Blog

  • Latency is 4-eva’

    I filled out a survey this morning sent to me by college kid from Ireland, Patrick Wilson (email him requesting the survey to help him out, and remove ANTI SPIZZAM from the email address before sending), about multiplayer-online games. Either that, or Word just p@wned my comp full of macro viruses.

    Anyway, interesting questions on it, and one made me really think. It asked if you think people would pay for infrastructure to remove the latency experienced in Massively Multiplayer Online Games, or would consumers just continue to suffer to pay lower prices.

    Naturally, I responded damn straight I’d pay, kill all lag!!!

    He finished with do you think latency will go away?

    Lag go away? No way! Even if we all had G8’s (T3’s on crack or whatever those G things are called), would your ping ever be under 10? And if so, is that really the problem?

    I mean, here’s the current scenario for FPS (first person shooter) like Counterstrike (these aren’t the actual algorithms, just how I’d code them):
    – I click mouse
    – mouse sends message to window’s message pump
    – CounterStrike gets the message when CPU processes said instruction
    – CounterStrike then determines client-side if a hit is probable
    – CounterStrike sends event packet to server
    – event is processed down the networking stack to a packet
    – packets fly across my DSL phone wire and hop across various servers until they reach their destination
    – upon re-assembly, the message is sent to the CS server
    – the server determines who the event affects (receiver of shot, others seeing me shoot, me getting results of my shot)
    – server fires off necessarey message after confirming event package state matches game state enough to designate a shot and a hit
    – events do the reverse heading back to clients
    – messesages bounce back across the wire hopping on various servers
    – at my comp, reassembled, and CounterStrike gets the message and updates screen

    Now, EVEN IF the networking gap in this equation is removed, is latency still there? Yeah. There are too many components involved that take time. The event described above is resolved in less than a second with a good ping, but that’s still latency; not real life time frames.

    At what point do all the components above become real-time, or when that time comes are we not using the components above?

    I think latency will always exist, at least in computer/console games. The only ones negatively affected by it are those faced paced FPS, but the MMORPG’s are not as long as the connection is continous, and latency remains at a stable level. Compensating for networking latency by tweening characters’ movements, auto-performing chacter attack actions, and queuing up events to ensure the experience are all great compenstation for lag.

    So to me, the question is, not will latency ever go away, but rather, will latency ever stop being an issue?

    In playing mutliplayer games over the years, I’ve seen drastic improvements in latency compensation; improving the lag experience. Developers of games are getting better at defining parameters of what are show stoppers and what are acceptable boundaries, even leaving a lot of that up to players.

    For example, most games nowadays have the clearly written rules of latency caps, meaning if your ping falls below a certain threshold a set amount of times, you are booted. Since a client’s participation also adds a level of latency because it is one more non-instaneous client to keep track of, you end up with the weakest link scenario; those with the lowest ping have the possibility to ruin the game for others.

    Different games handle this in different ways. In the early days during Starcraft, an online stratedgy game, almost half the games had suffix’ in their names “_Broadband Only”. If your latency indicator, 5 green bars was a color of yellow or red, you were booted even if it was the person hosting the game causing the problem. Racism based on color took a whole new twist.

    Now, it’s by the numbers for most PC games. Until consoles get to that level of g33k maturity, you’ll still see the gauges and bars to showcase the very simple concept of network reliability which is a new concept for some people. Even with broadband, things aren’t perfect.

    You have a higher ping, the more you are stigmatized. If it starts high but goes low, or is even remotely non-stable, you are mistrusted. Although the moniker of “LPB”, or low ping bastard, was immortalized in early games like Quake, Unreal, and HalfLife, it does in fact garner a lot of respect with it. By low ping, you have a better chance of doing well, mainly because the game gets quicker, and thus usually more accurate information about your game state. Did you really dodge that bullet? If your ping is lower, probably so.

    It’s not really about rich or poor though; most are circumstance. I have DSL for instance, but the connection sucks, and is unreliable.

    These stigmas, however, do not transition to MMORPG’s as much, mainly because those games do not require extremely low latency, but rather, stable connections. Dependable is more important than responsive.

    Games have made great strides to softening the latency blow by pausing the game if it gets really bad, or allowing you to still move and interact with the world and only pausing as it waits for the response from the server for those necessarey actions, thus not negatively affecting your gameplay.

    However, some still are built more to the low-latency connection settings. It’s a lot like how newer software, rather than being more optimized for the hardware, instead actually depends on the hardware evolving to be better so it can run it’s new features that require more resources. Windows’ Vista (aka Longhorn) is a perfect example. Not many computers today can really run the beta well enough to get the true benefit of a hardware accelerated OS (unless your a Mac g33k).

    As such, that says to me the same expectations of connections are expected to improve just as hardware does.

    Now, this attitude towards hardware has brought forth renewed interest in interpreted languguages for example. Formerely criticized for their slow speed because they are not actually compiled to machine code, but compiled either just in time or on the fly (or some higher level bytecode). Now that hardware has improved, people can look past their shortcomings in speed… because their shortcomings no longer exist, or not enough to adversely affect anyone responsible.

    As such, will networking continue down the same path? Will connections get so fast and reliable, that games take advantage of that fact? It’s following the same path as hardware, so as far as the developers are concerned, apparently so.

    I still think there are too many factors involved to truly remove latency, not just the networking components, but the hardware and software.

    Even if things used quantum computing or some other faster than light computing power, there is still one problem… people.

    People communicate via computers, and communication is an inherently flawed process. People deliver a message, and this message goes through 2 filters; the sender who is delivering the message interprets it, and the receiver has to interpret what the person said. These 2 filters change the message’s original intent, changing what it originaly was. These 2 flaws additionally take time.

    So, even if hardware, networking, and software are instantaneous, people communicating and interpreting information are not.

    …still, the thought of lag becoming history is just awesome. I’m sure it’ll re-surface when I try to play Quake 27 with someone who lives on Pluto and I’m on a wireless-gravity packet accelerator (uses Sun’s gravitational pull to slingshot packets to their destination) or even a hyperspace satellite (drops into hyperspace for faster than light travel to it’s destination orbit to more quickly deliver information). It’s all good; striving to be more connected is what the Information Age is all about, right?

  • Getting AS2 Remoting Classes Into Your MTASC SWF

    Just read on the MTASC list the tail end of a long thread about event handlers and scope. We have a lot of new blood getting into Flash, and even the experienced programmers still all need to learn how Flash handles scope in different situations.

    As such, the conversation discussed why you use the -mx compiler option for MTASC. Bottom line, it ignores all classes that have a package path starting with “mx”. Since there are a plethora of the Flash MX 2004 v2 component classes that will not compile under MTASC, one uses the -mx option to have MTASC just ignore those classes if you choose to use a pre-compiled SWF. Most people compile in Flash after building their movieclips and artwork, and then use MTASC for the rest of their development cycle, unless they need to add more artwork or movieclips again (unless they are using SWFMill of course).

    I had problems doing a FAME only project awhile ago; the AS2 remoting classes would NOT compile into my SWF using MTASC, even when using dependencies, and I couldn’t figure out why. They even had some MX 2004 components in there already, so I was baffled.

    After reading that thread today, I figured out that unless Flash MX 2004 puts them there via you compiling in it first, MTASC will NOT add them to the SWF because they fall under the -mx compiler option; don’t put “mx.*” classes in the SWF.

    So, instead of using those Remoting Components you can drag into the FLA to force them in, or re-writing the AS2 remoting components to compile in MTASC (thanks Mario! I finally understood your email after 22 days… I know, I’m slow), just compile in Flash first so it puts the classes in the SWF, then you can use that SWF with MTASC.

  • Army 1st Lt. Timothy E. Price Dies in Iraq

    Tim Price

    Quote from his family’s statement:

    “1LT Timothy E. Price, leader of the 3rd Platoon, 127th MP Company, was killed in action in Baghdad, Iraq, on Tuesday, Sept. 7, 2004. At the time of his death, Tim was attempting to secure a defensive perimeter around a disabled Army vehicle that had been struck by an IED (Improvised Explosive Device) and was in flames. Tim, who was 25, was serving his second tour of duty in the Baghdad area.”

    My friend Derrick from high school, who broke the news to me 2 weeks ago after finding me by searching on the internet, said he was shot down by a sniper. During high school back in Richmond, Virginia, Tim and I were good friends. He was the only guy I knew more high strung than me, and he was always happy and upbeat. A lot of good memories.

    I miss you, dude! My deepest condolences go out to Tim’s family and friends.

  • Flash Player 8 Hardware Accelerated on Mac OSX 10.2+

    I hereby decree no more whining from you Apple-groupies about how Flash runs slow on your Picasso’ plastics. My video card doesn’t break a sweat running HalfLife2 as high as its settings will go… but because I’m on a PC, I don’t need Flash Player using OpenGL to get good performance. You all toolbar-boxers now get hardware accelerated Flash, and I don’t.

    Why Flash Player 8 runs mach-2 on Mac