I filled out a survey this morning sent to me by college kid from Ireland, Patrick Wilson (email him requesting the survey to help him out, and remove ANTI SPIZZAM from the email address before sending), about multiplayer-online games. Either that, or Word just p@wned my comp full of macro viruses.
Anyway, interesting questions on it, and one made me really think. It asked if you think people would pay for infrastructure to remove the latency experienced in Massively Multiplayer Online Games, or would consumers just continue to suffer to pay lower prices.
Naturally, I responded damn straight I’d pay, kill all lag!!!
He finished with do you think latency will go away?
Lag go away? No way! Even if we all had G8’s (T3’s on crack or whatever those G things are called), would your ping ever be under 10? And if so, is that really the problem?
I mean, here’s the current scenario for FPS (first person shooter) like Counterstrike (these aren’t the actual algorithms, just how I’d code them):
– I click mouse
– mouse sends message to window’s message pump
– CounterStrike gets the message when CPU processes said instruction
– CounterStrike then determines client-side if a hit is probable
– CounterStrike sends event packet to server
– event is processed down the networking stack to a packet
– packets fly across my DSL phone wire and hop across various servers until they reach their destination
– upon re-assembly, the message is sent to the CS server
– the server determines who the event affects (receiver of shot, others seeing me shoot, me getting results of my shot)
– server fires off necessarey message after confirming event package state matches game state enough to designate a shot and a hit
– events do the reverse heading back to clients
– messesages bounce back across the wire hopping on various servers
– at my comp, reassembled, and CounterStrike gets the message and updates screen
Now, EVEN IF the networking gap in this equation is removed, is latency still there? Yeah. There are too many components involved that take time. The event described above is resolved in less than a second with a good ping, but that’s still latency; not real life time frames.
At what point do all the components above become real-time, or when that time comes are we not using the components above?
I think latency will always exist, at least in computer/console games. The only ones negatively affected by it are those faced paced FPS, but the MMORPG’s are not as long as the connection is continous, and latency remains at a stable level. Compensating for networking latency by tweening characters’ movements, auto-performing chacter attack actions, and queuing up events to ensure the experience are all great compenstation for lag.
So to me, the question is, not will latency ever go away, but rather, will latency ever stop being an issue?
In playing mutliplayer games over the years, I’ve seen drastic improvements in latency compensation; improving the lag experience. Developers of games are getting better at defining parameters of what are show stoppers and what are acceptable boundaries, even leaving a lot of that up to players.
For example, most games nowadays have the clearly written rules of latency caps, meaning if your ping falls below a certain threshold a set amount of times, you are booted. Since a client’s participation also adds a level of latency because it is one more non-instaneous client to keep track of, you end up with the weakest link scenario; those with the lowest ping have the possibility to ruin the game for others.
Different games handle this in different ways. In the early days during Starcraft, an online stratedgy game, almost half the games had suffix’ in their names “_Broadband Only”. If your latency indicator, 5 green bars was a color of yellow or red, you were booted even if it was the person hosting the game causing the problem. Racism based on color took a whole new twist.
Now, it’s by the numbers for most PC games. Until consoles get to that level of g33k maturity, you’ll still see the gauges and bars to showcase the very simple concept of network reliability which is a new concept for some people. Even with broadband, things aren’t perfect.
You have a higher ping, the more you are stigmatized. If it starts high but goes low, or is even remotely non-stable, you are mistrusted. Although the moniker of “LPB”, or low ping bastard, was immortalized in early games like Quake, Unreal, and HalfLife, it does in fact garner a lot of respect with it. By low ping, you have a better chance of doing well, mainly because the game gets quicker, and thus usually more accurate information about your game state. Did you really dodge that bullet? If your ping is lower, probably so.
It’s not really about rich or poor though; most are circumstance. I have DSL for instance, but the connection sucks, and is unreliable.
These stigmas, however, do not transition to MMORPG’s as much, mainly because those games do not require extremely low latency, but rather, stable connections. Dependable is more important than responsive.
Games have made great strides to softening the latency blow by pausing the game if it gets really bad, or allowing you to still move and interact with the world and only pausing as it waits for the response from the server for those necessarey actions, thus not negatively affecting your gameplay.
However, some still are built more to the low-latency connection settings. It’s a lot like how newer software, rather than being more optimized for the hardware, instead actually depends on the hardware evolving to be better so it can run it’s new features that require more resources. Windows’ Vista (aka Longhorn) is a perfect example. Not many computers today can really run the beta well enough to get the true benefit of a hardware accelerated OS (unless your a Mac g33k).
As such, that says to me the same expectations of connections are expected to improve just as hardware does.
Now, this attitude towards hardware has brought forth renewed interest in interpreted languguages for example. Formerely criticized for their slow speed because they are not actually compiled to machine code, but compiled either just in time or on the fly (or some higher level bytecode). Now that hardware has improved, people can look past their shortcomings in speed… because their shortcomings no longer exist, or not enough to adversely affect anyone responsible.
As such, will networking continue down the same path? Will connections get so fast and reliable, that games take advantage of that fact? It’s following the same path as hardware, so as far as the developers are concerned, apparently so.
I still think there are too many factors involved to truly remove latency, not just the networking components, but the hardware and software.
Even if things used quantum computing or some other faster than light computing power, there is still one problem… people.
People communicate via computers, and communication is an inherently flawed process. People deliver a message, and this message goes through 2 filters; the sender who is delivering the message interprets it, and the receiver has to interpret what the person said. These 2 filters change the message’s original intent, changing what it originaly was. These 2 flaws additionally take time.
So, even if hardware, networking, and software are instantaneous, people communicating and interpreting information are not.
…still, the thought of lag becoming history is just awesome. I’m sure it’ll re-surface when I try to play Quake 27 with someone who lives on Pluto and I’m on a wireless-gravity packet accelerator (uses Sun’s gravitational pull to slingshot packets to their destination) or even a hyperspace satellite (drops into hyperspace for faster than light travel to it’s destination orbit to more quickly deliver information). It’s all good; striving to be more connected is what the Information Age is all about, right?