Home / Gaming News & Opinions / Google Peddles Physics-Defying ‘Negative Latency’ Stadia Hype

Google Peddles Physics-Defying ‘Negative Latency’ Stadia Hype

Last Updated September 23, 2020 1:07 PM
Thomas Bardwell
Last Updated September 23, 2020 1:07 PM

The launch of the Google Stadia next month should answer a lot of questions about the viability of cloud-based gaming and in particular shed light on a facet of the technology that has long-concerned those intrigued by streamed video games: how will the tech giant tackle latency and offer a responsive experience free of lag on the user’s end.

The varying speed and reliability of internet connections across the globe, the different hardware processing demands of individual games, physical distance to Google’s data centers; there are many hurdles to overcome.

According to an interview  with Stadia’s vice president of engineering, Madj Bakar, in the most recent issue of Edge magazine, Google is working on what it is calling ‘negative latency’ with an eye on offering a gaming experience that surpasses local hardware both in terms of speed and responsiveness within a year or two.

Google Peddles Physics-Defying Stadia Latency Tech
Source: Google

Google Stadia ‘Negative Latency’

Despite the confusing terminology which sounds like Google having a hand in some sort of physics-defying sorcery, Bakar suggests that negative latency involves Google’s streaming tech using an AI buffer to compensate for network latency. The key is picking up and mitigating latency causing factors.

In practice, this could mean upping the FPS count to lower latency, or more interestingly, predict what the player will do next and prepare in advance for that eventuality by rendering upcoming frames accordingly. This means Stadia will anticipate player input (character movement, button presses, jumps, attacks, camera positioning, etc.) and push out several of the most likely ‘predictions.’

It does raise questions as to how the client and servers will respond when player input is entirely off the predictions and whether in these cases players will experience a jarring barrage of latency jumps as the system corrects itself. Modern games carry with them a massive variety of input permutations, especially with non-linear devices like the mouse, so a disconnect between input and predictions is guaranteed, at least some of the time.

That said, the technology isn’t all that new. Microsoft’s DeLorean system used a similar prediction technique  and programmer Tony Cannon’s Good Game, Peace Out netcode  rolled back game states based on a balancing act of prediction and remote player inputs. Both produced encouraging results.

Misplaced Hype or Reality?

With a vast arsenal of state of the art data centers in tow, the idea of large scale latency mitigation doesn’t seem all that far fetched, especially with Google’s cutting edge AI tech. The company unquestionably has the resources, but porting to hundreds of thousands, if not millions, of concurrent users with any degree of fidelity seems unlikely.

Google Stadia
Google Stadia’s controller. | Source: Cody Engel/Shutterstock.com

The idea of negative latency also veers off a plausible path with claims that the tech will outshine the responsiveness of local devices. A single local console or PC is no match for a stacked cloud gaming data center in terms of raw power, but local hardware has the upper hand thanks when it comes to proximity to the player, and importantly, inputs. Data doesn’t have to travel hundreds of miles, then wait for the server to complete calculations before being beamed back. There is a definite advantage to this.

Without further details, or the ability to experience the system ourselves, it’s hard to render a firm verdict on Google’s latest outlandish hype-inducing claim. If negative latency turns out to be everything it promises to be, then next month will be one to jot down in the history of video games. Short of that, Stadia’s long-term prospects of squaring up to Sony’s PlayStation 5 may take a significant hit.