Verizon scores in new cloud gaming report, but AT&T beats in latency

A new report from RootMetrics looks at the gaming experience for all four carriers in Los Angeles, both on 4G and 5G. (Pixabay)

The oft-cited application for 5G known as gaming is getting validated by a new report from RootMetrics. The firm found that carrier speeds in Los Angeles were fast enough for nearly all cloud gaming, with 5G showing faster speeds than LTE for all four carriers.

Using the bare minimum speed and latency requirements set by Google Stadia, Microsoft xCloud and Steam Remote Play, RootMetrics analyzed results from its most recent mobile performance testing in LA in the first half of 2020 to show which carriers in LA are capable of delivering smooth mobile cloud gaming experiences for casual games in standard definition (SD) and multiplayer online games in high definition (HD).

The study found that speeds were generally fast enough for both casual and multiplayer online gaming for all four carriers in LA. However, latency was a different story, with none of the carriers meeting the 10-30ms minimum latency requirement set by the game providers. AT&T was the only carrier whose latency on either LTE (44.0ms) or 5G (45.5ms) came close to meeting the 10-30 ms standard.

Cloud gaming allows users to play games hosted on the cloud, rather than on a console or gaming PC, and all the processing, graphics and video rendering that traditionally takes place on consoles now takes place on the cloud. The team chose LA because it was one of two cities they have found so far where all carriers have live 5G, with Philadelphia being the other market.

Looking at 5G speeds specifically, Verizon once again demonstrated a shining performance, clocking a “remarkable” 5G median download speed of 254.7 Mbps, along with “near-perfect” packet loss and jitter results. “In short, Verizon’s mmWave 5G showed incredible potential, and Verizon’s 5G gamers shouldn’t see any issues from speed, packet loss, or jitter,” according to RootMetrics.

RELATED: ‘Real’ 5G relies on 5G NR, Standalone architecture: Special Report

Of course – cue Debbie Downer music here – Verizon’s mmWave 5G is greatly limited in availability, as the service provider is targeting dense urban areas and signals don’t travel as far in the higher band spectrum. Verizon has said it’s working on that – adding up to five times more small cells this year while it continues to work on rolling out dynamic spectrum sharing (DSS), which will enable it to provide much better 5G coverage using LTE spectrum.

In its report, RootMetrics said T-Mobile offered the most 5G in LA with a 5G availability rate of 32.1% and the carrier’s 5G led to faster speeds, “excellent packet loss,” and lower jitter. “While T-Mobile had the second lowest latency in LA (roughly 77ms on 5G LTE and 5G), lag will likely be an issue for most multiplayer games in HD,” the firm stated.

The good news for all the carriers is they’re moving from the non-standalone (NSA) version of the 5G standard to the standalone (SA) version, which will enable them to offer lower latency and higher speeds. Just yesterday, T-Mobile announced it will launch SA 5G later this year and ticked off a long list of partners: Cisco, Ericsson, MediaTek, Nokia, OnePlus and Qualcomm.

RELATED: T-Mobile touts multi-vendor milestones on path to 5G SA

According to RootMetrics, edge computing may be the best tool mobile carriers and game providers have for improving latency, essentially moving processing, video rendering and video encoding physically closer to the user, which in turn reduces lag. The firm also said it’s important to understand that most latency metrics, including RootMetrics for LA and those recommended by the game providers, only factor in the round-trip time it takes for user inputs to reach the cloud server and return to the user.

There’s also lag that happens between the cloud server and the actual game server, such as EA’s servers. “While that 'second layer' of latency is out of the control of both mobile carriers and cloud providers, it’s always there,” according to the mobile analytics firm. “That being said, if latency is reduced on the carrier side by 5G or edge computing, then overall lag will be reduced, even though that cloud-to-game server latency exists. Ideal lag between the cloud server and game server is less than 100ms.”

Standalone, edge and cloud native core

Suzanth Subramaniyan, director of Mobile Networks at RootMetrics, said via email that SA will help with over-the-air latency (the portion of latency that is due to the time taken to communicate with the serving tower). “With NSA, this is typically in the order of tens of ms (best case) and hundreds (worst case); with stand-alone, in contrast, we could be seeing <10 ms (best case) and <100 ms (worst case),” he told Fierce.

However, there also are opportunities to see improvement from the cloud native core, which enables virtualization and network slicing. With this, operators can prioritize network resources for gaming services. Better edge computing already is in progress even in NSA mode and Rootmetrics expects to see its impact this year.

In sum, standalone will definitely bring meaningful improvements to latency, with the biggest impacts seen by combing SA, edge computing and cloud native core, according to Subramaniyan.

Story updated with additional commentary on impact of standalone 5G.