For more than a year, a group of engineers at Verizon have been quietly filing patents and trying to figure out a big 5G problem: how to orchestrate and load-balance data over a 5G network onto a GPU or GPU cluster that exists at the network edge.
Why? It started about 24 months ago when the team at Envrmnt, a branded organization within Verizon, was asked to figure out how its expertise in artificial intelligence (AI), virtual reality (VR) and other areas intersects with 5G. Machine learning, AI, augmented, virtual and mixed reality (AR/VR/XR) and other futuristic technologies are heavily dependent on the GPU for compute and not so much the CPU.
T.J. Vitolo, the head of Verizon’s XR Lab, and his team went about an exercise to try to figure out how to blend the speed and latency of 5G by connecting it to a GPU in the cloud—the network edge. What they quickly realized is that GPUs fundamentally, from an architecture standpoint, don’t work the same way that CPUs do, meaning you can’t have hundreds of thousands of users accessing a single GPU to do tasks; you can only have one or two users accessing a GPU at the same time, or at least the requests going to that unit.
The typical mobile device today has a GPU embedded, and that inherently occupies much of the power requirements of the device. By removing that one piece of hardware, Verizon is able to take bulkiness out of a device, whether it’s a pair of glasses or a smartphone, and move it up to the cloud, so to speak.
Asked if it’s Verizon’s own cloud where the GPU sits, Vitolo said they’re still testing to determine architecturally where everything will end up, but the test they did exists in the Verizon edge network.
So what are they going to do with this new technology? That plays into one of the big questions on everyone’s minds: How are operators going to make money from 5G. Basically, Verizon is trying to jump-start the developer ecosystem, much the same way it did with 4G. The team created eight services for developers to use when creating their own applications and solutions for using Verizon’s 5G edge technology.
One of them involves real-time ray tracing. It has to do with the idea of recreating real-world reflections, shadows and light within an environment to make it look as realistic as possible, according to Vitolo. Light is a particle that bounces off of everything, and programmers for years have been trying to get objects to look more real, not only by the way the light is cast but also how it reflects off a person’s body, or how it bounces off a wall and onto an object.
Graphics processing manufacturers are able to create these scenarios in real time, and you can see that in a video game, but it’s not available on mobile.
With its capability, Verizon is able to deliver that very high graphics/fidelity experience down to the mobile device that you typically can only get on a high-end computer or high-end video console system, he explained.
Another service for developers is related to spatial audio. An example of that would be if you were to take the kind of sound you hear in a movie theater and apply that to a mobile phone. You can only do that, though, if the GPU is on the edge, making mobile games that much more immersive.
Verizon’s commercial 5G node will be powering the demonstrations at the convention center during Mobile World Congress Los Angeles, where developers will get a chance to see some of these capabilities in action.