DALLAS—A top T-Mobile executive said that edge computing could well bloom into a major opportunity, but not anytime soon.
“Edge compute is early days from my perspective,” said T-Mobile’s Dave Mayo, SVP of the carrier’s 5G and IoT businesses. Mayo spoke here at the FierceWireless Next Gen Wireless Networks Summit. “It’s not as imminent as maybe the hype cycle will suggest.”
Mayo first explained that the latency available on LTE networks is already as low as 20 ms, and he said that figure could decline to 5-10 ms on 5G networks. He said those latency speeds will be suitable for most applications in the near term. (Latency is the time it takes a byte to traverse a network—like when you say, “hey Alexa” and it responds with a “yes?”)
“The need to broadly deploy edge compute diminishes” as those latency speeds decline.
However, Mayo said that some select applications could make use of edge computing capabilities in the next few years. Specifically, he pointed to oil and gas operations, where providers might need local computing capabilities in order to keep all their data local to a specific geographic location.
“That’s a great use case where edge compute is absolutely required,” he noted.
Mayo also pointed to other edge computing use cases like manufacturing or medical uses that could rely on super low latency speeds. He added that T-Mobile will continue to look for opportunities to deploy edge computing operations.
“We’ll go looking for those opportunities,” he said.
Indeed, Mayo noted that T-Mobile operates 60,000 locations where edge computing could be deployed—likely a reference to the number of cell towers T-Mobile operates around the country.
Interestingly, Mayo also said T-Mobile wouldn’t build its own cloud in order to deploy edge computing, and will instead work with existing cloud providers.
Despite Mayo’s comments, some industry players continue to talk up the edge computing opportunity. After all, people are using increasingly complex data services like Siri, Alexa and virtual reality. An edge computing design—where queries are processed in a data center geographically closer to an end user—can make those communications go faster by lowering the network latency.
This would represent a major change from most of today’s computing designs, where queries are sent hundreds or thousands of miles away to be processed in a data center.
Nonetheless, Mayo said that, as 5G gets built out on a wide scale basis, developers and others will likely create services and business models that could make use of edge computing. For example, he pointed to drones that could have edge computing capabilities on board, allowing those drones to analyze video in real time rather than streaming that video elsewhere to be analyzed.
“It will come to play over time,” he said. “I think it will happen. It’s a matter of what pace and rate.”
He added that “we just can’t even contemplate some of the things that will be required” in the future, and that edge computing demand will be “application specific.”
But Mayo cautioned that it could take decades for the edge computing opportunity to fully develop—he mentioned 2035 as a possible timeframe for when edge computing might turn into a major opportunity.
“It will be a fun future,” Mayo said.
Mayo isn’t the only top wireless executive to throw cold water on the edge computing noise.
“There’s a lot of excitement about it [edge computing] … It’s talked about like crazy,” acknowledged Alex Gellman, CEO of Vertical Bridge, the largest private owner and operator of wireless communications infrastructure in the United States. But, he added, “I’m not sure that edge data centers are coming.”