In this six-part series on 5G network and digital innovation, I am discussing some of my observations and suggestions on how the wireless ecosystem can come together to address an unprecedented set of opportunities that will come before us in the next decade. In the first piece, I laid the basic thesis of why 5G is different and how operators should think about strategy and network economics. In the second part of this series, we looked at the cost element and role software will play in managing costs and expanding EBTIDA. In the third piece, we explored the revenue equation. In this one, I will take a look at the measurements needed to give us a better picture of the 5G evolution.
First, let’s look at the network performance data. Historically, we have focused on “average network speeds” as a measure of how the networks are performing. Unfortunately, such metrics gives us an incomplete picture of the true state of the network in any given city or region. 70% of the traffic is shouldered by only 35% of the sites. The congestion patterns during commute hours narrows down the burden even further. So, if the average of the datasets is taken without any weighting, it will lead to inaccurate assessment. For example, if a smartphone is experiencing 45 Mbps on a cell site with no other users, and another is seeing the bits travel at 1 Mbps due to congestion, the average meter will give the answer as 23 Mbps. If there is no weight applied to the reading from the congested site and/or the congested time period, the average data reading will be misleading.
I hope we get away from average speeds as the defining metric for network competition. Indeed, at our annual senior executive summit, Mobile Future Forward, John Saw, CTO of Sprint, in talking with Ludwig Siegele, U.S. technology editor of The Economist, suggested as much. He said, “The success of 5G is not going to be measured by a speed test.” In fact, for a number of applications, peak speeds or average speeds are meaningless; one needs to focus on quality of the network session wherein the speeds don’t fluctuate for the duration of that session, be it a cloud gaming or VR stream or autonomous vehicle or a manufacturing environment. For latency-sensitive applications and services, maintaining jitter within the limits is extremely important. This is true for services in both consumer and enterprise domains.
As such, a better measure for the industry to aspire to is consistency – consistency in user experience for customers across sites and time periods throughout the day. Right now, there is wide gulf between the average user experience and the readings during congestion times at congested sites. The figure below illustrates the issue in more detail. If we go by the average speed data, network speeds look great. However, if we collect data from the same market directly from the RAN which has an enormously larger dataset, we see a wide variation in network performance to the point that during congested times on congested sites, the network comes to a virtual crawl. To really measure the network performance, the industry will benefit from network consistency metrics.
Furthermore, this data should be overlaid with the network coverage maps to really understand the gaps and upgrade requirements. This will keep us grounded and closer to reality. We are entering an era where consistency of network performance will be the cornerstone of user experience, SLAs, and competition. Average speeds are a vestige of the 3G era that need to be retired, and new, more modern metrics should be adopted by the industry and the regulators. A peak speed of 1 Gbps and average speed of 500 Mbps is of little use for new generation of applications and services if the standard deviation of speeds is high.
The adoption of new metrics means that we can’t manage congestion and network traffic in the networks the old-fashioned way either. One can’t plan their capex investments based on outdated parameters. We need to adopt more modern software techniques to manage such congestion. Applying AI and ML to real-time network loads to manage the traffic will be of utmost importance.
As we alluded to in the previous columns, software and sensors will define the 5G cycle. While a vibrant access layer is critical to the 5G economy, the 4th Wave services will rule the day. We are already seeing signs of services that will enable a new class of applications for both consumers and enterprises. These services will demand new ways to measure network performance; average speeds across the city just won’t cut it. Enterprises will work with operators who can provide a high degree of reliability for their applications. They will care less about how the network is performing nationwide and for the consumer applications.
Operators who design their networks and internal metrics to address this imminent market demand will be rewarded. Capex planning should reflect this new reality, operators should optimize their networks for consistency and reliability and not average and peak speeds. Regulatory tools should be updated to sync-up with the emerging trends. Software-enabled networks that predict, adjust, and manage the flow of traffic will be most cost-efficient. If the operators optimize their networks for the wrong metrics, it will impact the EBITDA. Hopefully, industry will come together to focus on the right network and financial metrics as we step into the next decade of hyper-growth.
Chetan Sharma is CEO of Chetan Sharma Consulting, an 18-year young management consulting firm and is an advisor to CXOs and boards of companies in the wireless industry. Over his 25 years in the industry, he has worked with operators on all five continents and has the rare distinction of advising each of the top 9 global mobile operators. Chetan has written 15 books on various wireless topics and his research work has helped shape many strategic decisions and dialogue in the industry. He is curator of industry’s premier brainstorming summit Mobile Future Forward. More information at www.chetansharma.com. You can follow his musings at @chetansharma
"Industry Voices" are opinion columns written by outside contributors--often industry experts or analysts--who are invited to the conversation by Fierce staff. They do not represent the opinions of Fierce.