Marek’s Take: The five biggest wireless tech developments of the last decade

Carrier aggregation is one of many clever techniques that the telecom industry is using to squeeze more capacity from existing spectrum. (Pixabay)
Marek's take

In early 2010 LTE was getting started, network congestion was becoming a big issue, and software-defined networking was just a group of protocols without a concrete mission.

Flash forward 10 years and LTE is in the rear-view mirror as operator’s turn their attention to their 5G deployments. Offloading data to Wi-Fi networks is a common occurrence, machine-to-machine (M2M) communications is an old-school term that has been much replaced by Internet of Things, and SDN is a key part of the 5G future.

I decided to look back and review what I believe are the five most important technologies of the decade that changed the way wireless networks work.

Sponsored By Blue Planet, a division of Ciena

If You're Stuck With Static, Fragmented Legacy Inventory Systems, A Clear Path To Operational Transformation Is Here

Blue Planet® Inventory helps identify and correct discrepancies between network resources and OSS inventory.

1. LTE put an end to the CDMA and GSM battle

Verizon launched its nationwide LTE network in 2010 and that really established the U.S. as a wireless network technology leader. Prior to that, the U.S. had always been a fast follower. Europe was the leader in GSM technology and Asia paved the way with CDMA technology. Verizon’s LTE launch also moved the operator off the CDMA network evolution path and put the pressure on rival AT&T (a GSM operator at the time) to step up its network technology game, and make the move to LTE.

2. Wi-Fi offloading helped operators manage the explosion in data traffic  

Network congestion was a big issue during the past decade, and will continue to be as users seem to have an insatiable demand for video viewing. According to Ericsson’s November 2019 Mobility Report, video traffic on mobile networks accounts for 60% of all mobile data traffic, and it is expected to grow to 75% of all data traffic by 2025. Operators had to learn various data management techniques so they could handle users’ data demands. One of those techniques was offloading traffic to Wi-Fi networks for extra capacity. AT&T even purchased a Wi-Fi company called Wayport in 2008 as a way to incorporate its hotspots into AT&T’s footprint. Today, Wi-Fi firm Boingo says it has offloading agreements with several major U.S. operators.  

3. Internet of Things became a significant part of operators’ business models

Although the term "Internet of Things" was actually coined in 1999, it really gained traction in the past decade as a more up-to-date term for what the industry used to refer to as M2M (machine-to-machine) communications. During the past decade operators began to take a serious look at IoT as a viable business with long-term potential.

Initially operators used their legacy 2G and 3G networks for their IoT traffic, but in 2017 AT&T and Verizon deployed LTE-M networks to handle a big portion of their IoT traffic.

Then in 2019, T-Mobile launched the first nationwide narrowband-IoT network, which can operate alongside the operators’ existing LTE network and does not interfere with other traffic. NB-IoT offers slower download speeds than LTE-M (in the range of 100 to 250 Kbps) but it also offers a 10-year battery life for NB-IoT devices. Verizon and AT&T followed a short time later launching their own NB-IoT networks. Sprint so far has only launched an LTE-M network.

According to Ericsson’s November 2019 Mobility report, there are about 1.3 billion cellular IoT connections globally this year, but that number is expected to grow to 5 billion by 2025.

4. Small cells improved network coverage and added capacity

Small cells started getting traction at the beginning of the decade as a way for network operators to improve their coverage and add targeted capacity. Small cells are not a replacement to the macro network, but instead are a complementary technology that can be deployed indoors or outdoors depending upon the use case. Small cells can be deployed using both licensed and unlicensed spectrum and they come in different sizes, form factors, and power levels.

In October 2014, the FCC approved rules designed to accelerate the deployment of small cells. The rules made it easier to put equipment on not just buildings and cell towers but also utility poles.

In 2018, the Small Cell Forum forecast that 400,000 small cells would be deployed in North America that year, and the organization estimated that by 2020 enterprises would deploy a total of 552,000 small cells in North America.

5. Carrier aggregation allowed operators to combine spectrum

Carrier aggregation is a complicated concept, but the benefits are easy to understand. Basically, it is a technique in which an operator combines multiple frequency blocks of spectrum (called component carriers), and assigns them to the same user as a way to increase the data rate.  

Carrier aggregation is a key feature of LTE-Advanced and operators started deploying it in 2014 and 2015 as a way to provide more bandwidth to users. Carrier aggregation is considered a good way to manage spectrum resources and also increase network capacity, deliver higher peak data speeds and improve load balancing on wireless networks.

So far carrier aggregation hasn’t made its way to 5G networks. However, it is one of many clever techniques that the telecom industry is using to squeeze more capacity from existing spectrum. Dynamic spectrum sharing (DSS) is another interesting concept. DSS is part of the 3GPP Release 15 and it allows operators to dynamically allocate some of their existing 4G spectrum to 5G and use existing radios (as long as they are 5G NR-capable) to deliver 5G services.

DSS is a bit too premature to be on this list—it is expected to be deployed in 2020 but so far operators have only conducted trials of the technology. However, DSS may very well be on my 2030 list of top wireless technologies of the past decade. Happy New Year! — Sue