Much has been said and written of late about congestion in mobile data networks, a subject brought to the fore by the introduction of the iPhone and its subsequent clones. Indeed, the problem has precipitated a whole new sub-section of the OSS/BSS industry devoted specifically to identifying and controlling wireless broadband data traffic.
There is potentially an equally serious problem, however, in the form of congestion on the signaling channel caused by ‘chatty’ applications and the signaling requirements of increasingly complex services running on smartphones.
It is not the case that the potential problem of signaling congestion has been ignored completely - far from it - but there is a question over whether it is something which can be accurately modeled theoretically. Or will we just have to wait and see what happens when we begin running smartphones over LTE networks?
There are a number of potential pinch points in both the RAN and the core network, and signaling traffic management techniques addressing these areas will vary accordingly. However, much of the extra signaling traffic will be created by applications which, though they may generate relatively low volumes of actual data traffic, have a very high signaling overhead.
Some of these applications may drop the connection by overloading the signaling channel long before the bearer network is threatened. The problem is familiar on existing mobile networks, but dealing with it on LTE networks is still a relatively unexplored area.
Although flat-IP core networks in 3GPP Release 8 (LTE) and subsequent releases are expected to reduce signaling overload, mobile operators are still unfamiliar with the effects of smartphones on these networks. Moreover, LTE networks are launched with smartphone devices only, in contrast to earlier networks where smartphones arrived much later than feature phones. That change will cause LTE networks to experience heavier signaling traffic, which may have adverse effects on network operation.
Typical examples include smartphone applications such as gaming, which contain large volumes of advertisements, and social networking in which users typically remain connected to multiple platforms for long periods of time. Indeed, social networking sites such as Facebook are increasingly featuring online games such as Farmville and various role-play games, all of which ramp up the signaling requirement.
In addition, signaling growth is also triggered by attempts to reduce battery drain for ‘always on’ devices. Again, while this is something that has been tackled on existing networks by defining different ‘states’ for end user devices that minimize battery consumption and signaling, the lack of available devices means this is still a relatively uncharted area in the LTE environment.
The lack of paid applications in Android have forced developers to use in-app advertising and, in the case of games, different advertisements may be displayed for different levels or when the same level of the game is replayed. The practice causes a significant amount of additional signaling messages due to advertisement downloads, especially when compared with paid applications or games that do not include advertisements.
To some extent it is the responsibility of developers to ensure that their apps offer a positive customer experience which is not achieved at the expense of heavy battery drain and high signaling overheads. It is also in their own interest for operators to encourage the development of such applications.
However, OSS/BSS vendors also have to factor an incremental increase in signaling traffic into the design of next-generation platforms, particularly with regard to real-time functionality and dynamic network optimization. Significant additions to the growing signaling overhead will only degrade the quality of service seen by the end user – which will negate one of the main reasons for implementing LTE.
Peter Dykes is a senior analyst on Informa Telecoms & Media’s Networks Intelligence Centre