Much has been said and written of late about congestion in mobile data networks, a subject brought to the fore by the introduction of the iPhone and its subsequent clones. Indeed, the problem has precipitated a whole new sub-section of the OSS/BSS industry devoted specifically to identifying and controlling wireless broadband data traffic.
There is potentially an equally serious problem, however, in the form of congestion on the signaling channel caused by ‘chatty’ applications and the signaling requirements of increasingly complex services running on smartphones.
It is not the case that the potential problem of signaling congestion has been ignored completely - far from it - but there is a question over whether it is something which can be accurately modeled theoretically. Or will we just have to wait and see what happens when we begin running smartphones over LTE networks?
There is a number of potential pinch points in both the RAN and the core network, and signaling traffic management techniques addressing these areas will vary accordingly. However, much of the extra signaling traffic will be created by applications which, though they may generate relatively low volumes of actual data traffic, have a very high signaling overhead.
Some of these applications may drop the connection by overloading the signaling channel long before the bearer network is threatened. This is known to be a problem on existing mobile networks, but dealing with it on LTE networks is still a relatively unexplored area.
Although flat-IP core networks in 3GPP Release 8 (LTE) and subsequent releases are expected to reduce signaling overload, mobile operators are still unfamiliar with the effects of smartphones on these networks. Moreover, LTE networks are launched with smartphone devices only, in contrast to earlier networks where smartphones arrived much later than feature phones. This will cause LTE networks to experience heavier signaling traffic, which may have adverse effects on network operation.