More
    HomeMobile EuropeLTE networks - When best effort is not good enough

    LTE networks – When best effort is not good enough

    -

    Is a ‘best effort' mobile service good enough? Jonathan Borrill describes the challenges of testing a live LTE network

    To build a high-performance LTE network, operators will need to learn the new constraints on network planning and optimisation that LTE introduces. Since LTE uses all-IP transport and a new radio technology, the straightforward application to LTE of the network-testing techniques that operators have used in 3G networks may simply not be enough.

    The introduction of LTE networks marks the end of the transition of mobile wireless services from a traditional telephony-style circuit-switched network to all-IP transport. This will allow operators to deliver diverse broadband services at competitive prices, because of the efficient way in which IP networks use bandwidth.

    This transition has progressed through technologies such as High-Speed Packet Access (HSPA) on the current 3G platform, which combines circuit-switched voice with packet-based data. As an entirely packet-based technology, however, LTE will improve the performance of mobile data services at the same time as it allows operators to package mobile services in new ways to make them suitable, from a cost and performance point of view, to a wider range of businesses and consumers than is possible today with 3G.

    Although referred to as an evolution, then, LTE represents a significant step forward. This is not only because all traffic, including time-sensitive traffic, is packet-based, but also due to the fact that LTE replaces the established W-CDMA radio technology with a new radio-access technology based on Orthogonal Frequency Division Multiple Access (OFDMA).

    Compared to W-CDMA, OFDMA will allow LTE to support higher data rates up to 300Mbps over the air, as well as lower delay and latency for data packets. In combination with the simplifying of core networks, which is a benefit of the move to entirely packet-based traffic, OFDMA is central to meeting the LTE objective of delivering more bandwidth at lower cost by providing better spectral efficiency.
    Network planners, however, must deal with two major challenges if they are to successfully implement LTE. They must both understand how to optimise the network to meet Quality of Service (QoS) goals, and establish new models for the location and configuration of base stations in order to predict accurately the network coverage they will achieve with the new OFDM radio infrastructure.
    Effective optimisation and network modelling will depend on reliable data obtained from accurate field tests of live LTE networks.

    Network Planning
    LTE networks will give operators extra flexibility to optimise for coverage, call quality and data speed. This is due to the nature of an all packet-switched network. In traditional telephony, the beginning of a call opens a circuit between the participants. The participants then have all the bandwidth on that circuit available to them for the duration of the call. If they need less bandwidth, the operator cannot share the bandwidth with other users. If they need more, they cannot have it.

    In LTE, by contrast, the network simply streams a succession of packets (of voice and of data) from and to the user, by whichever routes the operator's algorithms decide are most appropriate. This gives the operator the freedom to prioritise different types of packets according to Quality of Service demands. Packets belonging to higher-value services can be prioritised over those of lower-value services; streamed video packets can be prioritised over e-mail packets; and so on. But IP also imposes a ‘best effort' constraint on packet delivery – in other words, the network will do its best to get packets to their destination on time, but it cannot guarantee their arrival schedule. By contrast, voice signals in a circuit-switched 3G call are guaranteed to reach their destination with minimum time variation (jitter).

    With so many network-management decisions being made, there is great opportunity for network operators to apply highly differentiated marketing strategies to their networks, to meet the needs of the different market segments.

    This freedom, however, places a responsibility on the network engineer to ensure that the user's experience matches the service that the operator is contracted to provide. This in turn will require more and more intensive field testing of live networks. Where in the 3G context it might have been enough to test the aggregate bandwidth capability of a cell, in LTE the operator must continuously monitor the network's ability to deliver, for instance, smooth video streams or fast internet access to individual users who have paid for it.

    Success begins with effective network planning. When planning a network, the initial models for simulation are built using theoretical information contained in the published 3GPP standard. These initial models contain approximations and assumptions which are then ‘tuned' over time using data from field tests. As with previous generations, network testing aims to assess coverage accurately on all channels, identify interference between neighbouring cells, and characterise coverage patterns.
    Feeding this data back into the simulation tools allows more accurate prediction of aspects such as ideal basestation sites, optimum power levels, and allocation of sub-carriers, bandwidth and codes. Also, as before, planners must take into account the practical restrictions on basestation locations – such as negotiating for permission to use a particular site, or gaining planning approvals. Working within any such limitations, planners will rely on simulation results based on actual field data to achieve the desired coverage. This is generally achieved using a combination of main basestation macro sites, with smaller basestations creating picocells in those areas where poor coverage is calculated. Although the rules relating to basestation positioning are different for LTE, compared to 3G and previous networks, LTE planners can take advantage of knowledge and equipment technologies from WiMax applications, which also use OFDM radio technology.

    Network Optimisation
    As said above, the task of managing an LTE network is very different from that of managing a 3G network. To deliver the agreed QoS for all customers, the network operator must correctly prioritise packets. This includes allocating the available bandwidth most appropriately, and depends on factors such as the type of traffic.

    Also, recognising that there are continuous naturally occurring fluctuations in channel conditions, the modulation and channel coding schemes may be adapted in various ways to make best use of the conditions prevailing at any point in time.

    In turn the testing strategies, routines and equipment employed to optimise and manage a live LTE network will also different those applicable to a 3G network. The change in radio technology, from 3G to LTE, also requires new technologies to be embedded in the basestation analysers and test handsets used in the field to measure the large number of variables that are associated with OFDMA.
    Key field-test data necessary to optimise LTE networks QoS include latency (delays in packet transmission), jitter (variations in latency), and dropped packets. These effects are experienced by users in service defects such as jerky video, or echoes and delays causing poor audio quality on voice calls. Users of today's PC-based VoIP telephone services might be familiar with such voice effects.

    The normal technique to minimise or eliminate these effects is to implement buffers in the core network and in the subscriber handset. There is, however, a trade-off between efficiency and QoS here. Broadly, increasing buffer size helps insulate the network from the effects of latency, jitter and dropped packets, but it also decreases data rates and the efficiency of bandwidth utilisation. Network engineers therefore need to find a level for buffer sizes that is adequate but not excessive. Accurate, comprehensive field test data describing latency, jitter and dropped packets is therefore essential to enable network installers to determine the optimum buffer sizes.

    The intensive effort required to plan and optimise the network must also be factored into the overall cost of deployment and, therefore, into the cost of services to end users. To minimise these costs, network operators are interested in self-optimising network elements, incorporating embedded software to adjust settings automatically for optimal coverage and capacity. This can save the labour-intensive tasks involved in manually optimising the network.

    Self-optimising software is itself, however, dependent on accurate knowledge of the practical factors influencing network performance, such as the effectiveness of call handovers between 2G, 3G and LTE network equipment. This knowledge must first be acquired from field test activity, to allow equipment vendors to deliver network elements that are truly self-optimising.

    Monitoring LTE Services
    As we have seen, the ability to apply different service models to different customer types presents operators with exciting new marketing opportunities. It also imposes more complex challenges in terms of monitoring service delivery and quality.

    This challenge is heightened by the difficulty of assessing end-to-end network performance in a packet-based environment. Once a packet leaves the operator's basestation, it is transported across a core network where the operator might no longer have control of its routing. Hence, LTE will bring a demand for improved capabilities in OSS monitoring systems. OSS monitoring systems, of course, are already well established in GSM and 3G networks, and enable operators to analyse various network performance parameters. This information helps operators assess the experiences of individual customers, and identify and solve any problems quickly and efficiently.

    The same principles apply to LTE networks. Because IP-based services such as SIP and IMS are characterised by a significantly higher volume of data making up each transaction, however, operators will need LTE OSS monitoring systems that can filter this data logically in order to isolate critical network-performance information. Decisions must be made as to which parameters to monitor, which data patterns to look for, and optimal thresholds so that the monitoring system can provide accurate responses when QoS falls below the agreed level. Because LTE operators are able to offer highly differentiated services to customers, OSS monitoring systems must be capable of assessing network performance in relation to multiple, diverse QoS targets.

    In an LTE network, the challenges associated with monitoring service quality are further complicated by the fact that there can often be an appreciable difference between the bandwidth available through the LTE radio link and the effective data rate experienced by the end user. These differences are determined by the volume of traffic over the radio link. Hence augmenting OSS monitoring results with data acquired using field test equipment will allow accurate comparisons between the perceived network performance and the service the user is actually experiencing. Operators will benefit greatly from test equipment that can simultaneously perform radio-link tests and protocol tests, so that the source of service defects can be pinpointed.

    Jonathan Borrill is Director of Marketing, Anritsu EMEA