comment on this article

End to end tests are essential for matching application behaviour to network if IoT is to succeed

In the near future, several billion devices will be connected to the Internet of Things (IoT). According to Harbour Research, 2billion IoT devices were sold in 2014; in 2020, it will be more than 7bn. But this will only happen with cost effective communications networks and wireless devices.

By using technologies such as Bluetooth, ZigBee, Wi-Fi or cellular networks, almost any device anywhere in the world can be connected to the Internet quickly and reliably. The latest cellular technologies – LTE-M and narrowband IoT (NB-IoT) – will play an important role in the success of the IoT.

One of the main challenges isn’t posed by new communication technologies; rather, it is the huge amount of IoT devices that need to be tested against regulator, standard and operator requirements. Currently, mobile operators probably test a few hundred mobile phones every year in order to allow them to connect to their network. With the adoption technologies like NB-IoT or LTE-M and the approximated variation in the device behaviour against a network, operators need to be prepared to test thousands of devices every year. This requires more efficient conformance and carrier acceptance testing and highly flexible and efficient testing solutions. Overall test efficiency will be the biggest topic in all phases of the product life cycle.

M2M and LTE

Because LTE is optimised for the mobile broadband market, the IoT has generated little demand for 4G technology, so the costs for an LTE modem are still relatively high in comparison to a GSM modem. However some aspects of LTE make it increasingly attractive. One is global accessibility; according to GSMA, 422 operators in 146 countries offered commercial LTE services as of September 2015. The long term availability of LTE is another consideration. More cellular operators are saying they will no longer support 2G mobile networks, making it necessary to convert to the latest technology. The first LTE chipsets optimised for the M2M market in terms of cost and power consumption are already available, with LTE offering advantages with respect to spectral efficiency, latency and data throughput.

The need for optimised solutions for the IoT market was recognised in the early stages of defining the 3GPP standard and specific improvements for machine type communication have been developed. For example, features defined in Rel. 10/11 are intended to protect the mobile network against overload by IoT devices. Network operators need to be armed against the possibility of several thousand devices trying to connect to the network at the same time. This could happen after a sudden event – for example, the power grid coming back online after a power failure. Overload mechanisms and options for reducing the signalling traffic have been introduced to handle these possibilities.

Many IoT applications – sensor networks, for example – only send data infrequently and do not need to operate with precise timing. These devices can report to the network that they can accept longer delays during the connection setup (delay tolerant access). Rel. 10 includes a process that permits the network to initially reject connection requests from these devices and delay them (extended wait time). With Rel. 11, access to the cellular network can be controlled by means of access classes. In this case, a device may set up a connection only if it is assigned a class that is currently permitted by the network. The network transmits a bitmap (EAB barring bitmap) that identifies which classes are permitted access (see fig 1).

Fig 1: A barring bitmap identifies which classes are denied access

The processes introduced in Rel. 10 and 11 ensure reliable and stable operation of current and future IoT devices within cellular networks without endangering the mobile broadband service.

The only thing still missing is solutions optimised for IoT devices with low data traffic, low power consumption and low costs. The committee started to address this in Rel. 12 and it quickly became clear there will be no single, simple solution; the requirements for applications such as container tracking, smart meters, agricultural sensors and personal health trackers are too varied. Rel. 12 therefore concentrates on power consumption and cost effective modems. The results are a power saving mode (PSM) that is especially important for battery operated devices and a new LTE device category 0, which should have only 50% of the complexity of a category 1 modem.

Power saving mode

The PSM process starts after a data link is terminated or after the periodic tracking area update (TAU) procedure completes. The device first enters the normal idle mode, in which it switches periodically to receive mode. As a result, it remains available via paging. PSM is entered after timer T3324 expires (see fig 2). In this mode, the device is always ready to send messages because it remains registered in the network. However the front end is switched off, so the device is not accessible via paging. PSM is thus suited for sensor networks that need to send data to the device only rarely and in small amounts. This process is not suitable for applications where a quick response from the sensor or a time-critical reaction is expected.

Fig 2: Power saving mode starts after a timer expires

Applications that use PSM must tolerate this behaviour and the design must include the optimal timer values for idle mode and power saving mode. End to end tests are essential for matching application behaviour to the behaviour of the network.

Cost effective devices

The introduction of category 0 was a first attempt at permitting significantly less expensive LTE modems for M2M applications. Ideally, category 0 modems will use less power and, to achieve this, the complexity of the modem was reduced by lowering the supported data rate to 1Mbit/s. This minimises processing power and memory requirements. Manufacturers can also eliminate full duplex mode and multiple antennas and, as a result, the duplex filters that would be necessary to prevent interference between the transmitter and receiver are not required.

Category 0 devices are still in development and will probably be introduced later in 2016.

Development of LTE-M

LTE-M has taken the first steps; Rel. 13 includes additional cost reduction measures, especially lower bandwidths in the uplink and downlink, lower data rates and reduced transmit power.

In Rel. 14, automotive industry requirements have prompted the investigation of options for reducing the latency in communications between consumer devices, allowing real time communication between cars, for example.

End to end testing

In parallel with the standardisation activity, there needs to be a new approach to testing. In addition to verifying the compatibility of the device, there will be the need for more end to end application testing in order to understand how dependencies between the different communication layers affect overall application performance, including such characteristics as power consumption or delay. This will guarantee a proper interworking of all applications. Additionally, security aspects such as encrypted communication or resilience in IoT end to end connections should not be neglected and therefore need to be verified and tested.

Author profile:
Joerg Koepp is head of the wireless market segment at Rohde & Schwarz.

Author
Joerg Koepp

Related Downloads
114053/P25-26.pdf

Comment on this article


This material is protected by MA Business copyright See Terms and Conditions. One-off usage is permitted but bulk copying is not. For multiple copies contact the sales team.

What you think about this article:


Add your comments

Name
 
Email
 
Comments
 

Your comments/feedback may be edited prior to publishing. Not all entries will be published.
Please view our Terms and Conditions before leaving a comment.

Related Articles