LTE IoT Will Make Low-Power IoT Devices More Efficient Than Ever
Depending on where and how they will be used, IoT devices have radically different needs. Surveillance equipment might require an uninterrupted, high-speed data connection to enable transmission of live video feeds with minimal latency. At the other end of the spectrum, smart utility meters have much smaller data usage needs and only require intermittent connections to the network.
While use cases differ, most IoT devices share a need for low power consumption and extended battery life. LTE IoT, the low-power wide-area network (LPWAN) branch of LTE, expands the range of possible device architectures by adding features like Power Save Mode (PSM) and extended Discontinuous Reception (eDRX) to reduce power consumption. These features allow engineers to maximize battery life by optimizing sleep cycles based on factors such as time of day, business hours, or seasons. Though the technological capabilities are in place, the burden is on device vendors to design their devices for optimal battery life based on the specific needs of their use case.
A Key Element for Most IoT Use Cases
Massive IoT deployments in a wide swath of verticals, including smart cities, mobile health, smart utilities, smart buildings and asset tracking, require low power consumption to make them feasible. Power economy results in either longer battery life or reduction in the size and capacity of the battery pack needed, reducing the cost and improving the flexibility of the device.
Key Steps in the Design Process for Low-Power IoT Devices
For OEMs interested in designing a power-saving IoT device, the first task is to define the needs of their use case. For example, if the equipment will be part of an agricultural deployment of water sensors on rural farms, coverage will need to be broad and battery life must be lengthy to minimize maintenance costs. Here are a few questions to consider when evaluating a device’s functional requirements:
- What types of data speeds will be needed (100s of Mbps vs. 1000s of kbps vs. 10s of kbps)?
- How massive will the data traffic be (GBs vs. MBs vs. KBs)?
- Are there any minimum latency requirements (for example, voice communications)?
- What extent of coverage will be required given the deployment scenario (e.g., above ground, in basements, urban areas or rural areas)?
- What type of battery will be used (large custom battery vs. AA, AAA vs. button cells)?
- How easy will it be to charge or replace the battery?
- Do the applications have device-originated or device-terminating data?
- Does the device need to be available for connections all the time or only during scheduled times?
Once they gather all criteria, OEMs should consult with a trusted IoT partner to identify the right technology and cellular module for the device.
LTE Advancements Enable Massive IoT Applications
Previous iterations of LTE, initially designed for smartphones, prioritized high data speeds and seamless mobility based on the assumption that consumers would charge the devices at least once a day. However, frequent charging is not practical for many IoT deployments in which equipment may be dispersed in remote areas away from dedicated power sources. As cellular moved into LPWAN, the focus shifted to include low-throughput communications, ubiquitous coverage, latency tolerance, lower mobility and longer battery life.
Today, there is a broad spectrum of device capabilities within LTE, ranging from Category 1 (Cat 1), which supports peak speeds of 10 Mbps, up to gigabit speeds of Cat 18 and multi-gigabit speeds of 5G. LTE has evolved to accommodate the needs of all connected devices, from underground water sensors to broadband gateways.
Module vendors offer a variety of modules to support every speed and technology. It’s up to the IoT device OEM and systems integrator to select the right module for their project and the best mix of features to optimize the device’s performance.
Power-Saving Enhancements of LTE IoT
Moving from a smartphone-focused design toward one addressing IoT needs, 3GPP worked to curtail the complexity involved in today’s smartphones. That design simplification reduced power consumption considerably, and they added several enhancements specific to IoT devices.
- Flexible, Simple Device Architecture
As part of the simplified design, 3GPP reduced the peak speeds that IoT devices can support, eliminating features, such as carrier aggregation, higher-order modulation and MIMO. They chose narrower bandwidths — 1.4 MHz for LTE-M and 200 kHz for NB-IoT — and LTE IoT also eliminated antenna diversity, keeping only one antenna for transmitting and receiving data. This simplification means that the memory needed in the devices is lower as well, further reducing power consumption.
- Power Save Mode (PSM)
Extending sleep cycles in IoT devices is the best way to prolong battery life. Smartphones must always be alert for data coming to them from the network, but this constant monitoring consumes power and drains the battery quickly. Many IoT devices don’t require an uninterrupted connection. Instead, using the PSM function, they can remain in sleep mode for most of the day and only wake up during fixed, pre-determined times to send their collected data to the network. They can be programmed to wake up if they sense a problem — for example, a smart water sensor that detects leakage — to send an alert. Otherwise, remaining in PSM for most of the time allows these IoT devices to extend battery life from one day to many years. For instance, an LTE-M (Cat M1) device that transmits once per day in full PSM mode could last more than ten years on two AA batteries. One drawback of PSM is that equipment will be out of network reach during sleep time. As a result, PSM is only suitable for applications such as smart meters that do not require frequent, network-initiated contact.
- Extended Discontinuous Reception (eDRX)
This feature allows devices to extend sleep cycles between paging. Instead of waking up every few seconds, devices can sleep longer with eDRX, waking to check whether there is data awaiting them. If not, they return to sleep mode. If there is data to receive, they establish the connection, transfer that data, and go back to sleep. An LTE-M device that transmits data once per day and wakes up about every ten minutes can achieve 4.7 years of battery life on two AA batteries. With lengthier sleep cycles, battery life can extend much further. The eDRX feature is especially useful in applications in which devices must be reachable when needed, such as object tracking and the smart grid.
3GPP Rel. 14 and 15 Bring Further Enhancements
Most commercial mobile IoT networks today use 3GPP Rel. 13, but Rel. 14 and Rel. 15 are coming soon with additional features to improve device performance and battery life. Here are a few of those features:
- Higher data rates for both LTE-M and NB-IoT
- Coverage extension and TDD support for NB-IoT
- Better mobility for LTE-M
- Single-cell broadcasting, efficiently delivering common data for multiple devices in a cell
Developers must prepare for the power trade-offs that come with each improvement. For example, a machine designed using power modeling for Rel. 13 may result in one battery capacity that translates into weight and volume dimensions, determining a specific physical design for the device’s enclosure. If there is an upgrade plan to Rel. 14 or 15 in the works, OEMs should spend time modeling power usage and battery life now to ensure the next generation device can be smaller and lighter.
Design Is Essential to Maximized Battery Life
For OEMs, designing devices to suit specific use cases is essential to maximizing power efficiency. If extended battery life is a primary need for your IoT deployment, approach the design and dimensioning process with that priority in mind. The effort will pay off with a device that’s future-proof, efficient and suited to function in your use case.