Industry White Papers

Use an Advanced Bluetooth 5.2 SoC to Build Secure Low-Power IoT Devices

August 31, 2020 by Digi-Key

Learn how Silicon Labs' EFR32BG22 Bluetooth Low-Energy 5.2 SoC family meets the requirements for power and performance required to build IoT devices and other battery-powered products capable of sustained operation of over five to ten years.

Bluetooth connectivity and low power performance are critical requirements in battery-powered designs underlying high-volume products for the Internet of Things (IoT), wearables, connected home, and building automation applications. In building these designs, developers have struggled to find low-cost Bluetooth system-on-chip (SoC) devices able to deliver high performance functionality within tight power budgets. Too often, developers have been forced to compromise some aspect of performance or even sacrifice increasingly critical capabilities such as security to meet requirements for low-cost, low-power design solutions.

To mitigate the degree of compromise required, the Bluetooth 5.2 specification has incorporated power saving features such as LE Power Control, periodic advertising sync transfer (PAST), as well as advanced low-power mesh networking, and location tracking features. What’s required is a single integrated IC that supports these features, and which is supported by related development kits and software that allow developers to get up and running quickly and efficiently with Bluetooth 5.2’s low-power enhancements.

This article shows how Silicon Labs' EFR32BG22 Bluetooth Low-Energy 5.2 SoC family can meet broad requirements for power and performance required in battery-powered products. Using the EFR32BG22 SoC family and its associated development ecosystem, developers can build IoT devices and other battery-powered products capable of sustained operation of over five years on a single CR2032 coin cell battery, or over 10 years on a CR2354 battery.

 

Optimizing Power with Advanced BLE Features

Bluetooth connectivity has become a familiar feature of mass-market consumer products, but the availability of more advanced Bluetooth Low Energy (BLE) capabilities is expected to usher in a range of more advanced products for the IoT, wearables and other mobile products. In delivering those features, however, developers face underlying expectations for extended battery life and enhanced security in their products.

Underlying any Bluetooth data exchange, mesh network transaction or location service operation, the choice of transmitter power setting is critical for achieving a high signal-to-noise ratio (SNR). If the transmitter power setting is too low, reduced SNR can lead to increased error rates. If it’s set too high, the transmitting device is not only wasting power, but its high-power signal can lead to communication failures by increasing interference in multi-node networks, or by saturating nearby receivers.

Power control: The introduction of the LE Power Control feature in Bluetooth 5.2 addresses these concerns with a protocol that allows BLE devices to interact with its receivers to achieve an optimal transmitter power setting. Here, a receiving device can use the LE Power Control protocol to request a compatible transmitter to change its transmit power level to improve the receiver's SNR. Similarly, a transmitter can use LE Power Control data to lower its transmitter power if needed to a level that remains serviceable to the receiver. Here, the transmitter can use the received signal strength indicator (RSSI) provided by the receiver to independently tune its transmitter power output.

In some applications, developers are less concerned about optimizing transmitter power than ensuring that their device has sufficient transmitter power to reach some distant host or communications hub. The need to ensure effective wireless connectivity across extended distances has traditionally been at odds with power and security, particularly in resource-constrained designs at the heart of battery-powered products.

Mesh networking: BLE mesh networking can help eliminate the need for high transmitter power to reach distant hosts. Here, battery-powered devices communicate using lower power communications with nearby line-powered nodes. Because their messages are relayed from node to node, a low-powered device can communicate across a distance not feasible even at the device's maximum transmitter power and receiver sensitivity. In applications such as home or building automation, developers can further take advantage of Bluetooth's broadcast features to cause multiple devices to respond to a single command to change area lighting, for example. Using Bluetooth Low Energy, these mesh networking protocols can help satisfy conflicting demands for extended operating range and low power operation.

Location services: Bluetooth location services compound challenges for efficient radio operations with a need for effective signal processing capabilities. The availability of radio direction finding features in Bluetooth lets developers implement real-time locating systems (RTLS) for asset tracking, or indoor positioning systems (IPS) for navigating within buildings. With the introduction of support for angle of arrival (AoA) and angle of departure (AoD) direction finding in Bluetooth 5.1, RTLS and IPS applications can achieve a level of position accuracy beyond that available with earlier methods based on the RSSI.

AoA and AoD methods essentially provide complementary capabilities. Multi-antenna receivers can use AoA calculations to track the location of a moving asset that broadcasts a direction finding signal from a single antenna. Conversely, multi-antenna transmitters can enable a device such as a wearable to use AoD calculations to determine its position (Figure 1).

 

Bluetooth's AoA method allows a receiver to use an antenna array to pinpoint the position of a transmitting asset

Figure 1. Bluetooth's AoA method allows a receiver to use an antenna array to pinpoint the position of a transmitting asset, while the AoD method allows a receiving device, such as a wearable, to find its own position with respect to an antenna array. (Image source: Bluetooth SIG)

 

In each method, AoA receivers or AoD devices use quadrature signal processing to determine the phase shift associated with the signal received or broadcast, respectively, by the multi-antenna array. In turn, device requirements differ for the asset being tracked with AoA methods or the device determining its location with AoD methods. The tracked asset requires the lowest possible power consumption to ensure extended battery life while transmitting. In contrast, the location finding device requires sufficient processing power to handle the phase shift calculations using the transmitted in-phase (I) and quadrature (Q) components associated with the required IQ sampling needed to maintain accurate position information as it moves.

Additional Bluetooth features enable developers to reduce power consumption without loss of positioning precision. To implement AoD in a wearable, for example, the Bluetooth protocol allows the transmitter and receiver to synchronize their activity so both wake at the same time to complete a location scan. This approach eliminates the need for devices to waste energy randomly sending or listening for advertising packets. Wireless processors can simply sleep in low-power mode until built-in timers wake them at the required time. This synchronized approach also mitigates the collisions and loss of efficiency that would arise when a large number of transmitters and receivers were operating in close proximity.

Bluetooth's periodic advertising sync transfer (PAST) provides a means for further reductions in power consumption for paired devices such as a wearable and a smartphone (Figure 2).

 

Figure 2. Rather than consuming power to maintain its own synchronized connection with a transmitter (left), a wearable can use Bluetooth's PAST mechanism to reduce power consumption by relying on a paired smartphone to provide required synchronization data (right). (Image source: Bluetooth SIG)

 

With PAST, the wearable device relies on the smartphone's periodic advertising synchronization with the transmitter. As a result, the power-constrained wearable avoids the power costs associated with waking and performing the synchronized advertising transaction with the transmitter. If needed during low battery conditions, the wearable device can go further by reducing the rate of its positioning data update with the smartphone, trading positioning accuracy for extended operating time.

To take full advantage of BLE’s advanced features, however, developers need a Bluetooth SoC able to meet the competing requirements of reduced power consumption and high performance computing capability. The Silicon Labs EFR32BG22 Bluetooth Low-Energy 5.2 SoC family is designed specifically to support these requirements in high volume, battery-powered products.

 

Meeting Power and Performance Requirements

Built around the Arm® Cortex®-M33 core, the Silicon Labs EFR32BG22 Bluetooth Low-Energy 5.2 SoC family architecture integrates a comprehensive set of features and capabilities required in designs for battery-powered IoT devices, wearables and other mobile products (Figure 3).

 

Figure 3. The Silicon Labs EFR32BG22 SoC architecture combines an Arm Cortex-M33 core and a comprehensive set of peripherals with features designed to optimize BLE communications, enhance security, and minimize power consumption in low-power designs. (Image source: Silicon Labs)

 

Along with the Arm Cortex-M33 core and associated memory, the baseline EFR32BG22 SoC architecture combines an extensive set of serial interfaces, GPIO channels, clocks, and timers. The integrated 12-bit analog-to-digital converter (ADC) supports single-ended or differential input processing at up to 1 mega sample per second (MSPS) with a novel architecture that combines elements from successive approximation register (SAR) and delta-sigma converters.

Within the EFR32BG22 family, different family members are designed to meet specific requirements for processing and Bluetooth operations. For example, developers building designs with more compute-intensive requirements can select the EFR32BG22C222 SoC, which provides a higher speed core, more GPIOs, and higher transmit (TX) power. For designs built for RTLS or IPS applications, developers can turn to the EFR32BG22C224 SoC with built-in support IQ sampling and increased receiver (RX) sensitivity.

At the foundation of each member of the EFR32BG22 family are a complete radio subsystem, security module, and energy management unit that provide a wide range of services required for secure low-power Bluetooth communications.

 

Low-power Bluetooth Radio Subsystem

The EFR32BG22 family radio subsystem supports Bluetooth Low Energy 5.2 through separate TX and RX signal paths controlled by a dedicated ultra-low-power Arm Cortex-M0+ processor core. The radio subsystem design complements the processing capability of this core with dedicated blocks including a frame controller (FRC), a cyclic redundancy check (CRC) module and a dedicated radio buffer controller (BUFC) that manages RAM buffers (Figure 4).

 

Figure 4. The EFR32BG22 SoC integrates a complete BLE radio subsystem controlled by a dedicated Arm Cortex-M0+ processor core. (Image source: Silicon Labs)

 

Based on a direct-conversion transmitter architecture, the TX path combines an on-chip power amplifier (PA) with modulator (MOD) and frequency synthesizer. In performing any required carrier-sense multiple access with collision avoidance (CSMA/CA) or listen-before-talk (LBT) protocols, the Arm Cortex-M0+ radio controller automatically manages the necessary frame transmission timing.

The RX path uses a low intermediate frequency (IF) receiver architecture that integrates a low-noise amplifier (LNA), automatic gain control (AGC), and IF ADC that enables the device to digitally perform demodulation (DEMOD) with decimation and filtering that can be configured to support receiver bandwidth from 0.1 to 2530 kilohertz (kHz). Finally, the RX signal chain generates the receiver RSSI value used for a wide variety of services including power optimization, signal quality control, and proximity detection, among others.

Operating in parallel with the RX signal path, Silicon Labs' RFSENSE module monitors the input signal and wakes the device when it detects RF energy above a defined threshold. To help reduce false alerts when operating in electrically noisy environments, the RFSENSE module also provides a selective mode that generates the wake up signal only when it detects a pattern in the energy rather than some burst of random RF energy. In this case, the pattern of energy corresponds to an on-off keying (OOK) preamble in a transmitted packet, so the energy detected by the RFSENSE module is more likely to signal an actual communications transaction.

 

Hardware Support for Building Secure Systems

Securing battery-powered connected devices requires solutions that have been at odds with the features and capabilities of traditional processors used in earlier designs. Built for operating in less vulnerable conditions, traditional processors have lacked some of the physical and functional capabilities required to protect today's IoT devices and wearables. For example, the ready availability of IoT and wearable designs makes it easy for hackers to attack these designs with side-channel methods such as differential power analysis (DPA) that can expose secret data and private keys. Using these keys, hackers can employ a variety of methods to spoof actual devices and gain access to secure networks and supposedly protected resources. Even more easily, hackers already routinely penetrate wireless networks to reach poorly secured connected devices as the prelude to that same sort of attack.

For designers, requirements for minimal BOM and extended battery life have often forced them to adopt software-based security methods. Unfortunately, those methods remain as vulnerable as the application software and operating system itself. Perhaps worse, from the user's point of view, security mechanisms implemented purely in software introduce noticeable lags in communications and the application's perceived responsivity. To harden security without sacrificing performance, connected designs depend on a hardware-based security mechanism.

The EFR32BG22 family helps developers protect device designs using a combination of hardware-based security mechanisms. At the core of these mechanisms, a cryptographic accelerator speeds data encryption and decryption using a wide range of Advanced Encryption Standard (AES) key lengths and modes. For authentication and signing operations, the accelerator supports popular elliptic curve cryptography (ECC) curves and hashes.

At a lower level, a true random number generator (TRNG) provides the non-deterministic number patterns required to mitigate threats arising from use of random number generators known to repeat patterns of numbers. An even lower level mechanism protects the accelerator from the kind of side-channel DPA attacks mentioned earlier.

Implementing system security with these mechanisms is only half the battle in any connected products. Indeed, mitigation of threats in deployed systems is a constant struggle made even more difficult in sophisticated battery-powered designs. After they have deployed an otherwise secure design, developers have in the past left their designs exposed to malware software injection attacks or even penetration through open debug interfaces. The EFR32BG22 family addresses both of those concerns through specialized capabilities designed to mitigate malware firmware and debug interface penetration.

These SoCs provide a security feature called Secure Boot with root of trust and secure loader (RTSL) that uses a two-stage bootloader designed to ensure that an EFR32BG22-based system boots only with authenticated firmware (Figure 5).

 

Figure 5. Supported in the Silicon Labs EFR32BG22 SoC family, Secure Boot with RTSL builds a root of trust on trusted firmware booted from ROM. (Image source: Silicon Labs)

 

Conceptually, Secure Boot with RTSL addresses a weakness in older single-stage bootloader systems that permitted hackers to take complete control of a connected system by booting it with compromised firmware. The use of signed firmware would seem to provide a solution to this problem. In practice, however, the use of counterfeit certificates to sign firmware or use of legitimate certificates fraudulently obtained by bad actors can leave even signed booting methods exposed to attack.

In contrast, an EFR32BG22-based system establishes a root of trust built on a first stage bootloader that pulls trusted firmware from ROM. In turn, this trusted software uses strict authentication methods to verify the source and integrity of the second stage bootloader code, which in turn verifies and loads the application code.

The ability to build a system solution on a root of trust allows developers to deliver products with high confidence in the ongoing integrity of the software even through over the air (OTA) firmware update cycles. Sometimes, however, developers need deeper access to those systems provided at the level of the system's debug port.

Of course, deploying a system solution with an open debug port is a recipe for disaster. The EFR32BG22 family's secure debug feature provides a practical solution for developers of complex software systems who need the ability to trace faults without compromising the security of the overall system. With secure debug, developers use secure authentication mechanisms to unlock the debug port and gain the visibility they need for fault analysis without compromising the confidentiality of user data in the deployed system.

 

Optimizing Power Consumption

The most effective Bluetooth communications and security mechanisms will nevertheless leave a battery-powered device at a disadvantage if it is unable to provide extended battery life. In fact, energy management and power optimization features are built into the foundation of the EFR32BG22 SoC architecture. Taking full advantage of the low-power Arm Cortex-M33 core, these SoCs consume only 27 microamperes per megahertz (μA/MHz) while running at maximum frequency (76.8 MHz) in their fully active mode (EM0) with all peripherals disabled.

During idle periods, developers can place the SoC in one of several low-power modes including sleep (EM1), deep sleep (EM2), stop (EM3), and shutoff (EM4) mode. As the SoC transitions to lower power modes, the integrated energy management unit (EMU) turns off an increasing number of functional blocks until a minimum set of blocks needed to wake the SoC remain powered (see Figure 3 again). In addition, the EMU automatically lowers the level of voltage scaling in switching to lower power modes. As a result, in a 3.0 V system using the internal DC-DC converter and with all peripherals disabled, power consumption drops dramatically to 17 μA/MHz (76.8 MHz operation) in sleep mode, 1.4 μA in deep sleep mode with full RAM retention, 1.05 μA in stop mode, and 0.17 μA in shutoff mode.

In earlier processors, developers faced a difficult decision in choosing a low power mode due to the extended time required to wake up those processors. An extended wake-up time not only forces the system to remain unresponsive during the wake-up period, it also results in wasted energy performing "non-productive" operations associated with the wake-up process. Often, developers would be forced to select a higher power mode than otherwise required to ensure that the processor could wake up in time. In contrast, an EFR32BG22-based system running from RAM requires as little as 1.42 microseconds (μs) to wake up from EM1 sleep mode, or 5.15 μs from EM2 deep sleep or EM3 stop mode. Even waking up from shutoff mode requires only 8.81 milliseconds (ms), which is often below the minimum update period for many battery-powered wearables or IoT devices.

The ability to take full advantage of these relatively fast wake-up times depends on the availability of mechanisms able to maintain some level of activity even when the SoC is in its EM3 stop power mode. Along with capabilities such as RFSENSE described earlier, other functional blocks such as the SoC's real-time clock (RTC) enable the device to maintain real-world time while sleeping, and its Low Energy Timer (LETIMER) enables the device to generate different waveforms or provide counters for other peripherals. In fact, on-chip peripherals can continue to operate thanks to the SoC's Peripheral Reflex System (PRS), which can route signals between different on-chip peripherals and perform basic logic operations in doing so – all without any CPU involvement.

 

Efficient System Development

To help accelerate the implementation of EFR32BG22-based solutions, developers can take advantage of a comprehensive set of tools and libraries built around the Silicon Labs' Simplicity Studio integrated development environment (IDE). Within its Bluetooth Low Energy software development kit (SDK), Silicon Labs provides support for advanced features including Bluetooth mesh networking, AoA and AoD processing, and secure over-the-air firmware updates. Along with a full set of Bluetooth profiles, the SDK includes sample applications and source code for implementing custom software.

 

Conclusion

Fast-rising demand for advanced BLE features in battery-powered mobile products places developers under increasing pressure to solve the conflict between needed performance and available power. In the past, these conflicting requirements often led to compromise in system capability, size and cost. Using an advanced Bluetooth SoC, however, developers can build high-volume IoT devices and other battery-powered products capable of supporting next-generation features such as indoor navigation and mesh networking while operating for years on a single coin cell battery.

Other Products in the EFR32BG22 line include: