Aug. 1, 2016
by Hesham ElHamahmy
Enough with the “Billions!”
Stacks Image 39140
I get it. There are going to be A LOT of devices untethering and interconnecting our world. Can we focus for a moment on the major factors of cost, security, and power consumption that will need to go into a successful IoT device? Cost, the most referenced impediment to rapid deployment comes down to one predominated factor; radio technology. Security, the second most referenced concern by corporations, of the billions of bits traversing many types of networks and as many opportunities for intrusion. And finally, for IoT devices with 10x the cost to replace as they cost to manufacture (remote sensor), energy management that ensures the IoT device will last 3-10 years as expected.


Cellular modems will always be more expensive than their low power cousins WiFi, Zigbee, and BLE. A typical LTE chip can cost $20-$30 vs. WiFi <$5 and even less for Zigbee, BLE.The higher premium comes with benefits such as high speed mobility, connection reliability, and security vs. high bandwidth and low power consumption so just make sure they are worth it.
What should you look for?
·Does the product need to gather and transmit data over long distances (Kilometers)?
·Is the device intended for use in a highly mobile environment such as a vehicle?
·How accurate does the location of the device have to be?
·Does the product interface with other entities i.e., machine or human?
·Can you rely on Wi-Fi or mesh networks to ensure communication coverage?
For those that have concluded LTE is your only viable choice, there is good news. Industry standard 3GGP has just approved standards for LTE-M1 and LTE-M2 which aim to drive the costs of LTE chips down to $6-$8 for IoT applications.Even more importantly, there are many Operators announcing deployment of LTE-M1 or LTE-M2.


Historically, it has been difficult to hack into a mobile network given its closed nature vs. WiFi and other mesh networks where the device security is left to the consumer’s or enterprise’s IT security prowess.That said, even mobile networks are increasingly forced to create aggregation points (hubs) to manage IoT devices due to their 10x count over subscribers. This creates an opportunity for malicious intruders.Regardless of the access technology, the challenge is clear. How do you achieve more security than available in a phone with less processing power than a phone?Here are some typical threats to any mobile network to be considered:
1.Improper mobility handling triggers camping on cells with poor coverage
2.Chatty applications that use unnecessary bandwidth impacting load balance
3.Requesting more radio resources than actually used thereby wasting network resources
4.Poor upper layer (IoT client-IoT Cloud server) security that exposes the RAN to malware behaving within the rules of the link layer
5.SIM spoofing
6.Attach procedures that flood the network given link layer security hasn’t been established yet
7.Paging floods in deployments where device IP address is NAT’ed by a controller instead of the RAN
8.SIP client invite floods
For anyone in mHealth, Financial Enterprise, or Automotive, the solution can be the difference between a viable or non-viable product.Of course, nothing is free. Many of the above considerations will impact the design time and cost due to the additional hardware performance required to address the range of static and dynamic attacks, recovery resiliency, and ability to remotely update the device.

Energy Consumption

I refer to energy consumption rather than battery performance because the life expectancy of a battery operated IoT device is largely determined by the amount the device consumes. The battery is a fixed supply of energy once it has been deployed. Many of the challenges with energy consumption are independent of the wireless technology.
  • Cellular devices have to transmit over longer distances and therefore use more power
  • Is the RF technology requiring the device to be always “awake”?
  • Does the device have to activate both transmit and receive RF to communicate or isdiscontinuous transmit/receive an option?
  • How does the application usage “chatter” compare to pre-deployment assumptions?
  • Is the device transmitting always attempting maximum throughput or are there options to set the rate to a fixed or adjustable level as a function of battery levels?
While most designs do a good job of factoring in the consumption of the wireless modem and the sensor processor under test conditions, there is often a lot of application chatter and other factors that accelerate the deterioration of the battery, and subsequently lead to costly replacements.
There are plenty of M2M (so passé) devices already enhancing productivity in many markets.The current (r)evolution is the result of a convergence between powerful, new data analytics solutions that are capitalizing on the centralization of data vis a vie the Cloud and sharp declines in the cost of Cellular modems. Review your design goals to ensure the cost, security, and energy consumption are aligned with your business goals.

For more information on this article or on products and product engineering services to help deliver compelling IoT devices, contact us at