It’s come to be the number to guide us through the IoT revolution, and all the tech innovations driven by it: a projected 50 billion connected devices by 2020. Forbes wrote of retailers betting big on the Internet of Things, changes have been made to accommodate the impending wave of data and new protocols, and governmental programs were drafted to try to regulate the deluge – but the storm never came as expected.
In fact, it looks like IoT adoption, although far from changing direction, has stalled considerably. Statista.com, for example, shows a steady growth but no geometric progression and puts the 2020s at the 30 billion mark. Gartner, too, revised their estimate to a much more modest 20 billion.
It may be that we’ve been wrongfully assuming that the biggest challenge for IoT evolution is connectivity – it’s not. Broadband is currently affordable in 111 countries, with a basic fixed or mobile plan costing less than 5 percent of Gross National Income (GNI) per capita. With that out of the way in developed countries, some turned, instead, to the other components: the IoT layers. From the device itself to embedded systems and from the data processors to the cloud, all grounds were covered in the past year by new technologies, services, and protocols. Still, not enough.
What are we missing?
The real challenge is perfecting the UI/UX layer by addressing the three pillars it sits on within every IoT deployment – safety, relevance, and performance – through a sleuth of improvements made possible only by continuous, deep monitoring.
Keeping the user safe and the notion of digital trust
Consumer devices are the main driver of IoT, with over 5 billion units sold in 2017 alone – 63 percent of the total. Smart cars, smart TVs, digital set-top boxes and wearables are as ubiquitous as cell phones nowadays, but with scandals like the Norwegian Consumer Council’s #ToyFail report and many, many stories of data breaches hitting the media, 62 percent of consumer IoT users are now very concerned with privacy, and 54 percent are worried about their data security. (Add to that the 21 percent who openly declare they’re scared of AI taking over the world, and the picture is quite apocalyptic).
The digital trust, before any app is installed, is an exercise of transparency. Failing to mention that you’ll record everything a kid says through their smart toy and then selling that info to third parties is a blatant breach, but so are other more sophisticated ways to collect information. As a result, smart users are getting smarter, and “what gateways does your railway use?” or “who’s your bank’s APM?” have a chance to become as meaningful as “Intel Inside.”
Keeping the user using and the notion of digital relevance
Customers procure, buy, sign up; but the truth is, ownership does not guarantee usership. In fact, at some point, of the 19 million registered Fitbit users, only 9.5 million were active. What’s more, about a third of owners of smart wearables ditch their devices or turn some of the smart services off. Even so, wearables are on the rise, with new launches announced each month, some of which are already making a huge impact – like Pebble and internal health trackers. Other lifestyle IoT are also here to stay. Smart Home applications are stable and rising as well, with the term itself, smart home, being searched over 600,000 times every day.
So why aren’t buyers using more? The explanation may lie in two things: First, the category is very fragmented. If data is shared with one app, it may skip another app or flatly not work well with another back-end software – and it’s simply too hard for the user to keep it all updated.
Secondly, faulty business logic or system limitations impede the automatic correlation of customer engagement data with business performance data (and vice versa). If the wrong consumer data is collected or looked at in the first place, turning it into insights and then action does not make a positive difference on the user side. In other words, the relationship with the app or service becomes irrelevant because the analytics are in the wrong and the offering no longer reflects a need.
The formula for making wearables stick is still a huge TBD and not much can be done independently to integrate platforms and apps from different providers. However, a lot can be done to affect one app’s usability and how it’s measured, thus making it more relevant—and competitive.
Successful companies not only cover these bases by employing the right kind of performance monitoring tools, but have discovered that the best monitoring is not a patchwork of disparate systems, but unique platforms to correlate real-time user behavior with business performance. Thus, getting real-time visibility can calibrate the right parameters, at the right time, all the time.
Keeping the consumer IoT usable with real-time performance
If digital trust and effective need are satisfied, the user experience relies entirely on performance, including speed of delivery, accuracy, and reliability.
That being said, IT teams now have to deal with several additional layers of complexity on top of the usual management conundrums: growing number of devices, variety of data formats, custom business logic, external threats, and the customary nature of IoT applications – all of these, and the relationships between them, leave monitoring with a lot of blind spots.
And with the sheer footprint of an IoT infrastructure (with all its layers) being as wide as it is, that means the lines between the integrity of the infrastructure, the performance of the infrastructure, and the performance of the user applications are blurred.
While automating each of these key IoT components is justified and desirable, they have to work together seamlessly to create a scalable, intelligent system—otherwise the puzzle is disjointed and unstable, and the full picture unclear.
So, again, what gives?
Speed of delivery (of services, analytics, and triggered actions) depends on the infrastructure and the performance management system in place. If the infrastructure supports stream processing capabilities and the system performance and application performance management (APM) tools are able to monitor, baseline, troubleshoot, and scale in real time, speed of delivery is optimal.
Accuracy determines relevance, and it depends largely on both the ability to monitor end user behavior, and—as previously mentioned—on correlating it with application performance levels and business outcome in real time.
Reliability, apart from previous indicators, is also conditioned by uptime. Continuous execution and the ability to detect failures and threats before they occur is paramount.