Blogs

Making IoT data work for your business

January 6, 2021

Posted by: Anasia D'mello

Adam Mayer of Qlik

There was a time you couldn’t escape the discussion around the Internet of Everything (IoE), which in itself felt symbolic of the journey we’d inevitably go on with the Internet of Things (IoT).

Our immediate reaction when discovering new technology is more, more, more, says Adam Mayer, senior manager at Qlik without necessarily ensuring we’re getting the most out of what we already have. Consequently, organisations were being encouraged to put sensors on every light, door and toilet before they began to see return on investment.

This is a similar journey many early adopters of Big Data went on; it took time to understand having more data didn’t necessarily translate into improved outcomes without better ways to visualise and analyse it. Sequentially, organisations are coming to realise the greatest potential for the IoT is in how the data produced by these devices can be explored and probed to provide learnings and improve outcomes.

The Breathe London Project, which our partner C40 Cities is running with the Greater London Authority, is an example of this. As part of an investigation into Londoners’ exposure to air pollution, a network of 100 sensor pods were installed on lampposts and buildings throughout the city, while Google Street View cars used mobile sensors, to continuously transmit air quality measurements across London.

While the information is undoubtedly interesting, the value of the project is not the gathering and representation of data but in the policy decisions that will be made to reduce the pollution ‘hotspots’ these sensors will identify.

Barriers to analysing IoT data

However, for many organisations, this is easier said than done. There are significant challenges associated with integrating IoT data for analysis.

Firstly, organisations must overcome integrating a variety of data from different sources into their data pipeline. Qlik’s research with IDC revealed integrating disparate data into standard formats is one of the greatest challenges organisations face in transforming data into an analytics form (37%).

The introduction of the IoT significantly exacerbates this challenge as it can quickly multiply the number of data sources feeding the pipeline, often in unfamiliar or unstructured formats that must be transformed before being made ready for analysis.

The issue is further aggravated by the second challenge, the high volumes and high velocity of throughput. With many IoT devices taking continuous readings, data is produced in far greater quantities than most. This then naturally comes onto the final hurdle, that even if the data pipeline is robust enough to ingest and transform the continuous data flow from IoT devices, many visualisation and analytics solutions aren’t able to provide real-time information updates.

This means whether the bottleneck is with the software or caused by the time lapsed between the user reviewing its output, the learnings from the data can only be implemented retrospectively – not in real-time.

Keeping up with the pace of data

Organisations hoping to take advantage of the IoT can overcome these challenges by building a data supply chain that can quickly integrate and transform data from the multitude of different sources.

Traditional batch-orientated methods like Extract, Transform and Load (ETL) – are too slow, inefficient and disruptive to integrate and support the timely analysis of IoT data, and often require heavy coding and deep scripting. With 31% of global organisations citing ‘a lack of skilled resources to process data’ as one of the greatest challenges in making data analytics ready, it’s critical to the success of IoT implementations that organisations reduce the significant drain on the time of skilled programmers.

Change Data Capture (CDC) technology presents an achievable smart alternative for those wanting to quickly process their IoT data for analysis. Instead of uploading data into different sources, CDC enables continuous incremental replication by identifying and copying data updates as they take place. Streaming data in this way significantly increases the velocity with which data can be ingested and transferred into the data warehouses or data lakes for analysis.

Finally, when the data pipeline can integrate data in near-real time, it’s important the analytics solutions are not only capable of continuously visualising up-to-date information, but that a layer of proactivity is built-in to support the decision-making process. Real-time alerting provides not only insights, but can recommended actions for users to quickly trigger. Leveraging cognitive engines to deliver this Active Intelligence will be a key feature of the next generation of BI tools.

A data pipeline to deliver the promise of the IoT

Organisations must ensure they don’t fall into the same trap with the IoT as many did in the early days of Big Data, where the goal of having more data took precedence over using what they had to drive the best outcomes. Looking at early adopters of the IoT, too many are more focused on receiving real-time updates than taking the necessary steps to transform and analyse its output to empower better decision-making.

The promise of the IoT is the opportunity to continuously learn, act and react. To ensure IoT implementations in organisations have the velocity and flexibility to support advanced analytics, they must first ensure their whole data pipeline is up for the task

The author is Adam Mayer, senior manager at Qlik.

Comment on this article below or via Twitter @IoTGN