Keith Brody of DigitalRoute
In the first two blogs in this series, that was about digitisation and the next one about data integration and management utilities, we established the importance of managing data to the success of the digital transformation process.
Having done so, in the next two blogs we will look at the beneficial outcomes that can be accrued by deploying effective data integration and management technology.
To do this, we’ll examine the impact of such applications in actual deployments.
First we’ll consider the case of Alpiq, a leading Swiss utility. A year ago (in 2016) Alpiq identified a clear strategy to transform itself into a digital innovator and market leader. A central aspect of its approach was to develop, promote and market business models that required automated and highly developed smart technologies.
To do this successfully meant taking control of asset data from across all the company’s portfolios and consolidating it into one enterprise wide IoT platform, on top of which smart applications based on artificial intelligence for smart grid management, trading and demand side management could be leveraged, says Keith Brody, director of Global Marketing at DigitalRoute.
As is so often the case in major IT projects reaching goals which sound achievable is hindered by the realities of the existing legacy landscape. Complex legacy infrastructures, tight integrations between IT applications and the network, rapidly growing data volumes, siloed architectures and technology evolution itself in many ways combine to inhibit rather than encourage digital transformation.
Thus, for Alpiq, its evolving infrastructure (you can think in terms of asset management, energy trading, wholesale generation, distribution operations, and customer service) has become a series of functions working in isolation from each other. The static dependency between network and business operations was preventing rather than driving digital success.
Given the speed of market evolution in recent years this is, perhaps, not surprising. In an industry whose revenues were historically handled by batch-run, post-paid billing processes, the rapid change first to accommodate strategies that relied on situational awareness (machine to human, a growing number of data sources, the need for visualisation, etc.) and that today are reliant on stream-processing has been dramatic. The new industrial realities of machine-to-machine interaction, exponentially growing data sources, and the automation of processes demands infrastructure change if digital transformations are to succeed.
In the fourth blog in this series, we’ll look further into the details of both the deployment itself and what Alpiqs chosen technology now delivers.
The author of this blog is Keith Brody, director of Global Marketing at DigitalRoute
Comment on this article below or via Twitter @IoTGN