With the rise in connected assets and sensors, and the ever-expanding capacity of predictive analytics and machine learning, the possibilities for insurers to know and understand more of the world around them are huge. We’ve spoken in the past about how this will enable insurers to change their business models, moving from a model of insurance (i.e. what they’ve been doing for the past few hundred years) to one of assurance (where risks are reduced or prevented, and where the insurer actively helps the customer to protect their assets). We’ve also spoken about how ‘the law of unintended consequences’ will mean that hyper-connectivity will bring about new, unforeseen developments. On a macro-level, this may mean that the insurance market develops in unexpected ways – the various different players (insurers, customers, OEMs, brokers etc) may change places or be replaced entirely as each develops new technological capabilities. But it may also have consequences on a micro level, with our understanding of device and asset behaviour becoming much more detailed. One technology affecting our understanding of devices and assets is Digital Twin technology.
Digital Twin technology was rated by Gartner as one of the top ten strategic technologies of 2017. In effect, it allows companies to create a digital version of a device, machine or system, that can then be used for simulation purposes. This may include diagnosing faults, preventing downtime or generating predictive models that allow greater understanding of the real, physical version of the device. An initial version of Digital Twin technology was pioneered by NASA in the early days of space exploration and it allowed engineers to understand how to diagnose and fix machine faults remotely. As Bernard Marr wrote today in Forbes Magazine (6 of March 2017), when disaster struck the Apollo 13 mission, “it was the innovation of mirrored systems still on earth that allowed engineers and astronauts to determine how they could rescue the mission”. Indeed, the importance of this is difficult to overstate, and as Thomas Kaiser, SAP Senior Vice President of IoT, states, “digital twins are becoming a business imperative, covering the entire lifecycle of an asset or process and forming the foundation for connected products and services. Companies that fail to respond will be left behind.”
So what are the implications for insurance? We’ve gone on record many times to say that insurers who fail to adopt connected technologies will become extinct in the near future. But taking the decision to adopt connected technologies is only part of the answer – they need to be baked into that company’s business model. Part of doing this means utilising the information that is generated by networks of smart devices. If you’re a marine insurer and all of your customers have vessels that are packed full of engine diagnostic sensors and connected technology, you could, theoretically speaking, create predictive models that tell you how likely each vessel will be to have engine problems (and how likely it will be to make a claim).
Rolls Royce are already talking about this and apply predictive analytics to maintain efficiency and prevent downtime with their engines. They are also talking about autonomous vessels by 2020, something that would result in huge amounts of digital data being made available for analysis. In a similar vein, GE are applying this technology to their renewable energy offering, with predictive maintenance modelling for windfarms. Indeed, Concirrus works with CHS Engineering and Heathrow airport to monitor critical infrastructure and detect potential malfunctions before they result in down-time.
What happens to insurers when this kind of data is being produced by their customers? How do they ensure that they aren’t side-lined within the industry, with other players (brokers, for example) becoming powerful, data-fuelled entities with massive insight into the risks of the market?
The answer lies in the insurer’s ability to translate raw data into behavioural insight. With greater insight into the likely behaviour (and risk) of individual assets, we can start to see how insurers may start to manage risk more effectively and even work with their customers to help prevent it. More intimate relationships with their customer’s risk management departments are likely to be key here.
This data will also help insurers to model and plan their overall risk portfolio, envisaging likely scenarios and ensuring that an optimal balance of risk is maintained. This ‘digital portfolio’ would help remove some elements of uncertainty within the business help plan more effective reinsurance policies, for example.
But in order to leverage this data, insurers will need ways of bringing it all together and making sense of it. This is currently beyond the abilities of many organisations and will mean that third party technology providers have a huge role to play. These companies will be the intermediary between insurers and the data providers (customers, industry data-sets, OEMs etc). The race to achieve this is on, and the winners will enjoy a period of substantial competitive advantage over the rest of the market.