Business interruption from supply chain vulnerability is one of the most complex risks in global business environments. Extreme weather caused by climate change has significantly increased this vulnerability.
Yet it is not only supply chains that are subject to interruptions due to climate impacts. Company-internal processes are significantly affected, too.
In the extended heat period in Central Europe in 2018, that caused high temperatures and low water levels in the Rhine, the global chemical manufacturer BASF had to adjust production and logistics at its Ludwigshafen headquarters. Accompanying these adjustments was a quantitative assessment of risks which was crucial to adequately price insurance products, and to design parametric products around business interruption.
What is more, climate change has material regulatory and liability implications. Since conventional emission trading schemes have been in place for several years, emissions regulation is tightening and new targets or instruments are being shaped. Germany’s recent ruling on emission taxation impacts on business risks of companies and requires strong improvement in both fact-based carbon accounting of business processes and industry-wide climate risk reporting and disclosure standards. The latter will increase the exposure of directors and executives who oversee the disclosure of risks.
A comprehensive framework for risk disclosure has been developed by the Task Force on Climate-related Financial Disclosures (TCFD). For insurers and climate risk analysis experts, who work with these risks on a daily basis, it is clear that the complexity of climate risks essentially eliminates the suitability of generic, industry-focused approaches. Processes differ substantially among companies within the same industrial segment.
Artificial Intelligence (AI) fed by an underlying modern data logistics layer is best suited to account for such evolving risks in a robust and reliable manner. But how can the data consumed by AI models be integrated most effectively? Let us investigate the state of data systems in businesses and how this is set to change.
From closed data systems to an interoperable world of APIs and micro-services
In any business segment, IT and data systems are heterogeneous and lacking in interoperability. This problem emerges from either a diverse landscape of applications and vendors serving different needs, or from monolithic Enterprise Resource Planning (ERP) systems that are complex in setup, cumbersome in operation and costly in maintenance and expansion.
Use cases that employ AI to work with data across this system landscape are hindered by the lack of interoperable systems, meaning it is almost impossible to consolidate and analyze data. Concretely, this means companies are blocked in rapid AI adoption, because they need to:
- Carefully evaluate which process is suitable for machine learning and which data is required to deliver a working AI solution;
- Be clear on the desired outcome and the best-suited algorithm and machine learning technique.
Today, typical IT architectures use data warehouses as the workhorse for data management. Data warehouses are highly organized and structured IT systems for data storage (such as indicated in the upper panel of Figure 1). Data is transformed and loaded in a predefined manner into the warehouse. Changes to the structure of the data warehouse are associated with cost and often not possible due to mutual dependencies of the internal sub-systems. Data warehouses are highly tuned to solve a specific set of problems, but catering to new use cases outside the original scope is tricky.
The predefined structure and aggregation level of data, the complex system architecture of many interrelated databases, batch-oriented processes and the missing interoperability to external data sources, pose obstacles to the fast development of AI-supported applications. Interoperable data access would allow an AI to learn from a much richer set of data, while direct access to raw data sources without intermittent ETL (extract – transform – load) processes would enable the AI to identify more fine-grained patterns in data if they exist. And to learn aggregating data by itself.
Our recommendation to resolve this dilemma is to modularize data systems. Instead of designing complex processes and systems to integrate data, companies should focus on making disparate data sources accessible, making any data point searchable and instantly retrievable, using micro-services for dedicated applications on the data.
This approach can be seen in the lower panel of Figure 1. One technology that allows for full interoperability of corporate data builds upon an orchestration and industrialization of open software. The transition to micro-services can be performed step by step: Processes that ought to be transformed with priority are those that are of high value (either internally or towards customers or partners) or create major waste. Based on the applications used in the processes and the way to use them, targeted micro-services can be designed with APIs (interfaces) as publicly accessible as possible. Financial service companies in Switzerland started to follow these recommendations and to make databases of legacy systems API-ready.
The Internet-of-Things: Ready-to-use points-of-truth for sustainability management
Modularization of data sources also implies that valuable information from outside an organization can be integrated seamlessly via APIs. One such data source is the Internet-of-Things (IoT) – objects which collect data and communicate this information over the internet.
A first collection of IoT-driven use cases in the financial services industry was undertaken by Finextra, a financial technology news company. It mentions B2C applications, such as automated utility payments through Smart Metering and building surveillance that allow banks and insurance companies to provide personalized and dynamic products based on actual usage patterns of buildings.
The International Institute for Sustainable Development (IISD), a think tank, describes how digital technologies come together for data acquisition, storage and curation, trust and accountability, and insight and analysis in sustainable finance. And the Sustainable Digital Finance Alliance introduces into range of applications for IoT technology, for instance, to lower the cost of validating green investments by asset monitoring.
The impact of IoT is so significant because it reaches people, objects and processes that other technologies could not reach to the same degree. Sensors and cloud-based analytics therefore enable more informed capital planning for sustainable investments.
If you like to hear more about Data Logistics and IoT integration for climate related impact and risk analysis, listen to our recent video interview with the Swiss Re Global Institute.