AI adoption is dashing ahead and its impact on our lives is ever increasing. But it’s more likely to be our downfall if it’s not implemented in a sustainable way, in terms of legal, technical and societal aspects. At Grundfos, they don’t only focus on how they compensate AI sustainability, but also what impact can they achieve with it.
They see AI sustainability as an opportunity to provide sustainable impact and economic impact on the environment, that will also fund the next sustainable efforts, explains Christian Rasmussen, Head of Technology, Innovation Lab 3 at Grundfos, introducing his presentation at the Data Innovation Summit 2019.
Grundfos is strongly dedicated to supporting the UN’s Sustainable Development goals number 13 of the climate change act and number 6 of water and sanitation – two areas where they can make a change and impact since they are involved in pump manufacturing.
Digital offerings and data quality
Apart from manufacturing water pumps, Grundfos provides digital offerings to their clients for monitoring, control, performance and fault detection based on analysis of performance sensor data from the pumping systems.
And as data is at the core of their digital offerings, ensuring high data quality along their whole value chains is Grundfos’ most critical priority.
The data dilemma at Grundfos
However, when an enterprise starts working with data and digital offerings, it’s faced with a dilemma, that Grundfos too came across.
Christian refers to this as the enterprise DAIR dilemma, where on the one hand, an enterprise can focus on building in rock-solid data management and data governance, enterprise-wide capabilities, scaling and re-usability.
Whereas on the other hand, it can focus on use case agility, run analytics on every pilot separately, fast operationalising with minimal upfront investment. But this approach lacks the foundation and platform needed to scale.
So the main question is how does an enterprise balance between these two extremes? Christian presented how Grundfos solved this dilemma by adopting a balanced data management across their entire data value chain.
Balancing data and innovation at Grundfos
As we’ve seen previously, Grundfos has integrated digital transformation into their business core and built the digital services on top of it so they can move ahead in the digital era. This new core of digital offerings is a more efficient way of doing business, but it’s still evolving and developing. Therefore, Christian states that they are focused on developing the engine of balanced innovation which covers both laying the data foundations and doing the pilots. They run the plots one by one, take the learnings and implement them in the data foundation.
What is good data quality?
If we talk about ensuring data quality, we first need to define what good data quality means.
“The purpose for which you are going to use the data, defines the quality of the data”, explains Christian.
“The level of quality of data represents the degree to which data meets the expectations of data consumers, based on their intended use of the data.”Laura Sebastian-Coleman
Data quality depends on the expectations of data consumers, e.g., what does a data scientist need from a data set in order to be able to work with it.
Data Quality Assessment Method
Having this into consideration, Grundfos developed their Data Quality Assessment Method which helps raise awareness about data quality, provides the organisation with an understanding of what good data quality is and common language to talk about data quality.
Christian gives the example of a term he used “data lineage” when discussing with the business function. The communication was difficult because they are not data science people and don’t use this technical jargon. Then Christian started using the term “data supply chain” instead because all people in a company know what a supply chain is.
Finding a common language also makes it easier for data scientists to explain their data requirements in order to work with it. This, in turn, helps improve data quality.
Grundfos’ Data Quality Assessment Method consists of three dimensions:
- Data definition – assessing the quality of metadata describing the data set.
- Data quality – assessing the data quality in a quantitative manner and comparing it to the set requirements.
- Data availability – assessing the FAIRness of the data set.
The Data Quality Assessment Method is also a tool that Grundfos uses to educate people on data quality, responsibility and how to improve it.
It has six steps:
- Identify the stakeholders
- Identify persons who will be responsible for findings and commitment
- Map the data models
- Handover of the report to identified persons from step 2
- Identified persons will delegate improvement tasks, and data quality improvement will begin.
What is FAIR data?
FAIR data is:
- Findable – data recorded in a central catalogue, not on individual computers.
- Accessible – data can be accessed by new members who didn’t create it.
- Interoperable – data can be used across the organisation.
- Reusable – data is kept up to date at all times.
Who is involved in improving data quality?
There are several functions within the company in different parts of the data value chain, and they need to work together for improving data quality.
- Data Scientists understand the use of the data and they can articulate the requirements for the data to have a good data quality.
- The Business function creates the value and defines the next steps forward in the business that data should support.
- FAIR Q forum – a group of people that sits across the whole company that help build the quality guidelines and foundation.
- Quality functions – a group of people across the entire organisation that ensures data quality in every step of the process.
In the process of creating the new digital offerings, people from these different functions gather in Grundfos’ Digital Transformation office to work together.
The FAIR Q umbrella
The FAIR Q umbrella is a network that brings people together and enables them to talk on data quality questions.
It consists of several parts or building blocks for ensuring data quality:
- Data catalogues
- Good data quality reports
- Data process management
- Data quality assessment method.
The FAIR Q network enables Grundfos to manage their data foundation. As Christian says, it’s an entirely different approach to appointing a Chief Data Officer as a central role that looks after the data. It is a tool that helps them work from the bottom-up and top-down. The people on the bottom from different functions work together and talk in a structured way, create guidelines and make recommendations as they are hands-on involved in working with data quality. And then the top makes decisions and creates policies based on these guidelines and recommendations.
Implementing data quality requirements into the development process
The overall process of implementing data quality into the business is actually an old fashioned product development process, with a step-by-step waterfall methodology, states Christian.
And today, rarely anyone wants to do it the traditional way; everybody wants to do agile. But this data quality implementation process is in Grundfos core business, how they operate and produce products.
They help their people become more data literate and know where to put data quality management and assessment in the process and ultimately create better data. The journey of going from a product company to a service company also means making sure the products they produce and the data these products create are of good quality, as the service is derived out of that data.
The Data Innovation Summit has gone 100% Online and become a Global event!
You can now join the summit from the comfort of your home or office, and enjoy the unparalleled content shared through the program. The entire program will be streamed LIVE through the event platform Agorify between 18th to 21st of August 2020.
Register on the link below to get your online ticket and listen to more than 300 sessions delivered by the leading data-driven companies in the world!