One industry that abundantly reaps the fruits of AI innovation is without doubt manufacturing. AI has proven to be a real game-changer for the heavy asset and manufacturing industry.
Industrial AI is a manifestation of the digitalisation that is transforming manufacturing. Industry 4.0, Industrial Internet of Things (IIoT) or smart manufacturing are the main drivers of this digitalisation and all refer to applying information and communication technology to every step in the manufacturing supply chain.
“Manufacturing is the sector that generates far more data than any other sector,” stated Diego Galar, Professor of Condition Monitoring at Lulea University of Technology and Head of Maintenance & Reliability at Tecnalia at his presentation on industrial AI at Data Innovation Summit 2019.
Intelligent manufacturing and AI
Digitalisation has evolved the manufacturing in such a way that we have moved from smart manufacturing where sensors were deployed to automate and optimise factory processes to digital factories which allow monitoring digital assets, i.e. products before they are even manufactured. The impact of the evolution is such that it can transform traditional manufacturing to digital manufacturing in the course of 2-3 years, points out Diego.
What’s even more prominent is that digital transformation has affected the most conservative sectors like metallurgy and mining. But where do we see AI’s role in this evolution? Diego states that the concept of AI isn’t new, but as it has adapted to the industry, it provides a specific context and potential for AI to grow even faster and augment our lives. He points to maintenance, operations and production as the biggest beneficiaries of industrial AI.
AI and engineers are at a position to co-create a solution. Diego argues that artificial intelligence can’t replace the traditional engineers, but their joint efforts can produce a solution that will improve industrial assets and decrease the shutdowns of the machines. Because at the end of the day that what Industry 4.0 is all about – increasing performance and decreasing the number of failures, states Diego.
Enemies to industrial AI
Artificial intelligence in the industry isn’t free from challenges. It has to cope with different anomalies, Diego warns, such as point anomalies, contextual anomalies and collective anomalies.
The enemies for artificial intelligence are false positives and false negatives. They are detrimental for the process operations because they point to actions that are not required. A still more dangerous consequence is that the performance of AI and the confidence in the system are ruined.
At the end of the day, Industry 4.0 is all about increasing performance and decreasing the number of failures.
Being the main protagonist in the midst of a revolution is a hard job and attracts many nemeses. The biggest ones for AI in the industry are the black swans and the swan song.
Black swans are outliers that are not found in our data, but when they appear they cause extreme catastrophic consequences. They characterise with retrospective predictability, meaning humans can find explanations for it only after it happens.
The swan song works the other way around. It refers to the signals emitted by a machine indicating that it’s going to die. They can be detected with AI so the shutdown is planned in advance to avoid catastrophic consequences.
With all these enemies lurking about, working in manufacturing entails risk perspective. We have to be aware that we have the probability-based and historical data and the knowledge derived from it on the one side, but we also have the surprise on the other side. And it’s AI’s part to fill that gap between our knowledge and the surprises.
In order to do that, we need to have a tool in place that brings all the company data in one place and enable running AI systems. This tool, or tools, is the digital twins.
Digital twins enable technology to run AI systems, maintains Diego. They represent data repositories, models or virtualisation of the assets in order to perform the two main goals of digitalisation: increasing performance and decreasing failures, but they also provide business KPIs of the assets.
These digital twins have been evolving extremely rapidly, Diego highlights. The first generation is digital twins 1.0, which is basic machine learning where the OT data of the asset is fed into the anomaly detection model.
The need to integrate the IT systems into the digital twins led to the second generation of digital twins. Bringing the taxonomies and ontologies of all assets together in the digital twins, and converging OT and IT technology create the digital twins 2.0. This second generation of digital twins is combining data in real-time from the asset and data from the IT systems in order to produce metadata to be exploited.
However, digital twins 2.0 still have certain limitations in that they’re not employing all the available data in the organisation. There is so much more data that the organisation collects that is not related to the asset but is still crucial to be considered. When the contextual data is brought to the asset, we create context awareness, which refers to all the circumstances related to the asset described by the dataset – giving rise to the digital twins 2.1. or digital twins 2.X, as Diego describes. Even with context awareness, the digital twins 2.X have limitations in regards to certain failures and performance that are not in the dataset and can’t be predicted (black swan loses).
To fill this gap, we have to merge the datasets and the physics of the failure – or create hybrid models as Diego calls them. This convergence of the data and physic is called digital twins 3.0. The digital twins 3.0 or hybrid models bring together data from the field, the physics from the failure and the context awareness in order to deliver context-driven services. The digital twins 3.0 are the most sophisticated generation, but that doesn’t render the previous versions 1.0 and 2.0 as useless, points out Diego. Digital twins 1.0 are useful for operational level warnings and early detection, while 2.0 are applied for edge computing and tactical level operation maintenance. And 3.0 is applied to a higher level of strategic management.
How digital twins help manufacturing
As Diego stressed earlier in his presentation, the expectations of digital twins are to render virtual commissioning of our assets as digital assets that will serve not only as data repositories to dump all the data in them but they present compound of all databases and the physics of the failure in order to deliver services.
The evolution of the digital twins deliver the different types of analytics applied in the industry:
- Diagnostics/Descriptive analytics – applied to have insight into the failures (what happened, why they happened)
- Predictive analytics – used to predict when the next failure/anomaly is going to happen.
- Prescriptive analytics – applied to show different implications of decisions models.
AI in the industry is striving to achieve cognitive analytics, analytics that embodies all previous, AI that autonomously learns from past actions, finds correlations, learns from outcomes and gives prescriptions.
The Data Innovation Summit has gone 100% Online and become a Global event!
You can now join the summit from the comfort of your home or office, and enjoy the unparalleled content shared through the program. The entire program will be streamed LIVE through the event platform Agorify between 18th to 21st of August 2020.
Register on the link below to get your online ticket and listen to more than 300 sessions delivered by the leading data-driven companies in the world!