Thought Leadership

AI breaks down the silos of industrial data Podcast – Transcript

In a recent podcast, host Spencer Acain is joined by Ralf Wagner, Senior Vice President of data-driven manufacturing at Siemens, to explore core applications of AI within Siemens Insights Hub, with everything ranging from out of the box solutions to powerful customizations for expert users. Beyond that, Ralf examines the importance of data-driven manufacturing going forward, and why AI will play a key role in that.

Check out the full podcast here or keep reading for a transcript of that conversation.

Spencer Acain:

Hello, and welcome to the AI Spectrum podcast. I’m your host, Spencer Acain. In this series, we explore a wide range of AI topics from all across Siemens and how they’re applied to different technologies. Today I’m joined once again by Ralf Wagner, senior vice president of Data-Driven Manufacturing at Siemens. Last time we talked, Ralf, we kind of left things on a bit of a cliffhanger, talking about your four major AI applications within Insights Hub. So I’d love to pick things up again from there, and maybe you can just kind of really dive in and tell us more about how you’re applying AI to the data-driven manufacturing process and adopting it across the entire breadth of your tools and your applications.

Ralf Wagner:

Yeah, maybe I actually walk you through those four areas and put a little bit more meat to the bone here. So let me start with the out-of-the-box solutions, which is basically a SaaS offering, a software as a service, which is tailored towards a specific kind of use case pattern. So we have a solution which is called Quality Prediction. We have a solution which is called Energy Optimization and some others, and these are just applications running in your browser, specifically dedicated to those kind of areas like energy and quality. But in the background there is AI models empowering everything which you see on the screen as a user. So you don’t need to be an AI expert at all, but you need to be an expert in the domain which this solution and this application is targeting, like I said, either quality engineer or maybe the sustainability or automation engineer when it comes to energy optimization.

But let me give you an example for the first out-of-the solution, which we have since a while in our portfolio, which is Insights Hub Quality Prediction. So this is a specific real in-production time analytics solution, which is actually analyzing the data coming from the equipment, coming from the quality system, coming from the environment, like temperature, humidity, and everything which can influence quality. And while you are in the production process, it’s giving you an indication whether at the end of that process you will have the quality level which you actually want to have and planning for. I’ll give you one example. So if you are in a gas turbine manufacturing environment, in every gas turbine, there’s these hundreds of blades in there, and each of the blade is getting an individual coating. And the coating process, depending on the size of the gas turbine, it takes about 15 to 20 minutes. It’s multiple layers and it has to be perfect.

If it’s not perfect, there’s a higher risk that the blade, while the gas turbine is being operated, is breaking. And we can all imagine if a blade in a gas turbine is breaking that that is major breakdown of that turbine, which we all want to avoid and not see. So that’s why the quality of the coating for that blade is so important. So usually at the end of the coating process there is X-ray tests. There’s all kinds of tests in order to make sure the quality is at that level. But with a solution like Insights Hub Quality Prediction, we are monitoring the coating process in detail with every kind of information and data we get from, as I said, the equipment, the environment and the quality system. So we can actually predict each and every minute of this 15 to 20-minute coating process whether we are still in spec for the quality outcome we plan for.

So maybe after seven minutes you have so many data collected, and the AI model is predicting that this coating process is not going to make the quality test at the end. So what you can do is actually you can, after seven minutes, you can stop that coating process, take the blade out, maybe put it to rework or maybe completely do uncoating and start from scratch or maybe even throw it away. And you can start the coating process for the next blade. And you save, basically, instead of waiting the complete coating and which takes the 15 to 20 minutes, you stop that process after seven minutes and you save that additional time, which you would’ve wasted, for finishing the coating and then find out in the X-ray test that it didn’t make the quality test. So we helped customers who were using this and specifically now in that coating process to save 70% of non-performance tests only using that Insights Hub Quality Prediction solution in that step of their production, which is the coating.

So giving you that example, an out of the box solution for the main expert of quality, the quality manager in this production side, a data-driven AI solution has a major impact on the non-conformance test. This is something we can provide out of the box. So that’s only the first quarter, the quadrant of the four. So let’s go to the next one, which is the analytical tools. So this is a set of also a browser-based SaaS solution application which help you more generically. So this is more for the production engineer, more for the operator of the equipment. So you collect your time series data from the equipment and from the OT systems and the events and everything. And then you can actually detect anomalies which you have in the behavior of the equipment, and you’re given some alerts and maybe even an indication and predict what will happen pretty soon is something maybe you don’t want.

So maybe there is a CNC machine which has a pattern of temperature to energy consumption ratio. And every time this is getting in a certain ratio based on whatever is being produced right now, there is a certain pattern then to be observed, and there might be a breakdown of this machine coming. So this is something you detect an anomaly between temperature and energy consumption ratio, and you can predict, based on the past data and the AI model as being trained against this data in the past, so you can predict that there might be a breakdown if you see this kind of pattern. So that’s a set of tools not tailored and built for a specific use, case more in generic use where the production engineer and the operator of the line and the machine can actually use the data in a much broader set of use cases in order to make use and get insights what is happening with this equipment and maybe even predict future behavior of that equipment and potential breakdowns.

So that’s the second set of tools which we have AI-related. Let me move to the third one. So this is the AI model management and execution area. This is something which is actually tailored towards the data scientists. Many of the manufacturing companies actually don’t really have a lot of data scientists there, but specifically the larger enterprises, they have. And also they want to work with the data which sits in Insights Hub and is a lot of value. And typically you don’t want to move that data out of Insights Hub and move it to somewhere else to be analyzed because IoT data is huge. It’s big data. And data has gravity, and moving data around typically comes with cost associated to it. So you better do the other way around. You better bring your AI, your models, to the data and not the data to the AI model, to some other cloud or so.

So what we allow the data scientists to train an AI or ML model with the tools they want to use. Data scientists are sometimes a little bit picky when it comes to their tool chain, and we don’t want to tell them what to use. We give them the complete freedom, whether they want to use AWS SageMaker or they use the open source framework, TensorFlow or something else. In their local environment, they can get access to the data in Insights Hub. They can train their model, and then we give them an environment to execute and manage that model in Insights Hub.

Give you one example, a simple one. One of our automotive customers, they had a quality issue because sometimes the model type, which they put back on the trunk, these little letters and numbers, didn’t really fit the model that was actually the car itself. So at the end of the line when they put the model type on the trunk, they took a little picture, and they linked this to the individual car. And there was an image recognition, a little model that the data scientists of that manufacturing site was actually recreating because there were certain letters, and it was by the way in China and Chinese characters as well. So he built the image recognition model, trained it against the typical kind of characters they use and put it into Insights Hub. And then every time a car came, a picture was taken. It was trained, it was linked to the model, and there was a validation of whether the type which was put on a trunk was actually the car that the trunk was at.

So this is a typical scenario, specific problem. A model can be trained. Data scientists do this and then use Insights Hub capability and the data is in there and managed in order to get the value and this use case implemented. So that’s the third area. And the fourth area we just added end of last year is what we call the Copilot. So we started with the production Copilot, which actually allows you, in a ChatGPT-like interface, to talk to Insights Hub, to data that sits in Insights Hub, either in the time series and event store or in the integrated data lake or in the MIS data warehouse. It doesn’t matter. You can start to interact with your data in a chat-like interface and environment.

So these are the four quadrants which we added to Insights Hub over the last few years, more and more and just constant innovation. There’s constant capabilities being added using new technology. And as I said, sometimes it’s not really visible to the user. We use it in the background and take the complexity away, but sometimes we give that tools, that high-end tools into the hand of an Insights Hub user or data scientists so they can actually take advantage of the latest and greatest in the environment as well.

Spencer Acain:

Wow, it really sounds like you’re almost rapidly including a lot of these type of AI technologies in Insights Hub. It sounds like you and your customers are already seeing a lot of value from that. So I guess have to ask now. So why is moving to this kind of data-driven manufacturing process so important for the industry as a whole, and what makes AI such an important part of that process?

Ralf Wagner:

As I mentioned before, that the continuous improvement process is something that is basically in the genes of every manufacturing company, and it’s mostly done in pen and paper and Excel spreadsheet and whiteboard still today. To get the decisions and the root cause analysis being done fact and data-driven is a major improvement step to get value with some of the examples I mentioned before. And to make that a systematic process, embedding these tools into the daily stand-ups, do not look at just an Excel spreadsheet, but look at the dashboard that Insights Hub is providing. And if there is somebody who has a question, you can just double-click and go down to the next level, next level, and find out more without asking somebody to open up a process and to go away to make some more analysis and come back the week later with maybe an answer or maybe not. So it makes that overall continuous improvement way, way more efficient, which leads to better decisions and faster implementation than at the end of the day.

And it’s not only about that now going forward, that data which comes from the equipment of quality systems or some of the OT system in SCADA. It’s also about getting more context of the data which is linked to the equipment, where IoT, it was basically starting and analytics. Because industrial IoT, the T, the things in IoT, internet of things, is the equipment of the shop floor. But now more and more in the last few years we said, “IoT is not a silo in itself. It has to be connected to the systems left and right,” and most prominently potentially to the MES system, to the manufacturing execution system. And let me give you an example why this is important to integrate this into the overall OT landscape. So IoT is basically collecting data mainly from the equipment, and make this available in the examples that we mentioned before for analytics, for dashboarding, for transparency, for root cause analysis, and better decision-making.

MES on the contrary is taking the data from the top, meaning from the ERP system, how many orders of green, white, black, yellow bicycles do I have in the system for my customers? Do I have the right color for the right paint? Do I have the material? Is the capacity and the lines for Model X, Y, and Z enough? And then I need to do a detailed scheduling and distribute the work across the different lines. And then I execute down to the PLCs, to the automation level and execute the production. So this is MES. And now if we bring these two systems together and contextualize them to take MES out of that silo and to take IoT out of that silo and have a contextualization between IoT and MES, in the Siemens case between Insights Hub and Opcenter, we can get the next level of insights and contextualization out of that data.

And let me give you a very brief example. So if you have an IoT solution and you see that there is an anomaly detected from maybe a CNC machine, there’s some vibration coming up, so there’s an alert being triggered, and the service technician is being alarmed and the service technician goes to the machine but doesn’t find anything. So, “The machine is working well, there’s no issue with that. I don’t know where the vibration is coming from.” And by the way, the vibration is gone by now.

So maybe the following week, same happens again. Again, there’s an alert, there’s a vibration, an anomaly detected, service technician goes in there and doesn’t find anything. In the context, now taking the context of IoT that anomaly, that vibration, and contextualize that with what is happening on the machine right now, what is being produced, what material is being used from what supplier? Who’s operating the machine? To contextualize that vibration anomaly with what is happening right now, which sits in the MES system, they would’ve found out that every time the machine is vibrating with an anomaly, it’s not a problem with the machine. It’s not a problem with the operator. It’s because of the material being used comes from a different supplier.

And immediately you take a completely set of different action steps because now you know you don’t have a problem with the machine or with anything else, you have a problem with the material. So you probably have a supplier quality issue. And you pick immediately up the phone and start to talk to your supplier, “What is happening? Why do I get this kind of vibration and therefore potentially quality impact with the last batch of material you have been sending me?” So this is the next level of insights for data-driven manufacturing, taking the data out of IoT and equipment-related only, but related to, for example, the MES system and contextualize this. So this brings way, way faster, again, insights and decision and actionable insights for the manufacturer to really fix these kind of things way faster. So in that kind of vibration quality problem, they would’ve maybe not found it out for weeks. Maybe they would’ve even had a recall later because of that quality problem, which they now found out because of the data being utilized in a completely different way to get to these kind of insights very, very fast.

Spencer Acain:

I see, it really sounds like there’s a huge potential for benefit there by having these systems that can talk to each other and connect with each other in this more in-depth way than was possible before. And I guess to enable this, it sounds like you have a lot of different kind of AI solutions that you’re offering as part of this whole package per se. But unfortunately I think we’re going to have to delve into those applications of AI in another episode because we are out of time for today. So once again, I have been your host, Spencer Acain, joined by Ralf Wagner on the AI Spectrum Podcast. Tune in again next time as we continue our discussion on the applications of AI and data-driven manufacturing.


Siemens Digital Industries Software helps organizations of all sizes digitally transform using software, hardware and services from the Siemens Xcelerator business platform. Siemens’ software and the comprehensive digital twin enable companies to optimize their design, engineering and manufacturing processes to turn today’s ideas into the sustainable products of the future. From chips to entire systems, from product to process, across all industries. Siemens Digital Industries Software – Accelerating transformation.

Spencer Acain

Comments

One thought about “AI breaks down the silos of industrial data Podcast – Transcript

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/thought-leadership/2025/05/16/ai-breaks-down-the-silos-of-industrial-data-podcast-transcript/
OSZAR »