June 9th 1:30 pm CEST
Open discussion with Katerina Mouliadou at ATS GLOBAL
Manufacturing is rarely digital native. The production means have evolved with automation and industrial robotics but there is so much more that can be done thanks to connectivity and Artificial Intelligence. Being capable to take decisions from data, always requires a transformation involving multiple steps. What are those milestones? Are there expected gains from each step? How do we measure the impact? During this webinar, Ms Katerina Mouliadou, Digital Innovation Lead at ATS Global will share insights on the steps to reach this ultimate objective of a data driven operation. We will discuss concrete examples of success stories but also all the difficulties that can be encountered along the way. Are there some difference from SMEs to Large companies, from one vertical to the other? Is there a generic roadmap for this transformation? What are the tools ? Atlas, the innovation core of ATS Global, creates software as a service (SaaS) solutions, increasing efficiency and opening the door to the fourth industrial revolution for businesses of any size. With no-code functionality, Atlas has made connecting to manufacturing equipment and integrating data from multiple business systems and sources at various data maturity levels flexible and scalable.
Katerina is the Innovation Lead in ATS Global and has over 10 years of experience in R&D and innovation of new products, manufacturing processes and Industry 4.0 technologies.
She joined ATS in 2019 where her role involves managing the global innovation portfolio aiming to develop and commercialise smart IoT solutions for the manufacturing industry. Since joining ATS, Katerina secured and managed > £4M of external funding for the development of new technology offerings which won multiple awards and have led to the commercialisation of the SaaS offering Atlas.
Her previous work in product and technology development for highly regulated sectors such as medical devices and pharmaceutical industries achieved manufacturing efficiency improvements of over $8M savings for J&J’s family of companies. Katerina is now applying her ‘customer perspective’ on the IoT solution developments with her role in ATS.
She has an Honours BEng degree in Product Design Engineering from Loughborough University, a MSc in Biomedical Engineering from Imperial College London and is currently an Executive MBA candidate at Warwick Business School where her research is focused on enabling data driven organisations.
Katerina is passionate about enabling tangible value for the business as well as its customers through
innovation and strategic collaboration.
ATS Global is an IT systems integrator for manufacturing organizations present all over the world and growing around the digital space. They install Manufacturing Execution System systems (MES), Enterprise resource planning systems (ERP), they get machines to talk to each other. They have also grown through acquisitions in the last few years and own quite many Product Lifecycle Management (PLM) solutions as well.
ATLAS was initiated a few years back in 2018 by our current CTO Martin Kelman. He had the idea of creating something more flexible beyond the current MES system that requires a lot of configurations. He started looking at developing new business models. He looked at Software as a Service (SAS) offerings which are new business models for software solutions, and we started to develop technology integration. We have a platform that gather all the solutions we deem the more appropriate to resolve the different problems our clients might face.
We’re working on cloud analytics and edge computing. Our strategy is to capture the process level first to identify what needs to be done and then the data required to run the processes. We disagree from the idea of capturing data for the sake of it. We should have value behind data and the meaning behind it. This is how we enable our customers to get the grips of the data and make it meaningful for their organization.
We work as a startup within a larger organization. We work very independently from the rest of the business. This has enabled us to be quite innovative and create and bring new products faster to the market. We’ve branded ourselves as innovation team. To date we have launched three products:
The backbone of the whole application, a software where you can map and execute your business and manufacturing processes.
A follow-up version which is the capability to visualize your data through customizable dashboards and to clear up the data because just getting data is not enough you need to do a lot of cleaning up a lot of sorting a lot of correlations with other inputs.
An edge capability to collect data at the next level as well as deploying algorithms at the edge level too.
Yes, we’re doing both. We took part to projects funded by the European Commission. We also got Innovation awards for Atlas Play. Since 2019, we’ve managed to work with 37 partners. This shows that innovation and collaboration are key to bring new ideas to the table and accelerate products to the market.
Actually all those criteria are linked with each other. The concept of “data-driven company” is indeed defined but is defined slightly differently by the various consultancies and organizations that are working on this.
There is a lack of consistency regarding what data-driven means. It probably means different things to different companies and different companies will have different objectives about what level of data drivenness they would like to be. This diversity is really key to consider when we evaluate a company.
However, in general terms, data-drivenness means that organizations are capable of making decisions based on data available across the whole organization.
This being said, it really depends on the organization and on whom you are talking to in the organization, what they think the problem is versus what is the actual problem.
If you speak to people in charge of the production, they will say they don’t have the good IoT integration or that processes are lacking or that people don’t have the knowledge to adapt to new technologies and that they need to upskill their people. If you speak to their IT team, the feedback is quite different.
If we look at it from the triangle of people, processes and technology the issue is not in the technology anymore. There are many technologies available which can do impressive things if they’re applied correctly. The issue seems to be mainly on the people level, about the culture and the mindset to adapt and adopt these new technologies. Because, whether we want it or not, we’ve all been disrupted, especially the last few years, and forced to use digital tools to communicate and be able to progress in our job. Similarly, the manufacturing environment is disrupted by new technologies coming forward. Some people will adopt new technologies and they will progress and build a competitive advantage. So if you lag behind you’re just going to be continually lagging and missing the new advancements.
It really seems to be a problem of mindset that can be change based on a good company strategy. What is required first is to define a team with a chief data officer who is clearly responsible for the delivery of such initiative and strategy within the organization. It is also really key to have the backing of the CEO to be able to push that initiative down.
Many case studies have been developed on a top-down approach and eventually people need to accept and understand the benefit that the business will have and how important it is to adopt these technologies. At the moment departments are still often working in silos, they have digitized everything but another department in the same company doesn’t have any visibility on the data hence produced.
Another problem we need to solve is the availability of real-time data to support fast & efficient decision processes.
A few of the data-driven companies at the forefront are the ones that offer digital software solutions or products such as Amazon Microsoft Facebook or Google. The question is how the rest of us can catch up with them. Obviously, the capability they have in terms of knowledge of the technologies is impeccable so it is difficult to compare to them.
Some of the companies that have achieved good levels of maturity compared to other manufacturing companies have been able to do it by putting the right structure in place, by upskilling people and training them to ensure the required data literacy level across the organization and make sure that everybody has the right level of training.
The other key success factor is diversifying the project groups instead of having a segregated team just with tech people and then everybody continues doing their job on a day-to-day base, be it administrative stuff or any other engineering activities. They start diversifying by seeding a technical person into every team so they can start creating those conversations or by designating internal champions whom people can reach out to and ask how they can do something in a more digital way.
There are many interactive studies about the digitalization level across different companies globally. The company size doesn’t seem to make any difference to the data driven readiness of the company.
This probably comes from the different challenges they face. Large companies have obviously a lot of people to manage, they will need a huge initiative to change the mindset which can be very difficult and usually takes a long time. They usually have very strict processes and have to make sure that this introduction of new technologies doesn’t disrupt everything too much and that people still know what they’re doing. On the other side, they can financially afford the integration of new high technologies.
On the opposite, smaller companies obviously don’t have the same financial freedom, but they work in smaller groups so they’re more agile. I think agility is key to becoming data driven and the ability to adapt and change based on your environment is a competitive advantage of the small companies.
Today, with the funding available by national and European institutions, these smaller organizations can benefit from an extra push to become data driven and embrace what digitalisation relies on: the skills, the attitude, the mindset and the structure. Here they will see that the size of the company is an advantage rather than a drawback.
This is exactly what we say all the time to our clients: “you get what you put in”. This brings us back to the discussion about the mindset and the people involvement/commitment.
If your system relies on manual inputs from people, you need to make sure that they do that on time or that they don’t do it promptly and ensure that the values are accurate. Otherwise, obviously, your data quality will be completely disruptive. Data quality is definitely one of the key issues that should be resolved without special need to upskilling people and giving them a good level of data literacy. They can understand how important this data quality and the automating of the collection of the data are for the business.
Indeed, Cyber security is often considered as a common challenge, but I see it more as a risk than a challenge. There are technologies out there that are secured and probably most of us use the cloud a lot more than we even realize.
At the organizational level, it’s important to make sure that we know about all this data flows, where does everything go, where is it stored, for how long, who’s got access to that, etc. Technically speaking this is feasible to be isolated and make sure it’s private to the individual company. What the companies need to do is look at their processes and make sure that they have ownership responsibilities and everything aligned to make sure that this risk of cyber threats is tackled and constantly monitored. So it’s not a challenge it’s more a risk management topic.
One of the projects we’ve developed lately was focusing on incorporating new technologies and improving the current process based on data collection and treatment. The project was called “World Zero” and involved a few end users like BMW, Lancor and 67 companies active in the construction and in the oil and gas industries. All of them have got a few manual processes and they’d like to automate it for different reasons. This is quite interesting to see that the same approach and the same solution brings different values to different industries. Their objectives are quite variable, but the challenge is still the same: collecting meaningful data.
Their first challenge was to connect their machines. This is difficult sometimes depending on their machine provider because some of them lock down their equipment so it’s impossible to get any useful data out of it. And even if you connect your machine, there is very little you can do with the data you are collecting. It’s really important to conceive those digitisation processes as a supply chain. It is important to acknowledge that being more open about sharing data and giving value to others as well as to ourself within our ecosystem is a way to be far more effective.
An issue remains on communication standards. More than 200 different standards are identified in the manufacturing industry. You can imagine why digitisation is so slow and so difficult although the technology seems to be there. It is a challenge for any integrator or any solution provider to be able to provide offerings to cover all this range of standards. There is a lack of continuity across different industries and even across different companies even if you work in the same industry. Each organization has their own communication standards their own data standards and we need to reach a common understanding on what should be the best practices.
There is also a need for more research on technical elements. We looked at deploying AI by collecting data from the edge. The AI was based in the cloud for the training purposes. This means that you need to upload loads of data continuously although you don’t necessary need all of them for your AI algorithm. This is how it’s structured from the machine and you can really break it down before you upload it and uploading it one by one would cost a fortune. This is the costing model of the cloud: you’re being costed by events. So whether you’re sending one string of data or a whole bunch of data, the cost will be the same. This makes it difficult to have real-time processing capability. Of course, once you have that sorted you have the perfect algorithm that you can deploy on the edge to save a lot of computing power. It’s a more sustainable option. But to get there you obviously need to think about these architectural elements in the right way and know how your algorithms are working based on the data that you’re collecting.
At the moment, it’s more of a trial-and-error process. We’re in the learning journey ourselves. We use these projects to learn what are the elements we need to look at to develop our deployment methodologies. We will have that as a competitive advantage for commercial projects. But it is still a challenge because most of AI is very custom and very application related so it’s difficult to develop something that’s applicable to all the possible options you will have to explore. You still need to have that element of flexibility and adaptability in your methodology and in your thinking.
I think it goes back again to the mindset. It’s all about changing the mindset. Probably the big customers having the buying power need to push their supply chains to share openly the data. But they also need to share data back as well. I have discussions with small organizations within the supply chain that provide equipment. They say they would love to know when their equipment is about to break down so they can arrange for a service straight away rather than making the customer unhappy because they don’t have the scheduling or the capacity to service. This question of sharing data is a two-way thing. So yes, obviously, it’s all about changing the mindset and being more collaborative.
I definitely agree. Data is a new oil but it’s a crude oil. It needs a lot of cleaning to become usable. From that perspective the quantity doesn’t matter it’s the quality that you get.
We have the example of Rolls-Royces who have adopted AI in their supply chain. They are exploring also through R² Lab, the digital group of Rolls-Royce. It goes back again to the importance of having the right structure and the dedicated team to work on specific data driven initiatives. They’ve started basically looking at just putting together the right source of information. They discovered that they had different sources, reporting different results and different records all over the place. So just putting everything together and ensuring that what they have is the accurate description and content made them save huge amount of metal scraps across the supply chain which was worth a few million pounds just for one year. It was worth two Eiffel towers actually.
It is certainly related. We see companies improving their offerings incorporating new business models and identifying opportunities that they didn’t know they could do beforehand. With the ability to digitize and having that data they foresee new value and optimization opportunities.
Regarding the “digitize or die”, this is also very true. It’s what we mentioned earlier. If you fall behind, you will only keep falling behind even further. It is a stream that we all need to jump off, whether we like it or not. It’s just in our lives.
This is a very great collaborative project that we completed with Ford, Vodaphone, PWY and VFE amongst other partners. The idea was to try and apply 5G within a manufacturing environment.
One of the use cases we explored was deploying 5G on a welding line which incorporated some automated inspection. We’re trying to correlate the data that we collect from inspection and optimize the manufacturing process.
The second use case was remote maintenance assistance. If a machine would break down, they could link up with their maintenance team – which was an external organization – to have a secured access to the data from the machine and provide remote instructions to the operator so that they can set-up the machine themselves. That would save them time and then, potentially, they could start collecting data and start predicting as well when the machine is going to fail rather than waiting for it to fail and hence schedule the maintenance better. In this use case, we faced the issue regarding communication standards I mentioned before. As there was no specific standard, there was a lot of work trying to make the application switch and develop data communication standards. This is not difficult, it’s just time consuming and adds to the time needed to bring a new offering to the production floor. It minimizes their value as well and the value they get from the solution.
Then, within the consortium, we have discovered quite a few challenges for the 5G adaptability and adoption by the wider industry. It’s a great technology to process data very fast but in the production context we discovered that it’s more important or equally important the increase the upload speed as well as the download speed. This is quite different to their consumer model at the moment where the download speed is the only key requirement. In the manufacturing environment, we had to upload lots of inspection data in real time and the speed was significantly lower. It wasn’t that bad but it’s not necessarily aligned with the benefits that have been claimed and published around 5G.
I know that some providers are looking at reducing this difference. There are some configurations in which this would possible but – as far as I know – it is still in the exploration phase.
If you look at the Gartner’s hype cycle of 5G, the technology is now at the top of the expectations. Everybody thinks they can resolve all the problems with 5G. So we try to explore more. As no technology is going to solve all our problems we need to define where is the potential of the 5G which expected value will gradually deflate.
For me the quality is more important than the quantity. 5G will give you the option to upload these fast quantities but the question is “do you really need these quantities?” It really depends on the application. Sometimes it’s a yes sometimes it’s a no.
If we’re talking about an infrastructure where you collect data from many cars on the streets, here you have huge data and probably 5G is a great application for that. Within the factory, it depends on the application. We used welding application for the 5G project because it creates so much data. But you have limitations in terms of how fast the devices can stream that data as well. So you might have a very powerful network but you cannot collect that data so fast anyways.
As mentioned about the app streaming into the cloud, it depends on your architecture.
We have a few small companies in our consortiums. This is one of the benefits of these externally funded government projects. They force companies to work together in a good way. This is how smaller companies can have the opportunity to work with larger ones and large customers and hence showcase their value. In the “World Zero” project, we work with a robotic SME based near London which managed to reach out to three large companies through the consortium. Those projects are really about fostering collaboration so if anybody would like to work with us, please reach out.
Atlas is the innovation core of ATS Global, backed by decades of industry leading expertise around production floor connectivity. Atlas is a software as a service (SaaS) solution, increasing efficiency and opening the door to the fourth industrial revolution for businesses of any size, in any industry. With no-code functionality, Atlas has made connecting to manufacturing equipment and integrating data from multiple business systems and sources at various data maturity levels flexible and scalable. Learn more →
The DIH² project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 824964.