by Kumar Singh, Research Director, Automation & Analytics, Supply Chain Management, SAPinsider
The lucrative opportunity in smart manufacturing
Back in 2018, IDC estimated that by 2020, at least 55% of the organizations would leverage Industry 4.0 technologies to transform their businesses and operating models. While they have not published the most recent stats on this yet, another think tank, IoT Analytics believes that the adoption rate of Industry 4.0 in 2020 was less than 30%. One aspect however, is pretty clear- the opportunity in this area is still big and hence many Industry 4.o specific solutions, particularly in the area of smart manufacturing, have manifested in last few years.
The real value of digital manufacturing lies in how you leverage the data
One of the primary advantages of digital manufacturing is that the data generated can be leveraged to do various forms of analytics. When an organization invests millions in digital manufacturing to setup a “smart” factory and in developing a digital twin, the real ROI comes from exploiting the data that these smart factories generate.
There are various areas in which data generated by smart sensors can be leveraged in the manufacturing process and in this post, we will cover few of the application area and the methodology at a very high level.
I intend to focus on how the analytical approaches work but not the math involved. However, the math behind the concepts is anything but simple. Complexity arises due to the sheer number of variables involved. If there are 100 sensors on the floor, we are looking at equations with up to 100 degrees of freedom. The resulting equations may be computationally easy to solve but the interactions among subsystems grows the complexity exponentially. And we know that the number of sensors on the floor sometimes may run in thousands for complex manufacturing operations. This however, is the very definition of Big Data.
So let us now jump into applications of analytics in the following areas in a manufacturing setup:
(1) Design and testing
(2) Anomaly detection
(3) Quality testing
(4) Process optimization
Since my undergrad was in Electrical engineering, I will use the example of a plant that manufactures induction motors. Key parts on an induction motor are shown below. Our hypothetical manufacturing company is planning to launch a new model of induction motor and design engineers are working on testing a prototype.
Picture source : IEEE
Leveraging data for design and testing
During the test runs, the sensors on the prototype generate data related to rotor speed, angular velocity, coil temperature, torque generated etc. As you can assume, many of the variables are interconnected, where the value of one impacts the other.
The data captured by the sensors is captured by a broker , ingested in real time into a NoSQL Database. Note that I am simplifying the technical architecture here for simplicity purposes. (The NoSql database is just an example and other types of databases can be leveraged as well).
The NoSQL database feeds into training dataset of a model, and then-using this model-react to live-data using this model. Finally the result will be translated into an action and sent back to the IoT device (actuator). The actuator then tweaks the physical hardware characteristics like rotor speed etc. to vary the operating parameters. Hardware components may be replaced as well based on the sensor data (coil with thicker gauge, rotor with slightly smaller diameter etc.).
This “update and improve” process continues till the prototype parameters align with the ideal design parameters of the product.
While the architecture above is generic (on purpose), many solutions exist currently. One of the leading ones is from Siemens, called Simcenter. Siemens has been collaborating closely with SAP in the Industry 4.0 arena and hence those looking for an end to end Industry 4.0 solution in SAP ecosystem will benefit from this partnership.
Analyzing process parameters
Finally, the engineers have validated and approved the final design and after passing all testing phases, Motor Co. starts manufacturing these motors. However, since this is a new product and considering the significant investment they have put into this, they want to monitor the production of first few batches very carefully.
Each equipment on the shop floor has certain operating parameters. Sensors on the floor monitor these parameters very carefully. Anomaly detection, a subset of pattern recognition, which is a subset of machine learning, is employed in many classes of analytics. In general, model variables or equations of model variables are mathematically compared for equivalency or ranges of equivalency.
For example, anomaly detection is applied to the friction coefficient in the roller bearings of a robotic arm. In this one-dimensional case, the anomaly is detected by stream analysis if the coefficient is higher than a predefined threshold. The manufacturing shop engineers know that it is abnormal for the frictional coefficient to be higher than, say, 0.35, so when 0.36 is detected, it triggers an action such as sending a message or lubricating a bearing. This can also be used to drive descriptive analytics to graphically display any model variable that is out of the norm.
Quality and testing
As the first few batches of motors starts to come out of production lines, Motor Co. wants to make sure the product is world class and meets the stringent quality criteria.
In the testing bay, motors are run connected with sensors that capture key quality parameters and relay the data. An example of parameters is: When the rotor in an induction motor moves, it makes a squeak. Normally the squeak is so feeble that we can’t hear it but it can still be measured. The squeak is largely due to friction between bearings and the ring holding them. By capturing the data on squeak frequency for faulty motors, Motor Co. engineers have developed a squeak profile . In the quality testing runs, data collected via sensors is used to generate the squeak profile and compared against the threshold squeak profile to ensure that the motors pass this criteria.
Solve or Optimize
We generally don’t need new data for this type of analysis as it is mostly done post process. Motor Co. leverages the data from the entire manufacturing process to:
(1) Ensure that the process flow is optimal: Algorithms can compute millions of permutations and combinations to determine the optimal process flow/path. The objective can be minimizing total cycle time or cycle time for a work area
(2) Reverse engineer: Leveraging data and using it to get to a desired result or product specification.
Note that one of the key aspects of this approaches is the Digital Twin. So when I refer to a “model”, it is essentially a digital twin, which in simple terms is a real time replication of the manufacturing process or processes, that allows monitoring as well as control of the manufacturing process. The topic of digital twin is exhaustive in itself and hence outside the scope of this article.
Kumar Singh, Research Director, Automation & Analytics, SAPinsider, can be reached at email@example.com