Dr Sven Schmidt-Rohr, Chief Executive Officer of ArtiMinds Robotics GmbH, believes that a digital twin only makes sense if it can be easily integrated into an existing process and explains how that can be achieved in applications involving the use of robots.
According to the German Informatics Society, digital twins are more than just data. They consist of models of the objects or processes they represent. In addition, they can also contain simulations, algorithms and services that describe and influence the properties or behaviour of the respective object or process. They often also offer complementary services. So the digital twin is usually significantly smarter than the plant or process it represents.
In many cases, digital twins collect data from a process, enrich it with information and derive consequences for future action from it, for example, through calculations. Used correctly, this results in numerous advantages for process optimisation and improvement of product quality. But how does the digital twin obtain the necessary data with relatively little effort? And how is it possible to extract the really relevant data from the abundance of data and to process it in a meaningful way?
Intelligent adaptor
Depending on the solution approach, a digital twin can only be achieved by integrating additional hardware or software into the system to be mapped. However, there is a relatively simple way to realise a digital twin in robotics applications. Dr Sven Schmidt-Rohr, Chief Executive Officer of ArtiMinds Robotics GmbH, explained: “From our point of view, a digital twin only makes sense if it can be easily integrated into the existing process. We, therefore, refer to our solution as the intelligent adapter for the digital twin because it can be used to create a digital twin very quickly and without the use of additional hardware. To do this, we use the robot, which is the central element of a plant section by nature. It is linked to sensors and actuators and communicates with these components anyway.”
In other words, a great deal of relevant process data is already available in the robot. If you start here with a solution for data collection, it can be integrated compactly and used easily. The result is a lean solution that can be integrated into an existing process with little effort. The robotics experts at ArtiMinds demonstrate what this can look like with their Robot Programming Suite (RPS).
No-code programming
With RPS, robotic applications can be planned, programmed, operated, analysed and optimised in a single tool. The suite’s no-code programming approach is manufacturer-independent and is done via drag and drop using individual function blocks. The software then generates the native robot code automatically. Despite the easy-to-use approach, very complex programs can be realised. However, Programming Suite follows this approach not only in programming but also in analysis. With the Learning & Analytics for Robots (LAR) add-on module, live sensor data can be monitored, analysed and optimised. The analyses are also adapted to the respective application with just a few clicks and via the corresponding parameters. LAR can therefore be used to create the digital twin as early as the development stage. It then provides detailed insights into the robotic production process. Based on the Robot Programming Suite, live sensor data from the robot, force/torque sensor, vision system, and end effector (the last element of the kinematic chain, for example, the gripper or welding head) are transferred from the robot controller. The sensor data is automatically decomposed, assigned to the individual components and permanently stored in a local database at the user’s site.
To create the intelligent adapter for the digital twin, the robot programmer navigates through the various sensor data of the robot program based on the individual function blocks. Sven explained: “Each function block is used to perform certain tasks. Our LAR tool automatically makes suggestions for suitable analysis and monitoring options depending on the blocks. Here, too, the user then selects from predefined ’tiles’, which means analysis methods, or ‘rules’ which are the monitoring methods that might suit the sensor data in question. With this approach, setup time is drastically reduced. Moreover, even users without in-depth expert knowledge can understand complex processes.”
Dashboards can be set up to visualise the process data. These can bundle information from different processes or display very detailed analyses for specific sub-processes. Sven added: “What is displayed in the dashboards can be configured very flexibly. This means that every user can choose the solution that best suits their requirements. Processes can be visualised in 2D or 3D. We have a large selection of standard tiles for this as well.”
For analysing force curves along a robot path, for example, it is recommended to display them in 3D diagrams, while 2D diagrams are more suitable for evaluating process tolerances. In the end, the range of information that can be displayed is only limited by the user’s imagination; it ranges, for example, from success rates per component to force-path diagrams of different assembly steps.
Quality and optimisation
Users benefit in many ways from an intelligent digital twin that is easy to set up. In terms of quality, two topics are becoming increasingly important in automated production: digitally supported plant maintenance and digitally managed product quality. Both are closely interlinked when looked at closely. The intelligent digital twin adapter can help with both. It automatically collects and evaluates inline sensor data from the process without additional hardware. In this way, it documents the process behaviour in detail for each day and stores the history of the digital twin. This provides insights into the production process that were not easily available before. Plant operators no longer have to rely on their gut feeling or the good ear of the plant supervisor with many years of experience, but have clear data and facts that reflect the condition of the plant and the quality of the manufactured products. The detailed documentation of each cycle’s entire process behaviour is a helpful tool for process optimisation and quality improvement. Especially in the case of sporadically occurring errors, it is often difficult to understand in retrospect what caused them. If process data is recorded, evaluated and consistently stored, it is still possible to investigate the causes and eliminate errors retrospectively.
In addition to historical data, however, the digital twin also shows current process behaviour and helps detect anomalies in real-time. Coupled with clever calculations, it can automatically determine which process adjustments are necessary to prevent a plant malfunction or product quality violations. The Start-End Point Analysis tile, for example, automatically calculates suggestions on how the robot program can be modified to optimally compensate for any tolerances that occur in a particular sub-process. This enables the plant operator to react quickly and purposefully during the running process. In addition, predictions can be made with regard to system malfunctions on the one hand and product quality violations on the other. For this purpose, the Epsilon-Tile monitors, for example, in which sub-process the sensor data takes an unexpected course. Sven concluded: “These tiles are just two examples of analysis and monitoring methods that we make available to users as standard. With these methods, they can create a digital twin with minimal effort, which helps them to monitor and specifically further improve their performance and product quality over the entire life phase of a plant.”