Why simulation is important in industrial robotics
It’s hard to imagine building any physical object without first fully designing, assembling, and even testing it on your computer. CAD/CAM tools and multi-physics simulations are readily available, allowing you to iterate quickly in the digital world rather than building N-prototypes of your widget and spending a lot of money trying to get it right.
So why would it be any different for an industrial robot workcell?
For anyone who works with Offline Programming (OLP) or simulation tools, the beauty of being able to work from the comfort of a digital twin is obvious. You can experiment in a no-risk environment and validate an entire process from the comfort of your desk. Validating a process in simulation is critical because there is very little margin for error on a real robot. Robots can be big, powerful beasts that will obediently destroy thousands of dollars of material just because you were off by 1 millimeter in your calculations. Not to mention the potential danger to personnel who happen to be near the robot during testing. So, the more we can simulate, the safer and more productive we will all be.
Surprisingly, OLP and simulation tools, although they are becoming more and more common and popular, are not the norm for most robotic workcells. A lot of programming occurs online, or in-situ on the production floor. This happens for two main reasons:
- Simulation tools can be very complex and require specific expertise.
- There are always errors or differences between simulation and reality that you have to correct in the real world.
So let’s dive into these challenges in their respective order.
Simulation tools can be very complex and require specific expertise.
Depending on your application, the complexity of your simulation can vary greatly. If you have a single robot welding a part on a table in a closed cell, you really just need to know if the robot can reach all the weld locations, if it will collide with anything, and if it can weld within the required cycle time limits. You might also want to simulate the loading and unloading process of the part as well. This can all be done without the need for a really complex physics simulation and is not too calculation-intensive. Fuzzy Studio was designed with these types of applications in mind. We wanted to make robot simulation accessible to anyone, regardless of their robotics background.
But let’s say your MIG torch has a bulky cable that might obstruct the robot’s motion. Now we are getting into some more complex issues. You need to model the cable, which is a flexible element (the parameters of which you can only guess at) and properly simulate its movement based on where it is attached and its physical model. This is a challenging problem. Many tools provide approximations that aren’t bad but require a lot of faith in rule of thumb thinking.
What if we are working on a handling application, such as picking boxes off a conveyor and putting them onto a pallet? We need to model the motion of the conveyor and the boxes on it. We have to simulate the process of “picking” the box, making it a part of the robot, and then “placing” the box on the pallet or among already placed boxes. All of this requires significantly more setup and thus tools that are inherently more complex.
Moral of the story: the more things you want to simulate and the closer you want your simulation to match reality, the more complex your simulation tools will need to be. So, you should always try to find the right tool for the job. If a basic kinematic simulation will suffice – great. If you need a full-blown physics engine because you have complex dynamic interactions between objects – then get ready to roll up your sleeves. But no matter how well you choose your simulation tool and craft your simulation, there is one hard truth you must keep in mind: your simulation will never perfectly match your reality.
There are always errors, or differences, between simulation and reality that you have to correct in the real-world.
Simulations are nothing more than complex and imperfect mathematical models of the physical world. Some mathematical models are better than others, but there are many phenomena that are tough to get right – contact between objects, flexible body deformation, fluid mechanics, etc.
Even with the best equations, there are always things that go wrong in reality. Mechanical parts and assemblies all have tolerances and unmeasured errors. Robots that we assume are perfectly rigid will flex when loaded with hundreds of kilos of payload. Parts might be mounted differently than planned in the simulation. And let’s not forget good old wear and tear. This list goes on and on.
But all is not lost. There are many tricks, tools, and good practices that can mitigate these errors enough so that we can forget about them.
Things are not where they should be
This is the biggest issue in my experience. Parts, end-of-arm tooling, mountings, etc. are never where you placed them in your perfect digital model. They are usually quite close, but as my fifth-grade math teacher always said, “Close only counts in horseshoes and hand grenades”! When it comes to industrial robotics, a few centimeters of error can lead to significant problems depending on which direction you are off…
This is where calibration comes in handy. Calibration can mean a lot of different things in robotics, but generally speaking, it is the process of determining the difference in position and orientation between two frames of reference. Most commonly, we are talking about where an object is located with respect to the base of our robot. This is a rich topic full of really cool math and clever real-world tricks that we will dedicate a full tutorial series to. For now, let’s say there is pretty much a solution for any calibration problem you might have (some just come with hefty price tags).
Robots getting all bent out of shape
I generally don’t speak in generalities, but generally speaking, this issue only becomes troublesome with larger robots. Robots that lift heavy payloads (>50kg) or robots with a long reach (>3m) are particularly affected. If you’re not clear on what I’m talking about, try holding your desk chair with an outstretched arm. Unless you’re super strong or your chair is very light, your arm will probably droop a bit. Robots face the same problem.
Drooping is most assuredly not the technical term for this, but fixing it usually comes in three flavors:
- Buy a bigger robot.
- Model the drooping and compensate for it in the control of the robot.
- Design end-of-arm tooling that can be placed “approximately” and will then precisely position itself for the needed operations.
Cables getting in the way
As mentioned earlier, modeling cables is difficult to get right. Most of the time, people just don’t do it and basically say “YOLO” before running their first trajectory on the real robot. Honestly, this is not as crazy as it sounds. Any robot motion you plan to run on the real robot should be first executed at a reduced speed with one hand on the dead-man switch. Human validation of the first trajectory is a critical step that should always be part of a deployment. During that first “dry-run” you check to make sure nothing is colliding or getting tangled.
Cable harnesses can get really complicated depending on the tooling, and even checking manually is not without risk. This is a real pain point for systems integrators, so there are a number of cable management systems on the market that can tidy up all of the clutter.
Another common practice is to limit certain joints to ranges that wont cause the cable or cable management system to break or tangle during usage. However, this limits the overall range of motion of the robot and can be overly conservative.
Wear and tear
Like any machine, robots will wear over time. Lubrication becomes less effective, friction and mechanical play result in positioning errors (among a billion other possible problems). As with your health, “an ounce of prevention is worth a pound of cure”. Regular maintenance and checkups are your friend here.
What You See the robot do in simulation Is NOT What You Get in reality (WYSINWYG)
I have mainly talked about simulation for the moment, but the term OLP refers not only to simulating the robot workcell but also to generating the robot code that must be uploaded to the robot controller. This replaces the tedium of having to write the lines in the teach pendant or having to manually teach them – which can take hours for complex motions. When you are working with an OLP solution from the robot manufacturer, you can be pretty certain that the simulated motion will match precisely the real robot motion because they know exactly how the robot controllers work. What is problematic is when you work with 3rd party software that can only approximate what the OEM robot controller will do. For simple motions like straight lines from one point to another (or articular or cartesian), the 3rd party simulations can get pretty close to how the OEM controller will behave. But for more complex motions with dozens or hundreds or thousands of points to move through, 3rd party simulations don’t even come close. This is particularly troubling for Continuous Path (CP) motions, such as polishing an airplane wing or deburring a manifold.
To solve these inconsistencies, you can try to fix the points manually using the teach pendant on the robot if there are only a few dozen. But if there are a few hundred points, then you will likely want to redo your simulated motion or postprocessing to robot code. The standard offline programming workflow looks a bit like this: you do your simulation, export your robot code, copy that to your USB key, upload and compile it on the robot controller, retry it on the real robot, and cross your fingers, hoping it works – if it doesn’t then you try again. These iterations can be slow, painstaking, and ultimately fruitless.
This was precisely the reason why we built Fuzzy RTOS.
Our very first clients needed to execute complex trajectories at high speeds and modify them in real time. For a single motion, we often had to do 10-30 of these workflow iterations to get it right. The classical OLP iteration cycle was just not efficient enough for us.
Having done it in our past lives, we decided to bypass the proprietary robot code and directly control the robot joints in real-time ourselves. This meant that we had to reinvent the motion algorithms from scratch, but once we did, we were able to ensure that whatever motion we had in our 3D simulation would be the same as the one run on the real robot. This meant that we could do all of our iterating on the computer and execute on the robot instantly in seconds rather than wasting hours going back and forth with the robot and the traditional OLP workflow.
The reality gap
Simulation tools are the clear path to making robotics easier, safer, and more accessible because we can work in a zero-risk environment before even setting foot in the factory. As simulation tools improve, become cheaper, and more accessible, they will open robotics to more and more users. It’s easy to draw parallels with the world of CAD, with the advent of accessible tools like Autodesk’s Fusion 360. Similarly, the world of 3D printing software has made leaps and bounds in just a few years. Now, using a 3D printer is almost easier than using your run-of-the-mill office printer.
But no matter how good these tools get, the reality gap – the difference between our simulation and our real machine – will always be there. That’s why we need tools, technology and know-how to close that gap and successfully automate new and exciting robotic applications.