Real-Time, Voice Programming for Robotic Applications

Jan. 14, 2022
Comau's MI.RA/Dexter software allows end users to program robots in real-time using simple verbal commands, such as “look,” “touch,” and “execute.”
Aw Title Card 2 61e0476398e22

Quick hits:

  • Intuitive methods for programming robots have become more common in a push to lower barriers to adoption.
  • Currently, there are three common methods for programming robots: The teach method, hand-guiding, and offline robot programming.
  • Comau's MI.RA/Dexter uses a programming metalanguage to take human syntax and translate it into robot syntax, so that programming can take place in real-time via voice command.

Related to this episode:  

Listen to the story here:

Read the transcript below:Hello and welcome to Take Five with Automation World. I'm David Miller, Senior Technical Writer for Automation World. Today I'm going to be talking about a very innovative new approach to robot programming, and how it links up with other trends we've seen in this domain.   

So, before I get into this new methodology for programming robots, let's talk about the current techniques that are in widespread use. For some time now, we've seen the programming of robots becoming easier and more intuitive in the interest of lowering barriers to adoption.

At the current juncture, robotic programming is done through one of three methods: The Teach Method, Hand-Guiding, and Offline robot programming.

The teach method simply involves using a device called a teach pendant to guide a robot through a series of specific points of motion which are then saved to its memory. So, the robot is basically steered or controlled using this teach pendant device, which takes it through every incremental movement it will be required to make, each one is recorded, and then it is essentially played back at a higher speed. Because the robot will probably be working in a highly standardized environment with products of a uniform type, this perfect repetition of motions will do perfectly well to move a line along.

Hand-guiding is similar, but it's done entirely using manual guidance, rather than a controller like the teach pendant. It's quite simply that if I needed the robot to move its arm like this, and then like this, and then like this, I would simple move it manually in that manner, and record the movements at each juncture to lock it into memory, and then it would repeat that sequence of motions on its own.

Those two types of programming are obviously highly intuitive, and so basically anyone could carry them out. You don't need to have much specialized technical knowledge. Offline robot programming involves using virtual models and may be more technically sophisticated, so we're not going to go into that in more detail today. Instead, I want to return to what we started on – a new method of programming robots that is similarly intuitive.  

So, this method of robotic programming is from Comau, via a software product called MI.RA/Dexter, and it utilizes what's called a programming metalanguage to take human syntax and translate it into robot syntax. In essence, it's able to process simple voice commands. What happens is that a robotic arm outfitted with a vision system and AI – an intelligent robotic arm – is simple spoken to by the user – in real-time, they can give it verbal commands such as to “look” at a certain object, “touch” a certain point in space, “execute” a certain pre-programmed action. It's that simple, and it can again be done completely in real-time.

This software was actually recently demonstrated to program the infotainment systems on Fiat 500 vehicles – This entails a process of literally pushing the buttons on a touchscreen to set it up prior to shipping the automobile out, and typically, a person would just have to sit there and do it step by step. It's tedious, it's boring, and it's probably a waste of that person's time. Using MI.RA/Dexter, Comau was actually able to send a robotic arm in through the window of the vehicle with another smaller robot attached to it that could very delicately push the buttons on the infotainment system. And the really interesting thing about this is that this is a very high precision task that involves a certain amount of spatial variability, so it's a fairly impressive type of software intelligence that is required to make the robot capable of repeating it consistently.

Yet this system was able to achieve it even while utilizing such an intuitive programming system—so that's impressive.

That's about all I have for you today, but if you enjoyed this video keep your eye on this space for more to come in the days ahead.