Inside IT
Cobot’s Transformation from a Robot to a Human (2)

In the previous posting, we talked about the principle of Cobots (Collaborative and Symbiotic Robots). Today, let’s take a look at some more specific mechanisms and cases of how Cobots have evolved so far followed by some cases of commercialized virtual robot systems and robot cloud platforms.

협업_로봇_1

Let me show a simple example situation a real factory may face and what kind of role can Cobot play. Let’s suppose a worker is getting some help from a robot holding a screwdriver or a hand drill-like tool while assembling a small product as seen in the image above.

If the robot is designed to follow a routine according to a given code and program, then it would put screws on products at the accurately fixed locations in fixed orders. If a human provided the parts as they liked, the given rule would not matter anymore since the location and angle of the parts would be put in place randomly. <Robots cooperating with human in assembly work>

This may end up leading the robots completely miswork or even hurting the worker’s hands. The advanced collaborative robots can be a huge help as it can figure out the changing place and angle of the object and react to the changing information flexibly. It also sees if there’s a human around or not, and controls the speed, power, and distance of the machines to keep workers safe.

In factories, workers are mostly placed around the assembly line. Though people usually call it ‘simple assembly work’, compared to the conventional role, such as transportation and welding, it is a job that requires some intelligent ability at the robot’s point of view.

Many experts including Prof. Julie A. Shah leading the MIT robot group and Prof. Manuela Veloso from Carnegie Mellon are expecting the robots, which used to be just simple and rough, to evolve into human coexisting[1]and their research is actively in progress.

Eventually, the most important part is to secure workers’ safety, then to make sure they create more added values through better productive efficiency by adopting these robots. Ultimately, however, what we have to remember is that these robots will eventually replace human workers.

Soft_Robotics

The gripper made of soft material from Soft Robotics (left), the gripper from Festo imitating an elephant’s nose (middle), and Jamming Gripper from MIT (right)

The end-effectors (gripper) that can be attached to robots can be designed in the form of human hands or animal claws with the use of 3D printers. They also use soft material and the pneumatic mechanisms. Moreover, with two arms and over seven joints like a human arm shown below, their appearances and motions are becoming more like human.

From this, not only in 3D works, but made possible to pick up materials that are highly fragile, and do more delicate work that only human could do (i.e. assembling microelectronic circuits). In the image below, you can see prototypes and new models: Universal Robots UR Series, KUKA LBR, ABB dual arm concept robot (Yumi, FRIDA), Rethink Robotics’ Baxter and Sawyer, and YaskawaMotoman.

cobot

Examples of advanced industrial robots – Rethink Baxter/Sawyer Robot, Universal, ABB Yumi/Frida, and KUKA (Clockwise from the top left)

ros

What are the most important factors to having these intelligent and advanced service robots to collaborate with other robots and workers at the site and utilize them safely?

It is the interface with a virtual system like a brain based on machine learning/circumstantial recognition and better communicative functions. Reflecting such needs, virtual software as well as user friendly interface development and standardization are under way through Robot Operating Systems (ROS).

Various robotics experts, developers, and others in related fields in the manufacturing industry are already participating in this open source project to create a library of middleware and interfaces. What they’re working on is studying and defining the necessary functions and rules about the symbiotic relationship between robots and humans in assembly processes which require accuracy. The key issue for robotics research, therefore, is to focus on the relationship among robots, manufacturing environments, and humans, then fulfill the mandatory factors like the ‘subdivision of tasks’, ‘programming’, ‘compatibility and simplicity of robot interface unrelated to OS’, ‘flexibility to adapt to any situation’, and ‘behavioral inductivity’.

Commercialized Virtual Robot Systems and Robot Cloud Platforms

Virtual robot simulation and programming

Virtual systems are expected to become the medium used to manage, control, and operate service robots. An interface that is divided according to the people’s roles (Workers/users/administrators/programmers) can be implemented using a virtual system.

Examples of this virtual factory system using commercial robots are Delmia Robotics Simulation supported by Dassault System’s cloud based 3DEXPERIENCE Platform and Tecnomatix from Siemens. These systems also own great libraries with diverse types of robots.

산업용_Robotics

Delmia Industrial Robotics Simulation from Dassault and Tecnomatix from Siemens’ (Source:http://www.3ds.com/)

By using various sets of virtual applications, you can also simplify the space arrangements even when you’re not at the production site through connecting to the robot layout and the production process. This reduces potential errors and crashes, and in turn, wastes of materials and threats to reproduction. The standardized connection between 3D models of CAD products and other virtual business programs such as PLM (classification information on components for various products) will also sync the robot’s end-effector’s work based on product/component specs and increase the recycling rate in a production line.

These universal simulation tools as well as programming tools like Baxter, Nao, and Pepper[2] used for each robot show that the current trend is all about making it simple and intuitive regardless of its purpose (Industrial/personal). They support interfaces using high-order language such as choreographed workflows that were used for business systems.

가정용_휴머노이드

Nao, a home humanoid’s choreographed workflow programming visually suited for users (up) and Baxter’s SolidWorks based 3D robot simulation for industrial use (down) (Source: www.businesswire.com; www-users.cs.umn.edu/~karen/dreu/)

Baxter cooperates with a 3D CAD system like Solidworks from Dassault while Nao enables the internet-based 3D robot simulation through the Catia WebBot. What made these possible was user friendly on-offline programming.

For those who feel uncomfortable with programming, there are even robots that users can teach more intuitively as if teaching a child where you can hold the robots arm and show it how to do things by moving it as you like. They also have dozens of sensors for better safety with the self-awareness function, through which they stop moving automatically whenever it notices that it has bumped into something.

Robot cloud platforms

Through a cloud platform, you will be able to process sound and video remotely with an advanced analysis tool, or even download new technology when needed.

The collective intelligence based service will soon be developed using the massive database supported by a cloud network. This can provide a web service just for robots, with information about the external environment and objects as well as data on what tasks they can perform.

Such advancement in robots is ultimately expected to lead to the world where we can analyze, test and control robots at an industrial site from the very beginning of development in a more user friendly way (i.e. talking to them through a walkie-talkie).

RoboEarth

RoboEarth’s cloud architecture (left) and the future of robot cloud platforms (Right) (Source:http://arstechnica.com/; http://venturebeat.com/)

There are various platforms being run or developed currently, and these include ABB’s MyRobot platform, Grid Robotics, Rapyuta Service, Robohub, RoboEarth, V3Nity, and PROFETA. These platforms provide diverse information to the robots (Robot managers, programmers, etc.) under the motto of ‘robotics as a service’.

The technology the “father of computers and artificial intelligence” Alan Turing dreamed of is now taking shape into today’s Cobot. The key to Turing’s insight was to imitate and learn through the similarities among different objects and the generalization between them.

As Turing machines general characteristics can be used in various environments, they are now able to imitate the human way of thinking through machine learning. Their imitation skill is now advancing into copying the way we see, hear, and move by adopting the visual technology and bionic movement methods.

Cobot, which already exceeds human calculation, is now working to be exactly like humans. Turing once asked if machines would be able to think like humans. What is that humanness that robots aspire so much to reach, then?

Written by Seung Yup Lee, Researcher

[1] Unlike the general industrial robots which perform automated jobs on their own, collaborative and symbiotic robots focus on guiding/assisting human workers throughout a shared work procedure (i.e. holding a heavy object up while a human worker is welding the floor). Intelligence and stability are considered especially important. [back to the article]

[2] A French home humanoid from Aldebaran. It has the form of a normal ‘home robot’ we imagine and is designed to be more familiar to general consumers unlike industrial robots. It has an open development environment and users can add extra functions. It’s able to perform as a toy robot like Lego Mindstorms. [back to the article]

Post navigation

'Inside IT' Category Post
  • IoT
  • Cloud
  • Big Data
  • Security
  • Data Center
  • e-Government
  • Transportation
  • Energy
  • Manufacturing
  • Finance