i3 | May 03, 2022

An Array of Humanoid Robots to Help

by 
Gary Arlen
Robot flying in the galaxy

What should a robot look like?

Most of us have seen 3CPO in Star Wars, the replicants Roy Batty and Zhora (played by Rutger Hauer and Joan Cassidy respectively) in the original Blade Runner, Rosey from The Jetsons, Robby the Robot in Forbidden Planet, Gort in The Day the Earth Stood Still or Terminator T-1. Or the disembodied Samantha (the voice of Scarlett Johansson) in HER, an artificial intelligence (AI) sentient character for whom Theodore Twombly (Joaquin Phoenix) has a crush on.

Yet these science fictional characters and scores of other robots have shaped our visions of what a humanoid or hybrid anthropomorphic device “looks like.” It’s hard to forget those classic humanoids, even in a world of dedicated robotic arms for industrial and medical projects, mobile robots for food service, first-responder rescue and safety devices, all built for specific purposes.

Sophia the Robot
Sophia is a realistic humanoid robot capable of displaying humanlike expressions and interacting with people. It’s designed for research, education, and entertainment, and helps promote public discussion about AI ethics and the future of robotics.

We will not examine semi-humanoid robots such as Pepper (developed by Softbank Robotics), Jibo, a social robot or the emotional companion robot Buddy. We’re also skipping some newcomers like the Italian iCub and the Indian Robot Shalu, an educational humanoid that can speak dozens of languages.

This proliferation of robotic devices says volumes about the advancement of AI, micro-mechanics and the appeal of quasi-human collaboration. Despite the expanding activity in this field, the appeal of human-functioning robots continues to evolve. The new robotics boom is encouraging technologies that are spilling into alternative uses, such as exoskeletons for medical prosthetics and therapeutics, and for extra strength in manual labor tasks.

David Hanson, founder and CEO of Hanson Robotics Ltd., an AI and robotics firm that makes the popular Sophia humanoid robot says, “Designers and psychologists find that we are biased to read human-like faces and human-like forms. It is a more natural and intuitive form of communication and we can communicate better through it.” Headquartered in Hong Kong, Hanson Robotics is developing anthropomorphic beings for entertainment, and health and business applications and has exhibited robots at CES since 2007.

“Robots have been accepted in our lives as humanoids,” he says. “The use of humanoid and human-like robots in theme parks is successful around the world. The world class manufacturing and improvement of features will make these robots even more useful and ubiquitous.”

Hanson believes human-like machines are useful in situations where “you want to hold people’s attention, to engage with people, to communicate, to entertain and to put people at ease.”

He adds, “When people have face-to-face interactions, they remember the interactions better and it’s less exhausting.” He cites robots’ value to establish narratives including educational applications, pointing to a University of Memphis study that shows students retain knowledge better when involved in “human-like interaction.”

PediaDroid
PediaDroid realistically simulates the jittery movements of a child reluctant to receive treatment, or a sudden change in medical condition. It can reproduce in detail the expression of emotions such as anxiety, fear and resistance, changes in facial color, pupils, and breathing sounds.

Robots in Health Care

Similarly, in health and medical applications, including therapy, a humanoid “can facilitate a more naturalistic engagement,” Hanson adds.

TMSUK, a Kyoto, Japan, robotics developer, is focusing on health care-related humanoids including a Pedia Droid shown at CES. TMSUK is designing the robot to train doctors, dentists and patients plus hospitals and teaching institutions.

“I would like to create a future where humans and robots are co-existing,” says TMSUK Chairman Yoichi Takamoto.

The company’s Pedia Droid contains human vitals such as pupil and eye movements, heart sounds and breath sounds, pulse rate and body movements, explains Yusuke Ishii, a director at TMSUK. “By freely combining these, it is possible to reproduce various sudden changes and disorders in physical conditions,” he says.

The pediatric patient-type simulator is used in pre-clinical training for medical school, dentistry and nursing students, as well as to prepare for licensing tests. Ishii says that the Pedia Droid can also educate parents and teachers “to deal with sudden [health or medical] changes.”

The company also is developing “DentaRoid,” a dental-patient simulator for clinical practice and training. This robot features voice recognition and can open its mouth according to the dentist’s instructions and perform actions such as coughing. TMSUK says the DentaRoid can “realistically reproduce complex actions, such as jaw closing and vomiting reflex.” 

Both droids can express general convulsive movements, and changes in facial and eye pupil color, and can reproduce heart and respiratory sounds. Their skin is made of special soft resins and an elastic silicone material is used for soft tissues, such as the tongue. Sensors generate “unexpected head” movements, such as a twinge when pain is felt, Ishii says.

Robot capabilities, such as sensitive tactile functions, can be used for advanced technology services, including “next-generation prosthetics and artificial body parts,“ explains Kimo Johnson, co-founder of GelSight, a Waltham, Massachusetts, firm that is productizing an imaging-based tactile intelligence, developed at the Massachusetts Institute of Technology (MIT). The company’s elastomeric technology gives robots “intelligent digital touch” which enables complex object manipulation and performance of dexterous tasks,” he adds.

The GelSight sensor can digitize how a soft, deformable skin conforms to objects and provides higher spatial resolution than the human finger and can differentiate objects based on hardness or softness. This capability can distinguish a bone or dense tissue covered by soft tissue. Johnson cites MIT research showing that human-like tactile capabilities can grab delicate objects including fruits.

Born Smart or Constantly Evolving?

While humanoid robots stride into more sectors of our lives, does the “rise of the robots” mean that the devices will continually learn new functions or are they fully formed when they are activated?

To develop a life-like robot, an array of physical characteristics is necessary. For example, a realistic ankle should move in multiple directions (forward, twist, sideways) and a “normal” reaching motion often involves bending the torso, not just a mechanized extension of an arm. These characteristics raise the robotic equivalent of questions about “nature vs. nurture.” Will humanoid robots be built with such characteristics, or should they “learn” them from experience or upgraded software?

Robotic scientists are also grappling with issues such as AI, machine learning and other capabilities that may change the shape and features of next-generation robots. Mathew Schwartz, an assistant professor at the New Jersey Institute of Technology’s College of Architecture and Design, has been working on TOCABI (Torque Controlled compliAnt BIped) for eight years. His team began working on the robot’s lower legs in 2013 and the upper body design in 2016. They realized the big red humanoid avatar needed a new head, which prompted a debate about LiDAR (light detection and ranging) sensors versus cameras. The team — collaborating from Newark, NJ, and Seoul — has blended aesthetics and industrial design with real-world engineering, manufacturing, research and prototyping to create TOCABI (a Transformers lookalike), a finalist in the Japanese Avatar XPRIZE competition in September.

Schwartz says the XPRIZE underscores the technology that goes into a humanoid robot. It’s more than “a walking algorithm,” he says, citing the complexity of torque controls which go beyond positioning, but also require an understanding of motor power, gravity compensation and similar factors. Schwartz says humanoid robots can perform in ways just like the humans they are imitating.

“It’s about the perception of other people,” he says. “If you have a human-like form, people will expect it to perform in a human-like way, such as reaching out for a coffee cup rather than swinging around to grab it. That’s why it has anthropometrics.” He cites the difference between the human knee and ankle and their movements. That’s why TOCABI has two motors in both ankles versus one motor at each knee that enable the requirements for each function.

“There is the ability to autonomously identify a task such as walking across unlevel terrain and understand the robot’s body has to compensate,” he explains. “That’s a computer science question.”

That leads to the issue of humanoid robot intelligence and emotion. Schwartz says at the beginning robot designers have to tell it to learn something. “A robot doesn’t have a sense of personal space,” Schwartz says. But he is fascinated by the development of a “Sim2Real” simulator, which could enable “reinforcement learning,” an AI type of training that could teach the robot “policies about how it should act and behave so it can learn millions or billions of actions to take in the real world.”

Beyond Imagination’s approach to humanoid robots comes in the form of “Beomni,” which debuted at CES 2022. The thin blue-and-white device moves on wheels and can be controlled by a person anywhere in the world wearing a VR headset and gloves. But Beomni’s autonomous future based on AI extends far further. CEO and Co-founder Harry Kloor says Beomni can learn multiple tasks, from opening a bottle or adding a pinch of salt while preparing a meal to manipulating heavy objects in an industrial setting. It can help a person garden, handle dangerous tasks such as electrical repairs or assist with an exercise workout program.

The lobes on Beomni’s “brain” are designed to process and execute specific skills and this separation will prevent Beomni from becoming fully sentient. “What we’re teaching the robot to do would never enable it to become ‘The Terminator,’” Kloor says. “It’s learning to do tasks. It’s not gaining consciousness.” The Beomni 2.0 will add features such as mobility and sensing its surroundings and plans include haptic technology through the VR gloves. The Colorado company also plans to place Beomni on a space mission next year. 

Elli Q
ElliQ is a proactive, voice-operated care companion designed to enhance independence and support social, mental and physical wellbeing. 

Semi-human?

Ben Shneiderman, the recently retired University of Maryland professor of computer science and founding director of its Human-Computer Interaction Laboratory, contends that AI and its implementations, such as robots, should be human-centered “to better serve human needs” rather than technology centered. He cites several varieties of “social robots” (anthropomorphic, humanoid, android, bionic, bio-inspired) and, in his new book “Human-Centered AI” calls for policies that “further human values, rights, justice and dignity by building safe, reliable and trustworthy systems.”

Shneiderman adds, “A common theme in designs for robots and advanced technologies is that human-human interaction is a good model for human-robot interaction, and that the emotional attachment to embodied robots is an asset.”

Nonetheless, developers are exploring alternatives to the full-sized humanoid robot focusing on specific human use cases. In March, Intuition Robotics released ElliQ, a faceless “character-based person” that is two semi-circular shapes (like a head on shoulders) with a small video monitor on a side-tray. 

ElliQ is designed as a companion, intended to resolve the “emptiness” of living alone, the company’s CEO and Co-founder Dor Skuler told The Washington Post. The robot is not a humanoid because the target audience — mostly older individuals who live alone — felt threatened by the eyes, robotic audio and the “command-and-control” features of voice-activated personal assistants. This body-less, tabletop companion learns its person’s needs and develops a rapport to assist with sleep, pain, anxiety, depression, mindfulness, stress and health activities.

As these anthropomorphic robots evolve, how will humans embrace their new companions? Tempere University in Finland found a key factor is “making eye contact.” The study found establishing eye contact with a robot “may have the same effect as eye contact with another person.” It added that people may perceive qualities such as “knowledgeability, sociability and likeability” in robots based on how they look or behave.

“Even though we know the robot is a lifeless machine, we treat it instinctively as if it could see us,” explains Tempere psychology professor Jari Hietanen, director of the project. “It’s as if it had a mind which looked at us.”

Hanson believes these features will encourage the acceptance of life-sized humanoid robots, including in technical fields such as telepresence and avatar technology. “As artificial intelligence is progressing rapidly, humanoid robots are going to become far more valuable,” Hanson predicts. “You can expect the future is going to look a lot like a science fiction novel.” Or perhaps a film.

Tesla Bot

Musk Envisions ‘Optimus’ Robots

In March, Elon Musk announced the “Tesla Bot,” a humanoid robot that includes technology from his car company. He also vowed to deliver a prototype by year end with production models available in 2023.

“Optimus” will be able to represent the personality of its owner, Musk told Business Insider. A customer would download traits and the robot would behave like the owner.

Optimus is a general-purpose worker-droid, Musk said, explaining it would be ideal for tasks that are repetitive, boring or dangerous — things that “people don’t want to do.” Initial concepts call for it to deadlift 150 pounds and carry about 45 pounds and walk at a brisk near-human pace of about five miles per hour. Musk promised droids won’t harm humans. That would adhere to Rule #1 of science-fiction guru Isaac Asimov. “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”

Musk also predicted that the humanoid robot market will eventually be bigger than the Tesla automotive business.