The proportions of retirees to workers are rising everywhere in the developed world. One of the greatest challenges facing public health planners in the coming decades is how to care for those ever more numerous armies of elders without breaking the bank. Enter the robot. The first priority of these robots will be to ascertain the needs of their human charges; what must they do to keep them healthy and to make their days brighter? For these vital tasks, they will need to be equipped with sensors to help them to determine their courses of action.
Sensors Connect Robots to the World
OMRON Corporation has developed the OKAO vision sensing technology system, with which a person’s gestures can be evaluated, as can the direction of his or her gaze. A package of hardware and software, it can also determine an individual’s mental state—whether he or she is angry, happy or sad.
It is also envisioned that capacitance touch detectors can be enhanced to detect changes in the skin that will allow a robot to infer emotional and physiological state via touch. Other sensors under development will gauge not only the pressure of a senior’s grip, but also how it changes with time. Information about blood pressure and heart rate changes from wearable devices will also help the robot in evaluating its human charge.
Aldebaran, the company that built Romeo, has collaborated with Softbank to build another emotion-reading robot called Pepper. As with Romeo, Pepper relies heavily on cameras and depth sensors to read body language and facial expressions. Both of Pepper’s hands contain a touch senor, and the torso has three of them. It is expected that visual, distance and touch cues will enable the AI within the unit to build powerful insights into a human’s emotional state.
What’s Next
In the near-term, the human that interacts with the robot might best be the one to wear the sensor. Researchers have developed very thin sensors based on carbon nanotubes that are exquisitely sensitive to even the faintest movement of the skin. Software can then decode the totality of the facial movement to determine if happiness, sadness, or even pain is being expressed, and the robot, through its AI, can then determine an appropriate response. Using these methods, the robot can get a far better handle on its charge’s emotional state than would be possible based on camera-derived cues.
Sensors that can detect smells are now an ongoing field of research. Since the first task that roboticists envision for their inventions is the care of the elderly and infirm, this can be an important tool in deciding when it time for intervention by human medical specialists (see Electronic Sensors Monitor the Environment).
So, while robots will not be replacing human caretakers, much less professional nurses any time soon, the foundations have been laid. And based on present demographic trends, there isn’t much time to lose.