Are we human or robots; one day we may not know the difference

Researchers in Scotland have developed expertise in robotics that positions the country as a leader in the field. Technologies that make robots sensitive to touch, that could allow them to express human-like emotions, and that enable them to recognise their surroundings, are being developed here.

This week, Heriot-Watt University and the Edinburgh Centre for Robotics, a joint project with the University of Edinburgh, are co-hosting the European Robotics Forum (ERF 2017) at the EICC. Delegates include researchers, engineers, managers, entrepreneurs, and public and private sector investors in robotics research and development. The forum’s objectives are to identify the potential of robotics applications for business, job creation and society, discuss breakthroughs in applications, learn about new business opportunities and initiatives and influence decision makers and strengthen collaboration in the robotics community.

At Heriot-Watt, Professor Oliver Lemon and his team are developing technology to enable robots and computers to interact naturally with humans, using combinations of speech, gesture, movements and facial expressions. Its Interaction Lab is one of the few places in the world that has pioneered the use of machine learning methods to develop conversational language interfaces. This means that machines can now learn how to have dialogues with people and better understand and generate natural human language.

Ultimately, this research will develop new interfaces and robots which are easier, quicker, and more enjoyable to use. The Interaction Lab has worked with companies including Yahoo, Orange, and BMW to solve problems in conversational interaction with machines. It is breaking new ground in the development of robots that can learn from human language and how to interact socially with humans. The team was successful in a 2015 European Commission Horizon 2020 funding call which has resulted in ‘Pepper’, Heriot-Watt’s resident robot. Pepper exhibits behaviour that is “socially appropriate, combining speech-based interaction with non-verbal communication and human-aware navigation”. It is one of several initiatives indicative of Scotland’s expertise.

A team at Glasgow University has devised solar-powered electronic skin for use in in robotics, prosthetics and wearable systems, that provides tactile and haptic feedback. The technology was conceived and developed by Dr Ravinder Dahiya, a reader in electronic and nanoscale engineering, as part of a project funded by the Engineering and Physical Sciences Research Council (EPSRC). It began life in 2014 as a response to the limitations of robot technologies then in existence.

“Interfacing the multidisciplinary fields of robotics and nanotechnology, this research on ultra-flexible tactile skin will open up whole new areas within both robotics and nanotechnology,” said Dahiya at the time of the time of the EPSRC award. “So far, robotics research has focused on using dexterous hands, but if the whole body of a robot is covered with skin, it will be able to carry out tasks like lifting an elderly person.

“Today’s robots can’t feel the way we feel. But they need to be able to interact the way we do. As our demographic changes over the next 15-20 years, robots will be needed to help the elderly.” Robots would have skin so that they can feel whether a surface is hard or soft, rough or smooth. They would be able to feel weight, gauge heat and judge the amount of pressure being exerted in holding something or someone.

“In the nanotechnology field, it will be a new paradigm whereby nanoscale structures are used not for nanoscale [small] electronics, but for macroscale [large] bendable electronics systems. This research will also provide a much-needed electronics engineering perspective to the field of flexible electronics.” The research is aligned with wider work on flexible electronics; the creation of bendable displays for computers, tablets, mobile phones and health monitors.

It has led to a breakthrough in materials science; the ability to produce large sheets of graphene – the ‘wonder’ material that is a single atom thick but is flexible, stronger than steel and capable of efficiently conducting heat and electricity – using a commercially available type of copper that is 100 times cheaper than the specialist type currently required. The resulting graphene also displayed ‘stark’ improvements in performance, making possible artificial limbs capable of providing sensation to their users.

Dahiya and his team have also succeeded in integrating photo-voltaic cells in to the skin. “The real challenge was ‘how can we put skin on top of photo-voltaic and yet allow light to pass through the skin?’ That’s what we have done.” The optical transparency of the graphene allows about 98% of the light that strikes its surface to pass directly through it making it ideal for gathering energy from the sun to generate power. The addition of solar power means there would be no need for an external battery to power the skin’s sensors. The technology could also increase the functionality of robots, allowing them to have a better understanding of what they touch and interact with, said Dahiya.

Last year, Dahiya received an award from the Institute of Electrical and Electronics Engineers to honour researchers who have made an outstanding technical contribution to their chosen field, as documented by publications and patents. The award is the latest in a string of recent accolades for Dr Dahiya, which also include the 2016 International Association of Advanced Materials (IAAMM) Medal, the 2016 Microelectronic Engineering Young Investigator Award, and inclusion in the list of 2016 Scottish 40UNDER40.

Our goal is a robot you buy and when you tell it what to do, it will react the same way a human
would

Work is also underway to make robots more human-like in their behaviour. Dr Oli Mival, a principal research fellow at Edinburgh Napier University, is an internationally recognised expert in human computer interaction, working in education, healthcare, industry and government. At a conference in St Andrews last month, organised by VisitScotland’s business events team, Mival spoke about how user experience design in computing could change simple interactions with robots into something more human-like.

Much of his work over the past decade has been focussed on the study of software in digital assistants and companions. He cited four which dominate the market; Amazon’s Alexa, “an embodiment in a physical object”, Microsoft’s Cortana, Apple’s Siri and Google Now, voice driven assistants on mobile phones. “That’s all great,” he said, “but what about if we move from software to the notion of embodied AI [artificial intelligence]?”

Human evolution has wired us to define what we think of as natural, said Mival, and we are used to the idea of brains being inside bodies. We respond differently to artificial personalities, whether they are represented physically or are screen based. With a robot, its representation of a human body puts it closer to being a peer: “The challenge then becomes,” said Mival, “how do we make robots less robotic?” He used the example of the perceived difference in intelligence between cats and dogs. From a neurological and cognitive perspective, a cat is far less advanced than a dog; it lacks the fundamental neurological constructs that a dog has.

“However, it has a behaviour that we attribute intelligence to. And this is a construct that humans use also, knowingly or unknowingly. The attribution that we give about people’s intelligence comes from extrapolation based on their behaviour.” Mival said that this knowledge can be used to make a robot appear more human-like in its behaviour; so-called affective computing, where it can recognise, interpret, process and simulate human emotion. “We can start to have things that respond based on our understanding and the context of the world,” said Mival.

Also at the conference was Dr Philip Anderson, of the Scottish Association for Marine Science (SAMS), which has developed techniques to probe the winter-time polar atmosphere, including low-power autonomous remote systems on the earth surface and kite, blimp and rocket instrument platforms in the earth’s atmosphere. In the last 10 years, Anderson pioneered this use of small robotic aircraft for measuring the structure of the atmosphere near the surface. At SAMS, the techniques aid understanding of sea-ice dynamics in the Arctic and help explain the dramatic reduction in summer-time sea-ice coverage.

Another speaker, Dr Kasim Terzic, a teaching fellow at the University of St Andrews, outlined work being carried out on computer vision; the automatic extraction, analysis and understanding of information from a single image or a sequence of images. Terzic has published extensively on visual scene understanding, a field of research central to the ability of robots to understand and communicate with the world around them.

He said there is a big gap between the complexity of tasks that robots can perform, and their ability to understand the physical context they inhabit: “Why can’t we have robots perceive the world the way we do? Part of it is that humans are so good at understanding visual clues; we are capable of a deep semantic understanding when presented with a scene. Computers are not very good at doing this.”

A huge amount of work has been done over the years on object recognition, he said, more recently supported by the amount of visual data available on the web. Making this process more accurate and less draining in terms of processing power is an ongoing challenge. But the next step is to progress to a high level of reasoning, where recognised objects can be placed in context to provide a semantic understanding of what is happening. Recently, Terzic and colleagues succeeded in programming a robot to recognise a favourite toy and follow it, ignoring other toys.

“We are getting good at object recognition,” he said. “It’s not solved, but we are getting to the point where we really need to leverage context. We are not going to solve it by throwing more algorithms at it; there is the potential to apply deep learning [machine learning inspired by the human brain]. And [then] there is [the] transition to a cognitive robot which understands and learns. Our goal is where you can have a service robot that you buy and when you talk to it, and tell it what to do, it will react the same way a human would do.”