Programming “Appropriate” Behaviours for an Elevator Riding Robot

Imagine a scenario where an autonomous robot is in need of riding an elevator to do a delivery task. When the robot navigates itself to the elevator, it encounters a person who is waiting to ride the elevator. If the robot can’t ride the elevator with someone due to safety reasons, what should a robot do?

If the robot was on an urgent delivery mission, would this influence your decision? What if the person wasn’t waiting for the elevator, but was already riding it? The person could also be just a regular Joe, someone in a wheelchair, or someone who is carrying a box full of heavy objects.

We conducted an online survey to find out how we should be designing our algorithms to regulate the robot’s behaviour. We generated twelve survey questions using this scenario and varying factors such as the urgency of the robot’s task and state of the person. For each question, the survey participants had to rank the most appropriate behavior from four possible answers. The results of the survey were used to train a machine learning algorithm (Q-learning) implemented in the PR2 robot from University of British Columbia.

Above video is an illustration of our concept.

With this example, we managed to input the moral and social norms of the stakeholders in the behavior design process. The actions that the robot takes when exposed to a certain situation are based on the survey results. This means that your input on what you think is the most appropriate robot behaviour through the survey directly affects how a robot behaves. It’s not necessarily the most appropriate approach, but it is one of many ways to tackle how robot behaviours can be informed by our moral and social norms.

Here is our discussion of the work presented at the We Robot 2013 Conference hosted at the Stanford University:

Related Publication(s):

Leave a Reply