September 17, 2019 Humanoid robots & natural language programming

By Natalie Khoo

Pepper

Today, Pepper the humanoid robot started serving ice cream to customers at a new retail store in Federation Square, Melbourne. There are no humans behind the counter; Pepper is able to converse with customers, complete tasks and process orders with help from other robotic friends.

Commissioned by Niska Robotics, Avion is thrilled to have driven the human-robot interaction component of this project, working on natural language programming and conversation design. We consider this collaboration a flagship piece of work that represents Avion’s niche capabilities in the emerging technology space. To celebrate the launch of Pepper’s ice cream store, I’m here to provide a brief overview of human-robot technology, and cover some of the challenges involved with rolling out a commercially viable humanoid robot project.

Artificial intelligence (AI) & the uncanny valley

Bina48 was released by Hong Kong company Hanson Robotics in 2010. (BINA stands for Breakthrough Intelligence via Neural Architecture.) She spoke as a panellist at South by Southwest in 2012 and has since been featured in many TedX presentations and TV shows. Her more modern counterpart Sophia, also developed by Hanson Robotics and activated in 2016, has made headlines in the news too.

Image from the Hanson Robotics website

What makes Bina and Sophia so interesting is the emotional and physiological reaction we have when we see them. Their life-like appearance scares us – and this is what we call “the uncanny valley”. In short, the theory is that the more a robot looks, talks and moves like a human being, the more uncomfortable we feel.

Image courtesy of Simple Wikipedia

When I was exploring trends at South By Southwest in Texas earlier this year, I had the pleasure of listening to Aleksandra Przegalinska, PhD in artificial intelligence. Her presentation was called Will Machines Be Able To Feel? If you’re interested in emotional data, I highly recommend her talk. I’ve embedded the video below.

Other (less terrifying) types of humanoid robots

The uncanny valley is exactly why humanoid robots designed to provide emotional, social or service support DO NOT resemble human beings. Take Pepper, our ice cream serving robot, for example. Pepper has been designed by Japanese company Softbank. People think Pepper is cute because while it has big eyes and mimics human gestures, Pepper features a hard exterior – like something we’d see in Star Wars or a Disney film – not a rubber layer of skin akin to ours. Other reputable humanoid robots include Asimo, Walker and Romeo, as well as another Softbank project Nao.

The future of retail & service industries

Over the last few years, airports around the world including Munich, Montreal, Taipei and Los Angeles have been deploying Pepper to help travellers navigate their way around. Humanoid robots are also finding their ways into retail stores, helping customers find the items they’re looking for. The hotel industry is also quickly adopting this technology; one of the first movers and shakers was The Hilton Group. In partnership with IBM, they launched Connie the robot concierge in 2016.

Another industry that’s madly trying to keep up with AI is financial services. HSBC has successfully launched Pepper in its New York branch. In Australia, CBA has been exploring how Chip from PAL Robotics could be used, but we’re yet to see this come to life (no pun intended).

What IS happening right now, however, is the explosion of investment in chatbots, voice technology, and virtual assistants. We’re all familiar with Jess from Jetstar, for example. A new (and very fun) character we’ve fallen in love with is Arlo the Koala for NRMA, thanks to our friends at Nuance Communications. Currently, our copywriters at Avion are very excited to be working on dialogue scripts for a range of chatbots across several brands. Another agency that’s been making waves is Soul Machines from New Zealand. One of their most recent projects is Jamie, the virtual assistant for ANZ. While in my opinion, I’m unsure how this technology will play out with real customers, it’s incredible to see what’s possible.

5 conversation design challenges

Asking the right questions at the start of any project is crucial to success. Effective conversation design doesn’t happen in isolation – it factors in all aspects related to customer experience. Thankfully, the team at Avion had the experience to bring all the moving parts of Pepper’s ice cream shop together.

Want to know our secrets? Ok, I’ll spill the beans. If you’re looking to launch a commercially viable project and create scripts for a humanoid robot, here are some of the things you’ll want to ask.

1. Will the robot have sensory awareness (i.e. voice or facial recognition) activated upon release into market (as opposed to people choosing options from a screen)?

If so, you will need to come up with clear instructions for your programmers about what happens when your robot hears certain words or picks up certain expressions. For example, if X happens (it hears ‘hello’), then Y happens (it responds with, ‘hi, how are you?’). The challenge with this is addressing all the different ways a human can ask the same question. You must account for all kinds of synonyms and slang, and then draft multiple responses so every conversation isn’t the same.

2. How does the robot move?

Humanoid robots come with different mechanisms. Consider how it moves its head and limbs, then marry it up with how a human might behave in conversation. You must match your intents and responses (questions and answers) with a wide array of gestures before handing your scripts to the programmers.

3. What’s the budget for conversation design?

A human-robot interaction could go in a million different directions (and forever!), so it’s important to know where to draw the line. The amount of funding you have will determine how much time you can invest in drafting effective intents and responses (questions and answers) that facilitate an entire transaction (i.e. from a greeting through to taking an order and helping a customer with collection). If you’re starting out with a proof of concept, be realistic. A small suite of conversations that you can execute really well from start to finish is better than a poor user experience.

4. Does the robot integrate with other systems?

If your robot is in a retail setting, you might sync it with POS software. But how can your robot take money? Does someone swipe their card or does the robot need to take cash? It’s important to delicately design a conversation in such a way that makes people feel safe handing over something personal. Your robot may also speak to other robots in the store. Ensure you partner with technicians on your project to understand what triggers are required for certain actions to take place.

5. What’s the bricks-and-mortar fit out?

People are intrigued by robots and there is every chance that someone could vandalise the robot if close enough. So it’s good to know whether the robot will be behind a counter or screen, or whether it’s free to roam – and how this could impact the way people interact with it. For an overview of our work with Niska Robotics, check out our case study Giving a voice to an ice cream-selling robot.

More resources on natural language programming for AI

The Avion team is passionate about emerging technology and we’re always keen to share our knowledge. If you’d like to read more, you can view our blogs below.

If you’re thinking of launching a chatbot, voicebot, virtual assistant or humanoid robot, please contact us. Avion would love to help. And don’t forget to visit Pepper in Federation Square! You can find open hours on the Niska website.