Robots are learning hand gestures by watching hours of TED talks

Pepper the robot gestures like a TED speaker

Pepper the robot delivers a speech accented with hand gestures learned from TED videos


We say a lot with our hands. We spread them wide to indicate size, stab the air for emphasis and reach out to draw people in. Waving our hands about when we speak makes us appear less robotic – and that’s true for robots too.

Youngwoo Yoon at the Electronics and Telecommunications Research Institute in Daejeon, Korea, and his colleagues trained a machine learning system to match hand gestures to different words and phrases by showing it 52 hours of TED talks …

Related posts

Washington hit China hard on tech influence this week


Tractable is applying AI to accident and disaster appraisal


Oracle delivers artificial intelligence across its customer experience cloud

Christopher Gordon

Twitter will suspend repeat offenders posting abusive comments on Periscope live streams


Formlabs goes unicorn with latest funding round


India’s Uber rival Ola is headed to Europe with ride-hailing launch in the UK

Christopher Gordon