Robots are learning hand gestures by watching hours of TED talks

Pepper the robot gestures like a TED speaker

Pepper the robot delivers a speech accented with hand gestures learned from TED videos


We say a lot with our hands. We spread them wide to indicate size, stab the air for emphasis and reach out to draw people in. Waving our hands about when we speak makes us appear less robotic – and that’s true for robots too.

Youngwoo Yoon at the Electronics and Telecommunications Research Institute in Daejeon, Korea, and his colleagues trained a machine learning system to match hand gestures to different words and phrases by showing it 52 hours of TED talks …

Related posts

Silicon Valley’& rsquo; s inequality maker: a discussion with Anand Giridharadas


Why longer term sheets are much better


Transport Weekly: Polestar Chief Executive Officer talks, Tesla terms, as well as a homage


Twitter turns to academics to improve conversational health on the platform


Formlabs goes unicorn with latest funding round


Just how to develop artificial DNA and also send it throughout the web|Dan Gibson