RAD ROBOTS

Rad Robots
Comments

RAD ROBOTS



Written by Tshepo Mokoena
Photos and illustrations by Max Aguilera-Hellweg, Arthur Caranta
18 Sunday 18th September 2011

The Unnecessary Human Voice in Sports News Reporting

First off, we've got a little piece of bot software from across the pond that's being used to churn out news stories. Just when any other writer thought the market for finding a job couldn't get tougher, some auto-bots come along to push you off the press desk. Narrative Science, an Illinois start-up company, has been trialling software that can produce entire articles from crunched data on mostly financial reports and sports statistics. And all in about a minute. Initially a few months ago word broke out about a similar piece of software, but it hadn't been totally refined yet; it was still to easy to detect that clinical lack of human voice.

Well, these Narrative Science people seem to have nailed it, with one of their sports pieces running completely undetected in the Big Ten Network (a collaboration between Fox News and the Big Ten Conference) news summary section. So what does this really mean for writers? Well, that editorial budgets straining to hire new talent can simply add themselves to the Narrative Science client list, then not have to worry about paying actual people to write about events that other people will want to read about. Simple, if not a tad harsh for all the sports by-line writer hopefuls out there.

Word of Weird, Prosthetic Mouth

To move smoothly from the written word to the spoken, researchers are also on a mission to perfect the robot's speaking 'organs'. We're talking about the whole shebang here, from tongue and larynx to the vocal chords and the air that pushes through them when producing sound. We've all come to accept the tinny and slightly jarring standard robot voices, but they've often just come from speech synthesisers. Anyone else who's tried to get the Adobe voice function to read out an entire journal article (in an attempt to multi-task) will know just the halting and totally non-instinctive speaking style we're on about. So now scientists have figured that the next challenge should be getting the robot to physically imitate our speech technique in order to not only look weird but function as a tool for the hearing-impaired.

Engineers at Japan's Kagawa University put together this piece of robot anatomy, powered with its own imitation nasal cavity, vocal chords, resonance tube and air pump. It can listen to human speech too, then track the lilts and movements of intonation in an algorithm that is later used to try and iron out inconsistencies suffered by those who are hard of hearing. It's nice to know the nightmare-ish invention has a noble presence at its core, at least. Watch it try to sing a song, above.

Trickery, Androids and Electric Sheep

The hardest part of getting bots to really communicate like people is getting them to think. And we mean beyond sorting algorithms and that sort of thing: the realm of reflective, self-aware thought. So far, a little guy known as HERB (the Home Exploring Robotic Butler, below) seems to be one of the furthest along in reaching this goal.

Developed in tandem with Intel Labs Pittsburgh and the Robotics Institute at Carnegie Mellon University, HERB seems capable of projecting future thought into his 'mind' by mapping out possible scenarios and figuring out how he'd react in them. According to Siddharta Srinivasa, HERB's builder and a Robotics Insitute professor, "...the robot is actually visualing itself doing something" and that's about as close to dreaming as a bot has gone before. When given a simple task to complete by students at the Institute, HERB scopes out his surroundings with his laser/camera eyes before visualing it on an inner screen, then actually undergoing the task. While from the outside it might just look like a slow robot (slow-bot? Perhaps not) taking a really long time to pick up a box, the complexities behind the motion are pretty extraordinary.

Unfortunately though, HERB doesn't quite measure up in terms of looking like an actual butler. For that sort of thing, we turn to the world of androids: robots designed to look like us. While most of them just cover the territory of being both horrifying and unconvincing, there is some hardcore work being done in Japan to make them impossible to discern from a real person in the next decade. The man at the centre of a lot of this work is scientist Hiroshi Ishiguro (above, right. I think) from the Intelligent Robotics Laboratory at Osaka University. He's pushing to break the barriers of human-robot interaction (HRI) so that in the future robots would be able to seamlessly integrate themselves into human social environments and take on roles as care-takers, child minders and more.

Ishiguro's already made himself an android twin with his trademark furrowed brow, slight scowl and kinda menacing eyes. In fact, the robot eyes are definitely more menacing. He's inspired the likes of Kokoro Company and their Actroid-DER range of fem-bots as well as designed the the Geminoid-F who's been in plays in Tokyo with real actors. She was playing an android, granted, but is another step towards HRI. Only when bots like the Geminoid can combine the smarts of guys like HERB will we really have something to worry about. So far, androids generally need to be run by people in control rooms or they just get all jerky, awkward and un-human. Phew.
 

Don't Panic attempt to credit photographers and content owners wherever possible, however due to the sheer size and nature of the internet this is sometimes impractical or impossible. If you see any images on our site which you believe belong to yourself or another and we have incorrectly used it please let us know at panic@dontpaniconline.com and we will respond asap.



Comments

MORE FROM DON'T PANIC