Robot dogs are a dime a dozen — well, not quite, the latest Sony Aibo goes for about $1,700 — but the point is they’ve become pretty common in the overpriced toy market.
A researcher at the University of Washington, though, is working on a version of a robot dog that promises to do more than than sit and bark and (though real dogs seldom do this) play music that you program into them.
Normally, when we hear the phrase artificial intelligence we think of intelligence that mimics that of a human.
Kiana Ehsani and colleagues have gathered a unique data set of canine behavior and used it to train an AI system to make dog-like decisions, according to MIT Technology Review.
They say their approach opens up a new area of AI research that studies the capabilities of other intelligent beings on our planet, which strikes me as a good thing — given how humans often botch things up.
To gather their initial data, the team fitted a dog with inertial measurement units on its legs, tail, and body to record. They also fitted a GoPro camera to the dog’s head to record the visual scene, sampled at a rate of five frames per second, and a microphone on the dog’s back to record sound.
It gathered about 24,500 video frames with synchronized body position and movement data to further understand how dogs, act, plan and learn, and to try to predict a dog’s future movements based on those recorded ones.
The researchers say the system got the point that it could accurately predict the next five movements after seeing a sequence of five images.
No actual dog robot was built, just an AI system, but the far away goal appears to be a robot dog that could do everything a real dog does, up to and including sniffing out a trail, and helping the blind.
Of course we already have an abundance of dogs with a built-in knack for those kind of things but, human intelligence being what it is, we want to duplicate it in machine form. And more to the point, there are things to be learned in doing so.
The team loaded up a Malamute named Kelp M. Redmon with sensors, to record movements, video of the dog’s viewpoint, and a microphone.
They recorded hours of activities — walking in various environments, fetching things, playing at a dog park, eating — syncing the dog’s movements to what it saw.
The resulting data was used to train a new AI agent.
Their work so far gathered data from just one dog, and it was primarily on what the dog saw and heard and the movements it made. Much more baseline data would be needed to get anywhere — and giving a robot a nose able to sniff out all that dogs to would surely be daunting, if even doable.
But the research is continuing, and the researchers feel the approach could be used to better understand the intelligence of other animals as well, TechCrunch reported.
“We hope this work paves the way towards better understanding of visual intelligence and of the other intelligent beings that inhabit our world,” Ehsani said.