Drawing on the acoustic navigation capabilities of submarines and bats, Prof. Parham Aarabi of the Edward S. Rogers Sr. department of electrical and computer engineering is developing electronic devices that can locate themselves and navigate using surrounding sounds. He has developed a method by which a device fitted with an array of microphones can combine information from sounds around it to locate and orient itself, in the same way that an animal uses its ears. This method achieves the same result as radar but is more adaptable to different technologies, he adds. “It is very possible that within five to 10 years, you will have cellphones or hand-held computers that locate you, ‘listen’ only toward your direction – thereby reducing the noise that is picked up from other sources in other directions – and then respond to your commands after speech recognition,” says Aarabi.
Recent Posts
Is the U.S. Entering a New Era of Instability?
Donald Trump aside, political polarization and growing authoritarianism have huge implications for America – and Canada
People Worry That AI Will Replace Workers. But It Could Make Some More Productive
These scholars say artificial intelligence could help reduce income inequality
A Sentinel for Global Health
AI is promising a better – and faster – way to monitor the world for emerging medical threats