A Phone That Can See Its Surroundings

Posted 3688 days ago by Phil Vialoux

18 Nov 2014

One of the limitations of smart phones is that they don’t know what’s around them — the camera can only see front to back. So if you forget to set the ringer on vibrate and go into a meeting it can go off and irritate everyone. Phones can be used while driving, and can’t tell if you’re using a stylus or a finger on the touch screen, nor can they pick up gestures from across the room.

Anything You Can Do Robots Can Do Better: Photos

That’s now changed. Xing-Dong Yang, a graduate student in the Department of Computing Science at the University of Alberta has made his phone (an HTC as it happens) able to see its surroundings and learn what it sees. He calls the device Surround-See.

“We can train the phone,” Yang told DNews. “It takes a number of pictures of the environment, as samples.” At that point its machine-learning algorithm can help the phone compare the pictures in the library to what it sees around it. The algorithm itself is off-the-shelf, but nobody combined it with a smart phone and a panoramic lens before.

Now Yang’s phone can see when he makes gestures across the room, so he can turn it on or off or answer it. The phone even warns him when he is in a car, and asks him if he wants to take it with him when he leaves the room. He has a video showing some of the phone’s capabilities here.

40 Years of the Cell Phone: Photos

“Humans are really smart about their environments, because we can see them,” Yang said. “This makes the phones smarter.”

Yang is thinking of commercializing it, though he hasn’t filed for patents yet.

Want more from your website?

Get practical, easy to implement guidance and tips to accelerate your online performance with a personalised one on one phone consultation

Book Free Consultation Now

Get in touch

here

is not postback