Smartphone-based laser depth-sensing system to work outdoors

New York, March 31 (IANS) The researchers from Massachusetts Institute of Technology (MIT) are set to present a new infrared depth-sensing system that is built from a smartphone with a $10 laser attached to it which works outdoors as well as indoors.

The team envisions that smartphones with cheap, built-in infrared lasers could be snapped into personal vehicles, such as golf carts or wheelchairs, to help render them autonomous.

A version of the system could also be built into small autonomous robots like the package-delivery drones proposed by Amazon whose wide deployment in unpredictable environments would prohibit the use of expensive laser rangefinders.

“My group has been strongly pushing for a device-centric approach to smarter cities, versus today's largely vehicle-centric or infrastructure-centric approach," said Li-Shiuan Peh, professor of electrical engineering and computer science.

This is because phones have a more rapid upgrade-and-replacement cycle than vehicles.

“Cars are replaced in the timeframe of a decade, while phones are replaced every one or two years. This has led to drivers just using phone GPS today as it works well. I believe the device industry will increasingly drive the future of transportation,” Peh added.

Traditional infrared depth sensors come in several varieties but they all emit bursts of laser light into the environment and measure the reflections.

Infrared light from the sun or man-made sources can swamp the reflected signal, rendering the measurements meaningless.

Gao and Peh's system performs several measurements, timing them to the emission of low-energy light bursts.

In their prototype, the researchers used a phone with a 30-frame-per-second camera, so capturing four images imposed a delay of about an eighth of a second.

The system uses a technique called active triangulation. The laser, which is mounted at the bottom of the phone in the prototype, emits light in a single plane.

The angle of the returning light can thus be gauged from where it falls on the camera's 2D sensor.

The team is set to present their findings at the international conference on robotics and automation in Stockholm in May,

Facebook Comments
Share

This website uses cookies.

%%footer%%