by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : May 2016
MIT.EDU BY PATRICK MARSHALL EMERGING TECH ONE OF THE BIGGEST obstacles to making afford- able mobile robots is the high cost of equipment for sensing and navigating the environment. The current go-to technology is lidar — light detection and ranging — which measures the time it takes rapid bursts from lasers to reflect back to the emitting device. The downside of lidar is cost. The device used in Google’s self-driving vehicles, for example, is reported to run $70,000. According to research- ers at MIT, however, the job can be performed at a fraction of that price using an off-the-shelf smartphone with a $10 laser attachment. The Smartphone LDS (laser distance sensor) prototype developed by a team at MIT’s Computer Science and Artificial Intelligence Labora- tory uses a smartphone’s camera to record reflec- tions of low-energy light bursts from the laser, which is mounted at the bottom of the phone. The position of objects is determined by computing where the light falls on the camera’s sensor. The prototype is fast and accurate enough to guide vehicles moving at speeds of almost 10 miles per hour, said Jason Gao, a doctoral student and member of the Smartphone LDS team. In fact, at a range of 3 to 4 meters, the system gauges depth to an accuracy measured in mil- limeters, while at 5 meters, the accuracy declines to 6 centimeters. Gao said the team believes it can improve the accuracy and resolu- tion to levels sufficient for ordinary driving. “Perhaps it’s not an appropriate sensor for something like a race car,” he added. “Our system is really suited for a more mass-market class of applications where we expect to see a lot more slower-moving or medium- speed autonomous vehicles out there doing all sorts of tasks for humans.” Given the advances in the quality of processors and smartphones, the team expects the capabilities of Smartphone LDS to im- prove rapidly as well. “We actually are limited by the processing right now,” Gao said. The cam- eras on the current genera- tion of smartphones are able to take higher-resolution images than the existing processors can handle in a timely fashion, he added. “Processing higher-resolu- tion images would improve our angular resolution and our range resolution or ac- curacy as well.” The prototype device uses a phone with a 30-frames-per-second camera. Smartphones with 240-frames-per- second cameras are available, and their use would further enhance the system. A faster shutter speed allows the use of higher-powered laser bursts, Gao said, and that would enhance ranging capabilities — though again, processing power could be the limit- ing factor. Smartphone LDS is in the process of being pat- ented, and Gao said the team has begun receiv- ing inquiries from the private sector. The team expects the technology to be used in diverse kinds of robots — such as self- driving vehicles, drones for package delivery and robots that pick up trash. “Lidar has been a critical sensor for all classes of ro- botics that need to navigate the real world,” Gao said. “This is essentially provid- ing the same kind of data as lidar but at much lower cost.” • Can your smartphone drive your car? By adding a laser sensor to a smartphone, researchers at MIT have created a low-cost device that can deliver the high-resolution distance sensing required for robotic navigation. GCN MAY 2016 • GCN.COM 27 0516gcn_027.indd 27 4/27/16 2:59 PM
March and April 2016
June and July 2016