Your pathetic human eyes can only see what’s right in front of them, and it’s not hard to fool people. Robots, on the other hand, are already learning how to see around corners. This will no doubt come in handy during the robot apocalypse when machines need to hunt down fleeing humans, but in the meantime, it could be a real boon to self-driving cars and other autonomous technology.
Researchers have, in the past, used computational methods to detect large objects around a corner. But professor Ioannis Gkioulekas from Carnegie Mellon’s Robotics Institute says this is the first time anyone has been able to resolve millimeter and micrometer-scale shapes without line-of-sight. So, this is a completely new application of so-called non-line-of-sight (NLOS) tech.
Most of the light your eye or a camera sees is reflected off a surface, but some light is scattered and bounces a few times before it reaches the observer. Usually, these signals are too faint to carry any information, but the NLOS techniques developed at Carnegie Mellon can amplify that signal with ultra-fast lasers.
The team fired the laser at a flat surface, allowing it to reflect and illuminate a hidden object. The computer knows when each pulse fires, so it can calculate the time it takes to bounce off the object, hit the wall, and reflect back to a sensor. It’s similar to the time-of-flight sensors used in current self-driving cars with lidar.
The method devised by the Carnegie Mellon team is based entirely on the geometry of the hidden object. The algorithm allows them to measure curvatures accurately, resulting in very precise small-scale details. In the laboratory, the team created reasonable approximations of jugs, vases, ball bearings, and a US quarter from around a corner. You can see above how the NLOS image (left) compares with a direct scan (right). It’s impressively close.
Currently, the technique developed at Carnegie Mellon’s Robotics Institute only works in the laboratory. The range of the NLOS sensor is about one meter, so it’s not practical for real-world applications. This is only the first attempt, though. This technology could eventually help autonomous vehicles avoid collisions by detecting hazards around the next corner. The team also thinks it could assist with ultrasound imaging and seismic measurements.
Top image credit: Getty Images