Current vision systems for robots and drones rely on 3D sensors that, although powerful, do not always keep up with the fast-paced, unpredictable movement of the real world. These systems often struggle to measure speed instantly or are too bulky and expensive for everyday use. Now, in a paper published in the journal Nature, scientists report how they have developed a 4D imaging sensor on a chip that creates 3D maps of an environment while simultaneously tracking the speed of moving objects.
The researchers built a focal plane array (FPA), a physical grid of 61,952 stationary pixels etched onto a single silicon chip. Each one is a tiny sensor that emits laser light toward a scene and detects the reflected signal.
To “see” its surroundings, laser light from an external source is fed into the chip. This light is routed across the chip through a network of optical switches that sequentially direct it to groups of pixels. Each pixel then uses a technique called FMCW LiDAR to measure the returning signal, which is later processed to determine distance and speed. In many LiDAR systems, one set of pixels sends the light, and another receives it, but here, all pixels both send and receive, making the system much more compact.









