Business

Ouster’s color lidar aims to merge vision and depth

Misryoum reports on Ouster’s Rev8 “native color lidar,” combining 3D depth and color imagery in one sensor as robotics demand accelerates.

A new kind of lidar is trying to end a long-running debate in robotics: why use separate sensors for depth and vision when one device could do both.

Ouster. the San Francisco-based lidar company. unveiled its Rev8 lineup built around what it calls “native color lidar. ” a sensor that captures three-dimensional depth and color imagery at the same time.. For the industry. the focus_keyphrase is clear: pairing color-like detail with point-cloud depth in a single capture stream could reshape how perception systems are built.

In a market where self-driving and robotic platforms have increasingly relied on sensor fusion. Ouster argues that combining cameras and lidar has often meant heavy calibration and prolonged engineering to make different data streams work together.. The company’s CEO framed Rev8 as the culmination of years of development. with the goal of simplifying deployment by reducing the need to align and fuse separate sensor outputs.

Meanwhile, this launch lands during a busy phase for lidar itself.. Misryoum notes that lidar providers have been consolidating, while robotaxis and robotics platforms are expanding their operational footprint.. At the same time. investment interest across robotics is pulling more attention and spending toward sensing technology. including both established players and newer entrants exploring alternative sensing modalities.

What’s particularly significant is the product architecture behind Rev8.. Rather than treating lidar and camera capture as two separate systems that later get stitched together. Ouster is using a “digital lidar” approach on a custom chip. leveraging SPAD detectors to collect both depth and color information.. The idea is to deliver data already formatted as a pre-fused 3D colorized point cloud. while still allowing customers to use depth-only or color-only streams depending on their perception pipeline.

Insight: In perception stacks, time and compute are often spent not just on sensing, but on interpreting multiple streams reliably. A pre-fused output can shift value from post-processing effort toward faster system integration and potentially simpler scaling across fleets or product lines.

Ouster says it has shipped samples to existing customers and is now taking orders for the Rev8 family. including models labeled OS0. OS1. and OSDome. alongside its OS1 Max long-range sensor.. The company positions OS1 Max as a step-up for long-range detection and points to applications where range and performance matter. such as high-speed robotic trucking. robotaxi use cases. and drone operations.

Insight: If color lidar adoption grows, it could influence not only hardware purchases but also how robotics teams design software. Easier fusion and more integrated sensing may lower barriers for new deployments, benefiting both established platforms and emerging robotics programs.

As Misryoum understands it. Ouster is not alone in pursuing “color lidar.” Competitors have also signaled their own efforts in the space. underscoring that the race is moving from whether lidar is useful to how best to package richer scene understanding into deployable hardware.. Rev8 is Ouster’s answer: merge the sensing functions at the source. then let robotics teams build with cleaner. more direct inputs.