CES 2026 Day 1: Can You See Me Now?
- Adam Bernard
- Jan 5
- 3 min read
Updated: Jan 7

Today I had the opportunity to learn about two relevant (and potentially complementary) technologies tied to the currently hot topic of automated driving: lidar and radar.
MicroVision is celebrating its 30th anniversary, delivering “advanced perception solutions in autonomy and mobility” this year, and briefed our #HouseofJournalists on its latest developments in lidar technology. Their target is impressive: a $200 short-range unit (and $300 long-range unit) on the market by 2029, and a cost that’s not dependent on scale but enabled through technological innovation. With radar units at less than $50 and cameras less than $100, that’s still a premium-priced item; longer-term, they’d like to get below $100 to enable mass adoption to support automotive ADAS technology.
Their strategy relies on three pillars. The first is solid-state technology, replacing the current electromechanical hardware, resulting in a lower-cost unit around the size of a deck of cards. The second is an integrated SoC (system-on-a-chip) and full-stack capability. The third—and so far, unique—feature is an open software framework that allows an automaker full access to program the hardware to their needs and more easily integrate into their systems.
This technology ties into the tri-lidar architecture introduced at IAA, which replaces a single roof-mounted lidar unit that must do everything (which is expensive and complicated) with two short-range and one long-range unit. In the spirit of portfolio diversification, they are also exploring opportunities for this technology in both industrial and security/defense applications. In the former, they can replace up to $20,000 worth of hardware in a forklift with a set of more capable components that costs less than $6,000. In the latter, a lidar-equipped drone can extend the perception of a surface-based military vehicle.
When asked if they saw Tesla-style vision-based systems crowding out lidar technology, CEO Glen DeVos said the likely path is continuous improvement of lidar, radar, and cameras (especially given the fact that no automaker has stepped forward to license Tesla’s system). As the slide below indicates, there’s no perfect technology to support automated driving, and so it seems to be in the customer’s best interest to deploy a multimodal system. We are still a few years away from this hardware hitting the streets, but the potential impact seems quite impressive.
In contrast, Atomathic (formerly Neural Propulsion Systems) is a much younger firm at seven years and is just now transitioning from an R&D company to a product company. Their goal is to make the invisible visible, and their premise is that the limits of radar image accuracy are a math problem, not a hardware problem. They have developed software that operates at 15-20 frames per second, and, for each frame, does the following:
Generates 6-7 hypotheses (i.e., “What am i looking at?”)
Applies the constraints of physics (i.e., “Is this physically possible?”)
Checks causality over time, keeping only consistent image tracks
This software is intended to eliminate ghosts, blindness, and flickering (not uncommon in radar imaging), resulting in far more accurate imagery. Atomathic admits that lidar is still better at shape definition—so this technological improvement is in line with MicroVision’s CEO’s expressed desire for “better radar, but radar still maintains its advantage in fog and rain.
The good news is that this software appears to be deliverable via an over-the-air update, and vehicles that now offer Level 3 or Level 4 automated driving likely have the computing power to support this new math; in fact, it requires much less power than computer vision. They are hoping to set up a deal or two in the next several months and then, at some unspecified (but not too distant) point in the future, this technology could be on the road, enabling a significantly safer driving experience.
I know that some people aren’t quite comfortable letting a computer handle all the driving (and people who know me are aware I’m perfectly fine with it), but it’s hard not to be impressed with how quickly this technology is evolving. We are still quite some distance away from the “let me finish this crossword puzzle on the way to work” driving, but we are getting there…



Comments