
California-based startup Helm.ai, backed by Honda Motor, has unveiled a new vision-based perception system designed for self-driving cars. Called Helm.ai Vision, the camera-first system interprets complex urban environments in real time and is aimed at helping automakers integrate autonomous capabilities into mass-market vehicles.
Helm.ai is currently collaborating with Honda to bring this technology to the 2026 Honda Zero series of electric vehicles. These upcoming models will offer hands-free driving, even allowing drivers to take their eyes off the road under specific conditions.
The system leverages images from multiple cameras to generate a high-resolution bird’s-eye view of the surroundings. This allows for better decision-making and vehicle control, enhancing safety and responsiveness. Helm.ai Vision is also optimized for hardware platforms from major chipmakers like Nvidia and Qualcomm, ensuring compatibility with existing vehicle architectures.
While some industry experts continue to argue that lidar and radar are essential for low-visibility and fail-safe performance, Helm.ai maintains a vision-first approach, similar to Tesla’s strategy. The startup does offer foundational models that can work with additional sensor inputs, but its core technology is built around cameras.
To date, Helm.ai has raised $102 million from investors including Goodyear Ventures, Sungwoo HiTech, and Amplo. With rising interest in cost-effective autonomous driving solutions, the company is positioning itself as a key player in the evolution of intelligent mobility.