Helm.ai, a developer of advanced driver assistance system (ADAS) software, autonomous driving software, and robotics, has announced Deep Neural Network (DNN)-based foundation models to predict vehicular and pedestrian behaviors in complex urban scenarios. These models can also anticipate the path of an autonomous vehicle.
The DNN models use Helm.ai’s surround view full scene semantic segmentation and 3D detection system, enabling intent prediction and path planning. The DNNs automatically learn multifaceted aspects of urban driving by gathering an autonomous vehicle's path input from a series of observed images and predicting possible outcomes and paths.
Helm.ai claims the models enable large-scale learning about complex urban driving scenarios and propose safe pathways for autonomous vehicles. It omits the necessity of physics-based simulators and hand-coded rules, thus understanding real-world driving complexity.
Its scalable AI approach can also be applied to different robotic domains beyond self-driving vehicles.
By using this site, you agree to allow SPEEDA Edge and our partners to use cookies for analytics and personalization. Visit our privacy policy for more information about our data collection practices.