Ansys announced a collaboration with Sony Semiconductor Solutions Corporation to improve perception system validation in Advanced Driver-Assistance Systems (ADAS) and Autonomous Vehicles (AVs). The partnership leverages Ansys AVxcelerate Sensors to provide real-time multispectral lighting simulation capabilities. This enables thorough evaluation of different lighting scenarios and weather conditions, including rain, snow, and fog.
ADAS and AV systems rely heavily on camera, radar, and LiDAR sensor-based perception systems to accurately assess environmental surroundings and conditions for navigational decisions. Reliable validation of these systems is crucial to mitigate safety issues, address regulatory challenges, and build trust in autonomous technology. Ansys and Sony aim to address these factors by enabling high-fidelity simulation of camera sensors, improving performance, and accelerating development times.
The AVxcelerate Sensors platform generates a virtual environment with varied lighting, weather, and material conditions to simulate light propagation through the environment, camera lens, and onto the imager. Coupled with Sony's sensor model, this simulation can reproduce pixel characteristics, signal processing functions, and system functions of Sony's HDR imager with extreme predictive accuracy. This enables robust, scenario-based testing for Sony's HDR imager-based perception systems.
The content on BeyondSPX is for informational purposes only and should not be construed as financial or investment advice. We are not financial advisors. Consult with a qualified professional before making any investment decisions. Any actions you take based on information from this site are solely at your own risk.