Benefits

  • Provides full-system validation before hardware investment
  • Ensures network, memory, compute and power architecture meet real-time control deadlines
  • Identifies sensor, asynchronous-event and scheduling bottlenecks
  • Protects battery and power delivery systems from spikes and unsafe transient load
  • Enables joint safety + failure modeling of autonomy functions under corner cases
  • Supports rapid trade-off exploration across performance, reliability and power

The Autonomous Driving System / Autonomy library in VisualSim Architect provides a complete system-level modeling framework for sensor-fusion-based automation, real-time maneuvering and control without human intervention. It enables the design and simulation of autonomous functions by combining hardware architecture, sensor data flow, ECU/on-board computing, real-time scheduling, network throughput, memory and power constraints in a single executable model.

The objective is to ensure that the timing deadlines, throughput guarantees and power budgets are simultaneously met while maintaining functional safety and failure resilience. The library supports both automotive ADAS platforms and aerospace autonomy systems with the same reusable modeling methodology.

Overview

The Autonomous Driving System / Autonomy library combines three primary architectural domains:

  • Hardware and Sensor Architecture
    Cameras, radar, lidar, infrared, sonar/ultrasonic, IMUs, GNSS, wheel sensors, vehicle health signals, environmental sensors and domain-specific payload sensors.
  • On-Board Computing and Real-Time Scheduling
    ECU clusters, multi-core processors, accelerators (AI/DNN/ML), GPU and dedicated sensor-fusion pipelines with
    software tasks mapped to cores via scheduling tables.
  • Network, Memory and Power Architecture
    In-vehicle networks, inter-processor communication, memory hierarchy, buffer sizing, and full energy/power-budget allocation.

These elements are tested under both nominal and disturbed operating conditions, including failures, overloads, power-spikes, delayed scheduling and incorrect sensor data.

Key Parameters

  • Sensor_Set — list of sensors active in the autonomy cycle
  • Sensor_Data_Rates — resolution, frequency and payload size per sensor
  • Task_Graph — sequence of perception, localization, planning and control algorithms
  • Schedule_Table — static or dynamic task allocation to cores and accelerators
  • Deadline_Profile — timing budget per task and per autonomy cycle
  • Network_Throughput_Target — worst-case bandwidth for sensor + perception traffic
  • Memory/Buffer_Capacity — maximum allowed queue depth per pipeline stage
  • Power_Budget — energy and peak-power constraints for the whole autonomy cycle
  • Safety_Action_Set — fallback actions triggered by abnormal conditions
  • Failure_Profile — hardware/software/network/power failure likelihood and MTBF/MTTR
  • Environment_Profile — rain, dust, fog, vibration, temperature, motion shock (optional)

Applications

  • Autonomous driving and ADAS systems for automotive OEMs
  • Autonomy platforms for aerospace / UAV / eVTOL / rotorcraft
  • Defense robotic systems and unmanned ground / air vehicles
  • Autonomous marine, mining and industrial mobile robotics
  • Full vehicle HIL-equivalent architecture validation in a virtual prototype

Integrations

  • Communication System for sensor data flow and perception channel bandwidth
  • Stochastic and Functional Safety for emergency-response behavior and sensor anomalies
  • Failure Analysis for degraded operation and fallback-mode performance
  • Scheduling / RTOS for deterministic software execution order and deadlines
  • Thermal and Power Architecture for battery and power-rail constraints
  • Traffic / Scenario Modeling for environmental and mission-profile simulation

Schedule a consultation with our experts

    Subscribe