Applied Intuition partners with companies of all sizes and across industries, including the automaker Nissan, the construction and mining equipment manufacturer Caterpillar, and the aerial mobility service provider Supernal (a part of Hyundai Motor Group). Our team also maintains relationships with research universities and supports startups with their autonomous vehicle (AV) development efforts.
Today, we are excited to feature a guest post from one of our startup customers, driveblocks, on the Applied blog. For the past five years, the driveblocks team has developed full-stack autonomous driving algorithms at the Technical University of Munich, Germany. The team then recently founded driveblocks to enable commercial vehicle autonomy. driveblocks uses Applied’s simulation products Simian, Spectral, and Orbis to test and validate its AV stack extensively and boost its agile development processes.
Read the guest post below to learn how a new startup with a deep international research background uses Applied’s solutions to master the challenges of commercial AV development.
We founded driveblocks earlier this year to enable commercial vehicle autonomy—whether on the road, in mining and agriculture, or in container logistics. Before founding driveblocks, our team spent five years developing full-stack autonomous driving algorithms under real-world conditions at the Technical University of Munich (TUM) in Germany. During that time, we published 24 research papers and won the Indy Autonomous Challenge—the first full-scale autonomous racing competition with top speeds of 270 km/h and prize money of $1M.
At driveblocks, our AV stack’s key features are high modularity and compatibility with open-source software and open standards. We do not aim to develop a monolithic software product but rather a modular software toolbox (Figure 1) that empowers our customers to focus on their autonomy applications’ specific challenges while relying on our software modules and know-how where needed.
After our longstanding work at TUM, we are building upon solid international research experience and are working towards certification with strong industry partners. One of these industry partners is Applied Intuition. The company’s simulation software allows us to extensively test and validate our AV stack and boost our agile development processes.
When we founded driveblocks, we needed to shift our focus away from highly dynamic autonomous racing functionalities and toward commercial applications with a specific emphasis on certification and safety. This effort required some functional adaptions in our AV development workflows. For example, we needed to set up an advanced simulation environment to better support our AV software testing and validation efforts.
Our previous work in the autonomous racing space involved the precise simulation of vehicle dynamics and opponent race car behavior in an unstructured environment. Our new focus area—commercial autonomy applications—instead consists of explicit and implicit traffic rules and various traffic participants showing different behavior in both structured and unstructured environments. Furthermore, topics like certification and functional safety are vital for autonomy applications on public roads.
We decided to collaborate with Applied because the company’s high-fidelity simulation software would allow us to test and validate our AV stack at scale. By collaborating with Applied early on, we have leveraged the company’s comprehensive simulation solutions from the beginning, allowing us to focus our work on autonomous driving functionality development completely. The straightforward integration process and the support from Applied’s engineering team enable us to create multiple traffic scenarios quickly. We then run these scenarios in full-stack simulations both on local machines and in continuous integration/continuous deployment (CI/CD) pipelines on our cloud service. As a result of having early access to such a powerful solution, we have already been able to demonstrate a full-stack AV prototype after about six months of development.
Our AV stack consists of four modules: Perception, environment model, planning, and control. Applied’s simulation products Simian, Spectral, and Orbis help us develop and test each module individually. They also allow us to run all modules concurrently, similarly to how they would interact once the AV stack is deployed in the real world (Figure 2).
Simian is Applied’s core simulation product. It provides highly customizable ways to generate traffic scenarios on imported custom maps. Our team conducts tests on an imported map consisting of highways, rural, and urban roads near our headquarters just north of Munich (Figure 3). With Simian, we have started to test our software granularly. In the first stage, both planning and control modules are required to maneuver the vehicle. The input data into our planning module comes directly from Simian and represents ground truth (see Figure 2, purple line). In the next step, the input data from Simian goes through our environment model first (see Figure 2, green line) before reaching the planning module. Here, the simulation data is very similar to what we expect real-world inputs into the perception module to look like during later development stages.
Using Simian, we can generate a variety of test scenarios quickly on custom maps and purpose-built road segments. We can also choose different options for obtaining ground truth sensor data from Simian. This helps us build individual data publishers that address one or multiple concurrent software modules in the loop. As a result, we can implement different levels of sensor data quality as well as different types of required input data.
Applied’s sensor simulation product Spectral allows us to re-use tests already specified in Simian and run them with the perception module in the loop. With Simian and Spectral, we can simulate our entire AV stack, running all four modules concurrently. During the simulations, each module relies solely on its respective preceding module. As the perception module is the only module to gather data about the vehicle’s surroundings (see Figure 2, yellow line), the ability to simulate different weather conditions and times of day is vital for robust perception algorithm development. In addition to perception module and full-stack testing, we also create Synthetic Datasets to train machine learning (ML) algorithms on aspects such as camera-based lane mark detection or lidar point cloud object detection (Figure 4).
Overall, Spectral allows us to achieve two main goals: 1) Test and validate our entire AV stack early during development using realistic scenarios; 2) generate synthetic data to train our perception algorithms. We can achieve both goals without investing in costly real-world testing or data collection. This allows us to save resources, which we can allocate to high-priority AV development work instead.
Applied’s testing automation platform Orbis integrates with both Simian and Spectral. It provides a workflow for large-scale automated testing of pre-defined scenarios. Our CI/CD pipeline features commit-triggered and nightly tests that help us detect potential issues quickly and track and evaluate our development progress. Recurring large-scale simulations of our fast-growing scenario library help us drive hundreds of virtual test kilometers overnight. This enables us to boost the development pace even further and at a low cost before gaining access to a significant number of prototype vehicles. By leveraging such an extensive test and validation workflow, we can develop our AV stack in a resource-efficient way and maintain high standards for the quality of our code.
By leveraging Applied’s simulation software, we can test our AV stack in a multi-step approach that scales to our needs—from trajectory planning based on ground truth simulation data to full-stack AV testing with simulated vehicle sensors.
In the future, we plan to set up our hardware-in-the-loop (HIL) simulator by joining the vehicle target hardware and Applied’s simulation software. This will prepare our AV stack for precise prototype deployment. We will be able to front-load integration tasks and minimize the time and effort spent on integrating different tools. We also look forward to digging deeper into the generation of synthetic training data for our ML-based perception algorithms. Finally, we will extend our automated testing capabilities by quickly growing the number of scenarios and leveraging large-scale parameter sweeps.
Applied’s sophisticated, reliable simulation workflows and the company’s comprehensive automated testing platform are critical success factors for the driveblocks team. Simian, Spectral, and Orbis empower us to focus on our AV software development and generate rapid progress toward validating a high-quality, safe commercial AV stack. We are thrilled that Applied is supporting us on our path to building a modular AV stack, and we are excited to see where this journey will lead us.