Yoshiya Okoyama, CEO and Takashi Kawahara, Chairman
With fully autonomous driving (AD) capabilities still years away, manufacturers are racing each other to develop next-gen ADAS and AD systems to put them ahead in the market. Although these systems are expensive and time-consuming to develop, most of them are developed to meet regulatory requirements and fail to represent scenarios of the real world such as uncontrollable weather, accident situations and huge costs. Over the years, virtual testing has become a standard tool from function development to validation in light of increasingly complex driving functions and their networking in future automobiles. This is where Vertech has successfully carved a niche for itself by providing 3D-CG databases for creating virtual simulation environments and model-based simulations of Japan that enable the development of ADAS and AD systems, training data for AI, OpenDRIVE data. The company's virtual environments vary from urban cities to warehouses and vineyards, where autonomous vehicles are necessary for the future. Its experts have also created digital twins of urban cities that range from a few hundred meters to as big as 5 kilometres for several OEM companies and Tier 1 suppliers in Japan. "As experts of Unreal Engine, we are able to provide a realistic visual model of the real-world location and combine it with accurate road information with a navigable scene for Japanese automotive companies," says Yoshiya Okoyama, CEO of VERTechs. Since its establishment in 2016 under the Advanced-Data Control group, VERTechs has provided virtual environments for clients in the automotive industry for AI training data creation, reinforcement learning, virtual sensor models, and software in the loop simulations (SiLS).
VERTechs' AUTOCity platform fulfils every industry requirement of simulating a realistic driving environment. It is optimized for physically-based rendering (PBR) and real-time operation that can be used for various scenarios where edge cases need to be tested. Experts at VERTechs have created the AUTOCity 3DCG database to provide high-fidelity environments, various sensor models for functional verification, and support for building a model-based simulation environment of a Japanese urban city junction.
VERTechs developed the AUTOCity Deep Learning Unreal Engine 4-based plugins to improve object detection and autonomous driving AI recognition efficiently.
The AUTOCity Deep Learning plugin can simultaneously output the camera sensor, segmentation, and depth images using the playback/ recording feature to simplify scenario creation operations. That said, a crucial element of ADAS and AD systems is a reliable and robust environmental perception using sensors. Consisting of the representative sensors that make up the perceptual system of an autonomous vehicle is the AUTOCity Sensor package. It's a set of camera, lidar, radar, and IMU sensors that collect virtual data to be utilized in simulations and modelled in Unreal Engine 4 simulations. Users can replay the scenarios encountered while testing and virtually enhance them or add variations to challenge edge cases using virtual data. In addition, VERTechs provides a database of NCAP dummies called 'AUTOCity Euro NCAP Asset' to perform automobile safety tests using the European New Car Assessment Program in a virtual environment based on Unreal Engine 4.
As experts of Unreal Engine, we are
able to provide a realistic visual model
of the real world location and combine
it with accurate road information with a
navigable scene for Japanese automotive
To further illustrate the efficacy of VERTechs, Okoyama cites a case study where it combined ASAM OpenDRIVE data with a high-fidelity virtual environment for one of its projects. The company created a digital twin of the Hamamatsu area in the south of Japan for an OEM company that tested driving scenarios on Prescan simulation software. When the OEM entrusted VERTechs with this project, the client looked for ground truth to test their driving algorithm.Besides,providing a visually realistic twin city in fbx format to the client, experts at the VERTechs decided to incorporate ASAM OpenDRIVE data for geo-reference, use RoadRunner 2019.1.4 for creating OpenDRIVE data and Maya for creating the FBX data. VERTechs converted ADAS map data into FBX data and ASAM OpenDRIVE data with a total area of 5 kilometres. Apart from the road network, they also created other vital elements of the environment necessary for autonomous driving that are location-specific objects such as foliage, buildings, street lamps and more. The client then imported the FBX data and the ASAM OpenDRIVE data into Prescan simulation software to solve the localization problem. In the end, owing to VERTech's efforts, the client could successfully drive an autonomous vehicle in a virtual environment with verified data and their choice of simulation software.
Today, automobile manufacturers are spending significant resources and time to harness the power of simulation and scenario-based environments to develop the ADAS and AD systems to meet real-world requirements. As one of the first companies to provide real-time solutions based on a game engine for OEM and Tier-1 companies, VERTechs is participating in the development of standardized format as a member of the ASAM group through its parent company, the Advanced-Data Control group. It is also one of the first companies to provide OpenDRIVE data in combination with the 3DCG data of the Japanese road environment using the Mathworks road network design tool "RoadRunner". Additionally, the company uses PLATEAU data created by the Ministry of Land, Infrastructure, Transport, and Tourism of Japan to create environments and deliver accurate information on real-world data. "Currently, VERTechs is developing an automated pipeline with which it can generate a high-fidelity road network using machine learning," concludes Takashi Kawahara, Chairman of VERTechs.