SPEAR: An Open-Source Photorealistic Simulator for Embodied AI

SPEAR: An Open-Source Photorealistic Simulator for Embodied AI

Intel Labs develops the Simulator for Photorealistic Embodied AI Research (SPEAR) with the collaboration of the Computer Vision Center (CVC), Kujiale and the Technical University of Munich. This highly realistic, open-source simulation platform accelerates the training and validation of embodied AI systems in indoor domains.

SPEAR can be downloaded under an open-source MIT license and customised on any hardware, applied to various domestic navigation and manipulation tasks. SPEAR aims to drive research and commercial applications in household robotics and manufacturing, including human-robot interaction scenarios and digital twin applications.

To create SPEAR, Intel Labs worked with a team of professional artists for over a year to construct a collection of handcrafted interactive environments. SPEAR features a starter pack of 300 virtual indoor environments with more than 2,500 rooms and 17,000 objects that can be manipulated individually. These interactive training environments use detailed geometry, photorealistic materials, realistic physics and accurate lighting. New content packs targeting industrial and healthcare domains will be released soon.

By offering diverse and realistic environments, SPEAR helps throughout the development cycle of embodied AI systems, and enables training agents to operate in the real world, straight from simulation. SPEAR helps to improve accuracy on many embodied AI tasks, especially traversing and rearranging cluttered indoor environments. SPEAR aims to decrease the time to market for household robotics and smart warehouse applications, and increase the spatial intelligence of embodied agents.

The CVC has worked on this project, collaborating in the development and design of content for the SPEAR settings.

For more information about SPEAR:

Other links of interest (SPEAR in the media):