LATEST NEWS

Occipital and Inuitive present new integrated solution for AR/VR/MR headsets and robotics

Occipital, Inc. and Inuitive Ltd. announce their collaboration to bring to market a complete hardware and software solution that brings highly efficient room-scale sensing and SLAM (simultaneous localization and mapping) to next generation mixed reality, augmented reality and virtual reality (MR/AR/VR) headsets and robotics.

The joint offering enables manufacturers of AR/VR/MR headsets and home & industrial robots to easily integrate efficient, low-latency 3D sensing and SLAM into their next generation of products. The solution merges Occipital’s Structure Core embeddable depth sensor with Inuitive’s NU3000 depth processing chip.

For the past year, Occipital and Inuitive have worked together to tightly integrate Structure Core and NU3000 to achieve levels of performance that minimize host system loads while delivering an exceptional user experience. The combined Structure Core and NU3000 offers high performance with low power consumption.

The market-leading solution is able to sense depths from 30cm to greater than 5m with an accuracy as high as ±0.17% RMS at 1m (using fit-to-plane). The integrated solution reduces CPU load to be equivalent to <25% of a recent dual-core ARM CPU during 6-DoF tracking.

System latency (from camera to fully-tracked pose) is just 10ms. Even with this high performance, the power consumption for depth + visible is between 1.3W and 2.0W, depending on the configuration selected.

Along with Structure Core, Occipital also offers Bridge Engine, an advanced MR software engine and development platform. Available on multiple platforms, Bridge Engine allows manufacturers to accelerate development of market-ready devices with advanced position-tracked VR and mixed reality capabilities.

The combination of Structure Core and NU3000 is designed for use in the new generation of home & industrial robots as well as headsets. In particular, Structure Core’s dual infrared cameras can be used for stereo depth sensing when ambient sunlight would otherwise blind robotic navigation systems that rely on time-of-flight or structured light depth sensors.

 

Liat

Recent Posts

Avnet ASIC and Bar-Ilan University Launch Innovation Center for Next Generation Chiplets

Collaboration aims to accelerate Europe’s adoption of chiplets and advanced 2.5D and 3D chip packaging…

1 day ago

NVIDIA Acquires Open-Source Workload Management Provider SchedMD

NVIDIA will continue to distribute SchedMD’s open-source, vendor-neutral Slurm software, ensuring wide availability for high-performance…

1 day ago

Stratasys Supercharges Airbus Production: More Than 25,000 Parts 3D-Printed this Year; 200,000+ Already in Flight

Powered by Stratasys (NASDAQ: SSYS) technology, Airbus is producing more than 25,000 flight-ready 3D-printed parts…

3 days ago

Quantum Art Raises $100 Million in Series A Round to Drive Scalable, Multi-Core Quantum Computing

Funding will support Quantum Art in reaching a 1,000-qubit commercial platform and global expansion Quantum…

6 days ago

Hud Ships First Runtime Code Sensor to Bring Production Reality to Code Generation

Hud automatically captures live service and function-level data from production- providing the missing context for…

6 days ago

Port Raises $100M Series C to Power Agentic Engineering Platform

General Atlantic leads round valuing company at $800M as Port tackles the 90% of developer…

6 days ago