LATEST NEWS

Occipital and Inuitive present new integrated solution for AR/VR/MR headsets and robotics

Occipital, Inc. and Inuitive Ltd. announce their collaboration to bring to market a complete hardware and software solution that brings highly efficient room-scale sensing and SLAM (simultaneous localization and mapping) to next generation mixed reality, augmented reality and virtual reality (MR/AR/VR) headsets and robotics.

The joint offering enables manufacturers of AR/VR/MR headsets and home & industrial robots to easily integrate efficient, low-latency 3D sensing and SLAM into their next generation of products. The solution merges Occipital’s Structure Core embeddable depth sensor with Inuitive’s NU3000 depth processing chip.

For the past year, Occipital and Inuitive have worked together to tightly integrate Structure Core and NU3000 to achieve levels of performance that minimize host system loads while delivering an exceptional user experience. The combined Structure Core and NU3000 offers high performance with low power consumption.

The market-leading solution is able to sense depths from 30cm to greater than 5m with an accuracy as high as ±0.17% RMS at 1m (using fit-to-plane). The integrated solution reduces CPU load to be equivalent to <25% of a recent dual-core ARM CPU during 6-DoF tracking.

System latency (from camera to fully-tracked pose) is just 10ms. Even with this high performance, the power consumption for depth + visible is between 1.3W and 2.0W, depending on the configuration selected.

Along with Structure Core, Occipital also offers Bridge Engine, an advanced MR software engine and development platform. Available on multiple platforms, Bridge Engine allows manufacturers to accelerate development of market-ready devices with advanced position-tracked VR and mixed reality capabilities.

The combination of Structure Core and NU3000 is designed for use in the new generation of home & industrial robots as well as headsets. In particular, Structure Core’s dual infrared cameras can be used for stereo depth sensing when ambient sunlight would otherwise blind robotic navigation systems that rely on time-of-flight or structured light depth sensors.

 

Liat

Recent Posts

Matia Raises $21M Series A to Scale Unified Data Infrastructure for the AI Era

Matia, the unified data operations platform, today announced it has raised $21 million in Series…

4 days ago

TMR 10WI series 10 watt DC/DC converter in compact SIP package

Compact SIP-8 plastic case Wide input range: 4.5–18, 9–36 and 18–75 VDC Certification according to…

6 days ago

Quantum Machines to Establish Flagship Hub at the Illinois Quantum and Microelectronics Park

New collaboration will establish a quantum-control–enabled center at the IQMP to accelerate and scale fault-tolerant…

1 week ago

SENAI raises $6.2M to launch real-time intelligence for threats hiding in online video content

With a seed round led by 10D Ventures, SENAI emerges from stealth to help government…

1 week ago

Dassault Systèmes and NVIDIA Partner to Build Industrial AI Platform Powering Virtual Twins

 Shared industrial AI architecture combines Virtual Twins and AI infrastructure deployable at scale. Science-validated world…

1 week ago

New Power Module Enhances AI Data Center Power Density and Efficiency

Microchip’s MCPF1525 power module with PMBus™ delivers 25A DC-DC power, stackable up to 200A The…

2 weeks ago