Categories: LATEST NEWS

Europe Leaps Ahead in Global AI Arms Race, Joining $20 Million Investment in NeuReality to Advance Affordable, Carbon-Neutral AI Data Centers

Europe’s crucial investment In NeuReality through its EIC Fund helps reshape the global AI landscape, challenging chip scarcity and positioning itself to outpace other nations with greener, affordable AI Inference

  • NeuReality cracks the code of AI chip scarcity, shifting the paradigm from mere AI chip proliferation to holistic AI data center efficiency — disrupting norms for BigTech and AI startups alike
  • Amid the Generative AI and LLM surge, NeuReality’s latest raise accelerates their deployment to more regions and market segments – supercharging AI Inference for advanced applications with unmatched performance while slashing costs by 90%

NeuReality, a technology leader in re-imagining ideal AI Inference and data center infrastructure in the age of AI, announced today that it has raised $20 million in new funds from the European Innovation Council (EIC) Fund, Varana Capital, Cleveland Avenue, XT Hi-Tech and OurCrowd. The new capital will accelerate the deployment of the company’s NR1™ AI Inference Solution to more customers and users – enabling NeuReality to shift faster from early deployment phase to growth in other markets, regions, and generative AI.

This $20 million raise brings NeuReality’s total funding to $70 million. It marks another significant vote of confidence since the successful delivery of its 7nm AI inference server-on-a-chip, the NR1 NAPU™ (Network Addressable Processing Unit) from TSMC last year — the fundamental component of its overall NR1™ AI Inference Solution. NeuReality’s efficient AI-centric system architecture enables companies to run generative AI applications and large language models (LLMs) without overinvesting in scarce and underutilized GPUs.

“In order to mitigate GPU scarcity, optimization at the system and datacenter level are key,” said Naveen Rao, VP of Generative AI at Databricks. Rao, a NeuReality Board member and early investor in the startup added: “To enable greater access to compute for generative AI, we must remove market barriers to entry with a far greater sense of urgency. NeuReality’s innovative system — engineered at the data center architecture level — represents that tipping point.”

Enterprises face other big challenges in deploying trained AI models and apps, known as the AI Inference process. Running live AI data to solve a task can be complex and costly, with a record of poor scalability from AI accelerators and system bottlenecks caused by CPUs.

“Our disruptive AI Inference technology is unbound by conventional CPUs, GPUs, and NICs. We didn’t try to just improve an already flawed system. Instead, we unpacked and redefined the ideal AI Inference system from top to bottom and end to end, to deliver breakthrough performance, cost savings, and energy efficiency,” said NeuReality’s CEO Moshe Tanach, pointing to the paltry 30-40 percent utilization rate of AI accelerators.

“Investing in more and more DLAs, GPUs, LPUs, TPUs…won’t address your core issue of system inefficiency,” remarked Tanach. “It’s akin to installing a faster engine in your car to navigate through traffic congestion and dead ends — it simply won’t get you to your destination any faster. NeuReality, on the other hand, provides an express lane for large AI pipelines, seamlessly routing tasks to purpose-built AI devices and swiftly delivering responses to your customers, while conserving both resources and capital.”

NeuReality Brings a Clear Alternative to CPU Limitations

NeuReality’s NR1-M™ and NR1-S™ systems, which are full-height PCIe cards that integrate easily into server racks, drive 100 percent AI accelerator utilization. Each system houses internal NAPUs that run on any AI accelerator and operate independently from the CPU, eliminating a CPU requirement altogether. By connecting directly to Ethernet, NR1 efficiently manages AI queries from vast data pipelines originating from millions of users and billions of devices. Compatible server configurations were demonstrated by AMD, IBM, Lenovo, Qualcomm, and Supermicro at NeuReality’s product launch at the SC23 international conference in Denver last November.

Since its Series A funding in 2022, NeuReality has taken delivery of its NR1 NAPUs. The firm is driving early AI deployments with select cloud service customers and enterprise customers in financial services, business services, and government, focusing on current natural language processing, automated speech recognition, recommendation systems, and computer vision. The additional $20 million propels NeuReality for broader AI Inference deployment as both conventional and generative AI applications surge in demand.

Funding from the EU supports NeuReality’s AI Inference and GenAI Deployments

The investment of the EIC Fund, the venture arm of the European Commission’s EIC Accelerator program, shows its support for the firm’s solution which not only brings optimized performance, but also improved energy efficiency in AI deployment to the marketplace. The EIC investment further addresses two important industries for Europe – advanced semiconductors and AI, both of which are expected to be major drivers of economic growth in the coming years.

“We recognize the increasing importance of the European Union’s leadership in AI and blazing a different path of high efficiency versus high spending,” said Svetoslava Georgieva, Chair of the EIC Fund Board. “NeuReality’s vision and disruptive technology align with our commitment to fortifying the region’s deep tech and AI investments for a sustainable, AI-powered future.”

NeuReality had already secured a substantial grant from the EIC Accelerator program last year to support various development steps of the firm’s innovation aimed at solving the cost and complexity of AI Inference at its core architectural level.

“We appreciate this tremendous vote of confidence from the EU and recognize that the backing of influential investors and the European Union will drive us forward as we pioneer next-gen semiconductor advancements in AI Inference,” Tanach said. “It’s a significant step towards a greener, more democratized AI future.”


Credit: Aviv Kurt

Danit

Comments are closed.

Recent Posts

Hailo Earns Frost & Sullivan 2024 Technology Innovation Leadership Award

The Best Practices Technology Innovation Leadership Award recognizes Hailo’s ongoing commitment to innovation and growth in the global vision processing…

2 weeks ago

BeyondTrust Acquires Entitle, Strengthening Privileged Identity Security Platform with Paradigm Shifting Just-in-Time Access and Identity Governance

Entitle is a pioneering privilege management solution that discovers, manages, and automates just-in-time (JIT) access and modern identity governance and…

1 month ago

Samtec Introduces SIBORG Tool to Speed Component Launch Designs

Available freely to Samtec customers under NDA, SIBORG (Signal Integrity Breakout Region Guru) works with Ansys HFSS 3D Layout to…

1 month ago

Accelerating Mass Business AI Adoption: NeuReality Launches Developer Portal for NR1 Inference Platform, Expanding Affordable AI Access

Entire NR1 system purpose-built for a more affordable AI infrastructure allowing for faster deployment; furthering AI’s reach into more parts…

1 month ago

Dot Compliance Raises a $17.5 Million Up-Round in Series B Extension Funding to Advance New Category of AI-driven Compliance

Following rapid growth in its customer base to over 400, funding will fuel further AI development and create a hybrid…

1 month ago

Tektronix and recently acquired EA Elektro-Automatik now offer expanded power portfolio for engineers who are electrifying our world

The addition of EA’s high-efficiency regenerative power supplies greatly expands Tektronix’s trusted offering Tektronix, Inc, a leading provider in test…

1 month ago