About a year ago, TechCrunch wrote about a little-known company developing ai accelerator chips to take on hardware from industry titans like Nvidia, AMD, Microsoft, Meta, AWS, and Intel. Their mission at the time seemed a bit ambitious, and it still does. But it must be recognized that the startup, Commission aiis alive and well, and just raised $22.6 million in a new funding round.
VentureTech Alliance, the strategic venture capital partner with semiconductor giant TSMC, led the round with participation from RTX Ventures, ACVC Partners, Anzu Partners and Schams Ventures. With a total of $45 million raised by EnCharge, the new capital will go towards growing the company's team of 50 employees in the US, Canada and Germany. and bolster the development of EnCharge's ai chips and “end-to-end” ai solutions, according to co-founder and CEO Naveen Verma,
“EnCharge's mission is to provide broader access to ai for the 99% of organizations that cannot afford to deploy today's expensive, power-hungry ai chips.” verma saying. “Specifically, we are enabling new ai use cases and form factors that operate sustainably, from both an economic and environmental perspective, to unleash the full potential of ai.”
Verma, director of Princeton's Keller Center for Innovation in Engineering Education, launched EnCharge last year with Echere Iroaga and Kailash Gopalakrishnan. Gopalakrishnan was until recently a member of IBM and had worked at the tech giant for almost 18 years. Iroaga previously led semiconductor company Macom's connectivity business unit as vice president and then general manager.
EnCharge has its roots in federal grants Verma received in 2017 along with collaborators at the University of Illinois at Urbana-Champaign. As a result of DARPA's Electronics Resurgence Initiative, which aims to advance a range of computer chip technologies, Verma led an $8.3 million effort to research new types of non-volatile memory devices.
Unlike the “volatile” memory prevalent in today's computers, non-volatile memory can retain data without a continuous power supply, making it theoretically more energy efficient.
DARPA also funded Verma's research on in-memory computing (in this case, “in-memory,” referring to running calculations in RAM to reduce the latency introduced by storage devices.
EnCharge was launched to commercialize Verma's research. By using in-memory computing, EnCharge hardware can accelerate ai applications on servers and “network edge” machines, Verma says, while reducing power consumption relative to standard computer processors.
“Current ai computing is expensive and energy-intensive; Currently, only the most well-capitalized organizations are innovating in ai. For most, ai is not yet achievable at scale in their organizations or products,” he stated. “Encharge products can provide the processing power the market demands while addressing the extremely high power requirements and cost obstacles organizations face.”
Lofty language aside, it's worth noting that EnCharge hasn't started mass producing its hardware, yet, and so far only has “several” customers lined up. In another challenge, EnCharge faces well-funded competition in the already saturated ai accelerator hardware market. Axelera and GigaSpaces are developing in-memory hardware to accelerate ai workloads, and NeuroBlade has raised tens of millions in venture capital funding for its in-memory inference chip for data centers and edge devices.
It's also difficult to take EnCharge's performance claims at face value, given that third parties haven't had a chance to compare the startup's chips. But EnCharge investors back them for what they're worth.
“EnCharge is solving critical issues around computing power, accessibility and costs that limit ai today and are inadequate to handle the ai of tomorrow,” Kai Tsang of VentureTech Alliance said via email. “The company has developed computing beyond the limits of current systems with a technologically unique architecture that adapts to today's supply chain.”