April 24, 2024

Quantum computing’s role in AI evolution: Streamlining industrial and enterprise operations


Technology’s evolution is often a permutation of several existing combinations. One of those pairings in today’s computing environment is artificial intelligence and quantum computing, with the latter supplying efficiency and reduced resource usage.

At the forefront of this intersection is quantum computing software company Multiverse Computing S.L.

“The real idea was, OK, we have AI systems that are expensive to train — they are large and so on, and could we just prepare something that is not only smaller, because you have to fit the system somewhere, but also consumes less energy,” said Enrique Lizaso (pictured), co-founder and chief executive officer of Multiverse Computing. “We analyzed the problem and said, ‘Yes, this is a particular problem that fits into quantum computing.’ Then, at some point … somebody inside the company suggested, ‘OK, we have done that in the manufacturing line … can we apply that to LLMs?'”

Lizaso spoke with theCUBE Research principal analyst Rob Strechay at the “Supercloud 6: AI Innovators” event, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the transformative potential of harnessing AI and quantum computing to streamline industrial and enterprise operations.

Europe’s efficiency and accuracy ambitions with AI and quantum computing

Often seen as trailing behind tech giants such as the United States and China, Europe is now asserting its presence in the quantum and AI domains. Multiverse Computing’s recent successful funding roundhighlights the continent’s commitment to pushing boundaries in these fields.

The company’s ethos sheds light on the potential for quantum-inspired techniques to reshape industries beyond the realm of tech giants. From defect detection in manufacturing to real-time patient monitoring in healthcare and injury prediction in sports, the applications are vast and transformative, according to Lizaso. The core principle, however, remains constant: making AI more accurate, efficient and cost-effective.

“I think that two days ago, there was some news in the Financial Times about Iran using 20% of the electricity production just for data centers,” Lizaso said. “The demand for new data centers and also for more performance from the data centers is escalating super fast. So, they say, ‘OK, we have to apply the data centers to use green energy.’ [But] the AI electricity consumption is escalating so fast that even if you put all the resources of green electricity you have now, you are not going to cope with the demand.”

A pressing concern in AI deployment is the integrity of models, particularly in scenarios such as customer service where misinformation can have severe consequences. Multiverse Computing’s upcoming solution promises to address this by enabling models to selectively forget information, mitigating the need for costly retraining due to data errors or privacy violations, according to Lizaso.

“We thought that, OK, those quantum techniques make a representation of the distribution of the knowledge inside a [large language model],” he said. “From a different point of view, I mean the systems authorization … you can just extract, eliminate some parts of information with more precision. This is super important — [our new solution] is going to appear in the next months. So, it’s not only about compressing and making cheap things cheaper, but also to do that.”

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE Research’s coverage of the “Supercloud 6: AI Innovators” event: