Multiverse Computing has demonstrated how quantum computers with few qubits can already implement arbitrary multidimensional function calculus in a remarkably efficient way.
Multiverse Computing has developed a “quantum calculator” algorithm intended to transform a quantum computer into a mathematical tool for executing complex calculations that are currently carried out by specialized software to solve problems like optimization. The calculations are on par with the results obtained by the highest-performance conventional computers and will improve as quantum computing capability grows, proving that quantum computers can be useful now, according to Multiverse, a deep-tech company with headquarters in San Sebastian, Spain, and subsidiaries in Toronto, Paris and Munich.
Quantum computing promises extraordinary increases in computational speed and power relative to today’s computers, yielding benefits in fields like pharmaceuticals, healthcare, manufacturing, cybersecurity and finance. For example, quantum computing can accelerate financial portfolio management models, such as the Monte Carlo model, for calculating likely outcomes and attendant risks. The computers can carry out multiple sophisticated calculations simultaneously, which is especially helpful for factorizations and could advance the development of decryption technology.
Quantum computers can run challenging simulations, as they can mimic highly complex systems at speeds well beyond the reach of traditional computers. Molecular simulations, which are crucial in the creation of prescription drugs, might benefit from this capability. And quantum computers’ promised optimization results, with their ability to process enormous volumes of complex data, could revolutionize artificial intelligence and machine learning.
Research collaborations and investments by large tech companies have advanced the technology in recent years to the point where the first commercial quantum computers have been built and brought to market, though these early models are constructed from sparse sets of noisy qubits. Future CPUs will likely be more powerful and noise-resistant, if historical trends provide any indication. In the meantime, the task is to put today’s noisy intermediate-scale quantum (NISQ) devices to productive use.
Optimization is an important area in which today’s NISQ devices can start to add value. Hybrid quantum-classical solutions can deploy modern quantum annealers to address challenging optimization problems in practical settings.
An optimization problem entails maximizing or reducing a function in relation to a set that represents the range of options possible in a particular circumstance. The tool enables comparison of the various options to identify which may be the best.
Optimization problems can be classified according to the types of variables they deal with:
• In continuous optimization, all the variables in the problem can take continuous values.
• In discrete optimization, all the variables in the problem can take values only on a finite set.
• In mixed optimization, some of the variables are continuous and others are discrete.
Additionally, an optimization problem is said to be a linear program or linear programming problem (LP) if both the function to be optimized and the function describing the constraint are linear. Conversely, a nonlinear program or nonlinear programming problem (NLP) is an optimization problem in which one of the functions is nonlinear.
A typical combinatorial optimization problem, important in theoretical computer science and operations research, is the traveling salesman problem (TSP). The cities that the salesman might visit are the points in the problem statement, and the goal of the TSP is to minimize the travel time and distance to minimize costs.
This seemingly simple example can be solved using integer linear programming, but it has been demonstrated that the exact solution of the TSP requires algorithms with exponential complexity. The TSP is therefore said to be an NP-hard problem (NP stands for nondeterministic polynomial time).
In the case of nonlinear programming, where either the objective function or the constraint function is nonlinear (i.e., includes nonlinear operations), the complexity spirals and the optimization problem grows even more challenging. As a result, approximate linear models are usually chosen, even when a nonlinear objective would be the more appropriate approach given the conditions.
Check out the full article by Stefano Lovati on EE Times Europe here.