How to Solve Real World Problems

Sam Mugel, Ph.D., is the CTO of Multiverse Computinga global leader in developing value-driven quantum solutions for enterprises.

Quantum mechanics, which is the study of the behavior of subatomic particles, offers a way to improve the use of machine learning to solve inherently complex problems in optimization, product development, and decision making. decision in real time.

This quantum-inspired enhancement, known as a tensor neural network, helps companies achieve training savings for a wide range of applications, including autonomous vehicle simulations, natural language processing, and even generative AI like ChatGPT.

Tensor networks address one of the constraints faced by companies investing in machine learning: the scale of resources needed to model and train systems representing complex, multi-dimensional entities like fluid dynamics or systems financial. Greater complexity represented in the modeling requires proportionally higher resources in terms of energy, memory size and computation time for training with traditional neural networks on classical computers (see “Compute Trends Across Three Eras of Machine Learning”).

Quantum-inspired tensor networksOne of the main characteristics of a neural network is its robustness. When neural networks are trained on slightly different data sets, robustness means that they perform similarly. Think of it as wanting your self-driving car to know what to do if a red car is parked on the freeway, even though it’s only seen blue cars parked on the freeway during its training.

One way to add robustness is to make neural networks highly redundant by adding more nodes (the building blocks of neural networks), connections, and layers. Very complex networks will tend to behave the same way, even if they have been trained slightly differently. Yet, this robustness increases the number of variables to fit, which leads to a trade-off between the reliability of a model and the additional cost in terms of time and computational memory required for training.

Fortunately, tensor networks can be added to the neural network architecture used in deep learning on classical computers. Tensor networks are a mathematical tool that can improve the efficiency of deep learning by performing better on unseen data and being less sensitive to the vagaries of data sets. This results in less time and memory required for training, which reduces computational and energy costs while maintaining accuracy and robustness. They are considered quantum-inspired because the method used by tensor networks comes from the compressions used in quantum mechanics to simulate quantum physical systems.

When added to the architecture of neural networks, tensor networks work by compressing the neural network. They allow users to identify and eliminate irrelevant states, limiting the simulation to represent only what really matters to describe the modeled system. This means using fewer variables to describe complex systems. More importantly, it doesn’t just remove irrelevant nodes, but provides an optimal way to reduce redundancy without removing robustness.

Real world apps

The application of tensor neural networks makes even more sense considering that they exceed the capacity of current deep learning supercomputers. And when included in commonly used dense networks, tensorized networks have been shown to solve problems with the same accuracy while doing so in a shorter training time.

Real-world applications for tensor neural networks are already in play. For example, they are currently used in complex financial instrument pricing and hedging models. Able to adapt to multiple complex variables and dynamic conditions while remaining controllable, they were advocated by none other than Igor Halperin, Vice President of the AI ​​Asset Management Center of Fidelity Investment and Buy-Side Quant of the Year of Risk.net for 2022.

Large companies are already making significant investments in tensor neural networks. The technology is currently offered to the general business community through extensions that tensor popular deep learning suites. Applicable wherever neural networks are used, it is also made accessible through a subscription-based model, with providers pushing data or predictions to users.

Tensor neural networks could speed up everything from training to prediction. This is a transitional technology that may now outperform purely classical approaches, but not outperform fully fault-tolerant quantum computers when they come into play in the next five to 10 years.

In the meantime, however, quantum-inspired tensor networks offer companies a way to save time and money on machine learning using today’s mainstream computers. This provides a clear competitive advantage in addressing some of the fundamental issues that are depressing the bottom line.


Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs, and technology executives. Am I eligible?


Leave a Reply

Your email address will not be published. Required fields are marked *