# FPGA+ Multicore MCU = World's Most Energy-Efficient Quantum Simulation

This is the story of how a $99 embedded supercomputer can be used to perform state-of-the-art quantum physics simulations.

The first time I tried to run a quantum mechanical calculation in a computer, I used a humble Sinclair Spectrum ZX+. These days we have GFLOPs-scale computational capabilities available to us in products as unassuming as a smartphone.

Quantum simulations that previously required a huge and expensive supercomputer can now be efficiently executed in an affordable embedded device. This new computational power is a key enabler for STEM (Science, Technology, Engineering, & Mathematics) education. (See also Cabe Atwell's blog: Single-Board Supercomputing Comes Home.)

This allows for a new approach for physics, not from the pure mathematical point of view, but from a very simple algorithmic focus. We can think about this as an analog vs. digital approach. From my own experience, I can assure you that simulating complex quantum systems by running a little program on your own computer is a really enlightening experience.

While attending the EE Live! 2014 conference and exhibition earlier this year, I had the opportunity of getting acquainted with Adapteva's CEO, Andreas Olofsson. Adapteva broke into the mainstream EE scene with its Parallella board Kickstarter project. This credit card-sized open-hardware USD$99 system is widely considered to be the world's most energy efficient supercomputer.

You can imagine how happy I was to discover that Max Maxfield, my editor at EETimes, was one of the backers of the Parallella project. I became even happier when Max said he would loan me his own brand-new Parallella supercomputer -- the only condition being that I should do something cool with it. The point is that I already knew what I wanted to do with the Parallella.

As a proof of concept, my mission was to run the most energy-efficient quantum simulation ever performed in a classical -- non quantum -- computer. This is the story of how a $99 embedded supercomputer can be used to perform state-of-the-art quantum physics simulations.

**The Standard Model**

In the context of particle physics, the Standard Model is a theory concerning the electromagnetic, weak, and strong nuclear interactions that mediate the dynamics of the known subatomic particles.

Now take a look at the mug in the photograph below. This is the mug from which I drink my coffee every morning. The equation on this mug reflects a Lagrangian formulation of the Standard Model.

Everything we know about physics (except Gravitation) is embodied in this apparently simple equation. Of course, you also need to understand underlying mathematical and physical concepts, such as the QFT (quantum field theory) framework and the Lagrangian formulation, in order to extract useful information from this expression.

The first line describes the dynamics of all the force fields -- the gauge bosons which carry the force; e.g., the photon, which is the massless particle behind the electromagnetic field.

The second line describes the matter fields. This accounts for fermions and anti-fermions and their coupling to bosonic fields; i.e., the electron -- the particle from which all our EE technology arises.

The third and fourth lines represent the coupling of matter fields with the Higgs field and the dynamics of the Higgs field itself, respectively. The Higgs field not only accounts for the mass of both the gauge bosons and the matter fermions, but it also hides some other secrets of our universe.

**Computational physics**

From the Lagrangian formulation of the Standard Model we can create algorithms that we use to calculate predictions that can be experimentally verified. This is the way in which our most accurate theories are tested in huge experimental facilities such as CERN.

The problem is that simulating a quantum field in a classical computer is a very difficult task. It requires massive parallel floating-point computation capabilities. Fortunately, many of the calculations that previously required a supercomputer can now be performed in a conventional desktop or laptop machine.

As an example, the image below is a snapshot from my own Ubuntu workstation running a Python script coding the Dirac equations. The whole Dirac and Maxwell equations can be derived from the QED (Quantum Electro Dynamics) section of the Standard Model Lagrangian formulation expressed as an algorithm with no more than 100 lines of code. Despite this simplicity, the code accounts for spin, quantum mechanics, relativistic effects, anti-matter, and so forth.

I've actually tested this piece of code on a variety of ARM, i386, and AMD64-based machines, which demonstrates that it can be run on any hardware floating-point capable processor, including the one that powers your smartphone. Furthermore, this code can be easily modified to squeeze all the power of specialized data-crunching hardware (OpenCL, CUDA, etc.).

In this point, it's quite clear what I want to do -- use the awesome power of the Parallella to run an optimized QED simulation.