Researchers at the University of Waterloo have developed a method for measuring the performance of quantum computers, and it could help establish universal standards for the machines.
The new method is called cycled benchmarking, and researchers use it to assess the potential of scalability. The method is also used to compare different quantum platforms against each other.
Joel Wallman is an assistant professor at Waterloo’s Faculty of Mathematics and Institute for Quantum Computing.
“This finding could go a long way toward establishing standards for performance and strengthen the effort to build a large-scale, practical quantum computer,” said Wallman. “A consistent method for characterizing and correcting the errors in quantum systems provides standardization for the way a quantum processor is assessed, allowing progress in different architectures to be fairly compared.”
Cycle Benchmarking helps quantum computing users to compare competing hardware platforms and increase the capability of each platform to come up with solutions for whatever they are working on.
At this point in time, the quantum computing race is becoming apparent all around the world. The amount of cloud quantum computing platforms and offerings are rising, and major companies like Microsoft, IBM, and Google are constantly developing new technology.
The cycle benchmarking method works by determining the total probability of error under any given quantum computing applications. This takes place when the application is implemented through randomized compiling. Cycle benchmarking provides the first cross-platform means of measuring and comparing the capabilities of quantum processors, and it is customized depending on the applications that the users are working on.
Joseph Emerson is a faculty member at IQC.
“Thanks to Google’s recent achievement of quantum supremacy, we are now at the dawn of what I call the `quantum discovery era’, Emerson said. “This means that error-prone quantum computers will deliver solutions to interesting computational problems, but the quality of their solutions can no longer be verified by high-performance computers.
“We are excited because cycle benchmarking provides a much-needed solution for improving and validating quantum computing solutions in this new era of quantum discovery.”
Emerson and Wallman founded Quantum Benchmark Inc., an IQC spin-off. It licensed the technology to world leading companies within the quantum computing field, including Google’s Quantum AI effort.
Quantum mechanics turned quantum computers into extremely powerful machines for computing. Quantum computers are capable of solving complex problems more efficiently than traditional or digital computers.
Quibits are the basic processing unit in a quantum computer, but they are extremely fragile. Any type of imperfection or source of noise in the system can lead to certain errors that cause incorrect solutions under a quantum computation.
The first step to going further with quantum computing is to gain control over a small-scale quantum computer with one or two quibits. A larger quantum computer could perform more complex tasks such as machine learning or complex system simulation, which could lead to advancements like the discovery of new pharmaceutical drugs. The problem is that engineering a larger quantum computer is more challenging, and the possibility of error is greater as quibits are added and the quantum system scales.
A profile of the noise and errors are produced when a quantum system is characterized. This indicates if the processor is performing the calculations that it is being requested to do. All significant errors need to be characterized in order to understand the performance of a quantum computer or to scale up.
Wallman, Emerson, and a group of researchers at the University of Innsbruck came up with a method to assess all error rates affecting a quantum computer. The new technique was implemented for the ion trap quantum computer at the University of Innsbruck, and it found that error rates don’t rise as the size of that quantum computers scales up.
“Cycle benchmarking is the first method for reliably checking if you are on the right track for scaling up the overall design of your quantum computer,” said Wallman. “These results are significant because they provide a comprehensive way of characterizing errors across all quantum computing platforms.”
AlphaZero Algorithm Applied to Quantum Computing
Quantum computing has become more of a focus over the last few years. Researchers and companies throughout the world are constantly working on developing this technology, which can solve extremely complicated problems that are too advanced for classical computers.
Quantum computers utilize quantum mechanics, which is a branch of physics that focuses on the smallest building blocks of our universe. One of the fundamental rules is that a system can exist in more than one state at a time.
These rules get translated into computer language, and a quantum computer is able to perform multiple calculations at the same time. This means that a quantum computer can perform much faster than regular computers.
The theory of quantum computers has been established, but there has yet to be a full-scale quantum computer created.
AlphaZero is capable of learning on its own without any interjection from humans. Because of this, the algorithm has been able to defeat both humans and complex computer programs in difficult games like Go, Shogi, and Chess. AlphaZero was able to do this by competing against itself and improving over time.
The algorithm was able to beat the leading chess program Stockfish after playing against itself for just four hours. After that impressive performance, Danish grandmaster Peter Heine Nielsen compared AlphaZero to a superior alien species.
The research group at Aarhus University has used computer simulations to demonstrate how AlphaZero can be applied to three different control problems. These could possibly be used in a quantum computer.
“AlphaZero employs a deep neural network in conjunction with deep lookahead in a guided tree search, which allows for predictive hidden-variable approximation of the quantum parameter landscape. To emphasize transferability, we apply and benchmark the algorithm on three classes of control problems using only a single common set of algorithmic hyperparameters,” according to the study.
The research done by the team was published in Nature Quantum Information.
Lead Ph.D. student Mogens Dalgaard spoke about how the team was impressed with AlphaZero’s ability to quickly teach itself.
“When we analyzed the data from AlphaZero we saw that the algorithm had learned to exploit an underlying symmetry of the problem that we did not originally consider. That was an amazing experience.”
The real breakthrough came from pairing AlphaZero, which is an extremely impressive algorithm on its own, with a specialized quantum optimization algorithm.
According to Professor Jacob Sherson, “This indicates that we are still in need of human skill and expertise, and that the goal of the future should be to understand and develop hybrid intelligence interfaces that optimally exploits the strengths of both.”
The group wants to quicken the pace of development within the field, so they released the code and made it openly available. The move generated a lot of interest.
“Within a few hours I was contacted by major tech-companies with quantum laboratories and international leading universities to establish future collaboration” Jacob Sherson said. “so it will probably not be long until these methods will find use in practical experiments across the world.”
DeepMind is a UK-based Google sister-company that is responsible for both AlphaZero and AlphaGo. These systems are now showing their importance in other areas, including quantum computing.
Optical Switch Can Reroute Light Between Chips Extremely Fast
Researchers at the National Institute of Standards and Technology (NIST) have developed an optical switch that is capable of rerouting light between computer chips within 20 billionths of a second. The new device is faster than any similar devices, and it could be integrated into low-cost silicon chips due to its low voltages. When it redirects light, the chip suffers very low signal loss.
The new chip will have big implications for computing, and it will help develop a computer that processes information using light rather than electricity. There are several advantages to using photons in order to transport data including faster travel and energy efficiency. With the use of electricity, computer components are heated up which wastes energy, and it limits computer performance.
The newly developed switch uses nanometer-scale gold and silicon optical, electrical and mechanical components. These are all densely packed, and they send light into and out of a channel. This affects its speed and direction of travel.
The device was described by the NIST-led international team in Science.
According to co-author Christian Haffner of NIST, ETH Zurich and the University of Maryland, the switch has a lot of potential applications. It could be used in driverless vehicles to redirect light beams that scan a roadway in order to measure the distance to other vehicles and pedestrians. The switch could also be used within neural networks, utilizing more powerful light-based circuits rather than electricity-based ones.
One of the major benefits of the new switch is that it uses very little energy to redirect light signals, which could be extremely important in quantum computing. A quantum computer has a fragile relationship between pairs of subatomic particles, which processes data. Because of their fragile nature, a computer needs to operate at extremely low temperatures and low power so that particle pairs are not disturbed. Since the newly developed switch requires a lot less energy, it could prove to be an important aspect of quantum computing.
Challenging Long Held Beliefs
According to Haffner, along with his colleagues Vladimir Akysuk and Henri Lezec of NIST, the new findings contradict many long-held beliefs within the scientific community. Many researchers believe that these types of switches would not be practical due to their bulky size, and they would operate at high voltages that cause slow performance.
The setup includes a tube-shaped channel called a waveguide, and a light beam travels inside of it. There is an off-ramp where some of the light exits into a cavity that is a few nanometers away.
The switch also utilizes a thin gold membrane that is suspended a few tens of nanometers above a silicon disk, which has the cavity etched into it. When the light travels around, some of it leaks out and hits the membrane. This activity induces groups of electrons that are on the membrane’s surface to oscillate. The oscillations are called plasmons, and they are a mix between a light wave and an electron wave. The oscillating electrons have a shorter wavelength that allows researchers to manipulate the plasmons over nanoscale distances. All of this helps the optical switch remain extremely compact.
If the researchers change the gap between the silicon disk and the gold member by a few nanometers, the phase of the hybrid light wave is delayed or advanced. When the phase of the wave recombines with light traveling in the tube-shaped channel, the two beams cause the light to either be obstructed or continue in its original direction. This allows the light to be transferred to any other computer chips at will.
The team’s next steps involve shortening the distance between the silicon disk and the gold membrane in order to make the device smaller. This would help further reduce signal loss, making the switch even more useful to different industries.
Reporting Shows Google Has Claimed to Achieve “Quantum Supremacy”
In what could be the biggest development in quantum computing, Google has claimed that it reached “quantum supremacy.” This comes as John Martinis, along with a group of researchers from Google, demonstrated their new abilities. The news was first picked up by the Financial Times. The original paper that detailed these new developments was first published on a NASA website, but it was quickly taken down before being picked up by the media.
Quantum supremacy is being defined as the point at which a quantum computer is able to perform tasks that are well beyond the capabilities of the most powerful conventional supercomputer. According to the paper that was published, “this experiment marks the first computation that can only be performed on a quantum processor.”
The title of the published paper was “Quantum supremacy using a programmable superconducting processor.”
According to MIT Technology Review, Google and NASA entered into an agreement last year which allowed Google to use supercomputers held by NASA. The reporting has said that the published paper stated that Google’s quantum processor was able to calculate and process data at an incredible rate. In a calculation that would take Summit, the world’s most advanced supercomputer, about 10,000 years, the new quantum processor at Google was able to complete it in 3 minutes and 20 seconds. Google is the only known company who has achieved this milestone.
The reason for the increased focus on quantum machines and their power is that they are capable of harnessing quantum bits (qubits), rather than just classical bits. Classical bits are either a 1 or a 0, while quantum bits are able to be a combination of both at the same time. Along with other features, this enables quantum computers to be able to process extraordinary amounts of data in a small amount of time. Regular supercomputers are required to go through the process in sequence. This has been a focus of scientists for years, as they recognize the enormous potential that these machines bring to our world.
There has been disagreement among the technology community as to whether this is as big of a deal as it seems, as reported by the Financial Times. There is no doubt that Google has built the first quantum computer that is able to outperform the most powerful supercomputer in the world, but some believe that it’s not as big of a milestone as it is being made out to be.
According to Dario Gil, the head researcher at IBM, what Google is doing is both indefensible and wrong, and he said that the work done by Google is “a laboratory experiment designed to essentially — and almost certainly exclusively — implement one very specific quantum sampling procedure with no practical applications.”
There are some supporters of the claim of “quantum supremacy,” like Chad Rigetti, a former IBM executive. He says that these developments are a profound moment for both quantum computing and humans.
The major tech companies like IBM, Google, Intel, and Microsoft have all been investing huge amounts of money and resources into this field. They are all looking to be the leader, as Google has claimed they have done. This technology will have huge implications in almost every aspect of society and industry including artificial intelligence, security, communication, and science.
One can say that the exact claim by Google of achieving “quantum supremacy,” is just as terrifying as the language. WIth tech companies increasingly gaining more power, they have an important hold on our societies. Whether or not Google has truly achieved “quantum supremacy,” there is no doubt that whoever does will hold the keys to the future of quantum computing, artificial intelligence, and processing thousands of years of data in minutes.
- Robots Walk Faster With Newly Developed Flexible Feet
- Bradford Newman, Chair of North America Trade Secrets Practice – Interview Series
- Eugene Terekhov, CEO of AiBUY – Interview Series
- How Riiid! is Helping to Bring in New Era of AI-Education
- Vincent Scesa – Autonomous Vehicle Program Manager, EasyMile – Interview Series