Researchers at the National Institute of Standards and Technology (NIST) have developed an optical switch that is capable of rerouting light between computer chips within 20 billionths of a second. The new device is faster than any similar devices, and it could be integrated into low-cost silicon chips due to its low voltages. When it redirects light, the chip suffers very low signal loss.
The new chip will have big implications for computing, and it will help develop a computer that processes information using light rather than electricity. There are several advantages to using photons in order to transport data including faster travel and energy efficiency. With the use of electricity, computer components are heated up which wastes energy, and it limits computer performance.
The newly developed switch uses nanometer-scale gold and silicon optical, electrical and mechanical components. These are all densely packed, and they send light into and out of a channel. This affects its speed and direction of travel.
The device was described by the NIST-led international team in Science.
According to co-author Christian Haffner of NIST, ETH Zurich and the University of Maryland, the switch has a lot of potential applications. It could be used in driverless vehicles to redirect light beams that scan a roadway in order to measure the distance to other vehicles and pedestrians. The switch could also be used within neural networks, utilizing more powerful light-based circuits rather than electricity-based ones.
One of the major benefits of the new switch is that it uses very little energy to redirect light signals, which could be extremely important in quantum computing. A quantum computer has a fragile relationship between pairs of subatomic particles, which processes data. Because of their fragile nature, a computer needs to operate at extremely low temperatures and low power so that particle pairs are not disturbed. Since the newly developed switch requires a lot less energy, it could prove to be an important aspect of quantum computing.
Challenging Long Held Beliefs
According to Haffner, along with his colleagues Vladimir Akysuk and Henri Lezec of NIST, the new findings contradict many long-held beliefs within the scientific community. Many researchers believe that these types of switches would not be practical due to their bulky size, and they would operate at high voltages that cause slow performance.
The setup includes a tube-shaped channel called a waveguide, and a light beam travels inside of it. There is an off-ramp where some of the light exits into a cavity that is a few nanometers away.
The switch also utilizes a thin gold membrane that is suspended a few tens of nanometers above a silicon disk, which has the cavity etched into it. When the light travels around, some of it leaks out and hits the membrane. This activity induces groups of electrons that are on the membrane’s surface to oscillate. The oscillations are called plasmons, and they are a mix between a light wave and an electron wave. The oscillating electrons have a shorter wavelength that allows researchers to manipulate the plasmons over nanoscale distances. All of this helps the optical switch remain extremely compact.
If the researchers change the gap between the silicon disk and the gold member by a few nanometers, the phase of the hybrid light wave is delayed or advanced. When the phase of the wave recombines with light traveling in the tube-shaped channel, the two beams cause the light to either be obstructed or continue in its original direction. This allows the light to be transferred to any other computer chips at will.
The team’s next steps involve shortening the distance between the silicon disk and the gold membrane in order to make the device smaller. This would help further reduce signal loss, making the switch even more useful to different industries.
Researchers Develop Method for Measuring Quantum Computers
Researchers at the University of Waterloo have developed a method for measuring the performance of quantum computers, and it could help establish universal standards for the machines.
The new method is called cycled benchmarking, and researchers use it to assess the potential of scalability. The method is also used to compare different quantum platforms against each other.
Joel Wallman is an assistant professor at Waterloo’s Faculty of Mathematics and Institute for Quantum Computing.
“This finding could go a long way toward establishing standards for performance and strengthen the effort to build a large-scale, practical quantum computer,” said Wallman. “A consistent method for characterizing and correcting the errors in quantum systems provides standardization for the way a quantum processor is assessed, allowing progress in different architectures to be fairly compared.”
Cycle Benchmarking helps quantum computing users to compare competing hardware platforms and increase the capability of each platform to come up with solutions for whatever they are working on.
At this point in time, the quantum computing race is becoming apparent all around the world. The amount of cloud quantum computing platforms and offerings are rising, and major companies like Microsoft, IBM, and Google are constantly developing new technology.
The cycle benchmarking method works by determining the total probability of error under any given quantum computing applications. This takes place when the application is implemented through randomized compiling. Cycle benchmarking provides the first cross-platform means of measuring and comparing the capabilities of quantum processors, and it is customized depending on the applications that the users are working on.
Joseph Emerson is a faculty member at IQC.
“Thanks to Google’s recent achievement of quantum supremacy, we are now at the dawn of what I call the `quantum discovery era’, Emerson said. “This means that error-prone quantum computers will deliver solutions to interesting computational problems, but the quality of their solutions can no longer be verified by high-performance computers.
“We are excited because cycle benchmarking provides a much-needed solution for improving and validating quantum computing solutions in this new era of quantum discovery.”
Emerson and Wallman founded Quantum Benchmark Inc., an IQC spin-off. It licensed the technology to world leading companies within the quantum computing field, including Google’s Quantum AI effort.
Quantum mechanics turned quantum computers into extremely powerful machines for computing. Quantum computers are capable of solving complex problems more efficiently than traditional or digital computers.
Quibits are the basic processing unit in a quantum computer, but they are extremely fragile. Any type of imperfection or source of noise in the system can lead to certain errors that cause incorrect solutions under a quantum computation.
The first step to going further with quantum computing is to gain control over a small-scale quantum computer with one or two quibits. A larger quantum computer could perform more complex tasks such as machine learning or complex system simulation, which could lead to advancements like the discovery of new pharmaceutical drugs. The problem is that engineering a larger quantum computer is more challenging, and the possibility of error is greater as quibits are added and the quantum system scales.
A profile of the noise and errors are produced when a quantum system is characterized. This indicates if the processor is performing the calculations that it is being requested to do. All significant errors need to be characterized in order to understand the performance of a quantum computer or to scale up.
Wallman, Emerson, and a group of researchers at the University of Innsbruck came up with a method to assess all error rates affecting a quantum computer. The new technique was implemented for the ion trap quantum computer at the University of Innsbruck, and it found that error rates don’t rise as the size of that quantum computers scales up.
“Cycle benchmarking is the first method for reliably checking if you are on the right track for scaling up the overall design of your quantum computer,” said Wallman. “These results are significant because they provide a comprehensive way of characterizing errors across all quantum computing platforms.”
Reporting Shows Google Has Claimed to Achieve “Quantum Supremacy”
In what could be the biggest development in quantum computing, Google has claimed that it reached “quantum supremacy.” This comes as John Martinis, along with a group of researchers from Google, demonstrated their new abilities. The news was first picked up by the Financial Times. The original paper that detailed these new developments was first published on a NASA website, but it was quickly taken down before being picked up by the media.
Quantum supremacy is being defined as the point at which a quantum computer is able to perform tasks that are well beyond the capabilities of the most powerful conventional supercomputer. According to the paper that was published, “this experiment marks the first computation that can only be performed on a quantum processor.”
The title of the published paper was “Quantum supremacy using a programmable superconducting processor.”
According to MIT Technology Review, Google and NASA entered into an agreement last year which allowed Google to use supercomputers held by NASA. The reporting has said that the published paper stated that Google’s quantum processor was able to calculate and process data at an incredible rate. In a calculation that would take Summit, the world’s most advanced supercomputer, about 10,000 years, the new quantum processor at Google was able to complete it in 3 minutes and 20 seconds. Google is the only known company who has achieved this milestone.
The reason for the increased focus on quantum machines and their power is that they are capable of harnessing quantum bits (qubits), rather than just classical bits. Classical bits are either a 1 or a 0, while quantum bits are able to be a combination of both at the same time. Along with other features, this enables quantum computers to be able to process extraordinary amounts of data in a small amount of time. Regular supercomputers are required to go through the process in sequence. This has been a focus of scientists for years, as they recognize the enormous potential that these machines bring to our world.
There has been disagreement among the technology community as to whether this is as big of a deal as it seems, as reported by the Financial Times. There is no doubt that Google has built the first quantum computer that is able to outperform the most powerful supercomputer in the world, but some believe that it’s not as big of a milestone as it is being made out to be.
According to Dario Gil, the head researcher at IBM, what Google is doing is both indefensible and wrong, and he said that the work done by Google is “a laboratory experiment designed to essentially — and almost certainly exclusively — implement one very specific quantum sampling procedure with no practical applications.”
There are some supporters of the claim of “quantum supremacy,” like Chad Rigetti, a former IBM executive. He says that these developments are a profound moment for both quantum computing and humans.
The major tech companies like IBM, Google, Intel, and Microsoft have all been investing huge amounts of money and resources into this field. They are all looking to be the leader, as Google has claimed they have done. This technology will have huge implications in almost every aspect of society and industry including artificial intelligence, security, communication, and science.
One can say that the exact claim by Google of achieving “quantum supremacy,” is just as terrifying as the language. WIth tech companies increasingly gaining more power, they have an important hold on our societies. Whether or not Google has truly achieved “quantum supremacy,” there is no doubt that whoever does will hold the keys to the future of quantum computing, artificial intelligence, and processing thousands of years of data in minutes.
Scientists Working On Bringing Quantum Computer Properties to Classic Computers
A group of Scientists from Linköping University have been able to demonstrate how a quantum computer works, and they were able to simulate the properties of it in a classical computer.
“Our results should be highly significant in determining how to build quantum computer,” Professor Jan-Åke Larsson said.
Sweden, Europe, and other parts of the world have been investing large resources and focusing research to create superfast powerful quantum computers. Within ten years, a Swedish quantum computer is expected to be built, and the EU has deemed quantum technology as one of its major projects.
Currently, we have few useful algorithms that can be used for quantum computers. Even though that’s the case, this type of technology will be extremely important in simulations of biological, chemical, and physical systems. Many of these are too complex for the most powerful computers that we have now. In a computer, a bit can take the value of one or zero, but a quantum bit is able to take all values in between. This means that quantum computers don’t need to take so many operations for each calculation that is done.
Professor Jan-Åke Larsson and his doctoral student Niklas Johansson, from the Division for Information Coding at the Department of Electrical Engineering, Linköping University, have figured out much of why a quantum computer is more powerful than a classic one. They have also looked into what happens within a quantum computer.
The results from the research has been published in the scientific journal Entropy.
“We have shown that the major difference is that quantum computers have two degrees of freedom for each bit. By simulating an additional degree of freedom in a classical computer, we can run some of the algorithms at the same speed as they would achieve in a quantum computer,” says Jan-Åke Larsson.
The team has created a simulation tool called Quantum Simulation Logic, or QSL. It allows them to simulate the operation of a quantum computer on a classical computer. The Quantum Simulation Logic has one specific property, and it is the only property, that a quantum computer has and a classical computer does not. That is one extra degree of freedom for each bit that is part of the calculation.
“Thus, each bit has two degrees of freedom: it can be compared with a mechanical system in which each part has two degrees of freedom — position and speed. In this case, we deal with computation bits — which carry information about the result of the function, and phase bits — which carry information about the structure of the function,” Jan-Åke Larsson explains.
The QSL tool has been used by the team in order to study some of the quantum algorithms that are responsible for managing the structure of the function. Many of those algorithms are as fast in the simulations as they would be in a quantum computer.
“The result shows that the higher speed in quantum computers comes from their ability to store, process and retrieve information in one additional information-carrying degree of freedom. This enables us to better understand how quantum computers work. Also, this knowledge should make it easier to build quantum computers, since we know which property is most important for the quantum computer to work as expected,” says Jan-Åke Larsson.
The team has also built a physical version with electronic components. They used gates that are similar to the ones in quantum computers, and a toolkit simulates how the quantum computer works. This can allow students and others to simulate and understand how quantum cryptography and quantum teleportation works, among other aspects of quantum computers.
This new research can add to the increasing crossover between quantum computing and artificial intelligence. One of these crossovers is feature mapping. Other research conducted by IBM Research, MIT, and Oxford scientists has shown that as quantum computers become more powerful, they will be able to perform feature mapping on highly complex data structures, something classical computers can’t do. Feature mapping is important within machine learning, and it can lead to more effective AI that could identify patterns in data that classical computers are unable to detect.
As more and more research takes place in these fields, there will be increasingly more crossover in the two important areas.
- Quantum Stat Releases “Big Bad NLP Database”
- Google’s CEO Calls For Increased Regulation To Avoid “Negative Consequences of AI”
- AI Ethics Principles Undergo Meta-Analysis, Human Rights Emphasized
- Computer Algorithm Can Identify Unique Dancing Characteristics
- DeepMind Discovers AI Training Technique That May Also Work In Our Brains