Computers that rely on artificial intelligence (AI) require a lot of energy, and this computing power requirement is roughly doubling every three to four months. When it comes to cloud-computing data centers, which are used by AI and machine learning applications, they use more electrical power per year than some small countries. Many researchers are warning that this system is unsustainable.
A team of these researchers led by the University of Washington has come up with a solution to help solve this problem – new optical computing hardware for AI and machine learning. This hardware is faster and far more energy efficient than conventional electronics. It also helps solve the ‘noise’ that is caused by optical computing, which can interfere with computing precision.
The research was published on January 21 in Science Advances.
Using Noise as Input
In the research paper, the team demonstrated how an optical computing system for AI and machine learning could use some of the noise as input to enhance creative output of the artificial neural network (ANN) within the system.
Changming Wu is a UW doctoral student in electrical and computer engineering and lead author of the paper.
“We've built an optical computer that is faster than a conventional digital computer,” said Wu. “And also, this optical computer can create new things based on random inputs generated from the optical noise that most researchers tried to evade.”
Optical computing noise is caused by stray light particles, or photons. These are produced by the lasers within the device and background thermal radiation. In order to target noise, the team connected their optical computing core to a generative adversarial network (GAN). They then tested different noise mitigation techniques, such as using some of the generated noise as random inputs for the GAN.
The team told the GAN to learn how to hand write the number ‘7’ like a human, which meant it had to learn the task by observing visual samples of handwriting before practicing over and over. Due to its form, the optical computer had to generate digital images that had a similar style to the samples.
Mo Li is a UW professor of electrical and computer engineering and senior author of the paper.
“Instead of training the network to read handwritten numbers, we trained the network to learn to write numbers, mimicking visual samples of handwriting that it was trained on,” said Li. “We, with the help of our computer science collaborators at Duke University, also showed that the GAN can mitigate the negative impact of the optical computing hardware noises by using a training algorithm that is robust to errors and noises. More than that, the network actually uses the noises as random input that is needed to generate output instances.”
As the GAN continued to practice writing the number, it developed its own distinctive writing style. It was eventually able to write numbers from one to 10 in computer simulations.
Building Larger Scale Device
The team will now look to build the device at a larger scale through the use of current semiconductor manufacturing technology, which will improve performance and allow the team to carry out more complex tasks.
“This optical system represents a computer hardware architecture that can enhance the creativity of artificial neural networks used in AI and machine learning, but more importantly, it demonstrates the viability for this system at a large scale where noise and errors can be mitigated and even harnessed,” Li said. “AI applications are growing so fast that in the future, their energy consumption will be unsustainable. This technology has the potential to help reduce that energy consumption, making AI and machine learning environmentally sustainable — and very fast, achieving higher performance overall.
- Vianai’s New Open-Source Solution Tackles AI’s Hallucination Problem
- AI & AR are Driving Data Demand – Open Source Hardware is Meeting the Challenge
- What is a ChatGPT Persona?
- PyCharm vs. Spyder: Choosing the Right Python IDE
- “Brainless” Soft Robot Navigates Complex Environments in Robotics Breakthrough