stub AI & AR are Driving Data Demand – Open Source Hardware is Meeting the Challenge - Unite.AI
Connect with us

Thought Leaders

AI & AR are Driving Data Demand – Open Source Hardware is Meeting the Challenge

mm

Published

 on

Data is the lifeblood of the digital economy, and as new technologies emerge and evolve, the demand for faster data transfer rates, lower latencies, and higher compute power at data centers is increasing exponentially. New technologies are pushing the boundaries of data transmission and processing, and  adopting open source technologies can help data center operators maximize their current operations and prepare for the future. Here are some examples of technology driving the demand for high compute and ways that open source technology, communities and standards are helping address this demand at scale in a sustainable way.

Artificial Intelligence and Machine Learning

Artificial intelligence (AI) and machine learning (ML) technologies are revolutionizing various domains such as natural language processing, computer vision, speech recognition, recommendation systems, and self-driving cars. AI and ML enable computers to learn from data and perform tasks that normally require human intelligence.

However, AI and ML also require massive amounts of data and compute power to train and run complex models and algorithms. For example, GPT-3, one of the most advanced natural language models in the world, has 175 billion parameters and was trained on 45 terabytes of text data. To process such large-scale data sets and models efficiently, AI and ML applications need high-performance computing (HPC) systems that can deliver high-speed data transfer rates, low latencies, and high compute power.

One of the emerging trends in HPC is to use specialized processors such as GPUs or TPUs that are optimized for parallel processing and matrix operations that are common in AI and ML workloads. For example, NVIDIA’s Grace CPU is a new processor designed specifically for HPC applications that leverages NVIDIA’s GPU technology to deliver up to 10 times faster performance than current x86 CPUs. Grace CPU also supports fast interconnects such as NVLink that enable high-speed data transfer rates between CPUs and GPUs.

Augmented Reality and Virtual Reality

The Apple Vision Pro made tidal waves during its unveiling. Augmented reality (AR) and virtual reality (VR) are two of the most immersive and interactive technologies that are transforming various industries such as entertainment, education, health care, and manufacturing. AR overlays digital information on top of the real world, while VR creates a fully simulated environment that users can experience through a headset.

However, these technologies also pose significant challenges for data transfer and processing. Due to its recent release, details around the Apple Vision Pro are still pending. Other VR headsets have been available for a while, however, so we can make some assumptions. For example, VR headsets such as Oculus Quest 2 require a high-speed connection to a PC or a cloud server to stream high-quality video and audio content, as well as tracking and input data from the headset and controllers. The video bitrate, which is the amount of data transferred per second, depends on the speed at which the GPU can encode the signal on the PC or server side, and the speed at which the Quest 2 processor can decode the signal on the headset side.

According to Oculus, the recommended bitrate for VR streaming is between 150 Mbps to 500 Mbps, depending on the resolution and frame rate. This means that VR streaming requires a much higher data transfer rate than other online activities such as web browsing or streaming music. Moreover, VR streaming also requires low latency, which is the time it takes for a signal to travel from one point to another. High latency can cause laggy or jittery gameplay, which can ruin the immersion and cause motion sickness.

The latency depends on several factors such as the network speed, the distance between the devices, and the encoding and decoding algorithms. According to Oculus, the ideal latency for VR streaming is below 20 milliseconds. However, achieving this level of performance is not easy, especially over wireless connections such as Wi-Fi or 5G.

Open Source Technologies for Data Center Optimization

As new technologies drive the demand for faster data transfer rates, lower latencies, and higher compute power at data centers, data center operators face several challenges such as increasing power consumption, demanding new cooling requirements, space utilization, operational costs and a rapid pace of hardware innovation and refresh. To address these challenges, data center operators need to optimize their current infrastructure and adopt new standards and technologies that can enhance their efficiency and scalability.

This is the goal of the Open19 Project, a Sustainable and Scalable Infrastructure Alliance initiative now part of the Linux Foundation. The Open19 Project is an open standard for data center hardware that is based on common form factors and provides next-generation highly efficient power distribution, re-usable componentry and opportunities for emerging high speed interconnects. The SSIA mission and open standards created through the Open19 project are in step with the larger industry drive toward efficiency, scalability, and sustainability for the infrastructure that powers our digital lives and communities. The Open Compute Project is another effort to efficiently support the growing demands on compute infrastructure. This project similarly fosters community-driven collaboration among industry partners to develop datacenter solutions, with a focus on 21” server rack sizes typically used by large colos and hyperscalers. OCP’s scope also extends to the datacenter facility, as well as the internal IT components of the servers.

Conclusion

New technologies are driving the demand for faster data transfer rates, lower latencies, and higher compute power at data centers while communities, governments and companies focus on resource management and the increased sustainability concerns around water usage, power management and other carbon intensive aspects of technology creation, use and deployment. Adopting open source technologies, developed in community driven forums like the SSIA and Linux Foundation can help data center operators maximize their current operations and prepare for a future that is more sustainable as they meet the demands of these exciting new applications.

Zac Smith, Community Board Member, SSI Alliance.

Zac has been a regular innovator in the cloud and datacenter industry for over 20 years. Most recently, Zac led the digital infrastructure side of Equinix after the company acquired Packet, a company he co-founded in 2014 to provide automated datacenter capabilities to leading digital businesses. Prior to Packet, Zac was an early member of Voxel, a Linux-based cloud company acquired by Internap in 2011.