top of page

Taking Stock of New Data Center Computing Paradigm


Maurice Steinman
Maurice Steinman (Source: Lightelligence)

The semiconductor industry is once again at an inflection point, enjoying a renaissance fueled by disruptive artificial intelligence (AI) applications, such as autonomous driving and large language models (LLMs). LLMs like OpenAI’s ChatGPT have suddenly become prevalent in everyday conversations, as the sophistication of the “chatbot” has seemed to have crossed a threshold of conversational capability. The potential for integration of this technology into various services is seemingly limitless.

This level of intelligence requires analyzing and extracting information from data, powerful data storage, transmission and processing capabilities, posing challenges to the computing capabilities of existing data centers and edge devices. Finances Online estimates that data consumption from 2021-2024 grew, or will grow, from 74 zettabyes (a unit of data equal to one sextillion) in 2021 to 94 zettabytes in 2022, and 118 and 149 in 2023 and 2024, respectively. The numbers are staggering, and current compute power in a data center is having trouble keeping up.

Read More from EE Times at:

Comments


bottom of page