ZEROGPU PROMOTES NVIDIA GPU FOR NEURAL NETWORKS

Hugging Face, known for its open solutions in the field of artificial intelligence, has announced the launch of the Zerogpu program. With a budget of $10 million, this initiative aims to provide computing capacities based on old NVIDIA graphic processors for public use. The primary goal of the program is to ease the financial burden faced by small teams of developers when creating AI models.

The CEO of Hugging Face, Clem Delang, personally announced the launch of Zerogpu, highlighting the fact that the open-source community lacks the resources that large tech companies possess, resulting in applications like ChatGPT remaining the most popular.

“We are launching Zerogpu to provide independent and academic developers with infrastructure to deploy and demonstrate on the Spaces platform at zero financial cost,” Delang stated.

Established in 2016, Hugging Face has emerged as a key source of open AI models optimized for various devices, thanks to its close collaboration with industry giants like Nvidia, Intel, and AMD.

Zerogpu will be accessible through the Hugging Face application post and will operate on old NVIDIA A100 accelerators, differing from traditional cloud providers that often require long-term commitments from customers, which can be challenging for small teams.

A notable aspect of Hugging Face’s approach is its focus on AI inference rather than model training, as training necessitates substantial computing resources, which are not fully supported in this program. Documentation for Zerogpu indicates that GPU functions are confined to 120 seconds, which is insufficient for comprehensive learning.

Delang suggested that the system can “efficiently hold and release the GPU as needed,” although the specifics of this process remain unclear. Technologies like time slicing for concurrent task execution and NVIDIA Multi-Instance GPU (MIG) technology could enhance the availability of computing resources.

With a shortage of graphic processors, where companies like Lambda and Coreweave utilize their equipment as collateral for purchasing additional accelerators, Hugging Face’s initiative could offer significant relief to startups developing AI applications.

Recently, Hugging Face secured $235 million in a Series D financing round, led by tech giants including Google, Amazon, Nvidia, and Intel. Ironically, many of the main supporters of Hugging Face are developing proprietary models that could potentially overshadow small AI startups.

Meanwhile, zerogpu

/Reports, release notes, official announcements.