Yandex has made the YaFSDP library available to the public, which allows you to accelerate the training of large language models (LLM) by up to 25%, the press service of the technology company reported.

Yandex has opened access to its YaFSDP library

Subscribe to RB.RU on Telegram

Yandex has published its YaFSDP library on GitHub, which accelerates LMM training of its own design, as well as third-party open source models. The amount of speedup that YaFSDP provides depends on the parameters and architecture of the neural network.

In addition to reducing training time, the library will help reduce GPU resource consumption by up to 20%.

YaFSDP is mainly aimed at LLM, but is also suitable for neural networks for image generation. The library optimizes energy consumption at all stages of training: from pre-training to alignment.

YaFSDP was created during YandexGPT 3 training. The developers tested the library on third-party open source neural networks.

Author:

Natalia Gormaleva

Source: RB

Previous articleHas the release date for the Metal Gear Solid 3 remake been announced?
Next articleGoogle Announces ‘Help Me Write’ AI for Gmail and Docs
I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.

LEAVE A REPLY

Please enter your comment!
Please enter your name here