Bloomberg has introduced a new language model called BloombergGPT, which has been specifically designed for a variety of business and financial management tasks.
Released on March 30, the BloombergGPT language model with 50 billion parameters was trained on the media company’s extensive database. According to the first reviewsa highly specialized neural network outperforms its competitors in solving specific financial problems.
Among the possible applications of BloombergGPT, the creators highlight the generation of news headlines, the search for information about various companies and their leaders, as well as financial analytics and trend tracking of various stocks. Unlike language models that were trained on general databases without a specific focus, Bloomberg’s neural network shows a higher percentage of correct answers and is generally better understanding of specific financial queries.
Technically, BloombergGPT outperforms Google’s OpenAI and LaMDA ChatGPT-3 language models, but is inferior to the larger LLaMA and PaLM models. The developers spent 53 days training the neural network; during this time, 569 billion information tokens were exchanged.
Author:
Grigory Shcheglov
Source: RB

I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.