The first supercomputer in California, the CG-1 is already showing its strength with 4 ExaFLOPs of FP16 performance and 54 million cores. By focusing on large language models and generative AI, AI learning becomes easier, eliminating the need for complex distributed programming languages.
In 2024, the CG-2 and CG-3 will come, followed by six more supercomputers around the world, with total computing power reaching a mind-blowing 36 ExaFLOPs.
The Condor Galaxy project wants to revolutionize AI by democratizing AI and accelerating learning curves. Andrew Feldman, CEO of Cerebras Systems, said their project will take minutes, not months, to train a productive AI model.
Source: Ferra

I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.