OpenAI has conducted two major training runs for GPT-5. The first was slower than expected and scaling up proved time-consuming and costly. While GPT-5 is superior to its predecessors in some areas, its improvements fail to match the progress seen in previous versions.
OpenAI uses new approaches to improve training data. These include hiring people to write code and solve math problems and using synthetic data generated by another model, o1.
OpenAI declined to comment but has previously said it will not release GPT-5 this year.
Source: Ferra

I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.