The new algorithm, called GPT-4, follows GPT-3, a groundbreaking text rendering model that OpenAI announced in 2020 and was adapted to generate ChatGPT last year. The new model performs better in a number of specific tests and also allows for fewer errors and can use images to complete queries (previously only text was available).

Unfortunately, GPT-4 tends to “hallucinate,” problematic social prejudices (racism, stereotypical “thinking”, etc.) during a “hostile” request, Wired notes.

“It will take a long time before you want to run your nuclear power plant with a GPT,” says Oren Etzioni, a professor at the University of Washington.

Source: Ferra

Previous article3,000 unique drawings stolen from SpaceX contractorScience and technology13:29 | 15 March 2023
Next articleHow sleep problems and stroke are linkedFitness and health13:47 | 15 March 2023
I am a professional journalist and content creator with extensive experience writing for news websites. I currently work as an author at Gadget Onus, where I specialize in covering hot news topics. My written pieces have been published on some of the biggest media outlets around the world, including The Guardian and BBC News.

LEAVE A REPLY

Please enter your comment!
Please enter your name here