OpenAI CEO says AI needs small amounts of high-quality data

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Sam Altman, CEO of OpenAI, dropped a bombshell at Davos: forget Big Data, the future of AI lies in selective sips, not gulps.

“There’s this misconception that we need everything under the sun to train our models. Actually, that’s not the case. We do not want to train on the New York Times data, for example.”

Altman said at the World Economic Forum.

This may be a surprise, considering OpenAI’s recent legal tussle with The New York Times, which accused the lab of data poaching, for which OpenAI also released a statement. But Altman insists quality trumps quantity. He said

“A lot of our research is focused on learning more from smaller amounts of very high-quality data.”

OpenAI’s shift in focus reflects a growing concern in the AI community: data quality matters more than data volume. A biased tweet can skew an entire model, while a carefully chosen scientific paper can unlock new capabilities.

This doesn’t mean OpenAI is avoiding partnerships altogether. Altman revealed that the company is currently in talks with news outlets such as CNN, Fox, and Time, focusing on collaboration rather than data mining.

OpenAI’s new approach could reshape AI. Less data means less energy consumption, fewer ethical concerns, and potentially smarter, more reliable AI. And it could ease tensions with publishers, opening doors for mutually beneficial collaborations.

More here.

User forum

0 messages