AI nonprofit boss says 'closed nature' of most AI research hinders innovation
Football February 25, 2025 01:39 AM

A year before Elon Musk co-founded OpenAI in San Francisco, Microsoft co-founder Paul Allen had already set up a non-profit artificial intelligence research lab in Seattle.

The Allen Institute for Artificial Intelligence (Ai2) aimed to advance AI for humanity's benefit. Over a decade later, Ai2 may not be as well-known as OpenAI, the creators of ChatGPT, but it is still pursuing "high-impact" AI, as envisioned by Mr Allen, who passed away in 2018.

Ai2's latest AI model, Tulu 3 405B, rivals those of OpenAI and China's DeepSeek in several benchmarks. Unlike OpenAI, Ai2 claims to be developing "truly open" AI systems for others to build upon. Since 2023, Ai2's CEO, Ali Farhadi, has been leading the institute after a stint at Apple.

READ MORE:

READ MORE:

He said: "Our mission is to drive AI innovation and breakthroughs to solve some of humanity's most pressing problems. The biggest threat to AI innovation is the closed nature of the practice. We've been pushing very strongly towards openness. Consider open-source software: the core idea is that I should be able to understand what you did, modify it, fork from it, use part of it, half of it, or all of it. And once I build something, I put it out there, and you should be able to do the same."

The debate around open-source AI is currently a hot topic. To us, open-source means having a clear understanding of your actions. While open weights models like those from Meta are beneficial because people can simply take those weights and proceed, they don't qualify as open source.

Open source implies having access to every piece of the puzzle. If I were to speculate, some of the training data for these models might contain questionable material. However, the training data, which is the actual IP, is probably the most valuable part. Many believe it holds significant value, and I agree.

Data plays a crucial role in enhancing your model and altering its behaviour. It's a laborious and challenging process. Numerous companies invest heavily in this area and are reluctant to share their findings. As AI matures, I believe it's gearing up to be taken seriously in critical problem domains such as scientific discovery.

A large portion of some disciplines involves a complex search for solutions - whether that's a gene structure, a cell structure or specific configurations of elements. Many of these problems can be formulated computationally.There's a limit to what you can achieve by merely downloading a model trained on text data from the internet and fine-tuning it. Our aim is to enable scientists to train their own models.

© Copyright @2025 LIDEA. All Rights Reserved.