Home / News / Technology / Scientists Used to Be Skeptical of ChatGPT, Now AI Research Assistants Are Writing Papers
Technology
3 min read

Scientists Used to Be Skeptical of ChatGPT, Now AI Research Assistants Are Writing Papers

Published
James Morales
Published

Key Takeaways

  • A growing number of AI research tools help scientists review literature, design experiments and even write papers.
  • Autoscience recently announced that papers authored by its AI research assistant, Carl, had passed a double-blind peer review.
  • Researchers have increasingly embraced chatbots as the technology has advanced.

In the months after ChatGPT was released in November 2022, the scientific community was abuzz with discussion about its potential—both good and bad. 

As time has passed, the role of AI in scientific research has expanded. Today, chatbots are capable of writing papers that can pass peer review.

Chatbot Hesitancy Wanes Among Scientific Community

Early conversations around generative AI chatbots generally treated the technology with suspicion.

For instance, in March 2023, the Lancet Digital Health ran an editorial titled “ChatGPT: Friend or Foe?” The article reflects the views of the time, pointing to potential use cases for the emerging AI while warning about the risk of errors and plagiarism. 

Such concerns prompted academic institutions, like the World Association of Medical Editors (WAME), to introduce new guidelines.

Significantly, WAME’s first recommendation was that “Chatbots cannot be authors.” But less than two years later, the notion of AI authors doesn’t seem so outlandish.

Carl the AI Scientist Passes Peer Review

On March 3, Autoscience announced that papers authored by its AI research assistant Carl had passed a double-blind peer review, and were accepted for workshops at The Thirteenth International Conference on Learning Representations (ICLR).

The startup boasted that the process involved “limited human intervention,” some of which it said won’t be needed in the future.

The example Carl paper shared by Autoscience is short—a few pages, including diagrams. It isn’t especially groundbreaking, either. The paper merely compares existing machine-learning techniques.

Nonetheless, the AI researcher has achieved a degree of sophistication and originality that was unheard of not so long ago. And with the right human partners, it is conceivable that Carl and its peers could soon be making important contributions to scientific literature. 

The Rise of AI Research Assistants

While researchers were among the first to experiment with general-purpose chatbots, dedicated AI research assistants, like Carl, have only recently hit the market.

In October 2024, Google DeepMind announced plans for a new “AI lab assistant” powered by its Gemini Large Language Model (LLM).

Meanwhile, Autoscience is among a crop of startups developing AI tools for researchers that streamline and enhance the scientific process. 

This emerging roster of new tools promises to help researchers review literature, design experiments and write papers. And as the technology advances, the developers behind AI research assistants are striving for greater autonomy.

These tools are going to become more and more fully autonomous, capable of coming up with ideas and actually running experimentation,” Potato AI founder and CEO Nick Edwards told CCN.

Like AutoScience, Google DeepMind and other AI developers in the space, Potato is focused on using increasingly agentic language models to speed up the research process.

Skepticism is warranted, Edwards observed. However, it’s undeniable at this point that these tools are going to make a difference and are going to improve and accelerate science.

Was this Article helpful? Yes No
James Morales is CCN’s blockchain and crypto policy reporter. He has been working in the news media since 2020, writing about topics such as payments, banking and financial technology. These days, he likes to explore the latest blockchain innovations and the evolving landscape of global crypto regulation. With an educational background in social anthropology and media studies, James uses his platform as a journalist to explore how new technologies work, why they matter and how they might shape our future.
See more
loading
loading