Logo for Dr Anna Clemens PhD who teaches scientific writing courses for researchers
Logo for Dr Anna Clemens PhD who teaches scientific writing courses for researchers

Why generative AI is overrated for writing scientific papers

This blog post on generative AI for writing scientific papers is based on a podcast episode I recorded. To listen or watch the podcast, click play on the YouTube video above or listen to the episode on Spotify or Apple Podcasts.

One of the hottest topics inside the Researchers’ Writing Academy right now is whether and how to use ChatGPT or any of the many other generative AI tools for writing scientific papers. And I can imagine that you are having some of the same questions so let’s dive into it!

I’m more critical toward using generative AI than many others who have spoken about it. In this blog post, I’m making an argument for why I think generative AI is overrated when it comes to assisting with writing scientific papers — and why it may even backfire for you.

I would love to know whether you agree with me, or where you think I’ve drawn the wrong conclusions. (As always, I’m opinionated but never against getting my mind changed.) Also, this is a field that’s changing fast so my opinions will likely evolve over time too.

This blog post is structured in the following way: First, I will analyse the most common expectations we have of AI in academic writing and research. Then, I will give you 8 specific recommendations for using generative AI to write scientific papers so you can get the most out of the technology while not running in any of the pitfalls.

Definition of generative AI

Let’s start with a quick definition of what generative AI is. Generative AI stands for generative artificial intelligence, which means it’s an algorithm that uses large-language models (abbreviated as LLMs) to produce text. LLMs are neural networks that have been trained on vast amounts of text to process it.

Generative AI is such a disruptive invention for writing because it can produce any text we want in a matter of seconds. Which leads us to the apparent question is: Should we use generative AI for our academic writing?

Should we use ChatGPT to write scientific papers?

Let us answer this question by breaking the matter down into the expectations, or hopes, or goals that we have for generative AI in terms of assisting us with our writing.

From how I see it, there are three main goals we have when we use ChatGPT and co for our scientific papers:

Graphic promoting a free scientific writing class for researchers

1) We want generative AI to save us time: We hope for generative AI to allow us to write and edit scientific papers faster, and we want it to provide shortcuts to reading the literature by e.g., getting summaries of articles.

2) We want generative AI to increase the quality of our writing: We hope for generative AI to correct structural, style and language mistakes we made in our manuscripts. We also hope for it to spot gaps in our literature search and find scientific papers we have missed.

3) We want generative AI to enhance our own creativity: We hope that generative AI can help us connect the research out there, identify gaps in knowledge and research opportunities we may have missed. We also may want it to act as a thought partner to spark ideas.

Okay, let’s examine how generative AI is holding up in terms of achieving these three goals:

Goal 1: We want generative AI to save us time writing and editing a scientific paper

I have never tried using ChatGPT to help me write a scientific paper but I have tried using it to write marketing copy. I like the idea of having to spend less time on marketing, and more time on the work I want to be doing: Helping researchers write better research papers in less time.

Unfortunately, ChatGPT didn’t fulfill my expectations in terms of saving time to write said marketing texts. To get decent results, I quickly learned, I had to provide it with a lot of context of what I’m writing and how I want it written (style, tone etc.). Lots of users of generative AI emphasise that it’s an iterative process where you report back what you liked about the text you got spit out and what you didn’t like. That way, the writing gets improved version by version.

The consequence of that is that it takes time for generative AI to deliver a text in the way you imagined. At least, ChatGPT, started, fairly recently, to actually have a memory so that you won’t have to any longer repeat general background info about your preferences. However, I still find that it tends to forget previous prompts often, especially when you are attempting a longer text.

Speeding up reading the scientific literature

Apart from writing text, using generative AI for literature search seems popular at the moment. One way is having it provide summaries of research articles to save you time reading every article that promises to have useful information. I think incorporating AI into your literature search in this way is actually smart. Ideally, the abstract of a paper should give you the summary you are looking for but, unfortunately, their quality varies, so getting an AI summary really can save time.

The only word of warning I want to give here is that you don’t have a guarantee for whether the summary AI delivers is accurate and, perhaps more importantly, whether it captures the information that is important for your own research. Remember: Summaries are never objective.

Is faster really the goal?

Graphic inviting scientist to register for our free interactive writing training

As a general rule, I highly encourage you to be self-critical about whether you really are saving time when using generative AI. I think, as humans, we like not having to do the hard work of writing on our own but instead being able to engage with a thought partner — even, or sometimes especially, if that thought partner is a robot. I think the way we’re wired it often feels easier to have a dialogue with someone to get to a solution rather than trying to tease it out of our brain all on our own. (And that’s fair enough!)

But, more generally, I’m wondering: Is faster really the goal? Is the goal to publish more scientific papers faster OR is the goal to publish better scientific papers? The default in our society is speed over quality. I’m aware, of course, of the pressures in the current academic environment. I know that a lot of researchers are being pushed to publish a lot with very little time and resources. This isn’t an individual critique but a more philosophical thought.

Real scientific breakthroughs take time

Here’s what I’m wondering: Maybe we’re inventing and getting obsessed with tools to solve a perceived problem (i.e, that we don’t have enough time to do all the things on our to-do list) that isn’t in fact the real problem (i.e, us trying to do too many things losing focus of what makes an actual difference).

When we do a lot of things, it’s hard to put proper focus into each of them. If we are required to write six scientific articles to get tenure, the scientific breakthroughs reported in these will most likely be less significant than had we put put all of our focus, effort and other resources into 1 or 2 research articles.

All that being said, I do think that there is room for saving time in the writing process of most researchers. I wouldn’t turn to ChatGPT and friends though. If you want to write scientific papers more time-efficiently, then I encourage you to learn the streamlined scientific paper writing process.

Am I using ChatGPT to avoid deep work?

On a cognitive level, it’s easier to engage with ChatGPT than actually deeply focusing on typing words on a page. I think there is a very important question we all should ask ourselves from time to time: Am I using AI to avoid deep work?

With deep work, I mean the deep focus that writing brings. Thinking deeply about a problem, struggling with it for extended lengths of time. Deep work is cognitively demanding and takes time. It’s slow, hard work with delayed, but often much, much bigger rewards.

Mockup of the free interactive writing training for researchers

And I think what our dopamine-flooded brains tend to forget is that having a lot of periods with deep concentrated focus in your day does make us feel really good — calm and grounded, and from my experience, happier.

Goal 2: We want generative AI to increase the quality of our scientific writing:

Okay, let’s move on to the second hope for generative AI that I identified: We expect generative AI to improve the quality of our academic writing. I see especially those researchers who don’t feel like their English is good enough to write high-quality papers, rely on generative AI to help. And that makes sense, of course! And I’m glad that there are tools out there now that help researchers whose first language isn’t English to get better access to scientific publishing.

Here’s the thing though: Generative AI doesn’t inherently know what good scientific writing is. The scientific literature contains both good and bad writing and the AI is trained on both without being able to distinguish between them.

The general quality of the writing present in the scientific literature is, in fact, so bad that I made it one of our missions with the Researchers’ Writing Academy to increase the quality of the scientific literature so that other researchers (and also journalists and the public) are better able to understand what is being communicated. We teach researchers not to dumb down their research but to write clearly, concisely and compellingly. And that is exactly what generative AI isn’t able to do.

AI generated writing is often bloated and hollow

The writing generative AI produces often sounds good when skim-reading but if you try understanding what’s actually being said, you notice how hollow it usually is. So, if your English isn’t good enough to spot bloated or unclear writing, then using generative AI is risky.

Even more troubling is that generative AI has no way to tell whether a fact is correct or not. It can and does produce wrong results, called hallucinating. And asking it not to hallucinate doesn’t work! It simply doesn’t know how not to. Remember that you as the author are liable for the accuracy of what you publish.

Promo graphic for our free scientific writing course

Be aware of these drawbacks when using generative AI for literature search

In terms of using generative AI for literature search, there are some major drawbacks I want you to be aware of:

  • Generative AI may introduce citation bias. Firstly, it cannot access all papers, only those that are available online without a paywall, or what publishers granted access to. Secondly, it can’t distinguish between good and bad research — to the AI the claims in each paper weigh equal regardless of sample size or quality of the research methods.
  • Generative AI can’t analyse papers critically. It isn’t able to tell if the authors’ conclusions match the data provided.
  • Generative AI can hallucinate references, i.e., invent research papers that don’t exist!
  • The literature search results you get from generative AI will inherently be outdated, depending on when the model has been last updated.

I still believe that there are use cases for generative AI to help you elevate the quality of your writing and literature search. You can find my recommendations for using it further down below.

Goal 3: We want generative AI to enhance our own creativity when writing scientific papers

And lastly, let’s examine the third hope we have for generative AI in terms of research and writing scientific papers: We hope for it to connect the knowledge out there with each other to identify gaps in knowledge and research opportunities we may have missed with our human brain — or to act as a thought partner to spark new ideas.

Do use generative AI to come up with catchy names and phrases!

My favourite way to use ChatGPT is to help me name things or come up with fun phrases. For example, I recently made some branded mugs to give to members of the Researchers’ Writing Academy who completed a 2-week Guided Writing Sprint we just hosted. ChatGPT helped me come up with: “Sip. Write. Submit” that I printed on the front — see the pics of our RWA mug below!

Mockup of our Researchers' Writing Academy mug that features the text "Sip. Write. Submit."
Backside of our Researchers' Writing Academy mug featuring our logo.

I have always been notoriously bad at this type of creativity so it’s immensely helpful to me having ChatGPT provide inspiration. I’ve used it several times in this way and I almost never end up using any of the suggestions exactly BUT they do usually either spark new ideas or I like a combination of the suggestions. So, I highly recommend using ChatGPT to help you name things!

A prompt engineering tip that seems to work well is to say at the beginning of your request: “Imagine you are the best [and/or other adjective] [profession according to the request made] in the world.” So, for example, for the mug design, I started my prompt with: “Imagine you are the best and cleverest copywriter in the world.”

Generative AI’s creativity is inherently limited

Now comes the “but” you’ve likely been expecting at this point. 😁 Here it goes: Generative AI is NOT creative. At least not in the sense we’re often hoping for. Generative AI is only capable of convergent thinking (i.e., finding familiar solutions), not divergent thinking (i.e, coming up with out-of-the-box ideas).

This means that its creativity is sufficient for finding a clever phrase to put on a mug (it uses all the information it has to come up with a suitable combination of words). But it cannot identify what’s missing in the pool of data it has access to. Generative AI is not able to identify a gap in the web of knowledge, come up with a research question for your research topic or storyboard your scientific research paper.

Graphic advertising a free scientific writing training

I’m also critical of this goal we have for generative AI to help us with our creativity more generally. If you are a scientist, academic or researcher, then thinking is your unique skill. You went into your field because you wanted to make a contribution. This is your profession. We need your sharp, critical thinking skills! We need you to develop them so science can move forward in significant and meaningful ways.

8 recommendations for using generative AI to write scientific papers

This all being said, I’m not totally against using generative AI. But I advise keeping in mind the following tips for best practice when using generative AI to write scientific papers:

1) Use a generative AI tool as your assistant but stay in the driver seat as the author and researcher. This means, for example, that you have to be able to tell if a text is written clearly, concisely and compellingly, i.e., of high quality. Don’t ever just copy text that you can’t appraise the quality of. (Remember, ChatGPT is notorious for hollow, bloated writing.)

2) Analyse critically whether AI is really saving you time when using it. Use a tool such as Toggl to track the time you spend on working on your scientific research paper so you have data on which step takes how much time.

3) Self-reflect whether you’re using generative AI to avoid deep work. Chatting to AI usually feels easier than deep thinking and writing but it’s the latter skills you need to sharpen so you can produce meaningful and significant breakthroughs in your research.

4) When reviewing the literature, use generative AI as a complementary tool to conventional literature search tools. Don’t solely rely on generative AI because it can introduce citation bias, is not always up to date and even hallucinates references.

5) DO outsource time-consuming tasks to (generative) AI that you don’t need your scientist brain for, such as correcting spelling, grammar and formatting.

6) Always check your target journal’s guidelines on whether they permit using generative AI during the research and writing process. Guidelines are changing fast so you need to make sure you’re up to date.

7) Make sure you know what the AI tool is doing with your inputs before using it. Some may save and use your data (e.g., share it with other users) so there may be risks to get scooped or making something that is confidential public.

8) If you struggle with academic writing, invest in learning that skill. This is an investment into your career that will pay off!! You invest time learning the skill once, and then you have that skill forever. You then won’t need to have to rely on generative AI to write for you. If you don’t know where to start, check out my free 60-minute writing training.

Alright, these were my thoughts on using generative AI like ChatGPT for academic writing and research. I hope this was helpful! I’d be curious to hear how this resonated with you, please comment on the accompanying YouTube video and share your thoughts there!

Graphic promoting a free scientific writing class for researchers

 

Why generative AI is overrated for writing scientific papers

This blog post on generative AI for writing scientific papers is based on a podcast episode I recorded. To listen or watch the podcast, click play on the YouTube video above or listen to the episode on Spotify or Apple Podcasts.

One of the hottest topics inside the Researchers’ Writing Academy right now is whether and how to use ChatGPT or any of the many other generative AI tools for writing scientific papers. And I can imagine that you are having some of the same questions so let’s dive into it!

I’m more critical toward using generative AI than many others who have spoken about it. In this blog post, I’m making an argument for why I think generative AI is overrated when it comes to assisting with writing scientific papers — and why it may even backfire for you.

I would love to know whether you agree with me, or where you think I’ve drawn the wrong conclusions. (As always, I’m opinionated but never against getting my mind changed.) Also, this is a field that’s changing fast so my opinions will likely evolve over time too.

This blog post is structured in the following way: First, I will analyse the most common expectations we have of AI in academic writing and research. Then, I will give you 8 specific recommendations for using generative AI to write scientific papers so you can get the most out of the technology while not running in any of the pitfalls.

Definition of generative AI

Let’s start with a quick definition of what generative AI is. Generative AI stands for generative artificial intelligence, which means it’s an algorithm that uses large-language models (abbreviated as LLMs) to produce text. LLMs are neural networks that have been trained on vast amounts of text to process it.

Generative AI is such a disruptive invention for writing because it can produce any text we want in a matter of seconds. Which leads us to the apparent question is: Should we use generative AI for our academic writing?

Should we use ChatGPT to write scientific papers?

Let us answer this question by breaking the matter down into the expectations, or hopes, or goals that we have for generative AI in terms of assisting us with our writing.

From how I see it, there are three main goals we have when we use ChatGPT and co for our scientific papers:

Graphic promoting a free scientific writing class for researchers

1) We want generative AI to save us time: We hope for generative AI to allow us to write and edit scientific papers faster, and we want it to provide shortcuts to reading the literature by e.g., getting summaries of articles.

2) We want generative AI to increase the quality of our writing: We hope for generative AI to correct structural, style and language mistakes we made in our manuscripts. We also hope for it to spot gaps in our literature search and find scientific papers we have missed.

3) We want generative AI to enhance our own creativity: We hope that generative AI can help us connect the research out there, identify gaps in knowledge and research opportunities we may have missed. We also may want it to act as a thought partner to spark ideas.

Okay, let’s examine how generative AI is holding up in terms of achieving these three goals:

Goal 1: We want generative AI to save us time writing and editing a scientific paper

I have never tried using ChatGPT to help me write a scientific paper but I have tried using it to write marketing copy. I like the idea of having to spend less time on marketing, and more time on the work I want to be doing: Helping researchers write better research papers in less time.

Unfortunately, ChatGPT didn’t fulfill my expectations in terms of saving time to write said marketing texts. To get decent results, I quickly learned, I had to provide it with a lot of context of what I’m writing and how I want it written (style, tone etc.). Lots of users of generative AI emphasise that it’s an iterative process where you report back what you liked about the text you got spit out and what you didn’t like. That way, the writing gets improved version by version.

The consequence of that is that it takes time for generative AI to deliver a text in the way you imagined. At least, ChatGPT, started, fairly recently, to actually have a memory so that you won’t have to any longer repeat general background info about your preferences. However, I still find that it tends to forget previous prompts often, especially when you are attempting a longer text.

Speeding up reading the scientific literature

Apart from writing text, using generative AI for literature search seems popular at the moment. One way is having it provide summaries of research articles to save you time reading every article that promises to have useful information. I think incorporating AI into your literature search in this way is actually smart. Ideally, the abstract of a paper should give you the summary you are looking for but, unfortunately, their quality varies, so getting an AI summary really can save time.

The only word of warning I want to give here is that you don’t have a guarantee for whether the summary AI delivers is accurate and, perhaps more importantly, whether it captures the information that is important for your own research. Remember: Summaries are never objective.

Is faster really the goal?

Graphic inviting scientist to register for our free interactive writing training

As a general rule, I highly encourage you to be self-critical about whether you really are saving time when using generative AI. I think, as humans, we like not having to do the hard work of writing on our own but instead being able to engage with a thought partner — even, or sometimes especially, if that thought partner is a robot. I think the way we’re wired it often feels easier to have a dialogue with someone to get to a solution rather than trying to tease it out of our brain all on our own. (And that’s fair enough!)

But, more generally, I’m wondering: Is faster really the goal? Is the goal to publish more scientific papers faster OR is the goal to publish better scientific papers? The default in our society is speed over quality. I’m aware, of course, of the pressures in the current academic environment. I know that a lot of researchers are being pushed to publish a lot with very little time and resources. This isn’t an individual critique but a more philosophical thought.

Real scientific breakthroughs take time

Here’s what I’m wondering: Maybe we’re inventing and getting obsessed with tools to solve a perceived problem (i.e, that we don’t have enough time to do all the things on our to-do list) that isn’t in fact the real problem (i.e, us trying to do too many things losing focus of what makes an actual difference).

When we do a lot of things, it’s hard to put proper focus into each of them. If we are required to write six scientific articles to get tenure, the scientific breakthroughs reported in these will most likely be less significant than had we put put all of our focus, effort and other resources into 1 or 2 research articles.

All that being said, I do think that there is room for saving time in the writing process of most researchers. I wouldn’t turn to ChatGPT and friends though. If you want to write scientific papers more time-efficiently, then I encourage you to learn the streamlined scientific paper writing process.

Am I using ChatGPT to avoid deep work?

On a cognitive level, it’s easier to engage with ChatGPT than actually deeply focusing on typing words on a page. I think there is a very important question we all should ask ourselves from time to time: Am I using AI to avoid deep work?

With deep work, I mean the deep focus that writing brings. Thinking deeply about a problem, struggling with it for extended lengths of time. Deep work is cognitively demanding and takes time. It’s slow, hard work with delayed, but often much, much bigger rewards.

Mockup of the free interactive writing training for researchers

And I think what our dopamine-flooded brains tend to forget is that having a lot of periods with deep concentrated focus in your day does make us feel really good — calm and grounded, and from my experience, happier.

Goal 2: We want generative AI to increase the quality of our scientific writing:

Okay, let’s move on to the second hope for generative AI that I identified: We expect generative AI to improve the quality of our academic writing. I see especially those researchers who don’t feel like their English is good enough to write high-quality papers, rely on generative AI to help. And that makes sense, of course! And I’m glad that there are tools out there now that help researchers whose first language isn’t English to get better access to scientific publishing.

Here’s the thing though: Generative AI doesn’t inherently know what good scientific writing is. The scientific literature contains both good and bad writing and the AI is trained on both without being able to distinguish between them.

The general quality of the writing present in the scientific literature is, in fact, so bad that I made it one of our missions with the Researchers’ Writing Academy to increase the quality of the scientific literature so that other researchers (and also journalists and the public) are better able to understand what is being communicated. We teach researchers not to dumb down their research but to write clearly, concisely and compellingly. And that is exactly what generative AI isn’t able to do.

AI generated writing is often bloated and hollow

The writing generative AI produces often sounds good when skim-reading but if you try understanding what’s actually being said, you notice how hollow it usually is. So, if your English isn’t good enough to spot bloated or unclear writing, then using generative AI is risky.

Even more troubling is that generative AI has no way to tell whether a fact is correct or not. It can and does produce wrong results, called hallucinating. And asking it not to hallucinate doesn’t work! It simply doesn’t know how not to. Remember that you as the author are liable for the accuracy of what you publish.

Promo graphic for our free scientific writing course

Be aware of these drawbacks when using generative AI for literature search

In terms of using generative AI for literature search, there are some major drawbacks I want you to be aware of:

  • Generative AI may introduce citation bias. Firstly, it cannot access all papers, only those that are available online without a paywall, or what publishers granted access to. Secondly, it can’t distinguish between good and bad research — to the AI the claims in each paper weigh equal regardless of sample size or quality of the research methods.
  • Generative AI can’t analyse papers critically. It isn’t able to tell if the authors’ conclusions match the data provided.
  • Generative AI can hallucinate references, i.e., invent research papers that don’t exist!
  • The literature search results you get from generative AI will inherently be outdated, depending on when the model has been last updated.

I still believe that there are use cases for generative AI to help you elevate the quality of your writing and literature search. You can find my recommendations for using it further down below.

Goal 3: We want generative AI to enhance our own creativity when writing scientific papers

And lastly, let’s examine the third hope we have for generative AI in terms of research and writing scientific papers: We hope for it to connect the knowledge out there with each other to identify gaps in knowledge and research opportunities we may have missed with our human brain — or to act as a thought partner to spark new ideas.

Do use generative AI to come up with catchy names and phrases!

My favourite way to use ChatGPT is to help me name things or come up with fun phrases. For example, I recently made some branded mugs to give to members of the Researchers’ Writing Academy who completed a 2-week Guided Writing Sprint we just hosted. ChatGPT helped me come up with: “Sip. Write. Submit” that I printed on the front — see the pics of our RWA mug below!

Mockup of our Researchers' Writing Academy mug that features the text "Sip. Write. Submit."
Backside of our Researchers' Writing Academy mug featuring our logo.

I have always been notoriously bad at this type of creativity so it’s immensely helpful to me having ChatGPT provide inspiration. I’ve used it several times in this way and I almost never end up using any of the suggestions exactly BUT they do usually either spark new ideas or I like a combination of the suggestions. So, I highly recommend using ChatGPT to help you name things!

A prompt engineering tip that seems to work well is to say at the beginning of your request: “Imagine you are the best [and/or other adjective] [profession according to the request made] in the world.” So, for example, for the mug design, I started my prompt with: “Imagine you are the best and cleverest copywriter in the world.”

Generative AI’s creativity is inherently limited

Now comes the “but” you’ve likely been expecting at this point. 😁 Here it goes: Generative AI is NOT creative. At least not in the sense we’re often hoping for. Generative AI is only capable of convergent thinking (i.e., finding familiar solutions), not divergent thinking (i.e, coming up with out-of-the-box ideas).

This means that its creativity is sufficient for finding a clever phrase to put on a mug (it uses all the information it has to come up with a suitable combination of words). But it cannot identify what’s missing in the pool of data it has access to. Generative AI is not able to identify a gap in the web of knowledge, come up with a research question for your research topic or storyboard your scientific research paper.

Graphic advertising a free scientific writing training

I’m also critical of this goal we have for generative AI to help us with our creativity more generally. If you are a scientist, academic or researcher, then thinking is your unique skill. You went into your field because you wanted to make a contribution. This is your profession. We need your sharp, critical thinking skills! We need you to develop them so science can move forward in significant and meaningful ways.

8 recommendations for using generative AI to write scientific papers

This all being said, I’m not totally against using generative AI. But I advise keeping in mind the following tips for best practice when using generative AI to write scientific papers:

1) Use a generative AI tool as your assistant but stay in the driver seat as the author and researcher. This means, for example, that you have to be able to tell if a text is written clearly, concisely and compellingly, i.e., of high quality. Don’t ever just copy text that you can’t appraise the quality of. (Remember, ChatGPT is notorious for hollow, bloated writing.)

2) Analyse critically whether AI is really saving you time when using it. Use a tool such as Toggl to track the time you spend on working on your scientific research paper so you have data on which step takes how much time.

3) Self-reflect whether you’re using generative AI to avoid deep work. Chatting to AI usually feels easier than deep thinking and writing but it’s the latter skills you need to sharpen so you can produce meaningful and significant breakthroughs in your research.

4) When reviewing the literature, use generative AI as a complementary tool to conventional literature search tools. Don’t solely rely on generative AI because it can introduce citation bias, is not always up to date and even hallucinates references.

5) DO outsource time-consuming tasks to (generative) AI that you don’t need your scientist brain for, such as correcting spelling, grammar and formatting.

6) Always check your target journal’s guidelines on whether they permit using generative AI during the research and writing process. Guidelines are changing fast so you need to make sure you’re up to date.

7) Make sure you know what the AI tool is doing with your inputs before using it. Some may save and use your data (e.g., share it with other users) so there may be risks to get scooped or making something that is confidential public.

8) If you struggle with academic writing, invest in learning that skill. This is an investment into your career that will pay off!! You invest time learning the skill once, and then you have that skill forever. You then won’t need to have to rely on generative AI to write for you. If you don’t know where to start, check out my free 60-minute writing training.

Alright, these were my thoughts on using generative AI like ChatGPT for academic writing and research. I hope this was helpful! I’d be curious to hear how this resonated with you, please comment on the accompanying YouTube video and share your thoughts there!

Graphic promoting a free scientific writing class for researchers

 

Why generative AI is overrated for writing scientific papers

© Copyright 2018-2025 by Anna Clemens. All Rights Reserved. 


Photography by Alice Dix