Critical thinking is the process of carefully examining ideas and information. It means asking questions, checking facts, and looking at different sides of an issue before deciding what to believe or do, not just accepting what you read or are told. It's like being a detective with your thoughts - always questioning and evaluating to make smart decisions.
Critical thinking is important because it helps you make better decisions. When you think critically, you don't just accept information as true - you ask questions, check facts, and look at issues from different angles. This means you're less likely to be fooled, and you can solve problems more effectively, both in college and in everyday life.
Researchers are interested in the effects of using generative AI tools on people's brains. The first published studies seem to that show that using generative AI tools leads to an erosion of critical thinking skills. From the paragraph above, that would mean if you use generative AI tools a lot, then eventually you'll be more easily fooled, make stupider decisions and have more difficulty solving problems.
Gerlich (2025) surveyed and interviewed 666 people of different ages and educational backgrounds and found that the people who used AI more frequently didn't have as good critical thinking skills as less-frequent AI users.
Lee et al (2025) did an in-depth survey of 319 frequent generative AI users and found that determined that those with greater confidence in generative AI tools experienced a decline in critical thinking skills, while those with higher self-confidence in their own skills demonstrated a greater ability to use critical thinking. The researchers also found that some aspects of critical thinking - understanding and idea generation were more negatively affected by frequent AI tool use.
Kosmya et al (2025) split 54 subjects into three groups and asked them to write essays using ChatGPT, Google, or nothing other than their brains. ChatGPT users "consistently underperformed" and "got lazier with each subsequent essay they were asked to write".
Do you remember important phone numbers, birthdays and special occasions, or do you use your phone to remind use of these? Chances are that you have them stored on your phone. The phrase "digital amnesia" has been coined to describe the phenomenon of outsourcing your memory to digital devices. A related phenomenon is called the "Google effect" - where you start to forget things that you can easily look up using Google (or other search engines)
There's some low-level evidence for the existence of digital amnesia but some scientists disagree about whether it is real or not, The phrase was devised by a cybersecurity firm that sells products to protect your digital information - so they have a vested interest in getting people to believe digital amnesia is real, while the research that first described the "Google effect" (Sparrow et al., 2011) has not been able to be fully reproduced by other researchers. Reproducibility - the ability to repeat someone's experiments and get the same results is very important in research. If research can't be repeated, then it makes it more difficult to believe it's correct. Digital Amnesia and the Google effect are interesting ideas, but the research to verify their existence is still in its early stages.
Whether real or not, digital amnesia and the Google effect are examples of cognitive offloading. Cognitive offloading is when you use tools or strategies to help your brain with tasks, like writing things down, using a calculator, or setting reminders on your phone. It helps reduce the amount of thinking or memory work you have to do on your own. The problem is, when you become accustomed to doing these things, it becomes more difficult to make more strenuous mental effort when you need it. If you go to the gym regularly to lift weights, what's going to happen when you stop going to the gym? Lifting weights becomes harder. The same thing can happen with cognitive offloading. Your brain needs exercise just like your muscles do. Using generative AI tools is another example of cognitive offloading and one that reduces your critical thinking skills as described by the work of Gerich, Lee et al and Kosmya et al.
Other studies - reviewed by Zhai et al. (2024) indicate that use of generative AI tools has a negative effect on not just critical thinking but analytical thinking (the ability to break down complex information or problems into smaller, more manageable parts) and decision-making "as individuals increasingly favor fast and optimal solutions over slow ones".
There are many ways that you can use generative AI in your studies, but let's be very clear: getting ChatGPT to write your essays for you should be completely off the table.
ChatGPT and other tools make stuff up all the time. Wouldn't it be embarrassing to hand in an essay that says bees live in your computer and perform calculations, because you saw AI say it on the Internet?

Come on, this is absolute nonsense. Granted, you're unlikely ever to be asked to write an essay on the topic of computer bees, so you're unlike to Google it, and you probably know that there are no microscopic bees living in your computer.
In case you think this is a one-off, did you know hippopotamuses can be trained to perform complex medical procedures?

Ok, again, you might recognise this as nonsense, but the fact is, you simply can't rely on the answer that generative AI tools give you to a question being correct. It doesn't matter that this is Google's AI overview and not ChatGPT, all these tools work the same way. ChatGPT and the like operate within very narrow parameters - "does this word often follow that word?" - not "is this answer correct?"
It's important to remember that ChatGPT's output is programmed to sound plausible. It can sound plausible enough to make experts doubt themselves. This is a scientist on Twitter talking about looking at ChatGPT when it first appeared:
The text of the first Tweet says:
Then I decided to ask ChatGPT about something that I knew didn’t exist: a cycloidal inverted electromagnon. I wrote my thesis about electromagnons, but to be double sure, I checked there was no such thing (it's been ca. 7 years since my defense). ChatGPT thought differently:
I wanted to drill down on physics. And here it became very spooky: somehow ChatGPT hallucinated an explanation of a non-existing phenomenon using such a sophisticated and plausible language that my first reaction was to actually consider whether this could be true!
"My first reaction was to actually consider whether this could be true!" ChatGPT's output was so plausible-sounding it caused an expert in the field to doubt herself. If it can cause experts to doubt themselves, then maybe it can fool you too, especially if the topic is something you don't know so much about.
So maybe there could be bees in your computer and maybe hippos could be trained to perform ultrasound, but don't take a generative AI tool's word for it: check what it says elsewhere. A good rule of thumb for assessing the truth of what you see on the Internet is to check to see what other sites say. Are there any sites that say there are bees in your computer, or do the search results talk about how silly the idea is? Use the SIFT technique to verify the output you get from generative AI tools.
The bottom line is that AI tools cannot replace your critical thinking, your evidence-based arguments or your subject knowledge and if you don't bother to learn anything yourself, then you won't be able to ask good questions or judge what is a good answer. Try to be more mindful about your AI usage and strive to balance it with traditional learning methods. This can help maintain cognitive engagement and prevent the decline of your critical thinking skills.
Gerlich, M. (2025) AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking. Societies 2025, 15, 6. Last accessed 7th March 2025
Kosmyna, N., Hauptmann, E., Yuan, Y.T., Situ, J., Liao, X.H., Beresnitzky, A.V., Braunstein, I. and Maes, P. (2025). Your brain on chatgpt: Accumulation of cognitive debt when using an ai assistant for essay writing task. arXiv preprint arXiv:2506.08872
Lee, H.P.H., Sarkar, A., Tankelevitch, L., Drosos, I., Rintel, S., Banks, R. and Wilson, N. (2025) The Impact of Generative AI on Critical Thinking: Self-Reported Reductions in Cognitive Effort and Confidence Effects From a Survey of Knowledge Workers. Last accessed 7th March 2025
Sparrow, B., Liu, J. and Wegner, D.M., (2011). Google effects on memory: Cognitive consequences of having information at our fingertips. Science, 333(6043), pp.776-778. Last accessed 7th March 2025
Zhai, C., Wibowo, S. & Li, L.D. (2024) The effects of over-reliance on AI dialogue systems on students' cognitive abilities: a systematic review. Smart Learn. Environ. 11, 28. Last accessed 14th July 2025