AI and Your Learning: A Guide for Students
In this guide, we provide information and guidelines to help you make informed decisions about navigating AI tools.
Generative AI tools, such as ChatGPT, Claude, and Gemini, have evolved quickly over the past few years and become increasingly widespread in their use. While these tools can be exciting, they can also raise questions and concerns. Whether you’re using these tools already or might want to explore them further, making decisions about AI tools as a student can be complicated. Perhaps you’ve wondered: Can I use AI? Should I use AI? How might using AI tools impact my learning?
Can I Use Generative AI for Assignments or Research?
AI policies vary dramatically from one class to the next, often because courses have distinct learning goals. Before you start using AI as a learning or research tool, make sure to review your instructor’s course policy. It is important for you to know if AI is allowed in your course and if so, how. Which ways of using AI are permissible and which are not?
Before you start using AI as a learning or research tool, make sure to review your instructor’s course policy.
You also need to understand that Stanford’s Generative AI Policy Guidance treats any use of generative AI as analogous to receiving help from another person, and using generative AI to substantially complete an assignment is always prohibited. Your default approach should be to always assume AI use is not allowed unless otherwise indicated in the syllabus or assignment, and to disclose any use of generative AI for help with your assignments. In addition, your instructor might have a specific way they want you to document how and when you utilized AI tools. If you are ever unclear about if, how, and when AI is appropriate, please consult with your instructor.
On a final note, sometimes you may be using AI-related tools for research. In that case, clarify the rules for using AI in your lab, department, discipline, and/or the specific journal that you might be submitting to. They may have specific AI-related policies that you are expected to follow.
In summary, when in doubt: ask and document!
How Might the Design of AI Tools Influence My Learning?
Perhaps you’re thinking about using a tool such as ChatGPT to quiz yourself on course concepts, clarify points of confusion, or get feedback on your writing. While this may seem no different from the ways in which you might learn and interact with a tutor or friend, learning with AI is a bit more complicated. Knowing how AI tools work can help you understand their strengths and limitations and ultimately make informed decisions about when and how you use them to support your learning.
On a basic level, generative AI tools such as ChatGPT, Claude, and Google Gemini are called “generative” because they produce text, images, audio, or other content in response to a prompt. Through a complex process of machine learning, which involves training the AI model on vast amounts of data and then fine-tuning the model, an AI tool is then able to generate responses that resemble human-generated language. If you ask ChatGPT a question, it is able to respond not because it “knows” the answer to your question, but because it has identified patterns and relationships within the training data and can reassemble language from that data to produce an answer.
The generative nature of AI tools offers a number of strengths: the tools are easy to use, they can take on a number of roles, and they provide novel ways to learn and create. However, the underlying structure of AI tools can also lead to problems. Because of the way AI tools “guess” statistically plausible information, AI tools will sometimes provide responses that are incorrect, which are called “hallucinations.”
Additionally, researchers have documented a variety of biases that can exist in training data (e.g. lack of geographical and population diversity) and note that the algorithms that are then applied to this data can further exacerbate these biases (Mehrabi et al., 2021). Research has also shown that AI tools have a tendency to misattribute sources, or in some cases, create a citation for a source that doesn’t exist (Jaźwińska & Chandrasekar, 2025).
Finally, it’s important to note that while the experience of interacting with an AI tool can feel life-like and similar to talking with a human, these tools have serious limitations in their ability to provide holistic support. Whether you’re facing challenges in your learning or in your broader life as a student, it’s important to connect with human support resources such as tutors, academic coaches, advisors, and counselors. Given these limitations, being able to think critically about the output from AI tools is essential, as not everything they generate will be accurate or useful.
Can AI Tools Help Me Meet My Learning Goals?
A frequently cited quote from Herbert Simon, one of the founders of the field of cognitive science, describes how learning takes place: “Learning results from what the student does and thinks and only from what the student does and thinks. The teacher can advance learning only by influencing what the student does to learn” (cited in Ambrose, 2023, p.1). By extension, the specific strategies you use while studying, because they shape what you do and think about during the learning process, also play an important role in whether your efforts at learning are successful.
Strategies that involve thinking deeply about the material, such as comparing and contrasting concepts with each other, explaining in one’s own words, and self-quizzing tend to be more effective than superficial strategies such as highlighting or rereading material (Dunlosky, et. al., 2013).
The relationship between how you use generative AI and your learning depends on how you use this technology. Does a specific use of generative AI facilitate or take away an opportunity for deeper learning? The answer to this question may not be straightforward. For example, effective active reading strategies include previewing the reading and asking yourself questions that might be answered in the text (McGuire, 2018). While generative AI could be used to help with these strategies by generating a summary and questions, does it take away an opportunity for you to engage directly with the text by previewing it yourself? And do you risk introducing hallucinations into any summaries or questions generated by AI? If you use AI as a study aid, what's important is that you’re using effective strategies to facilitate your learning and not delegating your learning process to an AI tool.
The following framework suggests questions for you to consider when using generative AI as a learning tool.
Questions to Consider When Using Generative AI for Learning
Assignments and Research
- How do I know whether I’m allowed to use AI in my assignment or research?
- How am I allowed to use AI in my assignment or research, if at all?
- How should I document my use of AI?
- Am I required to disclose my use of AI?
Accuracy and Bias
- How will I evaluate the accuracy of AI output?
- In what ways can I monitor AI output for bias?
- How will I identify which sources are contributing to AI output and whether they are being used accurately? How will I properly attribute sources?
AI and Your Learning
- Does using generative AI take away an opportunity for me to engage more deeply with the material? If so, are there alternative learning strategies I could use?
If you’re not sure, a CTL academic coach can help you find new approaches to studying. - Alternatively, does using generative AI deepen my engagement with the material or otherwise help me reach my learning goals? If so, how?
- In addition to the considerations described above, what other factors are important to me as I make decisions about using AI tools? (e.g., the environment, data privacy).
Additional AI Resources
- Responsible AI at Stanford: Stanford UIT’s guide to using AI tools and models while keeping your and Stanford's data safe.
- OCS Generative AI Policy Guide: Stanford Office of Community Standards' guidance on the honor code implications of generative AI tools.
- Stanford's AI Playground: Stanford University hosted platform that allows users to test out multiple AI models in a safe, university-supported environment.
- Student Guide to Artificial Intelligence: Guide published by Elon University providing an overview of key considerations for student use of AI, including skill-building, ethics, academic integrity, and more.
References
Dunlosky, J., Rawson, K., Marsh, E., Nathan, M., and Willingham, D. (2013). Improving Students’ Learning With Effective Learning Techniques: Promising Directions From Cognitive and Educational Psychology. Psychological Science in the Public Interest, 14(1), 4-58. https://journals.sagepub.com/doi/full/10.1177/1529100612453266
Jaźwińska, K. & Chandrasekar, A. (2025, March 6). AI search has a citation problem. Columbia Journalism Review. https://www.cjr.org/tow_center/we-compared-eight-ai-search-engines-theyre-all-bad-at-citing-news.php
Lovett, M., Bridges, M., DiPietro, M., Ambrose, S., & Norman, M. (2023). How Learning Works: 8 Research-Based Principles for Smart Teaching. (2nd ed.). Jossey-Bass.
McGuire, S. (2018). Teach Yourself How to Learn: Strategies You Can Use to Ace Any Course at Any Level. Stylus Publishing.
Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2021). A survey on bias and fairness in machine learning. ACM Computing Surveys (CSUR), 54(6), 1-35.