Skip to main content Skip to secondary navigation

Applications now open for Peer Academic Coaches!

Learn more and apply

AI Teaching Strategies

Main content start

While the AIMES Library of Examples allows you to filter and explore real course policies, assignments, and other teaching artifacts, this page provides a brief overview of strategies that may be helpful when assigning, limiting, and prohibiting AI use. 

Introduction

Generative AI has permeated higher education, with rapid adoption by students and with instructors wondering whether, and how, to incorporate its use in teaching and learning. Research on educational uses of generative AI is still nascent, and we do not yet have extensive, stable, or discipline-specific bodies of evidence to draw from. That said, Stanford instructors have been exploring, testing, and critiquing uses of generative AI in teaching and learning on campus. The emerging practices discussed here, synthesized from Stanford examples, are consistent with broader educational research findings.

Educational Approaches to AI

Instructors have adopted a variety of approaches to use of AI in their courses. Here are three approaches that have emerged. Click to jump to an approach, or scroll down for pedagogical strategies that apply across all approaches, followed by more details about each one.

AI-Assigned

When you want students to learn AI skills such as prompting and verifying, practice AI tasks in a particular field or domain, and explore or critique AI.

AI-Limited

When you want students to navigate AI responsibly, make and justify decisions about AI use, and disclose and cite AI use alongside other kinds of resources.

AI-Prohibited

When you want students to develop skills independently and in collaboration with peers and instructors.

Pedagogical Strategies Across Approaches to AI

Set and communicate a detailed and specific course policy

  • No matter what approach you take to use of AI in your course, write a clear course policy in the syllabus, with enough detail to address the specific kinds of work students will do in the course.
  • Provide examples of applying the course policy in scenarios that you anticipate and that students ask about.
  • Revisit the course policies periodically throughout the course, especially when students encounter a new form of coursework, assignment, or assessment for the first time.
  • The CTL Syllabus Template includes sample policies for limited and prohibited AI use, but you will likely benefit from tailoring them to your specific course.

Address transparency and motivation 

  • Share with students why you have chosen your approaches to AI and how it is informed by the course learning outcomes. What skills do you want them to develop? How will those skills serve them in their studies, life, and work?
  • Revisit your reasoning behind the approach to AI throughout the course–for example, in the context of learning goals, class activities, assignments, and assessments.
  • Emphasize students’ progress on goals and skills, whether related to AI use or on tasks they complete on their own, through individual feedback and discussion with the whole class.

Emphasize alignment and process

  • Ensure that course policies are aligned with the goals and reasons you have for students to use, limit, or avoid generative AI.
  • Require that students show their processes of doing the work of the course (e.g., problem solving process, writing process, analysis process, coding process, reasoning process, prompting process, revision process) as components of assignments.
  • For major assignments such as projects, long papers, and capstones, include several stages that provide scaffolding and feedback on the way to the completed product. Logs or journals that accompany major projects and are reviewed with the instructor or TA periodically can also be effective.

Further Reading

AI-Assigned

Strategies When Assigning AI Use:

In addition to strategies relevant to all approaches to AI:

Make AI Use Transparent and Motivational

  • Share with students why you are asking them to use AI in this course or assignment. How and why does AI enhance what is possible in this context, while still giving students practice and helping students advance their own thinking and skills?
  • Revisit and check in with students about their use of AI. Asking students to reflect, through critical thinking and metacognition, helps ensure that AI is incorporated as part of students’ own learning, not to replace it.
  • Provide a course policy that specifies how and why AI use is assigned in the course.

Provide instruction and practice on responsible use of AI

  • Guide students to go to the Stanford AI Playground. Discuss the risks of sharing sensitive data or information. Open up a conversation about ethical use of AI in ways that tap into disciplinary perspectives represented in the course.
  • Consider whether an alternative assignment or approach may be necessary in cases where a student has a strong ethical objection to using AI tools.
  • Model how you expect students to integrate AI into their work through in-class examples and guided assignments.
  • Discuss and model examples of prompting and fact-checking AI results in ways that are specific and appropriate to your discipline, course, and assignment. 

Assess students’ use of AI along with other objectives

  • Articulate criteria and create a rubric that illustrates the important AI-related skills you will evaluate in students’ work.
  • Require that students show their process with using AI in the course, e.g., save and turn in full histories of prompts and outputs, how and why they chose specific generative AI tools (if given a choice).
  • Follow up on assignments in which students use AI tools, e.g., by asking students to explain their reasoning, discuss an example from their work, explain a technique they used, and articulate their approach to working with AI tools.
  • Give students feedback on their use of AI as well as other components of assignments.

Further Reading

The following publications highlight recent discussion about incorporating AI use in higher education courses. This list is not a comprehensive bibliography.

  • AI Pedagogy Project (N.D.). Assignments. aipedagogy.org/assignments/
  • Bowen, J. A. & Watson, C. E. (2024). Teaching with AI: a Practical Guide to a New Era of Human Learning. Johns Hopkins University Press. Stanford University Libraries Ebook (login required).
  • Dungo, C. A. B., Beltran, Z. L. E., Declaro, B. C., Dela-Cruz, J. J. C., & Viray, R. U. (2025). Students’ level of awareness on the environmental implications of generative AI. Journal of Education in Science, Environment and Health, 11(2), 93-107. doi.org/10.55549/jeseh.777
  • Perkins, Furze, Roe & MacVaugh (2024). The Al Assessment Scale. aiassessmentscale.com/ (includes several peer-reviewed publications using the scale)
  • Yang, T., Cheon, J., Cho, MH. et al. Undergraduate students’ perspectives of generative AI ethics. Int J Educ Technol High Educ 22, 35 (2025). doi.org/10.1186/s41239-025-00533-1
AI-Limited

Strategies When Limiting AI Use:

In addition to strategies relevant to all approaches to AI:

Clarify what AI uses are permitted, what uses are prohibited, and why

  • Create a specific course policy about use of AI. Where some uses are allowed and others are not, additional detail may be needed.
  • Consider providing a table or list that breaks down and compares different circumstances where AI may and may not be used.
  • Share with students why they should follow your guidance about when it is and is not appropriate to use AI in your course or assignment, and how these policies connect to course learning goals, skills that are useful in their studies, life, and work, and students future goals.
  • Be explicit about the kinds of activities AI could productively enhance and detract from in this particular academic context.
  • Consider ways in which limited use of AI may support students with learning differences.

Engage with students on AI-limited tasks

  • Model the kind of thinking, analysis, discussion, problem-solving, and other academic work that your course is designed to help students achieve.
  • Demonstrate and discuss the drawbacks of relying on forms of AI assistance that is limited in your class.
  • Check in with students about how they are experiencing the different AI use cases in their learning.

Guide students on allowed use of AI

  • Guide students on how to disclose and cite AI use. Some instructors require a disclosure with every assignment; it can also include reflection about how and why the students approached their work on the assignment and what kinds of tools were most and least helpful.
  • Be upfront about how AI could limit students’ learning, as well as where AI may assist students in deep learning, and how to tell the difference.
  • Guide students to go to the Stanford AI Playground for instances when they do use AI in the course or assignment. Discuss the risks of sharing sensitive data or information. Open up a conversation about ethical use of AI in ways that tap into disciplinary perspectives represented in the course.
  • Discuss and model examples of prompting and fact-checking AI results in ways that are specific and appropriate to your discipline, course, and the AI use cases that you allow.

Assess students fairly and accurately

  • Articulate assessment criteria and create rubrics that include appropriate disclosure and citation of AI use, alongside the important learning goals for the assignment or exam.
  • Make sure there are no hidden penalties for using AI in ways that are permitted. Some research in workplace settings has indicated an inequitable "competence penalty," in which evaluators gave lower ratings when they believed work products were created with AI assistance (Acar et a. 2025). 

Further Reading

The following publications highlight recent discussion related to limiting AI use in higher education courses. This list is not a comprehensive bibliography.

AI-Prohibited

Strategies When Prohibiting AI Use:

In addition to strategies relevant to all approaches to AI:

Communicate with students

  • Discuss with students your reasons for prohibiting AI use. These could include perspectives from the course or the broader discipline, emphasis on learning objectives that can best be developed by students working independently, and objections to various aspects of generative AI development and sustainability (e.g., intellectual property used in training, environmental impacts, and more).
  • Ask students about their experience with using AI in similar courses so that you understand what will be familiar and what will be new in this academic setting.
  • Make room for discussion. While staying true to your course policies and approaches to AI, giving students a chance to ask questions and discuss their perspectives may increase their investment, motivation, and understanding.

Be explicit about the importance of learning without AI

  • Normalize cognitive effort and struggle, which are crucial for learning, but can be shortchanged by use of AI. Discuss what it is like to be unsure, confused, and to grapple with concepts in this course, and why those challenging experiences are important for learning.
  • Use feedback and discussion about students’ work to increase their engagement and accountability for the development of their own ideas and lead to strong final products without AI assistance.
  • Bring some of the important intellectual and academic work into class, where you can provide guidance and modeling. For example, have students generate and refine ideas through short, individual writing and discussion with peers and instructors in class. Doing so may help students feel more ownership and be less likely to turn to AI for when they are working on their own outside of class.

Address AI-related changes (even while not allowing its use)

  • Become familiar with where and how students may inadvertently encounter AI assistance built into software, searches, and other platforms they use for coursework. Demonstrate and discuss the drawbacks of relying on AI assistance that is prohibited in your class and advise students about how to disable AI in the applications they use. Seek support from university technology offices to learn more about AI in campus applications.
  • Recognize that incoming students may be accustomed to relying on AI assistance. If students have offloaded crucial skills and habits of mind to AI in prior learning environments, modeling and guidance may be necessary.
  • Provide a “safety valve” for students to admit to making mistakes in following the AI policy without it being catastrophic to their grade. For example, some instructors ask students to disclose AI use on every assignment, even when it is prohibited, and then discuss what went wrong or led to the inappropriate use of AI, in order to support students in planning a revised approach next time. You could limit the number of times students may use such a safety valve.

Redesign assignments and assessments

  • Follow up on assignments completed outside of class, e.g., by asking students to explain their reasoning, discuss an example from their work, or explain a technique they used. If students do not seem to understand their work, they may need to redo the assignment with your guidance on how to develop a deeper understanding, whether or not they used AI assistance.
  • Especially on lower-stakes assignments or those leading up to longer projects and papers, provide feedback that focuses on progress toward learning goals and how to improve, rather than only on getting the right answer.
  • Consider conducting assessments in class (e.g., "blue books" or other in-class written formats). As of fall 2025, proctored exams are only allowed as part of the Academic Integrity Working Group Proctoring Pilot, but additional courses may request to join the pilot each quarter.
  • Help students prepare for in-class and/or proctored assessments. Remind them of the conditions they will encounter and encourage them to practice under similar conditions before the exam. Scaffold learning experiences leading up to timed, in-class writing so that a higher-stakes assessment is not the first time students experience the format. 

Further Reading

The following publications highlight recent discussion about prohibiting AI use in higher education courses. This list is not a comprehensive bibliography.