Skip to Main Content
Proctor Library

AI in Higher Education: Ethics, Pedagogy, and Policy

A Faculty Professional Development LibGuide

Using AI Tools as Teachable Moments

The most effective approach to AI in the classroom is to treat it as a teachable moment rather than a hidden or prohibited practice. By explicitly addressing how to use AI tools ethically, responsibly, and transparently, faculty can help students develop the discernment and skills they’ll need for academic success and lifelong learning. This guidance aligns with the library’s role as a partner in cultivating information-literate learners who can discover, evaluate, and ethically use information in any format.

Coaching Students on Ethical and Productive AI Use

1. Promote an Open Dialogue
Start the semester with a class conversation about AI. Ask students:

  • What AI tools have they used, and for what purposes?
  • What benefits and challenges have they experienced?
  • Where do they see ethical gray areas?
  • These discussions can help normalize conversations around AI, reduce misinformation, and establish shared expectations.

2. Emphasize a "Citation Mindset"
Encourage students to treat AI tools as sources—if they use them, they must acknowledge how and why.

  • Explain that citations are not only about avoiding plagiarism but also about maintaining academic transparency.
  • Provide examples of AI citations (APA, MLA, Chicago) and statements of use.

3. Keep the “Human in the Loop”
Make it clear that AI works best as a collaborative partner rather than a replacement for thinking. Students should:

  • Guide AI with clear, specific prompts.
  • Critically evaluate AI-generated content for accuracy, bias, and relevance.
  • Integrate AI-generated insights with their own analysis, creativity, and voice.

4. Teach Critical Evaluation Skills
AI tools are only as reliable as the data and algorithms behind them. Help students recognize:

  • The difference between authoritative and unverified AI output.
  • The risk of “hallucinations” (fabricated or inaccurate information).
  • The potential for bias in AI-generated responses.

Possible Classroom Discussions

  • Ethical Boundaries: “What’s the difference between using AI to check your grammar and using it to write the entire essay?”
  • Responsibility and Accountability: “What are a student’s ethical obligations when using AI? What responsibilities do faculty have?”
  • Redefining Originality: “How might the widespread use of AI reshape how we define ‘original work’?”
  • Bias Awareness: “What kinds of bias might an AI tool introduce into a research project?”
  • Transparency and Trust: “How might acknowledging AI use affect how your work is perceived by professors, peers, or employers?”

Practical Strategies

  • Model Ethical AI Use in your own work and share examples with students.
  • Incorporate AI Evaluation Exercises into research assignments—e.g., compare AI-generated bibliographies with library database results.
  • Use AI as a Springboard for discussion and critical thinking rather than as a shortcut to final answers.