Setting and Managing Boundaries for Students’ Use of Artificial Intelligence

Presentation and faculty conversation occurred on February 28, 2024.

Panel Presentation

This session tackled the growing presence of Artificial Intelligence (AI) in student work. Two professors at Syracuse University, Roger Hallas (Associate Professor of English) and Laura Lisnyczyj (Assistant Teaching Professor, English to Speakers of Other Language) offer their perspectives on how to encourage authentic learning while acknowledging the potential of AI as a tool.

Roger Hallas’s Approach:

  • Setting Expectations: Clearly communicate what constitutes acceptable use of AI in assignments.
  • AI-Resistant Design: Craft assignments that are more difficult for AI to complete, such as:
    • Close reading tasks that require deep analysis.
    • Activities that draw on personal experiences.
    • Informal writing exercises to establish a baseline for student writing style.
  • Revision is Key: Encourage students to save drafts and revise their work independently. This allows instructors to track progress and identify areas where AI might have been used.
  • Grammarly vs. Chat GPT:
    • Permissible: Tools like Grammarly for proofreading and editing.
    • Discouraged: Content generators like Chat GPT that create original text.

Laura Lisnyczyj’s Approach (for Non-Native Speakers):

  • Transparency is Paramount: Encourage students to be upfront about how they’re using AI.
  • AI for Comprehension: Translation tools can be helpful for understanding source materials, but students must then express those ideas in their own English.
  • Revision and Drafts: Similar to Roger, Laura emphasizes the importance of saving drafts and tracking revisions.
  • Variable Restrictions:
    • The level of AI restriction will depend on the specific assignment and the students’ proficiency level.
    • Although detailed prompts help articulate teacher expectations clearly, vague prompts might be useful as they encourage students to clarify their understanding with the instructor. However, this could lead to more cases of unethical use if students do not ask for clarification.
  • Proactive Discussions: From the beginning of the semester, Laura has open conversations with her students about appropriate AI use in all assignments. This establishes clear expectations and creates a foundation of trust.
  • Direct Inquiry: When she suspects AI misuse, she initiates a conversation with the student. She asks them to explain their writing process and the steps they took to complete the assignment. This allows her to diagnose the issue and provide guidance.
  • Specific Concern about QuillBot: Laura identifies QuillBot as a tool she discourages because it doesn’t demonstrate genuine learning or comprehension. She sees it functioning more like a content generator than a writing aid. By pinpointing specific tools, Laura can tailor her guidance to address potential issues.

In conclusion, the need for a thoughtful approach to AI in education was emphasized. Upholding academic integrity remains crucial, but instructors can also explore ways to leverage AI as a learning tool for students. The line between content generation and writing tools isn’t black and white, but rather a spectrum. Paraphrasing tools like QuillBot could be problematic in certain courses and for specific learning objectives that prioritize originality and critical thinking. However, in other contexts, such as practicing sentence structure or exploring different ways to phrase an idea, these tools could be perfectly appropriate. As educators navigate this evolving landscape, it’s important to strike a balance that fosters genuine learning while acknowledging the potential benefits that AI can offer.

Question and Answer

Participants engaged in a Q&A where they exchanged thoughts surrounding AI writing tools in academic settings. Educators grapple with concerns about plagiarism and a decline in critical thinking if students misuse these tools.  The discussion explores ways to navigate this challenge, with some advocating for clear policies and others suggesting integration with learning objectives. Here the key takeaways from the discussion:

Challenges of AI Writing Tools:

  • Plagiarism and Lack of Critical Thinking: There’s concern that students might misuse AI paraphrasing tools or rely on AI-generated content without proper understanding. This could lead to unintentional plagiarism and a lack of critical engagement with the material. Instructors worry that students become reliant on AI for tasks that are essential for academic development, like developing arguments and analyzing sources.
  • Difficulty in Detection: Distinguishing between acceptable tools like grammar checkers and problematic AI generation tools can be tricky. This makes it difficult for instructors to create effective policies or catch instances of misuse.

Potential Benefits and Solutions:

  • Targeted Policies and Open Communication: Some instructors advocate for clear and specific policies in syllabi outlining acceptable uses of technology. Others believe a more nuanced approach is needed, encouraging open communication about AI tools and integrating them into learning objectives. Emphasizing transparency and clearly outlining the goals and skills students should develop through assignments can make it less likely they’ll view AI tools as a shortcut.
  • Focus on Higher Order Thinking: A key point is that AI tools should not replace the development of critical thinking skills. Instructors can design assignments that require students to analyze, synthesize, and evaluate information, skills that AI cannot replicate.
  • Teaching Responsible Use: Some argue that AI writing tools are becoming ubiquitous, and students will encounter them in future careers. By teaching students how to use these tools responsibly and critically and to evaluate AI-generated content, instructors can prepare them for the evolving technological landscape. This perspective argues that AI tools are becoming essential, and instructors should equip students with the skills to use them effectively, similar to teaching industry-standard software.