- Admission
- Programs
- Learning
- Community
- About
- Research
- Research Priorities
- Strategic Research Plan
- Implementation Plan
- Supporting Health and Wellness of Individuals, Populations and Communities
- Expanding the foundations of knowledge and understanding our origins
- Strengthening Democracy, Justice, Equity and Education
- Supporting Research Graduate Students
- Supporting Postdoctoral Fellows
- Valuing and Measuring Scholarly Impact
- Incorporating Indigenous Perspectives into Research Ethics
- Building World-Class Research Space and Infrastructure
- Involving Undergraduate Students in Research
- Supporting Early-Career Researchers (Faculty)
- Funding Research Chairs
- Reducing Administrative barriers to Research
- Transforming industry and economies through technology, management and policy
- Implementation Plan
- Performance & Excellence
- Innovation
- Knowledge Mobilization
- Researcher Resources north_east
- Institutes, Centres & Facilities
- Leadership & Departments
- Dashboard
- Campuses
- Contact Us
- Emergency
Artificial Intelligence (AI) offers our community and opportunity to creatively explore how to interface with this new technological revolution. This page brings together resources and supports currently available to help instructors respond to the challenges and opportunities of at SFU.
On this page:
Instructor Guidelines for Learning and Teaching with AI
These guidelines (download PDF) are designed to help SFU instructors make informed decisions about when and how to integrate artificial intelligence (AI) into your teaching.
Recognizing there is no one-size-fits-all approach, the guidelines offer practical directions to help you use AI responsibly and effectively to support student learning.
AI Selection: Key Considerations for AI Use
Your use of any AI tools in your teaching remains optional, unless it is a curriculum requirement.
Clearly communicate your expectations regarding AI use in each assignment and in the classroom to your students in the syllabus, during first day of class, and at intervals throughout the semester.
- Provide explicit details about instructor use, student assessments, and assignments involving AI.
- Where possible, offer students guidance on how AI tools are being used within your field, discipline, or relevant professional contexts to help them understand the practical implications and potential benefits or drawbacks.
- Be open with students about any use of AI in preparing teaching materials to promote a transparent learning environment.
Engage learners in open conversations about AI use and academic integrity in various learning contexts and at multiple points throughout the learning experience.
Do not use AI detectors for grading decisions or academic misconduct investigations because they are unreliable, biased, and may unintentionally compromise students’ learning and well-being. Any form of detection (AI or otherwise) is subject to the “balance of probabilities” test.
- In addition, AI tools require a Privacy Impact Assessment (PIA) and informed student consent. Inputting student work into these tools without consent could violate both privacy legislation (if done without authority) and copyright protections.
Academic Integrity
- Approach academic integrity holistically by considering and addressing the reasons why students use AI tools in ways that are not permitted (i.e., build in AI literacy, clear guidelines, and other pedagogical strategies).
- Establish explicit expectations about when, how, and if AI may be used for specific assignments and activities.
- When AI is not permitted, you should review the existing curriculum and course pedagogy to redesign and/or move assessments into the classroom to ensure the course educational goals will not be compromised by prohibited use of AI.
- When AI use is permitted, ensure your students understand they are responsible for verifying the accuracy of AI generated content.
- Guide students to always disclose the use of AI and reference the nature of that use in assignments, exams, and research papers (e.g., in the “acknowledgement section”).
- Promote understanding of AI tools’ limitations, biases, and ethical considerations and explain the purpose of using AI in assessments/assignments.
- In cases where there is reasonable evidence of a violation of course or assignment expectations, begin by having a conversation with the student about your concerns before reporting them in accordance with academic integrity procedures, which uses a “balance of probabilities” standard.
- Provide students with sample language on how to acknowledge AI use (e.g., permitted, limited, or prohibited) so expectations are concrete and consistent.
Privacy, Copyright, and Consent
- Ensure you use AI tools approved by SFU in ways that protect student’s privacy, data security, and intellectual property. Verify that these tools support effective teaching practices, disciplinary standards, and institutional policies by confirming they’ve undergone necessary reviews, such as a PIA.
- Students should always be clearly informed when their data will be collected or used within AI systems, and explicit student consent must be obtained to protect both privacy and copyright rights of students. Data input into AI tools must be limited to only what is necessary, avoiding sensitive personal information unless specifically approved.
AI Adoption: Integrating AI Tools into your Teaching
- Select AI tools that are relevant and align with AI use in students' field, discipline or potential workplace.
- Select AI tools that do not create barriers (i.e., costs, service availably) for learners or instructors.
- Choose freely available (or open source) AI tools, or ensure the cost is in line with SFU’s values and policies.
- Prioritize the use of institutionally supported AI tools or ones that are intuitive, accessible, and designed for a diverse student body (i.e., consider various cultures, different knowledge systems, multi-lingual, neurodivergent and learners with differing abilities).
- Ensure that AI tools are compatible with assistive technologies (e.g., screen readers) and meet accessibility standards.
- Consider the number and complexity of AI tools in the context of all educational technology in a course to avoid cognitive overload.
Assignments and Assessment
- Regardless of whether AI use is permitted or not, design assessments and learning activities which emphasize human-centered approaches that focus on "process", not "product” to enable students' thinking and learning.
- Adapt grading practices and rubrics to assess critical engagement rather than just content accuracy.
- Approved AI tools can be used to improve the grading process by providing timely, detailed and personalized feedback on assignments, however, the instructor/teaching assistants must be responsible for all feedback and grading and not rely on AI as the only source of feedback for students.
- Make explicit whether generative media (images, audio, code, video) are allowed in assignments and guide students on how to evaluate and credit them responsibly.
Pedagogical Considerations
- Foster critical thinking by encouraging deep analysis and reflection both with and beyond AI-generated content.
- Apply pedagogical approaches to facilitate higher student achievement with AI assistance than without (i.e., critically analyze AI-generated content for accuracy, bias, and ethical implications; use AI for concept clarification and summarization but develop personal insights before (or later depending on pedagogical goals) when completing assignments.
- When its capabilities permit, you can implement approved AI tools to help personalize the content focus and level of challenge for each student.
- Instructors are encouraged to consult the Centre for Educational Excellence (CEE) for resources, sample syllabus language, and workshops on AI in teaching.
- Remember that AI should augment—not replace—core teaching practices and your professional judgment.
AI Framework for Unit Leaders: Developing unit-level learning and teaching guidelines
As a unit leader (e.g. department chair, school or program director, graduate or undergraduate program chair) you are responsible for guiding how your area responds to the integration of artificial intelligence (AI) in teaching and learning. SFU recognizes that the use of AI will differ across disciplines, programs, and pedagogical approaches. In alignment with the AI Teaching and Learning principle Disciplinary Contexts, the university acknowledges that a single, standardized policy cannot account for the diverse expectations, values, and practices found across academic units. This guideline, or framework (download PDF), is designed to support you in leading a structured and inclusive process to develop AI guidance that reflects your unit’s unique disciplinary context and instructional goals.
Before developing your unit’s guidelines, it is important to understand SFU’s current institution-wide guidance. The Learning and Teaching with AI Guidelines are structured around three groups in the academic community:
1. Teaching & AI
These guidelines are designed to help instructors make informed decisions about whether, when, and how to incorporate AI tools into their teaching. They offer practical strategies for aligning AI use with course educational goals, redesigning assessments to account for AI-generated work, and clearly communicating expectations to students. The guidelines also address considerations such as tool selection, transparency in instructional practices, academic integrity, and adherence to privacy legislation. Instructors are encouraged to reflect on how AI can either enhance or complicate learning and to set appropriate boundaries that support critical thinking and originality.
2. Learning & AI
These guidelines help students understand how to use AI in their academic work responsibly, and in compliance with institutional expectations. They provide clarity on what constitutes acceptable use in different contexts and emphasize students' responsibility to disclose AI use, verify outputs, and maintain academic integrity. Topics include citing AI-generated content, understanding data privacy implications, avoiding overreliance on AI tools, and respecting intellectual property rights. The guidelines also remind students that course-level rules may vary, and that they are expected to follow the specific guidance provided by their instructors.
3. Graduate Students & AI
These guidelines focus on the use of AI tools in graduate research, thesis and project writing, and academic publishing. They recognize that graduate-level work often involves more complex questions about authorship, originality, research ethics, and data protection. The guidelines emphasize that any use of AI in research must comply with departmental and Faculty of Graduate Studies policies and approved by a student’s supervisor or supervisory committee. Graduate students are expected to understand how AI may intersect with methodological choices, disciplinary standards, and institutional policies around data security, Indigenous data sovereignty, and academic publishing. These guidelines serve as a foundation for thoughtful, transparent, and discipline-appropriate decision-making at the graduate level.
Using These Guidelines in Your Unit
As a unit leader, you are expected to use the institutional guidelines as a foundation when creating local guidelines. You should adapt these baseline expectations to your program's disciplinary context and pedagogical goals. Your department’s guidelines should clearly define what AI use is permitted, restricted, or encouraged in coursework, and how instructors and students are expected to follow these expectations. These guidelines are living documents and should be reviewed and updated regularly in response to emerging technologies, teaching practices, and student needs.
Decision Framework: A Deep Reflection Tool
The Decision Framework (PDF) provides a structured method for units to reflect on their academic values and teaching goals when deciding how to implement unit-level AI guidelines. The framework seeks to promote critical reflections via the following questions: Are AI tools allowed in your courses? Based on your answer, it prompts a set of guiding questions across five core domains. These questions help departments consider the practical, and pedagogical implications of their AI decisions.
Key Questions to Guide Your Unit’s AI Planning:
What educational goals matter most in our courses, and how does AI support or undermine them?
How is AI currently used in our field, and what expectations do students bring from industry or research?
Are all students equally able to access and use the AI tools we allow, or would some be unfairly disadvantaged?
How will instructors redesign assessments, assignments, or feedback processes to reflect our position on AI?
What privacy, copyright, and institutional policies apply to the tools and practices we’re considering?
Have we assessed potential risks of AI tools by contacting SFU’s Archive and Records Management Office or considered using frameworks such as the Government of Canada's Algorithmic Impact Assessment tool?
Have (existing or new) instructors been given the guidance and resources they need to implement our decisions consistently?
Implementation Process
Step 1: Understand your local context
Begin by reviewing how AI is currently being used in your unit. You may want to review recent syllabi, survey instructors and students about their experiences, or audit assessment types across courses. You can work with LEAP to develop and analyze surveys or other tools for collecting data. The goal is to build a shared understanding of your starting point before making decisions.
Step 2: Facilitate informed, inclusive discussions
Bring together students (undergraduate and graduate), instructors, advisors, teaching assistants, and graduate supervisors to explore how AI should be used or restricted in your unit. These discussions should surface disciplinary norms, identify pedagogical concerns, and raise questions about privacy, access, or ethics. You should document the outcomes of these discussions and use the Decision Framework to support and guide your planning.
Step 3: Draft and approve your unit’s AI guidelines
Based on your consultations, develop a written document that outlines your unit’s guidelines. It is recommended that the guidelines include a short rationale for your approach; clear expectations for instructors on how to communicate AI rules; guidance for students on disclosure, citation, and academic integrity; and any special considerations for graduate students working on research or theses. The document should also include sample syllabus statements and a plan for periodic review and revision. The Centre of Educational Excellence (CEE) can assist with developing syllabus language, while Learning Experiences Assessment and Planning (LEAP) can support the design of a review and revision process. Finally, ensure your local guidelines aligns with SFU’s academic regulations, privacy protocols, and AI guidelines.
Step 4: Put the guidelines into practice
Once your guidelines are ready, support instructors as they apply it. This may involve revising syllabi, redesigning assessments, or clarifying expectations with students. You should encourage instructors to consult the CEE for additional support and consider assigning faculty lead(s) to coordinate implementation and collect feedback. You may also want to establish a process for updating the guidelines based on feedback or new developments.
Monitoring and Evaluation
SFU has begun to incorporate AI-related questions in the Course Experience Survey (CES). These questions will allow students to reflect on whether AI use was clearly explained and appropriately integrated into the course.
Departments are encouraged to review these responses regularly. It is recommended to use feedback from the CES and internal discussions to improve your unit’s guidelines and instructional practices. If you identify successful strategies or challenges worth sharing, report them to the Office of the Vice-Provost, Learning and Teaching. This will contribute to a broader university-wide understanding of how AI is being used in learning and teaching.
For more information, contact: Paul Kingsbury, AVPLT, avpltsec@sfu.ca
- SFU Centre for Educational Excellence website: Generative AI in Teaching
- Syllabus statement examples: A collection of syllabus statements(PDF) from SFU instructors showcasing various positions on AI use in coursework and learning.
- Additional examples of syllabus statements that instructors can use to communicate their course policy on AI use can be found on Student Services Sample text for syllabus page.
Student Academic Integrity Tutorial: Add SFU’s Academic Integrity Online Tutorial to your course, which addresses student use of AI. Learn more.
Working with Graduate students: Guidance for supervisors on helping graduate students navigate the use of AI tools in academic research can be found on the AI for Graduate Supervisors online course.