How Can I Use AI in Teaching?
AI tools can be valuable in co-developing teaching materials, such as drafting course maps, learning outcomes, and lesson plans; evaluating alignment with learning outcomes; performing data analysis; and creating H5P content, rubrics, slides, case studies, videos, diagrams, and other resources.

Instructors and staff may choose to use AI tools in their teaching practice, unless specific department, program, or school guidelines prohibit it. If a program is purchasing a tool for faculty or student use, it must first be approved by the Governance Executive Group Information Technology (GEGIT) committee.
- We recommend the use of Microsoft Copilot, as it offers enhanced privacy and security through institutional login.
- Before using AI tools, review the Terms of Service and Privacy Policies to ensure comfort with data collection practices.
- Some AI tools allow opting out of input data usage for training models, which is recommended for privacy. For instance, opting out of saving chat history typically means inputs will not be used for further training.
Any content generated by AI for teaching purposes must be thoroughly reviewed for accuracy, appropriateness, bias, and potential harm before being shared with students.
It’s important to demonstrate ethical and responsible use of AI in student-facing materials. Any content generated entirely or partially by AI should be properly cited. See AI at NIC on the NIC Library website to learn more.
Several resources s are available to assist with crafting useful prompts for creating teaching materials like lesson plans, rubrics, practice questions, case studies, and H5P activities.
There are currently no Privacy Impact Assessments (PIAs) completed for AI detectors for use at NIC. The use of AI detectors is strongly discouraged due to concerns about their accuracy, reliability, bias, privacy, and security. It is advised not to submit original student work to any tool without a completed and approved PIA, as doing so may breach student privacy or violate intellectual property.
AI tools must undergo a Privacy Impact Assessment (PIA) review to be approved before being used for grading purposes. There are also considerations regarding the impact of AI on student-instructor relationships and perceptions of course value. If a PIA is completed and approved, educators must disclose the use of AI for feedback and grading, but instructors remain accountable for student grades and should review any feedback or grades generated by AI tools.