AI FAQ

At NIC, AI tools are transforming teaching, learning, and work. This resource brings together important questions, insights, and key considerations for instructors and students to navigate the use of AI in the classroom.

It’s crucial to establish clear and ongoing communication with students about the evolving role of AI tools in your course. Students will encounter different expectations across courses and assignments, so transparency is key. Include a course-level statements in your outline and use NIC’s Course Outline Chart [Word document], along with specific guidelines that are aligned with NIC’s six laneway approach for each assignment.

Things to consider when drafting your statement:

  • If, how, and when AI tools are allowed
  • The rationale behind these decisions and how the use of AI supports or conflicts with course goals and learning outcomes
  • Any student responsibilities, such as citation requirements

In addition to these written statements, your course and assignment-level AI guidelines should be reviewed with students to ensure understanding and to provide an opportunity for dialogue and questions.

Currently, Microsoft Copilot with Enterprise data protection (distinct from Microsoft 365 Copilot) is the only AI tool approved for use at NIC; however, this may change as additional tools undergo review. It is available to all faculty, staff, and students through NIC’s M365 license.

To access this tool, users can log in at https://copilot.cloud.microsoft/ or use the Microsoft Edge browser sidebar with their NIC credentials. A shield or profile name will indicate if users are logged in correctly, ensuring privacy and data protection. However, this secure connection is only available when users are logged in.

If you permit AI use in coursework, you should also ensure that students know how to appropriately acknowledge use of these tools. See UBC’s citation guide and NIC’s citation guide. You may also choose to have students provide an appendix to their work showing prompts and outputs or complete a Student AI Disclosure Form [Word] [PDF] to attach to assessment submissions.

The use of Microsoft Copilot and other AI tools does not automatically constitute academic misconduct at NIC. Whether AI tools are permitted in a course is a decision made at the course or program level. Instructors should clearly communicate expectations with students early in the term, such as in the syllabus.

If an instructor prohibits the use of AI tools for assignments, using them would be considered academic misconduct. If AI tools are permitted, instructors should specify limitations, proper acknowledgment, and acceptable usage. If AI usage has not been addressed, it may be considered unauthorized, as per academic misconduct policies (e.g., using unapproved resources or plagiarism). Students should seek clarification from their instructor if it’s not specified.

Students should not assume all technologies are permitted. If unsure about AI tools, they must ask their instructor for guidance.

NIC’s academic misconduct policy addresses actions that give unfair academic advantage, and AI tools could fall under this if used improperly. Instructors should regularly address academic integrity and provide opportunities to discuss expectations throughout the semester.

NIC strongly advises against using AI detection tools on student work due to legal, pedagogical, and practical concerns. Instructors should not upload student work or personal information to unapproved tools, as this may violate the Freedom of Information and Protection of Privacy Act (FIPPA) and the Copyright Act.

Key concerns include:

  • Accuracy: AI detection tools often have high false positive rates, leading to unjust accusations and stress for students.
  • Bias: These tools may disproportionately flag non-native English speakers, raising equity issues.
  • Evasion: Detection tools can be easily bypassed, making results unreliable.
  • Rapid Advancement: AI technology evolves quickly, and detection tools struggle to keep up.
  • Transparency: Most tools don’t explain why content is flagged, leaving students with no recourse.


Currently, NIC does not support AI detection tools, and faculty are encouraged to design assessments that focus on process and originality to maintain academic integrity.

Instructors who suspect that a student has used AI tools contrary to expectations should follow the standard academic misconduct process. If an instructor has a suspicion based on the student’s work, they should follow the procedure as they would for any misconduct allegation. Instructors should not rely on AI detectors to form the basis of an allegation of academic misconduct. If you have any questions, comments, or concerns, please email the Chair of the Academic Integrity Committee.

The only way to know if AI is permitted for your course assignments is to check with your instructor and consult your course outline. If you are unsure, do not assume AI use is allowed. As AI technology evolves, it is increasingly integrated into various tools, such as Copilot in Microsoft Word. While using AI for non-academic tasks like drafting emails or creating resumes may be appropriate, its use in academic work—such as assignments, essays, or exams—is not allowed unless explicitly stated by your instructor in the syllabus or exam instructions.

Even if your instructor permits the use of AI, you must properly cite it to maintain academic integrity. Transparency about the tools you use is essential. Keep a record of the prompts you entered to generate AI output for your coursework and ensure proper citation. It’s important to demonstrate ethical and responsible use of AI in student-facing materials. Any content generated entirely or partially by AI should be properly cited. See AI at NIC on the NIC Library website for guidance on citing AI in APA and MLA.

In every course, your instructor sets learning outcomes and your grade reflects how well you achieve them. NIC’s six laneways for assessment outline different levels of AI use, ranging from no AI assistance to full integration. If AI is not permitted for an assignment, using it is considered cheating and may result in disciplinary action under NIC’s Code of Conduct Policy #3-06. This is because it interferes with your instructor’s ability to assess your knowledge and is unfair to students who follow the rules. Always check your course syllabus or ask your instructor to determine what level of AI use, if any, is allowed.

AI can be a valuable tool for efficiently gathering information from large datasets, but its output may contain errors, outdated information, false references, or biases. Users must critically evaluate AI-generated content to ensure accuracy. Students permitted to use AI in their coursework should verify information by using multiple prompts to gain different perspectives and be mindful of potential biases. As with any tool, AI is most effective when used responsibly.

Confidential information is data not intended for public use. Personal information (PI) includes any recorded data that identifies an individual, such as names, contact details, student numbers, academic history, and financial information. Student assignments may also contain PI and constitute intellectual property.

AI tools may collect and store various types of data, including log data (IP address, browser settings), usage data (location, content generated), and session interactions. Any personal information entered may be stored, used for AI training, or shared with third parties—often outside Canada—raising privacy concerns.

Copyrighted information includes works you do not own or have permission to use, such as journal articles, textbooks, and teaching materials. Uploading such content into AI tools may violate copyright laws.