What Every Canadian Teacher Needs to Know About AI and Student Data Privacy
In January 2025, Canadian parents received letters they never expected: their children's personal information had been stolen in a cyberattack on PowerSchool, the education software platform used by school boards across Canada.
The breach affected approximately 62 million individuals across North America. The Toronto District School Board reported that data stretching back to 1985 — covering approximately 1.49 million current and former students — may have been compromised.
When Ontario and Alberta's privacy commissioners released their joint investigation findings in November 2025, the conclusion was clear: governance failures worsened the breach.
If you are a Canadian teacher using AI tools in 2026, this is the context in which every technology decision you make will be evaluated.
The Canadian Privacy Landscape: A Quick Guide
Canada does not rely on a single student privacy law like the United States. Instead, privacy protection is governed by layered federal and provincial legislation.
Publicly funded schools:
Private schools:
The federal PIPEDA may apply, or substantially similar provincial private-sector laws.
The key issue for AI tools: Entering student data into an AI platform constitutes disclosure to a third party. If that data leaves Canada, you may be violating provincial legislation — particularly in British Columbia, where public bodies face strict data residency requirements.
What This Means in Practice
You cannot paste student names, grades, or behaviour notes into ChatGPT. When you enter data into cloud-based AI tools, it is transmitted to private company servers — often in the United States.
"But I only used first names" is not a defence. Under Canadian law, personal information includes any information about an identifiable individual.
Uploading student essays requires caution. Some AI tools retain input data for training purposes. Always review retention policies before uploading student work.
Approved tool lists matter. If your board maintains an approved tools list, use only those tools for any student-facing activity.
How to Use AI Safely
- Use AI for content creation, not student data processing.
- De-identify information before inputting.
- Check data residency and server location.
- Review data retention policies.
- Document your AI use practices.
What School Boards Should Be Doing
School boards should be conducting privacy impact assessments, strengthening vendor contracts, and providing written AI usage guidelines.
They should invest in privacy literacy — not just AI training — and leverage procurement power to ensure vendors meet Canadian legal standards.
The Privacy Advantage
Canada's stronger privacy framework is not just a burden — it is an advantage.
When Canadian boards demand Canadian data storage and legal compliance, they are building a model of responsible AI adoption that other jurisdictions may eventually follow.
Teaching students about digital rights and responsible technology use is itself a form of AI literacy.
AIForEdu.ai provides Canadian-specific guidance on AI privacy and responsible technology use.
Subscribe at aiforedu.ai for weekly analysis →
Privacy workshops: [email protected]