Ottawa’s AI Sprint: What’s at Stake

Canada just spun up a 30-day sprint to shape its next AI strategy. Minister Evan Solomon assembled 26 experts (mostly industry and academia) to advise on research, adoption, commercialization, safety, skills, and infrastructure.

Public consultations run through October 31. Recommendations due in November.



On paper, it’s a pivot moment. In practice, it’s already drawing fire. Too much weight on scaling, not enough on governance. Too many boardrooms, not enough frontlines. Too much Ottawa, not enough ground truth.

BC + AI sees this differently.

This is Canada’s chance to reset the DNA of its AI ecosystem.

But only if we choose regeneration over extraction, sovereign data governance over corporate capture, and community benefit over narrow interests.


If Ottawa’s foot is on the gas, BC is the brake and steering: ethics, open protocols, and land-rooted practice in every build.


The Problem With The Task Force

Research says: The group’s stacked with expertise. But critics flag the imbalance. Where’s healthcare? Where’s civil society beyond token representation? Where are the people who’ll feel AI’s impact first: frontline workers, artists, community organizers?

Canada just spun up a 30-day sprint to shape its next AI strategy. Minister Evan Solomon assembled 26 experts (mostly industry and academia) to advise on research, adoption, commercialization, safety, skills, and infrastructure.

The worry: Commercialization and scaling overshadow public trust, governance, and equitable outcomes. Again.

The numbers back this up: Only 24% of Canadians have AI training. Just 38% feel confident in their knowledge. Nearly two-thirds see potential harm. 71% would trust AI more under public regulation.

We’re building a national strategy on a foundation of low literacy and eroding trust. That’s not a recipe for sovereignty. That’s a recipe for capture.


Principles for a National AI Strategy: What BC + AI Stands For

Sovereign data governance comes first: Relational Data, Responsible AI Futures

Data isn’t “oil”… it’s relationship and responsibility. Canada’s AI strategy must centre Indigenous-led, community-governed frameworks (UNDRIP-aligned), using OCAP® and CARE principles plus BC’s FOIPPA to decide what’s collected, where it lives, who accesses it, and how benefits flow.

In BC, the First Nations Technology Council is operationalizing this: community-controlled data residency, consent protocols with revocation, audit logs, benefit-sharing MOUs, and Indigenous data trusts so Nations hold the keys not vendors.

Programs like Indigenomics AI add algorithmic impact reviews and value-sharing to contracts so AI serves sovereignty, not extractive pipelines.

No sovereignty, no legitimacy. Without this foundation, no national strategy can claim legitimacy.

Community compute, not corporate capture: GPU co-ops, ethical data lakes, Indigenous tech hubs.

Canada’s $2B Sovereign Compute Strategy must build regional compute commons: GPU co-ops, ethical data lakes, Indigenous tech hubs.

That means infrastructure in Prince George, Kelowna, and on Nation lands, not just hyperscale centers locked up by private vendors.

Compute is public infrastructure. Keep a big chunk of those tokens in the commons so Canadian labs and SMEs can build here.

Research-to-deployment bridges: Turn UBC and SFU wins into working pilots that stay here and scale here.

With BC’s Gail Murphy (UBC) on the Task Force, we have a champion to close the gap between labs and lived impact.

Funding must support applied research that flows into public health pilots, wildfire response tools, and Indigenous language revitalization projects, not just patents that exit Canada.

Turn UBC and SFU wins into working pilots that stay here and scale here.

Public-interest procurement: No black boxes. No closed‑door vendor lock‑in.

Every AI system procured with public funds must be auditable, transparent, and standards-based. No black boxes. No closed‑door vendor lock‑in. Procurement should explicitly favour Canadian vendors using open standards.

If public money buys it, the public should be able to read it. Model cards. Dataset provenance. Energy use. Limits in plain language.

AI literacy for all: Give people real on-ramps.

BC’s approach is multi‑literacy: blending code with ethics, art, and history.

Only 24% of Canadians have any AI training, and just 38% feel confident in their knowledge.

Open‑licensed curriculum, rural outreach, and training pipelines for under‑represented communities must be funded at scale.

Give people real on-ramps. Paid fellowships on live projects beat glossy workshops every time. Train-the-trainer PD and micro-credentials co-designed with educators, parents, and youth build capacity fast.

Consent-first by default—respecting Indigenous data sovereignty—with offline/low-bandwidth options and open models, so every district can adopt, audit, and adapt.

Guardrails for safety and trust: Publish the safety case up front or do not ship.

Adopt Responsible AI certification for people, products, and services: covering algorithmic bias, environmental impact, and sovereign data governance. Public oversight boards (not just industry committees) must hold veto power on high-risk AI use.

Canada should build on efforts like LawZero (Yoshua Bengio’s new safety nonprofit) to set global standards.

Human in the loop for dual-use. Publish the safety case up front or do not ship.

Climate and ecological accountability: We cannot manage what we do not measure.

Every federal AI initiative should disclose its compute footprint. Introduce green compute credits for models and infrastructure that minimize carbon, water use, and ecological impact.

We cannot manage what we do not measure. Start with hourly energy mix and water use and reduce from there.

Clean energy data centres: BC has clean hydro. Build the AI factories here and wire them for heat reuse and transparency from day one.

AI can’t run on dirty power. Canada must:

  • Designate Clean Compute Zones co-located with renewables
  • Incentivize waste heat reuse for community heating, greenhouses, and industry
  • Require demand-flexible compute that shifts workloads to match renewable supply
  • Build public dashboards tracking energy, carbon, and water use
  • Partner with Indigenous governments to co-develop clean, sovereign compute nodes
Open Tools and Commons: open stacks other people can pick up and run with

BC’s strength is mycelial innovation: cross-disciplinary, open, community-built. Canada should invest in shared public AI tools (wildfire prediction, Indigenous language, healthcare analytics) rather than siloed contracts to foreign vendors.

We do not need more glossy demos. We need open stacks other people can pick up and run with.


Why This Matters for BC and Canada

BC is already proving what’s possible: monthly AI meetups in Vancouver drawing hundreds of creators, Indigenous tech hubs building sovereignty‑first frameworks, and researchers at SFU and UBC pushing applied AI in climate and creative industries. Our model is DIY over gatekeepers, grassroots over bureaucrats, regeneration over extraction.

If Canada’s AI strategy leans only on commercialization and scaling, we’ll repeat past mistakes: extraction without equity, innovation without accountability. But if we follow a BC‑led path, Canada can pioneer an AI ecosystem that is ethical, sustainable, and globally competitive.


A Call to Minister Solomon

Minister Solomon: come join a BC + AI gathering. Sit in circle with our builders, Elders, and artists. See firsthand how we’re shaping an AI ecosystem as accountable to land and people as it is to code.

Canada’s AI strategy must answer one question:

Will this serve the commons or just the corporations?


What We’re Asking For

  1. Regional Compute Commons: Public GPUs with transparent model cards and fair queue rules. Prioritize climate, health, and language projects led by communities and Nations.
  2. Open Procurement Templates: Publish docs showing datasets provenance, energy use, and limits in plain English.
  3. Community Review Boards: Fund bias and ecological audits. Give local boards veto power over high-risk deployments.
  4. Indigenous Data Governance First: No sovereignty, no project. Full stop.
  5. Public Budgets and Decisions: Transparent, hype-proof, DIY-first so the ecosystem learns together.

The Bottom Line


If Ottawa’s foot is on the gas, BC must be the brake and steering: ethics, open protocols, and land-rooted practice baked in.


If Ottawa wants proof of concept, BC can deliver working, accountable demos (from wildfire prediction to Indigenous language AI) without selling our soul.

The question isn’t whether AI will transform Canada. The question is: who benefits, and who pays the price?

BC + AI has our answer. Now let’s see if Ottawa’s listening.



Canada’s AI Task Force

Research & Talent

Gail Murphy: UBC VP Research & Innovation; Vice-Chair, Digital Research Alliance. Vancouver brain trust, deep software engineering roots.

Diane Gutiw: VP Global AI Research, CGI; Co-Chair, Advisory Council on AI.

Michael Bowling: UAlberta professor; Amii fellow; CIFAR AI Chair; RL legend.

Arvind Gupta: U of T comp sci prof; long-time national science/innovation builder.

Adoption (industry & government)

Olivier Blais: Co-founder, Moov.AI; Co-Chair, Advisory Council on AI. Standards & responsible AI tooling.

Cari Covent: Tech exec; scaled enterprise AI (Canadian Tire).

Dan Debow: Chair of the Board, Build Canada (policy lab for pro-builder ideas).

Commercialization

Louis Têtu: Executive Chair, Coveo (search/recommendation AI).

Michael Serbinis: CEO, League; Board Chair, Perimeter Institute.

Adam Keating: CEO, CoLab (hardware design collaboration).

Scaling champions & investment

Patrick Pichette: GP, Inovia (ex-Google CFO).

Ajay Agrawal: Rotman prof; founded CDL & NEXT Canada.

Sonia Sennik: CEO, Creative Destruction Lab.

Ben Bergen: President, Council of Canadian Innovators.

Safety & public trust

Mary Wells: Dean of Engineering, UWaterloo. Talent + ethics lens.

Joelle Pineau: Chief AI Officer, Cohere (ex-Meta AI). Fresh mandate after Cohere’s August fundraise.

Taylor Owen: Founding Director, Centre for Media, Technology & Democracy (McGill).

Education & skills

Natiea Vinson: CEO, First Nations Technology Council (BC). Indigenous data sovereignty + skills.

Alex LaPlante: VP, Cash Management Technology (Canada), RBC; Mitacs board. Bridges talent to jobs.

C. David Naylor: U of T professor; led Canada’s 2017 Fundamental Science Review.

Infrastructure

Garth Gibson: Chief Technology & AI Officer, VDURA (RAID co-inventor; ex-Vector CEO). Storage stack for AI.

Ian Rae: President & CEO, Aptum (Canadian multi-cloud/data centre).

Marc-Étienne Ouimette: Chair, Digital Moment; OECD One AI expert; ex-AWS global AI policy.

Security

Shelly Bruce: Distinguished Fellow, CIGI; former CSE Chief. National-security chops.

James Neufeld: Founder/CEO, samdesk (real-time crisis intelligence).

Sam Ramadori: Co-President & ED, LawZero (Bengio’s new AI-safety non-