Canada’s debate over artificial intelligence sovereignty is accelerating, but it is currently trapped in a dangerously narrow lane.
Policymakers have correctly identified that digital power is the new national currency. The latest federal budget underscored this ambition by committing nearly $1 billion to develop sovereign digital infrastructure. But as the government builds out the hardware and negotiates for GPU clusters, a deeper, more existential question remains unresolved: Who governs the data systems that underpin those technologies?
A new report from the AI Competitiveness Project at the Munk School of Global Affairs and Public Policy, Sovereign by Design, argues that Canada faces a narrowing window to shape its technological future.
Authored by Sean Mullin and Jaxson Khan, the report’s central thesis is a dose of strategic realism: sovereignty in the AI era is about “freedom from coercion.” It depends less on building individual models than on securing influence across the entire technology stack –from physical chips to the cloud environments and data pipelines that fuel them.
The report identifies critical vulnerabilities, noting that Canada remains heavily dependent on foreign cloud providers and advanced hardware. It also highlights that data centres physically located in Canada may still fall under foreign jurisdiction depending on the ownership structures of the companies operating them.
These concerns are hardly new. Over the past year they have been raised repeatedly in policy discussions, industry analysis, and national security debates, and have now begun to register in government thinking as well.
The report’s most sobering takeaway, however, is a fiscal reality check: even with $1 billion in new funding, Canada cannot, and should not achieve total self-sufficiency.
The Reality of Managed Dependency
We often hear about the nearly $1 billion allocated in Budget 2025 to expand domestic AI compute, but the Munk report puts that figure into a brutal global perspective.
The authors are blunt: “A billion-dollar investment, while a necessary down payment on our digital future, is dwarfed by the hundreds of billions being spent by global hyperscalers and foreign states.”
This isn’t an argument for more spending; it’s an argument for better design.
In a world where single tech giants spend more on a single data center than Canada spends on its entire national AI strategy, the report concludes that “the sheer capital requirements of the global hardware and cloud markets mean that isolationism is a fiscal impossibility.”
This is the pivot point for Canadian policy. If we cannot buy our way to total independence, our goal must shift from the “illusion of total independence toward a reality of managed dependency and strategic leverage.”
For Canada, sovereignty is not a wall; it is a seat at the table. It is about ensuring that even when we use foreign-built hardware, we have enough leverage — through our energy, our minerals, and our domestic data governance — to remain free from external coercion.
Yet, this is where the current public debate stops. We are fixated on the “middle” of the stack, the compute and the cables, while ignoring the “intelligence layer” at the top.
The Infrastructure Gap
Canada can build as many data centres as it wants and fill them with the most advanced hardware on the planet. But if the data flowing through those systems — the geospatial, infrastructure, and sensor data that dictates how the country functions — is processed through platforms governed outside our borders, national autonomy remains conditional.
Without addressing the governance layer, investments in sovereign compute solve only the hardware half of the challenge. The missing piece of this debate is visible in how Canada currently manages its digital information. Outside of protected categories like health records and Statistics Canada data, our data governance is fragmented.
As data governance scholar Prof. Tracey Lauriault had pointed out in an earlier interview in GoGeomatics, open data is frequently pulled into foreign repositories and used in corporate AI projects.
“Think of GenAI. Do we want Statistics Canada data in someone else’s GenAI project?” was the question she had posed, while explaining that while the government may put safeguards in place, there’s no law or regulation making it essential for companies to do the same.
“We have very little consumer protection, and probably very little corporate protection, over how Canadian data is produced and reused, whether by Canadian companies, Canadian citizens, governments, or nonprofits,” she had said.
The result is a cycle where Canadian-funded data is harvested by foreign platforms to train proprietary models and analytical systems — from generative AI to predictive analytics and digital twin platforms — which are then sold back to Canadian agencies as “intelligence.” We are effectively subsidizing the digital sovereignty of others while compromising our own.
The implications extend beyond economics. When the platforms managing industrial sensors, electrical grids, transportation networks, and infrastructure monitoring systems operate under foreign jurisdiction, the operational data that underpins them can also fall under foreign legal frameworks.
In a crisis, our “sovereign” infrastructure may still be subject to foreign subpoenas, surveillance regimes, and policy shifts.
Where Sovereignty Becomes Operational
To understand why “strategic leverage” is necessary, one must look at the systems that actually run the state. Across sectors ranging from national defense and Arctic surveillance to wildfire management and infrastructure planning, the Canadian government depends on spatial data to understand real-world activity. AI and machine learning are now used to detect wildfire ignition points, monitor deforestation, and track shipping patterns in the North. These signals form the “Intelligence Stack” that triggers national policy and operational responses.
Sovereignty is not simply about where a server sits; it is about who governs the pipeline that informs a fire chief where to deploy resources. If that pipeline is foreign-owned and trained on data that Canada does not legally control, then the ability to manage our own territory has been effectively outsourced.
The implications for the geospatial sector are significant. Earth observation systems and spatial analytics are becoming part of the operational infrastructure of the state. As AI becomes embedded in these workflows, the governance of geospatial data pipelines moves from a technical IT concern to a national security priority.
From Data Suppliers to Strategic Guardians
This shift will eventually force a reckoning for the commercial industry. For years, companies in the geospatial and data sectors have operated as simple vendors. That era is ending.
As AI becomes embedded in national workflows, companies providing satellite imagery, environmental monitoring, and mapping are becoming providers of strategic national infrastructure.
But in doing so, it will no longer be enough to provide accurate data; the industry will have to answer where that intelligence lives, who can access it, and whose laws govern it.
Public agencies will face growing pressure to ensure that critical geospatial datasets are governed within domestic legal frameworks. For the private sector, this is both an opportunity and a responsibility — they must now decide if they are mere data vendors or partners in national resilience.
Closing the Governance Gap
The challenge before Canada is not primarily an engineering one. We can buy chips and build data centers, but we cannot “buy” sovereignty.
Sovereignty must be designed into the governance layer. It requires us to move beyond the hardware-centric debate and start asking who owns the intelligence derived from our data. Without a sovereign governance layer that dictates how data is stored, reused, and integrated into national decision-making, our investment in compute is simply a down payment on a system we do not truly own.
Canada is at a crossroads. We can continue as a “data colony”, exporting raw information and importing finished intelligence, or we can build a complete sovereign stack that governs how our data is turned into national decisions.
The window is narrowing. It is time to stop talking about the chips and start talking about the governance of the intelligence they produce.

Be the first to comment