NextFin News - Artificial intelligence systems are systematically absorbing Canadian journalism to power their responses while failing to attribute or compensate the original creators in more than 80% of cases, according to a landmark study released Monday. The report, authored by researchers at McGill University’s Centre for Media, Technology and Democracy, reveals a parasitic relationship where AI models like ChatGPT, Gemini, Claude, and Grok ingest news archives to deliver derivative content that effectively renders the original news source obsolete for the end user.
The McGill team tested 2,267 Canadian news stories against the leading AI models, finding that all four demonstrated "extensive knowledge" of domestic current events. However, when these systems were queried about specific news from their training data, they failed to provide any source attribution 82% of the time. Even when web access was enabled, the AI platforms frequently provided enough detail in their synthesized answers to satisfy a reader's curiosity, removing any incentive to click through to the publisher’s website. While half of the responses included at least one Canadian link, the actual name of the Canadian news organization was mentioned in only 28% of instances.
This data arrives at a critical juncture for the Canadian media landscape. A coalition of the country’s largest news organizations—including the Globe and Mail, CBC/Radio-Canada, and Postmedia—is currently suing OpenAI in an Ontario court, alleging copyright infringement and the unauthorized profiting from their intellectual property. The McGill report provides the empirical backbone for these legal challenges, arguing that AI companies are extracting value at every stage of the process: from training on archives to delivering answers that cannibalize the traffic of the very outlets they rely on for facts.
The economic implications are stark. Unlike the previous era of social media, where platforms like Facebook and Google captured advertising revenue by aggregating attention around links, AI models are "absorbing the substance" of the reporting itself. The researchers noted that this shift accelerates the economic decline of journalism by making the consumer’s visit to the source unnecessary. This is no longer a battle over who gets the ad dollar for a click; it is a battle over the ownership of the information that constitutes the answer.
U.S. President Trump has previously signaled a preference for deregulation in the tech sector, but the Canadian government is moving in a different direction. Speaking at a national summit in Banff on Monday, Culture Minister Mark Miller and Artificial Intelligence Minister Evan Solomon both acknowledged the "legitimate questions" surrounding copyright and data mining. Solomon noted that creators require "guardrails" to ensure AI development does not come at the expense of the cultural and news industries. The federal government is currently consulting on a national AI strategy that may eventually mirror the Online News Act, which already forces tech giants to pay for news content.
The tension between Silicon Valley’s "move fast and break things" ethos and the survival of local reporting has reached a breaking point. If AI models continue to serve as a "black box" that outputs facts without credit, the financial incentive to produce those facts will vanish. The McGill study suggests that without a mandatory licensing framework or technical barriers to scraping, the very data that makes AI "intelligent" about the world will eventually dry up as newsrooms continue to shrink or shutter entirely.
Explore more exclusive insights at nextfin.ai.

