Meta ban has been rough, but Google ban would be worse, say small news outlets, analysts
Federal officials hold the line as C-18 full effect date approaches
Small news outlets and media and internet experts say the Online News Act, also known as Bill C-18, has had a serious impact so far, and it may be about to get much worse.
"We're losing, and that means the community is losing," said Theresa Blackburn, owner of the River Valley Sun, which covers daily news from Perth-Andover to Nackawic in western New Brunswick online and also prints a monthly paper with a circulation of about 6,000.
The four-year-old publication found itself cut off from readers and viewers in July, when Meta blocked Canadian news on its platforms in response to new federal legislation that was supposed to force big internet companies to pay for the news content they make available.
The transition has gone fairly well since the paper launched its own website four months ago, said Blackburn, but reader engagement has fallen dramatically.
On Facebook, River Valley Sun stories used to get 800,000 likes, shares or comments a month.
On rivervalleysun.ca, there are 60,000 visits a month, but stories can't be shared to Facebook and the paper lacks the resources to allow comments, which would require constant monitoring.
Canadian users seeking Canadian news content have been blocked from viewing it. (Meta)
It's nice having local control, Blackburn said, but the paper isn't able to do as many live reports.
That hurts its bottom line, she said, because it used to get thousands of dollars in revenue from live-streaming events for local businesses and organizations.
It also hurts the journalistic product, said Blackburn, and puts public safety at risk.
"At some point in time someone isn't going to get the information they need to be safe," she said.
Many outlets affected
The Sun is not alone. A group of 20 other news outlets across Canada, including the New Brunswick Media Co-Op, say the Facebook ban has been "a big crisis," affecting how they reach viewers, readers and listeners.
They've formed a new collaborative news platform called Unrigged, hoping to jointly benefit from a critical mass of their pooled material and share the costs of a website.
Meta, on the other hand, has lost little or no audience or advertising since it banned Canadian news, said Chris Waddell of Carleton University's school of journalism, formerly of CBC News and the Globe and Mail.
It's also been spared a lot of trouble dealing with things such as disinformation, misinformation and inflammatory comments, said Waddell.
He's pretty sure that even if Bill C-18 were to be killed, Meta wouldn't bring news back.
The survival of the River Valley Sun this long, through the pandemic and the Meta news ban, goes to show the importance of local news, said Blackburn, but she's not sure how they'll cope with what may be coming next.
The Google logo is seen on a computer in this photo illustration in Washington, DC, on July 10, 2019. (Alastair Pike/AFP/Getty Images)
Google has said it will remove links to news from its products in Canada when the Online News Act entirely takes effect, which will be no later than Dec. 19 — 180 days after it received royal assent.
A member of the company's media relations department, Shay Purdy, told CBC News those plans are still accurate.
Google said in its submission on the draft regulations that the new law is "unworkable" because free linking is the foundation of the open web.
It maintained that as a company it already supports journalism by linking people to Canadian news sites, to the greater benefit of news companies and Canadians than to its own bottom line.
It called the act "deeply discriminatory" because it's the only company being asked to pay — an estimated $172 million annually, a minimum of four per cent of its Canadian revenues, while only two per cent of its searches are for news.
It advocated more flexibility and suggested several amendments to the legislation so that among other things, it would only have to pay for "displaying news content," not for simply linking to it, and video and ad platforms would be excluded.
Google recently reached a deal to pay publishers in Germany the equivalent of about CA $4.8 million a year. They had been seeking more than $600 million.
Chris Waddell is an academic and former journalist specializing in business and finance. (Matthew Usherwood)
The River Valley Sun relies on Google searches for 47 per cent of its website traffic, said Blackburn, and without YouTube or Facebook, she's not sure how they'd get their videos out.
Waddell has no idea if Google is really going to follow suit with a news ban, but he believes the consequences of that would be "a much more dramatic loss for all news organizations in Canada, both big and small."
He places much of the blame for the situation on large, established news organizations, including CBC.
Instead of leveraging the traffic they got from tech platforms, by making their websites more user-friendly and engaging, they cluttered their stories and videos with ads and lobbied government to force Google and Facebook to give them money, he said.
But according to Blayne Haggart, Google and Meta are the parties mostly responsible for what's happening.
Blayne Haggart is an academic and former journalist and economist whose research focuses on intellectual property rights and data governance. (Submitted by Blayne Haggart)
Haggart, an associate professor of political science at Brock University, has written some articles about Bill C-18 for the Centre for International Governance Innovation and recently published a book with Natasha Tusikov called The New Knowledge: Information, Data and the Remaking of Global Power.
"It's tantamount to holding the country hostage," he said of existing and threatened news bans.
"It's a coercive use of power designed to bring the Canadian government and a democratic, legitimate legislature to heel," said Haggart.
The tech companies have set themselves up as essential infrastructure for the delivery of information — including news — and want all of the benefits, including ad revenue, without any of the responsibilities, he said.
Haggart isn't sure the government's approach is the best way to promote and safeguard a healthy information ecosystem, but it is a "legitimate" way, he said, having been passed by Parliament with the support of three parties and been implemented successfully in other countries such as Australia, where it has led to the hiring of more journalists.
Google has said the Australian legislation is different because it only applies to designated companies. It also created an incentive for the various parties to reach voluntary agreements, so it hasn't been necessary yet to designate any companies, including Google.
In Canada, the big internet companies aren't being asked for much, said Haggart — basically, to negotiate payment agreements with Canadian media companies that must meet some basic conditions set by the CRTC, and not to unjustly discriminate against or give unreasonable preference to any Canadian news media companies.
"That would be an enormous win for Canada and Canadians," he said, whereas a news ban by Google would be a big loss.
"Social media is one thing, but everybody depends on search," said Haggart, noting Google has about 90 per cent of the Canadian search engine market.
"It's basically how people find information."
The silver lining would be if people are driven to other platforms, so Google didn't have such a stranglehold, he said.
"The fact they're able to threaten an entire Canadian industry and Canadians' access to information, which is vital to a democracy, is proof that they have far too much power and have been given far too much leeway for far too long," said Haggart.
Blackburn isn't surprised if big corporations "don't care about the little guy," but she does want and expect the federal government to care.
She was hoping the standoff would be resolved by now and is receptive to the idea of the legislation being softened.
'Constructive discussions' continue
The federal government doesn't seem to be backing down. An emailed statement from the Canadian Heritage Minister's office said it is "open to proposals that make the regulations stronger."
Canadians expect "tech giants" to "pay their fair share for news," it said.
"These tech platforms have to act responsibly and support the news sharing they and Canadians both benefit from," said a statement attributed to Minister Pascale St-Onge.
The minister noted that hundreds of newsrooms and thousands of jobs in journalism have been lost in the last decade across the country.
"This has had a big impact on the capacity of Canadians to get high-quality, fact-based news and information," she said.
The minister's office said it continues to have "constructive discussions with platforms" and it is optimistic the Online News Act will help make news available to Canadians in a sustainable way.
"I believe we share the goal of ensuring quality access to information and news for Canadians," said St-Onge.
Final regulations will be provided "in due time," it said.
The CRTC said the bargaining process for news outlets and the big internet companies to negotiate compensation is only expected to begin late next year or in early 2025.
Blackburn remarked with a sense of irony that the River Valley Sun will have to change from a sole proprietorship to a corporation in order to be eligible for payments.
Clarifications
- A previous version of this story contained the following paragraph: In Canada, the big internet companies aren't being asked for much, said Haggart — basically, to pay into a fund to be overseen by the CRTC and to not unduly discriminate against any particular news outlet by downranking its content, making its stories harder to find. Haggart clarified his remarks to say: In Canada, the big internet companies aren't being asked for much — basically, to negotiate payment agreements with Canadian media companies that must meet some basic conditions set by the CRTC, and not to unjustly discriminate against or give unreasonable preference to any Canadian news media companies.Nov 26, 2023 11:51 AM AT
With files from Information Morning Fredericton
Supreme Court shields Twitter from liability for terror-related content and leaves Section 230 untouched
The Supreme Court handed Silicon Valley a massive victory on Thursday as it protected online platforms from two lawsuits that legal experts had warned could have upended the internet.
The twin decisions preserve social media companies’ ability to avoid lawsuits stemming from terrorist-related content – and are a defeat for tech industry critics who say platforms are unaccountable.
In so doing, the court sided with tech industry and digital rights groups who had claimed exposing tech platforms to more liability could break the basic functions of many websites, and potentially even create legal risk for individual internet users.
In one of the two cases, Twitter v. Taamneh, the Supreme Court ruled Twitter will not have to face accusations it aided and abetted terrorism when it hosted tweets created by the terror group ISIS.
The court also dismissed Gonzalez v. Google, another closely watched case about social media content moderation – sidestepping an invitation to narrow a key federal liability shield for websites, known as Section 230 of the Communications Decency Act. Thursday’s decision leaves a lower court ruling in place that protected social media platforms from a broad range of content moderation lawsuits.
The Twitter decision was unanimous and written by Justice Clarence Thomas, who said that social media platforms are little different from other digital technologies.
“It might be that bad actors like ISIS are able to use platforms like defendants’ for illegal – and sometimes terrible – ends,” Thomas wrote. “But the same could be said of cell phones, email, or the internet generally.”
Thomas’ opinion reflected the court’s struggle to identify in oral arguments what kinds of speech ought to trigger liability for social media, and what kind deserved protections.
“I think the court recognized the importance of these platforms for billions of people for communicating, and stepped back from interfering with that,” said Samir Jain, vice president of policy at the Center for Democracy and Technology, a group that filed briefs in support of the tech industry.
For months, many legal experts had viewed the Twitter and Google cases as a sign the court might seek sweeping changes to Section 230, a law that has faced bipartisan criticism in connection with tech companies’ content moderation decisions. Thomas in particular has expressed vocal interest in hearing a Section 230 case.
Expectations of a hugely disruptive outcome in both cases prompted what Kate Klonick, a law professor at St. John’s University, described as an “insane flood” of friend-of-the-court briefs.
As oral arguments unfolded, however, and as justices visibly grappled with the complexities of internet speech, the likelihood of massive changes to the law seemed to recede.
“I think it slowly started to creep into the realm of possibility that … maybe the Court has no idea what the hell these cases are about and had MAYBE picked them to be activist, but weren’t ready to be THIS activist,” Klonick tweeted.
Daphne Keller, director of the Program on Platform Regulation at Stanford University, agreed.
“I do think this vindicates all of us who were saying, ‘the Supreme Court took the wrong case, these ones did not present the issues they actually wanted,’” Keller told CNN.
The justices may soon have another opportunity to weigh in on social media. The court is still deciding whether to hear a number of cases dealing with the constitutionality of state laws passed by Texas and Florida that restrict online platforms’ ability to moderate content. But the court’s handling of the Twitter and Google cases suggests the court may approach any new cases carefully.
“The very fact that the justices are proceeding cautiously is a good sign and suggests a more nuanced understanding of these issues than many feared,” said Evelyn Douek, an assistant professor at Stanford Law School.
In Thursday’s Twitter decision, the court held that Twitter’s hosting of general terrorist speech does not create indirect legal responsibility for specific terrorist attacks, effectively raising the bar for future such claims.
“We conclude,” Thomas wrote, “that plaintiffs’ allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack.”
He stressed that the plaintiffs have “failed to allege that defendants intentionally provided any substantial aid” to the attack at issue, nor did they “pervasively and systemically” assist ISIS in a way that would render them liable for “every ISIS attack.”
Twitter v. Taamneh focused on whether social media companies can be sued under US antiterrorism law for hosting terror-related content that has only a distant relationship with a specific terrorist attack.
The plaintiffs in the case, the family of Nawras Alassaf, who was killed in an ISIS attack in Istanbul in 2017, alleged that social media companies including Twitter had knowingly aided ISIS in violation of federal antiterrorism law by allowing some of the group’s content to persist on their platforms despite policies intended to limit that type of content.
“Countless companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result,” said Halimah DeLaine Prado, Google’s general counsel, in a statement. “We’ll continue our work to safeguard free expression online, combat harmful content, and support businesses and creators who benefit from the internet.”
Twitter did not immediately respond to a request for comment.
Dismisses Google challenge, leaving Section 230 untouched
In a brief order, the court dismissed the case against Google with only a brief opinion, leaving intact a lower court ruling that held Google is immune from a lawsuit that accuses its subsidiary YouTube of aiding and abetting terrorism.
The outcome will likely come as a relief not only for Google but for the many websites and social media companies that urged the Supreme Court not to curtail legal protections for the internet.
The opinion was unsigned, and the court said: “We decline to address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief. Instead, we vacate the judgment below and remand the case for Ninth Circuit to consider plaintiffs’ complaint in light of our decision in Twitter.”
No dissents were noted.
The case involving Google zeroed in on whether it can be sued because of its subsidiary YouTube’s algorithmic promotion of terrorist videos on its platform.
The family of Nohemi Gonzalez, who was killed in a 2015 ISIS attack in Paris, alleged that YouTube’s targeted recommendations violated a US antiterrorism law by helping to radicalize viewers and promote ISIS’s worldview.
The allegation sought to carve out content recommendations so that they do not receive protections under Section 230, potentially exposing tech platforms to more liability for how they run their services.
Google and other tech companies have said that that interpretation of Section 230 would increase the legal risks associated with ranking, sorting and curating online content, a basic feature of the modern internet. Google claimed that in such a scenario, websites would seek to play it safe by either removing far more content than is necessary, or by giving up on content moderation altogether and allowing even more harmful material on their platforms.
Friend-of-the-court filings by Craigslist, Microsoft, Yelp and others suggested that the stakes were not limited to algorithms and could also end up affecting virtually anything on the web that might be construed as making a recommendation. That might mean even average internet users who volunteer as moderators on various sites could face legal risks, according to a filing by Reddit and several volunteer Reddit moderators.
Oregon Democratic Sen. Ron Wyden and former California Republican Rep. Chris Cox, the original co-authors of Section 230, argued to the court that Congress’ intent in passing the law was to give websites broad discretion to moderate content as they saw fit.
The Biden administration also weighed in on the case. In a brief filed in December, it argued that Section 230 does protect Google and YouTube from lawsuits “for failing to remove third-party content, including the content it has recommended.” But, the government’s brief argued, those protections do not extend to Google’s algorithms because they represent the company’s own speech, not that of others.
This story is breaking and will be updated.
There's no way I'd swallow his supposedly $800,000 renovation project as being necessary just to sell some booze, beer and wine.
The judge was correct in basing his ruling on "only to determine whether N.B. Liquor's decision was reasonable."
Case closed.