The assassination attempt on former President Donald Trump starkly highlights the razor-thin line between profit and ethics that platforms are precariously tightrope-walking as the presidential race heats up.
In the chaotic aftermath of the incident at Trump’s campaign rally in Pennsylvania on Saturday, July 13, a chorus of opportunists rushed to cash in on the swirling hysteria, speculation, and conspiracy theories.
Political advertisers on Meta platforms have already begun using the shooting to sell assassination-related merchandise ranging from T-shirts and shot glasses to trading cards. Coffee mugs and tumblers show members of the Secret Service pulling the former president off stage along with the text “Legends Never Die.” T-shirts with the same image feature text like “Fight For America,” “Not Today,” and “I get shot and I get up again!”
All these examples were shown in Meta’s ads library, according to Digiday’s analysis of the publicly available record of all political ads bought on the platform. Many of the shooting-related ads were small media buys of less than $100, with Meta’s ad library showing each ad only having a few thousand impressions or less. However, they illustrate how even small right-wing e-commerce websites, political sites, and political affiliates are seeking to capitalize on the attack — and making Meta a little money in the process.
Other shooting-related products being marketed on Meta include picture frames, golf balls, and a shot glass with Trump’s image underneath “take your shot.” There’s also a whiskey glass showing a bloody Trump with his fist in the air along with the text “Impeached, arrested, convicted, shot, still standing.” (That advertiser, Old Southern Brass, was also fined last year by the Federal Trade Commission for making false claims about products being “Made in the USA” and having veteran affiliations.)
Beyond selling products, other political ads make false claims about the shooting — including some with misinformation. New ads paid for by Conservative Voices of America falsely suggest the attack was orchestrated by Biden and the “deep state” — and then invite Trump supporters to fill out a survey in exchange for a free “Trump flag and gold coin.” (One of the same group’s ads features a video message from Alex Jones, who is permanently banned from Meta platforms.) Another ad, bought by a group called Remember In November, wrongly claims the shooter was a “far left progressive Democrat” despite the shooter being reportedly registered to vote as a Republican. Other examples include an ad supporting Missouri State Rep. Brian Seitz — paid for by Seitz Conservative Coalition — which suggests the shooting “was not just an isolated event.”
It’s too soon to predict how platforms will respond to this, especially regarding the more objectionable ads. Will they block them entirely or simply throttle their reach?
Meta did not respond to Digiday’s request for comment by press time, but the ads library shows at least one ad has already been removed for not including proper political disclaimers. In November 2023, Meta’s president of global affairs, Nick Clegg, revealed the platform’s game plan for elections, boasting around 40,000 experts working on safety and security across its family of apps, while Meta has invested $20 billion since 2016 in the practice.
Whatever their approach, platforms will have a lot to contend with. They’re having to police what’s happening on these platforms to a point that feels sufficient, while still recognizing free speech. And police them enough so advertisers are happy to spend their dollars in increasingly risky environments.
“The weekend’s events must have put advertisers on high alert,” noted Jasmine Enberg, vp and principal analyst of social media and the creator economy at eMarketer. “The platforms are also likely walking on eggshells, as they try to size the extent to which this election cycle could damage their businesses and determine what — if anything — they can do to prevent both monetary and real-world harm.”
Those concerns were ruffled last week when Meta rolled back restrictions on Trump’s accounts that had been in place since 2021 after he praised those who joined the deadly storming of the U.S. Capitol. It’s the climax of a slow rehabilitation that kicked off last year when Trump’s accounts were reinstated.
Nevertheless, reinstating Trump is a hot potato, bound to attract ongoing scrutiny as the fallout continues. Platforms argue they’re balancing free speech, responsibility, user safety and their own business interests. Critics counter that they should take a hardline stance against someone they believe erodes the truth. The assassination attempt on Trump will push both sides to their limits.
“These platforms will need to get really aggressive and proactive to combat the misinformation and fraud,” said Jacek Chrusciany, CEO and co-founder of Adfidence. “They need to publicize clear guidelines on how they’re moderating content and enforcing standards to build trust with both users and advertisers, if they want to keep ad dollars coming to these platforms. Behind the scenes, they’re going to need to deploy a mix of AI and human moderation to stay on top of it all.”
Meta isn’t the only platform with a game plan for the election year. TikTok chimed in with a similar announcement in January, stating that the platform is consulting with more than 50 experts, including its safety and content advisory councils for this historic election year. In the same month, Snap took a cheeky swipe at rivals in its own election planning announcement, claiming “We don’t boost misinformation or Groups. Instead, we moderate content before it spreads and showcase news from trusted sources worldwide.”
The message from the platforms is clear: trust us, we are taking the election year seriously. But are the actions they are taking actually enough?
“They’re [social platforms] all taking those steps to ensure election integrity,” said Anastasia Nairne, consulting and strategy, senior principal at PMG, who caveated that they can only do so much. She noted that while the 2020 election was contentious, U.S. consumers have been feeling a growing sense of unease for a while now, so much so that brands have been asking about how they should position themselves during an election year since Q1, knowing that they’re going to be held to higher standards this time round.
Indeed, the landscape has changed dramatically. During the 2020 election, the main conversations took place across Meta platforms and then-Twitter. Now TikTok, vertical video and even AI have thrown huge curveballs into the mix. As a result more and more peoples’ frame of reference is influenced more harshly by the algorithms behind the content they consume on X, Instagram, TikTok et al than the actual events themselves. These algorithms — and thus the frame of reference — evolve as new information is posted, regardless of its accuracy.
Platforms will only go so far in moderating this — they always do. It often falls to regulators to push them the extra mile. This happened in Europe with the Digital Services Act, which aims to “create a safer digital space” by enhancing transparency, monitoring content, and curbing monopolistic practices of what it deems “large online platforms.”
However, in the U.S. no such regulation exists — at least not yet. Instead free speech and other legal protections limit what the government can do when it comes to reining them in.
The impact on advertisers will vary significantly across platforms. X, heavily reliant on a few major advertisers, often sees spending pauses after negative news due to its shaky spot in media plans. In contrast, Meta, with over 10 million advertisers who rely on Facebook and Instagram, faces different challenges. These advertisers know their ads might occasionally pop up next to questionable content. Plus, Facebook and Instagram’s traffic is less driven by current events, reducing their exposure to negative headlines compared to X.
“Responsible platform operators should gear algorithms towards accuracy and broader perspective rather than sensationalism,” said Jamie MacEwan, senior research analyst at Enders Analysis. “Those standards should also be expected of professional media output. Without these precautions, there is always the risk that platforms and the media themselves become part of the story.”