{"id":37898,"date":"2025-11-06T12:31:09","date_gmt":"2025-11-06T12:31:09","guid":{"rendered":"https:\/\/agooka.com\/news\/business\/scam-ads-are-flooding-social-media-these-former-meta-staffers-have-a-plan\/"},"modified":"2025-11-06T12:31:09","modified_gmt":"2025-11-06T12:31:09","slug":"scam-ads-are-flooding-social-media-these-former-meta-staffers-have-a-plan","status":"publish","type":"post","link":"https:\/\/agooka.com\/news\/business\/scam-ads-are-flooding-social-media-these-former-meta-staffers-have-a-plan\/","title":{"rendered":"Scam Ads Are Flooding Social Media. These Former Meta Staffers Have a Plan"},"content":{"rendered":"<p>Save StorySave this storySave StorySave this story<\/p>\n<p>When billionaire Dutch TV producer John de Mol sued Facebook in 2019 over its alleged failure to stop scammers from using his image in deceptive ads, the social media company sent Rob Leathern to Amsterdam to meet with Del Mol\u2019s team and to speak with the media.<\/p>\n<p>\u201cThe people who push these kinds of ads are persistent, they are well funded, and they are constantly evolving their deceptive tactics to get around our systems,\u201d Leathern told Reuters at the time.<\/p>\n<p>During his four years at the company now known as Meta, Leathern was in many ways the public face of its effort to fight scam ads. He led the business integrity unit tasked with preventing scammers and other bad actors from abusing Meta\u2019s ad products. He regularly spoke to the media about scam ads. Leathern also oversaw transparency efforts like the Meta Ad Library, the industry\u2019s first free and searchable repository of digital ads, and the launch of identity verification for political advertisers.<\/p>\n<p>But since leaving Meta at the end of 2020, Leathern has watched as criminals deployed deepfakes and used artificial intelligence to craft more convincing scam ads. He said he became alarmed as major platforms failed to invest in teams and technology at the rate needed to fight such exploitative ads.<\/p>\n<p>\u201cThe technology and the progress has stagnated the last five years,\u201d Leathern said in an interview. \u201cI also feel like we just don&#039;t really know how bad it&#039;s gotten or what the current state is. We don&#039;t have objective ways of knowing.\u201d<\/p>\n<p>Leathern has teamed up with Rob Goldman, Meta\u2019s former vice president of ads, to launch CollectiveMetrics.org, a nonprofit aimed at bringing more transparency to digital advertising in order to fight deceptive ads. The goal is to use data and analysis to measure things such as prevalence of online scam ads and to lift the veil on the opaque ad systems that generate hundreds of billions of dollars in revenue for companies like Meta.<\/p>\n<p>Their effort comes as losses due to scams have skyrocketed around the world. The Global Anti-Scam Alliance, an organization that researches scam trends and includes leaders from Meta, Google, and other platforms on its advisory board, estimates that victims collectively lost at least a trillion dollars last year. Its 2025 Global State of Scams report found that 23 percent of people have lost money to a scam.<\/p>\n<p>The report said that many victims fail to report scams due to feeling ashamed or because they don\u2019t know who to tell. Of those who did report a scam, more than a third said that \u201cno action was taken by the platform after reporting it.\u201d<\/p>\n<p>Leathern said that it\u2019s impossible to know exactly how many scam ads there are on platforms like Facebook and YouTube because the companies don\u2019t make data accessible for independent research.<\/p>\n<p>\u201cI want there to be more transparency. I want third parties, researchers, academics, nonprofits, whoever, to be able to actually assess how good of a job these platforms are doing at stopping scams and fraud,\u201d Leathern said. \u201cWe&#039;d like to move to actual measurement of the problem and help foster an understanding.\u201d<\/p>\n<p>As a first step, they commissioned an online survey of 1,000 American adults to gauge how consumers view efforts by platforms to fight deepfakes and scam ads. Almost half of people (47 percent) said that TikTok is doing a poor or very poor job, the highest of the platforms polled. Facebook and Instagram were the next worst. Thirty-eight percent of respondents said Facebook was poor or very poor at preventing deepfakes and scam ads, while 33 percent of people said the same of Instagram. People over 55 had the most negative view of TikTok and Meta\u2019s efforts, with 61 percent saying that TikTok does a poor or very poor job, and 47 percent and 43 percent saying the same of Facebook and Instagram.<\/p>\n<p>The low numbers for TikTok and for two Meta products suggests that consumers have an overall negative perception of the companies\u2019 anti-scam efforts, according to Leathern.<\/p>\n<p>\u201cPeople seem quite more negative than I would have expected,\u201d he said.<\/p>\n<p>He added: \u201cThere&#039;s been a loss of institutional knowledge at some of these companies. I just think we&#039;re in for a hard time, and I don&#039;t see the mechanisms in place for much accountability yet.\u201d (Leathern\u2019s wife currently works in product marketing at Meta.)<\/p>\n<p>Melanie Bosselait, a TikTok spokesperson, said in an email that the company\u2019s Community Guidelines prohibit \u201cattempts to scam, trick or defraud people.\u201d TikTok also offers educational resources including an article entitled, \u201cHow We Fight Scams and Fraud on TikTok.\u201d Bosselait said that TikTok uses a mix of automated and human systems to enforce its rules, and that it regularly reviews and strengthens such systems.<\/p>\n<p>Meta spokesperson Daniel Roberts said the company has continued to invest in fighting scams since Leathern left the company.<\/p>\n<p>&quot;We aggressively fight scams on our platforms, and as scammers have grown in sophistication in recent years, so have our efforts,\u201d Roberts said in an emailed statement. \u201cIn fact, since this former employee left Meta a half-decade ago, we have expanded our multi-layered approach to combatting scams by launching global awareness campaigns that help people spot scams, collaborating with cross-industry partners to disrupt these networks, and rolling out facial recognition technology to detect and remove celeb-bait ads.\u201d<\/p>\n<p>Roberts said that Meta has seen a more than 50 percent decline in user reports about scam ads since the summer of 2024, and removed more than 134 million scam ads this year.<\/p>\n<p>Meta is currently being sued in California by Australian billionaire Andrew Forrest, who alleges that the company\u2019s automated ad systems assisted investment scammers in placing ads that impersonated him. In a court filing, Meta disclosed that it had hosted roughly 230,000 scam ads that featured Forrest\u2019s likeness since 2019.<\/p>\n<p>An October report from the Tech Transparency Project found that Meta has recently earned at least $49 million from scam advertisers that often used deepfakes of public figures like Donald Trump, Elon Musk, and Alexandria Ocasio-Cortez<\/p>\n<p>Leathern said one potential reason that scam ads are still widespread on platforms is that the companies worry that \u201ctoo much good revenue will get flushed out if they are more aggressive about getting rid of the bad.\u201d<\/p>\n<p>Roberts disagreed.<\/p>\n<p>\u201cWe fight fraud and scams because people on our platforms don\u2019t want this content, legitimate advertisers don\u2019t want it, and we don\u2019t either,\u201d he said. \u201cThat&#039;s why we\u2019re always looking for new ways to stop them and take them down.\u201d<\/p>\n<p>CollectiveMetrics.org\u2019s survey data shows that consumers generally believe that platforms and governments have a responsibility to prevent scam ads. But only 36 percent of respondents said digital platforms are doing a very or somewhat good job fighting deepfakes and scam ads.<\/p>\n<p>\u201cConsumers in the US definitely expect both tech companies and the government to help protect them from the potential negative effects of deepfakes,\u201d Leathern said. \u201cAnd also they don&#039;t feel like platforms are doing a great job yet in terms of preventing scams and deepfakes.\u201d<\/p>\n<p>Just under 50 percent of respondents aged 18 to 54 said it\u2019s \u201cvery important\u201d for the government to pass laws to stop deepfake ads. People over 55 were even more supportive of government action, with 65 percent saying it\u2019s very important.<\/p>\n<p>Sixty-seven percent of respondents aged 55 and older said it was \u201cvery important for online platforms to prevent fraudulent ads,\u201d compared to 55 percent of those aged 54 and under.<\/p>\n<p>\u201cI think the older users are disproportionately getting targeted by scams and problematic offers,\u201d Leathern said.<\/p>\n<p>The survey showed that people think TikTok and Meta are doing the worst job preventing deepfake scam ads. But Leathern said we lack real data to understand how such platforms are actually performing.<\/p>\n<p>\u201cLet&#039;s have some independent third parties be able to look at whether you have more fraud and scams than YouTube does. Because, look, I&#039;ve worked at both Google and at Meta, and people tell me all the time, the ads on Google ads are terrible,\u201d said Leathern, who worked on privacy products at Google from 2021 to 2023. \u201cI&#039;d love to have that conversation with real data.\u201d<\/p>\n<p>The challenge is that it\u2019s currently impossible for researchers, governments, and other third parties to fully assess the performance of platforms. Even the Digital Services Act in the European Union, which mandates additional data transparency and reporting by major platforms, hasn&#039;t resulted in the kind of data that\u2019s needed to perform large-scale audits of ads and advertisers, according to Leathern.<\/p>\n<p>\u201cI think it&#039;s super well intentioned,\u201d he said of the DSA. \u201cI think that they aren&#039;t necessarily requiring the right metrics to be surfaced or the right information to be provided to the public. So I think those laws need to evolve.\u201d<\/p>\n<p>Leathern said that the ideal scenario is for platforms to see scam prevention as a competitive advantage and to protect users by investing in new features and systems. He recently proposed that platforms should notify users when they clicked on an ad that was later removed for violating policies against scams and fraud.<\/p>\n<p>\u201cThese scammers aren&#039;t getting people&#039;s money on day one, typically. So there&#039;s a window to take action,\u201d he said.<\/p>\n<p>Leathern also said that platforms should have to donate or otherwise disgorge the money earned from scam ads placed via their systems. As of today, Meta, Google, TikTok, and other companies remove scam ads but keep the money that was spent to run them.<\/p>\n<p>\u201cIt certainly shouldn&#039;t necessarily be enriching companies if there&#039;s scammy ads being run,\u201d he said. \u201cThe revenue could also be used in other ways to fund nonprofits to educate people about how to recognize these kinds of scams or problems. There&#039;s lots that could be done with funds that come from these bad guys.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Save StorySave this storySave StorySave this story When billionaire Dutch TV producer John de Mol sued Facebook in 2019 over its alleged failure to stop scammers from using his image in deceptive ads, the social media company sent Rob Leathern to Amsterdam to meet with Del Mol\u2019s team and to speak with the media. \u201cThe [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":37899,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[36],"tags":[],"class_list":{"0":"post-37898","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-business"},"_links":{"self":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/37898","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/comments?post=37898"}],"version-history":[{"count":0,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/posts\/37898\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media\/37899"}],"wp:attachment":[{"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/media?parent=37898"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/categories?post=37898"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/agooka.com\/news\/wp-json\/wp\/v2\/tags?post=37898"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}