Privacy and AI policies to watch in 2024

Date:

This article is part of a series exploring trends in marketing, media and media buying for 2024. More from the series →

As policymakers and businesses continue discussing possible regulation around AI, 2024 will also be another big year for data privacy.

It will be a busy year for anyone tasked with tracking new privacy laws or proposed legislation on an international, national and state level — compounded by rules related to other issues like AI and antitrust.

Regulatory discussions are concurrent with Google’s plan to officially begin deprecating third-party cookies. On Jan. 4, the company began testing its new Tracking Protection feature to block third-party cookies and limit cross-site tracking before a full phase-out takes place in the second half of 2024.

There also is rising tension between competition and privacy. One example is Epic Games and its separate lawsuits against Apple and Google. Epic alleged anticompetitive practices on the respective app stores, but both Apple and Google had argued their app stores’ policies provided stronger privacy and security for consumers.

Although many new state privacy laws are quite similar, privacy lawyers say differences in the laws also create new compliance challenges. Jane Horvath, an attorney at the law firm Gibson Dunn, mentioned part of the reason the European Union passed its General Data Protection Regulation was to ensure existing EU privacy rules were interpreted consistently in each member country.

“Most companies think in terms of country borders, not state borders,” said Horvath, who previously was Apple’s chief privacy officer. “When you’re running a global company, what you’re trying to find is some common standard that you can apply. It’s very hard if you’ve got to apply 100 different privacy laws. You’re trying to find a course set and I think the U.S. is going to be increasingly more difficult with the free flow of data.”

Companies aren’t entirely ready to handle more privacy regulations. When the law firm Womble Bond Dickinson surveyed executives in April and May of 2023, only 45% said they were “very prepared” to address U.S. laws and regulations — down from 59% in 2022. In the U.S. only 42% of execs had completed comparisons of state privacy law frameworks while 60% had a hard time tracking state privacy bills or knowing the differences between state privacy laws. Of those doing business in Europe and the UK, just 35% were “moderately prepared” to follow EU and British privacy laws while 53% said they were “very prepared.”

Here’s a look at some of the international, national and state issues to expect in 2024 related to privacy, AI and antitrust:

State laws

In 2023, new privacy laws went into effect in California, Colorado, Connecticut, Virginia and Utah, with the latter only in effect since Dec. 31. Another 15 other states passed privacy legislation. Meanwhile, lawmakers in eight other states introduced new bills or statutes that failed to pass. (A recent Politico report detailed how L.L. Bean fought a proposed privacy law in Maine.)

In 2024, new privacy rules will take effect with Texas in July and then Montana and Oregon in October. And in 2025, at least three more will take effect across Iowa, Tennessee and Delaware.

In California, the recently passed Delete Act will raise transparency and accountability standards for data brokers and create a way for state residents to request brokers delete their data. It won’t go into effect until 2026, but some privacy lawyers think the Delete Act could have less impact than expected.

Many states have focused on passing new laws to protect health-care and biometric data, which legal experts expect will likely continue as a theme in 2025. Other states are aiming to add more protections for children. (Utah’s new law aims to curb kids’ use of social media apps by proposing a new age verification tool as well as parental consent.)

While Europe has taken the route of requiring consumers opt-in for targets ads, new U.S. state laws have added ways for consumers to more easily opt-out. But if future state laws pass new opt-in rules, it could ratchet up existing standards.

Many state privacy bills are generally similar, but there are some differences in definitions and other details. The platform-agnostic approach of many will also require marketers and tech teams to more deeply examine their data, according to Jessica Lee, chair of the privacy, security and data innovations practice at the law firm Loeb & Loeb. 

“Depending on what platforms you’re using, the answer to that question might be different,” Lee said. “This is going to require kind of digging at a technical level to understand how data is coming into your systems.”

As the state privacy landscape becomes more robust — and more complex — adtech groups have sought to help companies navigate all the changes. Last fall, the Interactive Advertising Bureau debuted a new privacy program and new multi-state framework to help companies comply with the myriad state laws.

U.S. regulations — What’s on the horizon

Despite calls for a new U.S. privacy law, efforts in Congress are still stuck. And with 2024 being an election year, some experts think lawmakers will find it too risky to rock the boat. But even without Congressional action, federal agencies are also looking for ways to strengthen privacy rules. 

Just last month, the Federal Trade Commission proposed updates to the Children’s Online Privacy Protection Act (COPPA), which would include sweeping changes for social media, games, education platforms and others apps. The proposal would place new limits on how companies monetize data for kids under 13 such as turning off default ad-targeting and strictures on push notifications.

Privacy concerns related to kids are much more acute, said Joseph W. Guzzetta, a California-based attorney with the law firm Grellas Shah.

“Privacy is incredibly popular,” Guzzetta said. “Although people don’t want to pay for it, they’re very bullish on privacy in the abstract. When you say, ‘This privacy law is going to cost you $12,’ they say, ‘No way.’ People are willing to pay more and might be willing to do that with respect to their children, but not necessarily themselves.”

There’s also ongoing efforts from the White House on the AI front, including President Joe Biden’s executive order that could have implications for how companies develop AI systems and how government agencies use them.

Another agency is the Consumer Financial Protection Bureau, which recently proposed new rules around sharing consumer financial data. 

International laws

Perhaps one of the most prominent international laws to impact privacy is actually one focused on AI. In December, European Union officials reached an agreement on the AI Act, marking a major policy milestone that could impact more than just AI. First proposed in 2021, the AI Act would enact a sweeping set of new rules to regulate how AI systems are developed and used. 

Although the AI Act doesn’t directly address advertising, privacy experts point out it could still have a major impact how how companies collect and use data for their AI systems.

Other laws to keep an eye on include the EU’s Digital Services Act, which regulates online and social platforms to prevent illegal and harmful activities, and which went into effect last year. Just last month, a new complaint was filed against X — formerly known as Twitter — alleging it violated the DSA by allowing an advertiser to target people with ads based on sensitive data like political and religious informaiton. 

AI and antitrust: The privacy world’s wildcards?

Other regulatory efforts and antitrust lawsuits could also play a role in privacy debates both directly and directly. New laws and court battles in the U.S. and Europe could change the way personal data is used for everything from training AI models to how they’re used. 

There are additional concerns about AI, that go beyond just generative AI. Consumer advocates say governments should curb algorithmic feeds on social media, while some researchers warn that large language models pose a range of risks for consumer data and child safety. (In the UK, regulators recently sent a letter to Snap about how the company vetted its “My AI” chatbot to make sure it’s safe for kids to use.)

Last year, at least 10 states in the U.S. added new AI regulations inside of broader consumer privacy laws, according to the Electronic Privacy Information Center. More will likely be introduced in 2024 at both the state and local level. 

“We will see a collision of privacy regulation with other domains – from competition policy, AI governance, and trade policy to free speech, national security, and safety,” said Caitlin Fennessy, IAPP’s vp and chief knowledge officer. “Privacy rules and requirements have long bumped up against other policy priorities. In 2024, they will crash head on.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

Popular

More like this
Related