OPINION – Digital deception: Bad actors already want a piece of the 2020 election, and we still haven’t dealt with 2016

0
413

By ANN M. RAVEL

Amid the flurry of debates, ads and candidate speeches for the 2020 presidential election, another hallmark of the campaign season has appeared: digital deception. It seems almost a foregone conclusion that foreign powers or other bad actors will try to use social media to spread false information and influence political campaigns.

From a fake Joe Biden website to bots amplifying false and divisive conspiracies about Kamala Harris’s background, political actors are manipulating stories and spreading false information online in efforts to influence voters. All of this will inevitably rob voters of their ability to understand who is trying to influence them, their power to compare candidates and their trust in the political process.

I first raised this issue as vice chair of the Federal Election Commission in 2014. Under current law, digital political ads still don’t require on-ad disclaimers showing who paid for them, individuals can still be targeted based on incredibly specific criteria — with hardly any disclosure of who is doing the targeting and how — and practices like purchasing social media bots or using fake accounts to amplify political messages (known as astroturfing) still go mostly unchecked.

LOTS OF HOLES
While Facebook, Twitter and Google have all introduced political ad transparency databases, these are full of holes.

Facebook’s system, for example, still makes it very difficult to understand who is actually paying for ads; Twitter does not list advertisers, meaning you have to know who to search for to see who is actually spending money on the platform; and Google defines “election ads” quite narrowly, so only a small subset of political ads even make it onto the database. These efforts at transparency show that self-regulation can’t take the place of government intervention when it comes to protecting democracy.

ROADBLOCK IN SENATE
It doesn’t have to be this way, though. Several pieces of legislation have been proposed in Congress to address these issues.

For example, the For the People Act, a comprehensive package of reforms that includes regulations to create greater digital transparency and prevent foreign election interference, was introduced in Congress in January. As one of the first bills the House passed this year, the act spoke to many representatives’ commitment to enact legislation that would protect the integrity of our democracy.

Yet, despite Majority Leader Mitch McConnell’s recent support for boosting election security funding in the states, federal reforms like this one still stand little chance of passing in the Republican-controlled Senate, with McConnell adamant he won’t even bring it to the floor.

A similar fate apparently awaits:

• The Honest Ads Act, which would mandate online platforms that reach more than 50 million monthly viewers to disclose who paid for political advertisements and any targeting criteria used in the ad, as well as to maintain a public database of ads.

• The CONSENT Act, which would vastly increase internet users’ privacy protections and control over their personal data — reducing the ability of advertisers to engage in invasive targeting practices — will also almost certainly languish without ever seeing a vote.

• The Bot Disclosure and Accountability Act, which would simply require automated social media accounts to be publicly identified as bots, will probably not make it out of committee.

These bills are our best chance at limiting the spread of disinformation. Meanwhile, far from supporting these bills or introducing other measures to protect American democracy, the president has actually invited election interference, saying that he would potentially welcome foreign “help.”

In this environment, deceptive political communication flourishes with nearly no transparency.

URGENT THREAT TO OUR DEMOCRACY
Our current shortcomings in dealing with digital disinformation raise enormous concerns about how we will deal with deceptive techniques yet to come, such as deepfake audio and video.

For instance, when footage of Nancy Pelosi doctored to make her appear drunk rapidly went viral, Facebook left the content up, raising questions about how it handles false content.

How will we address digital deception as artificial intelligence and other computing technologies advance if we can’t handle today’s threats? Congress must come together to pass laws that address the urgent threat that digital deception poses to our democracy.

In April, FBI Director Christopher Wray stated, chillingly, that the tactics used in the 2018 midterm elections were merely a “dress rehearsal” for the 2020 election. If our inaction so far is any indication, we are far from ready for the full show.

(Ann M. Ravel is the Digital Deception Project director at MapLight and previously served as chair of the Federal Election Commission. The opinions expressed in this commentary are her own. Reprint CNN Business)

LEAVE A REPLY

Please enter your comment!
Please enter your name here