• #Privacy Rules Won't Fix Real Problem in #Facebook #Scandal

    The Facebook-Cambridge Analytica revelations have caused renewed focus on privacy rules, although investors don’t seem too concerned judging by the nearly $50 billion increase in Facebook’s market value after announcing its first-quarter earnings. What should concern us all, however, is not Wall Street’s views on privacy rules, but that Congress and so many others believe privacy is the key policy area to focus on in response to the incident. 

    By its own admission, Facebook did not show sufficient interest in policing the privacy standards it promised its users. But the policy responses generally being considered -- new privacy rules based on the European Union’s stricter standards — are not directly relevant to the issue at hand because they would have done little, if anything, to prevent the damage the Cambridge Analytica incident helped foster. 

    Nobody among the 87 million users whose data Cambridge Analytica obtained appear to have been harmed directly. That shouldn’t be surprising since Facebook does not generally collect sensitive information like health and financial data. 

    But that does not mean the incident was harmless. 

    The real harm came from the ability of bad actors — the Russian government in particular — to use social media to promote misinformation in an effort to suppress voter turnout and change votes, as Robert Mueller so carefully laid out in his indictment of three organizations and 13 Russian nationals. For example, Mueller notes that in July 2016, “[d]efendants and their co-conspirators purchased advertisements on Facebook to promote the ‘March for Trump’ and ‘Down with Hillary’ rallies.”

     Thus, while the users whose data was taken were not directly harmed, anyone whose voting behavior changed because of misinformation targeted with the help of that data was harmed. This private harm aggregates to larger social harms if it affected the outcomes of any elections. This includes not just the presidential election but state and local elections as well. 

    Regardless of whether one believes European-style privacy rules would be a net benefit, they are not a response to the problem at hand. After all, strict privacy rules did not prevent similar election interference in Europe. 

    To its credit, Facebook has announced its intention to require more transparency in the identity of buyers of political ads, much like political ads on old media include a line saying, “I am politician so-and-so, and I approve this message.” But this change, beneficial though it may be, may be difficult to enforce, especially if political messages are disguised as news or other supposedly non-political posts. We may also see pushback against this rule from U.S. politicians themselves when they find themselves unable to instantly post campaign ads in the next election cycle. 

    A famous cliché says that it takes a theory to beat a theory. And I have no good suggestions for what the right policy solutions are. Still, it is useful to reframe the debate so that it focuses on ways to address the issue rather than on ways to implement a separate agenda that is only tangentially related. 

    We will probably never know if the misinformation campaigns affected the outcomes of any elections. But we want to make such campaigns more difficult to carry out in the future. Economic regulation was never intended to be a tool that protects our social choice mechanisms from well-financed targeted attacks, and we should not allow the Facebook-Cambridge Analytica incident to eclipse the reluctance of the Trump administration and Congress to properly respond to attacks on election integrity. 

    Let Facebook eat crow. Let’s have a robust debate on privacy based on empirical evidence on how much people truly value their privacy, in word and deed.  Conversations need to include the costs and benefits of different policy approaches to regulate the data-driven economy. But that is a separate debate. 

    We must remember that what the Facebook-Cambridge Analytica incident reveals is how easy it was for the Russian government and others to rapidly spread misinformation through advertisement channels in attempting to affect an election’s outcome. This problem is larger than the ad network of a single platform, but Facebook should be responsible for the potential and dangers of its own technology, and the administration and Congress should not feign ignorance of election interference in the information age.

    Scott Wallsten is president and senior fellow at the Technology Policy Institute.

    Read More

  • Post-#Scandal, #Facebook Isn’t The Only #SocialMedia Site In A Downward Spiral

    Facebook and the shady data marketing firm Cambridge Analytica are dealing with the blowback of a recent scandal. And all of social media is feeling the heat.

    As the story broke, millions of Americans realized that by taking a simple quiz on Facebook, they had given up their personal data to feed an algorithm that was then used for political propaganda.

    While the scandal is definitely the biggest blow to Facebook’s reputation to date, it’s certainly not the first. In the past, the company had been involved in controversies over the spread of fake news, the dissemination of racist content, and the live streaming of homicides. But this time, the PR crisis had a tangible cost — $60 billion, or 11.4 percent of the company’s shares, went up in smoke in two days after the story broke.

    This massive financial loss isn’t just the consequence of scandal, it’s a symptom of a deeper crisis: investors know that the trust with which users once regarded Facebook and other social platforms cannot be restored.

    Mark Zuckerberg’s mealymouthed attempt at damage control — taking responsibility for the mishandling of user data, charting a path forward — is unlikely to regain the confidence of users, or of investors.

    The U.S. Federal Trade Commission, which makes sure companies don’t violate their privacy policies, could slap Facebook with a multi-million-dollar fine if it finds it breached the protocol, Bloomberg reports. This could happen to other companies with lax privacy policies. Investors were shaking in their boots. As a result, Twitter shares tanked along with Facebook’s, dropping as much as 11 percent on Tuesday, the most since July 2017, according to Bloomberg. Snapchat’s shares also fell nearly 3.7 percent over the past five days, after a big plunge on Tuesday.

    It’s unlikely that users will drop Facebook altogether, as WhatsApp founder Brian Acton urged. Facebook is too embedded in the daily lives of billions of people. For some, going on the internet is synonymous with logging onto Facebook.

    But now people’s attitude towards social media will be different. Most users already knew a bit about the lack of privacy on social media, but up until now cybersecurity was mostly theoretical, abstract.

    Now that people know that their page likes, quiz answers, and other frivolities may have played a part in electing the president, they will take all of that more seriously. And all social media platforms, with no exception, will have to reckon with that.





    Read More