Another day, another horrific Facebook privacy scandal. We know what comes next: Facebook will argue that losing a lot of our data means bad third-party actors are the real problem that we should trust Facebook to make more decisions about our data to protect against them. If history is any indication, that’ll work. But if we finally wise up, we’ll respond to this latest crisis with serious action: passing America’s long-overdue federal privacy law (with a private right of action) and forcing interoperability on Facebook so that its user/hostages can escape its walled garden.
In January 2021, Motherboard reported on a bot that was selling records from a 500 million-plus person trove of Facebook data, offering phone numbers and other personal information. Facebook said the data had been scraped by using a bug that was available as early as 2016, and which the company claimed to have patched in 2019. Last week, a dataset containing 553 million Facebook users’ data—including phone numbers, full names, locations, email addresses, and biographical information—was published for free online. (It appears this is the same dataset Motherboard reported on in January). More than half a billion current and former Facebook users are now at high risk of various kinds of fraud.
At noon on January 20, 2021, Joseph R. Biden, Jr. was sworn in as the 46th President of the United States, and he and his staff took over the business of running the country.
The tradition of a peaceful transfer of power is as old as the United States itself. But by the time most of us see this transition on January 20th, it is mostly ceremonial. The real work of a transition begins months before, usually even before Election Day, when presidential candidates start thinking about key hires, policy goals, and legislative challenges. After the election, the Presidential Transition Act provides the president-elect’s team with government resources to lay the foundation for the new Administration’s early days in office. Long before the inauguration ceremony, the president-elect’s team also organizes meetings with community leaders, activists, and non-profits like EFF during this time, to hear about our priorities for the incoming Administration.
Should an obviously fake Facebook post—one made as political satire—end with a lawsuit and a bill to pay for a police response to the post? Of course not, and that’s why EFF filed an amicus brief in Lafayette City v. John Merrifield.
In this case, Merrifield made an obviously fake Facebook event satirizing right-wing hysteria about Antifa. The announcement specifically poked fun at the well-established genre of fake Antifa social media activity, used by some to drum up anti-Antifa sentiment. However, the mayor of Lafayette didn’t get the joke, and now Lafayette City officials want Merrifield to pay for the costs of policing the fake event.
One of the most important aspects of cryptocurrencies from a civil liberties perspective is that they can provide privacy protections for their users. But EFF is concerned that the U.S. government has been increasingly taking steps to undermine the anonymity of cryptocurrency transactions and importing the widespread financial surveillance of the traditional banking system to cryptocurrencies.
On Friday, the Department of the Treasury’s Financial Crimes Enforcement Network (FinCEN) announced a proposed regulation that would require money service businesses (which includes, for example, cryptocurrency exchanges) to collect identity data about people who transact with their customers using self-hosted cryptocurrency wallets or foreign exchanges. The proposed regulation would require them to keep that data and turn it over to the government in some circumstances (such as when the dollar amount of transactions in a day exceeds a certain threshold).
The proposal appears designed to be a midnight regulation pushed through before the end of the current presidential administration, as its 15-day comment period is unusually short and coincides with the winter holiday. The regulation’s authors write that this abbreviated comment period is required to deal with the “threats to United States national interests” posed by these technologies, but they provide no factual basis for this claim.
Although EFF is still reviewing the proposal, we have several initial concerns. First, the regulation would mean that people who store cryptocurrency in their own wallets (rather than using a professional service) would effectively be unable to transact anonymously with people who store their cryptocurrency with a money service business. The regulation will likely chill the ability to use self-hosted wallets to transact with the privacy of cash.
Facebook has recently launched a campaign touting itself as the protector of small businesses. This is a laughable attempt from Facebook to distract you from its poor track record of anticompetitive behavior and privacy issues as it tries to derail pro-privacy changes from Apple that are bad for Facebook’s business.
Facebook’s campaign is targeting a new AppTrackingTransparency feature on iPhones that will require apps to request permission from users before tracking them across other apps and websites or sharing their information with and from third parties. Requiring trackers to request your consent before stalking you across the Internet should be an obvious baseline, and we applaud Apple for this change. But Facebook, having built a massive empire around the concept of tracking everything you do by letting applications sell and share your data across a shady set of third-party companies, would like users and policymakers to believe otherwise.
Make no mistake: this latest campaign from Facebook is one more direct attack against our privacy and, despite its slick packaging, it’s also an attack against other businesses, both large and small.
Every three years, the public has an opportunity to chip away at the harm inflicted by an offshoot of copyright law that doesn’t respect traditional safeguards such as fair use. This law, Section 1201 of the Digital Millennium Copyright Act, impedes speech, innovation, and access to knowledge by threatening huge financial penalties to those who simply access copyrighted works that are encumbered by access restriction technology. To mitigate the obvious harm this law causes, Americans have the right to petition for exemptions to Section 1201, which last for three years before the whole process starts over.
The liability created by Section 1201 can attach even to those who aren’t infringing copyright, because their access is in service of research, education, criticism, remix, or other fair and noninfringing uses. The law allows rightsholders to enforce their business models in ways that have nothing to do with the rights actually granted to copyright holders. A willful and commercial act of circumvention can even result in criminal charges and jail time, and the Department of Justice takes the position that there doesn’t need to be any connection to actual copyright infringement for them to prosecute.
EFF is representing Matthew Green and bunnie Huang in a First Amendment challenge to Section 1201, based on its failure to respect copyright’s traditional boundaries, including safeguards like fair use. At the same time, we’re participating in the rulemaking process in hopes of winning some exemptions that will mitigate the law’s harms. In the past, we’ve won exemptions for remix videos, jailbreaking personal computing devices, repairing and modifying car software, security research, and more.
Last week, users of macOS noticed that attempting to open non-Apple applications while connected to the Internet resulted in long delays, if the applications opened at all. The interruptions were caused by a macOS security service attempting to reach Apple’s Online Certificate Status Protocol (OCSP) server, which had become unreachable due to internal errors. When security researchers looked into the contents of the OCSP requests, they found that these requests contained a hash of the developer’s certificate for the application that was being run, which was used by Apple in security checks. The developer certificate contains a description of the individual, company, or organization which coded the application (e.g. Adobe or Tor Project), and thus leaks to Apple that an application by this developer was opened.
Here at the Electronic Frontier Foundation, we have a guiding motto: “I Fight For the Users.” (We even put it on t-shirts from time to time!) We didn’t pick that one by accident (nor merely because we dig the 1982 classic film “Tron”), but because it provides such a clear moral compass when we sit down to work every day.
Some of the most important work we do at EFF is build technologies to protect users’ privacy and security, and give developers tools to make the entire Internet ecosystem more safe and secure. Every day, EFF’s talented and dedicated computer scientists and engineers are creating and making improvements to our free, open source extensions, add-ons, and software to solve the problems of creepy tracking and unreliable encryption on the web.
Joining EFF this week to direct and shepherd these technology projects is internationally-recognized cybersecurity and encryption expert Jon Callas. He will be working with our technologists on Privacy Badger, a browser add-on that stops advertisers and other third-party trackers from secretly tracking users’ web browsing, and HTTPS Everywhere, a Firefox, Chrome, and Opera extension that encrypts user communications with major websites, to name of few of EFF’s tech tools.
For years, free speech and press freedoms have been under attack in Turkey. The country has the distinction of being the world’s largest jailer of journalists and has in recent years been cracking down on online speech. Now, a new law, passed by the Turkish Parliament on the 29th of July, introduces sweeping new powers and takes the country another giant step towards further censoring speech online. The law was ushered through parliament quickly and without allowing for opposition or stakeholder inputs and aims for complete control over social media platforms and the speech they host. The bill was introduced after a series of allegedly insulting tweets aimed at President Erdogan’s daughter and son-in-law and ostensibly aims to eradicate hate speech and harassment online. Turkish lawyer and Vice President of Ankara Bar Association IT, Technology & Law Council Gülşah Deniz-Atalar called the law “an attempt to initiate censorship to erase social memory on digital spaces.”
Once ratified by President Erdogan, the law would mandate social media platforms with more than a million daily users to appoint a local representative in Turkey, which activists are concerned will enable the government to conduct even more censorship and surveillance. Failure to do so could result in advertisement bans, steep penalty fees, and, most troublingly, bandwidth reductions. Shockingly, the legislation introduces new powers for Courts to order Internet providers to throttle social media platforms’ bandwidth by up to 90%, practically blocking access to those sites. Local representatives would be tasked with responding to government requests to block or take down content. The law foresees that companies would be required to remove content that allegedly violates “personal rights” and the “privacy of personal life” within 48 hours of receiving a court order or face heavy fines. It also includes provisions that would require social media platforms to store users’ data locally, prompting fears that providers would be obliged to transmit those data to the authorities, which experts expect to aggravate the already rampant self-censorship of Turkish social media users.
While Turkey has a long history of Internet censorship, with several hundred thousand websites currently blocked, this new law would establish unprecedented control of speech online by the Turkish government. When introducing the new law, Turkish lawmakers explicitly referred to the controversial German NetzDG law and a similar initiative in France as a positive example.
Germany’s Network Enforcement Act, or NetzDG for short, claims to tackle “hate speech” and illegal content on social networks and passed into law in 2017 (and has been tightened twice since). Rushedly passed amidst vocal criticism from lawmakers, academia and civil experts, the law mandates social media platforms with one million users to name a local representative authorized to act as a focal point for law enforcement and receive content take down requests from public authorities. The law mandates social media companies with more than two million German users to remove or disable content that appears to be “manifestly illegal” within 24 hours of having been alerted of the content. The law has been heavily criticized in Germany and abroad, and experts have suggested that it interferes with the EU’s central Internet regulation, the e-Commerce Directive. Critics have also pointed out that the strict time window to remove content does not allow for a balanced legal analysis. Evidence is indeed mounting that NetzDG’s conferral of policing powers to private companies continuously leads to takedowns of innocuous posts, thereby undermining the freedom of expression.