Advertisement
Australia markets closed
  • ALL ORDS

    7,937.50
    -0.40 (-0.01%)
     
  • ASX 200

    7,683.00
    -0.50 (-0.01%)
     
  • AUD/USD

    0.6499
    +0.0010 (+0.15%)
     
  • OIL

    82.54
    -0.82 (-0.98%)
     
  • GOLD

    2,336.60
    -5.50 (-0.23%)
     
  • Bitcoin AUD

    99,624.18
    -3,144.91 (-3.06%)
     
  • CMC Crypto 200

    1,402.04
    -22.06 (-1.55%)
     
  • AUD/EUR

    0.6073
    +0.0017 (+0.28%)
     
  • AUD/NZD

    1.0951
    +0.0021 (+0.19%)
     
  • NZX 50

    11,946.43
    +143.15 (+1.21%)
     
  • NASDAQ

    17,548.04
    +76.57 (+0.44%)
     
  • FTSE

    8,040.38
    -4.43 (-0.06%)
     
  • Dow Jones

    38,519.34
    +15.65 (+0.04%)
     
  • DAX

    18,088.70
    -48.95 (-0.27%)
     
  • Hang Seng

    17,201.27
    +372.34 (+2.21%)
     
  • NIKKEI 225

    38,460.08
    +907.92 (+2.42%)
     

Facebook to delay full E2EE rollout until 'sometime in 2023'

The company formerly known as Facebook is delaying a rollout of end-to-end encryption across all its services until "sometime in 2023", according to Meta's global head of safety, Antigone Davis, penning an op-ed in the British newspaper, the Telegraph this weekend.

While Facebook-owned WhatsApp has had E2EE everywhere since 2016, most of the tech giant's services do not ensure only the user holds keys for decrypting messaging data. Meaning those services can be subpoenaed or hit with a warrant to provide messaging data to public authorities.

But back in 2019 -- in the wake of global attention to the Cambridge Analytica data misuse scandal -- founder Mark Zuckerberg announced the company would work toward universally implementing end-to-end encryption across all its services as part of a claimed "pivot to privacy".

Zuckerberg did not give a firm timeline for completing the rollout but, earlier this year, Facebook suggested it would complete the rollout during 2022.

ADVERTISEMENT

Now the tech giant is saying it won't get this done until "sometime" the following year. Which sounds distinctly like a can being kicked down the road.

Davis said the delay is the result of the social media giant wanting to take its time to ensure it can implement the technology safely -- in the sense of being able to retain the ability to be able to pass information to law enforcement to assist in child safety investigations.

"As we do so, there’s an ongoing debate about how tech companies can continue to combat abuse and support the vital work of law enforcement if we can’t access your messages. We believe people shouldn’t have to choose between privacy and safety, which is why we are building strong safety measures into our plans and engaging with privacy and safety experts, civil society and governments to make sure we get this right," she writes, saying it will use "proactive detection technology" to ID suspicious patterns of activity, along with enhanced controls for users and the ability for users to report problems.

Western governments, including the U.K.'s, have been leaning hard on Facebook to delay or abandon its plan to blanket services in the strongest level of encryption altogether -- ever since it made the public announcement of its intention to "e2ee all the things" over two years ago.

The U.K. has been an especially vocal critic of Facebook on this front, with Home Secretary Priti Patel very publicly (and repeatedly) warning Facebook that its plan to expand E2EE would hamper efforts to combat online child abuse -- casting the tech giant as an irresponsible villain in the fight against the production and distribution of child sexual abuse material (CSAM).

So Meta's op-ed appearing in the favored newspaper of the British government looks no accident.

"As we roll out end-to-end encryption we will use a combination of non-encrypted data across our apps, account information and reports from users to keep them safe in a privacy-protected way while assisting public safety efforts," Davis also writes in the Telegraph, adding: "This kind of work already enables us to make vital reports to child safety authorities from WhatsApp."

She goes on to suggest that Meta/Facebook has reviewed a number of historic cases -- and concluded that it "would still have been able to provide critical information to the authorities, even if those services had been end-to-end encrypted" -- adding: "While no systems are perfect, this shows that we can continue to stop criminals and support law enforcement."

How exactly might Facebook be able to pass data on users even if all comms on its services were end-to-end encrypted?

Users are not privy to the exact detail on how Facebook/Meta joins the dots of their activity across its social empire -- but while Facebook's application of E2EE on WhatsApp covers messaging/comms content, for example, it does not extend to metadata (which can provide plenty of intel on its own).

The tech giant also routinely links accounts and account activity across its social media empire -- passing data like a WhatsApp user's mobile phone number to its eponymous service, following a controversial privacy U-turn back in 2016. This links a user's (public) social media activity on Facebook (if they have or have had an account there) with the more bounded form of socializing that typifies activity on WhatsApp (i.e. one-to-one comms, or group chats in a private E2EE channel).

Facebook can thus leverage its vast scale (and historical profiling of users) to flesh out a WhatsApp user's social graph and interests -- based on things like who they are speaking to; who they're connected to; what they've liked and done across all its services (most of which aren't yet E2EE) -- despite WhatsApp messaging/comms content itself being end-to-end encrypted.

(Or as Davis' op-ed puts it: "As we roll out end-to-end encryption we will use a combination of non-encrypted data across our apps, account information and reports from users to keep them safe in a privacy-protected way while assisting public safety efforts. This kind of work already enables us to make vital reports to child safety authorities from WhatsApp.")

Earlier this fall, Facebook was stung with a major fine in the European Union related to WhatsApp transparency obligations -- with DPAs finding it had failed to properly inform users what it was doing with their data, including in relation to how it passes information between WhatsApp and Facebook.

Facebook is appealing against the GDPR sanction but today it announced a tweak to the wording of the privacy policy shown to WhatsApp users in Europe in response to the regulatory enforcement -- although it claimed it has not made any changes to how it processes user data.

Returning to E2EE specifically, last month Facebook whistleblower Frances Haugen raised concerns over the tech giant's application of the technology -- arguing that since it's a proprietary (i.e. rather than open source) implementation users must take Facebook/Meta's security claims on trust, as independent third parties are unable to verify the code does what it claims.

She also suggested there is no way for outsiders to know how Facebook interprets E2EE -- adding that for this reason she's concerned about its plan to expand the use of E2EE -- "because we have no idea what they’re going to do", as she put it.

"We don’t know what it means, we don’t know if people’s privacy is actually protected," Haugen told lawmakers in the U.K. parliament, further warning: "It’s super nuanced and it’s also a different context. On the open source end-to-end encryption product that I like to use there is no directory where you can find 14-year-olds, there is no directory where you can go and find the Uighur community in Bangkok. On Facebook it is trivially easy to access vulnerable populations and there are national state actors that are doing this."

Haugen was careful to speak up in support of E2EE -- saying she's a supporter of open source implementations of the security technology, i.e. where external experts can robustly interrogate code and claims.

But in the case of Facebook, where its E2EE implementation is not open to anyone to verify, she suggested regulatory oversight is needed to avoid the risk of the tech giant making misleading claims about how much privacy (and therefore safety from potentially harmful surveillance, such as by an authoritarian state) users actually have.

 

Davis' op-ed -- which is headlined "we'll protect privacy and prevent harm" -- sounds intended to soothe U.K. policymakers that they can "have their cake and eat it"; concluding with a promise that Meta will "continue engaging with outside experts and developing effective solutions to combat abuse".

"We’re taking our time to get this right and we don’t plan to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023," Davis adds, finishing with another detail-light soundbite that it is "determined to protect people’s private communications and keep people safe online".

While the U.K. government will surely be delighted with the line-toeing quality of Facebook's latest public missives on a very thorny topic, its announcement that it's delaying E2EE in order to "get this right" -- following sustained pressure from ministers like Patel -- is only likely to increase concerns about what "right" means in such a privacy sensitive context.

Certainly the wider community of digital rights advocates and security experts will be closely watching what Meta does here.

The U.K. government recently splashed almost half a million of taxpayer's money on five projects to develop scanning/filtering technologies that could be applied to E2EE services -- to detect, report or block the creation of child sexual abuse material (CSAM) -- after ministers said they wanted to encourage innovation around "tech safety" through the development of "alternative solutions" (i.e. which would not require platforms not to use E2EE but instead to embed some form of scanning/filtering technology into the encrypted systems to detect/combat CSAM).

So the U.K.'s preferred approach appears to be to use the political cudgel of concern for child safety -- which it's also legislating for in the Online Safety Bill -- to push platforms to implement spyware that allows for encrypted content to be scanned on users' devices regardless of any claim of E2EE.

Whether such baked-in scanner systems essentially sum to a backdoor in the security of robust encryption (despite ministers claims otherwise) will surely be the topic of close scrutiny and debate in the months/years ahead.

Here it's instructive to look at Apple's recent proposal to add a CSAM detection system to its mobile OS -- where the technology was slated to scan content on a user's device prior to it being uploaded to its iCloud storage service.

Apple initially took a bullish stance on the proactive move -- claiming it had developed "the technology that can balance strong child safety and user privacy".

However after a storm of concern from privacy and security experts -- as well as those warning that such systems, once established, would inexorably face "feature creep" (whether from commercial interests to scan for copyrighted content; or from hostile states to target political dissidents living under authoritarian regimes) -- Apple backtracked, saying after less than a month that it would delay implementing the system.

It's not clear when/whether Apple might revive the on-device scanner.

While the iPhone maker has built a reputation (and very lucrative business) as a privacy-centric company, Facebook's ad empire is the opposite beast: Synonymous with surveillance for profit. So expecting the social media behemoth -- whose founder (and all-powerful potentate) has presided over a string of scandals attached to systematically privacy-hostile decisions -- to hold the line in the face of sustained political pressure to bake spyware into its products would be for Facebook to deny its own DNA.

Its recent corporate rebranding to Meta looks a whole lot more superficial than that.