6 C
London
Monday, November 20, 2017
Home Tags PGP

Tag: PGP

Morgan Marquis-Boire resigned from Citizen Lab back in September 2017.
In early October, a story was published by the Wall Street Journal alleging Kaspersky Lab software was used to siphon classified data from an NSA employeersquo;s home computer system.

To assist any independent investigators and all the people who have been asking us questions whether those allegations were true, we decided to conduct an internal investigation to attempt to answer a few questions we had related to the article and some others
Adobe suffered at a minimum a PR black eye on Friday when one of its private PGP keys was inadvertently published to its Product Incident Security Response Team (PSIRT) blog.
The firm's security team failed in a spectacular fashion.
Since deleted, post gave public and private key for Adobe incident response team.
Change the name to A-d'oh!-be An absent-minded security staffer just accidentally leaked Adobe's private PGP key onto the internet.…
Meanwhile Huawei gets green light, despite failure to verify source code UK spy agency GCHQrsquo;s cyber security arm, CESG, was left without PGP encryption for more than four months, according to a government report.…
Second of 2 AlphaBay sellers arrested in 2016 pleads guilty: Abdullah Almashwali.
Custom PGP BlackBerry smartphones allegedly used by criminal gangs for secure messaging have suffered a setback.
The security and privacy community was abuzz over the weekend after Google said it was open-sourcing E2Email, a Chrome plugin designed to ease the implementation and use of encrypted email. While this is welcome news, the project won't go anywhere i...

The security researcher cited in the report acknowledged that the word 'backdoor' was probably not the best choice.

WhatsApp this week denied that its app provides a "backdoor" to encrypted texts.

A report published Friday by The Guardian, citing cryptography and security researcher Tobias Boelter, suggests a security vulnerability within WhatsApp could be used by government agencies as a backdoor to snoop on users.

"This claim is false," a WhatsApp spokesman told PCMag in an email.

The Facebook-owned company will "fight any government request to create a backdoor," he added.

WhatsApp in April turned on full end-to-end encryption—using the Signal protocol developed by Open Whisper Systems—to protect messages from the prying eyes of cybercriminals, hackers, "oppressive regimes," and even Facebook itself.

The system, as described by The Guardian, relies on unique security keys traded and verified between users in an effort to guarantee communications are secure and cannot be intercepted. When any of WhatsApp's billion users get a new phone or reinstall the program, their encryption keys change—"something any public key cryptography system has to deal with," Open Whisper Systems founder Moxie Marlinspike wrote in a Friday blog post.

During that process, messages may back up on the phone, waiting their turn to be re-encrypted.

According to The Guardian, that's when someone could sneak in, fake having a new phone, and hijack the texts.

But according to Marlinspike, "the fact that WhatsApp handles key changes is not a 'backdoor,' it is how cryptography works.

"Any attempt to intercept messages in transmit by the server is detectable by the sender, just like with Signal, PGP, or any other end-to-end encrypted communication system," he wrote.

"We appreciate the interest people have in the security of their messages and calls on WhatsApp," co-founder Brian Acton wrote in a Friday Reddit post. "We will continue to set the record straight in the face of baseless accusations about 'backdoors' and help people understand how we've built WhatsApp with critical security features at such a large scale.

"Most importantly," he added, "we'll continue investing in technology and building simple features that help protect the privacy and security of messages and calls on WhatsApp."

In a blog post, Boelter said The Guardian's decision to use the word "backdoor" was probably not "the best choice there, but I can also see that there are arguments for calling it a 'backdoor.'" But Facebook was "furious and issued a blank denial, [which] polarized sides.

"I wish I could have had this debate with the Facebook Security Team in...private, without the public listening and judging our opinions, agreeing on a solution and giving a joint statement at the end," Boelter continued.
In an earlier post, Boelter said he reported the vulnerability in April 2016, but Facebook failed to fix it.

Boelter—a German computer scientist, entrepreneur, and PhD student at UC Berkeley focusing on Security and Cryptography—acknowledged that resolving the issue in public is a double-edged sword.

"The ordinary people following the news and reading headlines do not understand or do not bother to understand the details and nuances we are discussing now. Leaving them with wrong impressions leading to wrong and dangerous decisions: If they think WhatsApp is 'backdoored' and insecure, they will start using other means of communication. Likely much more insecure ones," he wrote. "The truth is that most other messengers who claim to have "end-to-end encryption" have the same vulnerability or have other flaws. On the other hand, if they now think all claims about a backdoor were wrong, high-risk users might continue trusting WhatsApp with their most sensitive information."

Boelter said he'd be content to leave the app as is if WhatsApp can prove that "1) too many messages get [sent] to old keys, don't get delivered, and need to be [re-sent] later and 2) it would be too dangerous to make blocking an option (moxie and I had a discussion on this)."

Then, "I could actually live with the current implementation, except for voice calls of course," provided WhatsApp is transparent about the issue, like adding a notice about key change notifications being delayed.

Google announced an early prototype of Key Transparency, its latest open source effort to ensure simpler, safer, and secure communications for everyone.

The project’s goal is to make it easier for applications services to share and discover public keys for users, but it will be a while before it's ready for prime time. Secure communications should be de rigueur, but it remains frustratingly out of reach for most people, more than 20 years after the creation of Pretty Good Privacy (PGP).

Existing methods where users need to manually find and verify the recipients’ keys are time-consuming and often complicated. Messaging apps and file sharing tools are limited in that users can communicate only within the service because there is no generic, secure method to look up public keys. “Key Transparency is a general-use, transparent directory, which makes it easy for developers to create systems of all kinds with independently auditable account data,” Ryan Hurst and Gary Belvin, members of Google’s security and privacy engineering team, wrote on the Google Security Blog. Key Transparency will maintain a directory of online personae and associated public keys, and it can work as a public key service to authenticate users.

Applications and services can publish their users’ public keys in Key Transparency and look up other users’ keys.

An audit mechanism keeps the service accountable.

There is the security protection of knowing that everyone is using the same published key, and any malicious attempts to modify the record with a different key will be immediately obvious. “It [Key Transparency] can be used by account owners to reliably see what keys have been associated with their account, and it can be used by senders to see how long an account has been active and stable before trusting it,” Hurst and Belvin wrote. The idea of a global key lookup service is not new, as PGP previously attempted a similar task with Global Directory.

The service still exists, but very few people know about it, let alone use it. Kevin Bocek, chief cybersecurity strategist at certificate management vendor Venafi, called Key Transparency an "interesting" project, but expressed some skepticism about how the technology will be perceived and used. Key Transparency is not a response to a serious incident or a specific use case, which means there is no actual driving force to spur adoption.

Compare that to Certificate Transparency, Google’s framework for monitoring and auditing digital certificates, which came about because certificate authorities were repeatedly mistakenly issuing fraudulent certificates. Google seems to be taking a “build it, and maybe applications will come,” approach with Key Transparency, Bocek said. The engineers don’t deny that Key Transparency is in early stages of design and development. “With this first open source release, we're continuing a conversation with the crypto community and other industry leaders, soliciting feedback, and working toward creating a standard that can help advance security for everyone," they wrote. While the directory would be publicly auditable, the lookup service will reveal individual records only in response to queries for specific accounts.

A command-line tool would let users publish their own keys to the directory; even if the actual app or service provider decides not to use Key Transparency, users can make sure their keys are still listed. “Account update keys” associated with each account—not only Google accounts—will be used to authorize changes to the list of public keys associated with that account. Google based the design of Key Transparency on CONIKS, a key verification service developed at Princeton University, and integrated concepts from Certificate Transparency.

A user client, CONIKS integrates with individual applications and services whose providers publish and manage their own key directories, said Marcela Melara, a second-year doctoral fellow at Princeton University’s Center for Information Technology Policy and the main author of CONIKS.

For example, Melara and her team are currently integrating CONIKS to work with Tor Messenger.

CONIKS relies on individual directories because people can have different usernames across services. More important, the same username can belong to different people on different services. Google changed the design to make Key Transparency a centralized directory. Melara said she and her team have not yet decided if they are going to stop work on CONIKS and start working on Key Transparency. One of the reasons for keeping CONIKS going is that while Key Transparency’s design may be based on CONIKS, there may be differences in how privacy and auditor functions are handled.

For the time being, Melara intends to keep CONIKS an independent project. “The level of privacy protections we want to see may not translate to [Key Transparency’s] internet-scalable design,” Melara said. On the surface, Key Transparency and Certificate Transparency seem like parallel efforts, with one providing an auditable log of public keys and the other a record of digital certificates. While public keys and digital certificates are both used to secure and authenticate information, there is a key difference: Certificates are part of an existing hierarchy of trust with certificate authorities and other entities vouching for the validity of the certificates. No such hierarchy exists for digital keys, so the fact that Key Transparency will be building that web of trust is significant, Venafi’s Bocek said. “It became clear that if we combined insights from Certificate Transparency and CONIKS we could build a system with the properties we wanted and more,” Hurst and Belvin wrote.