15.6 C
London
Friday, August 18, 2017
Home Tags Federal Bureau of Investigation

Tag: Federal Bureau of Investigation

Our view from press row.Sam Machkovech AUSTIN, Texas—In his keynote address at the 2016 South By Southwest conference, President Barack Obama responded directly to a question about cybersecurity in light of the ongoing Apple v.

FBI case with answers that favored the American government's current position. President Obama even called out Edward Snowden's disclosure of classified documents in 2013. When asked by moderator and Texas Tribune founder Evan Smith where he came down on the question of digital-device privacy versus national security, President Obama began his response by saying, "I can't comment on that specific case." Yet the President's lengthy response revolved around that case's core issues of encryption to a point that it appeared unmistakably related. President Obama began his response by reminding the audience that law enforcement agencies can obtain a warrant, show up on a citizen's doorstep, and "rifle through your underwear to see if there's evidence of wrongdoing"—and they've been able to do so well before smartphones were invented. What followed was repeated assurances that "we don't want government looking through phones willy-nilly" and an admission that "the whole Snowden disclosure episode elevated people's suspicions." Even so, Obama took the opportunity to claim that "the Snowden issue vastly overstated the dangers to US citizens in terms of spying.

The fact of the matter is, our intelligence agencies are pretty scrupulous about US persons, people on US soil. What those disclosures did identify were excesses overseas with respect to people not in this country." Obama then identified a key challenge faced by his administration: a desire for strong encryption to protect state secrets and infrastructure meeting an equal desire to be able to break encryption in specific circumstances. "If it's technologically possible to make an impenetrable device or system, where encryption is so strong that there's no key—there's no door—then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot? What mechanisms do we have to even do things like tax enforcement? If you can't crack that at all, if government can't get in, then everyone's walking around with a Swiss bank account in their pocket, right?" In short: Help us before Congress ruins everything After briefly describing the complaints from the likes of Tim Cook and cybersecurity experts about a key that could unlock any device and reminding the crowd that "I am not a software engineer," Obama made his beliefs wholly clear. "You cannot take an absolutist view on this.
If your view is strong encryption no matter what and we can and should create black boxes, that does not strike the balance that we've lived with for 200 or 300 years.

And it's fetishizing our phones above every other value.

That can't be the right answer." Obama told the crowd that his dream scenario would involve robust encryption that could be unlocked by a key that was "accessible by the smallest number of people possible for a subset of issues that we agree are important." He next reminded the crowd that, again, "how we design [such a system] is not something I have the expertise to do." That's when the president made it clear that this view was partially fueled by fears of the larger American government machine taking such regulations over, and the tech sector's smartest people should join up now or risk greater digital-privacy consequences. "We need the tech community to help us solve [this problem]," Obama said. "What will happen is, if everybody goes to their respective corners—if the tech community says, 'either we have strong, perfect encryption, or it's a Big Brother Orwellian world'—what you'll find is, after something really bad happens, the politics will swing.
It'll be sloppy, it'll be rushed, and it'll go through congress in ways we haven't thought through.

Then we'll have something really dangerous."
Oncology patients' diagnoses, treatment details slurped US cancer clinic 21st Century Oncology has admitted that a breach on its systems may have exposed private information on 2.2 million patients and employees. Unidentified hackers were able to access sensitive patient and employee data, including names, SSNs, diagnosis and treatment details and insurance information after breaking into the clinic’s network. The clinic was informed of the breach by the FBI in November 2015 but the Feds asked 21st Century to hold off from disclosing the incident until a thorough investigation had been completed.

This explains why the clinic only went public in admitting the breach this week. Hackers accessed the systems at the beginning of October last year. In its statement, 21st Century apologised for the incident while trying to quieten concerns by stating that there’s no evidence that the leaked data has been misused.

The clinic added that it had “taken additional steps to enhance internal security protocols to help prevent a similar incident in the future”. We have no indication that the information has been misused in any way; however, out of an abundance of caution, we are notifying the affected patients and offering them a free one-year credit protection services. We also recommend that patients regularly review the explanation of benefits that they receive from their health insurer.
If they see services that they did not receive, please contact the insurer immediately. We deeply regret any concern this may cause our patients, and we want to emphasize that patient care will not be affected by this incident. The incident marks the second time 21st Century Oncology learned of a data breach from federal authorities.
In 2013, federal law enforcement informed the clinic of an insider breach allegedly linked to a tax refund fraud scheme, as databreaches.net reports. “The fact that 21st Century Oncology has been breached should set off alarm bells to other companies in the healthcare industry,” said Kevin Watson, chief exec at Florida-based Netsurion, a provider of remotely-managed security services. “We know that hackers are in constant pursuit of highly sensitive, personal data and that they are equipped with sophisticated methods to gain access to it. “It appears that diagnosis and treatment information might have been exposed, which could unlock the potential for significant medical fraud.

And if insurance plan information was stolen along with identity information, data thieves would have a good indicator on which identities hold a higher value, based on the value of the insurance plan.” ® Sponsored: Why every enterprise needs an Internet Performance Management (IPM) Strategy
Not that kind of crack.Geoff Parsons Apple's encryption battle Feds: New judge must force iPhone unlock, overturning ruling that favored Apple Amazon will restore Fire OS‘ encryption support in the spring What is a “lying-dormant cyber pathogen?” San Bernardino DA says it’s made up [Updated] San Bernardino DA says seized iPhone may hold “dormant cyber pathogen” [Update] To get back at Apple, GOP congressman introduces pointless bill View all…The custom firmware that the FBI would like Apple to produce in order to unlock the San Bernardino iPhone would be the most straightforward way of accessing the device, allowing the federal agency to rapidly attempt PIN codes until it found the one that unlocked the phone. But it's probably not the only way to achieve what the FBI wants.

There may well be approaches that don't require Apple to build a custom firmware to defeat some of the iPhone's security measures. The iPhone 5c used by the San Bernardino killers encrypts its data using a key derived from a combination of an ID embedded in the iPhone's processor and the user's PIN.

Assuming that a 4-digit PIN is being used, that's a mere 10,000 different combinations to try out. However, the iPhone has two protections against attempts to try every PIN in turn.

First, it inserts delays to force you to wait ever longer between PIN attempts (up to one hour at its longest).
Second, it has an optional capability to delete its encryption keys after 10 bad PINs, permanently depriving access to any encrypted data. The FBI would like to use a custom firmware that allows attempting multiple PINs without either of these features.

This custom firmware would most likely be run using the iPhone's DFU mode.

Device Firmware Update (DFU) mode is a low-level last resort mode that can be used to recover iPhones that are unable to boot.

To use DFU mode, an iPhone must be connected via USB to a computer running iTunes. iTunes will send a firmware image to the iPhone, and the iPhone will run that image from a RAM disk.

For the FBI's purposes, this image would include the PIN-attack routines to brute-force the lock on the device. Developing this firmware should not be particularly difficult—jailbreakers have developed all manner of utilities to build custom RAM disks to run from DFU mode, so running custom code from this environment is already somewhat understood—but there is a problem.

The iPhone will not run any old RAM disk that you copy to it.
It first verifies the digital signature of the system image that is transferred. Only if the image has been properly signed by Apple will the phone run it. The FBI cannot create that signature itself. Only Apple can do so.

This means also that the FBI cannot even develop the code itself.

To test and debug the code, it must be possible to run the code, and that requires a signature.

This is why it is asking for Apple's involvement: only Apple is in a position to do this development. Do nothing at all The first possibility is that there's simply nothing to do.

Erasing after 10 bad PINs is optional, and it's off by default.
If the erase option isn't enabled, the FBI can simply brute force the PIN the old-fashioned way: by typing in new PINs one at a time.
It would want to reboot the phone from time to time to reset the 1 hour delay, but as tedious as the job would be, it's certainly not impossible. It would be a great deal slower on an iPhone 6 or 6s.
In those models, the running count of failed PIN attempts is preserved across reboots, so resetting the phone doesn't reset the delay period.

But on the 5c, there's no persistent record of bad PIN trials, so restarting the phone allows an attacker to short-circuit the delay. Why it might not work Obviously, if the phone is set to wipe itself, this technique wouldn't work, and the FBI would want to know one way or the other before starting.
It ought to be a relatively straightforward matter for Apple to tell, as the phone does have the information stored in some accessible way so that it knows what to do when a bad PIN is entered. But given the company's reluctance to assist so far, getting them to help here may be impossible.Update: It turns out that this bug was fixed in iOS 8.1, so it probably wouldn't work after all. Acid and laserbeams One risky solution that has been discussed extensively already is to use lasers and acid to remove the outer layers of the iPhone's processor and read the embedded ID. Once this embedded ID is known, it's no longer necessary to try to enter the PIN directly on the phone itself.
Instead, it would be possible to simply copy the encrypted storage onto another computer and attempt all the PINs on that other computer.

The iPhone's lock-outs and wiping would be irrelevant in this scenario. Why it might not work The risk of this approach is not so much that it won't work, but that if even a tiny mistake is made, the hardware ID could be irreparably damaged, rendering the stored data permanently inaccessible. Jailbreak the thing The iPhone's built-in lockouts and wipes are unavoidable if running the iPhone's operating system... assuming that the iPhone works as it is supposed to.
It might not.

The code that the iPhone runs to enter DFU mode, load a RAM image, verify its signature, and then boot the image is small, and it should be simple and quite bullet-proof. However, it's not impossible that this code, which Apple calls SecureROM, contains bugs.
Sometimes these bugs can enable DFU mode (or the closely related recovery mode) to run an image without verifying its signature first. There are perhaps six known historic flaws in SecureROM that have enabled jailbreakers to bypass the signature check in one way or another.

These bugs are particularly appealing to jailbreakers, because SecureROM is baked into hardware, and so the bugs cannot be fixed once they are in the wild: Apple has to update the hardware to address them.

Exploitable bugs have been found in the way SecureROM loads the image, verifies the signature, and communicates over USB, and in all cases they have enabled devices to boot unsigned firmware. If a seventh exploitable SecureROM flaw could be found, this would enable jailbreakers to run their own custom firmwares on iPhones.

That would give the FBI the power to do what it needs to do: it could build the custom firmware it needs and use it to brute force attack the PIN.
Some critics of the government's demand have suggested that a government agency—probably the NSA—might already know of such a flaw, arguing that the case against Apple is not because of a genuine need to have Apple sign a custom firmware but merely to give cover for their own jailbreak. Why it might not work Of course, the difficulty with this approach is that it's also possible that no such flaw exists, or that even if it does exist, nobody knows what it is.

Given the desirability of this kind of flaw—it can't be fixed through any operating system update—jailbreakers have certainly looked, but thus far they've turned up empty-handed.

As such, this may all be hypothetical. Ask Apple to sign an FBI-developed firmware Apple doesn't want to develop a firmware to circumvent its own security measures, saying that this level of assistance goes far beyond what is required by law.

The FBI, however, can't develop its own firmware because of the digital signature requirements. But perhaps there is a middle ground.

Apple, when developing its own firmwares, does not require each test firmware to be signed.
Instead, the company has development handsets that have the signature restriction removed from SecureROM and hence can run any firmware.

These are in many ways equivalent to the development units that game console manufacturers sell to game developers; they allow the developers to load their games to test and debug them without requiring those games to be signed and validated by the console manufacturer each time. Unlike the consoles, Apple doesn't distribute these development phones.
It might not even be able to, as it may not have the necessary FCC certification.

But they nonetheless exist.
In principle, Apple could lend one of these devices to the FBI so that the FBI would then be responsible for developing the firmware.

This might require the FBI to do the work on-site at Cupertino or within a Faraday cage to avoid FCC compliance concerns, but one way or another this should be possible. Once it had a finished product, Apple could sign it.
If the company was truly concerned with how the signed firmware might be used, it might even run the firmware itself and discard it after use. This would relieve Apple of the burden of creating the firmware, and it could be argued that it was weakening Apple's first amendment argument against unlocking the firmware. While source code is undoubtedly expressive and protected by the first amendment, it seems harder to argue that a purely mechanical transformation such as stamping a file with a digital signature should be covered by the same protection. Why it might not work Apple may very well persist in saying no, and the courts may agree. Andrew Cunningham Stop the phone from wiping its encryption keys The way the iPhone handles encryption keys is a little more complex than outlined above.

The encryption key derived from the PIN combined with the hardware ID isn't used to encrypt the entire disk directly.
If it were, changing the PIN would force the entire disk to be re-encrypted, which would be tiresome to say the least.
Instead, this derived key is used to encrypt a second key, and that key is used to encrypt the disk.

That way, changing the PIN only requires re-encryption of the second key.

The second key is itself stored on the iPhone's flash storage. Normal flash storage is awkward to securely erase, due to wear leveling.

Flash supports only a limited number of write cycles, so to preserve its life, flash controllers spread the writes across all the chips. Overwriting a file on a flash drive may not actually overwrite the file but instead write the new file contents to a different location on the flash drive, potentially leaving the old file's contents unaltered. This makes it a bad place to store encryption keys that you want to be able to delete.

Apple's solution to this problem is to set aside a special area of flash that is handled specially.

This area isn't part of the normal filesystem and doesn't undergo wear leveling at all.
If it's erased, it really is erased, with no possibility of recovery.

This special section is called effaceable storage. When the iPhone wipes itself, whether due to bad PIN entry, a remote wipe request for a managed phone, or the built-in reset feature, this effaceable storage area is the one that gets obliterated. Apart from that special handling, however, the effaceable area should be readable and writeable just like regular flash memory. Which means that in principle a backup can be made and safely squirreled away.
If the iPhone then overwrites it after 10 bad PIN attempts, it can be restored from this backup, and that should enable a further 10 attempts.

This process could be repeated indefinitely. This video from a Shenzhen market shows a similar process in action (we came at it via 9to5Mac after seeing a tweet in February and further discussion in March). Here, a 16GB iPhone has its flash chip desoldered and put into a flash reader.

A full image of that flash is made, including the all-important effaceable area.
In this case, the chip is then replaced with a 128GB chip, and the image restored, with all its encryption and data intact.

The process for the FBI's purposes would simply use the same chip every time. By restoring every time the encryption keys get destroyed, the FBI could—slowly—perform its brute force attack.
It would probably want to install a socket of some kind rather than continuously soldering and desoldering the chip, but the process should be mechanical and straightforward, albeit desperately boring. A more exotic possibility would be to put some kind of intermediate controller between the iPhone and its flash chip that permitted read instructions but blocked all attempts to write or erase data. Hardware write blockers are already routinely used in other forensic applications to prevent modifications to SATA, SCSI, and USB disks that are being used as evidence, and there's no reason why such a thing could not be developed for the flash chips themselves.

This would allow the erase/restore process to be skipped, requiring the phone to be simply rebooted every few attempts. Why it might not work The working assumption is that the iPhone's processor has no non-volatile storage of its own.
So it simply doesn't remember that it is supposed to have wiped its encryption keys, and thus will offer another ten attempts if the effaceable storage area is restored, or that even if it does remember, it doesn't care.

This is probably a reasonable assumption; the A6 processor used in the iPhone 5c doesn't appear to have any non-volatile storage of its own, and allowing restoration means that even a securely wiped phone can be straightforwardly restored from backup by connecting it to iTunes. For newer iPhones, that's less clear.

Apple implies that the A7 processor—the first to include the "Secure Enclave" function—does have some form of non-volatile storage of its own. On the A6 processor and below, the time delay between PIN attempts resets every time the phone is rebooted. On the A7 and above, it does not; the Secure Enclave somehow remembers that there has been some number of bad PIN attempts earlier on.

Apple also vaguely describes the Secure Enclave as having an "anti-replay counter" for data that is "saved to the file system." It's not impossible that this is also used to protect the effaceable storage in some way, allowing the phone to detect that it has been tampered with.

Full restoration is similarly still likely to be possible. There is also some risk to disassembling the phone, but if the process is reliable enough for Shenzhen markets, the FBI ought to be able to manage it reliably enough. This last technique in particular should be quite robust.

There's no doubt that Apple's assistance would help a great deal; creating a firmware to allow brute-forcing the PIN would be faster and lower risk than any method that requires disassembly.

But if the FBI is truly desperate to bypass the PIN lockout and potential disk wipe, there do appear to be options available to it that don't require Apple to develop the firmware.
Richard Ying et Tangui MorlierFrench parliamentarians have adopted an amendment to a penal reform bill that would punish companies like Apple that refuse to provide decrypted versions of messages their products have encrypted.

The Guardian reports: "The controversial amendment, drafted by the rightwing opposition, stipulates that a private company which refuses to hand over encrypted data to an investigating authority would face up to five years in jail and a €350,000 (£270,000) fine." This is only the bill's first reading, and the final fate of the amendment is uncertain.

Earlier this year, the French government rejected crypto backdoors as "the wrong solution." "Given the government’s reluctance to take on the big phone companies in this way, it remains to be seen whether the thrust of the amendment can survive the lengthy parliamentary process that remains before the bill becomes law," The Guardian writes. Amendment 90 (original in French) is just one of several proposals that sought to impose stiff penalties on companies that refused to cooperate with the authorities.

As the French site Numerama notes, even harsher proposals were rejected.

For example, Amendments 532 and 533 suggested imposing a fine of €1,000,000 (£770,000) on companies that refused to decrypt messages. Amendment 221 proposed a fine of €2,000,000 (£1,550,000) and wanted "all relevant" information to be handed over, not just the decrypted messages.

Finally, Amendment 51 suggested that if a company refused to help the authorities decrypt messages, its executives would be considered as "accomplices to terrorism." While politicians in France fall over themselves to paint Apple and others as terrorist sympathizers for defending encryption, support for the US company has come from the UN's high commissioner for human rights Zeid Ra’ad Al Hussein. On Friday, he issued a statement urging the US authorities "to proceed with great caution in the ongoing legal process involving the Apple computer company and the Federal Bureau of Investigation (FBI), given its potentially negative ramifications for the human rights of people all over the world." Al Hussein said he was concerned the case risked "unlocking a Pandora’s Box that could have extremely damaging implications for the human rights of many millions of people, including their physical and financial security." He went on: "It is neither fanciful nor an exaggeration to say that, without encryption tools, lives may be endangered." This post originated on Ars Technica UK
CryptoWall most prevalent nasty – survey File-encrypting ransomware has eclipsed botnets to become the main threat to enterprises, according to Trend Micro. During the fourth quarter of 2015, 83 per cent of all data extortion attacks were made with the use of crypto-ransomware. CryptoWall topped the list of 2015’s most notorious ransomware families, with a 31 per cent share.

According to FBI statistics released last June, CryptoWall managed to generate more than $18m for its creators in a little over a year. These revenues – traced by monitoring BitCoin wallets and similar techniques – provide evidence that a growing percentage of organisations affected by ransomware attacks are paying up. Healthcare is the most affected sector when it comes to cyber-attacks more generally, according to other findings from the 2015 edition of Trend Micro’s Annual Security Roundup Report.

Throughout 2015, almost 30 per cent of all data breaches happened in the healthcare sector, followed by education and government sectors (17 per cent and 16 per cent, respectively). Elsewhere, businesses at increased risk from the Internet of Things (IoT) attacks which are moving on from becoming something only consumers need to think about as wearables and the like enter the workplace, Trend warns. Given their susceptibly to attacks, IoT devices within the enterprise ecosystem can become liabilities. Unlike Android devices, which already have fragmentation problems of their own, IoT devices run on several different platforms, making device and system updates as well as data protection more complex than ever. More from Trend’s study, published on Tuesday, can be found here. ® Sponsored: Network monitoring and troubleshooting for Dummies
It's "precrime" meets "thoughtcrime." China is using its substantial surveillance apparatus as the basis for a "unified information environment" that will allow authorities to profile individual citizens based upon their online behaviors, financial transactions, where they go, and who they see.

The authorities are watching for deviations from the norm that might indicate someone is involved in suspicious activity.

And they're doing it with a hand from technology pioneered in the US.As Defense One's Patrick Tucker reports, the Chinese government is leveraging "predictive policing" capabilities that have been used by US law enforcement, and it has funded research into machine learning and other artificial intelligence technologies to identify human faces in surveillance video.

The Chinese government has also used this technology to create a "Situation-Aware Public Security Evaluation (SAPE) platform" that predicts "security events" based on surveillance data, which includes anything from actual terrorist attacks to large gatherings of people. The Chinese government has plenty of data to feed into such systems.

China invested heavily in building its surveillance capabilities in major cities over the past five years, with spending on "domestic security and stability" surpassing China's defense budget—and turning the country into the biggest market for security technology.

And in December, China's government gained a new tool in surveillance: anti-terrorism laws giving the government even more surveillance powers, and requiring any technology companies doing business in China to provide assistance in that surveillance. The law states that companies “shall provide technical interfaces, decryption and other technical support and assistance to public security and state security agencies when they are following the law to avert and investigate terrorist activities”—in other words, the sort of "golden key" that FBI Director James Comey has lobbied for in the US.

For obvious reasons, the Chinese government is particularly interested in the outcome of the current legal confrontation between the FBI and Apple over the iPhone used by Syed Farook. Bloomberg reports that China is harnessing all that data in an effort to perform behavioral prediction at an individual level—tasking the state-owned defense contractor China Electronics Technology Group to develop software that can sift through the online activities, financial transactions, work data, and other behavioral data of citizens to predict which will perform "terrorist" acts.

The system could watch for unexpected transfers of money, calls overseas by individuals with no relatives outside the country, and other trigger events that might indicate they were plotting an illegal action.

China's definition of "terrorism" is more expansive than that of many countries. At a news conference in December, China Electronics Technology Group chief engineer Wu Manqing told reporters, "We don’t call it a big data platform, but a united information environment…It’s very crucial to examine the cause after an act of terror, but what is more important is to predict the upcoming activities.”
On Tuesday, The Guardian reported that the Federal Bureau of Investigation (FBI) has changed its rules regarding how it redacts Americans’ information when it takes international communications from the National Security Agency’s (NSA) database.

The paper confirmed the classified rule change with unnamed US officials, but details on the new rules remain murky. The new rules, which were approved by the secret US Foreign Intelligence Surveillance Court (FISC), deal with how the FBI handles information it gleans from the National Security Agency (NSA).

Although the NSA is technically tasked with surveillance of communications involving foreigners, information on US citizens is inevitably sucked up, too.

The FBI is then allowed to search through that data without any “minimization” from the NSA—a term that refers to redacting Americans’ identifiable information unless there is a warrant to justify surveillance on that person. The FBI enjoys privileged access to this information trove that includes e-mails, texts, and phone call metadata that are sent or received internationally. Recently, the Obama administration said it was working on new rules to allow other US government agencies similar access to the NSA’s database. But The Guardian notes that the Privacy and Civil Liberties Oversight Group (PCLOB), which was organized by the Obama administration in the wake of the Edward Snowden leaks, took issue with how the FBI accessed and stored NSA data in 2014. "As of 2014, the FBI was not even required to make note of when it searched the metadata, which includes the ‘to' or ‘from' lines of an e-mail,” The Guardian wrote. "Nor does it record how many of its data searches involve Americans’ identifying details." However, a recent report from PCLOB suggested that the new rules approved by FISC for the FBI involve a revision of the FBI's minimization procedures.
Spokespeople from both the FBI and PCLOB declined to comment on that apparent procedure change, saying it was classified, but PCLOB’s spokesperson, Sharon Bradford Franklin, told The Guardian that the new rules "do apply additional limits.” A spokesperson for the Office of the Director of National Intelligence said that the new procedures may be publicly released at some point.
The 25th anniversary edition of the annual RSA Conference was held from Feb. 29 to March 4 in San Francisco's Moscone Center, showcasing the best and the worst that the security world has to offer, ranging from new products (check out eWEEK's slide sho...
More Inception than legal argument at this point The US Department of Justice has appealed a decision by a New York judge to refuse the FBI access to an iPhone: one part in a wider legal battle between law enforcement and Apple. The New York case is separate from the San Bernardino case in California, over which Apple and the FBI have been very publicly fighting. However the decision by a New York magistrate last month to shoot down the FBI's demand that Apple help agents access a locked iPhone, and his rationale for doing so, have been widely cited and referenced, not least by Apple. In New York, the iPhone belongs to alleged drug dealer Jun Feng, whereas the San Bernardino phone belonged to mass killer Syed Farook. In particular, magistrate judge James Orenstein concluded that the FBI did not have the legal authority to compel Apple to help them bypass the phone's passcode and, critically, said the powerful All Writs Act was the wrong legal instrument to use.

The FBI is using that same act to argue for access in the San Bernardino case. Judge Orenstein wrote: The implications of the government's position are so far-reaching – both in terms of what it would allow today and what it implies about Congressional intent in 1789 – as to produce impermissibly absurd results. He added that to give the FBI and DEA the powers they requested would greatly expand governmental powers and put the All Writs Act's constitutionality in doubt. He also declared that since Apple has no responsibility for Feng's wrongdoing, he could not justify "imposing on Apple the obligation to assist the government's investigation against its will." The New York case was addressed by FBI director James Comey at a Congressional hearing on the Apple case last week, where he acknowledged that the FBI had lost. He tried to play down its importance by suggesting it was just one fight in a much larger battle. Regardless, the decision is important, so prosecutors have asked district judge Margo Brodie to look at it and grant them the court order that Orenstein denied. The FBI argues that Orenstein looked at the question too broadly and focused on possible future abuse rather than the actual case he was considering.

And then effectively accuses him of overreach by saying his ruling "goes far afield of the circumstances of this case and sets forth an unprecedented limitation on federal courts' authority." It also argues – as it has done in the San Bernardino case – that the request is device-specific and so does not constitute blanket approval for the FBI to break into any iPhone. As for Apple, unsurprisingly it is in favor of Orenstein's judgment, with a spokesman saying that the company "shares the judge's concern" that use of the All Writs Act in these case is a dangerous path and a "slippery slope". ® Sponsored: 2016 global cybersecurity assurance report card
The government argues the iPhone 5s in question runs an older operating system that has been cracked before. The U.S. Justice Department has asked a New York federal court to overturn a recent ruling that protects Apple from having to unlock an iPhone involved in a drug case. Last week, a Brooklyn judge rejected the government's request to compel Cupertino to crack an iPhone 5s seized in 2014 from accused drug trafficker Jun Feng, who eventually pleaded guilty to conspiracy.

Despite the guilty plea, however, the government claimed access to his phone was still necessary, because it might lead to criminal accomplices. "Ultimately, the question to be answered in this matter, and in others like it across the country, is not whether the government should be able to force Apple to help it unlock a specific device," Magistrate Judge James Orenstein said at the time. "It is instead whether the All Writs Act resolves that issue and many others like it yet to come.
I conclude that it does not." The move was welcomed by the tech titan, which is also fighting a very public battle against the FBI over its request to access an iPhone 5c used by a terrorist in the San Bernardino attack. In the New York case, prosecutors filed a 45-page brief on Monday, arguing that Feng's iPhone 5s runs an older operating system—iOS 7—that Apple has agreed to breach in the past. "This case in no way upends the balance between privacy and security," prosecutors wrote in the new filing, as reported by The Wall Street Journal. Judge Orenstein's ruling "goes far afield of the circumstances of this case and sets forth an unprecedented limitation on federal courts' authority," the brief said. Apple disagrees. "Judge Orenstein ruled the FBI's request would 'thoroughly undermine fundamental principles of the Constitution' and we agree," a company spokesman said in a statement. "We share the judge's concern that misuse of the All Writs Act would start us down a slippery slope that threatens everyone's safety and privacy." Cupertino boss Tim Cook has referenced that same slippery slope in the tech titan's fight with the FBI, claiming that the requested iOS backdoor will inevitably end up in the wrong hands.

Apple is even willing to take its fight to the Supreme Court, where it would have the support of numerous industry heavyweights. Apple is due back in court on the San Bernardino case on March 22. The DOJ did not immediately respond to PCMag's request for comment.
Ecryption, bug bounties and threat intel dominated the mindshare of the cybersecurity hive mind at RSAC last week. SAN FRANCISCO, CALIF. – RSA Conference 2016 -- With one of the biggest crowds ever to hit Moscone for RSA Conference USA, the gathering last week of 40,000 security professionals and vendors was like a convergence of water cooler chatterboxes from across the entire infosec world. Whether at scheduled talks, in bustling hallways or cocktail hours at the bars nearby, a number of definite themes wound their way through discussions all week. Here's what kept the conversations flowing. Encryption Backdoors The topic of government-urged encryption backdoors was already promising to be a big topic at the show, but the FBI-Apple bombshell ensured that this was THE topic of RSAC 2016.

According to Bromium, a survey taken of attendees showed that 86% of respondents sided with Apple in this debate, so much of the chatter was 100 different ways of explaining the inadvisability of the FBI's mandate. One of the most colorful quotes came from Michael Chertoff, former head of U.S.

Department of Homeland Security: "Once you’ve created code that’s potentially compromising, it’s like a bacteriological weapon. You’re always afraid of it getting out of the lab.” Bug Bounties In spite of the dark cast the backdoor issue set over the Federal government's relations with the cybersecurity industry, there was plenty of evidence of positive public-private cooperation.

Exhibit A: the "Hack the Pentagon" bug bounty program announced by the DoD in conjunction with Defense Secretary Ash Carter's appearance at the show. While bug bounty programs are hardly a new thing, the announcement of the program shows how completely these programs have become mainstream best practices. "There are lots of companies who do this,” Carter said in a town hall session with Ted Schlein, general partner at Kleiner Perkins Caufield & Byers. “It’s a way of kind of crowdsourcing the expertise and having access to good people and not bad people. You’d much rather find vulnerabilities in your networks that way than in the other way, with a compromise or shutdown.” Threat Intel There was no lack of vendors hyping new threat intelligence capabilities at this show, but as with many hot security product categories threat intel is suffering a bit as the victim of its own success.

The marketing machine is in full gear now pimping out threat intel capabilities for any feature even remotely looking like it; one vendor lamented to me off the record, "most threat intel these days is not even close to being real intelligence." In short, threat intel demonstrated at the show that it was reaching the peak of the classic hype cycle pattern. RSAC attendees had some great evidence of that hanging around their necks. Just a month after the very public dismantling of Norse Corp., the show's badge holder necklaces still bore the self-proclaimed threat intelligence vendor's logos.

But as Robert Lee, CEO of Dragos Security, capably explained over a month ago in the Norse fallout, this kind of failure (and additional disillusionment from customers led astray by the marketing hype) is not necessarily a knock on the credibility of threat intel as a whole.
It is just a matter of people playing fast and loose with the product category itself. "Simply put, they were interpreting data as intelligence," Lee said. "There is a huge difference between data, information, and intelligence.
So while they may have billed themselves as significant players in the threat intelligence community they were never really accepted by the community, or participating in it, by most leading analysts and companies.

Therefore, they aren’t a bellwether of the threat intelligence industry." Related Content: Find out more about security threats at Interop 2016, May 2-6, at the Mandalay Bay Convention Center, Las Vegas. Register today and receive an early bird discount of $200. Ericka Chickowski specializes in coverage of information technology and business innovation.
She has focused on information security for the better part of a decade and regularly writes about the security industry as a contributor to Dark Reading.  View Full Bio More Insights
The U.S.

Department of Justice has appealed an order by a court in New York that turned down its request that Apple should be compelled to extract data from the iPhone 5s of an alleged drug dealer. The case in New York is seen as having a bearing on another high-profile case in California where Apple is contesting an order that would require the company to assist the FBI, including by providing new software, in its attempts at cracking by brute force the passcode of an iPhone 5c running iOS 9. The phone was used by one of the two terrorists in the San Bernardino killings on Dec. 2 and the FBI wants Apple to disable the auto-erase feature on the phone, which would erase all data after 10 unsuccessful tries of the passcode, if the feature was activated by the terrorist. The DOJ argues in the New York case as well that it is unable to access the data on the phone running iOS 7, because it is locked with a passcode.

By trying repeated passcodes, the government risks permanently losing all access to the contents of the phone because when an iPhone is locked it is not apparent if the auto-erase feature is enabled, according to the filing Monday. But Apple can help extract the data from the iPhone 5s, which it has done dozens of times in the past for similar iPhone versions at the request of the government, the DOJ argues.

For versions of the operating system that predate iOS 8, Apple has the capability to bypass the passcode feature and access the contents of the phone that were unencrypted, it wrote in the filing. Invoking user privacy and safety, Apple has said that its previous acquiescence to similar judicial orders does not mean it consents to the process.

But the government, which has a warrant, claims that Apple made an U-turn in the particular instance, having first agreed to extract the data, when it responded that "Upon receipt of a valid search warrant pursuant to the instructions laid out in [the legal process guidelines], we can schedule the extraction date within a 1-2 week time frame." At no time during the communications ahead of the government seeking an order did Apple object to the propriety of the proposed order directing its assistance, according to the DOJ filing Monday. Magistrate Judge James Orenstein of the U.S.

District Court for the Eastern District of New York ruled recently that Apple can’t be forced to extract the data from the iPhone.

The government's reading of the All Writs Act, a statute enacted in 1789 and commonly invoked by law enforcement agencies to get assistance from tech companies on similar matters, would change the purpose of the law “from a limited gap-filing statute that ensures the smooth functioning of the judiciary itself into a mechanism for upending the separation of powers by delegating to the judiciary a legislative power bounded only by Congress's superior ability to prohibit or preempt.” But the government argues that the residual authority of the court under the All Writs Act is particularly important where legislation lags behind technology or risks obsolescence. The government argues that courts have relied on the All Writs Act to mandate third-party assistance with search warrants even in circumstances far more burdensome that what is requested in the New York case.

An unreasonable burden on the third-party is a key criterion when a Judge considers an order under the Act.

Apple has argued that a burden on the company would be the impact on its ability to protect customer information, which could threaten the trust customers have in the company and tarnish the Apple brand. The DOJ is now asking the court to review the decision by the Magistrate Judge. "Judge Orenstein ruled the FBI’s request would 'thoroughly undermine fundamental principles of the Constitution’ and we agree,” Apple said in a statement on the new filing by the DOJ.

The company said it shared the Judge’s concern “that misuse of the All Writs Act would start us down a slippery slope that threatens everyone’s safety and privacy."