ZDNet reports on an increasing trend by malicious actors to leverage code-signing certificates to bypass perimeter security appliances as a way to infect their targets:

Code-signing certificates are designed to give your desktop or mobile app a level of assurance by making apps look authentic. Whenever you open a code-signed app, it tells you who the developer is and provides a high level of integrity to the app that it hasn’t been tampered with in some way. Most modern operating systems, including Macs, only run code-signed apps by default.

But not only does code-signing have an affect on users who inadvertently install malware, code-signed apps are also harder to detect by network security appliances. The research said that hardware that uses deep packet inspection to scan for network traffic “become less effective when legitimate certificate traffic is initiated by a malicious implant.”

That’s been picked up by some hackers, who are selling code-signing certificates for as little as $299. Extended validation certificates which are meant to go through a rigorous vetting process can be sold for $1,599.

The certificates, the researchers say, were obtained by reputable certificate issuing authorities, like Comodo, and Symantec and Thawte — both of which are now owned by DigiCert.

Organizations need to consider blacklisting certificates known to be leveraged by attackers. Often times, due to the price of the certs, the same code-signing certificate will be used across multiple malware variants developed by a single group. This actually makes it much easier to block multiple attacks in one fell swoop.

It is standard operating procedure for cyber defense teams to block malware based on the hash and filename, among other datapoints. By blocking the certificate during the code-signing certificate validation phase, endpoint defense systems may be able to prevent the malware from running. If the cert matches a blacklisted one, the validation fails, and the attack will be thwarted.

So I would argue that being able to block code-signing certificates makes it a lot easier to stop a variety of malware from a single attack group. Blacklisting hashes and filenames is an extremely important ingredient in cyber defense, but so should blocking certificates.

The Trump Administration today stated the US government does not need FISA court approval to ask for encryption backdoors to be built into software developed by the technology industry:

The US government does not need the approval of its secret surveillance court to ask a tech company to build an encryption backdoor.

The government made its remarks in July in response to questions posed by Sen. Ron Wyden (D-OR), but they were only made public this weekend.

The implication is that the government can use its legal authority to secretly ask a US-based company for technical assistance, such as building an encryption backdoor into a product, but can petition the Foreign Intelligence Surveillance Court (FISC) to compel the company if it refuses.

In its answers, the government said it has “not to date” needed to ask the FISC to issue an order to compel a company to backdoor or weaken its encryption.

The government would not say, however, if it’s ever asked a company to add an encryption backdoor.

Unbelievable yet unsurprising.

How will a Trump administration tackle the encryption and surveillance policy issues started by Bush and continued with Obama?

“I imagine (Trump) is going to be a guy who is probably going to mandate back doors,” said Hank Thomas, chief operating officer at Strategic Cyber Ventures and a veteran of the National Security Agency. “I don’t think he’s ultimately going to be a friend to privacy, and the fearful side of me says he will get intelligence agencies more involved in domestic law enforcement.”

Just based on how Trump acts, and the bombastic remarks he makes without nary a pause, lead me to the same conclusion as Thomas. I sense a reckoning is coming, and there will be a tough battle between the Trump administration and Silicon Valley.

This is a must read on why Signal is the clear secure messaging choice over WhatsApp:

In short, if a government demands that Open Whisper Systems hand over the content or metadata of a Signal message or a user’s contact list, it has nothing to hand over. And that government will have just as little luck requesting backups of Signal messages from Google or Apple.

From a user privacy perspective, Signal is the clear winner, but it’s not without its downsides.

Compared to WhatsApp’s 1 billion users, Signal’s user base is minuscule. Marlinspike said that they don’t publish statistics about how many users they have, but Android’s Google Play store reports that Signal has been downloaded between 1 and 5 million times. The iPhone App Store does not publish this data.

Too bad almost nobody uses Signal. Convincing people to switch their text-messaging app is almost like trying to get people to convert religions.

Apple goes all in on encryption despite FBI concerns:

As part of the new system, developers building software for Apple’s devices will be able to opt for users’ information to have no encryption, single-key encryption, or multi-key encryption “with per-file keys for file data and a separate key for sensitive metadata” – comparable to leaving a door unlocked, using one key, or using two keys.

In its documentation of APFS, Apple explains that full disk encryption has been available on OS X since version 10.7 Lion. APFS differs in that it encrypts files individually rather than as a one unit, similar to other encryption mechanisms Apple introduced to its iOS platform in 2010. It also encrypts related metadata – the basic summary attached to each file – and will keen data secure even when the device has been physically hacked.

Since its battle with the FBI, Apple has made a number of important changes to increase security and tighten encryption. Apple itself couldn’t decrypt information the agency demanded, but the company did have the keys to access information stored in the shooter’s iCloud account. The company is now reportedly considering a system that wouldn’t allow it to access iCloud data.

Potent essay in favor of strong encryption even though the US intelligence apparatus would like Americans to believe terrorists use it to hide their communications from law enforcement (demonstrably false in certain circumstances, such as Paris):

People who protect liberty have to take care not to imply, much less acknowledge, that the draconian anti-liberty measures advocated by the surveillance state crowd are justified, tactically or morally, no matter what the circumstances. Someday a terrorist will be known to have used strong encryption, and the right response will be: “Yes, they did, and we still have to protect strong encryption, because weakening it will make things worse.”

Why? Because encryption is actually a straightforward matter, no matter how much fear-mongering law enforcement officials and craven, willfully ignorant politicians spout about the need for a backdoor into protected communications. The choice is genuinely binary, according to an assortment of experts in the field. You can’t tamper this way with strong encryption without making us all less secure, because the bad guys will exploit the vulnerabilities you introduce in the process. This isn’t about security versus privacy; as experts have explained again and again, it’s about security versus security.

Moreover, as current and former law enforcement officials lead a PR parade for the surveillance-industrial complex, pushing again for pervasive surveillance, they ignore not just the practical problems with a “collect it all” regime — it drowns the spies in too much information to vet properly — but also the fundamental violation of liberty that it represents. These powers are always abused, and a society under surveillance all the time is a deadened one, as history amply shows.

Of course we need some surveillance, but in targeted ways. We want government to spy on enemies and criminal suspects, but with the checks and balances of specific judicial approval, not rubber stamps for collect-it-all by courts and Congress. The government already has lots of intrusive tools at its disposal when it wants to know what specific people are doing. But our Constitution has never given the government carte blanche to know everything or force people to testify against themselves, among other limits it establishes on power.

The shortsighted Federal Bureau of Investigation considered taking Apple to court due to their encryption capabilities built-in to iMessage, Facetime, and iOS devices:

The clash with Cupertino was reportedly sparked by an investigation this summer — “involving guns and drugs” — in which a court order was obtained, demanding that Apple provide real time iMessages exchanged by iPhone-using suspects. Due to the stringent security measures featured on iOS 8, Apple responded that it could not comply due to the advanced encryption used by the company.

Thankfully, the decision was taken not to pursue legal action. However, the case once again demonstrates the opposition that exists within government to Apple’s stance on user privacy.

In a previous open letter, F.B.I. director James Comey argued that the top-notch security on devices like the iPhone have potential to aid terrorist groups like ISIS.

Tim Cook, meanwhile, has argued that Apple is taking a moral stance by not mining user data.

The Department of Justice continues to proclaim the sky is falling on encryption and is now calling for a balance to include law enforcement needs even though technical experts keep saying it just is not possible (emphasis added):

Beginning in late 2014, FBI and DOJ officials have sounded alarms about encryption, saying law enforcement agencies are increasingly “going dark” in criminal and terrorism investigations because subjects’ data unavailable, even after a court-issued warrant. Apple and Google both announced new end-to-end encryption services on their mobile operating systems, in part as a response to leaks about massive surveillance programs at the National Security Agency.

One recent criminal defendant described end-to-end encryption as “another gift from God,” Deputy Attorney General Sally Quillian Yates said during a speech last month. “But we all know this is no gift—it is a risk to public safety,” she said then.

Several encryption and security experts, as well as digital rights groups, have criticized the DOJ and FBI calls for encryption workarounds. “If it’s easier for the FBI to break in, then it’s easier for Chinese hackers to break in,” Senator Ron Wyden, an Oregon Democrat, said last month. “It’s not possible to give the FBI special access to Americans’ technology without making security weaker for everyone.”

The Intercept’s Jenna McLaughlin looks at the many things wrong with Manhattan District Attorney Cyrus Vance Jr’s anti-encryption op-ed in the New York Times (emphasis added):

It’s true that when law enforcement asks for information that is encrypted with the user’s passcode, Apple and Google cannot actually deliver it. But that’s typically not the whole story.

For one: Apple, for instance, copies a lot of that data onto its own cloud servers during Wi-Fi backups, where the company can in fact access it and turn it over to law enforcement.

Plenty of other data is still available from the phone companies: SMS text messages, phone numbers called and phone calls received, and location information.

And then there’s the ability to break in. Responding to Tuesday’s op-ed, ACLU technologist Christopher Soghoian tweeted: “If law enforcement can’t hack the hundreds of millions of Android phones running out-of-date, vulnerable software, they’re not trying.”

Following the rollout of iOS 8, Lee Reiber, a cell phone forensics expert at AccessData, told Mashable that “As secure as the device can be, there’s always going to be some vulnerability that can be located and exploited.” Reiber said it’s “cat and mouse.”

No matter how hard they have tried, technologists and security experts have been unable to find a workable solution to FBI Director James Comey’s ostensible “going dark” problem (emphasis added):

Short of outlawing cryptography, which would ensure that only outlaws have crypto, some of the solutions on the table call for either key escrow or building access for law enforcement into key servers.

“There’s no assurance that something like this would not be abused for mass surveillance,” Green said.

The FBI’s Comey, as recently as a month ago, eased off demands for exceptional access, and instead told technology companies they need to try harder to find a solution to the problem. Key escrow, where trusted parties share keys, was part of Comey’s solution.

“I’ve heard that it’s too hard, that there’s no solution. Really?” Comey said during a Congressional hearing July 8, mentioning Silicon Valley by name. “Maybe it is too hard, but given the stakes, we’ve got to give it a shot and I don’t think it’s been given an honest hard look.

“We want people to be in position to comply with judges’ orders in the U.S. We want creative people to figure out how to comply with court orders,” Comey said. “You shouldn’t be looking at the FBI director for innovation.”

Green and Denaro pointed out during today’s session a number of technical issues that make exceptional access a bad idea, in particular the fact that this issue has no geographic borders. Should Apple, for example, build in a backdoor for U.S. law enforcement, how does it say no to other countries, including leaders in oppressive or sanctioned nations?

“Once we have the capability to eavesdrop, even if you build in a legal safeguard to make sure it’s not abused, what happens when you send this to repressive governments that don’t have a First Amendment?” Green said. “Build it here to chase [criminals] and give that same technology to oppressive governments to own devices? If ISIS needs encryption, it will get it. It will stop relying on iMessage pretty quickly if it’s backdoored.”

I am just not buying this whole “going dark” problem. The FBI just wants these tools to make their jobs easier, which I can totally related to and maybe even sympathize. However, to blatantly disregard the security implications inherent in backdoors, the FBI is positioning the US to be less safe than anywhere else.

As a parallel, who were the only people with alcohol during Prohibition? Gangsters and people uninterested in following the law. So guess what happened? There was a lot of violence around the alcohol trade, many unnecessary deaths, and meanwhile people kept drinking alcohol. Long story short: prohibition did not prohibit anything. What makes the FBI think the same is going to happen this time around?

The ostensible Obama administration’s war against Apple and Google providing device encryption on by default has gotten uglier with the prospect that the companies could potentially be held liable for providing material support to terrorists (emphasis added):

Benjamin Wittes, editor-in-chief of the LawFare blog, suggested that Apple could in fact face that liability if it continued to provide encryption services to a suspected terrorist. He noted that the post was in response to an idea raised by Sen. Sheldon Whitehouse, D-R.I., in a hearing earlier this month.

“In the facts we considered,” wrote Wittes and his co-author, Harvard law student Zoe Bedell, “a court might — believe it or not — consider Apple as having violated the criminal prohibition against material support for terrorism.”

FBI Director James Comey and others have said that end-to-end encryption makes law enforcement harder because service providers don’t have access to the actual communications, and therefore cannot turn them over when served with a warrant.

Wittes and Bedell argue that Apple’s decision to “move aggressively to implement end-to-end encrypted systems, and indeed to boast about them” after being “publicly and repeatedly warned by law enforcement at the very highest levels that ISIS is recruiting Americans” — in part through the use of encrypted messaging apps — could make the company liable if “an ISIS recruit uses exactly this pattern to kill some Americans.”

The blog compares Apple’s actions to a bank sending money to a charity supporting Hamas — knowing that it was a listed foreign terrorist organization.

“The question ultimately turns on whether Apple’s conduct in providing encryption services could, under any circumstances, be construed as material support,” Wittes and Bedell write. The answer, they say, “may be unnerving to executives at Apple.”

As most security experts will agree, FBI Director James Comey’s call for backdoors in encryption will never solve the ostensible “going dark” problem he so loves to cry wolf about (emphasis added):

Imagine that Comey got what he wanted. Imagine that iMessage and Facebook and Skype and everything else US-made had his backdoor. The ISIL operative would tell his potential recruit to use something else, something secure and non-US-made. Maybe an encryption program from Finland, or Switzerland, or Brazil. Maybe Mujahedeen Secrets. Maybe anything. (Sure, some of these will have flaws, and they’ll be identifiable by their metadata, but the FBI already has the metadata, and the better software will rise to the top.) As long as there is something that the ISIL operative can move them to, some software that the American can download and install on their phone or computer, or hardware that they can buy from abroad, the FBI still won’t be able to eavesdrop.

And by pushing these ISIL operatives to non-US platforms, they lose access to the metadata they otherwise have.

Convincing US companies to install backdoors isn’t enough; in order to solve this going dark problem, the FBI has to ensure that an American can only use backdoored software. And the only way to do that is to prohibit the use of non-backdoored software, which is the sort of thing that the UK’s David Cameron said he wanted for his country in January.

Read the entire article to see why this call for backdoor-able encryption is just outright dangerous, not only for the public but for law enforcement as well.

Thanks to the advocacy of many industry and privacy groups, the Obama Administration has finally listened and is rewriting its controversial zero-day export policy (emphasis added):

For two months, security researchers have been fighting a controversial export policy known as the Wassenaar Arrangement — and now it looks like they may have won a crucial battle in that fight. In a closed-door meeting this morning, a Commerce Department representative said the agency’s Wassenaar-inspired export controls were currently being rewritten after the comment period ended last week. The new version will be “quite different,” according to a Commerce official quoted by PoliticoPro, and will be followed by a second round of public comments.

First laid out in May, the Department of Commerce’s new export rules were controversial from the start, with many in the security community saying the rules would make it impossible to develop and deploy benign security tools. Companies also raised concerns that the rules would hamper international bug bounties, which are now a common security practice among software vendors. Commerce held a two-month comment period on the proposed rules, in which time Google, Facebook, and dozens of other companies filed comments critical of the regulations as written. Now that the comment period is closed, it appears Commerce took those criticisms to heart.

The outstanding question is this: how much of the policy will be rewritten to address the many real concerns with the original draft?

In what most cyber security experts would say is a surprising change of heart, former Homeland Security Secretary Michael Chertoff publicly discloses his disagreement with FBI Director James Comey on the governments desire to backdoor encryption (emphasis added):

I think that it’s a mistake to require companies that are making hardware and software to build a duplicate key or a back door even if you hedge it with the notion that there’s going to be a court order. And I say that for a number of reasons and I’ve given it quite a bit of thought and I’m working with some companies in this area too.

First of all, there is, when you do require a duplicate key or some other form of back door, there is an increased risk and increased vulnerability. You can manage that to some extent. But it does prevent you from certain kinds of encryption. So you’re basically making things less secure for ordinary people.

The second thing is that the really bad people are going to find apps and tools that are going to allow them to encrypt everything without a back door. These apps are multiplying all the time. The idea that you’re going to be able to stop this, particularly given the global environment, I think is a pipe dream. So what would wind up happening is people who are legitimate actors will be taking somewhat less secure communications and the bad guys will still not be able to be decrypted.

The third thing is that what are we going to tell other countries? When other countries say great, we want to have a duplicate key too, with Beijing or in Moscow or someplace else? The companies are not going to have a principled basis to refuse to do that. So that’s going to be a strategic problem for us.

Finally, I guess I have a couple of overarching comments. One is we do not historically organize our society to make it maximally easy for law enforcement, even with court orders, to get information. We often make trade-offs and we make it more difficult. If that were not the case then why wouldn’t the government simply say all of these [takes out phone] have to be configured so they’re constantly recording everything that we say and do and then when you get a court order it gets turned over and we wind up convicting ourselves. So I don’t think socially we do that.

And I also think that experience shows we’re not quite as dark, sometimes, as we fear we are. In the 90s there was a deb — when encryption first became a big deal — debate about a Clipper Chip that would be embedded in devices or whatever your communications equipment was to allow court ordered interception. Congress ultimately and the President did not agree to that. And, from talking to people in the community afterwards, you know what? We collected more than ever. We found ways to deal with that issue.

These are all the exact same arguments security experts have been saying in opposition of the idea for some kind of unicorn dust magic key that will unlock every form of encryption available on the planet. It is very curious to see Chertoff have this change of heart considering his background, and especially since he used to have to tow the governmental party line on law enforcement capabilities.

That he hits the nail directly on the head with respect to why the FBI’s notion of a backdoor is such a bad idea is quite noteworthy. Chertoff is quite possibly the first former federal law enforcement senior leader to publicly disagree with Comey and the Obama administration on its ardent desire to make it easy for the FBI and other federal agencies to spy on Americans.

Hopefully Chertoff’s influence extends beyond his own nose and others in his sphere come to the realization that this idea of the government holding a magic backdoor key is nothing but a pure pipe dream.

Ransomware is one of the more evil malware types, primarily leveraged by crime syndicates looking to extort money from people who do not know any better. A new version of TeslaCrypt has been released with changes to the encryption scheme to potentially make the malware appear to be more intimidating (emphasis added):

“Why use this false front? We can only guess – perhaps the attackers wanted to impress the gravity of the situation on their victims: files encrypted by CryptoWall still cannot be decrypted, which is not true of many TeslaCrypt infections,” Fedor Sinitsyn of Kaspersky Lab wrote in an analysis of the new ransomware.

But the more significant modification in version 2.0.0 is the inclusion of an updated encryption method. TeslaCrypt, like many other ransomware variants, encrypts the files on victims’ machines and demands a payment in order to obtain the decryption key. The payment typically must be in Bitcoin and the attackers using crypto ransomware have been quite successful in running their scams. Estimates of the revenue generated by variants such as CryptoLocker run into the millions of dollars per month.

Researchers have had some success in finding methods to decrypt files encrypted by ransomware, specifically TeslaCrypt. But the change to the malware’s encryption method may make that more difficult.

“The encryption scheme has been improved again and is now even more sophisticated than before. Keys are generated using the ECDH algorithm. The cybercriminals introduced it in versions 0.3.x, but in this version it seems more relevant because it serves a specific purpose, enabling the attackers to decrypt files using a ‘master key’ alone,” Sinitsyn said.

“Each file is encrypted using the AES-256-CBC algorithm with session_priv as a key. An encrypted file gets an additional extension, ‘.zzz’. A service structure is added to the beginning of the file, followed by encrypted file contents.”