Category Archives: Opinion

Oh dear! Oh dear! I shall be too late! – The White Rabbit

29 October 2017

WannaCry, NotPetya, and now: Bad Rabbit. The good news is that Bad Rabbit isn’t spreading as fast as WannaCry and NotPetya. According to a DARKReading report from October 25th the outbreak appears to die down already.

The bad news is, that it happened again. Like the White Rabbit in Alice’s Adventure in Wonderland, IT departments seem to mutter only “Oh dear! Oh dear! I shall be too late!”, instead of increasing the security baseline of their company networks.

Bad Rabbit uses similar techniques as WannaCry and NotPetya for spreading in the networks:

Open SMB shares, Mimikatz alike ways to dump credentials from the affected systems, a hardcoded list of credentials, … For more technical details see this post from Malwarebytes Labs.

The methods to avoid this are well-known and easy and cheap to implement:

  • Run a user awareness campaign.
  • Reduce the number of users and administrators working with permanent administrative privileges to zero. This is a leadership task!
  • Apply the measures to mitigate Pass-the-Hash attacks to all Windows systems and networks.
  • Limit the functionality of technical users to local systems and the lowest possible privileges. Use individual passwords, eliminate default passwords.
  • Review all firewall rules. Question every required connection. Limit the use of the SMB protocol as far as possible. Eliminate the use of unsecured protocols as far as possible. Patch the systems at the endpoints of firewall rules.

The above list is not exhaustive, but if implemented, the attacker’s ability to explore the network is clearly reduced.

It appears to me, that everyone is waiting for Windows 10 to solve some of the issues. This however is the wrong approach. Windows 10 cannot be introduced with a big bang. In particular in the production, lab, and building automation domain, it will take a few years until we can shutdown Windows XP/7 completely. And during this years, our networks are at risk.

With this, there is no time to lose. The White Rabbits returns.

Have a great week.

Advertisements

To panic, or not to panic, that is the question: A simple panic calculator

22 October 2017

Last week, I had some interesting discussions about when to panic if a new vulnerability is published. With the concept of critical vulnerabilities in mind, this is an easy task:

My Panic Level Calculator

My Panic Level Calculator

To be honest, the panic in the media about the WPA2 / Krack vulnerability published last week appears somewhat exaggerated. CVE-2017-11292 however, a remote code execution vulnerability in Flash Player published on 16 October 2017, was not discussed in the media at all, although Kaspersky found an exploit on 10 October 2017.

Please keep in mind that critical vulnerability must be mitigated before an exploit is available on the market. The flash player vulnerability shows, that immediate action is required.

Have a great week!

Top secret information about Australia’s military hacked – SME’s overstretched with Cyber Security Frameworks

15 October 2017

Lisa Martins report Top secret information about Australias military hacked, published on October 12th, 2017 at news.com.au, about a one year old attack on an Australian defense contractor is another example that small businesses are technically and organizationally overstretched with the challenges of cyber security.

The best approach for SMEs would be to set up a cyber security framework like the NIST Cyber Security Framework or an ISO 27001 based framework. But the effort to do this is for small businesses just too high.

For SMEs to stay ahead of the cyber security curve a light version of such frameworks is required, with focus put on actively managing the risk.

The Strategies to Mitigate Cyber Security Incidents of the Australian Signals Directorate (ASD) puts focus on the basics. If carefully implemented and regularly assessed, the security level goes up and this kind of attacks are no longer possible. Even large businesses can raise their security level when implementing the ASDs recommendations.

But when it comes to critical infrastructures a full implementation of a cyber security frameworks is the only way to survive in the long-term. By the way, the first task in the NIST CSF core is asset management…

Have a great week.

Congress hearing brings light into the Equifax darkness – Trust in the results of vulnerability scans a possible cause for the Equifax data breach?

8 October 2017

On October 3rd, 2017 a hearing titled “Oversight of the Equifax Data Breach: Answers for Consumers.” was conducted by the Subcommittee on Digital Commerce and Consumer Protection of Committee on Energy and Commerce of the 115 U.S. congress.

Sarah Buhr’s report “Former Equifax CEO says breach boiled down to one person not doing their job” posted on October 3rd, 2017 on TechCrunch gives a good summary of the hearing results.

And a surprisingly plain cause for the data breach:

‘“The human error was that the individual who’s responsible for communicating in the organization to apply the patch, did not,” Smith, who did not name this individual, told the committee.’

To be honest, this sounds somewhat too easy.

In the witness statement of Mr. Smith one reads:

[1] “On March 9, Equifax disseminated the U.S. CERT notification internally by email requesting that applicable personnel responsible for an Apache Struts installation upgrade their software. Consistent with Equifax’s patching policy, the Equifax security department required that patching occur within a 48 hour time period.

[2] “We now know that the vulnerable version of Apache Struts within Equifax was not identified or patched in response to the internal March 9 notification to information technology personnel.”

[3] “On March 15, Equifax’s information security department also ran scans that should have identified any systems that were vulnerable to the Apache Struts issue identified by U.S. CERT. Unfortunately, however, the scans did not identify the Apache Struts vulnerability. Equifax’s efforts undertaken in March 2017 did not identify any versions of Apache Struts that were subject to this vulnerability, and the vulnerability remained in an Equifax web application much longer than it should have.”

[4] “The company knows, however, that it was this unpatched vulnerability that allowed hackers to access personal identifying information.”

From [1] we learn that Equifax has a good vulnerability management process in place.

From [2] and [3] we learn that Equifax’s vulnerability scanner did not find the systems with the vulnerable version installed, although the company knew that the vulnerable Apache Struts framework was installed on some systems.

There are many reasons for this. Just two examples:

  1. Vulnerability scanners must be updated regularly to keep pace with the threat market. But there is often a gap between the release of a scanner update and the disclosure of a vulnerability.
  2. Even up-to-date vulnerability scanners provide no 100% detection. In addition, the results are often somewhat puzzling.

From my point of view, the trust in the results of the vulnerability scanners is the cause for the data breach. Because of the gap between the release of a scanner update and the disclosure of a vulnerability an up-to-date asset repository, at least for the systems facing the internet, is of desperate need for fast identification of vulnerable systems.

The reduction to a human error is just too plain. This sounds more like a systematic error. And provides some interesting food for thought for the next week.

Have a great week.

Keep calm and ignore Illusion Gap

5 October 2017

During a cycling trip through the Eifel national park last week, a new weakness called Illusion Gap was extensively discussed in the media.

Security researchers at CyberArk detected a feature in the Windows SMB Server that allows attackers to bypass Windows Defender, and possibly other anti-malware products, when serving an executable from a file share. For more details please see Kasif Dekel’s excellent post at the CyberArk Threat Research blog.

CyberArk notified Microsoft of this vulnerability, but Microsoft did not view it as a security issue:

“Thanks for your email. Based on your report, successful attack requires a user to run/trust content from an untrusted SMB share backed by a custom server that can change its behavior depending on the access pattern. This doesn’t seem to be a security issue but a feature request which I have forwarded to the engineering group.”

In my opinion, the effort to successfully exploit Illusion Gap appears to be somewhat too high:

First of all, an attacker must convince a user to execute a program that installs a specially crafted SMB server service on a Windows system. Since administrative privileges are required to do this the perfect victim should either work with permanent administrative rights or should at least have access to an administrative account he can leverage for UAC. Finally, the attacker must install a malicious and a clean version of the executable on the newly created file share and trick a user to run the executable from the share.

Since the attack complexity is high and authentication is required the likelihood of rapid detection is high. This is aggravated by the fact that the execution of programs from file shares is often used as indicator of compromise.

With this, we should not waste our time with Illusion Gap.

Have a great weekend.

Critical vulnerabilities require immediate action – How to prevent Equifax like attacks

23 September 2017

Critical Vulnerabilities are

  • exploitable from the network (Access Vector: Network),
  • require only low or medium skills to exploit (Access Complexity: Low or Medium),
  • require no authentication (Authentication: None),
  • cause great damage (Severity: High), and
  • allow remote attackers to execute arbitrary code on the victims’ computer

Among the vulnerabilities with CVSS vector (AV:N/AC:L/Au:N) or (AV:N/AC:M/Au:N) which cause great damage the last property makes the difference.

The infographic below shows that the number of critical vulnerabilities (320) is very small compared to the total number of vulnerabilities in 2016.

Critical Vulnerabilities 2016

Critical vulnerabilities 2016. Click to enlarge.

Nevertheless, immediate action is required because the reach of attacks is technically unlimited if critical vulnerabilities can be exploited.

Once an attacker has exploited a critical vulnerability in the DMZ he is able to execute arbitrary code on this computer. With this, he can probe the network for other computers with critical vulnerabilities or leverage Windows built-in weaknesses, configuration issues, and tools to explore the network until he finally gets to a computer which has a connection across a firewall to the company network.

Both, NotPetya and WannaCry exploited critical vulnerabilities. While WannaCry was just annoying, NotPetya caused multi-million dollar damage in companies across the world.

Mitigation

The TEAM approach for handling risks shows the direction for dealing with critical vulnerabilities.

Transfer: No insurer will take the risk because in the case of a critical vulnerability on a server in the DMZ both the probability of occurrence and the impact are high.

Eliminate: Is not possible, because this will result in loss of business.

Accept: No option because the probability of occurrence and the impact are high.

Mitigate: Patching is the only possible response in this case. Isolation of the system from the network will result in loss of business.

Urgency

Under normal conditions, patches are available at the time of disclosure.

Rule: Critical vulnerabilities should be patched faster than exploits show up on the market.

With this, immediate action is required because very often exploits are available yet at the time of disclosure. In addition, we cannot expect that only ethical hackers publish vulnerabilities.

Equifax

Critical Vulnerabilities Mitigation Process

Critical vulnerabilities mitigation process.

In the Equifax attack the critical vulnerability CVE-2017-5638 in the Apache Struts framework was used. A patch was available at the time of disclosure but apparently not applied.

Patching the Apache Struts framework is a challenging job.

Firstly, it is a challenge to identify the systems with the vulnerable framework installed.

Secondly, patches must be carefully tested prior implementation to avoid business loss.

Finally, the patches must be implemented manually because automated patch management is not available.

Thus, an up-to-date asset repository, a current QA system, and actual automated test routines are required to get the job done in the required short time frame.

To be honest, the Equifax attack remains a mystery for me. The IT shop of a billion dollar company should be able to deal with critical vulnerabilities in the required short time. Perhaps someone simply underestimated the risk.

For more details on the Equifax attack see Steven Bellovin’s post Preliminary Thoughts on the Equifax Hack published at CircleID.

Have a great weekend.

A lesson in Phishing and Two Factor Authentication

13 August 2017

The post ‘Hackers Hijack Popular Chrome Extension to Inject Code into Web Developers’ Browsers’ published on August 3, 2017 by Graham CLULEY at the Tripwire blog ‘The State Of Security‘ gives another good reason for the use of Two Factor Authentication.

Since phishing emails become better and better it is not surprising that even professionals can be tricked.

Thus I can fully accept the developer’s answer ‘I stupidly fell for a phishing attack on my Google account.’ to the question ‘Any idea how this could have happened?’.

But I cannot understand why the Google account was not secured with Two Factor Authentication (TFA), in particular because Google’s Push Notification makes life with TFA really easy.

With TFA enabled, this cyber attack could have been prevented.

Have a great week, and activate TFA for your Google account.