Verifying webserver compression - BREACH attack

BREACH attack

A few lines of Bash script let you check which compression methods are supported by a SSL/TLS-enabled webserver.

 target=URL-OF-TARGET
for compression in compress deflate exi gzip identity pack200-gzip br
bzip2 lzma peerdist sdch xpress xz; do
curl -ksI -H "Accept-Encoding: ${compression}" https://${target} | grep -i ${compression}
done
If you see any output (and the server supports one of these compression algorithms), the site might be vulnerable to a BREACH attack. Might, because an attacker has to 'inject' content into the output (and have some control over it): This is called a chosen plaintext attack.
By carefully injecting certain content to the page, an attacker is able to deduce (parts) of the page content by merely looking at the response size (speed). An attacker therefore also has to be able to observe the server's response. A third prerequisite is that the secret (that an attacker wants to steal) is contained in the server response's body, and not 'just' in the response's header. Cookies are therefore out of scope.

The easiest mitigation is to disable HTTP compression completely. Other less practical mitigations are adding random content to each page, which changes the compressed size per page request, rate limiting the …

more ...

Automating OAuth 2.0 / JWT token retrieval for pentests

OAuth 2.0

Recently I was pentesting a complex API which used the OAuth 2.0 framework for authentication. Each API call needed an Authorization: Bearer header, containing a valid JSON Web Token (JWT).

To access the API I needed a lot of JWT tokens, as the tokens had a very short expiry time. To facilitate the quick generation of tokens I created a basic script that automated the OAuth authorization: It logs on to a domain, requests an authorization code, and converts that token to an authorization token.

One or more of these steps can be circumvented by command line options (e.g. by specifying valid cookies), to speed up the process.

Another feature of the script is that it automatically performs GET, POST, PUTs and DELETEs with valid tokens against a list of API endpoints (URLs). This preloads all API calls into a(n) (attacking) proxy, and helped the pentest speed up tremendously.

JSON Web Tokens

A JWT token is basically a string, representing a collection of one or more claims. Claims are name/value pairs which state information about a user or subject. The claims are either signed (JSON Web Signature, JWS) or encrypted (JSON Web Encryption, JWE). JWT's serve …

more ...

Getting to know your pentesting tools - curl and the HTTP/1.0 protocol

Tooling is important

When pentesting, it's always handy to have a bunch of automated scanner do the grunt work for you. Using automated tools saves time and can help in spotting potential vulnerabilities. Usually I run analyze_hosts.py, a wrapper around the open source tools droopescan, nmap, nikto, Wappalyzer and WPscan, with a bit of intelligence built in.

Recently I had a pentesting engagement where nikto flagged a IIS server as leaking the internal IP address (see https://support.microsoft.com/en-us/kb/218180 for more information).

This is a very common issue with older, unhardened IIS servers. The issue is triggered when a HTTP 1.0 request is made to the server, without supplying a Host header. The resulting Content-Location header will contain the server's (private) IP address, thereby leaking information which can subsequently be used for other attacks.

Example of a server's partial response:

HTTP/1.1 200 OK Content-Location: http://1.1.1.1/Default.htm

I ran into an interesting observation when I needed to reproduce this using curl. Curl is a great tool do do all kinds of HTTP requests on-the fly, and it's very well suited for scripting. It has flags to specify the protocol (e.g …

more ...

Automating repetitive git / setup tasks

repetitite work

Imagine you work on a large number of projects. Each of those projects has its own git repository and accompanying notes file outside of the repo. Each git repository has its own githooks and custom setup.

Imagine having to work with multiple namespaces on different remote servers. Imagine setting up these projects by hand, multiple times a week.

Automation to the rescue ! Where I usually use Bash shell scripts to automate workflows, I'm moving more and more towards Python. It's cross-platform and sometimes easier to work with, as you have a large number of libraries at your disposal.

I wrote a simple Python script that does all of those things more or less 'automated'. Feed the script the name of the repository you want to clone, and optionally a namespace, patchfile and templatefile variable (either command-line or using a configuration file). The script will then:

  • clone the repository
  • modify the repository (e.g. apply githooks)
  • optionally modify the repository based on given variables
  • create a new notes file from a template
  • optionally modify the notes file based on given variables

The advantage is that you can use a configuration file containing the location of the remote git repository, the patchfile …

more ...

Bash vs Python (dependency hell)

For a number of years I maintained a small collection of open source security scripts, written in Bash. The main purpose of these scripts was to act as a wrapper around other open source tools. Why try to remember long and awkward command line parameters, when you can ask a script to do that for you ?

Bash was chosen, as it was distribution-independent. It works almost everywhere (although sometimes OSX support is troublesome, due to outdated Bash versions).

After more and more (requested) features crept in, the

analyze_hosts.sh
Bash script became more and more complex. That's why I decided to port the script to Python. In my experience, it's at-least-as portable, and the usage of third party (pip) packages means that less time is spent on re-inventing the weel, and more on the actual functionality.

Yes, sometimes people talk about the dependency hell of Python, and in some cases, the usage of third party packages means you have to be careful of what you're doing.
However, when using virtual environments each Python script and its dependencies can be safely separated from the 'main' Python. For example, the following commands create a separate virtual environment for the security scripts repo …
more ...

Open secure redirect

left or right

Aren't those RFC docs amazing ? Reading up on standards ?

I needed plenty of time for them, as I encountered some interesting issues. As it turned out, some websites / loadbalancers are overly optimistic in encrypting all the things - actually, in redirecting all the things.

TL;DR

Never trust HTTP(s) clients, and be careful when setting up redirection rules. A non-RFC compliant client can trigger a (difficult to exploit) open redirect vulnerability, due to a non-RFC compliant server.

This vulnerability can be tested using

analyze_hosts.py --http TARGET

See https://github.com/PeterMosmans/security-scripts/ for the latest version of analyze_hosts.py

Be warned, long post ahead: A while ago I came across some servers that, when being sent insecure requests, responded with a redirect to the secure version.

Request:

% curl -sI http://VICTIM/

Response:

HTTP/1.1 301 Moved Permanently
Connection: close
Location: https://VICTIM/

So far so good, nothing fancy going on here. In fact, this is excellent behaviour. Insecure requests are immediately upgraded to secure requests.

However, the server seemed to be overly happy in redirecting, as it listened to the client-supplied Host parameter:

% curl -s -I -H "Host: MALICIOUS" http://VICTIM/

And the server responded by

HTTP/1 …
more ...

automatic XML validation when using git

XML
Recently I worked on a project which involved manually editing a bunch of XML files. Emacs is my favorite ~operating system~ editor, and it has XML validation built in (using the nXML mode). It highlights validation errors while-you-type. Unfortunately, even with Emacs showing potential issues in RED COLOR, I managed to commit a number of broken XML files to my local git repository. Subsequently when I pushed my errors to the remote 'origin' git repository, the errors broke builds.

Of course this can be completely prevented by locally using pre-commit hooks. If your local git repository validates XML files before you can commit them, and denies invalid XML files, then one part of the problem is solved.

A pre-receive hook on the receiving server side can do the same as a pre-commit hook locally: Validate XML files before letting somebody push a commit which can break the build process.

I looked around the Internet but couldn't find a lightweight quick script to do only and exactly that. That's the reason I whipped up a basic pre-commit and pre-receive hook, written in Python.

You can find the very basic and rough code at https://github.com/PeterMosmans/git-utilities.
By changing the …
more ...

Preparing your team for a CTF competition - Defcon style

Defcon

Playing Capture The Flag with a team on location is something completely different than performing penetration tests, security assessments or even trying to solve CTF challenges over the Internet.

At Defcon 23 I joined a team of really knowledgeable, nice and friendly people for the OpenCTF competition. It was an exhilarating ride from setting up all equipment to the glorious finish. Playing Capture The Flag on Defcon was educational but foremost fun, fun and fun.

So why would you spend a good chunk of 48 hours sitting in a chair behind a screen while there is so much more to see and experience at Defcon ? In one word: The undescribable exciting atmosphere of playing during a conference, of competing against all these bright people from all over the world, desperately trying to solve the challenges.

Here are some of my personal notes on how to get the most out of competing in an OpenCTF competition with a team:

  • Allow plenty of time before the competition to set up (and harden - don't be a fool like me) your machine. Make sure you have all necessary tools and notes.
  • Make sure beforehand that all team members have one communication channel (eg. IRC …
more ...

Defcon 23 was great - people are great

Defcon 23
For quite a while now, I work in the security industry. One of the things I do is providing security advice for companies on all sorts of guidelines, policies and hardening stuff. Web penetration tests is also something I do very regularly. In other words, a disclaimer before you read on: I should have known better...
VirtualBox, Packer, Vagrant and Ansible are tools that I use a lot. These four tools make virtualizing and provisioning really easy. You can create new machines, experiment with them and test different setups in a repeatable and automated way.
As I sometimes organize pentesting workshops, I have several virtual machines with Kali (a penetration testing distribution) installed on them readily availabe.
So, I connected my laptop to the network of the 23rd Defcon conference in Las Vegas, when one of these standard Kali virtual machines was (still) running as guest on my machine. Not only was Kali running, the guest was also configured to run in bridged networking mode. This means that Kali got it's own network IP address assigned.
What I hadn't changed on that machine was Kali's default root password. To make matters worse, what I had changed was the ssh server …
more ...

The future is here: HTTP/2

Last month I held a number of presentations on the latest and greatest HTTP/2 protocol. It's an area where there's currently a lot of demand for knowledge and practical tips. Most people are surprised to find out that the're actually already using it on a daily base.

If you're interested you could check out an Ansible role which installs a number of client-side and server-side tools all HTTP/2 enabled:

  • curl - A data transferring tool with HTTP/2 support
  • h2load - A benchmarking tool for HTTP/2 and SPDY servers
  • nghttp - A HTTP/2 client with SPDY support
  • nghttpd - A HTTP/2 server with SPDY support
  • nghttpx - A transparent HTTP/2 proxy with SPDY support
  • openssl - A cryptographic library with ALPN support (1.0.2-chacha)

The following libraries will be installed:

  • libcrypto - OpenSSL
  • libcurl - CURL library
  • libnghttp2 - A HTTP/2 and HPACK C library
  • libspdylay - A SPDY library
  • libssl - OpenSSL

You can find the role at https://github.com/PeterMosmans/ansible-role-http2

more ...