When to sharpen, and when to cut

Cut down a tree

When performing a task for the first time, I think of whether it's a one-off, or that it will become a recurring thing. Python scripts for example can be developed blazingly fast, and a little bit of automation can go a long way.


...sometimes, while developing an automated solution that looked so simple beforehand, becomes a wild ride from one rabbit hole into the other. Missing dependencies, compile errors, functions that don't lend themselves very well for automation; Everything that can go wrong will go wrong.

That's why I like The Pomodoro Technique [1] so much, where you work in discrete time chunks of say 25, or 30 minutes. You decide upon the maximum cost for the implementation beforehand. Given the expected return, what is a sane investment ? If the time is up, then it's back to the original task at hand.

I have learned the hard way to always budget some time for documenting the (partial) solution, so that at least there's the profit of knowledge gained. Or, another record of a failed attempt...

more ...

Rebase OpenSSL 1.0.2-chacha to use TLS 1.3


Since its inception in 2014, the OpenSSL 1.0.2-chacha fork [1] has been used as standard OpenSSL distribution for numerous SSL/TLS pentesting tools. It includes default support for ciphers that are deemed insecure, and has extensive starttls support.... in comparison with the vanilla 1.0.2 branch.

However, even though 1.0.2 is deemed a Long Term Supported (LTS) version, no new ciphers or functionality will be added to it.

The initial reason to start the fork was a lack of ChaCha20 / Poly1305 support in the 1.0.2 branch. After that, more and more features and insecure ciphers were added or ported back in from other branches.

As ChaCha20 / Poly1305 support has been added to the 1.1.1 branch, which also contains (preliminary) TLS 1.3 support, it might be time for the insecure OpenSSL version to be rebased onto a new branch. The initial goals will still be the same:

  • Add as much ciphers and functionality as possible
  • Keep the source aligned as much as possible to the vanilla version
  • Keep the patches atomic, transparent and maintainable
  • Write as little custom code as possible

This will be quite the challenge, as the architecture and …

more ...

Tools for setting, tracking and achieving long term goals


Immediately after reading an article on David Allen and his brainchild Getting Things Done, I started with implementing his methodology. I loved it. I still love it - especially the Getting Things Done concepts of inbox ZERO, maintaining lists, and periodic reviews.

Inbox ZERO for me is not so much about having empty email inboxes, as well as making sure that input is collected from multiple locations and stored into one dedicated location. An inbox can also be a notebook, or note taking software like Google Keep.

Electronically stored lists have the benefit of being available on a multitude of devices, the ability to synchronize between them, backups, and their biggest advantage - providing dynamic views.


Both tools that I have been using so far (the open source Java application ThinkingRock [1], and Emacs in Org mode [2]) for maintaining lists of actionable items and projects were great in that perspective. Using those tools for periodic reviews was a different story. After trying numerous configurations I never got the hang of using ThinkingRock and Emacs for that purpose. Items become abstract letters on a screen. Views never fully captured what was important or which project served which goal.

Periodically reviewing projects and …

more ...

Automating OAuth 2.0 / JWT token retrieval for pentests

OAuth 2.0

Recently I was pentesting a complex API which used the OAuth 2.0 framework for authentication. Each API call needed an Authorization: Bearer header, containing a valid JSON Web Token (JWT).

To access the API I needed a lot of JWT tokens, as the tokens had a very short expiry time. To facilitate the quick generation of tokens I created a basic script that automated the OAuth authorization: It logs on to a domain, requests an authorization code, and converts that token to an authorization token.

One or more of these steps can be circumvented by command line options (e.g. by specifying valid cookies), to speed up the process.

Another feature of the script is that it automatically performs GET, POST, PUTs and DELETEs with valid tokens against a list of API endpoints (URLs). This preloads all API calls into a(n) (attacking) proxy, and helped the pentest speed up tremendously.

JSON Web Tokens

A JWT token is basically a string, representing a collection of one or more claims. Claims are name/value pairs which state information about a user or subject. The claims are either signed (JSON Web Signature, JWS) or encrypted (JSON Web Encryption, JWE). JWT's serve …

more ...

Automating repetitive git / setup tasks

repetitite work

Imagine you work on a large number of projects. Each of those projects has its own git repository and accompanying notes file outside of the repo. Each git repository has its own githooks and custom setup.

Imagine having to work with multiple namespaces on different remote servers. Imagine setting up these projects by hand, multiple times a week.

Automation to the rescue ! Where I usually use Bash shell scripts to automate workflows, I'm moving more and more towards Python. It's cross-platform and sometimes easier to work with, as you have a large number of libraries at your disposal.

I wrote a simple Python script that does all of those things more or less 'automated'. Feed the script the name of the repository you want to clone, and optionally a namespace, patchfile and templatefile variable (either command-line or using a configuration file). The script will then:

  • clone the repository
  • modify the repository (e.g. apply githooks)
  • optionally modify the repository based on given variables
  • create a new notes file from a template
  • optionally modify the notes file based on given variables

The advantage is that you can use a configuration file containing the location of the remote git repository, the patchfile …

more ...


If you're like me, you don't want to spend your precious memory on remembering awkward command line parameters. However, lots of tools require exactly that: awkward command line parameters.

To simplify scanning of hosts for network vulnerabilities I wrote a simple wrapper script around several open source security tools. The script lets you analyze one or several hosts for common misconfiguration vulnerabilities and weaknesses.
My main objective in writing the script was to make it as easy as possible to perform generic security tests, without any heavy prerequisites, make the script as informative as possible, and make use of open source tools.

Note that the latest version is the Python version - please use that one.

How to install

Clone the git archive using the command

git clone https://github.com/PeterMosmans/security-scripts.git


Linux, and nmap


  • curl
    for fingerprinting and to test for TRACE
  • dig
    to test for recursive DNS servers
  • git
    to update the script
  • nikto
    for webscanning
  • testssl.sh
    to check the SSL configuration


Oh irony - the command line parameters for the tool:

usage: analyze_hosts.sh [OPTION]... [HOST]

Scanning options:
 -a, --all perform all basic scans
 --max perform all advanced scans (more thorough)
 -b, --basic …
more ...