Good for you, Colin. Several years ago, I started the process of disengaging and migrating from a dependence on large online service because I valued my privacy and freedom. Glad to see that some people are finally catching on.
Using cipherscan to test the TLS certificate configuration of my web server.
Cipherscan tests the ordering of the SSL/TLS cyphers on a given target, for all major versions of SSL and TLS. It also extracts some certificates information, TLS options, OCSP stapling and more. Cipherscan is a wrapper above the OpenSSL s_client command line.
Cipherscan is meant to run on all flavors of UNIX. It ships with its own built of OpenSSL for Linux/64 and Darwin/64. On other platforms, it will use the OpenSSL version provided by the operating system (which may have limited ciphers support), or your own version provided in the -o command line flag.
Photo by Markus Spiske on Unsplash
It’s been a while since I had dedicated Linux server in my home. In the early days of my career, I maintained a small “data centre” in my basement. It included BSD based network storage via FreeNAS, a LAMP installation, a Linux-based firewall and directory server, and a Windows domain controller. I spent a lot of time trying testing my ideas and messing around with open source software. Over time I replaced these machines with commercial off the shelf products or moved services to the cloud. Eventually everything was replaced. The move seems to coincide with my then employer’s outsourcing of network and system administration and application development. I think in the back of my mind I was thinking that it was pointless to learn to do something that couldn’t use. It’s like learning to play a sport but never getting on the field. I got better at using my “soft” skills even as my hard skills atrophied.
I recently started consulting independently (again) and I realised that my knowledge wasn’t as current as I wanted it to be. While it’s great to have business skills that clients find important to help bridge the communications gaps between the non-technical and technical staff I wanted to stay sharp. I also realised that I missed my early days in information security when I was responsible for vulnerability management. I wanted back in and I especially want to develop and hone a penetration testing skillset. I felt it was time to re-build my lab.
I have two Raspberry Pi (RPi) devices on my home network and two Macs. The Macs are running OS X Yosemite and the RPi are running Raspian. I consider the iMac and MacBook Air as capable workstations but I think they are inadequate for a server. The RPi is too underpowered and to limited by memory and storage constraints. I installed and configured ownCloud on one of the RPi machines but performance was terrible. I spent two days getting ownCloud up and running on the RPi but removed the software and reconfigured the machine after only one day of use. I decided that a used server might be a better solution.
My intentions were to install a set of open source security tools including network and system vulnerability scanning, security and event monitoring, intrusion detection, file integrity monitoring, and some sort of configuration management system. I wanted something that is powerful enough to handle the software stack with enough storage to allow me to install, configure and test other software. After scouring eBay for a week I purchased a Dell PowerEdge 1950 server for $160.Information on the Dell PowerEdge:
- Two Intel Xeon 3.0GHz Dual Core CPU
- 4GB of RAM
- Two 146GB SAS 15K RPM Hard Drives
- Dual Power Supplies
- Dual Gigabit network cards
- VGA/Serial/USB Ports
- CD-ROM Drive
When the server arrived — it is larger and heavier than I expected — I went to the basement to set up. But … I had no power cords, no keyboard and no display. Over the years I was spoiled by Apple products. With the exception of the Mac mini and Mac Pro, all Macs ship with a display and a keyboard. This was a frustrating set back but a few weeks later I now have a power cord, keyboard, and a display. The power cord and keyboard were donated by an office colleague from the excess he had sitting in a drawer. The display, a Dell P1913S, was purchased used from eBay. It has a small tear but the price, $50, was good for my budget. It supports VGA, HDMI and DisplayPort.Waiting and acquiring the peripherals took some time but I had everything in place last night when I installed Ubuntu. When I booted the device I noticed a BIOS error. After poking around in the BIOS I realized that one of the two 146GB drives had failed. I tried rebuilding the drive but that failed so I pulled the drive from the chassis.
Building the Install Media
My intention was to install Ubuntu from a flash drive. The general procedure to install Ubuntu from a USB flash drive is:
- Acquire the correct Ubuntu installation files (‘the ISO’)
- Put Ubuntu onto your USB flash drive
- Configure your computer to boot from USB flash drive and boot from it
- Install Ubuntu to your internal drive (hard disk drive or solid state drive).
I thought I could just download a supported ISO for the Dell PowerEdge 1950 from Canonical’s website and use the OS X Disk Utility (DU) app to create a bootable USB. This didn’t work. I am not sure why. Some Google search foo revealed that I first needed to convert the ISO to an IMG file and then do some other things to create a bootable USB flash install on OS X.
:~$ sudo hdiutil convert -format UDRW -o ubuntu-10.04.4-server-amd64.img ubuntu-10.04.4-server-amd64.iso
After converting the ISO I followed the instructions to burn the IMG to disk. It seemed like the flash drive imaging was taking forever. I lost my patience after about 20 minutes. I wanted to get started right away. So while the USB flash drive was being imaged I tried burning the ISO to DVD with DU. That failed too. After a few minutes scratching, my head I burned the IMG I just created to DVD with DU. This method finished before the USB flash drive was ready. Time for the OS install.
Install the OS
I booted the Dell from the install DVD, answered a bunch of questions, and created the root and a standard user account. Once the server booted into Ubuntu I made sure that the SSH daemon was running and sat on the couch with my MacBook Air to complete the initial security configuration. This is one thing I love about UNIX/Linux. Almost anything can be accomplished from the terminal — remote of local. I used apt to install missing OS patches but after doing that, I realized I should do an OS release update instead.
:~$ sudo do-release-upgrade
The release upgrade seemed to take forever but once it was complete I configured the server firewall using UWF. I remember in the past when I had to create Linux iptables firewall rules by hand. UFW make changing the firewall rules trivial. I edited /etc/default/ufw to make sure IPV6 support was enabled (IPV6=yes) and started creating firewall rules. At a, minimum I need a way to secure remote access to the server and allow web services. From a security perspective I wanted to follow my practice of “that which is not explicitly allowed is denied”. I enabled access to SSH on port 22 and secure web services on port 443 via a firewall on Ubuntu with UFW.
:~$ sudo ufw allow ssh
:~$ sudo ufw allow www
:~$ sudo ufw allow 443
:~$ sudo ufw enable
:~$ sudo ufw logging on
My next steps are to install other security software on the Ubuntu server. I took an early stab at installing Tripwire and OpenVAS but I’ll need more time to understand how to configure these correctly.