Removing SSH keys from authorized_keys file

If you have a server that receives connections from various computers you may have setup key based authorisation. This is a good move but occasionally you may be in a position where you want to remove a key so as to revoke access from one of the computers.

To do this you simply need to remove the public key from the authorized_keys file on the server. You can do this in a text editor, but it can get messy if there are dozens of entries in the file and you just want to remove one.

This is where SED is your friend. With this simple line you can delete the entry for the hostname you want to revoke, and also keep a backup of the file in case you make a mistake:

sed ‘/hostname/ D’ -i.tmp ~/.ssh/authorized_keys

“hostname” is the hostname of the computer connecting to the server. If you don’t have the exact hostname you can always use part of the public key in here, it will work the same way.

The command keeps a backup of the authorized_keys file as authorized_keys.tmp and saves the original file without the public key of the hostname you specified.

Should I use the keywords meta tags?

In case you are in a hurry, the short answer is No.

Google stopped using the keywords meta tags years ago, get people still seem to be trying to leverage these things to get their website “up the rankings”.

The reason Google stopped using them is people generally used them to try and get themselves up the ranking and play the game. They rarely provided much in the way of genuine info that Google couldn’t already get from analysing the data on the page.

Google does however use the Meta Description tag, often (if you have a good quality description) to populate their small excerpt you see on the google page.

So go and spend some time improving and updating the content of your site, but please don’t waste any time coming up with a list of keywords, it’s a waste of time.

Home backup strategy

Backup strategy is something I am passionate about and something I have dealt with for a lot of my working life. This post comes at it from the angle of a home user, with options to suit the average person who wishes to protect their data.

For most people it is great to have so many photos of friends and family floating around, but stop to think what it would be like if your computer died and you lost it all. It’s worth giving some thought to how you can protect your data.

I wrote a post about backing up your family photos. This post follows on from that and gives a much more bullet proof solution, which you can apply to your whole household.

3-2-1 backup strategy

The concept of the 3-2-1 backup strategy is that you should always have 3 copies of your data, two of which can be in the same place (but on DIFFERENT platforms) and one must be off site.

This can sound a little over the top at first glance, but when you analyse the strategy it makes a lot of sense.

Let’s take a look…

Firstly, what are we trying to protect against?

The answer is loss of data, but this can be a number of factors, but they come in two flavours:

Physical loss of data (fire, flood, theft etc).

To mitigate this factor we must have an OFF SITE copy. This is one of your 3 copies and it can either be kept in the cloud (dropbox, Google Drive, Crashplan etc) or it can be taken away on a hard drive (or whatever medium) and stored elsewhere. This only works if the data is static, and if you add to it you need to be quick to take a new copy off site. For most of us an automated cloud backup solution is better.

Digital loss of data

This can be data corruption, scratches (in terms of DVD’s etc) or other digital factors. Take DVD’s as an example. If you have your data on 2 DVD’s and they get scratched then you have lost your data (yes, you probably haven’t taken enough care, but nevertheless), or at the very least you are reliant on the cloud version which you may or may not have tested. There have been cases where people have gone to their cloud version and for whatever reason it was not up to date due to their Internet connection being slow or other factors. The cloud is a great backup, but shouldn’t be your only one.

That’s why the best scenario is where the other 2 copies that you have locally are on different platforms. This can be Hard Drive and DVD, or even two platforms like Time Machine and a backup to a local NAS drive.

An example of a good backup system

I often get asked for an example of the perfect setup. I don’t think a perfect system exists, as it all depends on your needs and how much data you have, but here is a good starter for six!

Backup Number 1 – Time Machine

I use a Mac, so Time Machine is built into the Operating system. If I plug a Hard Drive into my Mac it asks me if I want to set it up as a backup. It them backs up all my data as I create it to this external drive. You can even get wireless versions that sit on your network. There is an added advantage of being able to access deleted items and changed files.

Backup Number 2 – The NAS

If you have a home NAS (some routers allow you to plug in a hard drive to create shared storage) then you can schedule your Mac or PC to sync your crucial data to this shared storage either at set intervals or before you shut down. If you don’t want to do this then you can simply copy your data to this location as and when you need it (e.g. when you have downloaded new photos from your camera). Sometimes people use this second backup just to store critical stuff like family photos.

Backup Number 3 – Crashplan (The Cloud)

Crashplan can be set to run in the background and backup your data to the Crashplan servers. There is a charge for the product but if you have a friend you can set it up to backup to each others computers and it won’t cost you a penny other than the cost of the hard drive storage.

Another option for this is to simply use Dropbox, but you have to be careful not to exceed your account.


There you have it, a simple 3-2-1 backup strategy that gives you peace of mind that your data is safe in case of disaster.

Choosing a NAS drive

More and more data being captures in various ways, from music to photos to applications, on all sorts of devices. In the average household there are at least 2 mobile phones full of photos, often digital cameras, laptops, netbooks, tablets, all of which have vast amounts of data. The concept of a NAS (Network Available Storage) is a great idea to keep some of this data centrally and to back it up from it’s primary location, be it your phone or another device.

But what about the cloud?

The cloud is great for certain data, but for some things you can’t beat having a local copy. Often the files are big and the Internet connection is not fast enough to get you access to your data as quickly as you need it. Often you have hundreds or thousands of files on your hard drive that you want to offload. This is often easier to scan through on your home network than via a cloud interface.

Let’s just say the cloud is great, it has it’s place, but for the purpose of this post lets focus on NAS systems.

NAS systems are not all equal

In order to explore this statement it is necessary to ask the following questions:

  • What do I need a NAS for?
  • Does it matter if the data gets lost?
  • Do I just need it to store files or do I want it to give me more value?
  • Do I need to access it from outside my home?

First of all I am going to tackle the second point, “Does it matter if the data gets lost?”. This is crucial in terms of choosing a NAS. If the answer is yes then you MUST get a NAS with some form of redundant drive system (RAID or such like). Some NAS systems allow you to use 2 drives and use the total space (e.g. 1Tb + 1Tb = 2Tb). This is useless if a drive fails. What you need is a configuration where 1Tb + 1Tb = 1Tb, but if a drive fails you still have your data.

There are plenty of NAS devices that offer this, but this is where it gets tricky. Some NAS devices use proprietary systems, meaning the drives and the data are dependent on the hardware and the Operating System. This means that the drives may be fine, but if the hardware they are connected to fails then you are without your data until you can replace it with the same hardware (or similar from the same manufacturer). I have seen this before where someone had a NAD for years, the hardware failed and then they could no longer get hold of the compatible hardware.

The solution to this is to do one of the following:

  • Build your own Windows/Linux server (requires time and knowledge)
  • Buy spare hardware in case of failure
  • Use a software RAID system that is compatible with most hardware

The third option is quite popular, and my preferred solution is NAS software called UNRAID. This software allows you to connect any amount (well a LOT) of hard drives together with the largest one as the parity. Once you have your parity in place you total the storage of your other drives and that’s your NAS capacity. There are details of how it works on their site but it’s really pretty simple and works well. If you lose a drive you take it out and put in a new one and it rebuilds the data. Even if you lost 2 drives you have only lost the data on that drive, the other drive’s data remains in place. If you lose your hardware you just put the drives in a new PC or server and you can either access the data directly or install UNRAID on that server and you’re good to go again.

I will do a full post on UNRAID another time, but take a look at their site if you want more info.

To go back to the first bullet points, if you want more value or access from outside your home then some of the Synology kit is great, as is UNRAID o FreeNAS. They all provide plugins or apps that enable you to share out your music, access your videos from smart TV’s or even over the Internet. UNRAID has a near Docker system to let you install Plex, so you can access all your movies from all your devices (phones, tablets, laptops and smart TV’s). If your Internet connection is up to it you can even view them when you are out of the house.


Choosing a NAS is really down to your preference and needs. Depending on who you are you may choose a different one. Here are a couple of options:

For techies (or geeks?)

UNRAID is definitely my choice. It allows you great data protection, the facility to run all sorts of apps, and all on your own hardware. Pair it with an HP Microserver (for just over £100) and you have a great little NAS box that can even run Virtual Machines!

For less techie people

For people who just want it to work something like the Synology NAS systems are good. They do some of what the UNRAID system does but you are reliant on their hardware. They are a good brand though so you shouldn’t get caught out but then going out of business.

One last thought… if your data is important to you it is not enough to have it in one place, you should have it in 2 places, or preferably 3. It’s also preferable to have an off site backup, that’s where Crashplan comes in, but that’s another post in itself.

Force browsers to the HTTPS version of your site

After writing my posts about the benefits of HTTPS I thought it might be a good idea to write a short post letting you know how to easily send visitors to the HTTPS version of your site rather than the HTTP version.

Open your .htaccess file in the root of your website (where your main index file is). If you don’t see one then you either need to tell your FTP client to display hidden files or you may need to create a new .htaccess file.

Enter the text as follows, customising it to the URL of your site:

RewriteEngine On
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$$1 [R,L]

That’s it, all users going to any page on your site should now have it server up using HTTPS