Should I use the keywords meta tags?

In case you are in a hurry, the short answer is No.

Google stopped using the keywords meta tags years ago, get people still seem to be trying to leverage these things to get their website “up the rankings”.

The reason Google stopped using them is people generally used them to try and get themselves up the ranking and play the game. They rarely provided much in the way of genuine info that Google couldn’t already get from analysing the data on the page.

Google does however use the Meta Description tag, often (if you have a good quality description) to populate their small excerpt you see on the google page.

So go and spend some time improving and updating the content of your site, but please don’t waste any time coming up with a list of keywords, it’s a waste of time.

Uptime, and the 99.99% scam

If you have looked at the websites of hosting providers on the web you can’t fail to have seen claims of 99.9%, 99.99% or even 99.999% uptime. The higher the number, the better the deal, right?

The answer is “not necessarily”

It all boils down to the SLA (Service Level Agreement). This is an agreement that the hosting business gives to its customer to keep their site online, or rather it is an agreement that the downtime will not exceed a certain threshold.

So what exactly is 99.9%, or 99.99%?

Availability Downtime per year Downtime per month
99% 3.65 days 7.2 hours
99.9% 8.76 hours 43.8 mins
99.99% 55.5 mins 4.38 mins
99.999% 5.26 mins 25.9 seconds

Looking at the above figures you can see how little time per month a site can be down in order not go over the SLA.

Ask yourself the question, when can they reboot a server, apply security patches etc? In Enterprise environments (Banks, Medical environments etc) there is a LOT of infrastructure in place, failover servers, mirrored storage, even backup data centres! In all but the most expensive hosting solutions you cannot expect this same level of SLA…. yet some places advertise it?

So… what’s the catch? Is it too good to be true?

In a word, yes. Most of these places cannot promise to have uptime to that level. Realistically most months they will probably manage it easily, but when something goes wrong (as it does for all companies, even Microsoft) then there will be downtime and it will be more than 4.38 mins a month!

So how can companies advertise an uptime level they cannot hit? Are they lying?

This is the clever (or sneaky?) part. An SLA is an “agreement”, not a promise. It means that if the agreed level is breached then the business will compensate the customer, be it in service credits or cash refund. The level of this is completely dependent on the business terms and conditions (or contract).

I have seen some hosting companies (no names) state “we will compensate downtime that is in excess of our SLA at our standard rate”. In this case it was a cloud provider that charged by the minute. In effect they were just saying that for every minute your server is offline you will be refunded a minute’s charge. So on that deal they may as well say they have a 100% uptime SLA, it makes no difference.

In Enterprise situations there are vast penalties for exceeding SLA’s, so a lot of thought and planning goes into meeting them, but in your general Internet hosting provider’s case they generally have small print to avoid high penalties, so they can get away will boasting about massive uptime SLA’s in order to bring in new business, without worrying about what happens if things go wrong.

So are we all doomed?

No, not at all. servers are generally quite reliable, and month on month most providers will hopefully give you 100% uptime. The way to judge a good hosting provider is not what uptime SLA they promise, but how they react when things go wrong. Do they react quickly, do they keep you informed, and do they solve the problem and explain why it happened if you ask?

A good web host is worth their weight in gold, just not for the simple reason of a 99.999% SLA.

Backing up your family photos

It’s a while since I have posted about backup strategy but it’s such an important topic I thought it was worth a revisit.

If you are like me you probably have all sorts of data around your house, across multiple computers, phones and devices. While everyone is different, I think most people would agree  that in the event of losing their data “en mass” the most devastating would be the loss of their family photos.

While the risk of losing photos due to water damage, sunlight, or general wear and tear is a lot less, the biggest risk nowadays is hard drive failure. I have seen many examples of people taking their dead hard drive to their designated “techie friend” in the vain hope that it could be recovered. Sometimes it can, sometimes it can’t, and often even if it is possible it involves significant cost.

Companies are pushing bigger and bigger hard drives, NAS devices, USB sticks etc upon us, and it’s great that we  can now store years and years of data on these, but in actual fact the situation (and risk) is getting much worse. Where you used to have your photos stretched over 4, 5 or 6 devices, now you can fit them all on a single hard drive… so why not? The answer is simple, if that device fails you lose EVERYTHING!

Yes it is possible to have 2 hard drives and backup everything twice, but in reality how many people do that?

NAS devices also allow you to configure them in RAID mode, where you can use two disks together and the data will survive if one fails. The problem is you can also configure them where they use the full capacity of the two disks (RAID 0), which looks on paper to be great…. more disk space than you can shake a stick at. The problem is you’re back to losing a lot of data if a disk fails.

The other issue is how often you actually backup your data. Most people find that when they have a failure it’s been “quite a while” since they last backed up.

The best situation is to backup automatically. To have a system where you don’t have to think about it. There are services such as CrashPlan which offer this service if you have a reasonable internet connection. It’s great as it keeps your data safely off-site, so if you has a flood or fire you can always get your data back. CrashPlan also allows you to setup your own servers (or peer to peer with a friend) so you can backup to those instead (or as well!).

The ideal situation is to backup to multiple places, having several copies of data in multiple locations. This is a lot to consider for a lot of people, so that’s why I often recommend CrashPlan, as it is simple to use and doesn’t cost the earth. If you don’t want to go that route then by all means continue to use hard drives, but please consider buying a second one, using that as well, and keep it in a different location to the first one.

If you have any comments or questions please leave a comment.

OCZ Vortex 2 SSD – The need for speed

I have been running a 17″ i7 Macbook Pro since it was release earlier this year. Ever since I got it I have been waiting for the price of SSD’s to come down so I could swap out the 500Gb hard drive with an SSD drive and put the 500Gb drive into the optical bay.

Yesterday my 120Gb OCZ Vortex 2E SSD drive was delivered and all I can say is WOW, this thing is fast. OSX now boots in just over 10 seconds and Photoshop launches in 3 seconds flat! This thing is EPIC!

SSD drives are amazingly fast but the price is quite high so the best way to configure the system is to have your Operating System on the SSD drive along with you applications, then have your home directory (profile) on the normal hard drive. This means you can load your original hard drive up with your massive video and picture files, while using the SSD drive for things that will really benefit from the speed.

If you are really short of memory (RAM) then it is always worth upgrading, but providing you have a relatively modern computer (with a decent disk controller) then an SSD drive is far and away the best bang for the buck!

How to stop iWeb putting your site in a subfolder

First of all, feel free to skip to the end for the solution. Please leave a comment if this helped you in any way (or not).

My Dad recently bought a Mac and designed a website in iWeb. When it came to publish the site I configured it and hit “publish”. Up popped a message telling me that the site was now live at “<sitename>.com/site”. Mmm I thought, I must have made a mistake when configuring it. I went back and checked the settings and found in actual fact this is “just the way it works”

Rubbish!

I Googled the situation and found that using the only way people were getting around this issue was to upload their website in an FTP client like Cyberduck. All well and good, but hardly the most tidy solution.

I decided to go about this a different way, by telling the server that instead of looking in the “www” folder as the root of the site, it should instead look inside the “site” folder.

This was achieved by creating a “.htaccess” file inside “www” (or “public_html” as it may be). You need to edit the next bit to fit your site, but you get the idea:

RewriteEngine on
#
RewriteCond %{HTTP_HOST} yoursite\.com [NC]
RewriteCond %{REQUEST_URI} !/site
RewriteRule (.*) /site/$1 [L]

Note: make sure you keep the slashes the way they are, they look odd but they are needed.
This simply tells the server to look inside the iWeb created folder whenever anyone goes to your site. Set this up once and you should be able to forget about it from then on, it just works.

I hope this helps people resolve an issue that really should not be there in the first place. Please leave a comment with your thoughts.

Jailbreak for ALL IPHONES coming soon

It looks like a member of the Dev Team called pod2g has discovered a vulnerability in the booting mechanism for the iPhone, iPad and iPod Touch devices meaning that a jailbreak for iOS 4.1 is just around the corner.

The Dev Team have been producing jailbreaks for the iOS devices for a long while, but what makes this exploit extra special is it is hardware based, meaning that Apple cannot simply update the Operating System and fix it. Any of the aforementioned devices can (and will always be able to) be jailbroken.

This is a major coup for the jailbreaking community, as there has been talk of Apple clamping down on jailbreaking and making it much harder to do.

One thing that may hamper their future efforts is the advent of the Gaming community on iOS. This is something that has enforced legality in the gaming community in the past, as once a device has been detected as being exploited it could (in theory) be black listed from the gaming service. That said, if they had the ability to do that with games, then they also have that ability in Apps too, but as yet Apple have not played that card.

Some cynical people also believe that behind the scenes Apple don’t really mind the jailbreaking going on, providing the majority stay towing the Apple line.

Rumour has it the jailbreak will be with us within a day or so.

New iTunes 10 logo – by Paul, age 5 and a half

I sat down to watch the Apple presentation last week, interested to see what new products and software releases Apple are coming out with. I nearly fell off my chair when I saw the new iTunes 10 logo.

Now I am not one to get all upset about these things like a lot of people on Twitter/Facebook/Forums etc, but as soon as it came up on the big screen I just LOL’d (yes, it really WAS out loud!). This thing looks like it was designed by an (underachieving) school kid. The first “web 2.0” tutorial I saw on the web did a better job of making a logo look good.

I can see why they wanted a new logo. The old one has a picture of a CD, and let’s face it, when was the last time we consumed music via CD (yes, I know there’s always one!)?

I could understand if they replaced the CD with an iPod or something. I can even understand the use of a musical note, but this is your bog standard “gel button” with a flat note dumped on it with a bit of glow added for good measure.

I saw a tweet where someone recommended a replacement icon, and immediately you can see it took more than 5 minutes to design:

http://dribbble.com/shots/51446-iTunes-10-Replacement-Icon

At first glance you can see straight away it is a lot more modern (dare I say it, “funky”!) and also encompasses the social media angle of the new Ping service. This icon would not look out of place on any Mac Dock.

I was trying to think how on earth this gel icon go the thumbs up and the only thing I can think of is Steve designed it. I have seen it time and time again where a client thinks they know best. I am sure a lot of you have too. You know the conversation…. “can you make my logo BIGGER?!”.

This has to be a similar situation, and we all know that at Apple nobody argues with Steve Jobs (not for long anyway).

I’d love to hear peoples views on this issue, as I am sure there are many, but looking at the amount of blog posts that have popped up discussing this it is quite clear what people think of it.

I just find it bizarre that a company founded on (and obsessed with) the fit and finish of their products can produce something as laughable as this.

I sense an iTunes 10.1 around the corner, with a slightly more polished logo 😀

How to stop people linking to your images

It is often infuriating when you find people linking to your images online. Firstly they are stealing your bandwidth, and while this may or may not be a problem depending on your host (and the popularity of the site doing the stealing) it is just not polite.

Secondly there is the issue of copyright theft, people using your images without permission, often giving the impression they own, or even created them, themselves. Again, depending on the site/person doing the stealing it may or may not be an issue.

There are a few ways of dealing with this, I will go into just a few of them in this post.

Firstly, you really want to establish whether or not the person linking you image is doing it deliberately and whether they know it is wrong. The first move is usually to simply ask them to remove it.

If you establish this is a deliberate act there are some options open to you:

  • Legal (mostly over the top and expensive)
  • Pass on the responsibility (i.e. ask their ISP to remove the site for copyright infringement)
  • Revenge (will go into this later)
  • Prevention (Instructions below)

I will explain the above points one by one:

Legal

I am not a fan of premature legal action, I think it is often over the top and unnecessary. It is nice to have the option to hand but I would personally keep it as a last resort.

Pass on the responsibility

This is often a good low-effort way of dealing with things. ISP’s don’t like lawyers and they also don’t “usually” know if you are one. If you drop them an email asking them to remove the offending site due to copyright infringement (or even better drop them a letter) then they will often act pretty quickly. This really does depend on the ISP though, so there are mixed results for this method.

Revenge

If you are SURE they are doing it deliberately stealing your images then follow these steps to make their eyes water (or just give them bad press)

  1. Rename your image
  2. Change the link in your HTML to match the new name (so it looks the same on your site)
  3. Change the image (e.g. boat.jpg) to an image of something “different” (you know what I mean)

As long as the linked image has the same filename then people going to the offending site will no longer see the picture of a “boat” but will either see a jpg with your own message or an image of your choice 😉

Prevention

This way is usually the best way of preventing people linking your images (it won’t prevent them stealing them though, but that is another issue)

As long as you have an Apache server (most are) then put a file called “.htaccess” (don’t forget the dot!) into your images folder.

Add the following lines to it, and save it:

RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?yourwebsite.com/.*$ [NC]
RewriteRule \.(gif|jpg)$ - [F]

With this in place, any gif or jpg images linked from anywhere other than your site will fail to display.

Like doctors often say, prevention is best. I do agree with this, but I also believe “revenge is sweet” 😀

Is Facebook overstepping the privacy mark?

Facebook status’s and Twitter feeds have been set alight over the past couple of weeks surrounding Facebooks “Instant personalisation pilot programme” and “What your friends can share about you” settings.

For those who don’t already know, Facebook has made changes to the settings (defaults) that mean not only do you get to share (or not share) your information with the world, unless you actively go in and change the default settings your friends can share your information on your behalf.

It is all part of Facebooks plan to know everything about everyone, based on a persons friends and their taste in music, clothes etc..etc.. The problem is it is impossible to develop such a model without some form of invasion of privacy and Facebook seems to have battered down the privacy door in order to move forward.

It is one thing having the ability to share your information with the world, but to default it to on (and add default on settings to previously secured accounts) is not good.

Facebook has been criticised for having an over complicated security model, such that you have to dig deep into menus in order to find the settings you need to turn off. It has improved a little over the past year, but not nearly enough.

The funny thing is I read “The Accidental Billionaires” (the story of Facebook) a few months back and the way Mark Zuckerburg is portrayed (rightly or wrongly) it is absolutely not surprising that Facebook act this way.

I wonder if there was a viable alternative to Facebook if their numbers would take a hit? I know a lot of key figures who are cancelling their Facebook accounts, but I guess the general public are just not aware that their privacy is being given away. It is no wonder that Zuckerburg defends Facebook by stating that they are loved by the public, it is just the bloggers and the media that are on their back.

Anyway…

If you want to secure your account you need to do the following:

Go into :

account > privacy settings > Apps and settings > What your friends can share about you

untick all the boxes (if you dont want your friends sharing your info)

Then go back to privacy settings and into Instant personalisation pilot programme

Untick the box

Now, to be completely secure you must visit each of these pages and select “block” (on the left)

Microsoft Docs.comPandoraYelp

That should do it for now. If Facebook sneak in any more defaults I will post what you need to do to turn them off.

Macbook Pro – Delivered (finally)

I ordered a 17″ Macbook Pro on the day the new models were released. Due to speccing it with different hardware it had to be custom built in China. Not a problem usually, but then again there isn’t “usually” a volcano spitting ask all over Europe.

Deliveries halted last week and seeing my Macbook Pro was sitting air-side in Shanghai I had resigned myself to not getting my new machine until next week. However, UPS seem to have come up trumps and my MBP is now sitting at home, just over 24 hours after it was the other side of the world.

It took quite a route to get here though, and I am astounded how it managed to get here so quickly.

I am quick to bash companies for bad service, but on this occasion “well done UPS” 🙂