Step 1 - Denial

SEO gurus are always banging on about on-page optimisation (making the most of what's on your website, i.e. not relying on links) and keyword density (how many times a text uses a phrase).

Imagine I made a page with the URL andrewgirardin.com/all-work-no-play-makes-jack-dull-boy, targeting the phrase 'all work and no play makes Jack a dull boy.' Now imagine every sentence in the body text was 'All work and no play makes Jack a dull boy'. Google might start to think I was a crazy person. But more importantly, that page would never rank, simply because it'd be such a brazen effort at 'keyword stuffing'.

On the other hand, if that's really the keyword I want to rank on Google for, I would be wise to include it at least once.

The question, then, is: how many times should we include keywords in our texts? Once? Twice? 30 times?

Generally speaking I've brushed that question aside since I started making niche websites. When I write, I write for the reader, and then maybe I'll rewrite a headline or some subheadings to include the keyword.

I told myself that while it's clearly bad to over-optimise a page (i.e. to stuff it full of the target keyword), it could only be a partial ranking factor and wasn't worth the energy of doing it different. Someone like Matt Diggity will go to the top 5 pages, work out their keyword density, and make sure his keyword density was an average of those 5. In this way, he's bound to produce something acceptable to Google.

That makes sense, but my whole shtick is making money with the least effort possible.

Step 2 - Botherment

21st January 2017

If I've learned one thing in the last 6 months it's the value of tracking things. (I've started to track the effects of new links on my pages, for example.) But most of all I track where my sites rank for their keywords, and that's why I noticed the stark decline in one of my pages.

It is on my tech site and had never really pulled up any trees. It targets the term 'best [widget] for kids'. 

I think it ranked about number 20 when I bought the site, and I thought I could get it into the top 10, so one day I sat down and did an epic rewrite. Made it bigger and better according to my understanding of SEO at the time. That was about a year ago, the changes didn't help, and I moved on to other things.

A few months after the rewrite it ranked #50 and I gave it up as a bad job, but recently it slipped into the 60s, 70s, and in mid-January out of the top 100 altogether.

My site is far from a heavy hitter in the niche, but if I put up a decent page on most topics I'm going to get in the top 30. I was a bit pissed, thinking about all that effort I put into the rewrite.

So what went wrong? I wasn't totally sure, but since the rest of the site was doing great, it seemed like Google was penalising that page.

Could it be this keyword density thing?

Step 3 - Acceptance

I went to the best widget for kids page and did CTRL-F, typed 'kids' and found I'd used that term 52 times. I'd also used 'children' 30+ times.

So I started de-optimising it (and nothing else - no links). I replaced as many instances of the word 'kid' or 'child' with synonyms like 'little angels' and 'nippers'. Some sentences I changed to passive to avoid having a subject, and some I went from the reverse angle and made them about parents.

e.g. instead of 'Kids can use the widget to pacify cats' > 'The widget can be used to pacify cats'.
e.g. instead of 'Kids love playing with the widget's cat pacification feature' > 'Parents will be happy to know there's a cat pacification feature.'

I took instances of 'kids' from 52 to 33, and 'children' from 30+ to 20.

Basically my intention was to signal to Google that 'hey! I'm not trying to trick you by using this word a bajillion times'. It's also my understanding that Google's algorithm likes to see synonyms and related terms because it helps it understand what the page is about.

Note that I didn't check the keyword density of any rival pages - that seems like a step towards nutjobbery, to be avoided unless all your money comes from one page and it has to be absolutely pixel-perfect.

While I was there, I also added outbound links to authority sites like the BBC - previously there were only internal links and links to Amazon. That step prevents this from being a purely scientific test but hey, I want to make money from this page if I can.


Results

2nd March, 2017

After its deoptimisation, the page came back in at number 96, then fell out of the top 100 again. (The image below shows the rankings every 3 or 4 days.)

As you can see, it then stormed back in, rising up to #55 as of yesterday.

So I find that very interesting. Just by tweaking the keyword density you can get dramatic results. Of course, being #55 is practically the same as being #infinity, but that's not the point.


Takeaways:

  • Track your keywords so you can tweak and test
  • Keyword density is a thing