This is a transcript of the first ever Optimisey event speaker, Andrew Martin’s, talk.
Hands-up, the transcribing is by me. I tried to record the audio but failed (need to work on doing this better the sound quality was lousy). In short, any mistakes are likely to be mine, not Andrew M’s.
Andrew’s talk was great and, happily, dovetails with something I wrote recently: 7 Basic SEO Fundamentals. The fact both have seven is entirely coincidental but this does lead on nicely from that piece.
Like all the best performers Andrew (and Optimisey) is best experienced live – so you’ll just have to imagine the gifs moving, the interaction and the great Q&As afterwards etc. Come to the next Cambridge SEO event and see for yourself.
Pros and cons – you can’t hyperlink real speech but I can when I type it here. Enjoy.
- Download the slide deck: SEO First Steps 7 Must Dos
I’m Andrew Martin – yes, abuse me on Twitter whilst I’m up here… and vulnerable at @AndrewDoesSEO. Go for it.
I’m going to talk about the first seven steps to take in SEO but before I do that who am I? Why am I up here?
I learned to code HTML, at home, in the late 90s when the internet was a little bit quieter, a little bit simpler and a little bit slower and you had to dial-up to it.
We had BBC News pages like this – with a President impeached…
Animated gifs and this was my phone at the time, a Nokia 6110. You couldn’t even get the internet on that.
I’ve worked for these guys in various different web marketing and SEO roles. I’m currently SEO Manager for these guys, BoilerJuice – a heating oil price comparison site.
Actually – if anyone here is really into PPC come and see me later – we’ve got a vacancy.
I’m an SEO Manager with UX (user experience) tendencies.
For my seven steps I would want you to ask yourself:
Who is it for?
Search engines want to get users to the best answer, in the fastest possible time.
And users expect search engines to get them to the best answer in the fastest possible time.
So, obviously, produce content for humans.
If you’re writing for humans you need to write naturally, avoid long words, complicated language, user proper sentences, punctuation, paragraphs, bullet points, lists.
Add images, with captions, link to pages contextually so that means putting a link in the paragraph of text rather than “Here’s my Christmas dinner menu click here” – where the click here is the link.
Search engines are constantly evolving and they’re constantly learning – desperate to become more useful, like Siri, Alexa, Cortana – and they’re desperate to be human.
They’re trying to predict human behaviour.
Now here, I’ve done a query in Google Chrome, just typing away and it tries to help me… with some search queries – and I can promise you none of those was what I was searching for.
I was actually looking for: ‘Why was Mylene Klass’s teeth really white’ – I don’t know. Honestly.
Privacy for search engines – specifically for search engines – might work for a bit but the rules change, often with little to no notice meaning you could be faced with a really horrible SEO penalty which could take you out of search completely in the worst-case scenario, or it could push you right down underneath your competitors.
If you’re writing just for search engines, humans will not enjoy your keyword-stuff content.
And here, is a real-life example from one of my employers’ competitors. I’ll read it out for those in the cheap seats.
This is live and it’s created by an automated script – they have around 400 of these pages which they’ve actively created, live right now.
It swaps out the place-name and some of the geographical content and it pulls other stuff in, it looks like from Wikipedia.
I’ll read it for you – see how you feel about this:
“If you live in Beacons Bottom, a hamlet located on the A40 road between the villages of Piddington and Stokenchurch, and three miles north of the M40, and you require domestic heating oil in Beacons Bottom then [BLANK] is your local domestic heating oil supplier for Beacons Bottom.”
Now, to me, that is really difficult to read – I almost suffocated reading that. I would just remove the padded spam from there and just go: “If you live in Beacons Bottom, Buckinghamshire and you require heating oil then [BLANK] is your local supplier.”
Google (and other search engines) they know where Beacons Bottom is. They know that it’s in Buckingham, they know what it’s like, they know that there’s roads nearby the A40, the M40 – they understand that. They don’t need all that stuff stuffed into that paragraph.
And that paragraph is one of about 25 per page… nobody likes that.
Number two:
Go mobile
Search engines started to favour mobile optimised sites in roughly April 2015 is when they admitted to it.
And it triggered this Mobilegeddon which frightened a lot of SEO bloggers and SEO experts.
This was in response to a change in user behaviour.
Search engines realised there were more mobile search engine users than there were desktop search engine users.
So obviously they had to change up their rules and how they serve up search results because they knew that users would go on search on the train, on the toilet, whilst they were still in bed… sitting in conference audiences – they knew they had to change.
Mobile friendly sites began to rank higher than those mobile unfriendly sites.
The days of pinch and pull thinking “What the hell does that say?” or trying to do this on your screen to zoom in – those kind of searches are being pushed down in search engines.
Sites using responsive web design were faring better.
They could adapt to the size of the screen of the device and also the orientation in which you were holding that device.
It was designed for the smallest screen upwards and it means you have one version of your site, rather than lots of different versions.
These responsive web designed sites benefited from being mobile friendly, they got the benefit from appearing mobile friendly in search results but it also means they’ll be ready for when Google goes to mobile first indexing because they’re ready to go with that.
But… if you go mobile you have to…
Go fast
And you also need to: go faster.
Speed is so important. This is number three.
A one second delay – and this is one of the reasons you want to go for speed, so if you’ve got anyone who says “Oh, we’re not going to invest in making our site faster.” these are some reasons you can take back to whoever’s managing that or blocking that for you.
[tweetshare tweet=”A one second delay means a 7% reduction in conversions. – via @Optimisey” username=”@AndrewDoesSEO”]
79% of customers say that if they’re dissatisfied with a website’s performance they’re less likely to buy from them.
47% of customers expect a page to load in two seconds or less and if it takes over three seconds you can expect a 20% bounce rate.
Now, bounce rate means you’ve come from another website perhaps search results or a referral site and they come to your site and don’t look at any other pages they just hit it and bounce back off.
That’s obviously not very good for you.
So speed, being fast, will help you reduce that.
Some people call that pogo-sticking and there is some debate as to whether that affects your search rankings – signalling that your site is of less value and I think that will affect search results.
A slow site will also affect your crawl budget and indexation.
Now, crawl budget is this thing which is the amount of time which a search engine will spend when it comes to your site.
Doing that, find a site, considering it’s links and what it finds and then the indexers may add to search results so if you have a slow site, the amount of time which it will spend on your site will be less.
In order to go fast one of the things you can do is compress the hell out of your images.
What I mean by that is the file size – megabytes and kilobytes.
You can compress and minify the actual code of your site the HTML, the CSS and the javascript.
Also, try to reduce redirects.
Now, one of my former employers had some very old websites. One of them was about 15 years old.
They had them tucked away on some old server somewhere and this old site linked to that one and this to that and another one after that – and then they thought ‘Oh, we need a new one’.
So anyone using the old URLs was bouncing between all these sites – being redirected.
For the users they can’t see that it just seems really slow until they get to the correct site.
For search it also slows it down which is not good for SEO.
You could use browser caching to speed up delivery.
Browser caching means that, if you go to a website that website downloads some files on to your computer and then if you go to it again it can save some time by loading some of the files that it saved to your computer so it makes a repeat visit faster.
If you’ve even cleared your cache on your computer, that’s the kind of stuff you’re clearing which can be images from websites.
So if you use browser caching you can set an expiry date.
So if you set a date on, say, your company logo of six days and on day one a user downloads it on day three, when they visit again they don’t have to download it because it’s within that six days – which helps speed things up.
You can also up your server capacity, so improve your technology to serve these pages more quickly and efficiently.
There are tools – help is at hand.
This is from Google and it’s free – it’s called Google Page Speed Insights. If you Google ‘Page Speed Insights’ you’ll get something like this.
You can put anyone’s URL in here to check the speed score.
I just so happened to have looked at the BBC here and I got this.
I just did their main homepage – you can only do page by page so it can be a bit slow but it is really detailed and great advice.
It splits it by mobile and desktop and will give you a score for each one.
You can see the score there: Poor, 61 out of 100.
And it grades it. It’s not going to say ‘this page took two seconds to load’ it’s looking at it as a whole and gives a grade to go with it.
Each one of these you can expand and it will show which things or which code are causing it to be slow to load.
There’s tonnes of information so do give it a try.
Number four:
Go local
To come here tonight I just put into Google ‘Eagle Cherry Labs’ – I didn’t quite know where I was going – and unsurprisingly there was a whole load of nice search results.
But over on this side there’s a nice information box a Knowledge Box, which gave me photographs of the venue, it gave me a map, there was a preview to the street-view, there are reviews, the company website, the address to which I can get directions.
When I took this screenshot it was actually closed because it was late at night – but it would tell me which hours it’s open. It actually doesn’t have a phone number so there’s a tip if the venue manager is listening in.
That’s great visibility for a local business if searchers kind of know the business name.
If you can fill this out completely then it will help to rank for those searches.
In order to sort out your local SEO you need to go looking for yourself.
That sounds like a mindfulness course – but use search queries around your current and old addresses, all those old phone numbers, business name variants.
Take a look at directory listings sites, like Yell, Scoot like the phone book, all those usual suspects and also your own site and your social channels and sort out your NAP.
NAP stands for Name, Address, Phone number.
Basically, you’re looking for all of those variants and you want them to say exactly the same thing.
Pick one format for your address and update it everywhere you can so it matches that.
That’s like a testimonial to your business because it’s saying on all of these different websites this is the business, this is where’s it’s based.
There are paid for tools from the likes of Whitespark, from Moz Local and I’m sure SEMrush do have it as well that would enable you to find those listings and addresses and also to help you update them as well.
Or you can just use a spreadsheet, which is what I do. A whole load of search queries and sites and you can track them all.
Don’t forget to claim your business. You can do that on Google My Business and Bing – anyone remember Bing? Bing Places they have one as well.
And then you can have a nice listing like this – which is a great shop – so you can enhance your listing.
Who doesn’t want to support a local business?
Go meta
– number five.
Meta tags are little tiny tags that live in the HTML, in the head of each page and they kind of tell search engines the information about that page.
The first one is the page title which, technically, isn’t a meta tag but it kind of gets lumped in there with all the others.
But the title is really important.
You need a descriptive title for each individual page. Each of your pages needs to have one of these titles.
If often includes the brand name, so you’ll see if you do a search you’ll often see the name of the page and then a pipe symbol or a hyphen and then the name of the business.
Titles have to be under 60 characters. If it goes over 60 characters it just gets truncated.
It also appears in the tabs on your browser. So if you have a lot of tabs open in Chrome or whatever you’ll see page titles in those tabs.
They’re really prominent in search results too.
Next up, meta descriptions.
These are a unique description of each and every single page.
You’ve got 156 characters to play with and again, if it’s longer it gets truncated and it will be out of sight of search engine users.
If you’ve got some really juicy keywords that you want to be found or a really attractive message that you want to catch people’s eyes, make sure it’s early on in the meta description.
That goes the same for the title – get the really juicy stuff early on.
Meta description is visible in search results, I’ll show you where in a second.
Lastly, meta keywords.
Now these used to be significant but they were spammed to bits. Most search engines don’t consider them. People used to put all sorts of things and keywords into here just to get them to rank. They cheated the system so the search engines just kind of turned them off.
Now this is making me hungry but this is a search result I did for pizza recipes.
You can see the word, in the lower portion of the text in bold so it’s highlighting the things I was searching for.
This is some of the code from that page and some of that code was used to build that search result.
Here you can see the page title, the Pizza Recipes pipe symbol, Jamie Oliver – which turns up as the big piece of clickable text in the search result.
Next down you can see meta name description and then it has the content, which is a line and a half of code here which is the non-clickable piece of text in the search result.
For some reason they’ve then got jamieoliver.com in there too – I wouldn’t bother doing that.
Interestingly they’ve also put meta keywords in too: Jamie Oliver and jamieoliver.com – again I probably wouldn’t bother doing that.
That’s how you get that information to appear there.
Sometimes it’s not what’s in the page title or the description, sometimes the search engines like to do something a little bit different but if you state them here it’s the best bet you’ve got that they’ll appear in the search results.
Grep robots.
Meta robots advise search engines on what to do with your web pages, on an individual level on each page but you can also stick them on your server which make big sweeping statements.
Those kind of statements can cover ‘no index’ – which says to the crawlers ‘I know you’ve seen me but… you haven’t seen me. Don’t remember me, don’t put me in the search results.’
There’s also ‘no follow’ which says ‘I know you’ve seen me and you’ve seen these links but don’t consider these links, don’t associate my site with those links’.
And ‘no archive’.
No archive means, when a crawler sees noarchive it sees the page but you’re saying ‘don’t keep a copy of this in your cache archives.’
I tend to find if you’ve got a page that’s turned up in the search results that you don’t want in the search results – and that can happy for lots of legitimate reasons – I put a noindex and a noarchive to make sure it gets back out of the search results.
Next is headings.
So headings can help search engines understand content and also the context of the content. And they can really help users by breaking up the content into a hierarchical order.
H1 is the most important, H1 stands for Heading one and it indicates the main topic of the page.
As it’s the main topic there should only be one H1. So many sites have multiple H1s.
Visually on the page these are often the main headline on the page – there’s lots of examples on newspaper websites that do that.
Next down is H2, heading two which is a sub-subject of the H1 and H3 is a sub-subject of the H2 so you’re kind of breaking down the sections of your page or article.
I find that H tags are often misused by CMS users which drives me insane which means I can show this really naff meme – which shows an SEO and a CMS user.
It really annoys me how many people don’t use headings tags correctly they just go in for how they look.
Number seven, we’re almost there.
Links
As every spider knows the web can only be a web if there are links.
Be careful who or what you link to. Not all links are created equal.
You need to watch out for spam sites. Low quality sites with thin content, very vague, poorly written or sites with lots of duplicates with the same content on them.
Perhaps advert-laden pages – you go to these sites and all of these adverts appear overlays and stuff, spam-filled comments. If there are spam-filled comments on the site that you’re linking to it could be a strong spam signal to search engines.
Also, if you’ve got spam comments on your site that can also be a signal to search engines.
And also malware sites which, when you visit them are ‘Ah, yes, let’s download this for you.’
You can easily get penalised so only link out when it offers genuine value to the user. And link, as well, to content on your own site.
Lots of people overlook this and how internal linking.
If you’ve got a page on your site and you have another page over here people forget to link them – which can actually help the crawlers discover more of your content so with proper internal links you can get those pages indexed as well.
There’s some tools you can use to check for spammy sites. There’s a good one from Moz (links to all these sites will be available after this so you don’t need to frantically write this down) – that’s Open Site Explorer.
There’s a way to check sites and get their spam score before you link to it.
Also from Moz there’s the SEO Toolbar which you can add to your Chrome browser.
There’s also a good article from Ahrefs on Bad Links.
My personal opinion: don’t buy links.
Earn them.
Because you want to have the best content on the best website that’s well optimised.
So that the search users can get to the best possible answer in the fastest possible time.
- Download the slide deck: SEO First Steps 7 Must Dos
Cambridge SEO MeetUp
If you found the above useful (perhaps even interesting?) you will love attending the free Cambridge SEO MeetUps in person.
Great speakers, great learning, great advice to help get more of the right kind of organic traffic to your website.
There’s free drinks, snacks and great people to network with too.