It’s a debate we have been having for a little while, as we often see better SEO results from newer, fresher, simpler and less complicated websites than those are starting to get a little bit outdated.
So we decided to see whether we could find anyone else who would back us up on the theory that high-quality website design leads to better and quicker SEO results.
HTML is King
There was a big trend a few years ago to build pretty looking websites, they were built in flash or Java but while they make look quite impressive it was soon remembered that search engines aren’t capable of crawling and digesting all the backflips your images are doing for the human eye. To be indexed in Google, or any of the other search engines, despite the advancement in the crawling technology, HTML is still king, and there are other ways in which you can fit up those fancy images and design. The other point to mention about flash is that it isn’t supported by certain systems, if you’re using a device designed by Apple, flash won’t be shown properly and as mobile browsing increases, it isn’t really welcomed here either.
Link Structures
Friendly navigation is good for users and search engines also take quite kindly to it. As well as having content to crawl, Google needs to be able to navigate through your pages easily and internal links need to be strong to make sure that pages are indexed.
If there isn’t a link to your pages, in Google’s eyes it effectively doesn’t exist as when Google’s spider crawls your site, there is no way to the page. In the infographic we can see that the sub-pages of A and B will be crawled, but C will not. If you’re trying to rank for one of these pages, no amount of keyword targeting, quality content and good links is going to make any difference at all.
Other than simply forgetting to place internal links (which should never happen if you’ve got a good website design and knowledgeable SEO professionals) there are a few other reasons why pages may not be crawlable:
- Content following a form submission
If you need to enter your details or submit a piece of information before accessing a page, search engines won’t be able to crawl the content
- Links in Javascript, flash etc.
These links cannot be followed by the Google spiders resulting in a lack of indexing
- Robots.txt and meta robots
These are optionally put on by the webmaster to prevent Google spiders from crawling certain pages, be warned however it may unintentionally stop your website from being crawled properly
- Pages with thousands of links
Search engines will not crawl an infinite amount of links so if you’re page is full to the brim be aware that Google may not choose to go through each and every one.
When you’re linking internally, make sure it makes sense too – relevancy is yet again key, use sensible anchor text to link from page to page.
Back in 2010 a SEO factor was added to the Google algorithm, which was all to do about website load speeds. Making sure your website loads in a decent amount of time is one way of boosting your rankings in the search engines and careful website design needs to be thought of when you’re trying to reduce this time. Tips are to remove any unwanted code or place as much as possible on one line per div/class, reduce the amount of JavaScript and don’t make files overly large. You’ll see a decrease in load time and an increase in rankings.
Although we aren’t saying that an old website will never rank well for relevant keywords, it is probably going to be harder work to optimise the site. Following some of these tips, and looking at how your current website is built can save you a great deal of time in the long run.
Hi Rebecca,
Great tips on how to optimize one’s website in terms of SEO techniques. I agree with you that it is crucial to have your website have a fast load time as it determines whether your visitors would like to still stay on your blog or visit another site, but of course many of us are impatient and leave a site if it takes too long to load.
Now Google may crawl even some JS and AJAX. So it is a step ahead.
On the other side, it becomes harder to hide any kind of link or text from indexing.