The Most Active and Friendliest
Affiliate Marketing Community Online!

“Adavice”/  “CPA

Quality of Code and SERP

azseoguy

New Member
affiliate
Hello all,
what do you guys think about the quality/cleanliness of code in a website structure/page and its effect on SE rankings.

For instance, lets assume we have a good title, heading(s), copy, and anchor text for a given keyword phrase, but the structure (html) of the page is messy. would this have a negative effect on serp?

(The page looks good to the viewer but not so pretty for the crawler)



"Messy"
inline styles, nested tables, javascript in the head, etc.
 
The page looks good to the viewer but not so pretty for the crawler

If this is the case then its not good for SERP as the crawler cant crawl the site and index it.
 
Last edited by a moderator:
If this is the case then its not good for SERP as the crawler cant crawl the site and index it.

I don't think so. If the page has the proper meta data setup and the spider can still read the text/content of the page I see no reason why it wouldn't rank well...but if the ONpage stuff (meta data) is not properly done obviously you would have major problems.
 
The first thing Google say in there webmaster guidelines is that the page should be of high technical quality. They are referring to compliance with html or xhtml standards. So you should ensure your pages pass W3C error checking.

The W3C Markup Validation Service

Also, the more links the spiders can follow the faster your pages will be indexed/reindexed. So avoid navigation that bots cannot follow - i.e. flash.

You should ensure all your alt tags/ name tags are not left blank.
 
Thanks for the info, and the link surrey. Will check the standard. Yeah the meta data is in the correct place, it is the structural code that is outdated and or "messy".

For example a clean page would have the content starting around line 50-100, but the messy page doesn't have it starting 'til line 300+

My thought was by having to pass thru so much more code before it reaches the body, it might negatively effect the ranking of the page.
 
well codes sites tends to load fast and fast loading site has and advantage over a poorly coded bloated site.
 
I have read that you should get the page content as high as possible in the HTML. Many sites now have the main navigation bar at the bottom of the HTML and use CSS to position at the top so that the first thing a spider sees is the main copy.

I am skeptical of this though as it's obvius that spiders do read the whole page and I can't see why they would weight the top of the HTML as more important taking into consideration that the first thing in most sites HTML is the navigation.

Overall I think as long as the spider can parse the page (you don't have mismatching tags) then the search engines will index it.

I don't think search engines will be too bothered about w3 strict validation as most of the search engines sites have not valid xhtml.
 
I have read that you should get the page content as high as possible in the HTML. Many sites now have the main navigation bar at the bottom of the HTML and use CSS to position at the top so that the first thing a spider sees is the main copy....

Thanks Matt, I have considered this too. By using CSS positioning the code doesn't have to be written in the order it will be displayed. Curious to know if that would effect the ranking. I am in the process of converting the site into a table-less format so the pages are going to be shorter and cleaner (if nothing else)
 
The first thing Google say in there webmaster guidelines is that the page should be of high technical quality. They are referring to compliance with html or xhtml standards. So you should ensure your pages pass W3C error checking.

The W3C Markup Validation Service

Also, the more links the spiders can follow the faster your pages will be indexed/reindexed. So avoid navigation that bots cannot follow - i.e. flash.

You should ensure all your alt tags/ name tags are not left blank.

Completely agree with you. Even i am facing the errors in my blog code. But now i have forwarded my details to the programmer. I have also discussed it with some professionals and told me that errors shown by W3C validation service do affect your site performance and ranking.
 
Hello all,
what do you guys think about the quality/cleanliness of code in a website structure/page and its effect on SE rankings.

For instance, lets assume we have a good title, heading(s), copy, and anchor text for a given keyword phrase, but the structure (html) of the page is messy. would this have a negative effect on serp?

(The page looks good to the viewer but not so pretty for the crawler)



"Messy"
inline styles, nested tables, javascript in the head, etc.

Of course it matters. Long invalid pages can not be parsed easily by Search Engines. If the html is broken you will have other problems also:
You will not be able to keep the same result for the different browsers, the site will load slower and blind users will not be able to read your site.
 
Search engine spiders have difficulty reading messy outdated code. XHTML and CSS are best. PHP with static urls is also good and can be easier to manage. As mentioned earlier W3C guidelines should be followed. This can be difficult though with the different hacks that should be used for IE to make pages display correctly.

Spiders also look at code to content ratio. If you're using tables for non tabular data the search engine gets confused. But like with anything it's cost/benefit and you must take a holistic view to SEO. Don't spend 80% of your time on an aspect that will only produce small benefits if your time and budget are limited; and whos time and budget are not limited?
 
MI
Back