Tuesday, March 11, 2008

SEO , Javascript and AJAX

Javascript: Friend or Foe?

If you frequent different SEO forums you may have received mixed signals on things like Flash or JavaScript. You may be wondering if it’s safe to use these and if so, what the impact will be. In this article, I attempt to address your concerns over JavaScript.

A brief history of JavaScript

In 1995 Netscape developers realized they needed an easy way to make Java applets more accessible to non-Java programmers and web designers. While it was plagued with problems and errors that were unable to be diagnosed in the early days, the popularity of this simplistic scripting language has hung on and has gotten more popular over time.

Because of its cross-browser compatibility, in that most modern browsers now support JavaScript, and the relative ease of use and implementation, JavaScript has become popular among designers looking for a more dynamic edge for their websites.

So, is JavaScript bad?

In an nutshell, no. If it is used properly, and some basic rules are followed then JavaScript is acceptable.

The biggest flaw with many sites which use JavaScript is that the developer embeds navigation inside of JavaScript menus which render the links invisible to search engine crawlers and therefore the crawler won’t follow those links.

But if you remove the navigation from the JavaScript it then becomes a very powerful scripting tool to help you achieve various effects that HTML can not.

JavaScript is also handy in helping you reduce code bloat.

Code bloat is what we call it when a regular HTML file approaches the file size limits imposed by the search engines. While there is debate on the actual size, Google has come out and said that it can easily index files up to about 100 kB (kilobytes) but may have problems with files which are larger.

Sometimes, there are aspects within a file that must exist for that page to render properly. And sometimes JavaScript can be used to help maintain that functionality but also help reduce the actual file size.

But before I get to that, let us start with the basics.

The basics – externalizing JavaScript

Lets say your website has some JavaScript in it. What should you do?

Well, usually we recommend removing the script and referencing it externally. Especially if there are many lines of JavaScript. If you’ve only got 3 or 4 lines throughout the page, I wouldn’t worry about externalizing it. While such a small amount of JavaScript won’t matter one way or the other, you don’t have to if you don’t want to.

To externalize the script is quite simple – look in the code for where the starting tags and copy that code into a notepad file. Then save your file as something.js where “something” is the filename you want and “.js” is the file extension.

Note: Be sure that if you are using Notepad you have to change the file type (the lower drop down box in the “save as” dialogue box) to “all files” so that Notepad doesn’t add a .txt to the end of your file. If you don’t do this, the file will be saved as something.js.txt and won’t function properly.

Once you have the file properly saved, upload it to your webserver. Now you are ready to reference it using the following code:

<script language="JavaScript” type="text/javascript” src="something.js"></script>

What this does is refer to the external something.js file and run the contents of the JavaScript as if they reside within the HTML.

As you may already guess, this can help remove hundreds or even thousands of lines of JavaScript, reducing the file size substantially.

I remember one site I worked on where I was able to externalize over 1100 lines of Javascript and replace it with 4 external files. That resulted in much smaller file sizes which actually also loaded faster.

So what are the disadvantages of java?

As with all great ideas, this one has a catch.

As you know, search engine crawlers will not execute the JavaScript. That’s why we went to all this trouble. But if a visitor doesn’t execute JavaScript then of course they won’t see what is “hidden” in the script. Therefore if you have an essential navigation element inside that JavaScript then it is likely that the visitor that doesn’t execute JavaScript won’t be able to navigate your site.

However, most browsers today do support JavaScript. In fact, while it is something that can be turned off there are too many other sites which also use scripts, however simple, to help render the page. Virtually every major web portal uses some JavaScript. Therefore I don’t think you have too much to worry about.

However, if you are one of those people who must have your site as compatible as possible with as many browsers as possible, then I would recommend not using the above tactics.

On the other hand, if you are concerned more with supporting 99.9% of your sites visitors and are willing to sacrifice that 0.1% of visitors who may not support JavaScript in exchange for improved search engine rankings then perhaps this is one tactic you can employ.

Now before you go thinking that this is a magic bullet that will shoot you to the top of the rankings, think again.

This is only one of many in an arsenal of tactics available to help your site improve its rankings. There are many other things which are more valuable to you in terms of rankings (such as effective content and good link building). But, if you are looking to squeeze every bit of performance out of your site as possible, and looking for absolutely everything which could have a positive impact on your rankings, then you should consider externalizing any non-essential code such as JavaScript.


AJAX, for those who haven't heard the term, is a new age way of bringing software-like usability to the web. It uses existing technologies but usefully pulls them together to create powerful online applications. The most famous examples of AJAX, or Web2.0 is Google's Suggest feature and Google's Maps tool.

In a nutshell, it's a means of using asynchronous JavaScript and XML to query a remote data source and render the content without refreshing or reloading the page.

The danger of this, from a search engine optimisation point of view is that we are doing away with unique URL's and we have few options to manipulate TITLE tags and page headers, the main requirement for an effective SEO campaign.

As much as SEO is dismissed as secondary to design, a website is a tool for business and should fundamentally reach its market first and foremost.

How to search engine optimise Ajax

The technique I wish to propose contains two main rules and will fundamentally change the way the AJAX application is constructed.

Rule #1

The initial load of the AJAX application must contain the optimised elements such as TITLE and headers and also must be reachable from a fixed address.

Rule #1 is all about using the following two techniques.

* Server side script
* .htaccess mod_rewrite

Promotion and marketing of AJAX

This is the second important rule that needs to be followed in order for a great listing of the products or resources with an AJAX application.

Rule #2

As with all websites, we need inbound links which are keyphrase descriptive that point to a particular resource, not just the home page.

No comments: