I have often seen on other forums / blogs in the seo / making money online industry about Page Rank Bleed Down and more importantly how to kick start this process on sub pages for inner page rank. As to why this could be important is beyond me as google is very strict against selling page ranked links so don’t say I didn’t warn you!
But if you just want to know for the sake of knowing then all you need to do is continue reading on! Before we get started lets talk about what PR Bleed Down is, PR Bleed Down is how the Page Rank from your home page is bled down from page to page. Every sub page you create has the potential to gain Page Rank and even out page rank your home page if you do it right!
So when we talk about PR Bleed Down what exactly does that mean? Basically it’s the way in which the Page Rank is accumulated from the homepage and distributed among the different sub pages within your site. A sub page can gain pr by pointing links directly to it or by the homepage linking to it or both. Your page rank power comes from when you apply both a homepage direct link and external links pointing to that new sub page. Without both the page won’t be very powerful and wont rank well in search terms.
So the raises the ever important question, how can I kick start my sub pages to get pr?
• Link To Them From The Home Page
• Submit Them To Social Bookmark Sites
• Cross Link The Sub-pages
If you do those 3 things you will more then likely get a page rank no less then 1 pr below your homepage. Meaning if your a page rank of 5 then your sub-page should be a 4 in the next update. After that, point a few external links to each sub page and be ready to up your page rank even past the home page pr in some cases!
What about you? Can you think of any other tips or ideas for getting your sub-pages PageRank higher? Tell us about them below by commenting!
Special SEO Videos
Tuesday, March 11, 2008
Page Rank Bleed Down
SEO , Javascript and AJAX
Javascript: Friend or Foe?
If you frequent different SEO forums you may have received mixed signals on things like Flash or JavaScript. You may be wondering if it’s safe to use these and if so, what the impact will be. In this article, I attempt to address your concerns over JavaScript.
A brief history of JavaScript
In 1995 Netscape developers realized they needed an easy way to make Java applets more accessible to non-Java programmers and web designers. While it was plagued with problems and errors that were unable to be diagnosed in the early days, the popularity of this simplistic scripting language has hung on and has gotten more popular over time.
Because of its cross-browser compatibility, in that most modern browsers now support JavaScript, and the relative ease of use and implementation, JavaScript has become popular among designers looking for a more dynamic edge for their websites.
So, is JavaScript bad?
In an nutshell, no. If it is used properly, and some basic rules are followed then JavaScript is acceptable.
The biggest flaw with many sites which use JavaScript is that the developer embeds navigation inside of JavaScript menus which render the links invisible to search engine crawlers and therefore the crawler won’t follow those links.
But if you remove the navigation from the JavaScript it then becomes a very powerful scripting tool to help you achieve various effects that HTML can not.
JavaScript is also handy in helping you reduce code bloat.
Code bloat is what we call it when a regular HTML file approaches the file size limits imposed by the search engines. While there is debate on the actual size, Google has come out and said that it can easily index files up to about 100 kB (kilobytes) but may have problems with files which are larger.
Sometimes, there are aspects within a file that must exist for that page to render properly. And sometimes JavaScript can be used to help maintain that functionality but also help reduce the actual file size.
But before I get to that, let us start with the basics.
The basics – externalizing JavaScript
Lets say your website has some JavaScript in it. What should you do?
Well, usually we recommend removing the script and referencing it externally. Especially if there are many lines of JavaScript. If you’ve only got 3 or 4 lines throughout the page, I wouldn’t worry about externalizing it. While such a small amount of JavaScript won’t matter one way or the other, you don’t have to if you don’t want to.
To externalize the script is quite simple – look in the code for where the starting tags and copy that code into a notepad file. Then save your file as something.js where “something” is the filename you want and “.js” is the file extension.
Note: Be sure that if you are using Notepad you have to change the file type (the lower drop down box in the “save as” dialogue box) to “all files” so that Notepad doesn’t add a .txt to the end of your file. If you don’t do this, the file will be saved as something.js.txt and won’t function properly.
Once you have the file properly saved, upload it to your webserver. Now you are ready to reference it using the following code:
<script language="JavaScript” type="text/javascript” src="something.js"></script>
What this does is refer to the external something.js file and run the contents of the JavaScript as if they reside within the HTML.
As you may already guess, this can help remove hundreds or even thousands of lines of JavaScript, reducing the file size substantially.
I remember one site I worked on where I was able to externalize over 1100 lines of Javascript and replace it with 4 external files. That resulted in much smaller file sizes which actually also loaded faster.
So what are the disadvantages of java?
As with all great ideas, this one has a catch.
As you know, search engine crawlers will not execute the JavaScript. That’s why we went to all this trouble. But if a visitor doesn’t execute JavaScript then of course they won’t see what is “hidden” in the script. Therefore if you have an essential navigation element inside that JavaScript then it is likely that the visitor that doesn’t execute JavaScript won’t be able to navigate your site.
However, most browsers today do support JavaScript. In fact, while it is something that can be turned off there are too many other sites which also use scripts, however simple, to help render the page. Virtually every major web portal uses some JavaScript. Therefore I don’t think you have too much to worry about.
However, if you are one of those people who must have your site as compatible as possible with as many browsers as possible, then I would recommend not using the above tactics.
On the other hand, if you are concerned more with supporting 99.9% of your sites visitors and are willing to sacrifice that 0.1% of visitors who may not support JavaScript in exchange for improved search engine rankings then perhaps this is one tactic you can employ.
Now before you go thinking that this is a magic bullet that will shoot you to the top of the rankings, think again.
This is only one of many in an arsenal of tactics available to help your site improve its rankings. There are many other things which are more valuable to you in terms of rankings (such as effective content and good link building). But, if you are looking to squeeze every bit of performance out of your site as possible, and looking for absolutely everything which could have a positive impact on your rankings, then you should consider externalizing any non-essential code such as JavaScript.
Ajax
AJAX, for those who haven't heard the term, is a new age way of bringing software-like usability to the web. It uses existing technologies but usefully pulls them together to create powerful online applications. The most famous examples of AJAX, or Web2.0 is Google's Suggest feature and Google's Maps tool.
In a nutshell, it's a means of using asynchronous JavaScript and XML to query a remote data source and render the content without refreshing or reloading the page.
The danger of this, from a search engine optimisation point of view is that we are doing away with unique URL's and we have few options to manipulate TITLE tags and page headers, the main requirement for an effective SEO campaign.
As much as SEO is dismissed as secondary to design, a website is a tool for business and should fundamentally reach its market first and foremost.
How to search engine optimise Ajax
The technique I wish to propose contains two main rules and will fundamentally change the way the AJAX application is constructed.
Rule #1
The initial load of the AJAX application must contain the optimised elements such as TITLE and headers and also must be reachable from a fixed address.
Rule #1 is all about using the following two techniques.
* Server side script
* .htaccess mod_rewrite
Promotion and marketing of AJAX
This is the second important rule that needs to be followed in order for a great listing of the products or resources with an AJAX application.
Rule #2
As with all websites, we need inbound links which are keyphrase descriptive that point to a particular resource, not just the home page.
Monday, March 10, 2008
Sandbox
What is Sandbox ?
The Sandbox Effect is a theory used to explain certain behaviours observed with the Google search engine. The Sandbox Effect is the theory that websites with newly-registered domains or domains with frequent ownership or nameserver changes are placed in a sandbox (holding area) in the indexes of Google until it is deemed appropriate before a ranking can commence. Webmasters have claimed that their site will only show for keywords that are not competitive[citation needed]. It appears this effect does not affect new pages unless the domain is in the sandbox.
There are many different opinions about it, including the view that the Sandbox Effect doesn't actually exist and that the search ranking behaviour can be explained as a result of a mathematical algorithm, rather than a decided policy.
Those who believe the sandbox exists observe that it can sometimes take up to a year or longer for a website to be promoted from the Google sandbox, while those who do not believe in a sandbox explain this duration as simply the time it takes for Google to calculate PageRank using an "eigenpairs interpretation of nodes".
How will you know that your sandboxed by google ??
There are some signs of that you might be sandboxed:
1. Drop in the number of website visitors referenced by google.com .
2. Sudden drop in the google's PageRank of all website pages.
3. When querying google on specific keywords - your website appears in the last 2-3 pages of the search results.
4. Your website is banned from google's listing.
If you wish to check whether this applies to you then try the following few methods
I Method :
Use this web address to check the indexing of your website pages against specific keyword: http://www.searchenginegenie.com/sandbox-checker.htm
II Method :
Run your web browser and go to: http://www.google.com
then type in the search box:
www.yourwebsite.com -asdf -asdf -asdf -fdsa -sadf -fdas -asdf -fdas -fasd -asdf -asdf -asdf -fdsa -asdf -asdf -asdf -asdf -asdf -asdf -asdf
If your website appears in the search results and has good keyword ranking then your website is in the google's sandbox.
III Method :
Run your web browser, go to: http://www.google.com
and type: site:www.yourwebsite.com
If there are no results found and then your website is out of google's indexing database. The difference between non-indexed fresh websites and sandboxed ones that on latter you'll not see: If the URL is valid, try visiting that web page by clicking on the following link: www.yourwebsite.com
How to get out of Google Sandbox - part I
Emergency steps - do these first
1. Check whether are you using in your website meta-refresh javascript redirects. For example:
meta equiv="refresh" content="5; url=http://www.website.com/filename.php"
If so remove them, because they are assumed as spam by Google's Bot.
2. Check if websites linking to you give HTTP response code 200 OK
2.1 at google search box type allinurl: http://www.yoursite.com
2.2 Check every website other than yours by typing them here and look for HTTP response code 200 OK.
2.3 If there are any that give 302 header response code then try to contact the administrator of the problematic website to fix the problem. If you think that they are stealing your Page Rank - report them to google report spam page with check mark on Deceptive Redirects.
Few more advance tips to remove.. sandbox
- redirect website's www urls to non-www urls
- have a website structure not deeper than 3rd level(I.e: don't put your website content so deep inside via more than 3 links away. This way the crawler/spider could stop searching it.)
- convert webpages using .htaccess from dynamic to static ones.
- rewrite all the meta tags and explicitly manifest the pages that must not be indexed.
- put a slight delay into the crawling machines, this is important especially if your hosting server doesn't have a fast bandwidth.
In your robots.txt file put:
User-agent: *
Crawl-Delay: 20
You can also adjust the Crawl delay time.
- remove the duplicate or invalid pages from your website that are still in the google's index/cache.
First make a list of all the invalid pages. Then go to google's webpage about urgent url removal requests:
http://services.google.com/urlconsole/controller
Congratulations: I hope that by following those steps your website will be re-indexed soon.
Link Building Tips
Every one of us wants to rank high in Google. You must focus on backlink building to have good serps. But backlink building has to be done with great care - not to get penalized.
Here are some simple tips which will definitely help:
1. Don't buy (or sell) sitewide links for serps. Buy links from individual pages*.
2. Don't look for high PR links only - start your campaign for a new site with PR 0-1 and slowly move up each week as you see your rankings improve.
3. Control link placement speed. Remember of "Too many links at once" Google filter.
4. Control the number of outbound links on your pages and on pages where you place your links, avoid link-farms, links to gambling, pills, etc.
5. A link from a relevant website is excellent, but a link from a quality site of other niche or other language can also boost your ranking - so don't ignore such links.
6. Make links look natural. Use different link anchors, promote not only your main page, but your deep pages too.
7. Don't do reciprocal linkexchange.
* - Links from individual pages are far more effective when speaking of improving serps(Search Engine Ranking Poition), not PR. They are more natural and organic. Sitewide links can easily be filtered.
Selling sitewide links also means loosing lots of money - if your site has 1000 of PR2-3 pages, you could earn hundreds of $$$ a month just from selling ONE outbound link per page, compared to some $20-30 for selling a sitewide link.
Just remember, that your site's ranking depends on your efforts and skills, it doesn't depend on your site's PR etc.
Sunday, March 9, 2008
SE Friendly URL's
Search Engine Friendly URL's is also the main thing to be done.. Many of us have dynamic sites
means site who's content changes daily. So url might be in such format
http://www.yoursite.com/xyz/pqr.php?t=123&f=90
well this links can be cached by google but still making more simplicity or making it in such way
that it shows it is static page helps lots in caching..
Well the above link can be converted in following form
http://www.yoursite.com/xyz/123/90/<title of page>.htm
This link will be more friendly then that old one...
So converting such links is called Mod Rewriting which i will provide some links where you can get more explanation regarding this.
http://www.workingwith.me.uk/articles/scripting/mod_rewrite
If you want more links. Please do reply i will post more and more
Thanks
60 Forums for Do Follow
Do follow forum links means its a list of few forums where you can register and start
posting and give out your link in signatures. I found this list some where else..
But i m posting it with correct and proper links..
1. http://www.phpbbstyles.com
2. http://www.sitepoint.com/forums
3. http://www.thefreead.net/forums/
3. http://acapella.harmony-central.com/forums
4. http://forums.seroundtable.com
5. http://www.submitexpress.com/bbs
6. http://www.startups.co.uk
7. http://www.v7n.com/forums
8. http://www.webmaster-talk.com
9. http://forums.comicbookresources.com
10. http://bzimage.org/
11. http://www.clicks.ws
12. http://www.earnersforum.com/forums/
13. http://www.acorndomains.co.uk
14. http://forums.onlinebookclub.org
15. http://www.davidcastle.org/BB
16. http://6pins.com/blog/
17. http://iq69.com/forums/
18. http://forum.hot4s.com.au/
19. http://forums.microsoft.com/MSDN/
20. http://forums.mysql.com/
21. http://forums.cnet.com/
22. http://forums.oracle.com/forums/
23. http://paymentprocessing.cc/
24. http://www.warriorforum.com/forum/
25. http://forums.webcosmo.com
26. http://forums.digitalpoint.com
27. http://forums.searchenginewatch.com
28. http://www.webmasterforumsonline.com/
29. http://www.webmastertalk.in
30. http://www.ozzu.com/
31. http://www.webmasterworld.com/
32. http://www.webmastershelp.com/
33. http://talk.iwebtool.com/
34. http://www.webmaster-forum.net/
35. http://www.dnlodge.com/
36. http://forum.vbulletinsetup.com
37. http://forums.ukwebmasterworld.com/
38. http://www.webmastertalk.co.uk/
39. http://forums.seochat.com/
40. http://www.australianwebmaster.com
41. http://www.cambodiaxp.com/
42. http://www.highrankings.com/forum/
43. http://www.searchguild.com/
44. http://www.ihelpyou.com/forums/
45. http://www.seo-guy.com/forum/
46. http://www.searchengineforums.com/
47. http://www.marketingchat.org/forum/
48. http://www.webworkshop.net
49. http://www.webdesignforum.com/
50. http://www.ukbusinessforums.co.uk
51. http://www.businessforum.net/
52. http://www.domainforums.com
53. http://www.namepros.com
54. http://www.domainstate.com
55. http://www.nameslot.com
56. http://www.dnforum.com
57. http://www.webhostingtalk.com
58. http://www.freewebspace.net
59. http://siteownersforums.com/index.php
60. http://forum.admob.com
So you can get register over here and can post by using links in signature.
If you have some good forum which is not listed here.. Please let me know
so i can add them..
Thanks
Deep Links
Well we know Back links.. Right ?? Now Deep Links are same as back link but only the difference
or we may not say difference.. is that in deep links we don't have to link back for main page
like http://www.yoursite.com , Here we have to link back some middle page of site or other
like http://www.yoursite.com/fold/fold2/xyz_file.htm .
This is done so that our back links should look like natural. When we do if we link to main page
then importance goes over there, but i suppose that page should be balanced.. For google back
links should be natural.
There are many free and paid deep links directories.. Few of them i know
http://www.deeperlinks.com
http://www.deeplink.us
http://www.deeplinkindex.com
http://www.addurllisting.com
Go to google and search for "deep links" and you will have list of deep link directories.
So its not necessary to keep deep links but still it helps any how.
If you want any more help let me know..
Thanks