Tuesday, March 11, 2008

Page Rank Bleed Down

I have often seen on other forums / blogs in the seo / making money online industry about Page Rank Bleed Down and more importantly how to kick start this process on sub pages for inner page rank. As to why this could be important is beyond me as google is very strict against selling page ranked links so don’t say I didn’t warn you!

But if you just want to know for the sake of knowing then all you need to do is continue reading on! Before we get started lets talk about what PR Bleed Down is, PR Bleed Down is how the Page Rank from your home page is bled down from page to page. Every sub page you create has the potential to gain Page Rank and even out page rank your home page if you do it right!

So when we talk about PR Bleed Down what exactly does that mean? Basically it’s the way in which the Page Rank is accumulated from the homepage and distributed among the different sub pages within your site. A sub page can gain pr by pointing links directly to it or by the homepage linking to it or both. Your page rank power comes from when you apply both a homepage direct link and external links pointing to that new sub page. Without both the page won’t be very powerful and wont rank well in search terms.

So the raises the ever important question, how can I kick start my sub pages to get pr?

• Link To Them From The Home Page
• Submit Them To Social Bookmark Sites
• Cross Link The Sub-pages

If you do those 3 things you will more then likely get a page rank no less then 1 pr below your homepage. Meaning if your a page rank of 5 then your sub-page should be a 4 in the next update. After that, point a few external links to each sub page and be ready to up your page rank even past the home page pr in some cases!

What about you? Can you think of any other tips or ideas for getting your sub-pages PageRank higher? Tell us about them below by commenting!

SEO , Javascript and AJAX

Javascript: Friend or Foe?

If you frequent different SEO forums you may have received mixed signals on things like Flash or JavaScript. You may be wondering if it’s safe to use these and if so, what the impact will be. In this article, I attempt to address your concerns over JavaScript.

A brief history of JavaScript

In 1995 Netscape developers realized they needed an easy way to make Java applets more accessible to non-Java programmers and web designers. While it was plagued with problems and errors that were unable to be diagnosed in the early days, the popularity of this simplistic scripting language has hung on and has gotten more popular over time.

Because of its cross-browser compatibility, in that most modern browsers now support JavaScript, and the relative ease of use and implementation, JavaScript has become popular among designers looking for a more dynamic edge for their websites.

So, is JavaScript bad?

In an nutshell, no. If it is used properly, and some basic rules are followed then JavaScript is acceptable.

The biggest flaw with many sites which use JavaScript is that the developer embeds navigation inside of JavaScript menus which render the links invisible to search engine crawlers and therefore the crawler won’t follow those links.

But if you remove the navigation from the JavaScript it then becomes a very powerful scripting tool to help you achieve various effects that HTML can not.

JavaScript is also handy in helping you reduce code bloat.

Code bloat is what we call it when a regular HTML file approaches the file size limits imposed by the search engines. While there is debate on the actual size, Google has come out and said that it can easily index files up to about 100 kB (kilobytes) but may have problems with files which are larger.

Sometimes, there are aspects within a file that must exist for that page to render properly. And sometimes JavaScript can be used to help maintain that functionality but also help reduce the actual file size.

But before I get to that, let us start with the basics.

The basics – externalizing JavaScript

Lets say your website has some JavaScript in it. What should you do?

Well, usually we recommend removing the script and referencing it externally. Especially if there are many lines of JavaScript. If you’ve only got 3 or 4 lines throughout the page, I wouldn’t worry about externalizing it. While such a small amount of JavaScript won’t matter one way or the other, you don’t have to if you don’t want to.

To externalize the script is quite simple – look in the code for where the starting tags and copy that code into a notepad file. Then save your file as something.js where “something” is the filename you want and “.js” is the file extension.

Note: Be sure that if you are using Notepad you have to change the file type (the lower drop down box in the “save as” dialogue box) to “all files” so that Notepad doesn’t add a .txt to the end of your file. If you don’t do this, the file will be saved as something.js.txt and won’t function properly.

Once you have the file properly saved, upload it to your webserver. Now you are ready to reference it using the following code:

<script language="JavaScript” type="text/javascript” src="something.js"></script>

What this does is refer to the external something.js file and run the contents of the JavaScript as if they reside within the HTML.

As you may already guess, this can help remove hundreds or even thousands of lines of JavaScript, reducing the file size substantially.

I remember one site I worked on where I was able to externalize over 1100 lines of Javascript and replace it with 4 external files. That resulted in much smaller file sizes which actually also loaded faster.

So what are the disadvantages of java?

As with all great ideas, this one has a catch.

As you know, search engine crawlers will not execute the JavaScript. That’s why we went to all this trouble. But if a visitor doesn’t execute JavaScript then of course they won’t see what is “hidden” in the script. Therefore if you have an essential navigation element inside that JavaScript then it is likely that the visitor that doesn’t execute JavaScript won’t be able to navigate your site.

However, most browsers today do support JavaScript. In fact, while it is something that can be turned off there are too many other sites which also use scripts, however simple, to help render the page. Virtually every major web portal uses some JavaScript. Therefore I don’t think you have too much to worry about.

However, if you are one of those people who must have your site as compatible as possible with as many browsers as possible, then I would recommend not using the above tactics.

On the other hand, if you are concerned more with supporting 99.9% of your sites visitors and are willing to sacrifice that 0.1% of visitors who may not support JavaScript in exchange for improved search engine rankings then perhaps this is one tactic you can employ.

Now before you go thinking that this is a magic bullet that will shoot you to the top of the rankings, think again.

This is only one of many in an arsenal of tactics available to help your site improve its rankings. There are many other things which are more valuable to you in terms of rankings (such as effective content and good link building). But, if you are looking to squeeze every bit of performance out of your site as possible, and looking for absolutely everything which could have a positive impact on your rankings, then you should consider externalizing any non-essential code such as JavaScript.


AJAX, for those who haven't heard the term, is a new age way of bringing software-like usability to the web. It uses existing technologies but usefully pulls them together to create powerful online applications. The most famous examples of AJAX, or Web2.0 is Google's Suggest feature and Google's Maps tool.

In a nutshell, it's a means of using asynchronous JavaScript and XML to query a remote data source and render the content without refreshing or reloading the page.

The danger of this, from a search engine optimisation point of view is that we are doing away with unique URL's and we have few options to manipulate TITLE tags and page headers, the main requirement for an effective SEO campaign.

As much as SEO is dismissed as secondary to design, a website is a tool for business and should fundamentally reach its market first and foremost.

How to search engine optimise Ajax

The technique I wish to propose contains two main rules and will fundamentally change the way the AJAX application is constructed.

Rule #1

The initial load of the AJAX application must contain the optimised elements such as TITLE and headers and also must be reachable from a fixed address.

Rule #1 is all about using the following two techniques.

* Server side script
* .htaccess mod_rewrite

Promotion and marketing of AJAX

This is the second important rule that needs to be followed in order for a great listing of the products or resources with an AJAX application.

Rule #2

As with all websites, we need inbound links which are keyphrase descriptive that point to a particular resource, not just the home page.

Monday, March 10, 2008


What is Sandbox ?

The Sandbox Effect is a theory used to explain certain behaviours observed with the Google search engine. The Sandbox Effect is the theory that websites with newly-registered domains or domains with frequent ownership or nameserver changes are placed in a sandbox (holding area) in the indexes of Google until it is deemed appropriate before a ranking can commence. Webmasters have claimed that their site will only show for keywords that are not competitive[citation needed]. It appears this effect does not affect new pages unless the domain is in the sandbox.

There are many different opinions about it, including the view that the Sandbox Effect doesn't actually exist and that the search ranking behaviour can be explained as a result of a mathematical algorithm, rather than a decided policy.

Those who believe the sandbox exists observe that it can sometimes take up to a year or longer for a website to be promoted from the Google sandbox, while those who do not believe in a sandbox explain this duration as simply the time it takes for Google to calculate PageRank using an "eigenpairs interpretation of nodes".

How will you know that your sandboxed by google ??

There are some signs of that you might be sandboxed:
1. Drop in the number of website visitors referenced by google.com .
2. Sudden drop in the google's PageRank of all website pages.
3. When querying google on specific keywords - your website appears in the last 2-3 pages of the search results.
4. Your website is banned from google's listing.

If you wish to check whether this applies to you then try the following few methods

I Method :
Use this web address to check the indexing of your website pages against specific keyword: http://www.searchenginegenie.com/sandbox-checker.htm

II Method :
Run your web browser and go to: http://www.google.com

then type in the search box:
www.yourwebsite.com -asdf -asdf -asdf -fdsa -sadf -fdas -asdf -fdas -fasd -asdf -asdf -asdf -fdsa -asdf -asdf -asdf -asdf -asdf -asdf -asdf

If your website appears in the search results and has good keyword ranking then your website is in the google's sandbox.

III Method :
Run your web browser, go to: http://www.google.com
and type: site:www.yourwebsite.com

If there are no results found and then your website is out of google's indexing database. The difference between non-indexed fresh websites and sandboxed ones that on latter you'll not see: If the URL is valid, try visiting that web page by clicking on the following link: www.yourwebsite.com

How to get out of Google Sandbox - part I

Emergency steps - do these first

1. Check whether are you using in your website meta-refresh javascript redirects. For example:
meta equiv="refresh" content="5; url=http://www.website.com/filename.php"
If so remove them, because they are assumed as spam by Google's Bot.

2. Check if websites linking to you give HTTP response code 200 OK
2.1 at google search box type allinurl: http://www.yoursite.com

2.2 Check every website other than yours by typing them here and look for HTTP response code 200 OK.

2.3 If there are any that give 302 header response code then try to contact the administrator of the problematic website to fix the problem. If you think that they are stealing your Page Rank - report them to google report spam page with check mark on Deceptive Redirects.

Few more advance tips to remove.. sandbox

- redirect website's www urls to non-www urls

- have a website structure not deeper than 3rd level(I.e: don't put your website content so deep inside via more than 3 links away. This way the crawler/spider could stop searching it.)

- convert webpages using .htaccess from dynamic to static ones.

- rewrite all the meta tags and explicitly manifest the pages that must not be indexed.

- put a slight delay into the crawling machines, this is important especially if your hosting server doesn't have a fast bandwidth.

In your robots.txt file put:

User-agent: *
Crawl-Delay: 20

You can also adjust the Crawl delay time.

- remove the duplicate or invalid pages from your website that are still in the google's index/cache.

First make a list of all the invalid pages. Then go to google's webpage about urgent url removal requests:

Congratulations: I hope that by following those steps your website will be re-indexed soon.

Link Building Tips

Every one of us wants to rank high in Google. You must focus on backlink building to have good serps. But backlink building has to be done with great care - not to get penalized.

Here are some simple tips which will definitely help:

1. Don't buy (or sell) sitewide links for serps. Buy links from individual pages*.

2. Don't look for high PR links only - start your campaign for a new site with PR 0-1 and slowly move up each week as you see your rankings improve.

3. Control link placement speed. Remember of "Too many links at once" Google filter.

4. Control the number of outbound links on your pages and on pages where you place your links, avoid link-farms, links to gambling, pills, etc.

5. A link from a relevant website is excellent, but a link from a quality site of other niche or other language can also boost your ranking - so don't ignore such links.

6. Make links look natural. Use different link anchors, promote not only your main page, but your deep pages too.

7. Don't do reciprocal linkexchange.

* - Links from individual pages are far more effective when speaking of improving serps(Search Engine Ranking Poition), not PR. They are more natural and organic. Sitewide links can easily be filtered.
Selling sitewide links also means loosing lots of money - if your site has 1000 of PR2-3 pages, you could earn hundreds of $$$ a month just from selling ONE outbound link per page, compared to some $20-30 for selling a sitewide link.

Just remember, that your site's ranking depends on your efforts and skills, it doesn't depend on your site's PR etc.

Sunday, March 9, 2008

SE Friendly URL's

Search Engine Friendly URL's is also the main thing to be done.. Many of us have dynamic sites
means site who's content changes daily. So url might be in such format

well this links can be cached by google but still making more simplicity or making it in such way
that it shows it is static page helps lots in caching..

Well the above link can be converted in following form
http://www.yoursite.com/xyz/123/90/<title of page>.htm

This link will be more friendly then that old one...
So converting such links is called Mod Rewriting which i will provide some links where you can get more explanation regarding this.


If you want more links. Please do reply i will post more and more


60 Forums for Do Follow

Do follow forum links means its a list of few forums where you can register and start
posting and give out your link in signatures. I found this list some where else..
But i m posting it with correct and proper links..

1. http://www.phpbbstyles.com
2. http://www.sitepoint.com/forums
3. http://www.thefreead.net/forums/
3. http://acapella.harmony-central.com/forums
4. http://forums.seroundtable.com
5. http://www.submitexpress.com/bbs
6. http://www.startups.co.uk
7. http://www.v7n.com/forums
8. http://www.webmaster-talk.com
9. http://forums.comicbookresources.com
10. http://bzimage.org/
11. http://www.clicks.ws
12. http://www.earnersforum.com/forums/
13. http://www.acorndomains.co.uk
14. http://forums.onlinebookclub.org
15. http://www.davidcastle.org/BB
16. http://6pins.com/blog/
17. http://iq69.com/forums/
18. http://forum.hot4s.com.au/
19. http://forums.microsoft.com/MSDN/
20. http://forums.mysql.com/
21. http://forums.cnet.com/
22. http://forums.oracle.com/forums/
23. http://paymentprocessing.cc/
24. http://www.warriorforum.com/forum/
25. http://forums.webcosmo.com
26. http://forums.digitalpoint.com
27. http://forums.searchenginewatch.com
28. http://www.webmasterforumsonline.com/
29. http://www.webmastertalk.in
30. http://www.ozzu.com/
31. http://www.webmasterworld.com/
32. http://www.webmastershelp.com/
33. http://talk.iwebtool.com/
34. http://www.webmaster-forum.net/
35. http://www.dnlodge.com/
36. http://forum.vbulletinsetup.com
37. http://forums.ukwebmasterworld.com/
38. http://www.webmastertalk.co.uk/
39. http://forums.seochat.com/
40. http://www.australianwebmaster.com
41. http://www.cambodiaxp.com/
42. http://www.highrankings.com/forum/
43. http://www.searchguild.com/
44. http://www.ihelpyou.com/forums/
45. http://www.seo-guy.com/forum/
46. http://www.searchengineforums.com/
47. http://www.marketingchat.org/forum/
48. http://www.webworkshop.net
49. http://www.webdesignforum.com/
50. http://www.ukbusinessforums.co.uk
51. http://www.businessforum.net/
52. http://www.domainforums.com
53. http://www.namepros.com
54. http://www.domainstate.com
55. http://www.nameslot.com
56. http://www.dnforum.com
57. http://www.webhostingtalk.com
58. http://www.freewebspace.net
59. http://siteownersforums.com/index.php
60. http://forum.admob.com

So you can get register over here and can post by using links in signature.
If you have some good forum which is not listed here.. Please let me know
so i can add them..


Deep Links

Well we know Back links.. Right ?? Now Deep Links are same as back link but only the difference
or we may not say difference.. is that in deep links we don't have to link back for main page
like http://www.yoursite.com , Here we have to link back some middle page of site or other
like http://www.yoursite.com/fold/fold2/xyz_file.htm .
This is done so that our back links should look like natural. When we do if we link to main page
then importance goes over there, but i suppose that page should be balanced.. For google back
links should be natural.
There are many free and paid deep links directories.. Few of them i know


Go to google and search for "deep links" and you will have list of deep link directories.

So its not necessary to keep deep links but still it helps any how.

If you want any more help let me know..


Back Links

Basically google works on Back links... Now with back links many other questions arises
like what is Page Rank. What are deep links ?? Some times.. what is back link ??
Well this is the most and important part of SEO.

Back link means how many links (must be able to crawl by robots) you have got from other sites. Now getting back links doesn't mean that must advertise over that site..
You can advertise but you should keep html link not an advertise format link.

And your page rank depends on the no. links you get. You can check your page rank
at http://www.findpagerank.com
Remember this tool is best and correct one.. I wrote this sentence cause on internet many other sites are avilable but they show fake rank and page strength.

Now if you want to see how many sites link you back .. then go to google.com
search as follows link:http://www.yoursite.com

So for getting back links there are many process like submitting to directories, social bookmarking, using in signatures of forums, posting in blogs..

Just go to some social bookmarking site and submit your site and also recommend users to
bookmark it or do thumbs up for your site.. Social bookmarking is the best way for link backs
then any other as far as my knowledge. You can find lists of 150 social bookmarking sites.
I don't mean to register all over communities, you can just register and submit to few like
del.icio.us , stumbleupon , digg.com , furl.net , spurl.net ,Technorati, Newsvine ..etc etc

After that submit your site to directories with proper categories.. and tags and descriptions..
You can find here some free directories : http://www.best-web-directories.com/free-directories.htm
Well if you search in google you will find some paid links also.. Well its not necessary that you should buy links. You can find many good directories for FREE itself..

Now comes the forum signatures.. Just register at few good forums which are indexed
very well by google like http://forums.digitalpoint.com
Keep links to your site or blogs there.. It also gives out visitors for you but also you can get
back links.
Same thing with blogs.. When you visit some blog then just make some good comment and the
end provide link to your blog or site.

Still this is not enough...
Before submitting to directories or buying links or using at forum just check PR (Page Rank) of
that directories.. and try to submit to PR3 (sites having Page Rank 3)or greater directories.
Suppose if you get back link from PR6 or greater then it quality link for you. So if you get more
quality links, more the page rank and more the chances to come up in google.

So when you submit your site don't expect your site will get indexed on same day or your PR
will increase same day.. Remember PR update take place in 2 months on an Average as far as
i know.

If you want you can buy some good packages of links like we are selling 100 manual submissions
to PR3+ directories just for10$. We also offer 1000 times thumbs up at stumbleupon just for
5$ per link. If you want the package then just leave me a mail or reply via comment with your mail.

Suppose if you submit your links at many sites and your PR gets upgraded to something, lets
consider PR2. Now suppose if you stop submitting or your old links are gone then don't expect
still your rank will be same. it may get down.
So the funda is keep submitting as much as you can, so the many SEO Services may charge you
on monthly basis if you buy SEO..

Yet there are many tips and tricks are yet to post.. Will sure post soon..


150 Social Bookmarking

Social bookmarking sites are a popular way to store, classify, share and search links through the practice of folksonomy (an Internet-based information retrieval methodology consisting of collaboratively generated, open-ended labels that categorize content such as Web pages, online photographs, and Web links) techniques on the Internet.

This list was originally from www.feedbus.com/bookmarks but I've added more to it, and I encourage you to do the same.

Please note: This is not for people to submit links to their hubs or websites, strictly social bookmarking sites only.

1. Stumble Upon
2. Lensroll.com
3. del.icio.Us
4. Digg.com
5. BlinkList
6. http://www.twitter.com/
7. Technorati
8. Gather
9. Ma.gnolia.com
10. Newsvine/

11. Netscape.com
12. Blogmarks.net
13. Cutulike.org
14. Furl
15. HubPages
16. Bluedot.us
17. Clipmarks
18. Ning
19. Simpy
20. Squidoo

21. Spurl.net
22. Wink
23. MyWeb Yahoo
24. Zlitt
25. Rollyo: Roll your search engine
26. PlugIM
27. iLike
28. BibSonomy
29. Blabb
30. Bookmark-Manager

31. Backflip
32. Aboogy
33. A1-WebMarks
34. Connotea
35. Google Bookmarks
36. Kaboodle
37. MindDeposit
38. LiveFavorites
39. RawSugar
40. Rojo

41. Shadows
42. Icerocket
43. Spotback
44. Askville
45. BlinkBits
46. Band Buzzer
47. BlinkPro
48. Bookmark Buddy
49. Blogmemes
50. BookMark Commando

51. Sync2It
52. BuddyMarks
53.Bookmark Magic

61. del.lirio.us
62. Socail Anotation
63. Easybookmarks
64. Dogear
65. Dohat
66. freelink.org/
67. Frassle
68. Finety
69. Complore
70. GlobusPort

71. fungow
72. Guicookies
73. GetBoo
74. Humdigg
75. Dude Check This Out
76. hamiltontechgroup.com
77. i89.us
78. HyperLinkomatic
79. iFaves
80. ikeepbookmarks.com/

81. Jots
82. kmfavorites.com/
83. Linkroll
84. links2go - Targetted Directory
85. favmark
86. favoor
87. LinkGogo Society
88. Linksnarf
89. Lookmarks
90. Markaboo

91. Link2Mark
92. Mobleo.net
93. murl.com/
94. Lounge
95. memFrag
96. WaveRight
97. myHq
98. My BookMarks
99. myhotlist.com
100. My Bookmark Manager

101. linkatopia.com
102. mywebdesktop.net
103. My Link Vault
104. Netvouz
105. OnlineBookmark
106. PeerMark
107. openBM
108. Philippine Bookmarks
109. Powermarks
110. Riffs

111. Plug Web Edition
112. SearchFox
113. Shoppers Base
114. Shakk.us
115. Searchles
116. Share Your Links
117. Scuttle
118. SiteJot
119. Smarking
220. SiteBar

121. StartAid
122. Sports slister
123. socialbookmarking.org
124. an.geli.ca
125. Smelis.com
126. URL Blaze
127. Unalog
128. URLex
129. Taggly
130. SyncOne

131. Whitelinks.com
132. Wists
133. What link
134. Weeb-Feeds
135. womcat.org
136. World Wide Wisdom
137. wobblog.com
138. wURLdBook
139. yourMarks
140. zurpy

141. Zoogim.com
142. lilisto
143. Spotplex.com
144. SpotBack
145. CoRank
146. Openserving
147. TagTooga
148. 30DayTags
149. plime
150. Sphere

151. Mog
152. Vimeo
153. NewsWeight
154. Reddit
155. CrowdFound
156. Starton
157. GaddiPosh

If you have any suggestions or site to be added then reply here
or mail me..



Robots.txt is file which kept in main folder of site .. like e.g. http://www.yoursite.com/robots.txt
File is used for blocking and unblocking access of engines or robots to any file or folder. You can find
more info about it on http://www.robotstxt.org
Still for the sake of users i will define with more details :
Use any text editor like notepad and save as robots.txt, Now write into it as per rules..

User-agent: *
Allow: /
This thing allows all user agents to access all files and folders...
User-agent: * = means to all users robots or.. all user-agents or instead of * you can use any name of useragent.. like Opera , MSIE or FireFox..

Next thing is select whether to allow or disallow that useragent to access it or not.. like our e.g.
allows all files..
Here is example which will not allow access to search folder
User-agent: *
Disallow: /search

Keeping this file is not compulsion but still keep it for sake of some BOTS (Robots)

Next part of this post is blocking a link by other method..i.e. when you have some
link over your site then you can block bot or robot to not follow it.
e.g. <a href="http://www.google.com/webmaster/" rel="nofollow">Test link</a>

The rel='nofollow' says that don't follow link.. By this way you can block bots too..
Using this attribute doesn't mean that bot cant access the link.. They can access it
but they wont cache if they don't want..

If you have any questions do post or Either you can just post thanks so that i can
guess how much you are interested.


Site Maps

What are Site Maps ?
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

As said above site map is XML file stored on your side.. So will teach you how to make good format sitemaps. File must be UTF-8 coded using XML tags.

SiteMap File Sample

<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

If you want more details about site map and its protocol.. go here

with the help of this you just make XML file. Now upload it to you folder of site.
Now using google account you can submit it to google webmasters tools..
Google webmaster tools
After submitting google will download and crawl it and report you errors. Like many files are not valid or file is not present.

The same way you can submit XML files to other engines like Yahoo, MSN.

Remember if you want to block some links or link from indexing then visit our
Robot.txt post..

You can also create readymade site maps using online tools. I know one of them is

Now if you want to see which page google has indexed then search this way in google
By this way you can see chached pages by google.

If you have any questions regarding sitemaps submitting or errors then can post here.
Also check out our old posts for other tips and things regarding SEO.


Add to Delicious - Stumble Now

Saturday, March 8, 2008

Free SEO Tools

We are posting here few links for tools of SEO. remember this tools are just recommended by me.. I m not an any reseller or something else. Well this section i havent listed in index but still i m posting for sake of you guys..

If you love them then do sure reply or comment over them..

PageRank Checker

http://www.iwebtool.com/search_listings_preview = Create a preview of how your site will be on google, msn, yahoo.

http://www.iwebtool.com/pagerank_prediction = This predictor tool does what it says, it predicts your future Google PageRank

http://www.iwebtool.com/link_popularity = Link Puplarity checker

http://www.iwebtool.com/search_engine_position = Search Engine Position Checker

More tools will be posted on there proper page..

We are offering 100 Directories link submission at $8. All of them are done manual..
If interested in it then dont forget to mail me..


Add to Delicious - Stumble Now

Title Tags

Google gives importance to title tags instead of meta tags. So we must also keep our keywords in title.. Here is th example
Mobile Videos
Here the user has defined the tags also in title. This helps the site very much to get
into that keywords..
Even if your site is dynamic then you must make dynamic title tags so that for each page will show different tags... As i showed above example you can check that forum.
His each section shows different title tags and meta tags too... So his each page will come up for his proper tag instead of using same tags in all pages.
Even when google shows list of sites it shows title as link of site.. So use very defining title for your site.
Next thing google finds is that your first paragraph of site and try to find keywords in that. We will see that in next post.


Add to Delicious - Stumble Now

Meta Tags

Meta tags means data about data, where we have to define about page like keywords, description
encoding.. etc etc.. But now a days many search engine like google don't make much use of meta tags. Still many of us define them for other search engine's. There are lots of other sites who offers online meta tags analyzer, which checks whether the tags are ok are not. I dont think it is much needed, we can manually check them, we write this tags in head section of site,. Here are few tags to be included with details.

This line defines the page encoding or page type
<META http-equiv="Content-Type" content="text/html;charset=ISO-8859-1">

This tags has keywords for the page by which search engine will find your that page.
Following tags are for my blogs. You can think over good keywords for your site and
use them.
<meta name="Keywords" content="free seo,learn seo,seo for free,seo,seo tips, tips for seo">
This description is to describe the page or site so when search engines shows this inforation to users in short when they search.
<meta name="Description" content="Blog for learning SEO, Helping with SEO realted problems, We also make SEO easy to learn it. SEO is defined with all his aspects here..">

There are the meta tags also for let the page to be chached in browser or not.. But that is not usefull for us.. Basically google dont use meta tags. Google gives importance to title tags and first paragraph of the page. So we will also have look over them and how to maintain them.


Add to Delicious - Stumble Now

Introuction to SEO

SEO means Search Engine Optimization. Making our blog or site in such a way that its search engine friendly and can be indexed properly comes under SEO. There are many other aspects which come under SEO like backlinks , Meta Tags, Title Tags Improvement.We will see all of them step by step and also you can find here programming things for SEO like URL Rewriting.
SEO is not an easy task. It takes time and hard work to do. Still many guys havent understand SEO fully yet. Even one day i was one of them but slowly slowly i too got learned everything. Here are few things which we will learn under this blog :
1. Meta Tags
2. Title Tags
3. Context for search engine (how to make friendly).
4. Site Maps
5. Robot.Txt
6. Back Links
7. Deep Links
8. Search Engine Friendly URL's
9. Submitting Sites
10 Links for some awesome forums and sites

In between this if you have questions or difficulties then sure ask by posting.
Please dont reply using your url to get back links. Such posts will be deleted.
Please make a sure note of it..


Add to Delicious - Stumble Now