Sunday, 16 December 2012

About AdSense for your blog

Blogger provides a simple way for you to make money with your blog. Adsense is Google's content-targeted advertising program. If you use AdSense, you don't have to select keywords or categories for your ads. Instead, Google's servers determine what your posts are about and display the most relevant ads to your readers. For example, if you blog about baseball, you might see ads for “Major League Baseball memorabilia” next to your post. If you blog about painting, you might see ads for “Art Supplies.”
Blogger requests access to your AdSense account so that we can automatically create and place ad code on your page through our layouts and template tools. As a result of this, you might notice that Blogger is receiving "0%" of your AdSense earnings - this means that you will receive the same amount for clicks or impressions as you would creating the ad code from your AdSense account.
To view any partners that have requested access to your AdSense account, and their associated revenue shares, please log in to your AdSense account, click the Home tab, choose the Account settings sub-tab and go to “3rd-party access.” If you see an "enable access" link next to blogger.com, you must click this link in order to create ads with Blogger tools.
To take full advantage of other AdSense options and settings, such as managing the types of ads that appear on your blog, you can you can sing in to the adsense site and have a look around. That's where you can see how much money your ads are earning and manage your account.

Troubleshooting

Could not retrieve earnings stats
Error message in Blogger The error message above can occur when 3rd-party Access has been disabled in your AdSense account. To re-enable access to your AdSense account by Blogger, please log in to your AdSense account, and click the Account Settings link. You'll see an enable access link next to blogger.com, you must click this link in order to create ads with Blogger tools. You will then be able to log back into blogger.com and continue setting up AdSense from the Earnings tab.
 AdSense settings

How to use AdSense with CMS

We've provided some useful links for implementing AdSense on several common web publishing platforms. Note that some content management systems (CMS) use third-party plugins or add-ons to insert AdSense ads on the webpages generated by the system. Some do not.
CMS sites that do not require plugins:
  • Blogger
  • Google Sites
  • Hubpages.com
  • Xomba
  • Weebly
  • Yola
    CMS sites that require plugins
    The following content management systems (CMS) use third-party plugins or add-ons to insert AdSense ads on the webpages generated by the system. Google does not endorse or support these plugins. However, you can use these plugins/add-ons as long as the ad code displayed on your webpage remains unchanged and is not altered by the plugin in any way.
  • WordPress (self-hosted)
  • Drupal
  • MediaWiki
  • Moodle
  • Pligg
If you need assistance with a plugin, please visit the official support forum of your service or the developer of the plugin. Please note that Google is not responsible for malicious third-party plugins.

Use my linked AdSense account to show ads on my own site

If you signed up for AdSense through a host partner and you’d like to show ads on your own non host partner websitw, then you’ll need to provide us with the URL of the site you want to monetize. You can do this via a one-time application form. Here’s how:
  1. Sign in to your AdSense account.
  2. On the Home tab, visit the Account settings page.
  3. In the "Access and authorization" section, next to "Only host sites are allowed to show ads for your account," click edit.
  4. On the “Show ads on other websites” page that appears, enter the URL of the site where you plan to show ads.
  5. Click Submit.
  6. Finally you need to implement AdSense ad code on the URL that you provided above, on a page that receives traffic. 
  7. Once your request is approved, you’re welcome to place your ad code on any website that you own without any further approvals. If your application is not approved, be assured that ad serving on your host partner site will not be affected. You can continue to monetize your hosted content as normal.

Enable AdSense on your Google Site

To add AdSense to your Google Site, just follow these steps:
  1. Click the More actions drop-down menu, and select Manage site.
  2. Click Monetize on the left side of the page.
  3. Click the Monetize this site button.
  4. Create a new AdSense account or use an existing account. Follow the instructions, depending on the radio button you select.
Once you've enabled AdSense, go to any of your site's pages, click Insert, and select AdSense to insert ads within your page. You can also choose to have ads displayed in your global sidebar.

Best and Trusted PTC Sites

Paid To Click is an online business model that draws online traffic from people aiming to earn money from home. Paid-To-Click, or simply PTC websites, act as middlemen between advertisers and consumers; the advertiser pays for displaying ads on the PTC website, and a part of this payment goes to the viewer when he views the advertisement.There are hundreds of ptc sites but 95 percent are scam .They do not pay you .Best and trusted ptc sites are only few that really pay.Below are the best ptc sites that pays you and you must join these . Except these best PTC sites all other sites are scam and you will waste your time by joining them.Trusted ptc sites are below.  

1.Neobux
 As a member you can earn simply by viewing all the advertisements we display.
  Effortless income
- Earn from home
- Guaranteed ads daily
- Detailed statistics
- Upgrade opportunities
- A dedicated community
- AdPrize + Offers
payza and paypall withdrawal minimum 2$

Click here to free sign up on neobux 





2.Clixsense

Browser Toolbar With Instant Notification of New Ads Available
• Earn Up To $0.02 Per PTC Ad Click and Up To $0.008 Per Referral Click
• Earn Even More With Our Affiliate Program
 • Payments Via Check, Payza, PayPal and Liberty Reserve
• Win ClixGrid Daily Prizes and Participation in Weekly Contests
• Unlimited Direct Referrals and Guaranteed PTC Ads Daily

Click here to free sign up on Clixsense




3.Clicksia
  • Get Paid to Visit Websites
  • Get Paid to Complete Offer
  • Traffic Exchange
  • Get Paid To Promote Clicksia
  • Up to 100% Referral Earnings
  • $1.00 Minimum Payout
  •  Fast payments through PayPal or AlertPay

Click here to free sign up on clicksia




4.Incentria
  • Get paid to visit websites.
  • Get paid to complete simple offers.
  • Exchange Traffic with other members for free.
  • Promote Incentria and get up to $0.20 CPM.
  • Request a payment at $1.
  • Fast payments through PayPal or AlertPay
  • Earn up to 100% from your referrals.
  • Online over 2 years!

Click here to free sign up on incentria

Earn money with Incentria

Click here to free sign up

At Incentria you can earn an extra stream of income by completing simple tasks; They have advertisers that will pay you real cash money to simply view their websites, and other advertisers will pay you even more to join their various programs!

Time is money and in order for you to most effectively turn your time into money, you need to have a strategy. One important aspect of a good strategy is that when joining a get-paid-to website, you make sure its qualities will provide the most effective profit-earning. They are confident that you'll find Incentria sets the standard for effective earning!
They have many years experience with operating and participating in successful get-paid-to programs; as such, Incentria has a reputation to keep and will go out of their way to keep members happy! That means reliable payments, quick support and excellent website reliability and uptime.

Are you ready to start making money online with little effort? Incentria is an ideal get-paid-to website that you can trust. They have been around for a while and Incentria actually pays! Don't waste your time with scam sites, join Incentria today and start earning real cash now!


Earnings per Ad Cashout Minimum Processing Time Payout Methods
$0.001 - $0.005 $1 7 days Paypal, Alertpay
Number of Ads per day Referrals Referral's Click Membership
will vary Direct Referrals 1 Level (unlimited) 10% FREE, Elite

Click here to free sign up

How do I earn money with ClixSense

Click here to free sign up

Hi guys as everyone knows Clixsense is the number one standing PTC site all over the world and it is also the only site which actually does pays much, yes, it is the only PTC site paying more than any other. But how does it pay so much? Well here I will suggest you some tips to earn a living from clixsense, yes, LIVING. Some of the users from Clixsense make more than $500 per month just by solving the tasks.
Guys earning from Clixsense is not very easy but it's pain in a$$, you need to dedicate all your attention towards only that to earn seriously from them. Now I'm going to share the tips how to earn money from Clixsense which atleast can reach the payment threshold for a month. Following these steps and mastering them can make you earn more and more money and yeah the main key would also be upgrading your account to premium membership from standard. Upgrading your account may make your earning increased by two times; like if you earn $80+ per month then you can get up to $150+, so this would also be the main key.
Guys these steps are not so easy, some of them would be like click the ads or the Clixgrid but the others are too difficult but doing them can lead you get great money.

So these are the steps.........

*  Click all ads without fail(daily), there's another method of earning by clicking ads, what you need to do is just log in to your account whenever you are free and check whether new ads are available, but some of you guys think that once the ads clicked they should be clicked only after 24 hours, but that a true point but in Clixsense you "get paid every 30 seconds". So try to login and check the ads whenever you can so that may lead a little earnings. With the toolbar from Clixsense itself you can get updates of the ads like an "adalert" from neobus(ever tried it, gives best result)..

*  Click all the Clixgrid ads all the days, it may help you get a little money between $0.10-$5.00, for the new users it would be difficult to win in that but hope you will win something from that.

* Completing the tasks available, this is the main way to earn more from Clixsense, sponsored by the Crowdflower, Clixsense provides many tasks for all the user from all countries, but US gets the most. You can get the tasks which the US gets by using US VPN servers, you can download one from here. Try to complete more tasks, use the one with 'Google search........' which pays $0.21 per single completion. So if you complete 20 a day then $0.21 * 20 = $4.2 per day, you can even complete others as well with $0.14, 0.07 etc...

Click here to free sign up

How to earn money with Profitclicking.com

Click here to free sign up and get 10$ bonus on joining

Get your free $10 bonus for a jump start to earn fast income
It´s Simple & Fun! START EARNING TODAY! COLLECT free $10 AND START CLICKING NOW. Paid Daily • Huge Referral Bonuses • Daily Withdrawals.
Accepts payment method – payza, egopay,solidturstpay and lebertyreserve etc..
 Create Your Free Account
Collect Your Complimentary Traffic Package
Add Additional Traffic Packages to Maximize Your Earnings (Optional)
Earn 2% Weekdays/1% Weekends on Money Spent to Purchase Traffic Packages
Set your account to “auto repurchase”
Earn Daily Cash for 88 Days!
Over the next 88 days, your traffic packages will produce 150% in total earnings, effortlessly! After 4 traffic packages expire at the end of the 81 days, you will receive 1 Matrix Package.Over time, each Matrix Package will earn an additional $60, totalling 300% in total earnings overall
The best thing of profitclicking is that you will get free 10$ bonus in your account for joining at profitclicking.And the joining is absolutly free of cost.
Click below link to creat free acount on profitclicking and get your 10$ bonus.
Click here to free sign up and get 10$ bonus

Saturday, 15 December 2012

Clicksia Make Profit Easy,Quality Advertising and Fast Payouts

Clicksia is one of the PTC program providers. Clicksia offers the following services:


# Make Profit Easy
Get paid to do a handful of simple activities such as clicking links, reading ads and signing up for websites. We pay you competitively for each of these things which means maximum profit for your time!

# Quality Advertising
We strive to offer our advertisers the best of the best advertising methods! With our specialised anti-cheat system and a large memberbase, you can be sure that with Clicksia you get the most out of your advertising dollars.

# Fast Payouts
At Clicksia, you not only earn money very rapidly, but you also get paid fast! You can withdraw your earnings at only $1 with absolute minimum fees. We offer payouts for both Alertpay and Paypal users.

# Global Community
Clicksia is a global business providing service to 61,337 members in over 190 different countries. It doesn't matter if you live in The United States, Brazil, or India: you can start earning with Clicksia today!


Being a member of the Clicksia, likes others PTC Programs, has 2 benefits as affiliate and advertiser.

1. Earning Opportunity: Affiliate Benefits
Get paid completing simple tasks online !

Here in the world of clicksia, you can earn an extra stream of income by completing simple tasks; We have advertisers that will pay you real cash money to simply view their websites, and other advertisers will pay you even more to join their various programs!

Affiliate Benefits

* Paid to Click
* Paid to Sign Up
* Traffic Exchange
* PTP $0.10 CPM
* $1.00 Minimum Payout
* 10% Downline Earnings

2. Quality Advertising: Advertiser Benefits
Increase your traffic ! Increase your Sales !

Advertiser Benefits:

* Cheap Advertising!
* 24 Hours Unique Hits
* Free Live Stats
* Many Ways To Advertise
* Automated Transactions
* Try It Out Today!

Click here to free sign up on clicksia 

Fast Cash Mega Investment Plan

DAILY PROFIT SYSTEM

The daily profit system is based on daily earnings and cycler spots. Basically you buy units that will pay 3% daily for 60 days (180%). I know there are sites that say they pay like 10% or 15% daily but I think we all know that's impossible long term. We rather keep it sustainable with a real system that works for everyone short term and long term.

How do we manage to pay the 3% daily you may ask. Well, first of all we generate income by trading forex. We have been trading forex for over 7 years now and use it for other sites we run too (like Hybrid Cash System and Passive Income Pool). Next to forex income we have a system in place that will enable us to keep this running. Here's how it works:

You buy units for $10 per unit which pay 3% daily for 60 days. When you cash out there will be 30% reserved in your repurchase balance and 70% paid out. With your repurchase balance you can only buy cycler spots. Since cycler spots start at $10 (the $10 line) you'll need at least $10 in your repurchase balance to buy a cycler spot. You can also let your repurchase balance grow till you have enough to buy a $20, $40, $80 spot etc.

Wednesday, 12 December 2012

EARN OVER $200 US DOLLARS A MONTH

Click here to free sign up

ProTypers is a conglomerate of data entry specialists. We work primarily on converting scanned documents from image-to-text for institutions in North America and Europe. We also offer our services to Neural Network Text Imaging developers and provide CHALLENGE IMAGE decoding for the visually impaired (blind).

We’re currently hiring data entry personnel from all over the world. The only requirement to work for us is to have a computer, an internet connection and the ability to type over 30 Words per Minute. You decide when to work and for how long. The faster you type, the more money you earn.

ProTypers.com is ideal for:

  • Mothers that stay at home.
  • Parents that need a second job.
  • Students.
  • People in between jobs.

How much you earn depends on how much you work. Our top typers earn between $100 and $250 each month! Our Rates start from $0.50 for each 1000 words typed and can go as high as up to $1.5 for each 1000 words typed. We Pay through Debit Cards, Bank Checks, Paypal, Webmoney, Liberty Reserve and Western Union.Best payment option is liberty reserve. If u have earned 3$ you can withdraw through liberty reserve on every monday. 

Click here to free sign up

Tuesday, 11 December 2012

Earning Money with Neobux for FREE- Tips and Tricks



Click here to free sign up

Hi friends, Hows life going ? I am posting after long time. As Per many request of my friends, I am sharing some tips and tricks aboutneobux For Noobs and Professionals.

I have myself Earned More than $73 without Investing a penny from my side when I started. I Clicked Daily ( not even missed even one ). In few weeks when i reached $2 then i rented referrals and kept on clicking and so on. Within 1.5 Months i was able to make good profit. I am really happy with Neobux. Neobux is 100% Genuine and I am sure you will also make good profit.
if u want to earn some money online just for spending 5 to 10 min then please use my referal link to register into this site
click here to free sign up

Fastcashmega.com, Multi-stage 300% Cycler - AP,LR,STP

Click here to free sign up

THE 10 STRAIGHT LINE CYCLERS IN MORE DETAIL
Fast Cash Mega is based on passive income. The 10 straight line cyclers are the core of the system. The 10 lines are:
$10 line
$20 line
$40 line
$80 line
$160 line
$320 line
$640 line
$1,250 line
$2,500 line
$5,000 line

Every line pays 300% per position when you cycle out. To make sure everyone will reach that 300% as soon as possible, positions are filled in 3 ways:
1. Top Position: Every time a new position is bought in a line a percentage of the cost of that position is added to the top position. When that top position reach 300% it’s full and cycles out.
2. Referral Purchases: A percentage of the cost of a position will be added directly to the position of the sponsor of the referral that bought that position. This only if the sponsor has an active position in that line. In this way recruiters get more out of their recruiting efforts because they will cycle faster. If the sponsor of the referral that bought a position doesn’t have an active position in that line the percentage reserved for the sponsor will be added to the top position in that line (just like in a regular cycler).
3. Revenue Sharing: Every time a new position is bought the next 10 positions in the line will get 1% each of the cost of that position. This means all positions in the line will get their share at some point and then it starts over again from the top. In this way positions will cycle faster than in a regular cycler.

THE “OVERFLOW” SYSTEM
When a position reach 300% it’s full and cycles out. Our unique “overflow” system will place a new position at the bottom of the next line and deduct the cost of that position from the cycle earnings. The remaining is added to the available balance of that member. Only in the last line (the $5,000 line) this doesn’t happen because there is no next line after the last line of course.

REFERRAL CYCLE BONUSES
From the $160 line to the $5,000 line sponsors earn a referral cycle bonus when their referrals cycle one of those lines and they have an active position in that line. The referral cycle bonus is 10% of the cost of a position in that line. If the sponsor doesn’t have an active position in that line they will not get this referral cycle bonus.

AUTOMATIC REENTRY OPTION
In all lines it’s possible to enable automatic reentry. This means when you cycle out a new position in the same line will be bought automatically so you can maximize your earnings from the same investment. It’s also perfect to make sure you qualify for the referral cycle bonus as requirement is at least one active position in the line your referral cycled. You can enable the automatic reentry per line so you can check in what lines your referrals have positions and then enable automatic reentry only for those lines if you want.

A CLOSER LOOK AT EARNINGS IN THE 10 LINES:
$10 Line
Cost per position: $10
Cycle earnings: $30
From the $30 cycle earnings $20 is used to buy a new position in the $20 line. The remaining $10 is added to available balance (investment of $10 back)

$20 Line
Cost per position: $20
Cycle earnings: $60
From the $60 cycle earnings $40 is used to buy a new position in the $40 line. The remaining $20 is added to available balance.

$40 Line
Cost per position: $40
Cycle earnings: $120
From the $120 cycle earnings $80 is used to buy a new position in the $80 line. The remaining $40 is added to available balance.

$80 Line
Cost per position: $80
Cycle earnings: $240
From the $240 cycle earnings $160 is used to buy a new position in the $160 line. The remaining $80 is added to available balance.

$160 Line
Cost per position: $160
Cycle earnings: $480
Referral Cycle Bonus: $16
From the $480 cycle earnings $320 is used to buy a new position in the $320 line. The remaining $160 is added to available balance. If a sponsor has an active position in the line they will earn $16 when their referrals cycle.

$320 Line
Cost per position: $320
Cycle earnings: $960
Referral Cycle Bonus: $32
From the $960 cycle earnings $640 is used to buy a new position in the $640 line. The remaining $320 is added to available balance. If a sponsor has an active position in the line they will earn $32 when their referrals cycle.

$640 Line
Cost per position: $640
Cycle earnings: $1,920
Referral Cycle Bonus: $64
From the $1,920 cycle earnings $1,250 is used to buy a new position in the $1,250 line. The remaining $670 is added to available balance. If a sponsor has an active position in the line they will earn $64 when their referrals cycle.

$1,250 Line
Cost per position: $1,250
Cycle earnings: $3,750
Referral Cycle Bonus: $125
From the $3,750 cycle earnings $2,500 is used to buy a new position in the $2,500 line. The remaining $1,250 is added to available balance. If a sponsor has an active position in the line they will earn $125 when their referrals cycle.

$2,500 Line
Cost per position: $2,500
Cycle earnings: $7,500
Referral Cycle Bonus: $250
From the $7,500 cycle earnings $5,000 is used to buy a new position in the $5,000 line. The remaining $2,500 is added to available balance. If a sponsor has an active position in the line they will earn $250 when their referrals cycle.

$5,000 Line
Cost per position: 5,000
Cycle earnings: $15,000
Referral Cycle Bonus: $500
This is the last line so that means the full $15,000 cycle earnings will be added to available balance. If a sponsor has an active position in the line they will earn $500 when their referrals cycle.

Members can buy positions in any of the 10 lines directly and the “overflow” system will pick it up from there (overflow to the next line). Only in the last line (the $5,000 line) there is now overflow possible because there is no next line so the full $15,000 will be added to available balance.
After completing all 10 lines total profit per $10 position is $20,010!

Sunday, 9 December 2012

promoting your site step by step

promoting your site step by step


   In this section, I will explain how I use seo in promoting my own sites. It is a kind of systematic summary where I briefly recap the previous sections. Naturally, I use Seo Administrator seo software extensively in my work and so I will show how I use it in this example.

   To be able to start working with a site, you have to possess some basic seo knowledge. This can be acquired quite quickly. The information presented in this document is perfectly adequate and I must stress that you do not have to be an optimization guru to achieve results. Once you have this basic knowledge you can then start work, experimenting, getting sites to the top of the search listings and so on. That is where seo software tools are useful.

   1. Firstly, we create an approximate list of keywords and check their competition rate. We then evaluate our chances against the competition and select words that are popular enough and have average competition rate. Keywords are selected using the keyword suggestion tool. This is also used to perform a rough check of their competition rate. We use the PageRank Analyzer module to perform a detailed analysis of search results for the most interesting queries and then make our final decision about what keywords to use.

   2. Next, we start composing text for our site. I write part of it on my own, but I entrust the most important parts to specialists in technical writing. Actually, I think the quality and attractiveness of the text is the most important attribute of a page. If the textual content is good, it will be easier to get inbound links and visitors.

   3. In this step, we start using the HTML Analyzer module to create the necessary keyword density. Each page is optimized for its own keyword phrase.

   4. We submit the site to various directories. There are plenty of services to take care of that chore for us. In addition, Seo Administrator will soon have a feature to automate the task.

   5. After these initial steps are completed, we wait and check search engine indexation to make sure that various search engines are processing the site.

   6. In this step, we can begin to check the positions of the site for our keywords. These positions are not likely to be good at this early stage, but they will give us some useful information to begin fine-tuning seo work.

   7. We use the Link Popularity Checker module to track and work on increasing the link popularity.

.    8. We use the Log Analyzer module to analyze the number of visitors and work on increasing it. We also periodically repeat steps 6) - 8).

SEO software review

SEO software review

   In previous chapters, we explained how to create your own site and what methods are available to promote it. This last section is devoted to seo software tools that can automate much of the seo work on your site and can achieve even better results. We will discuss the Seo Administrator seo software suite that you can download from any site.

 Ranking Monitor
   Any seo optimization specialist is faced with the regular task of checking the positions of his sites in the search engines. You could check these positions manually, but if you have several dozen keywords and 5-7 search engines to monitor, the process becomes a real chore.

   The Ranking Monitor module will do everything automatically. You are able to see information on your site ratings for any keywords and in a variety of search engines. You will also see the dynamics and history of your site positions as well as upward and downward trends in your site position for your specified keywords. The same information is also displayed in a visual form.

Link Popularity Checker
   This program will automatically poll all available search engines and create a complete duplicate-free list of inbound links to your resource. For each link, you will see important parameters such as the link text and PageRank of the referring page. If you have studied this article, you will know how important these parameters are. As well as viewing the overall list of inbound links, you can track how the inbound links change over time.

Site Indexation Tool
   This useful tool will show you all pages indexed by a particular search engine. It is a must-have tool for anybody who is creating a new web resource. The PageRank value will be displayed for each indexed page.

Log Analyzer
   All information about your visitors is stored in the log files of your server. The log analyzer module will present this information in convenient and visual reports. Displayed information includes:
   - Originating sites
   - Keywords used,
   - What country they are from
   - Much more…

Page Rank Analyzer
   This utility collects a huge amount of competitive information on the list of sites that you specify. For each site it automatically determines parameters such as Google PageRank, the number of inbound links and the presence of each site in the DMOZ and Yahoo directories. It is an ideal tool for analyzing the competition rate of a particular query.

Keyword Suggestion Tool
   This tool gathers relevant keywords for your site and displays their popularity (the number of queries per month). It also estimates the competition rate of a specified keyword phrase.

HTML Analyzer
   This application analyzes the HTML code of a page. It estimates the weight and density of keywords and creates a report on the correct optimization of the site text. It is useful during the creation your own site and is also a great tool for analyzing your competitors' sites. It allows you to analyze both local HTML pages and online projects.

Changing the site address

Changing the site address
   You may need to change the address of your project. Maybe the resource was started on a free hosting service and has developed into a more commercial project that should have its own domain. Or maybe the owner has simply found a better name for the project. In any case, moving to a new address can be problematic and it is a difficult and unpleasant task to move a project to a new address. For starters, you will have to start promoting the new address almost from scratch. However, if the move is inevitable, you may as well make the change as useful as possible.

   Our advice is to create your new site at the new location with new and unique content. Place highly visible links to the new resource on the old site to allow visitors to easily navigate to your new site. Do not completely delete the old site and its contents.

   This approach will allow you to get visitors from search engines to both the old site and the new one. At the same time, you get an opportunity to cover additional topics and keywords, which may be more difficult within one resource.

Selecting a domain and hosting

Selecting a domain and hosting
   Currently, anyone can create a page on the Internet without incurring any expense. Also, there are companies providing free hosting services that will publish your page in return for their entitlement to display advertising on it. Many Internet service providers will also allow you to publish your page on their servers if you are their client. However, all these variations have serious drawbacks that you should seriously consider if you are creating a commercial project.

   First, and most importantly, you should obtain your own domain for the following reasons:

   - A project that does not have its own domain is regarded as a transient project. Indeed, why should we trust a resource if its owners are not even prepared to invest in the tiny sum required to create some sort of minimum corporate image? It is possible to publish free materials using resources based on free or ISP-based hosting, but any attempt to create a commercial project without your own domain is doomed to failure.

   - Your own domain allows you to choose your hosting provider. If necessary, you can move your site to another hosting provider at any time.

    Here are some useful tips for choosing a domain name.

   - Try to make it easy to remember and make sure there is only one way to pronounce and spell it.

   - Domains with the extension .com are the best choice to promote international projects in English. Domains from the zones .net, .org, .biz, etc., are available but less preferable.

   - If you want to promote a site with a national flavor, use a domain from the corresponding national zone. Use .de – for German sites, .it – for Italian sites, etc.

   - In the case of sites containing two or more languages, you should assign a separate domain to each language. National search engines are more likely to appreciate such an approach than subsections for various languages located on one site.

   A domain costs $10-20 a year, depending on the particular registration service and zone.

   You should take the following factors into consideration when choosing a hosting provider:

   - Access bandwidth.
   - Server uptime.
   - The cost of traffic per gigabyte and the amount of prepaid traffic.
   - The site is best located in the same geographical region as most of your expected visitors.

   The cost of hosting services for small projects is around $5-10 per month.

   Avoid “free” offers while choosing a domain and a hosting provider. Hosting providers sometimes offer free domains to their clients. Such domains are often registered not to you, but to the hosting company. The hosting provider will be the owner of the domain. This means that you will not be able to change the hosting service of your project, or you could even be forced to buy out your own domain at a premium price. Also, you should not register your domains via your hosting company. This may make moving your site to another hosting company more difficult even though you are the owner of your domain.

Creating correct content

Creating correct content
   The content of a site plays an important role in site promotion for many reasons. We will describe some of them in this section. We will also give you some advice on how to populate your site with good content.

   - Content uniqueness. Search engines value new information that has not been published before. That is why you should compose own site text and not plagiarize excessively. A site based on materials taken from other sites is much less likely to get to the top in search engines. As a rule, original source material is always higher in search results.

   - While creating a site, remember that it is primarily created for human visitors, not search engines. Getting visitors to visit your site is only the first step and it is the easiest one. The truly difficult task is to make them stay on the site and convert them into purchasers. You can only do this by using good content that is interesting to real people.

   - Try to update information on the site and add new pages on a regular basis. Search engines value sites that are constantly developing. Also, the more useful text your site contains, the more visitors it attracts. Write articles on the topic of your site, publish visitors' opinions, create a forum for discussing your project. A forum is only useful if the number of visitors is sufficient for it to be active. Interesting and attractive content guarantees that the site will attract interested visitors.

   - A site created for people rather than search engines has a better chance of getting into important directories such as DMOZ and others.

   - An interesting site on a particular topic has much better chances to get links, comments, reviews, etc. from other sites on this topic. Such reviews can give you a good flow of visitors while inbound links from such resources will be highly valued by search engines.

   - As final tip…there is an old German proverb: "A shoemaker sticks to his last" which means, "Do what you can do best.” If you can write breathtaking and creative textual prose for your website then that is great. However, most of us have no special talent for writing attractive text and we should rely on professionals such as journalists and technical writers. Of course, this is an extra expense, but it is justified in the long term.

Seo tips, assumptions, observations

This section provides information based on an analysis of various seo articles, communication between optimization specialists, practical experience and so on. It is a collection of interesting and useful tips ideas and suppositions. Do not regard this section as written in stone, but rather as a collection of information and suggestions for your consideration.

   - Outbound links. Publish links to authoritative resources in your subject field using the necessary keywords. Search engines place a high value on links to other resources based on the same topic.

   - Outbound links. Do not publish links to FFA sites and other sites excluded from the indexes of search engines. Doing so may lower the rating of your own site.

   - Outbound links. A page should not contain more than 50-100 outbound links. More links will not harm your site rating but links beyond that number will not be recognized by search engines.

   - Inbound site-wide links. These are links published on every page of the site. It is believed that search engines do not approve of such links and do not consider them while ranking pages. Another opinion is that this is true only for large sites with thousands of pages.

   - The ideal keyword density is a frequent seo discussion topic. The real answer is that there is no ideal keyword density. It is different for each query and search engines calculate it dynamically for each search query. Our advice is to analyze the first few sites in search results for a particular query. This will allow you to evaluate the approximate optimum density for specific queries.

   - Site age. Search engines prefer old sites because they are more stable.

   - Site updates. Search engines prefer sites that are constantly developing. Developing sites are those in which new information and new pages periodically appear.

   - Domain zone. Search engines prefer sites that are located in the zones .edu, .mil, .gov, etc. Only the corresponding organizations can register such domains so these domains are more trustworthy.

   - Search engines track the percent of visitors that immediately return to searching after they visit a site via a search result link. A large number of immediate returns means that the content is probably not related to the corresponding topic and the ranking of such a page gets lower.

   - Search engines track how often a link is selected in search results. If some link is only occasionally selected, it means that the page is of little interest and the rating of such a page gets lower

   - Use synonyms and derived word forms of keywords, search engines will appreciate that (keyword stemming).

;    - Search engines consider a very rapid increase in inbound links as artificial promotion and this results in lowering of the rating. This is a controversial topic because this method could be used to lower the rating of one's competitors.

   - Google does not take into account inbound links if they are on the same (or similar) hosts. This is detected using host IP addresses. Pages whose IP addresses are within the range of xxx.xxx.xxx.0 to xxx.xxx.xxx.255. are regarded as being on the same host. This opinion is most likely to be rooted in the fact that Google have expressed this idea in their patents. However, Google employees claim that no limitations of IP addresses are imposed on inbound links and there are no reasons not to believe them.

   - Search engines check information about the owners of domains. Inbound links originating from a variety of sites all belonging to one owner are regarded as less important than normal links. This information is presented in a patent.

   - Search engines prefer sites with longer term domain registrations.

Google LocalRank

Google LocalRank
   On February 25, 2003, the Google Company patented a new algorithm for ranking pages called LocalRank. It is based on the idea that pages should be ranked not by their global link citations, but by how they are cited among pages that deal with topics related to the particular query. The LocalRank algorithm is not used in practice (at least, not in the form it is described in the patent). However, the patent contains several interesting innovations we think any seo specialist should know about. Nearly all search engines already take into account the topics to which referring pages are devoted. It seems that rather different algorithms are used for the LocalRank algorithm and studying the patent will allow us to learn general ideas about how it may be implemented.

   While reading this section, please bear in mind that it contains theoretical information rather than practical guidelines.

   The following three items comprise the main idea of the LocalRank algorithm:

   1. An algorithm is used to select a certain number of documents relevant to the search query (let it be N). These documents are initially sorted by some criteria (this may be PageRank, relevance or a group of other criteria). Let us call the numeric value of this criterion OldScore.

   2. Each of the N N selected pages goes through a new ranking procedure and it gets a new rank. Let us call it LocalScore.

   3. The OldScore and LocalScore values for each page are multiplied, to yield a new value – NewScore. The pages are finally ranked based on NewScore.

   The key procedure in this algorithm is the new ranking procedure, which gives each page a new LocalScore rank. Let us examine this new procedure in more detail:

   0. An initial ranking algorithm is used to select N pages relevant to the search query. Each of the N pages is allocated an OldScore value by this algorithm. The new ranking algorithm only needs to work on these N selected pages. .

   1. While calculating LocalScore for each page, the system selects those pages from N that have inbound links to this page. Let this number be M. At the same time, any other pages from the same host (as determined by IP address) and pages that are mirrors of the given page will be excluded from M.

   2. The set M is divided into subsets Li. These subsets contain pages grouped according to the following criteria:
   - Belonging to one (or similar) hosts. Thus, pages whose first three octets in their IP addresses are the same will get into one group. This means that pages whose IP addresses belong to the range xxx.xxx.xxx.0 to xxx.xxx.xxx.255 will be considered as belonging to one group.
   - Pages that have the same or similar content (mirrors)
   - Pages on the same site (domain).

   3. Each page in each Li subset has rank OldScore. One page with the largest OldScore rank is taken from each subset, the rest of pages are excluded from the analysis. Thus, we get some subset of pages K referring to this page.

   4. Pages in the subset K are sorted by the OldScore parameter, then only the first k pages (k is some predefined number) are left in the subset K. The rest of the pages are excluded from the analysis.

   5. LocalScore is calculated in this step. The OldScore parameters are combined together for the rest of k pages. This can be shown with the help of the following formula:

formula.JPG - 2826 Bytes
   Here m is some predefined parameter that may vary from one to three. Unfortunately, the patent for the algorithm in question does not describe this parameter in detail.

   After LocalScore is calculated for each page from the set N, NewScore values are calculated and pages are re-sorted according to the new criteria. The following formula is used to calculate NewScore:

   NewScore(i)= (a+LocalScore(i)/MaxLS)*(b+OldScore(i)/MaxOS)

   i is the page for which the new rank is calculated.

   a and b – are numeric constants (there is no more detailed information in the patent about these parameters).

   MaxLS – is the maximum LocalScore among those calculated.

   MaxOS – is the maximum value among OldScore values.

   Now let us put the math aside and explain these steps in plain words.

   In step 0) pages relevant to the query are selected. Algorithms that do not take into account the link text are used for this. For example, relevance and overall link popularity are used. We now have a set of OldScore values. OldScore is the rating of each page based on relevance, overall link popularity and other factors.

   In step 1) pages with inbound links to the page of interest are selected from the group obtained in step 0). The group is whittled down by removing mirror and other sites in steps 2), 3) and 4) so that we are left with a set of genuinely unique sites that all share a common theme with the page that is under analysis. By analyzing inbound links from pages in this group (ignoring all other pages on the Internet), we get the local (thematic) link popularity.

   LocalScore values are then calculated in step 5). LocalScore is the rating of a page among the set of pages that are related by topic. Finally, pages are rated and ranked using a combination of LocalScore and OldScore.

Google SandBox

Google SandBox
   At the beginning of 2004, a new and mysterious term appeared among seo specialists – Google SandBox. This is the name of a new Google spam filter that excludes new sites from search results. The work of the SandBox filter results in new sites being absent from search results for virtually any phrase. This even happens with sites that have high-quality unique content and which are promoted using legitimate techniques.

   The SandBox is currently applied only to the English segment of the Internet; sites in other languages are not yet affected by this filter. However, this filter may expand its influence. It is assumed that the aim of the SandBox filter is to exclude spam sites – indeed, no search spammer will be able to wait for months until he gets the necessary results. However, many perfectly valid new sites suffer the consequences. So far, there is no precise information as to what the SandBox filter actually is. Here are some assumptions based on practical seo experience:

   - SandBox is a filter that is applied to new sites. A new site is put in the sandbox and is kept there for some time until the search engine starts treating it as a normal site.

   - SandBox is a filter applied to new inbound links to new sites. There is a fundamental difference between this and the previous assumption: the filter is not based on the age of the site, but on the age of inbound links to the site. In other words, Google treats the site normally but it refuses to acknowledge any inbound links to it unless they have existed for several months. Since such inbound links are one of the main ranking factors, ignoring inbound links is equivalent to the site being absent from search results. It is difficult to say which of these assumptions is true, it is quite possible that they are both true.

   - The site may be kept in the sandbox from 3 months to a year or more. It has also been noticed that sites are released from the sandbox in batches. This means that the time sites are kept in the sandbox is not calculated individually for each site, but for groups of sites. All sites created within a certain time period are put into the same group and they are eventually all released at the same time. Thus, individual sites in a group can spend different times in the sandbox depending where they were in the group capture-release cycle.

   Typical indications that your site is in the sandbox include:

   - Your site is normally indexed by Google and the search robot regularly visits it.
   - Your site has a PageRank; the search engine knows about and correctly displays inbound links to your site.
   - A search by site address (www.site.com) displays correct results, with the correct title, snippet (resource description), etc.
   - Your site is found by rare and unique word combinations present in the text of its pages.
   - Your site is not displayed in the first thousand results for any other queries, even for those for which it was initially created. Sometimes, there are exceptions and the site appears among 500-600 positions for some queries. This does not change the sandbox situation, of course.

   There no practical ways to bypass the Sandbox filter. There have been some suggestions about how it may be done, but they are no more than suggestions and are of little use to a regular webmaster. The best course of action is to continue seo work on the site content and structure and wait patiently until the sandbox is disabled after which you can expect a dramatic increase in ratings, up to 400-500 positions.

Choosing keywords

 Choosing keywords


Initially choosing keywords
   Choosing keywords should be your first step when constructing a site. You should have the keyword list available to incorporate into your site text before you start composing it. To define your site keywords, you should use seo services offered by search engines in the first instance. Sites such as www.wordtracker.com and inventory.overture.com are good starting places for English language sites. Note that the data they provide may sometimes differ significantly from what keywords are actually the best for your site. You should also note that the Google search engine does not give information about frequency of search queries.

   After you have defined your approximate list of initial keywords, you can analyze your competitor’s sites and try to find out what keywords they are using. You may discover some further relevant keywords that are suitable for your own site.

Frequent and rare keywords
   There are two distinct strategies – optimize for a small number of highly popular keywords or optimize for a large number of less popular words. In practice, both strategies are often combined.

   The disadvantage of keywords that attract frequent queries is that the competition rate is high for them. It is often not possible for a new site to get anywhere near the top of search result listings for these queries.

   For keywords associated with rare queries, it is often sufficient just to mention the necessary word combination on a web page or to perform minimum text optimization. Under certain circumstances, rare queries can supply quite a large amount of search traffic.

   The aim of most commercial sites is to sell some product or service or to make money in some way from their visitors. This should be kept in mind during your seo (search engine optimization) work and keyword selection. If you are optimizing a commercial site then you should try to attract targeted visitors (those who are ready to pay for the offered product or service) to your site rather than concentrating on sheer numbers of visitors.

   Example. The query “monitor” is much more popular and competitive than the query “monitor Samsung 710N” (the exact name of the model). However, the second query is much more valuable for a seller of monitors. It is also easier to get traffic from it because its competition rate is low; there are not many other sites owned by sellers of Samsung 710N monitors. This example highlights another possible difference between frequent and rare search queries that should be taken into account – rare search queries may provide you with less visitors overall, but more targeted visitors.

Evaluating the competition rates of search queries
   When you have finalized your keywords list, you should identify the core keywords for which you will optimize your pages. A suggested technique for this follows.

   Rare queries are discarded at once (for the time being). In the previous section, we described the usefulness of such rare queries but they do not require special optimization. They are likely to occur naturally in your website text.

   As a rule, the competition rate is very high for the most popular phrases. This is why you need to get a realistic idea of the competitiveness of your site. To evaluate the competition rate you should estimate a number of parameters for the first 10 sites displayed in search results:
   - The average PageRank of the pages in the search results.
   - The average number of links to these sites. Check this using a variety of search engines.
   Additional parameters:
   - The number of pages on the Internet that contain the particular search term, the total number of search results for that search term.
   - The number of pages on the Internet that contain exact matches to the keyword phrase. The search for the phrase is bracketed by quotation marks to obtain this number.

   These additional parameters allow you to indirectly evaluate how difficult it will be to get your site near the top of the list for this particular phrase. As well as the parameters described, you can also check the number of sites present in your search results in the main directories, such as DMOZ and Yahoo.

   The analysis of the parameters mentioned above and their comparison with those of your own site will allow you to predict with reasonable certainty the chances of getting your site to the top of the list for a particular phrase.

   Having evaluated the competition rate for all of your keyword phrases, you can now select a number of moderately popular key phrases with an acceptable competition rate, which you can use to promote and optimize your site.

Refining your keyword phrases
   As mentioned above, search engine services often give inaccurate keyword information. This means that it is unusual to obtain an optimum set of site keywords at your first attempt. After your site is up and running and you have carried out some initial promotion, you can obtain additional keyword statistics, which will facilitate some fine-tuning. For example, you will be able to obtain the search results rating of your site for particular phrases and you will also have the number of visits to your site for these phrases.

   With this information, you can clearly define the good and bad keyword phrases. Often there is no need to wait until your site gets near the top of all search engines for the phrases you are evaluating – one or two search engines are enough.

   Example. Suppose your site occupies first place in the Yahoo search engine for a particular phrase. At the same time, this site is not yet listed in MSN, or Google search results for this phrase. However, if you know the percentage of visits to your site from various search engines (for instance, Google – 70%, Yahoo – 20%, MSN search – 10%), you can predict the approximate amount of traffic for this phrase from these other searches engines and decide whether it is suitable.

   As well as detecting bad phrases, you may find some new good ones. For example, you may see that a keyword phrase you did not optimize your site for brings useful traffic despite the fact that your site is on the second or third page in search results for this phrase.

   Using these methods, you will arrive at a new refined set of keyword phrases. You should now start reconstructing your site: Change the text to include more of the good phrases, create new pages for new phrases, etc.

   You can repeat this seo exercise several times and, after a while, you will have an optimum set of key phrases for your site and considerably increased search traffic.
   Here are some more tips. According to statistics, the main page takes up to 30%-50% of all search traffic. It has the highest visibility in search engines and it has the largest number of inbound links. That is why you should optimize the main page of your site to match the most popular and competitive queries. Each site page should be optimized for one or two main word combinations and, possibly for a number of rare queries. This will increase the chances for the page get to the top of search engine lists for particular phrases.

Indexing a site

Before a site appears in search results, a search engine must index it. An indexed site will have been visited and analyzed by a search robot with relevant information saved in the search engine database. If a page is present in the search engine index, it can be displayed in search results otherwise, the search engine cannot know anything about it and it cannot display information from the page..

   Most average sized sites (with dozens to hundreds of pages) are usually indexed correctly by search engines. However, you should remember the following points when constructing your site. There are two ways to allow a search engine to learn about a new site:

   - Submit the address of the site manually using a form associated with the search engine, if available. In this case, you are the one who informs the search engine about the new site and its address goes into the queue for indexing. Only the main page of the site needs to be added, the search robot will find the rest of pages by following links.

   - Let the search robot find the site on its own. If there is at least one inbound link to your resource from other indexed resources, the search robot will soon visit and index your site. In most cases, this method is recommended. Get some inbound links to your site and just wait until the robot visits it. This may actually be quicker than manually adding it to the submission queue. Indexing a site typically takes from a few days to two weeks depending on the search engine. The Google search engine is the quickest of the bunch.

   Try to make your site friendly to search robots by following these rules:

   - Try to make any page of your site reachable from the main page in not more than three mouse clicks. If the structure of the site does not allow you to do this, create a so-called site map that will allow this rule to be observed.

   - Do not make common mistakes. Session identifiers make indexing more difficult. If you use script navigation, make sure you duplicate these links with regular ones because search engines cannot read scripts (see more details about these and other mistakes in section 2.3).

   - Remember that search engines index no more than the first 100-200 KB of text on a page. Hence, the following rule – do not use pages with text larger than 100 KB if you want them to be indexed completely.

   You can manage the behavior of search robots using the file robots.txt. This file allows you to explicitly permit or forbid them to index particular pages on your site.

   The databases of search engines are constantly being updated; records in them may change, disappear and reappear. That is why the number of indexed pages on your site may sometimes vary. One of the most common reasons for a page to disappear from indexes is server unavailability. This means that the search robot could not access it at the time it was attempting to index the site. After the server is restarted, the site should eventually reappear in the index.

   You should note that the more inbound links your site has, the more quickly it gets re-indexed. You can track the process of indexing your site by analyzing server log files where all visits of search robots are logged. We will give details of seo software that allows you to track such visits in a later section.

Increasing link popularity and SEO

Increasing link popularity
Submitting to general purpose directories
   On the Internet, many directories contain links to other network resources grouped by topics. The process of adding your site information to them is called submission.

   Such directories can be paid or free of charge, they may require a backlink from your site or they may have no such requirement. The number of visitors to these directories is not large so they will not send a significant number to your site. However, search engines count links from these directories and this may enhance your sites search result placement.

   Important! Only those directories that publish a direct link to your site are worthwhile from a seo point of view. Script driven directories are almost useless. This point deserves a more detailed explanation. There are two methods for publishing a link. A direct link is published as a standard HTML construction («A href=...», etc.). Alternatively, links can be published with the help of various scripts, redirects and so on. Search engines understand only those links that are specified directly in HTML code. That is why the seo value of a directory that does not publish a direct link to your site is close to zero.

   You should not submit your site to FFA (free-for-all) directories. Such directories automatically publish links related to any search topic and are ignored by search engines. The only thing an FFA directory entry will give you is an increase in spam sent to your published e-mail address. Actually, this is the main purpose of FFA directories.

   Be wary of promises from various programs and seo services that submit your resource to hundreds of thousands of search engines and directories. There are no more than a hundred or so genuinely useful directories on the Net – this is the number to take seriously and professional seo submission services work with this number of directories. If a seo service promises submissions to enormous numbers of resources, it simply means that the submission database mainly consists of FFA archives and other useless resources.

   Give preference to manual or semiautomatic seo submission; do not rely completely on automatic processes. Submitting sites under human control is generally much more efficient than fully automatic submission. The value of submitting a site to paid directories or publishing a backlink should be considered individually for each directory. In most cases, it does not make much sense, but there may be exceptions.

   Submitting sites to directories does not often result in a dramatic effect on site traffic, but it slightly increases the visibility of your site for search engines. This useful seo option is available to everyone and does not require a lot of time and expense, so do not overlook it when promoting your project.

DMOZ directory
    The DMOZ directory (www.dmoz.org) or the Open Directory Project is the largest directory on the Internet. There are many copies of the main DMOZ site and so, if you submit your site to the DMOZ directory, you will get a valuable link from the directory itself as well as dozens of additional links from related resources. This means that the DMOZ directory is of great value to a seo aware webmaster.

   It is not easy to get your site into the DMOZ directory; there is an element of luck involved. Your site may appear in the directory a few minutes after it has been submitted or it may take months to appear.

   If you submitted your site details correctly and in the appropriate category then it should eventually appear. If it does not appear after a reasonable time then you can try contacting the editor of your category with a question about your request (the DMOZ site gives you such opportunity). Of course, there are no guarantees, but it may help. DMOZ directory submissions are free of charge for all sites, including commercial ones.

   Here are my final recommendations regarding site submissions to DMOZ. Read all site requirements, descriptions, etc. to avoid violating the submission rules. Such a violation will most likely result in a refusal to consider your request. Please remember, presence in the DMOZ directory is desirable, but not obligatory. Do not despair if you fail to get into this directory. It is possible to reach top positions in search results without this directory – many sites do.

Link exchange
   The essence of link exchanges is that you use a special page to publish links to other sites and get similar backlinks from them. Search engines do not like link exchanges because, in many cases, they distort search results and do not provide anything useful to Internet users. However, it is still an effective way to increase link popularity if you observe several simple rules.

   - Exchange links with sites that are related by topic. Exchanging links with unrelated sites is ineffective and unpopular.

   - Before exchanging, make sure that your link will be published on a “good” page. This means that the page must have a reasonable PageRank (3-4 or higher is recommended), it must be available for indexing by search engines, the link must be direct, the total number of links on the page must not exceed 50, and so on.

   - Do not create large link directories on your site. The idea of such a directory seems attractive because it gives you an opportunity to exchange links with many sites on various topics. You will have a topic category for each listed site. However, when trying to optimize your site you are looking for link quality rather than quantity and there are some potential pitfalls. No seo aware webmaster will publish a quality link to you if he receives a worthless link from your directory “link farm” in return. Generally, the PageRank of pages from such directories leaves a lot to be desired. In addition, search engines do not like these directories at all. There have even been cases where sites were banned for using such directories.

   - Use a separate page on the site for link exchanges. It must have a reasonable PageRank and it must be indexed by search engines, etc. Do not publish more than 50 links on one page (otherwise search engines may fail to take some of the links into account). This will help you to find other seo aware partners for link exchanges.

   - Search engines try to track mutual links. That is why you should, if possible, publish backlinks on a domain/site other than the one you are trying to promote. The best variant is when you promote the resource site1.com and publish backlinks on the resource site2.com.

    - Exchange links with caution. Webmasters who are not quite honest will often remove your links from their resources after a while. Check your backlinks from time to time.

Press releases, news feeds, thematic resources
   This section is about site marketing rather than pure seo. There are many information resources and news feeds that publish press releases and news on various topics. Such sites can supply you with direct visitors and also increase your sites popularity. If you do not find it easy to create a press release or a piece of news, hire copywriters – they will help you find or create something newsworthy.

   Look for resources that deal with similar topics to your own site. You may find many Internet projects that not in direct competition with you, but which share the same topic as your site. Try to approach the site owners. It is quite possible that they will be glad to publish information about your project.

   One final tip for obtaining inbound links – try to create slight variations in the inbound link text. If all inbound links to your site have exactly the same link text and there are many of them, the search engines may flag it as a spam attempt and penalize your site.

External ranking factors and SEO

External ranking factors


Why inbound links to sites are taken into account
   As you can see from the previous section, many factors influencing the ranking process are under the control of webmasters. If these were the only factors then it would be impossible for search engines to distinguish between a genuine high-quality document and a page created specifically to achieve high search ranking but containing no useful information. For this reason, an analysis of inbound links to the page being evaluated is one of the key factors in page ranking. This is the only factor that is not controlled by the site owner.

   It makes sense to assume that interesting sites will have more inbound links. This is because owners of other sites on the Internet will tend to have published links to a site if they think it is a worthwhile resource. The search engine will use this inbound link criterion in its evaluation of document significance.

   Therefore, two main factors influence how pages are stored by the search engine and sorted for display in search results:

    - Relevance, as described in the previous section on internal ranking factors.

    - Number and quality of inbound links, also known as link citation, link popularity or citation index. This will be described in the next section.

Link importance (citation index, link popularity)
   You can easily see that simply counting the number of inbound links does not give us enough information to evaluate a site. It is obvious that a link from www.microsoft.com should mean much more than a link from some homepage like www.hostingcompany.com/~myhomepage.html. You have to take into account link importance as well as number of links.

   Search engines use the notion of citation index to evaluate the number and quality of inbound links to a site. Citation index is a numeric estimate of the popularity of a resource expressed as an absolute value representing page importance. Each search engine uses its own algorithms to estimate a page citation index. As a rule, these values are not published.

   As well as the absolute citation index value, a scaled citation index is sometimes used. This relative value indicates the popularity of a page relative to the popularity of other pages on the Internet. You will find a detailed description of citation indexes and the algorithms used for their estimation in the next sections.

Link text (anchor text)
   The link text of any inbound site link is vitally important in search result ranking. The anchor (or link) text is the text between the HTML tags «A» and «/A» and is displayed as the text that you click in a browser to go to a new page. If the link text contains appropriate keywords, the search engine regards it as an additional and highly significant recommendation that the site actually contains valuable information relevant to the search query.

Relevance of referring pages
   As well as link text, search engines also take into account the overall information content of each referring page.

   Example: Suppose we are using seo to promote a car sales resource. In this case a link from a site about car repairs will have much more importance that a similar link from a site about gardening. The first link is published on a resource having a similar topic so it will be more important for search engines.

Google PageRank – theoretical basics
   The Google company was the first company to patent the system of taking into account inbound links. The algorithm was named PageRank. In this section, we will describe this algorithm and how it can influence search result ranking.

   PageRank is estimated separately for each web page and is determined by the PageRank (citation) of other pages referring to it. It is a kind of “virtuous circle.” The main task is to find the criterion that determines page importance. In the case of PageRank, it is the possible frequency of visits to a page.

   I shall now describe how user’s behavior when following links to surf the network is modeled. It is assumed that the user starts viewing sites from some random page. Then he or she follows links to other web resources. There is always a possibility that the user may leave a site without following any outbound link and start viewing documents from a random page. The PageRank algorithm estimates the probability of this event as 0.15 at each step. The probability that our user continues surfing by following one of the links available on the current page is therefore 0.85, assuming that all links are equal in this case. If he or she continues surfing indefinitely, popular pages will be visited many more times than the less popular pages.

   The PageRank of a specified web page is thus defined as the probability that a user may visit the web page. It follows that, the sum of probabilities for all existing web pages is exactly one because the user is assumed to be visiting at least one Internet page at any given moment.

   Since it is not always convenient to work with these probabilities the PageRank can be mathematically transformed into a more easily understood number for viewing. For instance, we are used to seeing a PageRank number between zero and ten on the Google Toolbar.

   According to the ranking model described above:
   - Each page on the Net (even if there are no inbound links to it) initially has a PageRank greater than zero, although it will be very small. There is a tiny chance that a user may accidentally navigate to it.
   - Each page that has outbound links distributes part of its PageRank to the referenced page. The PageRank contributed to these linked-to pages is inversely proportional to the total number of links on the linked-from page – the more links it has, the lower the PageRank allocated to each linked-to page.
   - PageRank A “damping factor” is applied to this process so that the total distributed page rank is reduced by 15%. This is equivalent to the probability, described above, that the user will not visit any of the linked-to pages but will navigate to an unrelated website.

   Let us now see how this PageRank process might influence the process of ranking search results. We say “might” because the pure PageRank algorithm just described has not been used in the Google algorithm for quite a while now. We will discuss a more current and sophisticated version shortly. There is nothing difficult about the PageRank influence – after the search engine finds a number of relevant documents (using internal text criteria), they can be sorted according to the PageRank since it would be logical to suppose that a document having a larger number of high-quality inbound links contains the most valuable information.

   Thus, the PageRank algorithm "pushes up" those documents that are most popular outside the search engine as well.

Google PageRank – practical use
   Currently, PageRank is not used directly in the Google algorithm. This is to be expected since pure PageRank characterizes only the number and the quality of inbound links to a site, but it completely ignores the text of links and the information content of referring pages. These factors are important in page ranking and they are taken into account in later versions of the algorithm. It is thought that the current Google ranking algorithm ranks pages according to thematic PageRank. In other words, it emphasizes the importance of links from pages with content related by similar topics or themes. The exact details of this algorithm are known only to Google developers.

   You can determine the PageRank value for any web page with the help of the Google ToolBar that shows a PageRank value within the range from 0 to 10. It should be noted that the Google ToolBar does not show the exact PageRank probability value, but the PageRank range a particular site is in. Each range (from 0 to 10) is defined according to a logarithmic scale.

   Here is an example: each page has a real PageRank value known only to Google. To derive a displayed PageRank range for their ToolBar, they use a logarithmic scale as shown in this table
          Real PR                               ToolBar PR
          1-10                                            1
          10-100                                        2
          100-1000                                    3
          1000-10.000                               4
Etc.


   This shows that the PageRank ranges displayed on the Google ToolBar are not all equal. It is easy, for example, to increase PageRank from one to two, while it is much more difficult to increase it from six to seven.

   In practice, PageRank is mainly used for two purposes:

   1. Quick check of the sites popularity. PageRank does not give exact information about referring pages, but it allows you to quickly and easily get a feel for the sites popularity level and to follow trends that may result from your seo work. You can use the following “Rule of thumb” measures for English language sites: PR 4-5 is typical for most sites with average popularity. PR 6 indicates a very popular site while PR 7 is almost unreachable for a regular webmaster. You should congratulate yourself if you manage to achieve it. PR 8, 9, 10 can only be achieved by the sites of large companies such as Microsoft, Google, etc. PageRank is also useful when exchanging links and in similar situations. You can compare the quality of the pages offered in the exchange with pages from your own site to decide if the exchange should be accepted.

   2. Evaluation of the competitiveness level for a search query is a vital part of seo work. Although PageRank is not used directly in the ranking algorithms, it allows you to indirectly evaluate relative site competitiveness for a particular query. For example, if the search engine displays sites with PageRank 6-7 in the top search results, a site with PageRank 4 is not likely to get to the top of the results list using the same search query.

   It is important to recognize that the PageRank values displayed on the Google ToolBar are recalculated only occasionally (every few months) so the Google ToolBar displays somewhat outdated information. This means that the Google search engine tracks changes in inbound links much faster than these changes are reflected on the Google ToolBar.

Common SEO mistakes

Common seo mistakes
Graphic header
   Very often sites are designed with a graphic header. Often, we see an image of the company logo occupying the full-page width. Do not do it! The upper part of a page is a very valuable place where you should insert your most important keywords for best seo. In case of a graphic image, that prime position is wasted since search engines can not make use of images. Sometimes you may come across completely absurd situations: the header contains text information, but to make its appearance more attractive, it is created in the form of an image. The text in it cannot be indexed by search engines and so it will not contribute toward the page rank. If you must present a logo, the best way is to use a hybrid approach – place the graphic logo at the top of each page and size it so that it does not occupy its entire width. Use a text header to make up the rest of the width.

Graphic navigation menu
   The situation is similar to the previous one – internal links on your site should contain keywords, which will give an additional advantage in seo ranking. If your navigation menu consists of graphic elements to make it more attractive, search engines will not be able to index the text of its links. If it is not possible to avoid using a graphic menu, at least remember to specify correct ALT attributes for all images.

Script navigation
   Sometimes scripts are used for site navigation. As an seo worker, you should understand that search engines cannot read or execute scripts. Thus, a link specified with the help of a script will not be available to the search engine, the search robot will not follow it and so parts of your site will not be indexed. If you use site navigation scripts then you must provide regular HTML duplicates to make them visible to everyone – your human visitors and the search robots.

Session identifier
   Some sites use session identifiers. This means that each visitor gets a unique parameter (&session_id=) when he or she arrives at the site. This ID is added to the address of each page visited on the site. Session IDs help site owners to collect useful statistics, including information about visitors' behavior. However, from the point of view of a search robot, a page with a new address is a brand new page. This means that, each time the search robot comes to such a site, it will get a new session identifier and will consider the pages as new ones whenever it visits them.

   Search engines do have algorithms for consolidating mirrors and pages with the same content. Sites with session IDs should, therefore, be recognized and indexed correctly. However, it is difficult to index such sites and sometimes they may be indexed incorrectly, which has an adverse effect on seo page ranking. If you are interested in seo for your site, I recommend that you avoid session identifiers if possible.

Redirects
   Redirects make site analysis more difficult for search robots, with resulting adverse effects on seo. Do not use redirects unless there is a clear reason for doing so.

Hidden text, a deceptive seo method
   The last two issues are not really mistakes but deliberate attempts to deceive search engines using illicit seo methods. Hidden text (when the text color coincides with the background color, for example) allows site owners to cram a page with their desired keywords without affecting page logic or visual layout. Such text is invisible to human visitors but will be seen by search robots. The use of such deceptive optimization methods may result in banning of the site. It could be excluded from the index (database) of the search engine.

One-pixel links, seo deception
   This is another deceptive seo technique. Search engines consider the use of tiny, almost invisible, graphic image links just one pixel wide and high as an attempt at deception, which may lead to a site ban.