Wednesday, July 15, 2009

AdSense Alternatives - SEO FIRM DELHI

Did anyone ever check the alternatives to Google AdSense? That is, besides Google AdSense, is there anything else? Most of the online marketers are trying different alternatives to earn their online income. Running Google AdSense ads on their websites/blogs is one of the most preferred ways. However, there are other alternatives as well to monetize blogs or sites besides Google AdSense?

The quickest way to earning an online income through website is to find products, memberships, or services which pays recurring commissions.

The most popular way bloggers earn their income is by the contextual ad program from Google called AdSense. However, other alternatives are also available to increase this income.

Below are some of other alternatives of monetizing blog or website besides Google AdSense :

- WidgetBucks
- Chitika’s eMiniMalls
- Tribal Fusion
- Azoogle Ads
- Adbrite
- AdHearUs
- Kanoodle
- Clicksor
- BidVertiser
- FastClick
- ValueClickMedia

Also, the following affiliate programs can be tried out:

- ClickBank
- Commission Junction
- LinkShare
- Amazon

In conclusion, if someone wants to increase online earnings through website or blog then these above mentioned resources can be looked upon.

Monday, July 13, 2009

Adwords New Interface: Introducing spreadsheet editing

To make the work easier and faster on Adwords interface, Google has introduced a new feature in its new adwords interface called “spreadsheet edit”. Now advertisers can directly make the changes on Adwords interface in the same manner as they used to make through Adwords editor. The changes would include daily bid adjustment, creating keyword lists, adjusting destination URLs etc.






Also, we can simply copy and paste the work from another spreadsheet into Adwords spreadsheet. This will save a lot of time; and at the same time, advertisers will not have to go off-line to use Google Adwords Editor.

Rosetta Stone Files Suit Against Google in Adwords (RST, GOOG)

Rosetta Stone, Inc. a leading company in language learning and translation s/w, has sued Google Inc. and is seeking to prevent it from infringing upon its trademarks. The suit alleges that Google allows third parties to purchase the right to use Rosetta Stone trademarks or other terms confusingly similar in Google’s AdWords advertising program.

The case has been filed by Rosetta Stone as Google allows PPC advertisers to use trademark terms in their ad text without owning the trademark and without the permission from the trademark holder. Advertisers are allowed to use these terms within the text or title of paid advertisements.

We have encountered the same issues with one of our PPC clients viz., Manas Hosting. Trademark terms such as "Godaddy" were being used as ‘keyword’ triggers for paid advertisements. On similar grounds, Godaddy also filed a complaint with Google regarding the usage of their trademarks by Manas Hosting.

Trademark and copyrights policy needs to be read thoroughly before using them as our keyword list. Also we should educate the client regarding the same.

Thursday, July 9, 2009

Custom Search with Automatic transliteration

It is harder to type in one language as compared to another. When we're searching for content in a specific language, it is often convenient to think in that language, but type in another, e.g., English. i.e. one can get the search results in different language that the language of the search term.

Google made this easier to do in Custom Search. They have enabled transliteration in Custom Search for a set of languages. This will make it easier to find news in Arabic, Indian news in Hindi, one's favourite Bollywood song lyrics, or local content in a bunch of other Indic languages - Kannada, Malayalam, Tamil and Telugu. Soon this feature will be enabled for other languages also.

Transliteration is available to use in your search box in the following languages:

· Arabic

· Hindi

· Kannada

· Malayalam

· Tamil

· Telugu

Wednesday, July 8, 2009

Search Pad Launched By Yahoo! - Personal Research Tool

Yahoo! has launched a new note-taking research tool called Search Pad. This tool will automatically detect research intent among people using Yahoo! search.

Once the tool detects the user intent, it will prompt the searchers with an invitation to use Search Pad. The tool uses drag and drop and also includes the ability to write free form notes as well.

The tool has rolled out to several countries. I personally couldn't get a Search Pad prompt yet today. But I am looking forward towards it !. Here are the countries Search Pad has rolled out to:

  • United States

  • Canada

  • Australia

  • New Zealand

  • Singapore

  • Malaysia

  • Philippines

  • United Kingdom

  • France

  • Spain

  • Italy

  • Germany

  • Brazil

  • Mexico

  • Argentina

Source: http://ashutoshsachan.blogspot.com

Thursday, June 4, 2009

Google Rankings Faling- Check this out >> SEO FIRM DELHI

Googles new -50 penalty. Is it True?

Many Webmasters have observed an anomaly in Google's results that might be an indicator of Google's new penalty. We will try to figure out here that which websites may have been penalized by this and what might be the various factors being taken into consideration by Google's algo.

Issue under consideration?
Its been reported by many webmasters that their websites tumbled down from top positions on Google upto position 50 and below.
The problem here is that the usual metrics show that the websites are okay. For eg, a websites which was 2 yrs old, with the same no of inbound links and same Google PageRank got penalised, In the same way a website older than 10 years also suffered.

Reasons fo Penalty?

Several theories come up as we dive into the deep:

1. Spam techniques on alias websites
This theory states that the websites could have suffered the penaly because the webmaster was involved with some spammy methods on another of his domains.
Even if your websites are not linked, Google can still make out the owner of both websites. There are several ways to find this out, WHOIS information and Google accounts are also already there.

2. Linking with spammy sites
You might not be aware that you have linked to spammy site ! But yes this can happen. The widgets, counters and other third party plug-ins might contain links to other sites. Once these third party codes get into your website, your website also gets linked to the spammy websites.
The server might have also been hacked and there could be a link that you don't want on your website. Be focussed & aware.

3. Paid links from .edu domains
The use of paid links from .edu domains may also be an important theory to be discussed here. These links can appear on many hacked .edu servers that host websites on which these links can appear.
Getting paid links from other sites might also be the originator of the problem. ALARM—Your competitors might harm your site by purchasing paid links for your site from such .edu pages resulting in drop in your websites rankings.

Alarms should be raised?

It's likely that Google is working on its paid link filters & that paid links from domains such as .edu and other sites are the culpurits.

Keep away from paid links. They aren't liked by Google, and chances are that your website will get problems if you use them. Its better to be focussed on high quality organic links.

Wednesday, May 27, 2009

Google For Advertisers- SEO FIRM DELHI

Google has announced the launch of Google for Advertisers. Google for Advertisers is a place where all the information about the broad range of Google’s marketing solutions can be found. Also, the combination of tools can be explore and discover that best meet the advertiser’s own objectives.

Followings are the ways to get more from the website:

1. Know about more media platforms. It gives straightforward descriptions of all the Google's platforms (like search, TV, the Content Network or mobile) and the various supporting tools. Anyone can learn about “how to reach the audience in relevant and useful ways across devices, locations and languages.

2. Marketing Strategies for different business domains: Here Google demonstrating the solutions in the context of “how they can be applied across all the stages of Creating an effective advertisers campaign”. This will help the advertisers to drive better ROI through their online marketing campaigns.

3. Marketing Examples. Marketing Examples on “How Google created wealth for businesses through Online Marketing Programs”.

4. Building of personal 'toolkit.' Advertisers can browse the site and find Google tools according to their interests; Tools can be added them to their online toolkit.

PPC Services Delhi, India: Generate Quality Traffic & Leads


PPC SERVICES NEW DELHI INDIA

With the growth of the Internet, each & every advertiser is looking forward to make use of online advertising to reach the targeted customer. There are many forms of Online Marketing, amongst which PPC is the best option for the advertisers. Good traffic can be earned for their products & services by reaching the targeted audience in an hassle free manner. There are several Online Marketing Companies providing PPC services in Delhi NCR region, India to make the web sites of their client rank on top of the search engines.

Pay-Per-Click services can be located on the right hand side of the search web pages, listed under the featured or sponsored listings. To hire an online PPC service for PPC management is beneficial for an advertiser because these companies implement all the optimization techniques of PPC advertisement to bring a PPC ad on the top of a search engine. Payment needs to be made only when someone clicks on their PPC ad networks.

Major Search engines like Google, Yahoo, and MSN are providing the best platform to online marketing companies to place their PPC ads. Advertisers looking for an online marketierto provide PPC management must go for Indian PPC services, they are effective, targeted and building more leads.

PPC is mainly placement of an ad on the sponsored section of the search engine for a specific set of keywords. Once the SEO manages to attract users to click on the keyword targeted ads, it leads to the success of PPC.

If you want to know more about Pay-Per-Click services in India, contact at ondemand.seo@gmail.com.


Thursday, May 21, 2009

Enhanced Search Query Performance Report - SEO FIRM DELHI

Google search query report is now been made more comprehensive and detailed to analyze the campaign’s performance more effectively.

To analyze the search queries that trigger ads on Google, Search Query Performance Report is the best way.

Untill yesterday, the Search Query Performance report showed the traffic to be grouped under a line item called "other unique queries." This line encompassed queries with very low volume, where the ads are triggered only once or twice. However, some advertisers found that a significant portion of their spend was grouped under this heading, which made it difficult to manage keyword variations.

As stated by Google:

“From today, the SQR will show all queries that resulted in a click, where the referrer URL was not specifically blocked by the user. In other words, this includes all queries that are logged in the server logs or in a tool like Google Analytics. In requiring that the referrer URL be present, we are upholding our commitment to user privacy.”

This update will result in longer lists of queries in the Search Query Performance reports, however most of that will have very low traffic. This data can be analyzed and decisions on keyword variations can be taken accordingly.

Google also made an announcement that:
”This update makes us excited and we believe that it will help make the Search Query Performance report even more useful. To learn more about running Search Query Performance reports, you can visit our AdWords help center.”

# Data Source: .google.com/support/bin/answer.py?hl=en&answer=68034

Wednesday, May 20, 2009

SEO Basic Facts 1: SEO FIRM DELHI

Search Engine Marketing or Search Marketing is a form of Online Marketing which comprises of only 2 categories viz,-
  • Organic Listings
  • Paid or Sponsored Listings

Other forms of marketing such as Viral , Affiliate, Email, Pro Blogging etc come under Online Marketing.

___________________________________________________

Search Engines are software tools that search through the database to give relevant results to the query being made according to their algorithms. SE have three parts:-
• Algorithm: Algos take into consideration various factors in determining the results.
• Crawler: It crawls through the database to list the urls of the website and takes the snapshot of the url.
• Indexer: It indexes the url into appropriate categories.

____________________________________________________

  1. In INDIA, Google has its data center in Mumbai under VSNL, INDIA supervision. At present, Google Data centers sum up to approx 40 (C-Blocks).
  2. Total Yahoo Data Centers: 16 Active
  3. The only Search engine which had its own database but did not showed results to users, but provided its results to Yahoo was INKTOMI. Now Yahoo overtook Inktomi.
  4. CUIL (pronounced as Cool) has the biggest database, over a trillion.
  5. WolfRamAlpha - New SE in market. Still in Testing phase, but has good potential.
____________________________________________________

Wednesday, May 13, 2009

Static URLs Vs Dynamic URLs- SEO FIRM DELHI

One can understand that updating the pages with static URLs can be time consuming, especially if the amount of information grows quickly, since every single page has to be hard-coded. This is why the designers suggest making the URLs dynamic.

Also, I agree to the fact that search engines can crawl the dynamic URLs as good as they can crawl static ones. However, there are still various advantages with static URLs as mentioned below:

  • Higher click-through rates in the SERPs, as users can easily read the static urls and draw the conclusion.
  • Higher keyword prominence and relevancy.
  • Easier to copy, paste and share on or offline.
  • Easy to remember and thus, usable in branding and offline media.
  • Creates an accurate expectation from users of what they're about to see on the page.
  • Can be made to contain good anchor text to help the page rank higher when linked-to directly in URL format
  • All 3 of the major search engines (and plenty of minor engines) generally handle static URLs more easily than dynamic ones, particularly if there are multiple parameters.

In-addition, there are certain disadvantages with dynamic URLs that I would also like to share with you:

  • Lower click-through rate in the search results, on forums/blogs where they're pasted.
  • A greater chance of cutting off the end of the URL, resulting in a 404 or other error when copying/pasting.
  • Lower keyword relevance and keyword prominence.
  • Challenging (if not impossible) to manually remember
  • Does not typically create an accurate expectation of what the user will see prior to reaching the page

Moreover, if we change all the static URLs into the dynamic ones then 301-redirect has to be in place for all the changed URLs. There is a chance of missing 301-redirect on some of the changed URLs, which may result in the loss of value that has been created so far.

Monday, April 20, 2009

Geo Targeting - Different Aspects of Google & Yahoo

To locally target language content across various domains, Yahoo does not filter the duplicate content out of their results when the same content is found on multiple ccTLD domains. In the same way Google engine also behaves, to have the same content on for example, seo-firm-delhi.com, seo-firm-delhi.co.uk and seo-firm-delhi.com.au is perfectly acceptable and shouldn't trigger removal for duplicate content (assuming those sites are properly targeting their individual geographic regions). Exceptions are the potentially spammy or manipulative sites. The use of content across shared-language domains in this fashion is perfectly acceptable.

However, to redirect the alias site, the following code can be used. Write this code in ".htaccess file". Code is given below-

" Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^seo-firm-delhi.co.uk [nc]
rewriterule ^(.*)$ http://www.seo-firm-delhi.com/$1 [r=301,nc] "

Sunday, March 15, 2009

PPC platform of MSN - SEO FIRM DELHI

We got to learn that Google has not been the pioneer of all the features available to advertisers for running PPC. Below are the points that were covered:

1. When users sign up for MSN, they fill out their age, gender, income, education, and location. This allows MSN to create a tool where advertisers can target specific groups. For example, if your product’s target audience is women 25 to 35 years of age in Chicago, then you can advertise only to them. You can even bid differently for the

various demographic criteria (for example, different bids for men and women, or different bids for the different age groups). (In response, Google added demographic targeting to Site Targeting (CPM) ads.)

2. MSN gives you access to this data so you can research the market for keywords. You can enter a keyword and see who searches for it. This lets you test your keywords before you buy advertising. You can then apply this information to your other PPC campaigns, such as Google and Overture. You can also enter a website URL and it will give you a list of keywords and traffic for those words & you can also upload lists of keywords.

3. You can select the gender and age groups: 18-25, 25-35, 35-50, 50-65, and 65+. For each group, you can set the bids.

4. You can also see the geographical location of those searches.

5. Microsoft’s PPC introduced day parting as a standard tool. This allowed users to set the time for ads, such as only during business hours. Google followed MSN here also.

6. MSN adCenter uses Vickrey bidding, similar to Google AdWords. This means you’ll pay only 1¢ more than your competitor’s bid.

7. The adCenter account lets you add additional projects within it, each with their own credit card. You can set up accounts for each department within your company, or if you’re an ad agency, you can set up accounts for clients, each with their own credit card.

Google allows further relaxation for advertising gambling websites

Google have confirmed that gambling affiliates and comparison websites will now be able to bid on such terms as bingo, casino, poker etc.

In fact, this update has been in place from Jan 15th onwards.

This news means that gambling-related websites that are not registered with the Gaming Association are now free to bid. Back in November, 2008 Google made the decision to allow advertisers that were registered to start bidding, this latest move means it’s likely you’ll see an influx of advertiser’s competing for online gamblers also.

This is further evidence that Google are keen to get as much revenue out of this lucrative channel as possible.

Saturday, March 7, 2009

SEO Quiz - SEO FIRM DELHI

[Answers are marked in green]

Question 1

What does the term “Sandbox” mean in SEO?

a. The box with paid ads that appear when you perform a search.

b. The first 10 search results for a particular keyword.

c. This is where sites are kept till they get mature enough to be included in the top rankings for a particular keyword.

d. A special category of sites that are listed in kid-safe searches.


Question 2

When was the Big Daddy Google update completed?

a. 2006.

b. 2005.

c. 2004.


Question 3

You plan to launch a new site. Does it make sense to register the domain early, when you are not ready with the site itself? Check all that apply.

a. No, this is a waste of money. I will pay for the domain, when I am ready with the site.

b. Yes, early registration helps to have a mature site sooner and I will have to wait less till I get out of the sandbox.

c. Yes, by registering early, the site will get older sooner and since the age of the domain is important, this will help.

d. Yes, early registration will help me keep competitive without spending much on advertising.

Question 4

What is meant by “Google bombing”?

a. Submitting the site again and again in order to include it in Google's index.

b. Multiple sites linking to the same site, with the same anchor text in order to get high ranking for the keyword in the anchor text.

c. Extensively using the word “Google” on your pages in order to get high rankings with Google.

Question 5

Which of the following techniques is best for dealing with duplicate content?

a. Rewriting the title and the headings.

b. Rearranging the placement of paragraphs.

c. Changing the directory in which a file resides or renaming it in order to make the URL different.

d. Using synonyms in each sentence.

e. Re-wording the text.


Question 6

You have just launched a new site. Unfortunately, nobody visits it, even search engines' spiders don't notice it. What can you do for its SEO success? Check all that apply.

a. Get some fancy fonts for the titles and headings.

b. Add gorgeous Flash movies to the site.

c. Submit the URL of the site to search engines and search directories.

d. Get some free links from the greatest hacker sites because backlinks always boost rankings.

Question 7

Will you use an ordinary html sitemap with Google?

a. Any sitemap is OK to use it with Google.

b. No, I will not because it is duplicate content.

c. I'd rather use the special XML format that Google uses for sitemaps.


Question 8

Why are metatags important? Check all that apply.

a. Because search engines still use them for estimating search relevancy.

b. Because if you leave metatags empty search engines will not index your site.

c. Because keyword-rich metatags lead to top positions in search engines.

d. Because it is a professional approach to Web design to have complete and accurate metatags.

e. Metatags are not important at all today.

Question 9

What will the search string “link:www.webconfs.com” show?

a. The list of backlinks from other sites on the Web to webconfs.com.

b. The list of backlinks from other sites on the Web to webconfs.com that are available in the index of the particular search engine.

c. The home page of webconfs.com.

d. The anchor text of the backlinks that link to the webconfs.com site.


Thursday, March 5, 2009

Factors responsible for Non-indexing of a Web Page

  • Spam penalties. If you've been caught violating the search engines' terms of service (Spamming), they'll drastically scale back the pages in the index until you beg for reinclusion.

  • Hidden links.If the navigation to your site is hidden within JavaScript, Flash, or other non-HTML methods, the search engine spiders are unlikely to be able to follow them.

  • Dynamic URLs. If your URLs are excessively long, or have many parameters, or contain ID or session parameters, the search engines might elect not to index them.

  • Incorrect robots.txt file. Your robots.txt file tells the search spider which pages to include and exclude from the crawling--if you've coded the file incorrectly, you might be excluding lots of pages you meant to include.

  • Incorrect robots tagging. Just like the robots.txt file, a robots metatag tells the spider to include or exclude an individual page--you might be telling the spider to exclude the page by mistake.

  • Poor quality pages. If your page is excessively long, contains HTML coding errors, or uses frames, it's unlikely to be indexed correctly.
  • Improper redirects. If your page uses a meta refresh or Java Script redirect, spiders ignore them and don't index the page.
  • User interaction required. If your page launches a pop-up window, or demands that a form be filled out, spiders won't be able to comply.

Specific Use Of Twitter, Stumble upon & Linkedin To Improve Traffic + Pros & Cons of the 3 main streams of Online marketing(SEO/SMO/PPC)

Brief About These Sites:

Twitter: It a new microblogging platform that take blogging to a new level. Idea behind twitter is to post 2 liners that says " What are you doing" . Since it a microblogging platform it enables you to stay connected anywhere & everywhere. A person can post messages from his desktop or from his mobile device. It enables people to suscribe to message feeds that helps them to recieve updates anywhere.

Stumbleupon : Stumbleupon is a bookmarking cum social networking site that enable you to make friends & communities. Unique feature about stumbleupon is that it enables you to bookmark a site and share a web page with friends that becomes instant source for traffic.

Linkedin : Linkedin is a professional social networking site that enables people from different professions to come together & share information. It can become important source for making business contacts that can help in getting business and at the same time generate traffic.

These sites can be important source of generating relevant traffic since the people in your friends list are the people who share the same interests.

Unlike ppc where you pay for each & every click SMO provides free, instant & relevant traffic. It is a time building task as building communities take time & you have to be an expert in your field to provide unique content that is useful for users. Once it is done SMO is a useful source for generating traffic.

In SEO you have fixed no. of searches for a particular keyword beyond which a site cannot be optimized hence traffic become saturated but in Social Media Optimization it is unlimited as you can make unlimited no. of contacts so the potential is huge.

PPC campaigns are cost consuming & they basically focus on conversions & instant traffic, Whereas SEO is a long term approach, time consuming but provides free but limited no of traffic. SMO initially is time building task but provides free & instant traffic & is used for branding purpose.

Orkut Integrates YouTube and Google Videos- SEO FIRM DELHI

Google has now integrated its YouTube and Google Videos into Orkut by creating a new video page, where users can share all of their favorite videos. Users do not need to embed the code into the page, they can just copy and paste the link and the video shall be added to the list. If you liked a video on your friend's list, you can just add it to your list by hitting the Add to my favorites button.

After the video has been added to your list, you can also change the title and description of the video. Unlike MySpace Videos and Bebo TV, Orkut is missing out a community page which lists the most popular videos added by users.

Positive Off-Page factors-- SEO FIRM DELHI

  1. Total incoming links which plays a role in enhancing PR
  2. Anchor text of the incoming links from other sites.
  3. Age of the incoming links. The older, the better
  4. Popularity of referring page
  5. # of outgoing links on referrer page
  6. Keyword density on referring page
  7. Site age: Old shows stability
  8. Google's directory comes STRAIGHT from the DMOZ directory. You should try to get into dmoz.

Negative On-Page Factors - SEO FIRM DELHI

1. Text represented graphically is invisible to search engines

2. Too high keyword repetition (keyword stuffing) may get you the OOP. Overuse of H1 tags has been mentioned. Meta-tag stuffing.

3. Don't link to link farms, also, don't forget to check the Google status of EVERYONE you link to periodically. A site may go "bad", and you can end up being penalized

4. Targeting too many unrelated keywords on a page, which would detract from theming, and reduce the importance of your REALLY important keywords.

5. Most SE spiders can't read Flash content Provide an HTML alternative, or experience lower SERP positioning.

  1. Never use Hidden Text
  2. Do not create doorway/gateway pages.

8. Do not use phrases that have been associated and correlated with known spamming techniques, or you will be penalized.

Wednesday, March 4, 2009

Google Adwords In News Search- SEO FIRM DELHI

Google has announced that it will start displaying Adwords ads in the Google News search . They are initially starting to display the ads for the US visitors, which is likely to be extended to other geographic locations as well soon. It is rather surprising to see that Google which has been highly innovative in coming up with new ways of expanding the
online space to display its ads did not utilize one of the most obvious spaces within its own network. However, those who are in the US can start getting ads in the news search too. However, the relevance and the effectiveness of the ads in the news search should be determined by the users. As of now, Google will be delivering text ads like in the regular search pages.
Because of the diversity of the news results and the nature of the content that is listed, we need to wait to see the performance of the Adwords ads in the news search pages. When people come to search in the news search
page, they are not looking for a service or a product 90% of the times. They are more likely to be looking for some useful information and by placing the ads in such pages, the ad budget of the advertisers can get depleted by clicks that do not produce any conversions. As we all know that traffic for traffic reasons is no good to any website. So only when the conversion level increases, it will be useful for the website.
We can easily understand the hesitation of Google in implementing their
Adwords ads in the news results pages. We need to wait to see whether any webmaster is protesting against this decision of Google. Even Google is not spared by the current economic trends. Google has to strive hard to keep itself profitable. It has to make the best use of all its online properties and unutilized spaces to maximize its profit.
>From what has been observed so far, the ads in the news search pages are not highly effective in terms of their relevance to the news items that are listed.
Google is experimenting on the ads and various formats for the ads to see what will work best for the news search page.
In Google’s own words, from Josh Cohen, Business Product Manager, “In recent months we’ve been experimenting with a variety of different formats, like overlay ads on embedded videos from partners like the AP. We’ve always said that we’d unveil these changes when we could offer a
good experience for our users, publishers and advertisers alike, and we’ll continue to look at ways to deliver ads that are relevant for users and good for publishers, too.”
If Google continues to operate with best interests of its advertisers in mind, it will certainly need to find effective ways of displaying its ads in the news search pages to ensure that the advertisers’ ad budget is not depleted.

SEO Tools- Important for SE optimisation By SEO FIRM DELHI

As discussed earlier in this series, SEO has moved beyond being a mechanical process to being something that is far more social and subjective. While some tools focus on misinforming the user into thinking that there is a secret keyword density ratio to target (or some other bunk), the truth is you need to build organic links to help make your site compete. No amount of raw analysis and number crunching is going to make good links automatically appear.

Tools do have some value...they can help you save time while researching
- how strong competing sites (and pages) are based on their link profile and site age
- how much search traffic each site gets (based on value and volume)
- where a page ranks in the search results
- what the on page optimization for a page looks like
- the relative values of different keywords

A recent SEO tool sales letters exclaimed "Long gone are the days of clunky, standalone, research tools." And in writing that, they are correct. Rather than paying to download some clunky desktop software, we thought that you should be able to have access to the best SEO tools at your fingertips - free of charge.

 

Tuesday, March 3, 2009

Best ON-Page SEO Practices - SEO FIRM DELHI

There are "over 200 SEO factors" that Google uses to rank pages in the Google search results (SERPs).

Here is the speculation - educated guesses by SEO webmasters on top webmaster forums. Should you wish to achieve a high ranking, the various confirmed and suspected on-page optimization practices are listed below.

Best ON-Page SEO Practices Brief Note
Keyword in URL First word is best, second is second best, etc.
Keyword in Domain name Same as in page-name-with-hyphens
Keyword in Title tag Keyword in Title tag – Important keyword close to beginning. Title tag should be approximately 60 - 90 characters.
Keyword in Description tag Shows theme - approximately 200 chars. Keyword in Description tag – Important keyword close to beginning.
Keyword in Keyword tag Shows theme - approximately 200 chars. Every word in this tag MUST appear somewhere in the body text. If not, it can be penalized for irrelevance. No single word should be repeated since it might be considered spam. Google purportedly no longer uses this tag, but others do.
Keyword density in body text 3 - 7% - (all keywords/total words)
Keyword in H1, H2 and H3 Use keywords as Header tags wherever possible
Keyword font size "Strong is treated the same as bold, italic is treated the same as emphasis" . . . Matt Cutts July 2006
Keyword phrase order Does word order in the page match word order in the query? Try to anticipate query, and match word order.
Keyword prominence (how early in page/tag) Can be important at top of page, in bold and in large font.
Keyword in alt text Should describe graphic - Do NOT fill with spam
Keyword in links to site pages (anchor text) Use keywords as anchor text for linking site pages
All Internal links valid? Validate all links to all pages on site.
Efficient – tree-like structure TRY FOR two clicks to any page - no page deeper than 4 clicks
Intra-site linking Appropriate links between lower-level pages
Linking external pages Google patent - Link only to good sites. Do not link to link farms. CAREFUL - Links can and do go bad, resulting in site demotion. Unfortunately, you must devote the time necessary to police your outgoing links - they are your responsibility.
Outgoing link Anchor Text Google patent - Should be on topic, descriptive
Less than 100 links out total Google says limit to 100,
Domain Name Extension Top Level Domain - TLD .gov sites seem to be the highest status .edu sites seem to be given a high status .org sites seem to be given a high status .com sites excel in encompassing all the spam/ crud sites, resulting in the need for the highest scrutiny/ action by Google.
File Size Try not to exceed 100K page size. Smaller files are preferred <40k.>
Hyphens in URL Preferred method for indicating a space, where there can be no actual space One or two= excellent for separating keywords (i.e., pet-smart, pets-mart) Four or more= BAD, starts to look spammy Ten = Spammer for sure, demotion probable?
Freshness of Pages Google patent - Changes over time Newer the better - if news, retail or auction! Google likes fresh pages.
Freshness - Amount of Content Change New pages - Ratio of old pages to new pages
Freshness of Links Google patent - May be good or bad Excellent for high-trust sites May not be so good for newer, low-trust sites
Frequency of Updates Frequent updates = frequent spidering = newer cache
Page Theming Exhibit page theme carries greater weightage over general consistency
Keyword stemming Use natural variations. Example - Stem, stems, stemmed, stemmer, stemming, stemmist, stemification
URL length Keep it minimized - use somewhat less than the 2,000 characters allowed by IE - less than 100 is good, less is even better
Site Size - Google likes big sites Larger sites are presumed to be better funded, better organized, better constructed, and therefore better sites. Google likes LARGE sites, for various reasons, not all positive. This has resulted in the advent of machine-generated 10,000-page spam sites - size for the sake of size. Google has caught on and dumped millions of pages, or made them supplemental.
Site Age Google patent - Old is best. Old is Golden.
Age of page vs. age of site Age of page vs. age of other pages on site Newer pages on an older site will get faster recognition.


Make search your own: SEO FIRM DELHI

Have you ever wanted to mark up Google search results? Maybe you're an avid hiker and the trail map site you always go to is in the 4th or 5th position and you want to move it to the top. Or perhaps it's not there at all and you'd like to add it. Or maybe you'd like to add some notes about what you found on that site and why you thought it was useful. With SearchWiki you can do all this and tailor Google search results to best meet your needs.

SearchWiki provides a way for you to customize search by re-ranking, deleting, adding, and commenting on search results. With just a single click you can move the results you like to the top or add a new site. You can also write notes attached to a particular site and remove results that you don't feel belong. These modifications will be shown to you every time you do the same search in the future. SearchWiki is available to signed-in Google users. Google store your changes in your Google Account. If you are wondering if you are signed in, you can always check by noting if your username appears in the upper right-hand side of the page.

The changes you make only affect your own searches. But SearchWiki also is a great way to share your insights with other searchers. You can see how the community has collectively edited the search results by clicking on the "See all notes for this SearchWiki" link.




This new feature is an example of how search is becoming increasingly dynamic, giving people tools that make search even more useful to them in their daily lives.


Monday, March 2, 2009

5 Tips to Effective SEO Keyword Research Analysis- SEO FIRM DELHI

Keyword research and analysis can be a daunting task, when done correctly, and expert keyword research is the foundation to a successful SEO campaign. Many new website owners think the keyword research analysis process is easy. They think free tools, such as the Overture Search Term Suggestion Tool is the profit pill that will bring them instant results.

Unfortunately, the free tools will only give you a rough guide and a quick indication whether a hunch is worth further research. These free keyword research tools are limited to basic information. When performed correctly, expert keyword research exposes so much more - all the gems that are tucked away deep.

Real keyword research requires research AND analysis. There are so many aspects to the process that cannot be left to chance. Attempting to do the keyword research on your own is like going to a veterinarian to fix your car.

Following are 5 tips for effective keyword research analysis:

1. Latent Semantic Indexing (LSI) - Use multi-word phrases
Latent Semantic Indexing (LSI) is a vital element in Search Engine Optimization (SEO) for better keyword rankings in search results. LSI is based on the relationship, the “clustering” or positioning, the variations of terms and the iterations of your keyword phrases.

Expertly knowing LSI and how it can be most useful and beneficial for your SEO and the importance it has with the algorithm updates to search engines like Google, MSN and Yahoo which will benefit your keyword research for best practice SEO.

LSI is NOT new. Those doing keyword research over the years have always known to use synonyms and “long tail” keyword terms which is a simpler “explanation” to LSI. More often than not, these long tail, less generic terms bring more traffic to your site than the main keyword phrases. The real bottom line is that Latent Semantic Indexing is currently a MUST in keyword research and SEO.

2. Page Specific Keyword Research - Target your niche keyword phrases for each site page
Probably the most common mistake in keyword research is using a plethora of keywords and pasting the same meta keyword tag on every web site page. This is SO not effective! Your keyword research needs to be page specific and only focusing on 2 to 5 keywords per page. It’s more work, but combined with best practice SEO, gives each site page a chance for higher ranking on its own.

3. Country Specific Keyword Research and Search Engine Reference
Keep in mind that keyword search terms can be country specific. Even though a country is English speaking, there are different keyword terms you must research - and then reference that country’s search engine when doing your initial keyword research. For instance, UK and Australia may have different expressions, terminology and spellings (i.e. colour, personalised). Referencing the terms in the corresponding search engine is an important element to keyword research that is often forgotten. So for example, be sure to check the search terms on google.co.uk or au.yahoo.com. And, of course, if you have 3 to 4 really comprehensive research tools in your arsenal, you will be able to search for historical, global and country specific search terms easily and effectively.

4. Keyword Analysis - Cross referencing in the search engines
Once the majority of the keyword research has been done for a site page, it’s time to plug those terms into the search engines to determine:

* If it is really the desired niche keyword for that page
* To assess the competitiveness of your keywords. Along with checking the competitiveness of your keywords you should look at the strength of the competition.
* Are the other sites listed for your keywords truly your competitors?
* Are the sites listed for your keyword even related to your industry, products or services?

These critical analyses of keyword phrases are often forgotten. Since the keyword research and analysis is the foundation of a successful SEO campaign, you certainly don’t want to build your on-page optimization on the wrong niche keywords!

5. Ongoing Keyword Research - Repeat your keyword research on a consistent basis
While you may think that you have completed your keyword research analysis and laid a solid foundation for your SEO, you need to keep monitoring your keywords and tweak as necessary. Keywords can change from month to month as keyword search terms change, genres change and/or if your niche is within social portal networking sites - to name just a few. Maintaining ongoing keyword research is essential for best practice SEO.

Most Successful Strategy to Streamline Your Keyword Research Efforts:

Yes, many website owners will opt to do the keyword research and analysis themselves with only a marginal effect on an SEO campaign. It’s not the most successful strategy to use for the most effective results.

To be certain of your keyword data, accurate keyword analysis should be performed - and cross referenced - across multiple expert keyword tools.

Effective keyword research lays the ground work for effective SEO results and can help you kick-start the ranking process - perhaps even giving you a step up on your competitors.

The most successful strategy to streamline your keyword research efforts is to hire an expert. Focus your business efforts on your strengths and expertise and allow the SEO experts to effectively perform the keyword research analysis correctly.

Is Organic Traffic Better Than PPC? - SEO FIRM DELHI

There is probably nothing that gets a good flaming debate going between SEO Experts and PPC Advocates than the quality and quantity of traffic from these two sources. We have worked on both sides of this issue and we tell clients all the time that they have to compete on both sides of the search engine results page.

Sunday, March 1, 2009

50 SEO FAQS SEO FIRM DELHI

50 FAQs




1.)Google/Yahoo/MSN Reported Links to My Site - Why so different?

Google, Yahoo and MSN all supply estimates on inbound links and then try to find as many as they can in response to the query. They do not guarantee or even try to return every link they have found back to a specific page. ___________________________________________________________________________________________________



2.)How come I get a different amount of links when I use www.example.com
and example.com?

In the search engines eyes, these are two different domains and therefore have different amounts of sites linking to them. So you should 301 redirect one version to the other. If your site is hosted on a linux server you may do the following:

Open your .htaccess file(in the root directory of your site), if you don't see one create a .htaccess file and add the following:

Code:

Options +FollowSymlinks RewriteEngine on rewritecond %{http_host} ^domain.com [nc] rewriterule ^(.*)$ http://www.domain.com/$1 [r=301,nc]


Of course you will need to change domain.com in both instances to your domain name.
Make sure if you are creating the .htaccess file you save it as .htaccess or it will not work.

___________________________________________________________________________________________________



3.)What's a Link Farm?

A set of web pages that have been built for the sole purpose of increasing the number of incoming links to a web site. This is done in order to increase link popularity and search engine rankings. Link farms usually require a reciprocal link from sites seeking listings. Link farms are a known spam tactic and sites that participate in them are likely to be penalized or banned from the major search engines.

Please keep in mind that links from these sites do not harm you, what harms your site is you linking OUT to these sites NOT them linking in. However these types of links will not benefit you either so don't waste your time with them.

___________________________________________________________________________________________________


4.) What is "Anchor Text"?

Anchor text is the text that you use within a HTML link.


The anchor text that you use, should reflect the keywords of which you are trying to get a higher listing for. It should also accurately describe what information will be found on the next page if you click on the link.

Your anchor text should be varied, you should not use the same anchor text for all your links (that you control), if you use a variety of anchor text your overall link strength will be stronger and you
will rank for many more phrases than if you focused on only one keyword.


___________________________________________________________________________________________________


5.) What is an "IBL"?

An "IBL" is an 'In Bound Link'. When a site links to yours, that link is called an IBL. The same thing as a BL (Backlink).

Backlinks play a major role in ranking a site in the search engines specifically Google and MSN. Google also puts a great deal of value on the quality of the links, where it is placed, is the site related to yours? A link on a content related page is going to be worth a whole lot more than a link on a links page.

___________________________________________________________________________________________________


6.) What is Reciprocal linking or link exchange?

Link exchanging or reciprocal linking is where two sites exchange links. Usually this occurs when a webmaster sends an email to another webmaster requesting a link on their site. Because of how easy it is to get a link this way Google has greatly dis-valued these types of links. They are still
worth something but not as much as they have been in the past.

Some tips if you do exchange links...
-Only exchange links with related sites, don't reciprocate with sites who are irrelevant even if they have a high PageRank.

-Don't reciprocate with sites who link to anything, only exchange links with sites who are particular about who they link to.

-Don't reciprocate with sites which have categories for every industry or hobby ever invented. Again keep it relevant.

-Don't reciprocate with sites whos link pages contain hundreds of links.

-Before reciprocating view the source of the page (CTRL + U for FF | View > Source for IE)
and check that NONE of the links on the page contain a nofollow tag.
A quick way to check is to use the find tool and search for "nofollow"

-Also goto http://www.domainnameofsite.com/robots.txt
That way you can see if they have excluded the links pages. Also do a search in google for cache:domain.com/linkspage to see if the page your link will be placed on is indexed.

Those are tips you definitely should follow if you do decide to exchange links.
___________________________________________________________________________________________________


7.) What is Linkbait?


Linkbait is when you create a tool, write a article or many articles or otherwise do something that works as a link magnet. Website owners start linking to your pages without even being asked, many times the sites linking to you are authorities in your industry so a good linkbait can have a an extremely large positive effect on your rankings.

How do I create a linkbait?
-Talk about a hot subject - preferably before anyone else has written extensively on it.

-Take a recent event in your niche and write an extensive article detailing what it is about, what is good and bad about it etc.

-Be controversial - Perhaps one of the best ways to create a link bait.

-Be contrary... take a point that most, almost all of the experts in your niche agree on and disagree with it. Preferably provide evidence that you are right. If you do it should create quite a buzz.
Some good tips can be found here:
http://www.seobook.com/archives/001936.shtml


___________________________________________________________________________________________________


8.) What is cross linking?

When multiple sites link to each other for the purpose of increasing link popularity.
___________________________________________________________________________________________________


9.) How long after I am listed in DMOZ should I start to gain the search engine benefit?


The days of DMOZ Search Engine Ranking are over. Submit and forget but don't expect any search engine value from the listing if you do get it? If you can get in thats great but if not don't sweat it..


___________________________________________________________________________________________________


10.) Would I be penalized if a "Link Farm" or a "Bad Neighborhood" linked to my site?


No, you have no control over who links to you and this would be an easy way for a competitor to sink your site. What you SHOULD be careful of is if you are linking to them. This CAN hurt your site and should be avoided.

Don't be afraid to link out to sites that are related to yours but never link to a site because they guarantee search engine value if you do.



Search Engine Optimization


11.) How do I use a Robots file or do you have an example of one?

You mean robots.txt. It's a file in the main directory of a website which tells the search engine spiders where they are allowed to go.

A basic, spider can go anywhere file is:


User-agent: *

Disallow:

You can disallow folders or files by putting this into the file:



User-agent: *

Disallow: /foldername/

__________________________________________________________________________________________________

12.) Should I use a Site Map? and does anyone have an example of one?

Every site should have a sitemap of some sort. It provides a way for visitors to navigate your site, as well as a shortcut for bots to crawl all your pages. Google for "sitemap" and you'll find nice examples, although it would be advisable to follow Google's official guidelines and keep the number of links per page under 100.


13.) How do I tell how many people are searching for my keywords?

The Search Term Suggestion Tool here will show you number of searches per month on Overture's network. You can also use Digital Points Keyword Suggestion Tool.

___________________________________________________________________________________________________

14.) What are SERPs?

Search Engine Result Pages

___________________________________________________________________________________________________

15.). I am number 5 in Yahoo, but not in the top 1,000 in Google. Why?

Yahoo and Google use two different algos to come up with their SERPs – Yahoo favors on- page optimization such as H1 tags, keyword density, etc while Google favors off page like in bound links. If you have done more on page then Yahoo will rank higher and vice versa. As part of this if you are now addressing your off page then you need more backlinks with anchor text and time for these links to age (Google lag).

______________________________________________________________________________

16.) Is there a way I can find out where I rank?

There are two sites you can use if you are just checking a few keywords - www.googlerankings.com for Google and www.yahoosearchrankings.com/ for Yahoo. If you are looking to track multiple keywords over time then the best (and free) tool is Digital Points Keyword Monitor at
http://www.digitalpoint.com/tools/keywords (Google Only)

___________________________________________________________________________________________________

17.) What is an "ALGO" ?

An 'Algo' is an abbreviation of 'Algorithm'. (In short, the mathematical formula or calculation that Google use in order to rank websites.)

___________________________________________________________________________________________________

18.) What is Hidden Text?

Text that is visible to the search engine spiders but not to site visitors. Hidden text is primarily used to add extra keywords in the page without actually adding content to a site which may mostly consist of images. Most search engines will penalize Web sites which use hidden text.

___________________________________________________________________________________________________

19.) What is cloaking?

Determining which search engine spider is visiting a web page and then giving each spider a page optimized for it's particular algorithm. This method of search engine optimization can result in a website being removed from the major search engines.

___________________________________________________________________________________________________

20.) What is a doorway page?

A method of search engine optimization considered to be SPAM. These are pages created for nothing but the purpose of ranking first for a particular keyword phrase. These pages usually lack any significant content and do not reflect the tone of the rest of the website. If a search engine determines a website has used this tactic then the website will be removed from the search engine permanently.

___________________________________________________________________________________________________

21.) What is Geo Targeting?

The distribution of ads to a particular geographical area. For example, you can use a place name in your keyword, such as "Texas Web Design". Some search engines allow you to target specific countries – and languages – without using keyword relevance.

___________________________________________________________________________________________________

22.) Do XHTML pages rank better than HTML-pages?

No. They have the same value. Google however puts more weight to well structured content, and thus XHMTL is in most cases more suitable option.

___________________________________________________________________________________________________

23.) Does using CSS affect my rankings?

Not directly. Googlebot does not (at the moment) read CSS-files. Using CSS makes pages lighter, and page structure more organized. Thus using CSS could lead to better rankings. When adding CSS styles to page, it is recommend to put them to external file (using link-tag).

___________________________________________________________________________________________________

24.)Should I use submission software to make sure all the search engines have
me included?

Most submission programs that promise to "Submit Your Site to 1000's of Search Engines" are used to harvest your email address for spammers. The truth is there are only a handful of Search Engines used by enough people to justify a submission and these can all be submitted by hand. Even better yet is to get other sites to link to you. Search Engine spiders follow these links and you will be indexed naturally. Submission programs are unnecessary and may actually hurt you.

___________________________________________________________________________________________________

25.) I submitted to DMOZ but didn't hear anything. Should I resubmit?

You should only submit once as resubmitting only deletes your original request and lengthens your wait. Resubmitting too many times can have you classified as SPAM and blocked. You can also submit to a "normal" category and a "regional" one if it fits. You must be physically located somewhere to be able to also submit to regional and read their guidelines as certain sites like real estate are the exception. They have set up a forum as a way of letting you know your status and this is a much under posted and under used area of DMOZ - The address is http://www.resource-zone.com/forum/ . Just remember to take the time to follow the posted directions.

___________________________________________________________________________________________________

26.) What is URL canonicalization?

For most websites, the www and non-www versions both resolve to the same content but can be browsed independent, especially if using internal links without the full domain name.

Similarly it's often possible to view both yoursite.com/ and yoursite.com/index.html

The following urls could contain different content, but in most cases it is the same:

  • www.yoursite.com/

  • yoursite.com/

  • www.yoursite.com/index.html

  • yoursite.com/index.html

  • yoursite.com/index.php



This gives rise to url canonicalization, whereby Google has to decide which url is the best represented from the possible options. Usually this would see the page with the most back links (internal and external) displayed in the search results, and so for most webmasters the problems remain hidden.

The most significant disadvantage of canonical URL problems is that any link juice is split between several possible options and therefore not maximizing the full ranking potential of the page.

___________________________________________________________________________________________________



Google General

27.) What is the "Sandbox Theory"?

The sandbox theory proposes that Google has a unique ranking element in their algorithm that has been affecting sites whose major SEO efforts began after March of 2004 (this would include all sites registerd after this date). The criteria, elements and solution of the 'sandbox' have not yet been identified or publicized by the SEO community, but ongoing discussion and testing have determined that the two of the most likely issues creating the sandbox could be:



  • Age of backlinks - older backlinks now carry greater weight at Google

  • SEO specific filters - Google is actively attempting to filter sites out of their index that are actively optimizing (link-building, etc) or appear to have un-natural attributes (optimized pages, links, anchor text, etc.)


If your site is ranking well for specific keyword phrases at engines like Yahoo!, Teoma, MSN & for the allin sets of searches at Google (allinanchor:, allintitle:, allintext:, etc.) but not listed in the top 50-100 results at Google, the sandbox effect may be at work. However, it is important to keep in mind that this phenomenon is an 'unknown' algorithm piece at Google and standard optimization tactics, even by some of the best and most experienced SEOs have yet to consistently beat this effect.


___________________________________________________________________________________________________

28.) When does Google update their SERPs (When is the "dance")?

There is no longer a Google "dance" per se as SERPs tend to be updated constantly.

___________________________________________________________________________________________________

29.) I now use the link: command all the time, is there a list of the
other commands I can use in Google?

link:www.domain.com (returns a list of backlinks that google has indexed. Only a small selection is listed however.)

related:www.domain.com (returns a list of sites that google sees as related to the topic on your page. At present, not very accurate.)

allinurl:keyword (returns a list of pages and sites that contain the 'keyword' in their url.)

site:www.domain.com (returns all pages of the domain that google has crawled and indexed.)

allinanchor:keyword (returns a list of pages and sites that contain the keyword as anchor text in their backlinks.)

cache:www.domain.com (will show the current cache that google has for the page)

info:www.domain.com (will return information that google has for the page)

allintitle:keyword (returns webpages that have the specified keywords in the title)

intitle:keyword (returns listings of webpages that have only the specific keyword as the title.)



You can find all the Operators at Google Advanced Operators

___________________________________________________________________________________________________

30.) Am I banned in Google?

You can tell if you are banned or not indexed in google, by using the http://www.selfseo.com/google_ban_tool.php site and also by 'site:' command in Google's search.

(ie. site:www.domain.com)

__________________________________________________________________________________________________

31.) What is the first thing you would check for if you saw your site slip down the listings at Google? ( i.e. I'm not on top anymore, what happened? )

A drop in backlinks. Obviously when not aggressively acquiring new backlinks to your site from day to day, existing backlinks may even deteriorate in value. The referring site may disappear, be changed, lose relevance or lose PR itself, reducing the PR that is passed onto you.

___________________________________________________________________________________________________

32.) When I look up my site in Google, I am #23, but when my friend in California looks it up I am #31. Which one is right?

They both are! Actually, a more accurate explanation is that Google using multiple data centers that deliver the results you see. The data center that is used depends on your geographical location. It is very common to see a small difference in results depending where you are. Sometimes, results can differ when you move just a few miles away!

___________________________________________________________________________________________________

33.) Why does Google hate me/pick on me?

"Google" does not pick on individual sites but instead allows complex mathematical equations to decide the value of your site in any particular search. Although it may sometimes feel like they are picking on you, most times a ban or bad results can be traced to a small piece of bad or misguided SEO.

___________________________________________________________________________________________________

34.) Should I submit my website to Google every month?

No, you only need to do it once. In most cases you will not even need to submit your website to Google. If you are active in building links to your website then Googlebot will frequent your site soon, and thus include it in the index.

___________________________________________________________________________________________________


35.) My site is not cached in Google, I've submitted it to google numerous times but I'm not seeing the googlebot crawl my entire site. I'm also seeing an incomplete listing.

Sometimes the googlebot will visit, and leave, only to come back at a later time in order to crawl all of your pages. If a site has no cache, or an incomplete listing, it has not been crawled and indexed properly yet. The best thing you can do is get more in bound links to your site, so that the googlebot arrives to your site from more than 1 source.





Page Rank



36.) I am changing my domain but want to keep my PR, how do I do it?

A 301 redirect is the most efficient and spider/visitor friendly strategy around for web sites that are hosted on servers running Apache (check with your hosting service if you aren't sure). It's not that hard to implement and it should preserve your search engine rankings for that particular page. If you *have* to change file names or move pages around, it's the safest option. If you are unfamiliar with editing your .htaccess file, you should ask your hosting company for help.


__________________________________________________________________________________________________

37.) I have a new site but it has no PR. I have backlinks, etc When does it update?

It updates internally every day or so, (every time google's spider follows a link through to your site) you may see a difference in the SERPs very soon, however you will not see the backlink appear in a link:www.domain.com command search in google, until google do an official backlink update.

___________________________________________________________________________________________________

38.) I noticed my PR in the Google directory is different from my PR on the toolbar, Why?

This is because the google directory gets its listings from DMOZ. As with google's PR and backlink updates, it only updates its directory every so often, so your site may be showing an old PR.

There are also a host of "theories" such as "Googles Directory is on a scale of 0 to 8" and "The Directory PR is actually a comparison between you and other sites in the Directory". No matter what you may believe, it is most important to state that they rarely match and this is VERY common.


___________________________________________________________________________________________________

39.) When does Google update backlinks and PR?

There isn't a set schedule as many times an update is skipped or late. Backlinks on average are updated around once a month while PR has seemed to move to quarterly. The best way to "guess" when the next update will be is to utilize the SEOChat Calendar.



40.) How to remove google penalty?

Here are few important points you have to remember about Google Penalty.

* Manual penalties are usually related to off-site over-optimization, i.e. forceful link building campaigns
* Algorithmic filtering usually takes place with on-site over-optimization.
* A manual penalty might only be removed in slow steps.
* An algorithmic penalty is often removed automatically; it can also be removed with no apparent reason

Remember this too. If your competitor complaints you on your bad things, you will undergo manual review.



41.) How to avoid Google penalty?

1. Make sure that your website is not linked to any spam websites, gambling & adult websites, banned websites and unrelated websites.

2. You check your server IP whether it has been blacklisted or not. If it is and cannot remove, change your server.

3. Don’t over stuff your keywords in content and Meta tags. Use only wherever it needs.

4. Don’t give excessive interlinks to the pages present in the same website. Extensive interlinking of all your websites, particularly if they are on the same C Class IP address will affect all your websites.

5. Avoid hidden text or links

6. You should not participate in any link farm. If you are interested in generating revenue through advertisement just go ahead by implementing no follow attribute for that link.

7. Excessive reciprocal link exchange will put you in trouble. It should be done only if they are relevant and useful.

8. Get one way links for your website slowly. Use different keywords for anchor text, don’t use the same for all back links

If you are sure that you haven’t done any violation of Google guidelines, request Google for reconsideration of your site – Check this link for more details.

http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35843

___________________________________________________________________________________________________



42.) What is Google base?



google has introduced new product named google base, It was live only for a short while, it was then taken down probably for a later launch, We predict it will be an other great product from google.

This is what google says about google base-

Google Base is Google's database into which you can add all types of content. We host your content and make it search able online for free. Examples of items you can find in Google Base: Description of your party planning services, Articles on current events from your website, Listing of your used car for sale, Database of protein structures.

This what Google says about their product in official Google blog-

You may have seen stories today reporting on a new product that we're testing, and speculating about our plans. Here's what's really going on. We are testing a new way for content owners to submit their content to Google, which we hope will complement existing methods such as our web crawl and Google Site maps.





43.) Does google and yahoo regional search engines use different algorithm than the primary .com search engines?

Answer to this is of course yes. All geotargeting search engines needs to operate on a different algorithm than the .com ones to identify potential regional links and regional domains. Various factors affect geotargeted rankings. Domains hosted, backlinks, domain name itself, visitor tracking, trust network in geotargeted domains, topical directories etc.
So if you want to rank for regional domains be prepared to do something different don't use same .com and .regional extension domains for 2 different rankings.



44.) What is viral marketing?

Viral marketing is a marketing trend that helps and gives confidence to people to pass along a marketing message willingly. There are viral promotions coming in the form of video clips, interactive flash games, ebooks, branded software, images or text messages. The basic form of viral marketing is not considerably sustainable.



SEO Advanced Topics



45.) What is Mod Rewrite and when should I use it?

Mod Rewrite is a set of functions built into .htaccess, an Apache module which allows for all sorts of nifty tricks with URL's, Error Pages, etc.

Dynamic links are changed mostly using Mod Rewrite. Such as -

http://www.sample.com/company.php?cat=4&id=131

into search engine friendly static URL's, such as -

http://www.sample.com/company/4/131.html

While the dynamic link won't get indexed easily, and likely not at all without external sites linking directly to that URL, the static link will get indexed with simple internal linking. Mod Rewrite is a very powerful tool for creating static links for content management systems, forums, and the like.



46.) What is the difference between IP delivery and cloaking?

IP delivery is Based on IP address the results are delivered. Cloaking is showing different pages to users than to search engines.

IP delivery includes things like "users from Britain get sent to the co.uk, users from France get sent to the .fr". This is fine-even Google does this. Cloaking is that when you do something special or out of the ordinary for Googlebot that you start to get in trouble. For example, cloaking would be "if a user is from Googlelandia, they get sent to our Google-only optimized text pages."

So IP delivery is fine, but don't do anything special for Googlebot. Just treat it like a typical user visiting the site.



47.) What are potential reasons for sites crawled but not indexed?

Many of us have noticed sometimes search engines keep visiting the site regularly but dont index the site nor show it in the site: command.

There are various reasons for this to happen-

1. The domain is an expired domain, if the domain expires and not registered for a certain period of time google imposes a expired domain penalty on that domain, That domain is left to suffer for certain number of months, in that period googlebot keeps visiting the site but they don't index it and they don't show the site in site: command too.

2. An other reason is the domain is a new domain and if the domain is a new domain sometimes the crawler regularly visits the site and it doesn't show up in the index for a long time, There is nothing wrong with this, probably google index is taking longer time to expand, you just have to wait till google updates its index.

3. An other possible reason is the site is banned from the search engines for any particular onpage factor, In that case search engines periodically checks to see whether the onpage spam tactics is removed and as soon as they see the spam being removed they might re include the site into the index, So for people who are complaining that their site was previously indexed and listed in google but suddenly it disappeared from the index and googlebot keeps visiting the site it is good to look at your onpage work and see if there is any spam tactics like hidden text, cloaking, keyword stuffing etc.

4. An other important reason could be that the site was permanently banned from the search engines, Even here search engine crawlers visit the site following existing links to the site but they don't index the site because the site is banned, This is common with Yahoo slurp yahoo's robot, yahoo slurp is known to visit the site and don't index the site if the site is banned.





48.) How can I remove content from Google?

If you have access to the site, then use this tool

http://services.google.com/urlconsole/controller

But this tool will not actually delete the URL from the index - it will separate from the results for 180 days. After that point, the URL may return to the index if it is still found and will be continuously crawled during this time. You will need to create a new account to use this tool. In order to remove the page or pages, you have to either make sure that the page returns code 404 or adjust the robots.txt file, the robots meta-tag of the page.





49.) Do search engines index flash?

Flash is not good for search engines. Most often search engines have difficulty in parsing code from the complex DHTML coding of flash, Google has been recently reported on following links from flash .SWF files, We have seen Google read text within a flash file and follow links from a flash file.

But yahoo don't read flash they are not sophisticated to do so, Similarly MSN, gigablast and lot of other search engines don't follow flash, Best bet is to avoid flash sites if you are planning to do search engine optimization.

Flash has always been a hindrance for search engines better avoid designing full sites with flash.



50.) Which is the joint venture by search engines for sitemap submission?

Google, Yahoo and msn together provide one sitemap protocol (http://www.sitemaps.org/) where you could submit your site pages to major search engines jointly in one submission.





Sources:

  1. http://www.searchenginegenie.com

  2. http://forums.searchenginewatch.com

  3. http://www.seobook.com

  4. http://websearch.about.com

  5. http://www.highrankings.com



 
Google PageRank Checking tool