Archive for the ‘SEO’ Category

Quality Content

Friday, February 18th, 2011

There was a sudden burst of content over internet in last few years where every other person tried to host their website/ blogs and contributed content over internet. This has lead to lot of duplication of content with the quality of content going downstream.

Internet 2009 in numbers

Websites in 2009

Internet in 2009

Internet 2010 in numbers
Internet in 2010

Internet in 2010


Internet is all about content and everything revolves around it whether it is Search, websites etc. Over the last year or so there is more awareness in terms of content quality over quantity. In my recent assignments over an year, I have seen a trend where companies are putting lots of effort in defining the quality content, including their meta information. Some of the typical scenario’s are engaging with creative agencies/ technical copy writiers/ Content guru’s to help define the right content. This has many fold advantages including, better ranking with search engines, readers satisfaction, better bounding with readers which encourage them to return to the site and so on.

So, why not drive our blogs/ sites towards quality content as internet has lots of content and really needs quality around it.

Is Ajax making sites less reachable?

Thursday, May 17th, 2007

We have been talking about Ajax and Web 2.0 technologies for sometime now. At one end Ajax becoming an asset from usability point of view but on the other hand it is making sites less search engine friendly. Search engines for now are not intelligent enough to make a server side request and retrieve the information to index. So it boils down to Usability Vs Search-ability? Which one to achieve?

What is the best way to achieve a balance between the two – making my site usable as well as search-able.

Here is one of the possible approach:

Step 1: To design your site without any Ajax. Javascript
Step 2: Then modify your website to include small Ajax, Java scripts components making sure that the content that gets hidden is available else where in the site which does not deploy Ajax

Search Engine: Is Content Separation a new SEO?

Thursday, May 3rd, 2007

We are very familiar with the concept of separation of code from content. Yahoo recently tried to separate content itself into actual content and common content. What I mean is separating parts of a page that do not relate to the main content, such as navigation, menus repeated across the entire site, boilerplate text, or even advertising. Interesting!!!

Yahoo introduces a ‘robots-nocontent’ class which can be included with any HTML tag. By introducing this class within tag, the idea is to

• Focus on the main content
• Not to use those sections of the page marked for finding the page
• Improving abstract for the searched page by omitting unrelated info

How to use it?
<div class=”robots-nocontent”>Header of the site</div>
<span class=”robots-nocontent”>Navigation of the site</span>
<p class=”robots-nocontent”>Footer of the site </p>

My thoughts:
• Good way to segregate main/relevant content from general content
• More focus content

• Why “robots-nocontent”, why not “robot-content”?
• Is defining class a right approach? Most of the sites does not have well formed HTML, so think of a situation where you applied this class which covers the whole page content :)
• Currently keyword density as well as its location is one of the prime SEO techniques. What will happen if websites start using this tagging?
• Is it not raising concerns with content security? Someone can define not viewable content within these tags and search engine will happily ignore this and show your site even in top sites :)
• Need to be standardized before people start working towards it

What are your thoughts?