Cherishing the happy moments, learning from the difficult ones, and looking forward to a successful 2012, say Good Bye to 2011 !
May 2012 bring fun, happiness, prosperity and success to you and your family! Wish you and your family a very happy year 2012 !
Archive for the ‘Search’ Category
Recently, google released another way to help you personalize the results, by blocking the sites you don’t want to see. This is another angle Google introduced around CEM (Customer Experience Management) giving more control on the hands of end user.
Here is an extract from Google’s blog:
You’ve probably had the experience where you’ve clicked a result and it wasn’t quite what you were looking for. Many times you’ll head right back to Google. Perhaps the result just wasn’t quite right, but sometimes you may dislike the site in general, whether it’s offensive, pornographic or of generally low quality. For times like these, you’ll start seeing a new option to block particular domains from your future search results. Now when you click a result and then return to Google, you’ll find a new link next to “Cached” that reads “Block all example.com results.”
This feature makes your quality of content all the more important because once the user has blocked site in search results it will be tough to get the users back. On the face of it the feature sounds pretty exciting but it will limit the users of quailty content and management of block sites will later, if not sooner, will become a burden and tools around managing the same.
There was a sudden burst of content over internet in last few years where every other person tried to host their website/ blogs and contributed content over internet. This has lead to lot of duplication of content with the quality of content going downstream.
Internet 2010 in numbers
Internet is all about content and everything revolves around it whether it is Search, websites etc. Over the last year or so there is more awareness in terms of content quality over quantity. In my recent assignments over an year, I have seen a trend where companies are putting lots of effort in defining the quality content, including their meta information. Some of the typical scenario’s are engaging with creative agencies/ technical copy writiers/ Content guru’s to help define the right content. This has many fold advantages including, better ranking with search engines, readers satisfaction, better bounding with readers which encourage them to return to the site and so on.
So, why not drive our blogs/ sites towards quality content as internet has lots of content and really needs quality around it.
In recent times there is a lot of spectulation around personalized and targetted content, whether we talk about Search or websites. There is lots of data gathering which happens when we are on internet and that helps to provide the content you are interested in.
And as always Google is pretty good in this area. Recently, Google added a new dimension to this by providing Oscar Search Trends (http://oscartrends.appspot.com/) which provide in-depth summary of the search trends for the last Oscars and a way to predict the Oscar winners for this year.
From Google Blog:
“John Batelle once described search trends as “a massive database of desires, needs, wants and likes.” Looking at Insights for Search data, we were intrigued to find that this “database of intentions” shows consistent search patterns among Best Picture winners for the last three years. Each year, the winning film has shown an upward trend in search volume for at least four weeks, as well as highest regional interest from New York (The Hurt Locker, Slumdog Millionaire and No Country for Old Men).”
I can see such kind of apps getting more and more popular in the recent times to come. What the masses think?
Recently ICANN (Internet Corporation for Assigned Names and Numbers) approved the introduction of the complete Internet domain names in non-latin specific languages (Non-latin characters in domain names)
Extract from one of the Press release:
“Up to now, domain names had to use the 26 Latin letters in the English alphabet as well as 10 numerals and the hyphen.Technical efforts have enabled display of parts of Internet addresses in other scripts, but the two-letter suffixes had to be made up of those 37 characters.The approval for non-Latin characters applies for now only to domain names connected with the two-letter country codes, like .ru for Russia and .cn for China.Languages that could become available in 2010 for Internet-site names include Arabic, Mandarin Chinese, Cyrillic, Hebrew, Hindi and Korean.he so-called generic top-level-domain suffixes, like .com, .net and .gov, will remain Latin-characters-only for now.”
Tha above announcement could bring in number of changes in current industry. Here are few of them:
- Web Content Managment: With new non-latin domains getting registered everyday, WCM demand will increase than ever before with the focus on multilingual support. In my experience with WCM, I have not come across many implementations that support multilingual. So, its going to be a challenge for both the product vedors as well as System Integrators. It is important to test the current implementations for non-latin languages and becoming a Must Have feature for any implementations going forward. The same will hold good for e-commerce .
- Translators: The market for translators will increase and will become more prominent. The content will start becoming more localized to regions and translators will come to rescue to retrieve any such localized information to outside world. And there is a good chance that translators becoming one of the offerings of WCM products.
- Search: Local/regional seach will start occupying the search space. This might lead to emergence of lot of localized search engines and will provide stiff challenge to some of the bigger players today in the industry
- Social Networking: With Web 2.0 and social networking the key in todays world, I can imagine local facebook, twitter etc. versions emerging in the market. Localized social search is other area to watch out.
- Migration tools: Emergence of tools which could help to migrate your existing site to localized version
The announcement will:
- Bring in new business opportunities and whole new dimension for non-latin countries
- Internet users will increase exponentially
- Internet will start becoming the preferred source of communication for non-latin countries through localized email and social networking sites
- System Integrators local market in non-latin countires will increase many folds
- Add more challenge to the big brands as they will be pressured to register their current domain names in any number of non-Latin-script languages to prevent fake sites
- Another potential issue highlighted is that some characters in non-Latin scripts appear similar to those of Latin-alphabet characters. So, if we substitutea non-Latin character for the similar Latin character, it will createa unique URL — and the potential for site-spoofing
What are your thoughts about it?
Google has added new dimension to blog search: “Search your blog world”. The
new blogger search, which is in the draft stage, uses Google Ajax Search
powered by Linked Custom Search Engine. It searches your own blogs
content as well as anything you have linked to in your blog posts, including
link lists and blogrolls. The search results follows the CSS rules of the blog
This search widget can be configured in your blog’s Template | Page Elements
tab, in the “Add a Page Element” popup.
Some days back Google launched a new feature on Google Translate, where search query is in one language and the results can be a from the web pages in other language.
From Google: How does this work?
1. Search for Dubai tours from English to Arabic.
2. We translate your query into “جولات دبي” and find Arabic web page results.
3. Finally, we translate the Arabic web page results back into English for you.
This will bring in a whole new experience in the Internet world. It gives an opportunity to explore all the hidden information which was impossible before due to language barriers. I will expand this article to how it will change in the CMS world.
To what I understand till now about Web 2.0, it emphasize on Search Engine Optimization as one of the main area of concern. Let me try to list down few of the Search Engine Optimizations from CMS implementation point of view:
A utility to detect duplicate content detection at the time of content publish
I do not think that there is any WCM out in the market (commercial or Open source) that provides any utility for duplicate content detection
A spell-checker to ensure that the content submitted online doesn’t have any spelling mistakes
This has become and integral part of WISIWYG editors like FCKeditor. Most of the WCMS tools adopt to such content editor tool.
Content Structure which enforce Alt attribute for Image tag, title, meta descriptions, keyword tags and templates supporting picking up those information while rendering page
A more of a design consideration and rendering aspect. Lets leave this to individual design and implementation. It might be a good idea if in future, CMS provides something out-of-the-box for such considerations
What You See Is What You Get (WYSIWYG) editors for providing facility to content editors to add formatting to text content
Most of the WCM provides WYSIWYG editors that provides basic formatting stuff like h1, h2, b etc. OpenCMS has its own inbuilt editor.
An editor that can clean up unwanted HTML tags and make it W3C compliant
Its again part of the WYSIWYG editors.
An editor that gets integrated with your CSS of the site
Most of the text editors provide this facility. FCKEditor is one such that support CSS for better integration with website. WCM’s are adopting these editors for their content editors frontend.
A utility to help detect duplicate page titles or a mechanism to generate unique page title
I can not re-collect that there is any CMS in the market that provides such utility. At present its taken more at individual implementation level rather than a CMS providing out-of-the-box.
A utility to determine broken links at the time of content publish to avoid broken links
OpenCMS is one that does provide this functionality of validating the links and reporting the broken links. but I do not know many other which does that. Fatwire do provide this in a little different way, if at the time of rendering content, it doesn’t find the link, it removes hyperlink and render content. But I think most of the CMS products doesn’t really provide this out-of-the-box.
Looking at my list above, I feel most of the Search Engine Optimization techniques are taken care at the text editor level which have become powerful over the period of time. From CMS perspective, providing few utilities like duplicate content detection, broken links etc. will sure make them Web 2.0 compliant.
Just wondering how much a name of a product make sense? This strike me when I was reading about Search Engine Optimization and came across how domain name plays a vital role in optimization (will cover SEO in my next posts). Lets have a look at few products from each space and try to understand what they really mean.
Lets start with Portals space:
Art Technology Group – A group who created art for Internet world by providing common face to all the applications within the organization/
Broad-vision = enterprise wide vision? Nicely framed but their vision is really going hey-ward
A decorative design or small illustration used on the title page of a book or at the beginning or end of a chapter. A decorative design for your information capturing from beginning to end, from all systems across the organization.
Life – ray : a ray of life for all the vendors who can not afford commercial portal products.
Anyone out there help me. I have no clue about it