In the beginning the Internet was but a confusing mass of documents. Finding information, by searching the World Wide Web, was nearly impossible until the first search engines came along. Early search engines were like rudimentary life-forms (Remember Yahoo! Search and Alta Vista?). They got the job done, but as more people gained access to the web, their limitations and dependence on the information that content creators put on so-called meta tags (strings of text that identified the content of the page in the source code) meant that such engines were liable to be abused.
Into this fast-expanding, yet messy space came Google. To cut a long story short, what Google did was organise information a lot better. Instead of depending on the text of a web page to determine the search results, it determined the page's influence using a variety of factors-including how many times a page had been viewed or linked to. From those early days, Google made searching the Internet reliable and over the past decade has grown into the behemoth it is today.
But there is more to the Internet than Google and more to search as well. Microsoft's CEO Steve Ballmer told BT five years ago that Google was just a flash in the pan, as Microsoft readied its (then) latest search engine. Microsoft did not get too far then, but laid the foundation for an effort that churned out Bing, its search engine launched last year. The engine uses a process similar to Google's 'PageRank' algorithm and gets a boost from technology tailored to throw up relevant results. The result: it has garnered one-fourth market share in the United States.
Google is not standing still. "In the early days, much of the Internet was static, your 'web crawlers' would go out every few days, sometimes even weeks to find information. Today, in the age of Twitter, the Internet happens in 'real time' and we have to reflect that," said Amit Singhal, a Google Fellow, present at the company's recent Science of Search conference in Tokyo.
The way it delivers this is complex: New search technologies look for statistical patterns while determining the importance of one particular tweet message over thousands of others as well as the number of followers the person who wrote that tweet has. Juxtapose the volume of tweets - 2.7 million every hour-and the brute force of the new technology and servers that power it become apparent. That and Google's focus on search helps it stay top dog despite challenges in China (Baidu is No. 1) and Japan (Yahoo! is ahead in popularity).
Google founders Larry Page and Sergey Brin have spoken of their fears of being ousted by a newcomer. That challenge could come in the form of the likes of Wolfram Alpha, which queries a structured database for answers. So, ask it about the 16th President of the United States and it throws up not a link but a page of facts on Abraham Lincoln. The scope of the engine's results is limited given it is still a project in progress. Example: A search for "World Cup" assumes it is a gene and gives you a reference genetic sequence. Still, structured searches are being closely watched.
Elsewhere, to catch up with Google, Microsoft and Yahoo! are touting new contextual search services that give you search results based on the page you are on. For instance, if you are reading about Barack Obama's actions in the Gulf of Mexico BP oil spill, highlighting "Obama" will not do a generic search for Obama but for the US President and the oil spill. This is still a beta (test) service rolled out to a few users.