Mobile navigation

FEATURE 

Search – The Evolution Continues

The growth of vertical search, social media and mobile computing has seen search engines having to adapt to stay relevant. As search engines have changed their processes, writes Amanda Watlington, publishers too have had to rethink theirs, to avoid dropping down the rankings or even de-ranking.

By Amanda Watlington

The continuing evolution of search is dependent on how the technology companies that develop and own search technology respond to the changing habits of users and the new modalities that the users employ to interact with information on the web. In the good old days of just a few years ago, the interaction was simple. A somewhat naive user placed a query in a public search engine, such as Yahoo! or Google, and received back a list of textual links that were deemed relevant by the search engine for the searcher’s query. The user’s task was to scan the pre-selected list and find the source that provided the desired information. The activity has stayed the same. Search is still a quest – a hunt for information, but today, the users and their tools are evolving. Today, the incredibly rapid growth in mobile computing is changing how and when searchers interact with the web. Today, searchable information is constantly at the fingertips of increasingly sophisticated and interconnected users, who expect to have their favourite information sources always available on their medium of choice. For publishers and their search marketers, this evolution presents challenges and opens many new opportunities.

The Evolution of Trust

In the early days of the web, searchers expected and relied upon search engines to deliver them the best sites. Extensive research has shown that searchers often ascribed to those sites getting the top listings on a search engine as being for some reason the “best” source whether for information or for commerce. Having a top placement on a search engine added a lustre to the site, a halo of credibility. Some of this halo of credibility still persists, but searchers and search engines have come a long way in developing a more sophisticated approach toward evaluating whether a search engine listing meets their information needs and whether the resource should be trusted.

The searcher, confronted with a near blizzard of spam and many top-rated bad links that lead to almost worthless nuggets of information, has been forced to consider other sources and signals of value and trustworthiness. If search engines are to retain their esteem as the point of departure for the search quest, they must provide a credible list of results. Offering poor quality results is bad business for search engines since they can ill afford to lose the esteem of the searcher as a credible resource. Searchers are fickle and will migrate to other search properties that provide better (more credible) results. A few short years ago, Yahoo! enjoyed the overwhelming share of the search query market, only to be rapidly eclipsed by Google. When searchers move on to the next best thing, it is inevitable that advertising revenues decline. These are the lifeblood of the search business. Although the development of search technology and new search engines appears to have slowed, there are still significant business threats to the large public search engines.

Today, the public search engine is just one starting place. Searchers have expanded their horizons beyond to include many more vertical sources of information. For example, Amazon.com is in fact a search engine for books, and the large travel sites are search engines for travel information. YouTube is a giant video search engine, and Flickr is a searchable photo archive. As the searcher develops a familiarity with these sites, they develop loyalty to them based on their trust that they will continue to perform as a reliable and dependable source for the information they are seeking.

With the advent of the social web, the stakes have grown even larger. Searchers now access and rely on their social networks to validate their choices. They trust the wisdom of the crowd and give their social network a halo of credibility. By using systems that capture the sentiment of site visitors, site owners can tap into the power of the social network. Users seek out and read reviews and look for signals that other readers or purchasers have approved of the selection. This need to provide validation has fuelled a huge growth in the amount of review and social data available, as well as systems to capture the data.

The task for publishers is how to leverage meaningfully data from the social web to develop their relationship as a trusted source of information. The searcher today is curious about how other readers reacted to information, articles and even forum posts. The simplest tactic to implement is to give readers a method for indicating whether they liked or disliked the material. This validates the reader’s choice. Herd behaviours are part of the social web and users are also curious to see what information other users of the same source consumed. This can be done through systems that show articles that are trending in popularity. Also, by aggregating and including other popular articles on the same topic at the end or on the side rails of the site page, publishers can encourage readers to delve more deeply into the trustworthy information available on their sites. Recommendation engines can be used to tell the reader, “those who read this article also read this other article.” They provide a double barrel approach, for they cleverly use proprietary data to provide credibility for their own selections.

Improving the Offering to Gain Trust

Public search engines such as Google, Yahoo! and Bing must offer a high quality offering, if they are to retain the loyalty of an ever more sophisticated audience. This is particularly true given that much of the audience relies heavily on social networking sites such as Facebook where, within the walled garden, users can conduct a large portion of their online life.

Google recently made a significant update to its algorithmic search that clearly impacts publishers. The so-called “Panda Update” or “Farmer Update,” rolled out in the United States in late February and globally in April, targeted for de-ranking, sites with weak or shallow content. This update focused attention on the large content farms that have sprung up in recent years, but the results of the update have shown that many other sites suffered de-ranking as well. The content farms, the key target of the update, have created large volumes of low-grade content on hyper-specific topics often designed to attract clicks from searchers looking for specific information. The sheer size and rapid development of the content farms have created a mass of noise and clutter in the search results. The Google update focused on developing new signals to de-rank sites that offer weak or duplicate content. If visitors fail to bookmark the content or it is essentially unoriginal, poorly written and spread across multiple sources, then it is deemed as poor quality. There are other markers as well. A site that is poorly laid out and suffers from bad engineering or poor uptime performance is downgraded because of the poor user experience it provides. Some quality sites have been downgraded in spite of offering good well-written content. This can be attributed to poorly-executed internal linking structures. Sites that offer high quality, well-written, original content combined with solid site structure will continue to prosper.

Not all good content will be ranked as equal in value by the search algorithm. It is keenly important that publishers follow a well-plotted strategy for getting their articles highly ranked. This strategy must incorporate the value now ascribed by the search engines to the wisdom of the crowds. If your articles are not bookmarked, talked about in social media, tweeted and re-tweeted then their value may be missed. If a publisher expects an article to be highly ranked in Google, the article will need to be socialised as well as published on a well engineered site that is user friendly and rewarded as such by users. This is a tall order for all publishers to meet.

New Modalities Bring New Concerns

Search spam is like a relentless garden weed. No sooner is it rooted out in one area of the garden than it appears in yet another corner. Amazon is a vertical search engine for books, so it is not immune to the guile of the spammer. With the rapid growth in popularity of Amazon’s Kindle e-reader, there has emerged yet another form of spam - the e-book that is essentially created to capture revenue and listings on Amazon. With its huge popularity and reputation, Amazon brings a decided search boost to items available in the Kindle store. The Kindle Publishing Platform makes it possible for almost anyone with a word processor to publish an e-book. The availability of Kindle Reader Apps gives Kindle books very wide circulation to iPad, desktop computers and even phones.

Because the Kindle Publishing Platform allows publishers to use PLR (private label rights) content, many books are being created at very low cost. Once a book gains a few sales and some traction on the Kindle store, there is little preventing the author from slightly altering the title for the book and republishing the very same book again. The Kindle Platform has weak plagiarism detection. There is nothing stopping a would-be Kindle spammer from using copyright protected material in their e-books. Upon publication, the “e-book publisher” simply must check off that they have rights to the material in the e-book, which for the dishonest spammer is a minimal deterrent. If caught, the spammers simply take the book down. For publishers, this presents a new area where vigilance is called for. Book publishers will need to be on the lookout for unauthorised Kindle versions of books under their copyright. Similarly, other publications will need to be mindful that their material may very well be aggregated and republished in an unauthorised version. As the footprint of search expands into new modalities, so too must publishers be aware of how this impacts them. Initially, the worry was just how the publication was portrayed within the blue links of a few public search engines, and then the concern spread to user-generated content and social networking sites. Today new user modalities are adding yet another layer of impact and complexity.