Blog

A Changing Approach to SEO in the Aftermath of Keyword (Not Provided)

By , posted on
Filed Under Search Engine News

On September 23, 2013, the Internet was made privy to Google's master plan to encrypt 100% of organic search activity. Marketers were outraged, as Google's own Matt Cutts had formerly stated that keyword encryption would impact less than 10% of searches overall.

Google Not-Provided over 1 year increase

SSL Encryption and Google: A Brief History

Google initially launched an encrypted search option back in May 2010. Since October 2011, the search engine has been automatically redirecting users to its encrypted URL as long as they were signed in to Google. It was around this time that the "(not provided)" element in Google Analytics and other web metrics programs began to appear. 

Since then, encryption has been adopted through various other browsers including the Chrome omnibox as well as Mozilla Firefox and Apple Safari iOS6, steadily increasing the total percentage of obscured keyword data. Very soon, this encryption will apply across the board.

Webmasters and SEOs have been able to witness these disconcerting events unfold live over the past two years via the (Not Provided) Count graph on notprovidedcount.com. There was a noticeable leap in withheld terms around September 4 of last month. Today, the percentage of withheld data is currently up to around 80%.

 

Does Google Have Ulterior Motives for Withholding its Data?

When questioned by Search Engine Watch, Google claimed its decision to implement full encryption was made to provide "extra protection" for its users by obscuring potentially sensitive search strings. 

However, there has been some speculation among webmasters as to several other possible motives:

  • Accusations of cooperating with the NSA: Since June of this year, Google has been one of several targets involving accusations of cooperation with the NSA to provide private information to the government through the PRISM initiative. Google denied these accusations – it may be that the encryption is an effort to manage the company's reputation.
  • Competition from smaller search engines: For years, Google has enjoyed the lion's share of the search engine market. But with growing public speculation surrounding big brother-type initiatives, smaller search engines such as Duck Duck Go have found some success guaranteeing privacy to their customers. Perhaps Google is working to retain this segment of the market by acknowledging the growing desire of users for more protection online.
  • Promoting AdWords: Finally, some SEOs have asserted that the keyword encryption is simply motivated by monetary gain, as Google makes certain search data available only to customers of its AdWords program.

 

What Does Keyword (Not Provided) Mean for SEO?

Keyword not provided

The encryption of keyword data prevents marketers from being able to track visitors by referring keyword in their analytics program, so webmasters are no longer able to segment users according to search terms.

Without referral data, when searchers arrive at your website, you have no way of knowing what they were looking for when they clicked through to your site, and you have no way of knowing whether the page they landed on met their needs – aside, of course, from telltale indicators such as a high bounce rate.

Additionally, webmasters are no longer able to measure which of their organic keywords are converting customers, making the task of optimizing pages a process of trial and error.

The keyword data encryption certainly is bad news for webmasters who have up until this point relied entirely on Google Analytics to supply the data for their content marketing efforts.

 

The Problem with Webmaster Tools Referral Data

Keyword referral data for paid and organic searches is currently available for viewing on the Dimensions tab in AdWords. However, the organic data comes from Google Webmaster Tools, which is notoriously unreliable.

Google Webmaster Tools provides the top 2000 keyword terms per day for the last 90 days (Google states it will extend archiving to one year at some point in the future).

Unfortunately, Webmaster Tools "buckets" its data, which really means it approximates its numbers. While at times there seems to be some degree of correlation between GWT and Analytics data, the consensus among webmasters is that the Webmaster Tools data is generally inaccurate. 

This aggregate data would perhaps best be used to determine where a site is performing best from a broad perspective; however, because it cannot be expected to provide the precision that was previously available in Google Analytics, it's hardly a viable solution for smaller businesses. Even for larger businesses, with a maximum of only 2000 terms, the keyword long tail is no longer up for grabs.

Other proposed (and still less than ideal) solutions include:

  • Using keyword data available from Bing and Yahoo, which hold only 18% and 11% of the market share respectively.
  • Tracking correlations between page optimization and general increases in organic search traffic.
  • Measuring overall performance by monitoring rankings.
  • Tracking on-site keywords used within search applications.

 

The New Face of SEO – A Holistic Approach to Search Engine Optimization

So how should we as webmasters respond to the keyword (not provided) crisis?

It's important to remember that as long as optimized content continues to appear at the top of the search results and organic rankings continue to drive 85% of search engine traffic, SEO remains an important element of a successful online marketing strategy – as well as a competitive one.

Rather than focusing on when the data will be 100% gone, we must adapt to changing climates and be willing to reevaluate our ideas of what it means to do successful SEO.

For starters, keep the following objectives in mind.

  • Use trial and error to improve landing pages: Analytics data such as page views, time spent on page, bounce rate, and conversion rate are still available as telling data to improve relevance and user experience on a page level. While the task has certainly become more difficult, SEOs can salvage some of their traffic by using these remaining data to improve landing pages with high bounce rates and optimize conversion rates.
  • Quality first, and the rest will follow: SEOs should strive to adopt an integrated approach to online marketing which places the focus on serving the needs of their market by providing quality content rather than interacting with customers behind the curtain of micro-details such as keyword data and hyper-analytics. 

 

Think Customer-Oriented, Not Data-Oriented

By some in the SEO community, the keyword data encryption is being referred to as an apocalyptic event. For many veterans, however, it's simply a temporary setback.

Choosing the right keywords comes from knowing your target market and their needs, which can only be achieved through engaging your audience with a content marketing strategy that vigorously incorporates social media and other channels. Lasting organic search rankings and high-quality editorial links are a byproduct of these processes rather than the ultimate objective.

Those webmasters who welcome change as just another opportunity will continue to win out over the competition in the ever changing game of SEO.



About the author
Federico Einhorn
Federico Einhorn
I'm the Founder and CEO at FullTraffic. Since 2005, FullTraffic has evolved to become one of the most important Traffic providers world wide for small to medium sized businesses. http://federicoeinhorn.com - Read more stories from .
5 Unethical Marketing Tactics Your Business Should Avoid 5 Unethical Marketing... By Federico Einhorn
Posted on September 14, 2016
Black Friday Deals Week Starts Now! Black Friday Deals Week... By FullTraffic
Posted on November 26, 2015
10 Tips for Creating an Unbeatable User Experience for Your E-Commerce Website 10 Tips for Creating an... By Federico Einhorn
Posted on October 26, 2015

FOLLOW US

ARCHIVES