Hotel Online
News for the Hospitality Executive

.
advertisement





Hoteliers: Be Prepared for Penguin 2.0

Google Head of Web Spam Quoted as Saying "Updates will be Jarring and Jolting"


by Brandon Dennis
May 2013

Hotel websites are still recovering from last year’s Penguin update, but a new update loosely called ‘Penguin 2.0’ by the industry is looming on the horizon. Now is the time hotel marketers must make changes to their website SEO to prevent any penalties that might be levied by Penguin 2.0. While no one really knows what will be in Google’s upcoming update, here are a few likely scenarios you can prepare for.

First, Some Context

Last year, Google released an update to their algorithm called Penguin, which severely penalized websites that practice bad SEO. Tragically, hotel websites were some of the worst hit by these penalties, as hoteliers have a history of paying unscrupulous SEO agencies to ‘do’ SEO for them. Hoteliers are therefore vulnerable to be penalized by upcoming penalties, unless hoteliers change their SEO strategies and move away from vendors whose techniques are outdated.


Google gave this warning to a hotel marketer through Webmaster Tools who approached me recently with questions about SEO. It took a while, but Google finally penalized their hotel website for practicing bad SEO—specifically, for purchasing backlinks from article directories and blog comments. The following images will be using data from this hotel website’s SEO.

Soon, Google will update Penguin, and eventually incorporate it into their broader algorithm. Google’s head of web spam Matt Cutts spilled the beans about the looming update when he commented on a popular SEO blog:

 “…expect that the next few Penguin updates will take longer, incorporate additional signals, and as a result will have more noticeable impact.” (Emphasis mine)

In other recent interviews, Cutts said the updates will be “jarring and jolting”, and to those looking forward to the next Penguin update, “You don’t want the next penguin update”.

8 Steps to Prepare for Penguin 2.0

Note: I am indebted to many thought leaders in the SEO industry, including Search Engine Watch, QuickSprout, and State of Search for performing excellent SEO research and giving me much to think about when composing this guide.

Also note: No one truly knows what Google plans to do. This guide is a prediction of what Google will likely focus on with Penguin 2.0, based on intuition, industry discussions, and public statements from Google.

1.) Manipulated Links

As Google’s technology advances, they become increasingly sophisticated at discovering manipulated backlinks. Google valued backlinks in the first place only because backlinks are harder to ‘game’ compared to other outdated SEO techniques, like meta tags. But, like Google’s use of meta tags, Google will rely on backlinks less and less with each algorithm update. It’s quite possible that someday, Google won’t place any value on backlinks at all.

For Penguin 2.0, the only links Google will value are links placed with editorial intent—that is, placed with purpose because they give benefit to the reader.

To this end, Google will become much stricter about the links they value. Google can easily detect where links appear on a page, and they will discount the following links completely:
  • Site-wide navigation links, including footer and sidebar links
  • Links in author signatures
  • Links in blog comments
  • Links in forum signatures
  • Links in author profiles
  • Sponsored links
  • Blogroll links
  • Links from embedded media, like apps, widgets, images, infographics, and video.
  • Generic boiler-plate links
  • Article directories
  • Links from press releases
  • Much more
In the image above, we see that the majority of backlinks for this website come from forum signatures (dark blue), blog comments (teal), and short text paragraphs (pink), which are conspicuous of purchased links and ‘spun’ content. We also see that the links from the more trusted website found further out in the disc are from public sites like YouTube and Blogspot, while the majority of links come from poorly trusted websites further into the disk. If I can find rich data like this about a hotel website’s manipulated backlink strategy, how easier can Google?

In short, if Google can detect a link was not consciously placed by the author as an editorial comment, reference or source, Google will disavow the link.

Action Item: Don’t pay for backlinks. If you have paid for backlinks in the past, verify that none of the links you received were from these low-quality websites. If you have links from these kinds of websites, you may want to have them removed.

2.) Domain Trust

If a website repeatedly practices suspicious linking, then Google will decrease the domain authority they give to that website over time. Websites like this are easy to detect, as they have characteristics similar with each other. If Google sees that the majority of your website’s backlinks come from low quality sites like these, then they may conclude you’re buying backlinks.

Google will likely publish strict penalties with Panda 2.0, to penalize websites they have caught receiving links from ‘bad neighborhoods’, or sending links to them.


In the above  example, we see that the majority of backlinks this website has come from websites with low authority. The dots on the outer edge of the spiral are websites with higher authority. The closer towards the middle the dots get, the lower the website authority. Hovering over one of the dots, we see the kind of poor quality, unrelated website this hotel has backlinks from: it’s a public forum for free video games.

Action Item: If you actively build backlinks for your hotel website, make sure you only solicit high quality websites that are thought leaders in your niche. If your agency has solicited low quality links, try to have them removed.

3.) Social Signals

Google has been using social signals to supplement backlinks as a way to judge a website’s quality. But even social signals like Facebook likes can be ‘gamed’, or purchased, and Google is increasingly better at detecting them.


In the example above, we see a cluster of one website’s Facebook likes, broken into different clusters. Each gray dot represents one person who liked the website, and the links represent connections to other people who have liked the content.

The left side of the cluster looks normal. The page has likes from various people, who are connected to each other through long chains of interlinking friends. A healthy social graph will depict an interlinking web of likes from friends connected to other people through various degrees of removal, whom  the website author has never met.

On the right, we see some suspicious activity. We find four big lumps of likes from people only connected to friends within small groups. This is highly suspicious, as it is unlikely for there to be a community of friends connected to each other but have no connection with other people outside their click. That is, the people who liked the page are all friends with each other only, and have no friends outside those circles. This makes sense. Because it is unlikely for a real person on Facebook to send a friend request to a fake Facebook account created by a robot to give likes to a website.

Google could use data like this to help detect spam, and penalize websites.

Since Google does not have access to all the data 3rd-prty social websites like Facebook and Twitter have, Google created their own social network, Google+. It is likely that Google will put greater trust on social signals from Google+ over all other social networks, as they continue to aggressively encourage users to adopt the new social network. They will do this because they have access to all the social networking data of users of their network, which gives them an easy way to detect suspicious activity from fake Google accounts. This will give Google the ability to reliably place trust and value on +1s (the Google equivalent of a Facebook ‘like’) and Google+ social shares while weeding out the chaff.

Action item: Optimize your hotel website to accept Google +1s (see my guide to +1s for more info). Use the social network to build up a community of Google+ users with whom to share hotel information.

4.) Authorship

Google has clearly said they will give greater weight to content tied to verified online identities compared to anonymous content. Google can verify authorship though Google+ and verified Twitter profiles, but Google probably will rely more and more on Google+ authorship in the future as that network becomes more mature. Hotel websites will receive a boost in the rankings if they are tied to a verified online identity. Read our guide to Google+ for hotels and our article on setting up Google+ authorship and publisher markup for more information.

Action Item: Make sure Google+ authorship is installed correctly.

5.) Relevancy

For a long time now, Google has warned us that getting a slew of links from a random assortment of websites might not be the best idea, as they give greater weight to links from websites in your niche. With Panda 2.0, they probably will become stricter with the authority they give to backlinks from non-relevant websites.


In the above example we see this hotel website gets the vast majority of their backlinks from article directories (pink), blog comments (green), and web directories (red), some of the lowest quality websites possible. Instead, a healthy backlink profile would have the majority of these coming from travel related websites

For hotels, this means that the backlinks worth the most are backlinks from travel websites, like travel blogs, TripAdvisor, local travel businesses, and so on.

Action Item: Discover the kinds of websites your backlinks come from. If the majority come from article directories, blog comments, or other un-related websites, consider having them removed.

6.) Rich Anchor Text

Links with rich or ‘exact-match’ anchor text are links on a website made from words that exactly match the search engine query the recipient hopes to rank. The idea goes that a hotel website will get 100 backlinks with anchor text that says ‘great Seattle hotels’. Google will index these links, see that website owners think the hotel is relevant to ‘great Seattle hotels’, and then rank the hotel website well for that keyphrase.

This type of manipulation was all but killed with Penguin last year, and it was a lesson learned hard. This, more than anything, was the cause for so many hotel websites being severely penalized or even completely removed from Google’s index. While having some rich anchor text is still good for a website, the majority of backlinks should have the hotel brand name, or generic phrases like ‘click here’ to be deemed safe and ‘natural’.

Recent reports indicate Google is reducing the percentage of rich anchor text backlinks that are acceptable. With Panda 2.0, we could see even more websites fall out of the index or get penalized for having rich anchor text.

Action Item: If your website has paid backlinks, or links from friends, family, or colleagues that include rich anchor text, ask them to replace the anchor text with something safer, like your hotel brand name.

7.) Traffic Metrics

Metrics that demonstrate how popular and interesting your website is have always been a part of the Google algorithm, but they were emphasized with Google Panda. Now, the way your viewers act on your website can determine how well it ranks.

Google will look at how quickly your website loads, how long guests stay on your website, their ‘bounce rate’, how much of your content is read, how many broken links your website has, and so on. Now more than ever, it is essential to perfect the technical side of your website’s SEO.

Action Item: If your website is outdated or hasn’t been touched for years, it’s important to check on it—it may need to be rebuilt. Partnering with a digital marketing system is a smart move, as they will make sure your website is always up-to-date and innovative.

8.) Co-Citations / Co-Occurrence

Co-citations is a theory that websites can obtain authority from other websites even when they aren’t linked to by those websites, based on mentions, and links from other websites in the network. This can be a tad complicated, but I’ll try to explain with diagrams


Above, hotel A is linking to both B & C (blue arrow). Hotel B is linking to hotel A. But even though hotel B is not linking to hotel C, B is giving C a co-citation (black arrow) because of its relationship to A. That is, Google connects C’s relationship to B through A and B’s mutual relationship.


Above, hotel A and hotel C are only linking to each other. However, both websites are talking about hotel C, by mentioning hotel C in a blog article where they link to each other, for example. Hotel C gets a co-citation from each website, even though there are no actual links.


Here, no one is actually linking to anyone. However, each website mentions one of the others. By citing each other, Google can figure out there is a connection between them all, and pass along co-citations where necessary.

It’s possible that Google is resorting to alternative trust indicators like co-citations to convey website and article authority, and rank content.

Action Item: Continue to develop thought leadership in your niche using advertisements, creating deals, guest blogging, and developing relationships with leaders in your niche, in order to get people talking about you. Even if they don’t give you backlinks, your website may still receive a ranking benefit simply by being cited by other websites in your niche.

Conclusion

I realize that this information is pretty thick and can be overwhelming. My goal is to demonstrate how far away we are from the old days of SEO where stuffing keywords in meta tags and stuffing backlinks in website footers was considered clever SEO. We now have to work in a world where Google’s algorithm is increasingly difficult to game, and any SEO designed to manipulate the algorithm that actually works, if it even exists, is far beyond the talents of most SEO agencies.

Google Penguin has always been designed to take-out sites that use manipulative techniques to improve search ranking. As long as your efforts improve content quality to benefit guests with better information, there is little to worry about. If, however, you have used SEO agencies in the past that proved unreliable, or to have used black-hat tactics, then you may have some work cut out for you to prepare for Penguin 2.0.

Original article may be found by clicking here:
http://www.buuteeq.com/blog/penguin-two-point-o/
________________________


About the Author

Brandon Dennis is the Technical Marketing Manager at buuteeq, the digital marketing system for hotels. He manages buuteeq’s SEO, paid media channels, social outreach, and the company blog. You can connect with him on Twitter @buuteeq.

.
Contact:

Brandon Dennis
Twitter @buuteeq

Google+ profile for author connection: https://plus.google.com/100588133120378013221/posts

.
Receive Your Hospitality Industry Headlines via Email for Free! Subscribe Here

To Learn More About Your News Being Published on Hotel-Online Inquire Here


To search Hotel Online data base of News and Trends Go to Hotel.OnlineSearch

Home | Welcome | Hospitality News
| Industry Resources

Please contact Hotel.Online with your comments and suggestions.