Google: “Our Algorithms Have Gotten Pretty Good at Recognizing Similar Content”
There was an interesting new blog post on the Google Webmaster Tools blog yesterday discussing the issue of duplicate content and how Google doesn’t really look fondly upon websites that have the same information as other websites because of a poor user experience.
I hate taking large quotes from an article, but I think it’s important to know. Read the full post to get more information, but according to the blog post,
“Some less creative webmasters, or those short on time but with substantial resources on their hands, might be tempted to create a multitude of similar sites without necessarily adding unique information to any of these. From a user’s perspective, these sorts of repetitive sites can constitute a poor user experience when visible in search results. Luckily, over time our algorithms have gotten pretty good at recognizing similar content so as to serve users with a diverse range of information. We don’t recommend creating similar sites like that; it’s not a good use of your time and resources.”
This isn’t really new information, and it’s not surprising, but it’s something that domain owners need to take into consideration when developing their domain names. A lot of people have been asking about the issue of duplicate content lately, so this is certainly a good read.
Reach out to Elliot: Twitter | Google + | Facebook | Email