Let’s start with one of the original ways people tried to influence SEO rankings, Keyword Stuffing. It is far less common today than it was 10 or 15 years ago, but some people who are either unaware or don’t care about the penalties still try to get away with this. In short, Keyword Stuffing is when you stuff keywords into a page that is stuffed with keywords, sort of like what I’m doing in this sentence.
The goal of this practice is to influence your search ranking by listing off the phrases people are searching for as many times as possible in “unnatural” ways. Google and other search engines have cracked down on practices like these: they hand out stiff penalties for anybody trying to
game the system.
This isn’t meant to keep certain websites or people from ranking higher, it is meant to give proper recognition to the people who are putting in the extra work. The hope here is that by discouraging keyword stuffing, websites will create more original, informative content for their users.
Picture this: you’re filling out the last couple pages of your website. It’s been a long week, and you are exhausted from describing your company or product over and over again on different pages. So, to save time, you copy a few well-written sentences from your homepage and pepper it into two other pages. What’s the harm in that?
Google almost never displays duplicate content when a user searches for something, so right there you’ve cut out two thirds of your pages because three of them say the same thing. Or, maybe you referenced another website’s content in your own content by quoting it exactly. When Google recognizes duplicate content, it then traces the original source of the content. If that’s not you, you’re out.
The solution here is simple: be creative. Write unique content for each page of your website, or go with your own words over someone else’s. Google rewards originality this way.
Sounds eerie, doesn’t it? It’s actually a very common practice that is happening to websites right now. Google uses bots called “crawlers” or “spiderbots” that go through your website and index your data. This is how you end up ranked on Google, the data collected from crawling. But what if Google can’t index your data easily? They just won’t.
If your site isn’t optimized for crawlability, you’ll drop in the rankings or, if it’s really bad, end up not ranked at all. Crawlers are derailed by things like incoherent sitemaps that make your pages or links impossible to find; broken links that lead nowhere or to the wrong page; 404 errors; or dead end pages on your website.
Maybe you have a page or a link that is broken or outdated, and you just haven’t been able to get around to it. After all, who is going to click on that one page anyway? Crawlers will. And if those bots aren’t happy with what they find, all the keyword optimization in the world won’t help your rankings.
Any current SEO strategy can become outdated overnight, but the end result is the same: SEO rewards effort. All three mistakes I shared with you today can be corrected by putting in extra effort.
Avoid keyword stuffing or re-used content by creating unique, informative, and engaging content for your users. That benefits you, and your users. Help Google index your data by combing your website for dead ends or broken links. Doing a little “spring cleaning” on your website could very well give you a boost you didn’t expect.
You shouldn’t have to keep up with all the latest SEO trends, so we do that for you at Dotlogics. Speak to our specialists today and learn how to improve your ranking.