Top 10 SEO Errors and Solutions
In coming to terms with the search engine revolution of machine learning and the mobile index, businesses remain stuck in the mud with old “black hat” practices of link building and broad match keyword research.
The sheer competition among today’s SEO practitioners is impossible to ignore for small businesses competing against established brands. It used to be that websites could simply insert a keyword into their title tag, write some irrelevant content, and watch the traffic flow to their website. Search engines are vastly more intelligent than the websites they index–following search engine signals is a must.
If your website is struggling to stay competitive within the search marketing sphere, let this be a checklist of the potential website SEO errors. Anyone of these commonly overlooked mistakes could drastically affect your website’s rankings, traffic, and conversion rate, marking up your SEO campaign as a lost investment. Avoiding these pitfalls will give your content the best chance possible at ranking high and gathering quality links.
Inefficient Keyword Strategy
Keywords are the foundation in which search engines are able to match your content to user intent. The most common mistake businesses commit in this game is competing for the wrong keywords. Your content is already peppered with the keywords of products, information, or services you wish to market to customers. Finding and choosing the right keywords are the challenges.
Before undertaking any content marketing campaign it’s important to consult keyword softwares, such as Google AdWords Keyword Planner or SEMrush to discover keywords that are easy to rank for and match your content topic. Competing for broad keywords will yield increased traffic to your website, but they don’t necessarily result in conversions. Most likely, if your business offers specialized services you will want to compete for long-tail keywords that are specific to user intent and replicate natural speech.
Consult your analytics software to see which keywords are driving conversions for your website. Avoid keyword stuffing and bidding on highly competitive, broad match keywords that are dominated by established brands. Most likely keywords which don’t drive as much traffic, but rank higher can drive more conversions because they are more specific to user intent.
Duplicate Content
In terms of on-site errors, duplicate content is the most common mistake for any website. Duplicate content can come as the result of creating a printer friendly copy of your webpage, inserting blog tags, hosting syndicated RSS feeds, or URL parameters that create duplicate copies of a link. Google clusters all duplicate links and takes each of the internal signals contained within this cluster and assigns these signals to one URL. That is the URL is indexes and ranks.
Within Google Search Console, users can adjust URL parameters to set the preferred domain and URL users wish to be indexed. 301 redirects can also be inserted on duplicate links to point them to a specific link. It’s important that you insert a “rel=canonical” tag on all duplicate links in order to still obtain all of the internal signals contained on these webpages. If not, you could potentially disavow any external or internal links that point back to a duplicate link.
Terrible Mobile Experience
With the construction of Google’s preferred mobile index, creating a high-speed and valued mobile experience is more important than a desktop experience. Slow load times caused by image rendering and long-scroll content will increase your bounce rate. While creating a mobile domain for your website can alleviate some of these concerns it could potentially split your link equity, consume more resources, and divert traffic from the original URL.
In 2017 your website should offer mobile experience in the form of an AMP, Progressive Web App, or utilize responsive design to offer the same experience as a desktop device. The cheapest option may be enabling responsive design across your domain so that all of the secondary signals will still be identified by search engines. It’s important to discern that mobile content should generally be shorter, more concise and that CTAs are visible and easy to press on tinier screens.
Poor Linking Quality
Businesses often fall for the idea that link quantity trumps link quality. It does not and with Google’s Penguin update, suspicious linking practices will result in major penalties for your domain. Producing quality links from authoritative websites will boost your rankings far more than several irrelevant links.
Seeking to gather quality links for your content? Produce content that is worth linking to and consider marketing your content to trusted websites to create a linking exchange. Continually assess all of your external links as some will become outdated, broken, or you might even discover that they were irrelevant when you first inserted them.
Bad Sitemap
While many website owners may view a sitemap as outdated technology, it’s an effective way in helping search engines to crawl your website and identify your domain authority scale. A good sitemap will also alert search engines to any immediate changes to your website and will increase the speed in which webpages can be indexed. A bad sitemap could cause crawling issues for your website and will affect your rankings.
Consult Google Search Console to check for any sitemap issues, such as routing and link errors, that could be affecting your site indexation. Setting up a quality sitemap will also help establish your site’s information architecture and guide your internal linking strategy.
Didn’t Optimize for Local Search
The most common local SEO mistake businesses suffer from is a lack of location specific webpages. Search engines are becoming more capable of identifying local search intent. Providing separate copies of webpages with location specific information can help you improve your CTR and make it easier for consumers to contact you. Businesses should be registered on all local directories and keep active Facebook, Yelp, Foursquare, and Google profiles in order to leverage local search and respond to user comments.
Take advantage of Structured Data Markup, which organizes your rich snippets. This will allow your business to advertise for upcoming events and take advantage of sitelink extensions for specific user searches. Optimizing for local search will gather the right kind of traffic that drives conversions.
Slow Page Load
Page load speed has a major impact on rankings and user experience. A majority of clicks will bounce from a website that does not load in under 3 seconds. If your website contains a lot of images, videos, CSS, JavaScript, or uses Flash, than it could encounter slow page loads. Be sure to optimize all on-page elements and files for optimal speeds and consult Google’s own PageSpeed Tools to make sure your website is operating at tip top efficiency. It may be ideal to strip out some of these elements if they are affecting page speeds. If your website conducts a web hosting service, consider moving it to a dedicated server for optimal page speeds.
Poor Meta Data
Meta data is crucial in gathering CTR. Often times websites will place their website name in each of their title tags, which limits your ability to rank for specific keywords. Title tags should be optimized with a specific keyword at the beginning of the title followed by relevant information. Truncated title tags, title tags without content titles, or even irrelevant title tag keywords will affect your rankings. Title tags are not only the clickable link that users see first within a SERP, but it’s also the title that will appear in a link, which is shared or bookmarked.
Meta descriptions are important to businesses because they attempt to persuade users to click through to your SERP listing. A concise and persuasive meta description will go a long way in increasing your CTR. Go through all of the snippets you’ve optimized for your content and consider rewriting them for webpages that are not creating conversions or have a high CTR percentage.
Not Utilizing Correct Anchor Text in Hyperlinks
It’s certain that you’ve read an article that said click here for more information. This is a widely missed opportunity for websites and it equally hurts the websites these are linking to. Anchor text represents a a hyperlink through the use of a specific keyword. This gives users a general understanding of what they are clicking to and what the topic of the content centers around. Anchor text is instrumentally important in helping search engines identify the keywords you are trying to rank for and will help in indexing these links based on their topical relevance.
Some websites also use the same anchor text for each of their hyperlinks which is just as wasteful. Be sure to mix up your hyperlinks with different anchor text in order to help your links rank for a basket of different keywords. This will help increase their potential visitor traffic. Anchor text is also a good way to establish your own information architecture, separating internal links based on their keyword topics.
Redirect Misuse
Multiple 301 redirects to a particular URL can signal a coding error for search engines and could lead to a potential loop in your URL structure. Website operators should also be weary of using robots.txt files in order to prevent indexation errors. Robots files can be used to prevent spammers from accessing pertinent information and duplicate content you don’t want found. Unfortunately, redirect misuse could disavow all internal signals contained on these webpages and cause massive indexation errors for your site.
Assess all redirects you have in place and consult Google Search Console to set up appropriate URL parameters. If you’ve ever rebuilt your site, chances are there is a 404 error hidden within your site that needs to be redirected to a relevant webpage. Be sure that any internal links contained on robots.txt files are not entirely disavowed and be sure there is a structure in place to still get these pages indexed.