By the time you read this sentence about 40 people have already hit publish on their blog post for the day. That’s because 83 thousand people post an hour. That’s over a thousand a minute. That is why you need your local SEO to be dialed in.
SEO stands for Search Engine Optimization.
SEO is the process used to increase the number of visitors to a website, and to have that website rank higher than others in the search results. Search engines like Google and Bing use complex algorithms to find the most relevant website for any given search.
I am willing to bet an espresso, that if you have a business website it’s for doing business where you live.
Sure, some of you are looking to be found outside of this area, but even then you are probably more interested in targeting a specific location than just the whole world. Right?
Especially if you have a brick and mortar location. Basically, if you have found this page it is probably because you are interested in local SEO.
When people are searching Google for something they either enter their location into the search or Google pulls it from their settings.
A person in Deer Park is going to get a different list of auto mechanics than someone living in the valley. Some of the ways you will enhance your local SEO is by working on the following …
So where should we start? Citations? Content? Research / Analytics?
Get some direction before you start running
A buyer’s persona is an invented character that would be your typical customer. The more you know about who is looking for your services or products the easier it will be to provide content that answers the questions they may have. Telling someone about an amazing solar watch when they are looking for new tires is annoying a huge waste of time and energy. Better is to understand what they are looking for and then provide for that. Here are some ways to learn more about your customers.
Knowing where your buyer is on their buyer journey is important when targeting content for them at that stage. You can look at this as marketing or SEO depending on whether you are analyzing this information or applying the information.
There are some expensive tools out there that you can use to analyze your site and begin to figure out what keywords you should start targeting. But… I feel you can get much of the same information by using some free tools and common sense.
Google for instance.
It makes sense to use Google since that is the search engine on which you are trying to show improvement. So let’s start with what you can get out of Google.
One of the first things I like to do is Google my base search term, not my local keyword. So for something like Spokane SEO, I would start with just SEO. When you type that in to your search Google will attempt to guess what you are searching for and will autocomplete your phrase with some suggestions.
If you scroll down to the bottom of your search results you will find some keyword suggestions there as well. So if you were looking for some keyword suggestions for “Spokane Dentist” Google would provide the following suggestions:
Since we are talking local SEO, one of the best helps Google provides is in the local map results. Take a look at what words those companies are ranking for and consider using the same words. Like in the following example, the top results rank for the word “Dental” instead of “Dentist”.
Name. Address. Phone number. Website. [Your local citations.]
There are a lot of local networks that your business should probably be listed on. Above is an example of some of the citations one should strive for. This graph shows a particular company’s citations. The incomplete citations are in grey, inconsistent in red and duplicates in orange. The goal is to have them accurate and identical across the board. Keep in mind that there are some differences in their requirements like, how many photos you use and your categories.
Check your addresses and make sure they are the same. When you use a suite should be suite, not STE one place and suite another. Your hours should be the same. Make sure you have the correct number of images uploaded for each citation.
What I like to do is use is moz.com/local. and enter the business you want to check the citations on. You can type in the name of the company and see if that comes up or add the zipcode and find the company that way.
After the graph comes up you will see citations you can add and citations you will need to fix or edit. What you are looking for is incomplete, duplicates and inconsistent citations. When you click on the citation you will what kinds of things need work. You may simply need more photos or to add a category or two.
Siloing your navigation
There is a saying that goes something like this, “You are trying to rank each page, not the whole site.” What this saying or axiom is missing is what each page is trying to rank for and how that applies to the homepage. The ideal site structure, in my opinion, is one that leverages LSI keywords to add importance to the home page. We are funneling page rank to the homepage from LSI keyword pages.
Your homepage should be focused on your main keyword and at the same time explain to people what you do and what your services are.
The other pages on your site should target LSI keywords and direct traffic back to the homepage. If your site is about Dentistry, your homepage will be about Dentistry and explain to people what you do and what your services are. Your LSI pages could be about things like Dentures and Braces for instance.
I mentioned in the research and analytics section a way of using Google to locate LSI keywords. Another method is to use LSIgraph. You just put in your main keyword and it will spit out a list of LSI keywords you can use.
You should strive for a few links per page linking to relevant material on your own site. I wouldn’t use more than, say, five per page though. You don’t want to over do it.
“You can add schema to your HTML to improve the way your page is represented in SERPs.” – moz.com
Notice how one of these results has the rating and date with it? Little things like that gain attention and more clicks. This click through rate is one key to a higher ranking. Your click through rate should be in the 20-40%. Anything below 7% shows us that your listing needs some work.
Your robots.txt file should be placed in the root directory of your site.
Using your robots.txt file you should prevent irrelevant pages from being indexed on your website. Pages like, your login, staff, and testimonials for starters.
Disallow: Tells the bot crawling your site to not crawl the page.
Noindex: Tells the bot crawling your site to not include the page in the search results.
You can also use meta tags on specific pages to prevent bots from indexing them.
If you tell the bot to not crawl the page, but on the page have a meta tag telling the bot to not index the page, the bot will not read the meta tag telling it to not index the page, so it will show up in the search results.
You can also tell the bots to not follow links, but that is a topic for another section.
User-Agent: * Disallow: Disallow: /login/ Allow: /blog/
In this example the types of bots are set to all of them. The * symbol means all bots. Then it states that for all bots to disallow crawling the site. It then specifically states to not crawl the login. After that it says to the bots that they can crawl the blog though.
You can also add your sitemap to the robots.txt file. In fact I would recommend it.
One of the fist things you can check is to make sure your site is redirecting to the correct version. If you want it to be www first and then your domain name, that if just fine. Just make sure your non www version redirects to the www version. My site is a good example. If you remove the www from the domain name it will redirect back to it. That way Google is not reading http://www.domainname.com as a copy of http://domainname.com
© Copyright 2017 by Timothy Eberly. All Right reserved. Site design by Jest.Ninja