What Is The Best Time To Start SEO

If you are a website owner who wants to get the best out of SEO, you need to understand that the timing of your SEO strategies and exercises is just as important as the strategies or exercises themselves.

Whether you are doing a website launch or redesign, the best time to start doing SEO is before you release it to search engines for indexing and ranking. In other words, SEO work and website design should start at the same time.

Some aspects of SEO are unique to a new website, some are unique to redesigned a website and others are common to both cases. Nik Chaykovskiy, the expert of Semalt Digital Services explains the features of doing SEO for various purposes.

SEO For New Websites

For new websites, SEO should run concurrently with web design and user experience (UX). This means, the SEO strategy will determine the kind of content that will be used and where it will be placed on the website. Success in SEO is, thus, a result of the effort put towards intertwining web design, UX, and SEO. Starting SEO during the web design process accelerates the results of SEO.

However, this has another implication to your business: your UX and web design experts must understand SEO and your SEO guys must understand UX and web design. These experts might need training in four critical areas (design, UX, front end development, and SEO). But the effort is definitely worthwhile. It’ll be easier to brainstorm and the chances of the new website being successful are significantly increased.

SEO Tips For Website Redesign

If you’ve already launched your site, the best thing to start with is an SEO audit. An SEO audit helps you to know your SEO strengths, such as the pages that are ranked highly by search engines. You also know what SEO aspects make the page perform well so that you can maintain/preserve or replicate those competencies as you redesign the website.

Also important during website redesign is having a 301 redirect plan. As the old URLs are being changed, the redesign team should carefully redirect traffic to the new URLs so that traffic to the site is not lost once the redesigned website is fully functional. It would be, and has been for a few businesses, lethal for the business if this SEO aspect is forgotten. Imagine what would happen if your site’s visitors were greeted by a “404 – Page Not Found” error when they tried accessing your site’s webpages.

SEO Is An Ongoing Activity

For both new websites and redesigned ones, SEO is an ongoing activity. In today’s fast-paced technological atmosphere, most of the SEO aspects change within short time spans. At one time you’ll need to install an SEO plugging and at another time formulate your content according to the current trend. Failure to roll with the SEO trends would be a definite death of the business, especially if sales heavily rely on digital marketing.

Know the right SEO steps to take now to make your SEO strategy and website more successful. If your site is already running, engage an SEO professional and make sure a thorough SEO audit is done. This is the best way to know what actions best suit your website with regard to SEO success. And if you are planning on building a new site, involve your SEO team right from the start. SEO done during the early stages of website development helps to avoid SEO mishaps in the structure of the site that could cost you time and money in the future.

Why Good Content Cannot Replace SEO

Why Good Content Cannot Replace SEO

Lately, it has become a common perspective that as long as you have good content, the rest will take care of itself. It is true that SEO and Content marketing are inextricably linked good content is necessary for SEO. However, for a successful optimization campaign, you will need much more than just good content.

The Customer Success Manager of Semalt Digital Services, Nik Chaykovskiy explains why quality content is not enough for running SEO efficiently.

In theory, the idea is accurate. All search engines strive to provide their users’ base with the best content and so they have algorithms that rank good content higher. By producing more content, you can have more indexable topics which will cover many search requests. Furthermore, as long as the content is good, more users will visit your site.

On the other hand, if you have no content at all, you do not stand a chance of having efficient SEO. If the material is poor and not trustworthy, your results will be similar. For your content to be good, it has to meet a variety of standards to prove it is good. This ranges from uniqueness to practicality, relevance and entertainment.

Let’s assume that your content is good and that you produce content on a regular basis. This content is futile so long as it is not visible. If your users are not aware about your work, then they cannot read or view it. With the advances of Google, it still relies on the feedback of its users to help ranking the quality of content. Thus, if these users cannot view your work, Google cannot judge the quality of your content.

Much of this feedback is usually provided through shares and links, which Google considers as trustworthy. By earning lots of links, you can be seen as a good source of content and as a result, you will rise in the search rankings. However, these links are not earned by good content only. You have to take initiative by promoting and syndicating your links, sometimes by building some manual links as well.

Having good content on your site is a good start. However, you should never neglect the technical factors that are necessary for your site to rank highly in search engine results. Most template sites, like WordPress and Wix come equipped with a technical structure which makes it easier to be indexed by a search engine.

This, however, is not enough too. You will need to create Meta data and title tags, improve the security of your site, update your robits.txt file, create and update a sitemap and increase the speed of your site if you want your site to be in fighting shape.

You can only tap into the true power of content if you are able to integrate it with a variety of other marketing strategies. For instance, you can use email marketing and social media marketing to the benefit of your content. By using these strategies in conjunction with each other, you stand the better chance to take the most of your content. SEO is a complex strategy. It cannot be boiled down to a single focus.

How The Amount Of Funds Invested In SEO Correlates With The Benefits Gained

How The Amount Of Funds Invested In SEO Correlates With The Benefits Gained

The amount of funds you should invest in SEO varies because of the prices offered by agencies. However, the practice shows that the high cost of services not always guarantees the success of your SEO campaign. Does an increased amount of initial investments correspond with the increase of the benefits gained? Is there a certain threshold for businesses necessary to reach to reap the rewards?

Nik Chaykovskiy, the leading expert of Digital Services explains what factors influence the return on investments in SEO.

The amount of time and money spent on a strategy affect volume and quantity. The compounding payoff is greatly affected by variations of the following elements. On this account, increasing the input of one or a few of these factors increases the rate of growth of the returns.

  • Quantity is the amount of work done on a regular basis. Many quality links increase domain authority, while quality content increases the pages one has in Google’s index.
  • Quality depends on the amount of time dedicated to the work done. Higher authoritative sources create more value hosting a single link than lower ranking sources hosting several links.
  • Time. Returns from an investment are not instantaneous as they are a snowball over time.

Competition Factor

Partial investments in a business that exists in a market with many competing sites would not yield as much results as it would have made out of the company, that had been invested for the development of content marketing strategy. However, there are alternatives that help in increasing site’s visibility with the minimal budget and investment requirements. A company can spread the content within a certain market niche or a local geographic area where it optimizes content for the target audience found in these segments. The result will be an increase in the relevance of content provided.

Quality Threshold

Another point of consideration is the level of quality that content needs to satisfy before being released. Poorly written content or thin content can trigger the Panda penalty that will significantly drop ranking on the search queue. It poses a threat to the brand’s reputation and visitor traffic to the website as well as any possible relationships with publishers who aid in link building. The minimum investment ensures the quality and potential for growth as businesses level is up to the competition. Ranking higher than competitors will require a content strategy that surpasses competitors and relies on the knowledge of the best practices in content marketing.

Complexity of Aspects

Taking into the account previously discussed points, we can come up to the following conclusions:

  • Business gains greater returns if the amount of initial investments was increased as compared to one-o-one exchanges
  • Competition limits the visibility for small investments, but not entirely making it impossible
  • Achieving a certain threshold of quality for content and link building ensures that businesses gains the momentum

In conclusion, SEO is not an all-or-nothing strategy that demands an entire marketing strategy to see significant effect. Investing time and effort helps in achieving quality. Finally, the more investments one makes, the higher results he/she yields. Smaller investments are also effective if invested in the right way.

Explains Why Incoming Links Are Crucial For Your SEO

Explains Why Incoming Links Are Crucial For Your SEO

Many times when Google discharges information for its procedures, which regulates the ranking of websites, I would advise clients to make no amendments to anything. Our company has never been involved in search engine junk tactics like the creation of insignificant low-quality incoming website links. Creating exclusive content and quality links have always been our motivation.

Still, we deal with the cases that when clients come to us telling that after the changes of Google released, like the Penguin 2.1 appraise, it affected their rankings. Clients usually think that it relates to some difficulties with their success process, but we discover most of them had bad incoming links created by the previous SEO company they were working with.

Incoming links are an essential element for determining websites ranking in Google. Earlier, a website with the highest number of incoming links would rank higher than a competing website with lower backlinks. Firms would create lots of incoming links regardless of their quality or if they made any sense since Google didn’t care about any relationships between the links and the websites. Google then revolutionized. Panda was first introduced in early 2011, then Penguin was released later in 2012. All of a sudden, the quality of incoming links took over the quantity, at the same time, the firms that got involved in the collection of a large number of backlinks were getting involved in wrong techniques during the creation of content. According to Semalt expert Andrew Dyhan, an online marketing specialist, what Panda has started was finished by the Penguin. Many websites witnessed a drastic fall in their rankings, revenue, and traffic, and this is where one of our clients was caught up. This was a great astonishment for the affected webmasters whose sites got hit.

The bad incoming links don’t show the harmless effect until they hurt a great deal someday. Clients don’t realize the urgency of getting rid of backlinks before we show them the analysis of their account of the client. If on the first stages, removing bad links is not very time-consuming and expensive, later when the loss will bad enough, removal of incoming bad links became the urgent priority that will make you ready to any amount in. We immediately witness positive output once we embarked on bad links removal, but unfortunately, some clients tend to be willing to get the immediate result.

In order to maintain clients, any SEO firm should immediately look out for bad links and eliminate them. Currently, we reveal to potential customers regarding removal of bad links and if obligatory, we advise them to invest in it for an initial couple of words. The greater lesson is to have an understanding of where Google is headed and drive clients towards that route. For the SEO customers, it is no longer an option to be up to date with the trend in order to succeed. Also, abandoning the necessity to get assistance from the SEO firm very likely will make your competitor a winner. Therefore, following up with professionals is important.

What Links Cause SEO Penalties?

Search engine optimization (SEO) experts understand the importance of link building. Essentially, link building is one of the cornerstones of effective SEO strategy. This is because Google’s algorithm relies a lot on inbound links to determine website’s authority that influences organic search rankings.

Links are a critical factor in increasing brand visibility and referral traffic. In spite of this, a recent survey indicates that only 62 percent of all marketers are engaged in link building. So, why some marketers avoid pursuing this strategy? Andrew Dyhan, the Customer Success Manager of Digital Services explains the factors, which make link building a critical aspect of SEO.

Fear of Google penalization is the primary reason why many marketers avoid building links. It is quite fair, however, in most cases, this threat is overrated. Google’s penalization bases on Google’s Penguin update. According to this update, if you build links that violate Google’s terms of service, the search engine will respond in a form of burying your website in a deep sea of content where users won’t find you. This translates into less traffic and low rankings. So, what are the unhealthy links that earn you penalties?

Links from bad sites

Links from low-authority sources and spammy sites are the first type of links you want to avoid. At the most basic level, the value of a link is determined by the authority of the site it emanated from. In other words, if you source links from high authority sites, you command more authority on your site. On the other hand, if you build links from a questionable or spammy site, the authority of your domain takes a beating.

Contextually inappropriate links

Unlike the past, Google’s algorithms are advanced enough to detect how content fits the needs of the audience and the natural use of language. In simple words, if you link to content that has nothing to do with the piece, Google will flag you down and punish you for trying to mislead users.

Keyword-stuffed links

Initially, it was common practice to include keywords in the anchor text of your links. Today, doing that might get you penalized by Google because SEO enthusiasts started abusing the practice by stuffing keywords into links where they did not belong. In spite of this, you can still optimize your anchor text, however, it must be contextually appropriate for the link.

Spammed links

Spammy links include posting comments on a forum with just a link to your website and no other content. Why? Because the main goal of such a link is to drive traffic to your site without giving any value to readers. In addition, Google can penalize you if you place links on the same pages of the site repeatedly.

Links from schemes

Any link you build with the sole intention of driving traffic to your site without giving the user any valuable information is suspect and subject to Google penalties. There are a number of such links including reciprocal links and link wheels where the intention is to pass authority to sites within the wheel. To find out what Google considers as link schemes, read their article on the subject to avoid getting in trouble with the search engine.

Other techniques of manipulating site rankings

Normally, Google’s main aim is to reduce the possibility of SEO enthusiasts manipulating their site rankings using links. As long as you are using links in a way that is beneficial to users, there is nothing to worry about. However, if you are trying to use underhand methods to drive traffic and manipulate rankings, you are setting yourself up for Google’s penalization on your site.

Ultimately, an official Google penalty is a manual action similar to blacklisting. This is what strikes fear in every webmaster but most of the time, Google’s heavy hand only comes down on intentional offenders. However, webmasters often panic and think they have been penalized when their site experience a drop in traffic. But if you avoid running afoul of Google’s mode of operation or work with the specialized SEO services provider, who technically track your website’s performance, you will have nothing to worry about.

Content Planning for Local SEO

Content is the foundation of digital marketing success, no matter the channel on which that content appears. Content determines the businesses social media fans and followers choose to associate with, how visitors choose to engage with your website, and for which keywords search engines find your site relevant.

If you’re like most small business owners, you probably have no trouble talking to friends, relatives, business partners, and prospective customers about your business: the kinds of people you help, the pride you take in your work, what customers value about your business, and so on.

But it’s tough to find time to write about your business. It can be a struggle to find the exact right words to describe your business to the World Wide Web.

Fear not! Your website content doesn’t need to be perfect. In fact, it will appear more authentic to your customers—and probably be more useful to them—if it isn’t filled with refined marketing language, and actually answers their questions about your company’s products or services.

With that in mind, here are some ideas to get you started with creating content:

What are the top things users look for?

Google and Bing both provide a very simple method for researching the key phrases that your prospective customers are interested in: Simply visit either search engine’s homepage and perform a search. Prior to hitting “return” on your keyboard, take a look at the list of terms related to the one you typed in. These are generally the most popular words or phrases related to what you typed. Make a list of these terms and be sure to target a page on your website about each one. Repeat this process several times to develop a comprehensive list of subjects to start your content process.

Research keywords using Google Trends

Google Trends can provide you with a few more specifics around the relative search volume of each keyphrase that Google or Bing suggests. You can even zoom in to your specific geographic area to see just how popular certain phrases are in your state, metro area, or in some cases even your city. Google will also suggest even more key phrases related to these phrases next to the geographic overlay, so don’t ignore these.

What are the top questions your customers ask you?

This is a great tip from Aaron Weiche of Spyder Trap Marketing. If customers are asking you the same questions over and over again offline, they probably have the same questions online as well—and may even type these questions directly into a search engine. Each of your top questions should have a full page devoted to it to maximize your ranking potential for each question.

What’s unique about the areas you serve?

From an SEO standpoint, it’s a best practice to create a page for each town, county, or region that you serve. For example, if you’re a suburban plumber looking for business in the major city in your metro area, you could talk about the history of the sewage and water system of that market on its own page, highlight subdivisions or condo buildings that have a higher incidence of plumbing issues, or list lawsuits that have occurred over faulty pipes in that market. The more local the “scent” of a given page, the more likely Google and Bing are to rank that page.

Case studies of previous projects

You can also start a little bit closer to home, so to speak, and feature projects you’ve worked on in a particular market. Be as explicit as you can about the services you performed, or how your products helped the customers achieve their goals. Case studies are one of the things that make your business unique, so stay away from using canned marketing-speak, and focus on telling stories that will help future customers relate to previous ones.

Customer interviews and transcriptions

The best way to help future customers relate to previous customers is through video interviews and testimonials. In the era of smartphones, it’s super-simple to film and upload video interviews to YouTube and embed them on your website. The personality of your clients and customers will really shine through the video. Make sure you include a text transcript of your conversation below the video so that you get keyword “credit” from the search engines also.

For more information on writing great content that is compelling to both humans and search engines, see Moz’s complete Beginner’s Guide to SEO.

Local Search Data Providers

Many local business owners are surprised with the information that appears when they (and their customers) come across their business listings at Google and Bing. Often, incorrect or out-of-date information shows up with no explanation about where it comes from.

In some cases, even business owners who have already claimed their listings at major search engines like Google and Bing continue to see improper information displayed about their businesses, which understandably just adds to their frustration.

The reason this happens is that these search giants pull in business information from a variety of other sources, in addition to maintaining their own business databases. They both do the best they can to match the data that comes in from these other sources with what they have in their own index, but sometimes that doesn’t happen properly.

If the information is different enough from the correct listing, search engines might think it’s a different business—or they might even feel that the wrong information appears so many times in the other places from which they get their data that the info might actually be “right.”

The sources that Google and Bing pull information from vary from country to country. Each has its own set of important players, known as data aggregators.

These aggregators have typically accumulated their business databases by scanning and transcribing things like phone records, utility records, business registration websites, and printed yellow pages directories.

Google also crawls the web looking for business information wherever it can find it: online yellow pages directories, review sites, local newspaper sites, and blogs. Many of these sources get their information from the same aggregators that Google does—just one more reason you need to make sure your business information is correct at those handful of primary providers in your country. If your data is wrong at those aggregators, it’s likely to be wrong in many places across the web, including Google.

The data aggregators of the future

Factual is a relatively new player on the scene; they were hardly on anyone’s radar less than two years ago. And yet today, if you visit their homepage, you see a who’s who of local search portals, including Yelp, Bing, and TripAdvisor. It’s clear they’re a force to be reckoned with, especially globally.

The fragmentation of the location-based app market is only going to increase, and like Factual, Foursquare has turned its sights on becoming “the location layer for the Internet.” Its developer service is extremely reliable and it surely counts a large percentage of web developers among its ~40 million users. Foursquare is now enlisting users in a quest to provide extremely fine-grained venue data. The ability to layer user-generated data on top of business information is clearly the direction this ecosystem is heading. Google’s Mapmaker tool, Open Street Maps, and Foursquare position those entities to remain at the forefront of this trend.

Making sense of it all

Even for experts, the local search ecosystem is incredibly confusing! But hopefully browsing the local search ecosystem graphic relevant to your country will give you a better understanding of how these local data sites fit together, and identify places to clean up incorrect listing information you might not otherwise have known about.

How to Perform the Ultimate Local SEO Audit

Every business that competes in a local market and who competes for the display of localized results in SERPs will likely find the need to conduct a local SEO audit at some point. Whether you’ve hired an SEO in the past or not, the best way to beat the competition is to know where you stand and what you need to fix, and then develop a plan to win based on the competitive landscape.

While this may seem like a daunting task, the good news is that you can do this for your business or your client using this Ultimate Local SEO audit guide.

This guide was created as a complete checklist and will show you what areas you should focus on, what needs to be optimized, and what you need to do to fix any problems you encounter. To make things easier, I have also included many additional resources for further reading on the topics below.

In this guide I am going to cover the top areas we review for clients who either want to know how they can improve or the ones that need a local SEO audit. To make it easier I have included detailed explanations of the topics and an Excel template you can use to conduct the audit.

Also since the Pigeon update, local search has started to weigh organic factors more heavily so I have included them in this audit. However, if after you have read this you’re looking for an even deeper audit for Organic SEO, you should also check out Steve Webb’s article, ” How to Perform the World’s Greatest SEO Audit.”

Who is this guide for?

This guide is intended for those businesses that already have an existing Google My Business page. It’s also mostly geared towards brick and mortar stores. If you don’t have a public address and you’re a service area business, you can ignore the parts where I mention publishing your physical address. If you don’t have a listing setup already, it’s a little bit harder to audit. That being said, new businesses can use this as a road map.

What we won’t cover

The local algorithm is complicated and ever evolving. Although we can look at considerations such as proximity to similar businesses or driving directions requests, I have decided to not include these since we have limited control over them. This audit mainly covers the items the website owner is in direct control over.

A little background

Being ready and willing to adopt change in online marketing is an important factor in the path of success. Search changes and you have to be ready to change with it. The good news is that if you’re constantly trying to do the right thing while be the least imperfect, your results will only get better with updates.

Some goons will always try to cheat the systems for a quick win, but they will get caught and penalized eventually. However, if you stick with the right path you can sleep easier at night knowing you don’t have to worry about penalties.

But why are audits so important?

At my company we have found through a lot of trial and error that we can provide the best results for our clients when we start a project off with a complete and full understanding of the project as opposed to just bits and pieces. If we have a complete snapshot of their SEO efforts along with their competition we can create a plan that is going to be much more effective and sustainable.

We now live in a world where marketers not only need to be forward thinking with their strategies but they must also evaluate and consider the work done by prior employees and SEOs who have worked on the website in the past. If you don’t know what potential damage has been done, how could you possibly be sure your efforts will help your client long term?

Given the impact and potential severity of penalties, it’s irresponsible to ignore this or participate in activities that can harm the client in the long run. Again, sadly, this is a lesson I have learned the hard way.

What aspects does this local SEO audit cover?

Knowing what to include in your audit is a great first step. We have broken our audit down into several different categories we find to be essential to local SEO success. They are:

1) Google My Business page audit

2) Website & landing page audit

3) Citation analysis

4) Organic link & penalty analysis

5) Review analysis

6) Social analysis

7) Competition analysis

8) Ongoing strategy

Analyzing all of these factors will allow you to develop a strategy with a much better picture of the major problems and what you’re up against as far as the competition is concerned. If you don’t have the full picture with all of the details, then you might uncover more problems later.

Before we get started, a disclaimer

In this guide I am going to try to break things down to make it easy for beginners and advanced users. That being said, it’s a wise idea to seek advice or read more about a topic if you don’t quite understand it. If something is over your head, please don’t hesitate to reach out for clarification. It’s always better to be safe than sorry.

How to use this guide for your local SEO audit

This guide is broken up into two parts including this post and a spreadsheet. The written part that you are reading now will also correspond to this spreadsheet which will allow you to collect pertinent client intake information, record problems, and serve as an easy reference as to what your ultimate goal is for each of the items.

To use the spreadsheet you can click the link and then go to File > Make A Copy.

The complete spreadsheet includes five tabs that each serve a different purpose. They are:

Current info – This tab allows you to record the information the customer submits and compare it against the Google My Business information you find. It also allows you to record your notes for any proposed changes. This will help you when it comes time to report on your findings.

Questions to ask – These are some basic questions you can ask your clients up front that may save a lot of time in the long run.

Competitor information – You can use this tab to track your competitors and compare your metrics side by side.

Top 50 citations audit – This is the list of the top 50 citation sources as provided by Whitespark.

Audit steps – For the more advanced user I took everything in this long document and condensed it to this easy to use spreadsheet with an audit checklist and some small notes on what you’re checking for.

Get your audit shoes on. Now let’s get started

Step 1: Gather the facts

Whether you’re conducting this audit for a client or your own business it’s important to start off with the right information. If clients fill out this information properly, you can save a lot of time and also help identify major issues right off the bat. Not only can we help identify some of the common local SEO issues like inconsistent NAP with this information, we can also have it recorded in the spreadsheet I mentioned above.

Since this is an audit, the spreadsheet has information to include the current information and a column for proposed changes for the client. Later, these will be used as action items.

The first tab in this spreadsheet has everything we need to get started under the company information tab. This includes all of the basic information we will need to be successful.

This information should be provided by the client up front so that we can compare it to the information already existing on the web. You can use the audit spreadsheet and enter this under the “Provided Information” column. This will help us identify problems easily as we collect more information.

The basic information we will need to get started will include NAP information and other items. A sample of this can be seen below:

Questions to ask up front

Once we have the basic company information we can also ask some questions. Keep in mind that the goal here is to be the least imperfect. While some of these factors are more important than others, it’s always good to do more and have a better understanding of the potential issues rather than taking shortcuts. Shortcuts will just create more work later.

Feel free to edit the spreadsheet and add more questions to your copy based on your experience.

1) Have you ever been penalized or think you may have been? The client should have a good idea if they were penalized in the past.
2) Have you ever hired anyone to build citations for you? If they hired anyone to build citations for them they should have some documentation which will make the citation audit portion of the audit easier.
3) Have you ever hired an SEO company to work with you? If they hired an SEO in the past it’s important to check any work they completed for accuracy.
4) Have you ever hired anyone to build links for you? If they have hired anyone in the past to build links they will hopefully have documentation you can review. If you see bad links you know you will have your work cut out for you.
5) What are the primary keywords you want to rank for? Knowing what the client wants and developing a strategy based off this is essential to your local SEO success.
6) Have you ever used another business name in the past? Companies that used a different name or that were acquired can lead to NAP inconsistencies.
7) Is your business address a PO Box? PO Boxes and UPS boxes are a no no. It’s good to know this up front before you get started.
8) Is your phone number a land line? Some Local SEOs claim that landlines may provide some benefit. Regardless it’s good to know where the phone number is registered.
9) Do other websites 301 redirect to your website? If other websites redirect to their domain you may need to do an analysis on these domains as well. Specifically for penalty evaluation.
10) Did you ever previously use call tracking numbers? Previously used call tracking numbers can be a nightmare as far as local SEO is concerned. If a client previously used call tracking numbers you will want to search for these when we get to the citation portion of this document. Cleaning up wrong phone numbers, including tracking numbers, in the local ecosystem is essential to your local success.


Local SEO audit phase 1: Google My Business page

The new Google My Business Dashboard has a lot of useful information. Although I reference the Google Guidelines below, be sure to check them often. Google does change these sometimes and you won’t really get any official notice. This happened rather recently when they started allowing descriptive words in the business name. Keep in mind that if any changes were recently made to your Google My Business page they may not show in the live version. It may take up to three days for these to show in the search results.

Any information collected below should be put in the “Current Info” tab on the spreadsheet under the Google My Business Information. This will also help us identify discrepancies right away when we look at the spreadsheet.

1. Locate the proper Google My Business page we should be working with

We can’t really get started with an audit unless we know the proper page we’re working this. Usually if a client hires you they already have this information.

How to do this: If your client already has a Google My Business login, and log in to their dashboard using the proper credentials. In the back end of the dashboard it should show the businesses associated with this account. Copy this URL and confirm with the business owner that this is the page they intend to use. If it’s not their primary one we will correct this a bit later below.

Goal: We want to find and record the proper Google My Business URL in our Local SEO Audit Spreadsheet.


2. Find and destroy duplicate pages

Editor’s Note: Since the publication of this post, Google has shut down Mapmaker. For a current list of best practices for managing duplicate GMB listings, read: https://moz.com/blog/delete-gmb-listing.

Duplicate Google My Business listings can be one of the greatest threats to any local SEO campaign.

How to: There are several ways to find possible duplicate pages but I have found the easiest way is to use Google MapMaker. To do this log in to your Google account and visit http://www.google.com/mapmaker or https://plus.google.com/local. From this page you can search the business phone number such as 555-555-5555 or the business names. If you see multiple listings you didn’t know about, a major priority is to record those URLs and delete them.

I personally see a lot of issues when dealing with attorneys where each attorney has their own profile or in the case where an office has moved. There should only be one listing and it should be 100% correct.

You can also read my previous MOZ article.

Goal: Make sure there are no duplicate listings. Kill any duplicates.


3. Ensure that the local listing is not penalized (IMPORTANT!)

Figuring out Google penalties in the local landscape is not usually a walk in the park. In fact there are a lot of variables to consider and now this is a bigger deal post Pigeon as more organic signals are involved. We will look at other types of penalties later in this guide. Unlike organic penalties Google does not notify businesses of local penalties unless your account is suspended with a big red warning on the back end of your My Business page.

According to Phil Rozek from Local Visibility System “My first must-look-at item is: is the client’s site or Google Places page being penalized, or at risk of getting penalized?”

How to do this: If your keyword is “Los Angeles personal injury attorney” then you could search for this keyword on Google Maps and Google Search results. If your business listing appears on the maps side in position C for example but then does not appear at all in local search results performing a normal Google Search, then it’s likely there is a penalty in place. Sometimes you see listings that are not suppressed on the maps side but are suppressed on the places side. This is an easy way to take a look.

Goal: Do your best to determine that the listing is not penalized. If it is consult a penalty expert for further guidance.


4. Is the Google My Business page associated with an email address on the customer’s domain?

In my experience it’s best practice to have the login information for the business under an email address associated with the domain name. Additionally this ensures that the client has primary control of their listing. As an example if you run Moz.com and had local listings your Google My Business login should be something@moz.com instead of something@gmail.com. This helps associate that you are indeed the business owner.

How to: If someone else owns your Google My Business page you can transfer it to yourself. Read Google’s Transfer Ownership guide.

Goal: The Google My Business Login should be on an email address on the customers domain.

9 Things You Need to Know About Google’s Mobile-Friendly Update

Rumors are flying about Google’s upcoming mobile-friendly update, and bits of reliable information have come from several sources. My colleague Emily Grossman and I wanted to cut through the noise and bring online marketers a clearer picture of what’s in store later this month. In this post, you’ll find our answers to nine key questions about the update.

1. What changes is Google making to its algorithm on April 21st?

Answer: Recently, Google has been rolling out lots of changes to apps, Google Play, the presentation of mobile SERPS, and some of the more advanced development guidelines that impact mobile; we believe that many of these are in preparation for the 4/21 update. Google has been downplaying some of these changes, and we have no exclusive advanced knowledge about anything that Google will announce on 4/21, but based on what we have seen and heard recently, here is our best guess of what is coming in the future (on 4/21 or soon thereafter):

We believe Google will launch a new mobile crawler (probably with an Android user-agent) that can do a better job of crawling single-page web apps, Android apps, and maybe even Deep Links in iOS apps. The new Mobile-Friendly guidelines that launched last month focus on exposing JS and CSS because Android apps are built in Java, and single-page web apps rely heavily on JavaScript for their fluid, app-like experience.

Some example sites that use Responsive Design well in a single-page app architecture are:

Also, according to Rob Ousbey of Distilled, Google has been testing this kind of architecture on Blogspot.com (a Google property).

Google has also recently been pushing for more feeds from Trusted Partners, which are a key component of both mobile apps and single-page web apps since Phantom JS and Prerender IO (and similar technologies) together essentially generate crawlable feeds for indexing single-page web apps. We think this increased focus on JS, CSS, and feeds is also the reason why Google needs the additional mobile index that Gary Illyes mentioned in his “Meet the Search Engines” interview at SMX West a couple weeks ago, and why suddenly Google has been talking about apps as “first class citizens,” as called out by Mariya Moeva in the title of her SMX West presentation.

A new mobile-only index to go with the new crawler also makes sense because Google wants to index and rank both app content and deep links to screens in apps, but it does not necessarily want to figure them into the desktop algorithm or slow it down with content that should never rank in a desktop search. We also think that the recent increased focus on deep links and the announcement from Google about Google Play’s new automated and manual review process are related. This announcement indicates, almost definitively, that Google has built a crawler that is capable of crawling Android apps. We believe that this new crawler will also be able to index more than one content rendering (web page or app screen data-set) to one URL/URI and it will probably will focus more on feeds, schema and sitemaps for its own efficiency. Most of the native apps that would benefit from deep linking are driven by data feeds, and crawling the feeds instead of the apps would give Google the ability to understand the app content, especially for iOS apps, (which they are still not likely able to crawl), without having to crawl the app code. Then, it can crawl the deep-linked web content to validate the app content.

FYI: Garry Illyes mentioned that Google is retiring their old AJAX indexing instructions, but did not say how they would be replaced, except to specify in a Google+ post that Google would not click links to get more content. Instead, they would need an OnLoad event to trigger further crawling. These webmaster instructions for making AJAX crawlable were often relied on as a way to make single-page web apps crawlable, and we think that feeds will play a role here, too, as part of the replacement. Relying more heavily on feeds also makes it easier for Google to scrape data directly into SERPS, which they have been doing more and more. (See the appendix of this slide deck, starting on slide 30, for lots of mobile examples of this change in play already.) This probably will include the ability to scrape forms directly into a SERP, à la the form markup for auto-complete that Google just announced.

We are also inclined to believe that the use of the new “Mobile-Friendly” designation in mobile SERPS may be temporary, as long as SEOs and webmasters feel incentivized to make their CSS and JavaScript crawlable, and get into the new mobile index. “Mobile-Friendly” in the SERP is a bit clunky, and takes up a lot of space, so Google may decide switch to something else, like the “slow” tag shown to the right, originally spotted in testing by Barry Schwartz. In fact, showing the “Slow” tag might make sense later in the game, after most webmasters have made the updates, and Google instead needs to create a more serious and impactful negative incentive for the stragglers. (This is Barry’s image; we have not actually seen this one yet).

In terms of the Mobile-Friendly announcement, it is surprising that Google has not focused more on mobile page speed, minimizing redirects and avoiding mobile-only errors—their historical focus for mobile SEO. This could be because page speed does not matter as much in the evaluation of content if Google is getting most of its crawl information from feeds. Our guess is that things like page speed and load time will rebound in focus after 4/21. We also think mobile UX indicators that are currently showing at the bottom of the Google PageSpeed tool (at the bottom of the “mobile” tab) will play into the new mobile algorithm—we have actually witnessed Google testing their inclusion in the Mobile-Friendly tool already, as shown below, and of course, they were recently added to everyone’s Webmaster Tools reports. It is possible that the current focus on CSS and JavaScript is to ensure that as many pages are in the new index as possible at launch.

2. If my site is not mobile-friendly, will this impact my desktop rankings as well?

Answer: On a panel at SMX Munich (2 weeks after SMX West) Zineb from Google answered ‘no’ without hesitation. We took this as another indication that the new index is related to a new crawler and/or a major change to the infrastructure they are using to parse, index, and evaluate mobile search results but not desktop results. That said, you should probably take some time soon to make sure that your site works—at least in a passable way—on mobile devices, just in case there are eventual desktop repercussions (and because this is a user experience best practice that can lead to other improvements that are still desktop ranking factors, such as decreasing your bounce rate).

3. How much will mobile rankings be impacted?

Answer: On the same panel at SMX Munich (mentioned above), Zineb said that this 4/21 change will be bigger than the Panda and Penguin updates. Again, we think this fits well with an infrastructure change. It is unclear if all mobile devices will be impacted in the change or not. The change might be more impactful for Android devices or might impact Android and iOS devices equally—though currently we are seeing significant differences between iOS and Android for some types of search results, with more significant changes happening on Android than on iOS.

Deep linking is a key distinction between mobile SERPs on the Android OS and SERPs on iOS (currently, SERPs only display Android app deep links, and only on Android devices). But there is reason to believe this gap will be closing. For example, in his recent Moz post and in his presentation at SMX West, Justin Briggs mentioned that a few sample iOS deep links were validating in Google’s deep link tool. This may indicate that iOS apps with deep links will be easier to surface in the new framework, but it is still possible that won’t make it into the 4/21 update. It is also unclear whether or not Google will maintain its stance on tablets being more like desktop experiences than they are like mobile devices, and what exactly Google is considering “mobile.” What we can say here, though, is that Android tablets DO appear to be including the App Pack results, so we think they will change their stance here, and start to classify tablets as mobile on 4/21.

How to Target Multiple Keywords with One Page – Next Level

Welcome to our newest installment of our educational Next Level series! In our last episode, Jo Cameron taught you how to whip up intelligent SEO reports for your clients to deliver impressive, actionable insights. Today, our friendly neighborhood Training Program Manager, Brian Childs, is here to show you an easy workflow for targeting multiple keywords with a single page. Read on and level up!


For those who have taken any of the Moz Training Bootcamps, you’ll know that we approach keyword research with the goal of identifying concepts rather than individual keywords. A common term for this in SEO is “niche keywords.” I think of a “niche” as a set of related words or concepts that are essentially variants of the same query.

Example:

Let’s pretend my broad subject is: Why are cats jerks?

Some niche topics within this subject are:

  • Why does my cat keep knocking things off the counter?
  • Why does my cat destroy my furniture?
  • Why did I agree to get this cat?

I can then find variants of these niche topics using Keyword Explorer or another tool, looking for the keywords with the best qualities (Difficulty, Search Volume, Opportunity, etc).

By organizing your keyword research in this way, it conceptually aligns with the search logic of Google’s Hummingbird algorithm update.

Once we have niche topics identified for our subject, we then dive into specific keyword variants to find opportunities where we can rank. This process is covered in-depth during the Keyword Research Bootcamp class.

Should I optimize my page for multiple keywords?

The answer for most sites is a resounding yes.

If you develop a strategy of optimizing your pages for only one keyword, this can lead to a couple of issues. For example, if a content writer feels restricted to one keyword for a page they might develop very thin content that doesn’t discuss the broader concept in much useful detail. In turn, the marketing manager may end up spreading valuable information across multiple pages, which reduces the potential authority of each page. Your site architecture may then become larger than necessary, making the search engine less likely to distinguish your unique value and deliver it into a SERP.

As recent studies have shown, a single high-ranking page can show up in dozens — if not hundreds — of SERPs. A good practice is to identify relevant search queries related to a given topic and then use those queries as your H2 headings.

So how do you find niche keyword topics? This is the process I use that relies on a relatively new SERP feature: the “People also ask” boxes.

How to find niche keywords

Step 1: Enter a relevant question into your search engine

Question-format search queries are great because they often generate featured snippets. Featured snippets are the little boxes that show up at the top of search results, usually displaying one- to two-sentence answers or a list. Recently, when featured snippets are displayed, there is commonly another box nearby showing “People also ask” This second box allows you to peer into the logic of the search algorithm. It shows you what the search engine “thinks” are closely related topics.

Step 2: Select the most relevant “People also ask” query

Take a look at those initial “People also ask” suggestions. They are often different variants of your query, representing slightly different search intent. Choose the one that most aligns with the search intent of your target user. What happens? A new set of three “People also ask” suggestions will populate at the bottom of the list that are associated with the first option you chose. This is why I refer to these as choose-your-own-adventure boxes. With each selection, you dive deeper into the topic as defined by the search engine.

Step 3: Find suggestions with low-value featured snippets

Every “People also ask” suggestion is a featured snippet. As you dig deeper into the topic by selecting one “People also ask” after another, keep an eye out for featured snippets that are not particularly helpful. This is the search engine attempting to generate a simple answer to a question and not quite hitting the mark. These present an opportunity. Keep track of the ones you think could be improved. In the following example, we see the Featured Snippet being generated by an article that doesn’t fully answer the question for an average user.

Step 4: Compile a list of “People also ask” questions

Once you’ve explored deep into the algorithm’s contextually related results using the “People also ask” box, make a list of all the questions you found highly related to your desired topic. I usually just pile these into an Excel sheet as I find them.

Step 5: Analyze your list of words using a keyword research tool

With a nice list of keywords that you know are generating featured snippets, plug the words into Keyword Explorer or your preferred keyword research tool. Now just apply your normal assessment criteria for a keyword (usually a combination of search volume and competitiveness).

Step 6: Apply the keywords to your page title and heading tags

Once you’ve narrowed the list to a set of keywords you’d like to target on the page, have your content team go to work generating relevant, valuable answers to the questions. Place your target keywords as the heading tags (H2, H3) and a concise, valuable description immediately following those headings.