Google Search Console Tutorial

Google Search Console Tutorial

In this post, based on our YouTube video, we give you top tricks and techniques for Google Search Console (GSC). Whether you’re a beginner or or advanced user you’ll get the practical value. Make sure you read to the end because one of the techniques we’ll describe generated $5,000 a month.

 

Why Use Search Console?

Google Search Console, at least for beginners is the back-end access to Google and what Google sees when we use tools like Ahrefs, SEMRush, etc.

We’re relying on external databases that are trying to interpret what’s going on, whereas, when you use Google Search Console, you can directly access what Google can see about your website.

The biggest, fundamental difference between GSC and third-party tools is Real Time Keyword Data.

 

Should You Trust Ahrefs More than GSC

Ahrefs has to buy up data sources to get estimates of search volume from keywords. All those traffic metrics you see are taking estimates of what percentage of the traffic share each position gets and cross-referencing that with the estimated search volume.

If you’re in position 3 and you’re getting a certain fraction of the traffic according to estimates and your keyword has 500 searches per month, then Ahrefs can estimate that you might be getting 200 traffic per month from that.

That’s intelligent but there are limitations. There’s no way that can be accurate.

0 Important for SEO - best practices

❗❗❗Ahrefs underestimates figures by about 20 to 30% most of the time but sometimes much more. In addition to this, their keyword database takes a long time to update and refresh.

If you’re in an emerging industry, you may find that some of your keywords simply don’t exist in the database and that’s exactly what happened with that $5,000-a-month website two years ago.

I was brand jacking this new product and none of these keywords were showing up in Ahrefs; not the review, not the product versus another product, not the product free trial, discount code all of those. People have had a lot of success in the past talking about what they call zero volume keywords where just because it’s called zero volume keywords in   Ahrefs, it doesn’t mean it doesn’t exist, and what you find is they tend to be much less competitive than all the terms that are in the tools.

 

EXAMPLE

I’ve had a certain client for quite a while and we’ve gotten great results. They went from having barely any traffic to now getting a very significant volume of traffic through our content recommendations and the high-quality backlinks we built for them. After a search, we found that GSC says they’ve had 4,530 clicks in the past 28 days, whereas Ahrefs, was only estimating 731 clicks.

How can you balance these two? You just have to accept that these kinds of third-party tools are not perfect by any means, but they do allow you to compare like with like. Here, it is perfectly reasonable for me to say that we’ve multiplied their traffic by five because we can see the trend. It doesn’t matter what the apps do. Remember, what we’re looking for is a trend and that should reflect the same in Search Console.

 

Why Use a Special Tool and Not Search Console?

  • Well, the main reason is competitors. You’ve only got access to your own Search Console that you control, so you don’t get an idea of competitors’ keywords.
  • Everything in the Search Console is an average.
  • It’s difficult to pin down individual rank tracking with Google Search Console because you always get this aggregate over a longer period, whereas  Ahrefs shows clearly your position per time, about specific keywords.

For longer-term tracking, it doesn’t matter which you use, as long as you stick to the same one. When clients ask me ‘Why is the data in your report so different from our Search Console?’ I say ‘It’s because we’re using Ahrefs. That’s our established method and we want to consistently compare. The data can be quite shocking: Ahrefs says 750 visitors per month, whereas we know it’s 4,500.

IMPORTANT:

Just because the tool says the keyword is getting 20-30 searches per month that doesn’t mean the figure is accurate. It could be 60 to 100, so these tools are only an estimate indication rather than an exact figure.

Use GSC for better ranking

 

Getting Started with Search Console

Google ‘Search Console’ as we’ve seen from the latest Google update. It is recommended that you have a large portfolio of sites. You want to break those up between multiple Search Console accounts. Some of my more suspicious websites have unique Gmail accounts, and I access them through an incognito window. But generally, I’ve got a couple of URLs on each of my Google accounts. When you get on GSC, you’ll get this ‘Property Type’ option where you can use the ‘URL Prefix’ or ‘Domain Now. The URL prefix is much easier to do: just add the web address with https://www before it, or similar.

 

IMPORTANT

When you launch a website, there are multiple versions of that website; there is a version with and without www. and there’s one with and without HTTPS. The benefit of this version is you can take an HTML tag and just slap it on your homepage in WordPress and that should get verified.

The problem with it is that you only get that version. If Google somehow detects multiple versions, you want to be there to enjoy the benefits from that. Ideally, Google shouldn’t detect multiple versions, but sometimes it does and if you’ve only put one version into Search Console,  there’s a good chance you’re missing out on the data from other versions.

 

DNS Records

Using the domain metadata is tougher to verify, but it catches everything to do with that domain, whether it’s www dot, non-www dot, etc. So the way you do that is to use your DNS records. 

To be honest, that’s probably easier than the prefix a lot of time. I usually paste a tag on my homepage in HTML. And this just won’t pick it up. It’s annoying, as well as not being as good data as I’d have preferred. Whereas I can instantly log in to my server, my DNS records, wherever they’re kept, put in a TXT record, and it’s done. We’ve got everything and it stays there. That HTML tag has to stay there to keep the Search Console verified.

If you just put it into your theme through the editor, and then your theme updates, you’ll lose that HTML tag, and it no longer has access to the Search Console. You can re-verify and get the data back but it’s still a pain. Just use the DNS version.

As an example, let’s use the website for my book, The Power Lever Method: SEO for Coaches,    Ahrefs says it has 44 keywords and 2  traffic but data on Search Console says this is not the case.

Google Search Console Tutorial and Power Lever Method

What’s on GSC is not a huge amount of traffic, but we see 45 clicks in the last 28 days and we’ll probably find that’s because of people searching my name. My name isn’t a keyword yet – Only 20 searches for my name. If we go to   Ahrefs and put in my name, I’ve got 90 globally, but generally, I’m not a big keyword in the   Ahrefs database because my brand has only just blown up in the last six months.   Ahrefs probably isn’t up to date with people searching my name.

And from the data from the backend of Google Search Console, in the last 28 days, ‘Vickers’ has been searched 19 times. So, provided you’re on your page running the impressions count, here is a good indicator of true search volume.

 

URL Inspection

Here’s another feature on Google Search Console that provides great information on your website’s indexing. If you provide your website link, GSC will tell you if it’s indexed or not, and where. It’s not indexed, it will tell us why it’s not indexed.

To exemplify this, I used a landing page I was using for ads. Obviously, it was Google indexed. If, however, you run a test and it comes up saying it was not indexed, then you can request the indexing. This is also useful when you’re making any unpaid changes.

 

 

Does Clicking “Request Indexing” Work Anymore

In the past, requesting indexing like this only took a couple of hours. There have been times when I published a blog post and requested indexing immediately and within a few hours I can immediately see where that page is ranking.

This is great if you’re doing any testing like trying out different meta titles or trying to claim the Featured Snippets. It used to be that you could go through this process methodically, requesting indexing, and checking back in a few hours. You’d eventually see if you’ve got a pull-down or not, I’ve done it for a while that didn’t seem to work so well.

0 Important for SEO - best practices

❗❗❗You can try clicking the button but it’s not a strategy I focus too much on anymore. What you’ll often find is you’ll get discovered, not indexed. That’s been a big problem lately, where Google is detecting your page but not choosing to index it.

If you want to know what your options are, and how you can speed up indexing, check out the video: How to Index Your Website & Blog Posts in  Minutes (30 minutes).

 

Why are Canonicals Useful

The canonicals at the bottom of the page are useful, particularly for those with big e-commerce sites.

The canonicals are useful because you can find issues with things like localization and keyword cannibalization. If you’ve got two categories that are similar or the same products listed in different categories, then you can often find Google will declare a canonical. And what that means is Google doesn’t want to index two versions of the same thing. So it’ll just take whichever one it thinks is better.

You can also get a broader view of indexing by clicking on the section of the GSC page that says ‘indexing.’ Click on ‘pages’. This always looks bad when you have a lot of pages and just a minority of them are indexed.

So my clients tend to email me, asking ‘What’s going on here?’

 

Don’t Index Entire Website

Google Search Console Tutorial - all about indexing

You don’t want the whole of your site to be indexable. Most of the time, it’s best to de-index your tag pages, category pages, and archived pages where there’s no real content. It’s an important point to note because Google sees them as duplicate content.

There’s probably an exception with e-commerce sites where you probably have lots of text to try and build up those category pages you want to rank. But if you’ve got a blog, and two years ago, you added a tag that said ‘cute puppies,’ WordPress would automatically create a whole tag archive for a ‘cute puppy’ category page, and every single blog post has been tagged with cute puppies. will appear in that feed.

Now, what a lot of beginners do, unfortunately, is that they just slap on a bunch of tags, thinking anything is relevant, not realizing that creating all these different pages creates a new archive. Actually, that wouldn’t be a terrible thing if you stopped at just a few tags and used them consistently.

 

Example:

For instance, if you have a tech blog with a ‘Laptop’ category, then you could use a tag to separate Macs from PCs whilst keeping them both in the same laptop category. So you could have 20 blog posts in each tag and that’d be fine.

The point is you have loads of these ‘not indexed’ pages. And generally, they tend to be those kinds of tag category pages, data feeds, and things like that. And that stuff you don’t want the indexed.

 

 

What Does it Mean “Not Indexed” and Should You Fix It

Why Google ignores indexing requests

Excluded By No Index

Sometimes developers will accidentally leave the ‘no index’ tag on your site, and then you’ll never get any traffic. But generally, this should just be author pages, tag pages, etc.

So, what I was saying about www dot com and non www dot com, is pretty much the same with a slash at the end of your URL.

Ideally, you only want one version, and a lot of the time, your site will be configured to redirect one to the other. So, we inspected a URL, which currently says it’s not indexed as it’s a page with a redirect, and we saw that that’s because the users have declared the canonical to be the one with the slash on the end.

Purists would probably say we should tidy that up. Why is there even a non-slash version showing up in Search Console? But it’s absolutely fine. It’s redirecting correctly.

 

Crawled – Currently Not Indexed

You’ll also find this on GSC and it tends to be a bigger concern. The obvious explanation for this is that Google’s been stingy in terms of indexing lately. because everyone’s piling AI content onto the Internet. So a lot of content just isn’t being indexed naturally.

 

Removals

Imagine if someone reached out to you saying you’ve published an incorrect story, or you mentioned a brand and they want you to remove it. You go under ‘removals’ and just do new requests. You can put the URL in here, but what you’ll find is it’ll take a while for Google to pick up that change and remove you from the index.

 

 

Page Experience

You’ll often get warnings in Gmail about your text being too small and you’ve got all these problems, etc. I don’t use this feature of GSC a lot and I generally don’t find that they have any major impact on rankings

 

Manual Actions

Now of course we have the manual actions tab. If you find you’ve lost a lot of traffic very quickly, then you probably want to check here. I’ve never had a manual action, you do have to work quite hard to get a manual action. Well, you’ll see most of the time is you’ll see there’s a massive drop in your chart. That’s a very clear sign that you have been penalized. But if you’ve come in here, it still says no issues detected. That’s because algorithmic penalties do not show up here and you might never be told you’ve got it.

 

 

Legacy Tools

If you go into the legacy tools and click on links, you will see your backlinks coming through into the Search Console. Again, not much use here because we’re there to get ideas of traffic numbers.

At our agency, SEO Jesus, we know there’s a HUGE difference between links. We put a lot of time and effort into screening each website. We build links so our clients get maximum power and maximum results without compromising on safety.

A lot of agencies will build links and not bother getting them indexed. And as I already mentioned, loads of pages can be ignored by Google.

When we’re building links, we use third-party tools like Omega’s Invite to make sure all those things we build for clients are registered on Google, and therefore it’s showing up under Legacy Tools in GSC. But if you’ve not done that, this is a good way to check if your links are indexed.

 

 

Case Study: Cashing $5,000 Per Month with Hidden Keywords

Now, as promised at the beginning, I said there’s a keyword research technique I did using Search Console, which allowed me to make $5,000 in a single month, two years ago.

To make this work, we’re taking advantage of the fact that   Ahrefs has limitations in its database. There are all these statistics about how we view keywords as things people have typed into Google.

There are metrics about how every search term people type is unique in a way. There are millions and millions of variations, and we have to try and consolidate those together into individual trackable keywords. But the truth is, some keywords are too small or too new for keyword research tools to pick them up. This means your competitors don’t know about them.

So, what I tend to do here is just scroll through the queries on GSC and see what’s giving us impressions.

We’re looking for alarm bells where we’ve never targeted this keyword before. Some business coaching keywords, maybe. We’re looking at a 28-day period to find secret hidden keywords that   Ahrefs doesn’t know about. You won’t always find them, but it’s worth checking in occasionally.

So then we open up a keyword like ‘most popular coaching niches,’ and we see there are 30 impressions in the last month, meaning 31 people have searched for it.

Then we go to ‘pages,’ and we can see the exact URL we wrote, and therefore the keyword we targeted. So, we  wrote ‘The Most Profitable Life Coach Niches on The Planet.’

Now, I would say that’s a different search intent compared to ‘Most Popular Coaching Niches.’ For this method to work, we’re looking for where keywords have been shoehorned towards the content we’ve written, as that’s a clear sign there is not enough supply to meet the demand for this particular search term.

We are validating that there’s meaningful traffic. Using the impressions, we know we already have topical relevance for that keyword because Google is choosing to rank our page for it even though our page isn’t even that relevant for it.

That means if we go after that specific keyword that we’ve not covered yet, there’s a very good chance we will rank very well for it.

 

Practical Example

So, when I made the $5,000, I had a software review product. It was a new product that didn’t show up in   Ahrefs, but that one review was showing up for loads of ‘versus’ keywords. This product versus competitor free trial, discount codes, etc., and I was just able to go through and tick off all these different questions, queries, and searches for that specific product that didn’t show up anywhere on   Ahrefs.

I was making about $80 to $100 per sale on that product, so I wanted every click I could get. That way, I was ranking number one for the review keyword, and that review blog post was ranking maybe three or four for all these other keywords. Then I’d write content specifically targeting each of those individual keywords; a unique blog post for each and all of those blog posts could go to number one for those individual keywords.

0 Important for SEO - best practices

You’ve got to watch out for cannibalization, but generally, this technique works well.

To further prove this point, I ran a search to find interesting keywords that are out of sync with what the blog post is about. I found one that had 20 searches per month: “Do online coaches and consultants need SEO?” I’ve got some pages ranking for it, but none specifically addresses the question of whether online coaches and consultants need SEO.

 

Advanced Techniques in Google Search Console Tutorial

So, to take advantage of this, I could write a blog post specifically for that and then probably be number one for those 20 impressions per month.

AI makes this even easier because we can scrape all of these mismatches where we’ve got these random erroneous keywords that Google has no choice but to try and shove one of our less relevant blog posts too.

We can scrape all of those, use a tool like AutoBlog.AI, and just fulfill all of them at once. Loads of those won’t stick, but some of them will work well, and that’s how you can achieve true topical authority — by going into Google and finding all these microscopic niche keywords that add up to a significant volume, rather than just using   Ahrefs and taking off all keywords.

If you want to keep learning and ultimately rank much higher, watch the SEO Jesus channel on YouTube.

Even better, SUBSCRIBE NOW, and get notifications each time we publish more golden nuggets for you to grow.

 

Share the Post:

Related Posts