Google Search Console is a must-use tool if you want to track how your website is performing. You can keep an eye on keywords, page impressions, click-through rates, rankings, mobile usability, top linking sites and so much more.
It’s a fantastic companion to Google Analytics and completely free to use.
It can be a bit daunting to start with, which is why I’ve put together this beginner’s guide to the various reports and features to give you a handy starting point.
My big tip is to spend some time playing with the different features and filters in the Performance report (more about that below) to understand more about how you can use the data to boost your SEO efforts.
What is Google Search Console?
Google Search Console began its life as Google Webmaster Tools.
Essentially, it’s a free platform from Google that lets you see how the search engine views your website so that you’re better able to optimise your organic presence.
Using Google Search Console, you can tap into important information such as:
- Queries/search terms used to find your site in search engine results pages (SERPs)
- Average ranking position for keywords/phrases
- Impressions and clicks for specific pages
- Errors that need correcting
- Top linked-to pages (internally and externally)
- Which websites are linking to your site
On 9th September 2019, Google fully transferred to the ‘new’ interface for Google Search Console that it had been beta testing for two years and removed access to the old version.
This guide covers the new version.
How to add your website to Google Search Console
If you don’t have access to Google Search Console already, your first step is to set it up.
- Go to Google Search Console and click on Start now.
- Sign in using your Google account. If you have a personal and a business account, choose your business account. (If you don’t yet have a Google Account, hit the Create account option and follow the instructions).
- You’ll be taken to the Google Search Console dashboard. Click on Add a property.
- On the Select property type pop-up that appears, you’ll be given the choice between adding a Domain or a URL prefix.

Google gives an explanation about what these two options mean that I’d recommend reading.
Essentially though, the Domain option covers all subdomains (e.g. www.example.com, blog.example.com, m.example.com) and protocols (i.e. http, https) associated with your website.
It’s a way of consolidating all of your data from your whole domain into one view.

The alternative – the URL prefix option – was the only method available in older versions of Google Search Console.
It only covers URLs under the entered address and specified protocol, e.g. https://www.example.com.

If you use several subdomains and protocols, you would be better choosing Domain.
Please note: Currently, Google Analytics will only link to URL-Prefix properties.
My advice would be to set up a Domain property for your entire domain but also create a URL-Prefix property for your primary site URL and protocol, e.g. http://www.example.co.uk. This would be the primary version of the site that’s indexed with Google.
5. Click Continue.
6. As Google Search Console will give you access to confidential data about your website’s performance, you will need to pick a way to verify that you own the website you’re trying to add.
If you’ve chosen the Domain option, you will see the following pop-up:

Follow the instructions given to verify your ownership of the domain.
If you’ve chosen the URL Prefix option, you will see a different verification screen:

Pick your chosen verification method and follow the instructions given.
Google Search Console will start to collect and display data about your site as soon as it is verified.
Note: Google will let you re-add a web property that you’ve removed in the past without you having to go through a second verification process if the property still has one verified owner.
To re-add a website/domain, follow steps 1-5 above and you should be automatically re-verified.
Link Google Search Console with Google Analytics
As I mentioned above, Google Analytics will only link to URL-Prefix properties.
To link Google Search Console with Google Analytics you will need to complete the following steps:
- Log into Google Analytics
- Click on Admin, which you’ll find next to the gear symbol at the bottom of the main left-hand menu.
- Click on Property Settings in the menu under the name of the website to which you want to link Google Search Console (middle column).
- This menu will move to the left of the screen and a larger Property Settings panel will open – scroll down until you see the Adjust Search Console option. Click it.
- Click on Add.
- Scroll down until you find your website address. Tick the box and hit Save.
Your Google Search Console and Google Analytics should now be linked.
Creating a sitemap
Once your website is verified in Google Search Console, you should create and submit a site map, as this tells Google what pages you have on your website that you would like it to crawl.
There are lots of different ways to generate a site map.
If you have a WordPress site, you can create an XML sitemap using the fantastic Yoast WordPress SEO or a dedicated XML sitemap plugin, such as Google XML Sitemaps.
If you don’t have a WordPress site, Google offers advice about the different site map formats and general guidelines for building and submitting a site map.
Once you have generated a site map:
- Click on Sitemaps under the Index heading in the main Google Search Console menu to the left of the screen.
- Under Add a new sitemap, enter the URL for your sitemap.
- Click Submit.
Your Google Search Console Overview screen

On your Overview screen, you’ll notice that the main menu runs down the left-hand side.
There is also an at-a-glance view of:
- How many web search clicks your website has had over the past three months (find out more with the Performance report – see below)
- How many valid pages Google has currently indexed from your site and how many pages have errors (find out more with the Coverage report – see below)
- Mobile usability problems (find out more with the Mobile Usability report – see below)
- Which rich results Google found in your property, and whether or not they could be read (find out more with the Sitelinks Searchbox report – see below)
Let’s take a look at what the different options in the left-hand menu mean.
Performance
The Performance section of Google Search Console is where you can find incredibly valuable information about your website’s overall search performance in Google.

As you can see from the screenshot above, Performance gives you clear information about your website:
- Total number of click-throughs from SERPs
- Total number of impressions in SERPs (i.e. the number of times your pages showed up in search results)
- Average click-through rate (CTR)
- Average position in searches, overall and for each search term
You can view all four tabs together or select different combinations or single selections of the four tabs, depending on what data you want to view.
Dimensions and filters
In addition, Google Search Console lets you group your data by five different Dimensions:
- Queries (see all the searches for which your web pages have ranked over the given time period)
- Pages (see all the pages that have ranked in searches)
- Countries (see where your audience is based by country)
- Devices (see how your traffic is split across desktop, mobile devices and tablets, and whether the device on which SERPs are viewed influences your click-through rates)
- Search Appearance (this will show up pages that contain rich snippets or Accelerated Mobile Pages (AMP) so you may not see many results listed here)
It’s also possible to filter in/out specific information to help you target specific information (I’ll be covering that in more detail below!)
The Performance report replaces the Search Analytics report in the older version of Google Search Console.
One of the most significant differences between the old and new reports is that you can now see a fantastic 16 months’ worth of data instead of just 90 days. This helps you to build up a longer-term view of how your keywords are performing.
Ways you can use the Performance report to boost your organic SEO
-
Identify your highest click-through rate queries
Click-through rate (CTR) is defined as the percentage of impressions that turn into click-throughs to your website.
CTR is an important metric to keep an eye on because it’s an indication of relevance between search terms and your content. Google is likely to view a high CTR as a positive ranking signal, especially if your CTR is higher than for other pages in the same SERP.
So, what are your highest CTR queries? Here’s how you can use Google Search Engine to find out:
- Click on Performance in the main menu.
- Click on the Queries tab below the graph (although the Performance page usually defaults to this view).
- Click on the Date display to the top-left of the graph and change the date range to Last 12 months (or whatever range you’d prefer) and then hit Apply.
- Make sure the Average CTR tab is selected.
- On the line above the list of search queries, click on the arrow next to CTR to sort the results from highest to lowest. This will show you which of your search terms have the highest CTRs.
Of course, this information doesn’t give us the complete picture. A search term could have a 100% CTR but should you spend your energies on optimising that search term if it has only appeared once in a SERP?
To get a more complete understanding of your best performing search terms, click the Total Impressions tab too.
Which search terms stand out for having a high number of total impressions and a high CTR?
Note: If you’re wondering what makes a good CTR, a good starting point is the Advanced Web Ranking CTR Study. This regularly updated chart shows the organic click-through rates for searches coming from over 2.8 million keywords for 53,000+ websites.
This chart shows that CTRs are directly influenced by ranking position.
A further helpful step here can be to highlight the Average Position tab too. Make a note of search terms that have high impressions, high CTR but rank in position 5 or below in SERPs (i.e. halfway down page one and lower).
You could try building targeted content around these search terms to attract more traffic for these queries and boost them up to the top half of page one.
-
Find pages with a low click-through rate
It can be a red flag if you are getting loads of impressions for a search term but the click-through rate is low. This can be a sign that you need to revise your meta title and description, for example, to grab attention and show searchers/Google that your content is relevant to particular search terms.
With this in mind, Google Search Console can help you to pinpoint where you need to improve click-through rates.
Try the following:
- Click on the Average CTR and Average Position
- We want to find the pages that rank at number five on Google or lower (the bottom half of page one) and have a bad CTR. This means that we need to filter out higher ranking pages that have an average position of 1-4 in SERPs.
- To do this, you will need to click on the Filter button to the top right of the list of search terms (and below the graph) and tick Position checkbox.
- This will bring up a new line above the list of search terms where you’ll see the word Equals next to a small triangle indicating the presence of a dropdown menu.Click to view the dropdown menu and click on Greater than.Type 4.9 in the Filter by Position space. This will filter the results so you only see pages with an average ranking position of 5 or lower.
- At the time of writing, the latest Advanced Web Ranking Click-through Rate Study figures show that websites ranking in position 5 on Google should have a CTR of 7.5% for mobile searches and 4.17% for desktop searches. We can use 4.17% as our baseline.This means we will need to filter the list of search terms further so we find those that have a CTR lower than 4.17%.Click on the Filter icon again and check the CTR box
- Now go back to the dropdown menu where it says Equals and choose Smaller than. Type 4.17 in the Filter by Position boxYou now have a list of low ranking search terms with low CTRs
- Now click on the Total Impressions and Average Clicks tabs as these will show you how many times your low ranking/low CTR pages have been seen in SERPs and searchers have clicked through to your site.
In the example image above, the highlighted line stands out. The search term has been seen 1,169 times and yet there have only been 29 click-throughs. - We need to take this line of investigation further by understanding which pages are showing up in SERPs for the search term with the low CTR.To do this, click on the individual search query that you want to explore. This will isolate all the data to that particular search term.
- Click on the Pages tab
This will give you the page or a list of pages on your website that rank in response to the search term.From this information, you can add to the picture you’re building about specific keywords and pages. For example, do you have lots of pages all ranking – and competing – for the same keyword?But the picture still isn’t complete. - It could be that a page on your website has a low CTR for the search term you initially identified but that it ranks better and has a much higher CTR for a different search term.There’s no point optimising to improve the CTR for a keyword if it might negatively affect keywords that are already performing well for the page in question.To check this out, go to the top of the screen and click on New then Page and enter the URL for the page you want to investigate:
You’ve now got a list of all the keywords the page ranks for. Which of the keywords have the best CTR? Which have the highest number of impressions? Depending on what you see, you might decide to tweak the page in question and its meta data to optimise it for the greatest amount of search traffic.One of your priorities should be to make the meta title and meta description as enticing as possible.Remember, these are what people see in search results. They need to act as a compelling call to action, encouraging searchers to click through to your website. - Monitor the results! Having tweaked your meta data and optimised your low CTR pages, it may take a few days for Google to recrawl the updated content.Check into Google Search Console regularly over the next couple of weeks to see if the CTR improves
-
Identify your highest-traffic queries
Google Search Console can also help you identify which keywords bring the most search traffic to your website.
To find this data:
- Go to the Performance
- Click on the Query tab if it’s not your default view.
- Click on the Date range button to choose the time period you want to view (e.g. last 28 days)
- Make sure the Total Clicks tab is the only one selected in graph at the top of the screen. This relates to the actual number of searchers who clicked through to your website from a Google search, i.e. the total number of visitors arriving on your website from an organic search.
- Click on the small downward arrow next to Clicks to sort from the highest to the lowest.
- Click on each high performing search term and then the Pages tab to see which pages are ranking for the term.
You can use this information in a number of ways:
- Optimise the ranking pages for conversion (e.g. more bookings or sales)
- To update the ranking pages so that they maintain their position
- To support the page with paid promotions such as Google or Facebook ads
- Using them to link to low-ranked but relevant pages
-
Track ranking increases and decreases
Once you have implemented SEO changes to increase CTRs, for example, you will want to keep an eye on whether your rankings are going up or down for your target keywords.
Google Search Console helps us with this:
- Go to the Performance
- Click on the Queries tab if it isn’t your default view.
- Click on the Date option to the top-left of the Performance graph and then choose the Compare tab
- Select to compare two equivalent date periods (e.g. Compare last 3 months to previous period) and then hit Apply.
- This data will show whether your average clicks, total impressions, average CTR and average impressions have gone up or down between this period and the previous one.At this stage, you can view the data in Google Search Console or export it as a CSV file or to Google Sheets. To export the data, click on the Export/Download icon and choose how to want to save the data.
- For a view of ranking changes in Google Search Console, make sure that you only have the Average Position tab selected. This will bring up three columns of data – Last XX days/months position, Previous XX days/months position, Difference.Slightly confusingly, the lines with a negative difference (e.g. -0.9) represent an increase in rankings between the last and previous periods, whereas the lines with a positive difference represent a decrease in rankings.
You can see some examples of this in the screenshot below:

If there are some significant increases or decreases between this period and the last, you may need to investigate the cause.
Sometimes, you’ll notice a big variation because the search wasn’t made during the previous period. This will be indicated with a 0 in the relevant column. It could be that this is the first period during which your site has ranked for a search term, in which case you won’t need to worry about the increase or decrease.
My advice here is to begin building up a picture month-on-month of your highest ranking keywords. If you suddenly notice a nosedive for a usually productive search term then you will want to explore this further.
- Are your competitors targeting the same term?
- When did you last update/refresh your content?
- Could you have been hit with a Google penalty?
These are all questions you might want to consider.
-
Find ‘Opportunity’ keywords
You can use the Performance data to find keywords that rank between positions 8-20 in SERPs and get a good number of impressions. These are sometimes referred to as ‘Opportunity keywords’, i.e. words and phrases that reveal an opportunity to rank highly in SERPs.
With ‘Opportunity’ keywords, you should already have a page that is ranking well for the search term. With a little care and attention, you could boost the relevance of the page and bump up its rankings.
But how do you find ‘Opportunity’ keyword?
- Set the date range of the date to the Last 28 days.
- Filter the report to show keywords ranking Greater than 7.9 (this will show everything ranked from position 8 and below).
- Sort by Impressions (largest to smallest) and look for key phrases with a good number of impressions and a ranking average somewhere between 8 and 20.
- See which page ranks for this keyword by clicking on the search query and choosing the Pages tab
Once you’ve identified the pages you could improve to see a quick increase in rankings, you’ll want to turn your attention to the page in question.
- Look at ways to add more detail by covering as much as you can about the topic.
- Add in a video to keep people viewing the page for longer.
- Provide step-by-step instructions.
- Link to connected content.
- Share on social media and build backlinks to the updated page.
As we can see, the potential within the Performance report is MASSIVE!
URL Inspection
Next in the main menu to the left-hand side of the screen is the URL Inspection tool.
You can look at individual URLs on your website to:
- Check the current index status of the URL
- Determine whether a specific URL can be indexed
- Request that Google crawls – or recrawls – a page
- View how Googlebots see a page
- View a loaded resources list, JavaScript output and page code
Note: This isn’t a live test. Instead, this tool gives us a view of the most recently indexed version of the URL. To test the live version of the page, you will need to click on the Test Live URL button to the top right of the screen.
To check a URL, simply:
- Click on URL Inspection in the main menu.
- Enter the URL you want to inspect in the search box at the top of the screen.
- Review the results.

You’ll notice that the Coverage, Mobile Usability and Sitelinks searchbox sections each have a little arrow to the right on them. Click on these to expand the section for more information.
Within these boxes, you can see when a URL was last crawled by Google and whether it was crawled as a Googlebot smartphone or desktop, for example.
You can check out the user-selected canonical URL and whether this matches the Google-selected canonical (a great way to hunt out duplicate content!)
The URL Inspection tool will also flag up errors such as:
- Problems with the page’s coding, structure, rich snippets, etc.
- Page can’t be indexed
- There is a duplicate version of a canonical page
- The content is password protected or has a noindex tag
When problems are indicated, Google Search Console will give you appropriate information about how to fix the errors.
Index>Coverage
Click on the Index>Coverage option in the main left-hand menu.
This report shows the indexing state of all URLs that Google has visited – or tried to visit – in your web property.
The summary page groups the results by status (error, warning, or valid) and the specific reason for that status, such as Submitted URL has crawl issue or Submitted URL marked ‘noindex’.

Click on the Error row you want to investigate for a list of all of the URLs affected by the same error.
You’ll notice on the screenshot above that you can sort the Coverage summary screen by four different tabs, which show the following:
- Error: URLs that haven’t been indexed because of an error.
(If you’re not sure how to fix a particular error, click on the Help question mark icon to the top right-hand side of the screen and Google Search Console will bring up a list of errors and their probable solutions).
You may want to give these pages your attention straight away. - Valid with warnings: These are pages that Google may or may not have indexed, depending on specific factors. For example, the page may have been indexed while being blocked by robots.txt. Google may not be sure whether it was your intention to block the page so flags it up with a warning.Any URLs in this category will need your attention.
- Valid: These are URLs that have been submitted to Google and indexed with no problems.
- Excluded: URLs that Google feels it shouldn’t index. This could cover pages with a ‘noindex’ tag, Not found (404) URLs, pages with redirects, duplicate of a page with a proper canonical tag, and many others.Again, you can click on the row that gives the reason for exclusion to see what URLs fall under that heading.
Note: Google Search Console highlights that you probably won’t need to use the Coverage report if your website has fewer than 500 pages. It says it’s far simpler, in the case of smaller sites, to search for your site on Google by entering site:example.co.uk, where example.co.uk is your site’s homepage URL without the http:// or https:// prefix.
For example, I would enter site:seo-plus.co.uk
The search results show pages that Google knows about on your site. You can add search terms to find specific pages on your site – for example, I could enter site:seo-plus.co.uk local SEO to bring up all of the indexed pages about local SEO.
Index>Sitemaps
This report shows sitemaps that you have submitted to Google for indexing related to the current web property.
You can submit image, video and news URLs in your sitemap but the Sitemaps report doesn’t currently show any data for those types of URLs.
I covered how to create and submit a sitemap earlier in the guide.
The report will tell you whether or not sure sitemap was read and processed successfully or whether there were any problems that require your attention.
Click on the Help question mark icon to the top right-hand side of the screen while you’re in the Sitemaps report for a helpful guide to troubleshooting common sitemap problems.
Enhancements>Mobile Usability
The Mobile Usability report shows which pages on your website have problems when viewed on mobile devices.

As we can see from the above screenshot, Google will either mark a crawled page as having an Error or being Valid, based on an internal mobile usability score. If a URL has multiple errors, you should notice that it’s listed under each error type that affects it.
To see which pages you need to look at to fix an error, click on the Error row in the Details list (see the screenshot above) and you’ll get a list of every URL with the same error.
Pages that are listed as Valid have met the minimum mobile usability score set by Google. They may still have some mobile usability issues but not enough to flag up in this report. To double-check for more minor issues, you can use Google’s Mobile-Friendly Test Tool.
If you have a large number of pages with mobile usability errors, Google recommends that you fix them in the default order given in the report (i.e. most common causes first).
If yours is a very big website, it’s worth noting that Google will only list the first 1,000 pages found to have the same error in the Details list; other pages may also have errors.
Once you have fixed a mobile usability error on a page, you should return to the error listing and click on the Validate fix button. Google will then recrawl the page and change its status to Valid if it meets the minimum mobile usability score.

Enhancements>Sitelinks searchbox
If you have rich results or structured data markup (see my Schema Made Easy Guide) on your website, this report will help you to troubleshoot any errors in how Google sees and processes this information.

As with the Coverage and Mobile Usability reports, errors are shown in red, valid with warnings in amber and valid pages in green.
Some people find that they’re unable to see this report. There can be a number of reasons for this.
At the time of writing, Google only provides reports for 11 types of rich data: datasets, events, FAQs, fact check, how-to, job posting, logo, product, Q&A page, recipe, and sitelinks searchbox. It could be that your rich data is not currently supported.
Google can only look at rich data on crawled pages and doesn’t support this report in every location yet.
If you think Google Search Console is missing your rich data, you can use the Rich Results Test Tool (currently in beta testing) to see the rich result code.
Again, Google Search Console’s Help feature has some great pointers for troubleshooting if you do have errors highlighted within this report.
Security & Manual Actions>Manual Actions
Google issues a manual action against a site when a human reviewer at Google has found something on the site that they consider to breach Google’s webmaster quality guidelines. This is often where ‘black hat’ SEO techniques attempting to ‘game’ Google’s rankings will show up.
If your website has been hit by a penalty and you’ve noticed a sudden drop in rankings (or a page has completely disappeared from searches), this is the report to check first.
If you do have manual actions against your site, they will be listed in this report with details of the affected URLs. Google will expect you to fix the highlighted issues on all affected pages, not just a selection, before you see an improvement in your rankings moving forward.
When all of the issues have been fixed on all pages, you can select to Request Review. Google says that in your Reconsideration Request, you should:
- Explain the exact quality issue on your site
- Describe the steps you’ve taken to fix the issue
- Document the outcome of your efforts
You should receive an email to confirm that your Reconsideration Request has been received but it may take one or two weeks before the review is complete.
Common manual actions include:
- Thin content
- Unnatural links to/from your site
- User-generated spam
- Cloaked text or images
- Keyword stuffing
- Sneaky redirects
Security & Manual Actions>Security Issues
The security issues report will flag up if your web property has been hacked in some way. This can include types of malware, content or code injections, or social engineering (phishing) scams that attempt to trick web users into doing something dangerous, such as revealing confidential information or downloading harmful software.
As with the other reports, Google Search Console will walk you through the steps you need to take if any issues are flagged up. This could include checking third-party resources on your site, removing content or following the hacking recovery process.
Legacy tools & reports
Google Search Console’s legacy tools and reports are those that don’t yet have a full replacement in the new search console. Google says its Search Console team is “working on a replacement strategy for these items”.
For the time being, the following legacy tools and reports are available:
-
International Targeting
If you have a website that can be viewed in multiple languages then the International Targeting report will monitor your hreflang errors or enable you to choose a country that should be prioritised in your search results.
I found a good guide to this tool at Direct Online Marketing.
-
Removals
The URL Removal tool is intended as a first step for content that you urgently need blocked from Google searches — for example, if it contains confidential data that was accidentally exposed.
Using the tool for other purposes might cause problems for your site.
Google says that if your site has been hacked, you should only use the URL Removal tool to block URLs the hacker may have added to your site. You shouldn’t use the tool to take your site offline while you clean up the hacking.
You can find more information about the URL Removal tool in the Search Console help.
-
Crawl Stats
The Crawl Stats report provides information on Googlebot’s activity on your site for the last 90 days, taking into account all of the content types that Google downloads (e.g. CSS, JavaScript, Flash, images, and PDF files).
Generally speaking, the graphs should look fairly level over the 90 days with minor peaks and troughs. If you add more content to your site, you would expect the graph levels to increase.
If you notice a sudden, significant drop in crawl rates across the three charts, you may need to investigate further.
Have you added any fresh content recently? Google crawls unchanging sites less frequently than regularly updated sites.
Have you added a new robots.txt rule that is blocking your content from being crawled?
Do you have image-only pages? Google will not crawl these.
If you’ve recently added a new sitemap for Google to crawl, you may notice a swift spike in crawl rate.
-
Messages
Here, Google Search Console will flag up any reports you might want to look at or send messages about features and tools available to you.
-
URL Parameters
URL parameters are most commonly used on e-commerce sites where there are multiple pages for very similar products, e.g.
- https://example.com/products/women/dresses/green.html
- https://example.com/products/women?category=dresses&color=green
- https://example.com/products/women/dresses/green.html?limit=20&sessionid=123
This tool reduces the crawling of duplicate URLs but it should only be used if you know how URL Parameters work. Incorrectly excluding URLs can cause them to disappear from searches.
-
Web Tools
This link will take you through to a number of Google’s Web Tools such as:
- Structured Data Testing Tool
- Structured Data Markup Helper
- PageSpeed Insights
- Google My Business
- And many more!
Links
This report gives you an at-a-glance view of:
- Your most linked to web pages from external sources
- Your most linked to web pages from internal links
- The top linking sites to your content
- The text people use to link to your site
If you spot that one page is attracting a lot of backlinks, you can click on the URL to see all of the websites linking back to it.
This is a great way of building up a picture of who’s connecting with your content and sharing it with their own communities.
The information you gather from this report can help you identify guest blogging opportunities, influencers in your sector, content that you might want to link to, similar audiences and much more.
The report can also help you spot spammy links back to your site, which can sometimes be the cause of a manual action (see above) being logged. If you do find spammy backlinks, you could ask the publisher to remove them or disavow the links.
With the internal links information, you can build up a better view of how your content is connected across your site.
Are there pages that you link to regularly? This could be your cornerstone content, i.e. the pillars on which your website is built. If you use the Yoast SEO plugin, it has a feature that allows you to mark cornerstone content for Google. This is a way of telling Google which content is evergreen, in-depth and important to your audience.
Settings
Finally, the Settings feature will just let you check that your web property has been verified. You can also check approved users/owners and learn more about the indexing crawler for your domain.
Now, over to you.
Are you a longstanding user of Google Search Console or has this guide inspired you to give it a go?
Which Performance report do you plan to look at first? Or are you more interested in who’s linking to your site? Do you have errors that need fixing? Have you identified good keywords to target?
Leave a Comment to let me know.
Hazel Jarrett, director of SEO at SEO+, is well-known in the SEO space, has won many awards during her 20-year career and has been published on various well-known sites. Through her services and training programs, her SEO strategies have generated 10s of millions of sales for her clients, earning her a big reputation for delivering the results that matter.
Want to follow Hazel on social media? You’ll find her via the icons below.