RebelMouse Talk to Us
How to Submit a Sitemap in Google Search Console

Google is the most commonly used search engine, and if you have a large site — or are working on making it large — you definitely want your sitemaps submitted to it.

A sitemap is a file where you list the webpages of your site that tells Google and other search engines about the organization of your site's content. Search engine web crawlers like Googlebot read this file so your site gets crawled intelligently.

To understand how sitemaps provide information, imagine that sitemaps are lists of content. When a new post is created, this post is added to the top of the list. If an existing post is updated, then we remove it from its original position and we add it to the top of the list.


Your sitemap can also contain valuable metadata associated with the pages you list in it. Metadata is information about a web page, such as when the page was last updated, how often the page is changed, and the importance of the page relative to other URLs on your site.

If your site's pages are properly linked, Google's web crawlers can usually discover most of your site. Even so, a sitemap can improve the crawling of your site, particularly if your site meets one of the following criteria:

  • Your site is really large. As a result, it's more likely that Google's web crawlers may overlook crawling some of your new or recently updated pages.
  • Your site has a large archive of content pages that are isolated or not linked well to each other. If your site pages do not naturally reference each other, you can list them in a sitemap to ensure that Google does not overlook some of your pages.
  • Your site is new and has few external links pointing at it. Googlebot and other web crawlers crawl the web by following links from one page to another. As a result, Google might not discover your pages if no other sites link to them.
  • Your site uses rich media content, is shown in Google News, or uses other sitemap-compatible annotations. Google can take additional information from sitemaps into account for search, where appropriate.

Below is a list of sitemaps currently supported on RebelMouse:

  • /sitemap.xml (for published posts)
  • /sitemap_news.xml (for posts published two days ago)
  • /sitemap_video.xml (for published posts with video in the lead media)
  • /sitemap_pages.xml (for layouts)
  • /sitemap_sections.xml (for sections)
  • /sitemap_custom_pages.xml (for intersection)
  • /sitemap_section_content/section.xml (for a section — example)
  • /sitemap_section_content/parent_section/child_section.xml (for a child section — example)
  • /sitemap_tags.xml (for tags)

Regular sitemaps, news sitemaps, and videos sitemaps don't include all posts. Here's a list of posts that are excluded from sitemaps:

  • Articles with an "Unpublished" status:
    1. Removed Posts
    2. Drafts
    3. Community Posts
  • Private posts. A post is private if all of the sections that are assigned to it are set up as private.
  • Posts with direct links out. These are posts with source URLs (i.e., original URLs) that link directly to an external site.
  • Posts that are excluded from search results.
  • Suspicious posts.

Images are also able to be included in regular sitemaps. You can learn more about this in our post about how to serve images.

Here's how to submit a sitemap in Google Search Console:

Sign in to Google Search Console

You must have a Google account in order to connect your site to Google Search Console (formerly known as Google Webmaster Tools). You can sign in here.

Choose your site in Google Search Console and click the Sitemaps button at the bottom of the page.

Add a new sitemap and click the SUBMIT button.

You can enter different types of sitemaps as needed. You can see if they have been successfully submitted in the Status column.

At some point you may also need to remove a sitemap. In the left-hand navigation menu in Search Console, click on Sitemaps and select the sitemap you would like to remove:

Then, click the vertical ellipsis icon in the upper right-hand corner of the screen and select Remove sitemap:

More on Google Search Console

Click here to learn how to set up your Google Search Console.

Click here to learn more about our SEO Keyword Win feature, which shows you what keywords you've won in the search engines so that you can adjust your URL slug and related articles to improve SEO even further.

What Is RebelMouse?
Request a Proposal
The Most Secure CMS of 2022
Featured

The Most Secure CMS of 2022

Why RebelMouse Is the Most Secure CMS on the Web

Delivering a secure, high-performing environment with extreme reliability is essential to all of our clients at RebelMouse. We only use industry-leading, reliable approaches to host our infrastructure. This ensures maximum stability and security for all of our clients’ data. Here are just some of the reasons we’re able to maintain a hard-bodied product that’s flexible, too.

Keep reading... Show less
Google Made Major Changes to Its Formula for Page Speed. Here’s What to Do About It in 2022.
Rebel Insights

Google Made Major Changes to Its Formula for Page Speed. Here’s What to Do About It in 2022.

Make sure your site is set up for success in 2022.

In the spring of 2020, Google let the world know that its Core Web Vitals would become the new benchmark for measuring a site's performance in its search results, known as the page experience update. Fast forward to more than a year later in August 2021 when, after much anticipation, Google's page experience update became official.

Since its rollout, developers have felt the impact of how their publishing platforms stack up against the new standard. Important decisions around the architecture of your site can now make or break your site's performance in the eyes of Google.

HTTP Archive, a tracking platform that crawls the web to identify trends and record historical patterns, has revealed how top content management systems (CMS) have weathered the page experience update through the creation of its Core Web Vitals Technology Report. RebelMouse consistently outperforms major CMS platforms on Google's most critical metrics since its rollout and into 2022:

Getting superior scores on Google's performance benchmarks isn't easy, either. The Ahrefs blog analyzed Core Web Vitals data from the Chrome User Experience Report (CrUX), which is data from actual Chrome users, to see how the web stacks up against Core Web Vitals. Their study found that only 33% of sites on the web are passing Core Web Vitals.

data from Ahrefs tracked on a line chart finds that shows only 33% of sites on the web pass Google's Core Web Vitals From Ahrefs.

Luckily, performing well on Core Web Vitals is possible with thoughtful, strategic changes to your site’s codebase. Here's what you need to know and how we can help.

Keep reading... Show less
Interested in a Free Website Health Check?Check Your
Website's Health
Get Your Free Analysis Now