Published on Jul 02, 2020
How to add a sitemap to the Sphinx project?#
Sitemap is essential part of making your website more visible for search engines. It is usually represented by the sitemap.xml file and lists URLs of all website pages, translations of pages in alternative languages, etc. sphinx-sitemap extension can easily generate sitemap for your Sphinx documentation project.

As an example, have a look to this website sitemap.xml.
Add and configure sphinx-sitemap#
All hard work will be done by an amazing sphinx-sitemap extension.
Install:
pip install sphinx-sitemap
Add or append
sphinx_sitemaptoextensionsinconf.py:extensions = [ # ... 'sphinx_sitemap' ]If you haven’t set it already, enter your documentation public URL, e.g.:
html_baseurl = 'https://documatt.com/blog'
Build the docs! The output directory will contain automatically generated
sitemap.xml.
Update robots.txt#
robots.txt is somewhat similar to sitemap. They both talk to search engine crawlers. A sitemap is a list of pages to index, while robots.txt is used to ignore (do not index) some pages. robots.txt is expected at the root of your website, e.g., https://documatt.com/robots.txt.
One way to “announce” sitemap to search engines is to report it in robots.txt by Sitemap: command.
Create or update
robots.txtin the project root (folder withconf.py). If you don’t have pages to exclude, it may look like this:User-agent: * Sitemap: https://documatt.com/blog/sitemap.xml
Add
html_extra_path = ["robots.txt"]option toconf.pyor append"robots.txt"if this option already exist.html_extra_pathis a list of paths to be copied to the root of the documentation.