Welcome back. Now that we understand the importance of technical SEO, let's talk about one of the key elements of technical SEO, the sitemap. As a visitor to a website, you may have encountered sitemaps. But did you know that there are sitemaps created just for search engine robots? In this lesson, we'll learn about the difference between HTML sitemaps and XML sitemaps and see why it is important to include both types of sitemaps in your site design. The last part of this lesson, we'll create an XML sitemap using a crawling tool called Screaming Frog. Let's get started. Sitemaps help point search engines to existing pages on your site, which helps ensure pages are not overlooked or missed by crawlers. When SEOs refer to sitemaps, we are referring to either an HTML sitemap or an XML sitemap. The main difference between the two versions are that HTML sitemaps are easily read and understood by users while XML sitemaps are created for search engines. An XML sitemap is a file intended to be read by search engine robots. This file includes a lot of behind the scenes activity of a webpage. This can include unique information about each URL, which will provide search engines with additional data about that page. For example, you can include information such as when the page was last updated, how often the page changes, and how important the pages in relation to other pages on your site. This information allows search engine robots to analyze the content on your site in a more logical and intelligent manner. An XML sitemap is especially useful for new sites that have not yet been discovered by search engines. Since search engines discover pages on the web through following a list of links, it may take a while for a new site to be discovered. By creating a free account and Google Search Console or Bing Webmaster Tools and uploading your sitemap, you are able to inform search engines of the presence of your new site and what pages exist. An HTML sitemap is a simple page that contains links to important pages within the site that a user should concern themselves with and can be considered a general overview. For example, if you were looking for a particular page within a website and clicked onto their sitemap, you will likely be able to find the page you are looking for there. Smaller sites will generally have an HTML sitemap that is one page. Larger sites will generally split up their HTML sitemap content into different categories. This is done to better organize the content within their sight. Ideally, a website should have both an HTML and an XML sitemap available. For the purposes of this lesson however, we will discuss creating XML sitemaps as these are more specific to SEO and search engines. If you perform a search for XML sitemap creation tools, you'll receive a variety of options. One option is through site called XML sitemaps.com. This will allow you to create a free sitemap, but only up to 500 pages. Most free online sitemap creation tools will have similar limitations. The preferred method is to use a crawling tool such as Screaming Frog to help create a sitemap. Screaming Frog will also only crawl 500 pages with their free version. This is fine for learning how to create a sitemap in this course. Eventually, you will want to consider purchasing the full version. It's a very useful tool for analyzing a website. For now. Let's go ahead and create a sitemap using Screaming Frog to see how to do so and what a sitemap file looks like. In this demo, we will discuss how to use a crawling tool such as Screaming Frog to create an XML sitemap. This is the Screaming Frog interface. To begin, let's crawl a website so we can create an XML sitemap file of that site. I'm going to use UC Davis as an example. Once you've entered your URL, you can click start or simply hit Enter. The total begin crawling a list of pages within the site. You can see the progress of the crawl here to the right, as well as a list of pages the tool has crawled below. The tool will show you the address of the page, the type of content the page is, such as whether or not it's an HTML file, an image, or codes such as JavaScript, status codes, error codes, title tags and more. It's worthwhile to crawl a website and spend a few minutes familiarizing yourself with the interface and the information available. Since this crowd can take a while, we're going to stop the crawl and use the information we have in our example. Once you're done crawling a site, you can create an XML sitemap by going to the menu item sitemaps, and choosing Create XML sitemap. Screaming Frog will give you a variety of options that you generally want to leave as default. However, there are additional options here should you need something specific? For example, you can choose to include pages you would not like indexed. You can also choose to include canonicalized versions of pages. This option will allow you to include pages in the series, such as Page 1,2,3 and so forth. You can also choose whether or not you would like PDF files included if the content in your PDF files are unique and it may not be a bad idea to allow search engines to discover and crawl these so these can be included in the index. There are a variety of other tabs here with additional information you can choose. The last-modified tab allows you to tell search engines when the page was last modified. You can choose to go by the server response or set a custom date. The priority tab allows you to set the priority of specific pages within your site. You can also change the frequency and show how often a page is updated, such as daily, weekly, monthly, or something different and you can choose whether or not to include images. Generally, this is pretty unnecessary as Google will discover important images related to the content on your site as they crawl the page. Once you have this information selected, we can go ahead and create the sitemap file by clicking next. Screaming Frog will then ask where we want to save the file. In this example, I'm going to choose to save it to my desktop. Once you've selected the area, you want to save it to. Just name the sitemap file and click save. I'm just going to keep it named as default sitemap name for now. Once it has been saved, you can provide this to your web admins so they can upload it to the server, or if you want go in and upload it to yourself. You can get an idea of what an XML sitemap file looks like. Let's open it up and take a quick look. You can see some intersects here that provides additional information to search engines. Then you can see each page here as along with some additional information such as how often it's updated, the URL of the page, and the priority of the page. Depending on the settings you chose, you may see additional information here, such as when the page was last modified. This concludes the demonstration. You should now understand how to crawl a website and how to create a sitemap.