For my site, I write all demos and blog posts in Markdown format and then use Eleventy to create static web pages.
Until now, I haven't attempted to have search engines index the site, but I'll do that soon.
I knew you needed a files named robots.txt
but I couldn't explain how it worked,
so I asked my friend, ChatGPT:
Q: explain
robots.txt
in 3 sentences
A: The robots.txt file is a text file at the root of a website used to instruct web crawlers about which pages or sections of the site should not be crawled and indexed. It specifies rules for different user agents (web crawlers), indicating allowed and disallowed paths within the site to manage and optimize search engine indexing. However, adherence to these instructions is voluntary, and not all crawlers respect the robots.txt directives, making it an advisory rather than a security tool.
Similarly for sitemap.xml
:
Q: explain
sitemap.xml
in 3 sentencesA: A
sitemap.xml
file is an XML document that lists the URLs of a website along with optional metadata about each URL, such as the last modification date, change frequency, and priority in relation to other pages. It serves as a guide for search engines to more effectively crawl the site, ensuring that important content is discovered and indexed. By providing a structured overview of a site's pages, asitemap.xml
helps improve the visibility of a website in search engine results, aiding in search engine optimization (SEO).
Thanks to a couple of useful
posts,
I added robots.txt
and
sitemap.xml
files to the site.