# User-agent: * applies to all robots User-agent: * # Allow crawling of all content by default. # This line is often implied if no Disallow rules are present, # but explicitly stating it can improve clarity. Allow: / # Disallow specific directories or files that should not be indexed. # Replace these with actual paths on your site if they exist. # Examples: # Disallow: /admin/ # Blocks access to an admin folder # Disallow: /temp/ # Blocks access to a temporary files folder # Disallow: /private/ # Blocks access to a private section # Disallow: /*.bak$ # Blocks access to all files ending with .bak # Path to your XML sitemap(s). Replace with the actual URL(s) of your sitemap. # You might have one or more sitemaps depending on your site structure. # If you don't have a sitemap yet, create one! Sitemap: https://stukadoorsbedrijfkicken.nl/sitemap.xml