Sitemaps & Robots.txt

Unfold CMS automatically generates an XML sitemap and a robots.txt file for search engine crawlers. Sitemaps are updated synchronously whenever content is created, updated, or deleted.

XML Sitemap

Overview

The XML sitemap is generated using the spatie/laravel-sitemap package and is available at:

https://yourdomain.com/sitemap.xml

The sitemap includes:

  • Blog posts — All published posts with their last modified date
  • Pages — All published static pages
  • Categories — Blog category archive pages
  • Homepage — The site's root URL

Configuration

Configure sitemap behavior in Settings > SEO in the admin panel.

Automatic Updates

The sitemap regenerates synchronously when:

  • A blog post is published, updated, or deleted
  • A page is created, updated, or deleted
  • A category is added or removed

There is no need to manually trigger sitemap regeneration. The CMS handles this automatically without queue workers.

Search Engine Notification

When new content is published, Unfold CMS can notify search engines:

Engine Setting Default
Bing seo.ping_bing true

Google automatically discovers sitemaps through robots.txt.

Robots.txt

The robots.txt file is dynamically generated and available at:

https://yourdomain.com/robots.txt

Default Content

User-agent: *
Disallow: /admin/*
Disallow: /api/*

Sitemap: https://yourdomain.com/sitemap.xml

Configuration

Customize the robots.txt disallow paths and custom rules through Settings > SEO in the admin panel. The sitemap URL is automatically appended.

Custom Rules

Add custom directives through the seo.robots_custom setting:

User-agent: Googlebot
Allow: /

User-agent: Bingbot
Crawl-delay: 5

Shared Hosting Notes

Sitemap generation runs synchronously — no cron job or queue worker is needed for sitemap updates. The sitemap is regenerated on content changes and cached for the configured duration.

The only cron job needed is for scheduled post publishing:

* * * * * cd /path-to-project && php artisan schedule:run >> /dev/null 2>&1