At the moment, any outdated or removed pages need to be identified and deleted manually, which can be time-consuming.
We can implement a functionality where the crawler periodically updates the pages based on the sitemap. For example, if you could upload or link to the sitemap directly, the crawler could automatically remove pages that are no longer listed in the sitemap. This would ensure the knowledge base stays up to date without requiring manual intervention.