# This robots.txt file controls crawling of URLs User-agent: * Disallow: /api/content-delivery Disallow: /api/search-service Allow: /