TractorCow\Robots\Robots
Provides robots.txt functionality
- Author: Damian Mooyman
Synopsis
class Robots
extends RequestHandler
{
- // members
- private static string $sitemap = 'sitemap.xml';
- private static bool $disallow_unsearchable = true;
- private static array $disallowed_urls = ;
- private static array $allowed_urls = ;
- // methods
- protected boolean isPublic()
- public HTTPResponse index()
- protected string renderSitemap()
- protected string renderDisallow()
- protected string renderAllow()
- protected array disallowedUrls()
- protected array allowedUrls()
Hierarchy
Extends
- SilverStripe\Control\RequestHandler
Members
private
- $allowed_urls
—
array
List of allowed urls - $disallow_unsearchable
—
TractorCow\Robots\bool
Hide unsearchable pages - $disallowed_urls
—
array
List of disallowed urls - $sitemap
—
string
Path to sitemap.xml to look for
Methods
protected
- allowedUrls() — Returns an array of allowed URLs
- disallowedUrls() — Returns an array of disallowed URLs
- isPublic() — Determines if this is a public site
- renderAllow() — Renders the list of allowed pages, if any
- renderDisallow() — Renders the list of disallowed pages
- renderSitemap() — Renders the sitemap link reference
public
- index() — Generates the response containing the robots.txt content