Lecteur Markdown
ROBOTSTXT_DOCUMENTATION_EN
Feature: Robots.txt Manager #
Dynamic `robots.txt` generation with multiple security profiles, a bot database, AI crawler policies and sitemap integration.
---
Installation and Configuration #
Via FTP: Upload the `robotstxt/` directory into `/plugins/`.
Via package manager: Select `robotstxt` from the available plugins list.
Paths:
- Main plugin: `/plugins/robotstxt/robotstxt.php`
- Configuration: `/plugins/robotstxt/conf/robotstxt.conf.inc.php`
- Handlers: `/plugins/robotstxt/handlers/`
- Classes: `/plugins/robotstxt/lib/RobotsTxtGenerator.php`, `BotDatabase.php`
- CSS: `/plugins/robotstxt/css/`
Configuration parameters:
| Variable | Default | Description |
|---|---|---|
| `$basedisplevel` | `BASE_LEVEL_HIGHUSER` | Minimum required level |
| `$ftype` | `3` | Plugin type (system) |
---
Usage #
Security profiles #
| Profile | Description |
|---|---|
| `default` | Standard configuration |
| `private` | Blocks most crawlers |
| `open_research` | Allows academic crawlers |
| `corporate` | Strict enterprise configuration |
| `aegis_sovereign` | Maximises control |
AI policies #
| Policy | Description |
|---|---|
| `allow` | Allows AI crawlers |
| `block` | Blocks known AI crawlers |
| `honeypot` | Traps undeclared crawlers |
| `ignore` | No specific AI rules |
Workflow #
1. Select a profile and an AI policy
2. Configure crawl delay and elements to include (sitemap, honeypot)
3. Preview the generated file
4. Deploy (writes `robots.txt` to the site root)
Bot database migration #
The `run_migration` tool updates the known bot database and categorises entries.
---
Hooks and Entry Points #
?obj=robotstxt.php — Main interface
POST actions:
| Action | Description |
|---|---|
| `generate_preview` | Generate a preview |
| `deploy` | Write the `robots.txt` file |
| `run_migration` | Update the bot database |
---
Dependencies #
- `Beamreactor\Database\SQL` — bot database
- Write permission on `robots.txt` at the site root