Customize your site's robots.txt and include remote content to it
Author: | Viktor Szépe (profile at wordpress.org) |
WordPress version required: | 4.7 |
WordPress version tested: | 4.9.18 |
Plugin version: | 0.4.0 |
Added to WordPress repository: | 14-09-2014 |
Last updated: | 17-02-2018
Warning! This plugin has not been updated in over 2 years. It may no longer be maintained or supported and may have compatibility issues when used with more recent versions of WordPress.
|
Rating, %: | 66 |
Rated by: | 3 |
Plugin URI: | https://github.com/szepeviktor/multipart-robo... |
Total downloads: | 53 018 |
Active installs: | 2 000+ |
Click to start download |
This plugin needs more documentation!
You can edit your robots.txt and add remote content to it.
E.g. you have several sites and want to use a centralized robots.txt.
Features
- Include or exclude WordPress’ own robots.txt (core function)
- Include or exclude plugins – e.g. sitemap plugins – output to robots.txt (filter output)
- Include or exclude a remote text file (the common part)
- Include or exclude custom records from the settings page (the site specific part)
Where is robot.txt?
WordPress handles robots.txt as a virtual URL – just the same way as posts and pages.
So when you browse to https://example.com/robots.txt
WordPress generates robots.txt on the fly.
TODO
- add more description here
- add a video too
- add an admin notice for subdir installs (robots.txt is useless in a subdir)
- ‘At least one “Disallow” field must be present in the robots.txt file.’ – check for that
Links
Development of this plugin goes on on GitHub.
FAQ
ChangeLog