DB Robots.txt is an easy (i.e. automated) solution to creating and managing a robots.txt file for your site. It is easy to create a robots.txt without FTP access.
If the plugin detects one or several Sitemap XML files it will include them into robots.txt file.
- Upload bisteinoff-robots-txt folder to the
- Activate the plugin through the ‘Plugins’ menu in WordPress
Will it conflict with any existing robots.txt file?
If a physical robots.txt file exists on your site, WordPress won’t process any request for one, so there will be no conflict.
Will this work for sub-folder installations of WordPress?
Out of the box, no. Because WordPress is in a sub-folder, it won’t „know“ when someone is requesting the robots.txt file which must be at the root of the site.
Contributors & Developers
“DB Robots.txt” is open source software. The following people have contributed to this plugin.Contributors
- Security issues
- Compatible with WordPress 6.3
- Security issues
- Compatible with multisites
- Corrected errors in the functions for translation of the plugin
- Now the translations are automatically downloaded from https://translate.wordpress.org/projects/wp-plugins/db-robotstxt/ If there is not a translation into your language, please, don’t hesitate to contribute!
- Compatible with GlotPress
- New options to rename or delete the existing robots.txt file
- New option to disable the rules for Yandex
- Design of the Settings page in admin panel
- New basic regular rules for Googlebot and Yandex
- Now more possibilities to manage your robots.txt: you can add custom rules for Googlebot, Yandex and other User-agents
- More information about your robots.txt on the settings page
- Added a settings page in admin panel for custom rules
- Tested with WordPress 6.2.
- The code is optimized
- Added the robots directives for new types of images WebP, Avif
- Fixed Sitemap option
- Tested with WordPress 5.5.
- Added wp-sitemap.xml
- Tested with WordPress 5.0.
- The old Host directive is removed, as no longer supported by Yandex.
- The robots directives are improved and updated.
- Added the robots directives, preventing indexind duplicate links with UTM, Openstat, From, GCLID, YCLID, YMCLID links
- Initial release.