Package Data | |
---|---|
Maintainer Username: | OwenMelbz |
Maintainer Contact: | owenmelbz@gmail.com (Owen Melbourne) |
Package Create Date: | 2017-05-27 |
Package Last Update: | 2019-10-02 |
Home Page: | |
Language: | PHP |
License: | MIT |
Last Refreshed: | 2024-12-22 03:12:04 |
Package Statistics | |
---|---|
Total Downloads: | 1,097 |
Monthly Downloads: | 0 |
Daily Downloads: | 0 |
Total Stars: | 14 |
Total Watchers: | 2 |
Total Forks: | 0 |
Total Open Issues: | 0 |
An automatically generated robots.txt which automatically discourages indexing of folders, with additional meta tag blade directives for in page exclusions.
Install via composer composer require owenmelbz/laravel-robots-txt
Register the service provider - typically done inside the app.php
providers array e.g OwenMelbz\RobotsTxt\RobotsTxtServiceProvider::class
Add BLOCK_ROBOTS=true
to your application environment config e.g .env
Within your <head>
area you can include the blade directive @robotsMeta
to pull through a noindex, nofollow
meta tag.
If you publish the package via php artisan vendor:publish --provider="OwenMelbz\RobotsTxt\RobotsTxtServiceProvider"
you can use a custom robots.txt template file to include extra rules.