Package Data | |
---|---|
Maintainer Username: | rkeppner |
Maintainer Contact: | russell.keppner@gmail.com (Russell Keppner) |
Package Create Date: | 2016-04-09 |
Package Last Update: | 2016-04-13 |
Home Page: | |
Language: | PHP |
License: | MIT |
Last Refreshed: | 2024-10-30 15:19:10 |
Package Statistics | |
---|---|
Total Downloads: | 2,225 |
Monthly Downloads: | 0 |
Daily Downloads: | 0 |
Total Stars: | 1 |
Total Watchers: | 1 |
Total Forks: | 0 |
Total Open Issues: | 0 |
This is a fork of ellisthedev/laravel-5-robots, which was a fork of jayhealey/Robots, which was based on earlier work.
The purpose of this fork is to create a set-it-and-forget-it package that can be installed without much effort. It is therefore highly opinionated and not built for configuration.
When enabled, it allows access to all clients and serves up the sitemap.xml. Otherwise, it operates almost identically to Laravel's default configuration, denying access to all clients.
Via Composer command line:
$ composer require infusionweb/laravel-robots-route
Or add the package to your composer.json
:
{
"require": {
"infusionweb/laravel-robots-route": "~0.1.0"
}
}
Laravel ships with a default robots.txt which disallows all clients. It needs to be removed for the configured route to work.
$ rm public/robots.txt
Add the service provider to your config/app.php
:
'providers' => [
//
InfusionWeb\Laravel\Robots\RobotsServiceProvider::class,
];
Publish the package config file:
$ php artisan vendor:publish --provider="InfusionWeb\Laravel\Robots\RobotsServiceProvider"
You may now allow clients via robots.txt by editing the config/robots.php
file, opening up the site to search engines:
return [
'allow' => env('ROBOTS_ALLOW', true),
];
Or simply setting the the ROBOTS_ALLOW
environment variable to true, via the Laravel .env
file or hosting environment.
ROBOTS_ALLOW=true
The MIT License (MIT). Please see License File for more information.