Package Data | |
---|---|
Maintainer Username: | ganey |
Maintainer Contact: | git@ganey.co.uk (Michael Gane) |
Package Create Date: | 2016-11-01 |
Package Last Update: | 2016-11-01 |
Language: | PHP |
License: | Unknown |
Last Refreshed: | 2025-02-05 03:10:45 |
Package Statistics | |
---|---|
Total Downloads: | 21 |
Monthly Downloads: | 0 |
Daily Downloads: | 0 |
Total Stars: | 0 |
Total Watchers: | 1 |
Total Forks: | 0 |
Total Open Issues: | 0 |
This is a fork of https://github.com/jayhealey/Robots/ to be updated for Laravel 5.
The original Robots class was written by dragonfire1119 of TutsGlobal.com: http://tutsglobal.com/topic/15-how-to-make-a-robotstxt-in-laravel-4/
The class itself (Robots.php
) will work on any PHP 5.4+ site. It could easily be modified for 5.2 by removing the namespace.
This repository offers easy integration via Composer and includes service provider and a facade for Laravel 5 alongside a set of PHPUnit tests.
Checkout the Robots.php
class for a full understanding of the functionality.
As usual with Composer packages, there are two ways to install:
You can install via Composer. Pick the "master" as the version of the package.
composer require Ganey/robots
Or add the following to your composer.json
in the require
section and then run composer update
to install it.
{
"require": {
"Ganey/robots": "~2.0"
}
}
Once installed via Composer you need to add the service provider. Do this by adding the following to the 'providers' section of the application config (usually app/config/app.php
):
Ganey\Robots\RobotsServiceProvider::class,
Add the facade to the 'aliases' section of the application config (usually app/config/app.php
):
'Robots' => \Ganey\Robots\RobotsFacade::class
Note that the facade allows you to use the class by simply calling Robots::doSomething()
.
The quickest way to use Robots is to just setup a callback-style route for robots.txt
in your /app/routes.php
file.
<?php
Route::get('robots.txt', function() {
// If on the live server, serve a nice, welcoming robots.txt.
if (App::environment() == 'production')
{
Robots::addUserAgent('*');
Robots::addSitemap('sitemap.xml');
} else {
// If you're on any other server, tell everyone to go away.
Robots::addDisallow('*');
}
return Response::make(Robots::generate(), 200, array('Content-Type' => 'text/plain'));
});
And that's it! You can show different robots.txt
files depending on how simple or complicated you want it to be.
MIT