ARCANEDEV / Robots by arcanedev

Robots.txt generator
600
7
4
Package Data
Maintainer Username: arcanedev
Maintainer Contact: arcanedev.maroc@gmail.com (ARCANEDEV)
Package Create Date: 2015-05-14
Package Last Update: 2015-10-23
Home Page:
Language: PHP
License: MIT
Last Refreshed: 2024-11-11 15:21:33
Package Statistics
Total Downloads: 600
Monthly Downloads: 3
Daily Downloads: 3
Total Stars: 7
Total Watchers: 4
Total Forks: 0
Total Open Issues: 0

Robots.txt Generator Packagist License For PHP

Travis Status HHVM Status Coverage Status Scrutinizer Code Quality SensioLabs Insight Github Issues

Packagist Packagist Release Packagist Downloads

By ARCANEDEV©

Features

  • Framework agnostic package.
  • Well documented & IDE Friendly.
  • Well tested with maximum code quality.
  • Laravel 4.2 supported.
  • Laravel 5 supported.
  • Made with :heart: & :coffee:.

Requirements

- PHP >= 5.5.9

INSTALLATION

Composer

You can install this package via Composer. Add this to your composer.json :

{
    "require": {
        "arcanedev/robots": "~2.0"
    }
}

Then install it via composer install or composer update.

Laravel

Setup

Once the package is installed, you can register the service provider in config/app.php in the providers array:

'providers' => [
    ...
    Arcanedev\Robots\RobotsServiceProvider::class,
],

And the facade in the aliases array:

'aliases' => [
    ...
    'Robots' => Arcanedev\Robots\Facades\Robots::class,
],

USAGE

Laravel

The quickest way to use Robots is to just setup a callback-style route for robots.txt in your app/routes.php file.

Route::get('robots.txt', function() {

    // If on the live server, serve a nice, welcoming robots.txt.
    if (App::environment() == 'production') {
        Robots::addUserAgent('*');
        Robots::addSitemap('sitemap.xml');
    }
    else {
        // If you're on any other server, tell everyone to go away.
        Robots::addDisallow('*');
    }

    return Response::make(Robots::generate(), 200, ['Content-Type' => 'text/plain']);
});

Hard Coded

Add a rule in your .htaccess for robots.txt that points to a new script (or something else) that generate the robots file.

The code would look something like:

require_once __DIR__ . '/../vendor/autoload.php';

use Arcanedev\Robots\Robots;

$robots = new Robots;

$robots->addUserAgent('Google');
$robots->addDisallow(['/admin/', '/login/', '/secret/']);
$robots->addSpacer();
$robots->addSitemap('sitemap.xml');

header('HTTP/1.1 200 OK');
echo $robots->generate();

TODOS

  • [ ] Documentation

DONE

  • [x] Framework agnostic package.
  • [x] Examples
  • [x] Laravel v4.2 Support.
  • [x] Laravel v5.0 Support.
  • [x] Laravel v5.1 Support.

Contribution

Any ideas are welcome. Feel free to submit any issues or pull requests, please check the contribution guidelines.