Skip to content
This repository was archived by the owner on Nov 3, 2020. It is now read-only.
/ depictr Public archive

A middleware for rendering static pages when crawled by search engines

Notifications You must be signed in to change notification settings

Juhlinus/depictr

Repository files navigation

πŸ“½ Depictr

πŸ’° Is this useful to you?

Consider sponsoring me on github! πŸ™

πŸ’Ύ Installation

composer require juhlinus/depictr

πŸ“ Config

You can publish the config by running the artisan vendor:publish command like so:

php artisan vendor:publish --provider="Depictr\ServiceProvider"

πŸ•· Crawlers

The following crawlers are defined out of the box:

return [
    'crawlers' => [
        /*
        |--------------------------------------------------------------------------
        | Search engines
        |--------------------------------------------------------------------------
        |
        | These are the list of all the regular search engines that crawl your
        | website on a regular basis and is the crucial if you want good
        | SEO.
        |
        */
        'googlebot',            // Google
        'duckduckbot',          // DuckDuckGo
        'bingbot',              // Bing
        'yahoo',                // Yahoo
        'yandexbot',            // Yandex

        /*
        |--------------------------------------------------------------------------
        | Social networks
        |--------------------------------------------------------------------------
        |
        | Allowing social networks to crawl your website will help the social
        | networks to create "social-cards" which is what people see when
        | they link to your website on the social network websites.
        |
        */
        'facebookexternalhit',  // Facebook
        'twitterbot',           // Twitter
        'whatsapp',             // WhatsApp
        'linkedinbot',          // LinkedIn
        'slackbot',             // Slack

        /*
        |--------------------------------------------------------------------------
        | Other
        |--------------------------------------------------------------------------
        |
        | For posterity's sake you want to make sure that your website can be
        | crawled by Alexa. This will archive your website so that future
        | generations may gaze upon your craftsmanship.
        |
        */
        'ia_archiver',          // Alexa
    ]
]        

β›” Exclusion

Depictr comes with the option of excluding an array of urls that shouldn't be processed.

This is useful for plain text files like sitemap.txt, where Panther will wrap it in a stripped down HTML file. Use of wildcard is permitted.

Per default the admin route and its sub-routes are excluded in the config file.

🏞 Environments

You can specify which environments Depictr should run on. Default is testing and production.