Completed
Pull Request — master (#165)
by Brent
07:57
created

Profile::shouldCrawlCallback()   A

Complexity

Conditions 1
Paths 1

Size

Total Lines 4
Code Lines 2

Duplication

Lines 0
Ratio 0 %

Importance

Changes 0
Metric Value
dl 0
loc 4
rs 10
c 0
b 0
f 0
cc 1
eloc 2
nc 1
nop 1
1
<?php
2
3
namespace Spatie\Sitemap\Crawler;
4
5
use Spatie\Crawler\CrawlProfile;
6
use Psr\Http\Message\UriInterface;
7
use Spatie\Robots\Robots;
8
9
class Profile extends CrawlProfile
10
{
11
    /** @var callable */
12
    protected $profile;
13
14
    /** @var \Spatie\Robots\Robots */
15
    protected $robots;
16
17
    public function __construct()
18
    {
19
        $this->robots = Robots::create();
0 ignored issues
show
Documentation Bug introduced by
It seems like \Spatie\Robots\Robots::create() of type object<self> is incompatible with the declared type object<Spatie\Robots\Robots> of property $robots.

Our type inference engine has found an assignment to a property that is incompatible with the declared type of that property.

Either this assignment is in error or the assigned type should be added to the documentation/type hint for that property..

Loading history...
20
    }
21
22
    public function shouldCrawlCallback(callable $callback)
23
    {
24
        $this->profile = $callback;
25
    }
26
27
    /*
28
     * Determine if the given url should be crawled.
29
     */
30
    public function shouldCrawl(UriInterface $url): bool
31
    {
32
        $mayIndex = config('sitemap.ignore_robots', false) || $this->robots->mayIndex($url);
33
34
        return $mayIndex && ($this->profile)($url);
35
    }
36
}
37