Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add some efficiencies to prevent unnecessary requests #42

Merged
merged 3 commits into from
Apr 19, 2024

Conversation

tsjason
Copy link
Contributor

@tsjason tsjason commented Apr 17, 2024

There are two changes in this PR that will help me use the package more efficiently.

  1. Be able to provider Robots with a RobotsTxt object directly. I have access to the robots.txt file in memory and don't want to write it to disk first. With this change, I can create a RobotsTxt object from the string in memory and create a Robot with it.

  2. Calling mayIndex() and mayFollowOn() on a Robot class causes two requests to the remote server. This change pulls the file_get_contents() request up one level and prevents RobotsMeta and RobotsHeader from each having to do their own requests.

@freekmurze freekmurze merged commit a676cac into spatie:main Apr 19, 2024
6 checks passed
@freekmurze
Copy link
Member

Thank you! Very nice!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants