We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It looks like the scan_level parameter was responsible for this earlier. But it has been deprecated. What can I do to limit depth of crawl?
The text was updated successfully, but these errors were encountered:
The crawl length feature was removed as it was incompatible with new algorithm. You can try to subclass the crawler class and see what happens.
Sorry, something went wrong.
What does that mean @rajatomar788 since getting the whole website like wget is the main feature
No branches or pull requests
It looks like the scan_level parameter was responsible for this earlier. But it has been deprecated. What can I do to limit depth of crawl?
The text was updated successfully, but these errors were encountered: