-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot set timeout for each URL when batch import with parameter -L #1139
Comments
This is intended and that's how the feature works. I will consider adding something like |
When will v0.4.4 be released and the parameter --max-target-time is required |
I'm currently having exams so will probably start working on this in late December or January, sorry for having to delay it |
This occurs during batch scanning, so automatically discard the URL if you encounter a jam;RecursionError: maximum recursion depth exceeded while calling a Python object |
Stay stuck until the program exits |
@huan-cdm I don't think it's an issue with dirsearch but difflib library. It isn't related to this Github issue too, can you please open another Github issue for it so I can take a look later? Thanks! |
Ok, thank you |
What is the feature?
Cannot set timeout for each URL when batch import with parameter -L
What is the use case?
If the --max-time timeout parameter and the -l batch parameter are used together, it will cause the first RUL scan to time out and other URLs will not be scanned.
The text was updated successfully, but these errors were encountered: