All requests from the queue have been processed, t...
# crawlee-js
m
I'm working on news web crawler, and setting
purgeOnStart=false
so that I don't scrape duplicated news, however sometimes in some cases I got the message "All requests from the queue have been processed, the crawler will shut down." and the crawler don't run, any suggestion to fix this issue??
h
message has been deleted
h
the message means that all requests in your requestQueue are already handled, so there is no point to process them again
m
yes I know that, however how to add more urls to the requestQueue, because the domains I'm collecting the data from have more
h
You can add same url bit then you need to specify different uniqueKey for them request. By default uniqueKey is the same as url
m
Alright I'll try that
a
@mktr just advanced to level 1! Thanks for your contributions! 🎉
m
Thanks