You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when I press Ctrl+C, new versions of crawlee pause the scraper, letting the requests finish cleanly (like with a graceful abort on Platform). However, the outer process of the CLI (apify run) exits, so I get the shell prompt, but then some extra output lines are written, leaving a mess in my terminal
Linux Mint, fish-shell 3.7.1, apify-cli/0.20.6 node-v20.5.1
when I run the scraper with npm run start, it works correctly - the Crawlee clean exit happens, and only after that the shell prompt is shown in my terminal. I think issue is with this SIGINT handler in CLI.
Process three when the scraper is running:
Process tree after Ctrl+C, before the scraper finishes - notice the whole thing is no longer under the terminal/shell at all.
when I press Ctrl+C, new versions of crawlee pause the scraper, letting the requests finish cleanly (like with a graceful abort on Platform). However, the outer process of the CLI (apify run) exits, so I get the shell prompt, but then some extra output lines are written, leaving a mess in my terminal
Linux Mint, fish-shell 3.7.1, apify-cli/0.20.6 node-v20.5.1
when I run the scraper with
npm run start
, it works correctly - the Crawlee clean exit happens, and only after that the shell prompt is shown in my terminal. I think issue is with this SIGINT handler in CLI.Process three when the scraper is running:
Process tree after Ctrl+C, before the scraper finishes - notice the whole thing is no longer under the terminal/shell at all.
(originally Slack thread https://apify.slack.com/archives/CD0SF6KD4/p1725443996011839 , everything important from there is copied above)
The text was updated successfully, but these errors were encountered: