-
-
Notifications
You must be signed in to change notification settings - Fork 1
Description
Describe the bug
I'm watching the SDK crawl the website in my previous bug report, and due to the size of the website, it's sending an AWFUL lot of data back down. There are 54 pages, and refreshing every second is causing an explosion in memory used.
It feels like there needs to be a backoff algorithm based on the number of items the response says is being crawled, and/or I need to be able to pass in a different wait interval so I can better control the timing.
Steps to reproduce the bug
Use the .NET sample code to crawl https://burnrate.io.
Use Fiddler to watch how many times the Firecrawl API is hit and how much data comes back.
Expected behavior
I would expect to either have a measure of control over the refresh time, and/or have a backoff algorithm based on the number of items we're ultimately waiting for.
Screenshots
No response
NuGet package version
Latest
Additional context
No response