[Feature Request]: Add max_scroll_steps parameter to CrawlerRunConfig #1117
ruanjunmin
started this conversation in
Feature requests
Replies: 1 comment 1 reply
-
I have implemented this feature, and it is in the |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What needs to be done?
Proposed Solution:
``` class CrawlerRunConfig: def __init__( ..., max_scroll_steps: Optional[int] = None, # New parameter scroll_delay: float = 0.2, # Existing ): """ max_scroll_steps: Maximum scroll operations (None=unlimited) scroll_delay: Existing delay between scrolls """ ```Validation Approach:
Can reuse the session reuse mechanism2 and multi-step interaction pattern3 already proven in:
``` async with AsyncWebCrawler() as crawler: await crawler.arun(config=CrawlerRunConfig(max_scroll_steps=3)) # Test case ```What problem does this solve?
Documentation Alignment:
The change will finally implement what's already documented in Lazy-Loading section:
"Heavier Pages: Adjust scroll_delay or the max scroll steps as needed"
Target users/beneficiaries
No response
Current alternatives/workarounds
No response
Proposed approach
No response
Beta Was this translation helpful? Give feedback.
All reactions