Skip to content

[Bug]: TypeError: 'async_generator' object is not iterable when processing crawl requests with multiple URLs #1066

Open
@zinodynn

Description

@zinodynn

crawl4ai version

0.6.0-r2

Expected Behavior

Many results should be returned in the response when I send a request for a task with many URLs.

Current Behavior

Encounter some errors like this:

Traceback (most recent call last):
  File "/workspace/deploy/docker/api.py", line 445, in handle_crawl_request
    "results": [result.model_dump() for result in results],
                                                  ^^^^^^^
TypeError: 'async_generator' object is not iterable

When I send a request with many URLs for a task , like this :

curl -X POST http://localhost:11235/crawl \
  -H "Content-Type: application/json" \
  -d '{"urls":["https://example.com","https://www.google.com"],"crawler_config":{"type":"CrawlerRunConfig","params":{"scraping_strategy":{"type":"WebScrapingStrategy","params":{}},"exclude_social_media_domains":["facebook.com","twitter.com","x.com","linkedin.com","instagram.com","pinterest.com","tiktok.com
","snapchat.com","reddit.com"],"stream":true}}}'

Is this reproducible?

Yes

Inputs Causing the Bug

Steps to Reproduce

Code snippets

OS

Linux

Python version

3.12.10

Browser

No response

Browser version

No response

Error logs & Screenshots (if applicable)

No response

Metadata

Metadata

Assignees

Labels

🐞 BugSomething isn't working📌 Root causedidentified the root cause of bug

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions