Skip to content

Crawling site with maxDepth > 2 causes hang #376

Open
@TheTFo

Description

@TheTFo

I'm crawling a small site with maxDepth === 2, and things crawl fine. As soon as up it to 3 or more, the the crawler hangs. I don't see onError or onSuccess called, or any errors. Looking through Fiddler, I do not see any request firing aside from the first batch. How should I troubleshoot this?

What is the current behavior?
Crawler seems to hang with no error when maxDepth > 2. It's a rather small site.

If the current behavior is a bug, please provide the steps to reproduce
Setting up to queue a particular site with maxDepth > 2 causes things to hang

What is the expected behavior?
With any depth, the result should be the same.

Please tell us about your environment:

  • Version: 1.8.0
  • Platform / OS version: MacOS 11
  • Node.js version: 12.8.4

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions