Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • H headless-chrome-crawler
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 29
    • Issues 29
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 4
    • Merge requests 4
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Infrastructure Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • yujiosaka
  • headless-chrome-crawler
  • Issues
  • #376
Closed
Open
Issue created Dec 18, 2020 by Tony Foster@TheTFo

Crawling site with maxDepth > 2 causes hang

I'm crawling a small site with maxDepth === 2, and things crawl fine. As soon as up it to 3 or more, the the crawler hangs. I don't see onError or onSuccess called, or any errors. Looking through Fiddler, I do not see any request firing aside from the first batch. How should I troubleshoot this?

What is the current behavior? Crawler seems to hang with no error when maxDepth > 2. It's a rather small site.

If the current behavior is a bug, please provide the steps to reproduce Setting up to queue a particular site with maxDepth > 2 causes things to hang

What is the expected behavior? With any depth, the result should be the same.

Please tell us about your environment:

  • Version: 1.8.0
  • Platform / OS version: MacOS 11
  • Node.js version: 12.8.4
Assignee
Assign to
Time tracking