Skip to content

Make crawler process run continuously in Docker #2

@jfalken

Description

@jfalken

The docker container was initially created for ease-of-use for trial/demo purposes. After discussions, some people want to use their actual deployment, instead of using just the base python scripts.

This will work fine as is, however the current crawler process runs once. When complete, supervisord will try to restart it, but the behavior is undefined as it depends upon whether you still have GitHub API credits for that hour and how many retries have occurred.

This feature request is to add the ability for the Docker container version of this script to re-try / re-cycle the crawler every X hours, as specified by the user.

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions