Crawlab
Distributed web crawler management platform
Flexibility
Run web crawlers in any programming languages including Python, Go and Java, or web crawling frameworks including Scrapy, Colly and Selenium.
Scalability
Distributed architecture design of Crawlab allows users to easily manage over hundreds of distributed spiders and execute million-scale crawling tasks.
Powerful Features
Crawlab provides powerful functionalities such as Spider Auto-Deployment, Logs Monitoring, Git Integration, Online File Editor, Notifications, etc.