Scrapinghub: Cloud Backend Engineer at Scrapinghub () (allows remote)
Posted: Feb 20, 2019
About the job:
You’ll be working on the Platform team, building and maintaining our customer facing application, and tools to make the world a better place for web crawler developers.
We have established products that already have product-market fit where you’ll help to grow the business and stay up to date with market demands. At the same time, you’ll be working to iterate quickly on testing new opportunities to help determine which are worth continued investment.
We’re a data driven team that defines success by business result rather than completion of a task. Finally, being a completely remote company with team members in many different time zones, you’ll excel in this role as an independent thinker that can always find a way to move projects forward, even if you might be the only team member online at that time.
- Take ownership of projects, and independently drive them from prototype to completion
- Build composable, reusable components for our complex SPA
- Design and improve the backbone of a large scale web crawling platform
- Strive to build easy to maintain systems and improve existing systems
- Be proactive in bringing forth new ideas and solutions to problems
- Be a strong team player and share knowledge freely and easily with your co-workers
- Write code carefully for critical and production environments
- Good knowledge of Python, MySQL and HBase
- Backend web development experience using Django and Flask
- Experience with any distributed messaging system (Rabbitmq, Kafka, etc.)
- Strong knowledge of Linux & system programming
- Docker container basics
- Understanding different ways of solving problems, and the ability to wisely choose between a quick hotfix, a long-term solution, or a design change
- Being comfortable with Git and team-based Git workflows
- Excellent communication skills, both written and verbal, in English
- Experience developing RESTful web APIs
- Experience with real-time communication in webapps
- Experience using Celery
- Asynchronous programming experience using Python (asyncio, twisted, etc.)
- Familiarity with techniques and tools for crawling, extracting and processing data, asynchronous communication and distributed systems.
- Familiarity with Apache Mesos, Kubernetes, RabbitMQ, Kafka, Zookeeper
Bonus points for:
- Experience working remotely or with a distributed team
- Experience with ASGI and Django Channels