- Bachelor's degree or higher in a quantitative/technical field such as Computer Science, Engineering.
- 3-6 years of data pipeline software development experience.
- Exceptional skills in at least one high-level programming language (Scala, Java, Go, Python or equivalent)
- Actively using and a strong understanding of big data technologies such as Kafka, Spark, Databricks toolkit
- Experience with Dataflow orchestration in Google Cloud Flow, Airflow, or Conductor
- Experience with AWS services including EMR, S3, Redshift, and RDS
- Understanding of the full lifecycle of an IP address originating from IANA to the end user (DNS, networking)
- Excellent communication skills to collaborate with cross-functional partners and independently drive projects and decisions
- Previous experience working in distributed teams. We are a remote-first company!
- Fast hiring process. Short interviews spread over a few days.
- Company computer and 500USD for your home office.
- Creation of a Career Plan
- Permanent role. Long term contract
- Vacation and "unlimited" days off based on performance.
- Leading and growing company in the industry.