Buy. Sell. Simple.
C2C marketplace • mobile commerce • e-commerce • local commerce • engineering
201 - 500
💰 Secondary Market on 2021-08
April 12
Airflow
Android
AWS
Cloud
Distributed Systems
GCP
Hadoop
HBase
iOS
Kafka
Open Source
PagerDuty
Python
SQL
Terraform
Loading...
Buy. Sell. Simple.
C2C marketplace • mobile commerce • e-commerce • local commerce • engineering
201 - 500
💰 Secondary Market on 2021-08
• Design and develop applications to process large amounts of critical information to power analytics and user-facing features. • Monitor and resolve data pipeline or data integrity issues. • Work across multiple teams to understand their data needs. • Maintain and expand our data infrastructure.
• 3+ years of professional software development experience • Strong ability in distributed systems for processing large scale data processing • Ability to communicate technical information effectively to technical and non-technical audiences • Proficiency in SQL and Python • Experience leveraging open source data infrastructure projects, such as Airflow, Kafka, Avro, Parquet, Hadoop, Hive, HBase, Presto or Druid • Experience building scalable data pipelines and real-time data streams • Experience building software in AWS or a similar cloud environment • Experience with AWS services like Kinesis, Firehose, Lambda, Sagemaker is a big plus • Experience with GCP services like BigQuery, Cloud Functions is a big plus • Experience with operational tools like Terraform, Datadog and Pagerduty is a big plus • Computer Science or Engineering degree required, Masters degree preferred • Excellent communication skills, both written and spoken (fluency in English required)
• Excellent compensation package • Opportunity to work on cutting-edge data infrastructure projects • Work with a dynamic and innovative team • Opportunity to build scalable data pipelines and real-time data streams
Apply Now