Senior Data Engineer, DevX

Job not on LinkedIn

10 hours ago

Apply Now
Logo of Basis Technologies

Basis Technologies

Advertising • SaaS • Artificial Intelligence

Basis Technologies is a company focused on streamlining and innovating the digital advertising industry. They offer a comprehensive platform that integrates and automates various aspects of digital advertising, including demand-side platforms (DSP), cross-device targeting, brand safety, fraud protection, and private marketplaces. The platform is designed to reduce operational complexity and manual effort by consolidating disparate systems and automating workflows. Basis Technologies also provides services such as media strategy, activation, consulting, and onboarding, along with education programs through their AdTech Academy. They emphasize efficiency, productivity, and collaboration within advertising operations, catering to enterprises looking for effective programmatic advertising solutions.

501 - 1000 employees

☁️ SaaS

🤖 Artificial Intelligence

💰 $25M Private Equity Round on 2021-04

📋 Description

• Integrate diverse data sources and vendor products, including databases, APIs, and third-party services to make data available for analytical and operational use. • Automate repetitive and complex ETL deployment tasks to improve team efficiency and reduce manual intervention. • Develop, deploy, and maintain scalable data pipelines for large volumes of data. • Implement monitoring solutions to track data pipeline performance and quickly identify and resolve issues. • Code reviews, ensure adherence to best practices and standards. • Participate in the team's support rotation. • Adhere to the data team's established development and CI/CD practices.

🎯 Requirements

• Strong computer science fundamentals with 5+ years of relevant experience • Strong programming skills in Python or Java • Proficiency with build management tools (Bazel) • Proficiency with SQL and relational databases • Proficiency with Snowflake and their services (development, performance tuning) • Proficiency with workflow orchestration engines (Apache Airflow, Argo Workflows) • Proficiency with CI/CD engines (Harness, Circle CI, Argo CI/CD, Jenkins) • Proficiency with at least one of database versioning tools (Flyway, Sqitch or similar) • Proficiency with source control systems (Git) • Experience with containerization and orchestration tools (Docker, Kubernetes) • Experience developing for BI tools (Looker, Thoughtspot, Power BI) • Experience with working independently and managing multiple tasks • Ability to lead through technical excellence • Commitment to creating inclusive, respectful environments where all voices are valued and supported • Thoughtful approach to collaboration, design, and decision-making that prioritizes equity, access, and continuous learning

🏖️ Benefits

• flexible work week • 401k/RRSP matching • mental health support • paid sabbaticals • generous parental leave • flexible work options

Apply Now

Similar Jobs

19 hours ago

Software Engineer III responsible for designing and supporting AWS serverless applications at Modivcare. Collaborating with product teams on data engineering solutions in the healthcare industry.

AWS

Cloud

DynamoDB

ETL

NoSQL

Python

SQL

20 hours ago

Senior Data Engineer developing data infrastructure for real-time data ingestion and visualization. Leveraging cloud platforms to create impactful dashboards and support business intelligence.

JavaScript

Python

SQL

20 hours ago

Data Engineer designing, building, and maintaining data processing architectures for Humana. Collaborating to transform data into insights and enabling new capabilities for the Medicaid business.

Azure

Cloud

Oracle

PySpark

SQL

22 hours ago

Manager, Data Engineering leading Keeper Security's Data Engineering team to architect and maintain data infrastructures and pipelines. Collaborating closely with data scientists and business stakeholders.

Amazon Redshift

AWS

Cloud

NoSQL

Python

SQL

Tableau

Yesterday

Data Engineer assisting in the execution of data engineering projects at Stryker's Customer Intelligence team. Involves building ETL pipelines and working with cross-functional teams.

AWS

Azure

Cloud

ETL

Google Cloud Platform

Python

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com