Senior Data Engineer

Job not on LinkedIn

August 14

🇨🇴 Colombia – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Apply Now
Logo of Ansira

Ansira

B2B • Marketing • Enterprise

Ansira is a company that specializes in helping brands synchronize their distributed ecosystems through a comprehensive platform and professional services. With over 100 years of experience, Ansira empowers brands to better coordinate their partner networks, drive efficiencies, and stimulate brand-to-local growth. The company offers a suite of services including media planning, strategic consultancy, and targeted marketing incentives, supporting over 300 brands across various industries. Ansira's solutions are designed to unleash creativity, motivate partner investments in marketing, and engage consumers with personalized campaigns. They are a recognized leader in distributed growth, providing vital tools and tactics for effective brand promotion.

1001 - 5000 employees

🤝 B2B

🏢 Enterprise

💰 Private Equity Round on 2012-03

📋 Description

• Senior Data Pipeline Developer role at Ansira Connect SaaS platform; optimize and enhance data pipelines powering customer data management • Cross-functional team developing, maintaining, and improving fast, scalable, cloud-native data solutions capable of processing millions of customer records in real-time • Build and monitor fast, secure, cost-effective data workflows ingesting customer information including addresses, emails, and custom targeting fields through real-time streaming pipelines • Ensure data quality via deduplication, address standardization, and email validation; maintain reliability across varying data volumes • Collaborate with product, engineering and other teams to enhance cloud-native solutions using Spring Cloud Data Flow, Kafka, CockroachDB, and Kubernetes • Opportunities for learning, mentorship, career growth and working in a fun, diverse team • Participate in the full development life cycle; design, development and implementation of large-scale distributed systems • Design, develop and maintain data pipeline processing functionality; extend Data Warehouse and Data Lakes from diverse sources • Proactively identify data quality issues and recommend improvements • Write scalable, tested code; contribute to documentation and knowledge sharing

🎯 Requirements

• Bachelor's or Master’s degree in computer science, computer science engineering, statistics, math, related field, or equivalent experience • 5+ years of hands on experience in application development using cloud technologies • 5+ years of hands on Architecture experience: data pipelines, distributed computing engines • 5+ years of hands on experience in developing and running ETL or ELT processes • Expertise in using ETL or ELT to ingest data from diverse sources (RDBMS, NoSQL, REST API, flat files, Streams, Time series data, proprietary formats) • Expertise designing and implementing pluggable, reusable platform components pertinent to data analytics and ingestion technologies • Expertise in consuming web-services (REST, SOAP) • Expertise in developing software involving caching, queuing, concurrency, and network programming • Expertise in using CI/CD and DevSecOps best practices • Expertise in running workloads in containers (Docker or Kubernetes) • Expertise in analyzing production workloads and developing strategies to run data systems with scale and efficiency • Proficiency in SQL/PLSQL, data manipulation, query development and optimization • Proficiency troubleshooting and resolving performance issues at the database and application levels • Proficiency in using flow charts, UML or C4 models • Proficiency in using Unix and command line tools • Proficiency in Test Driven Development (TDD) or automated testing including unit, functional, stress and load testing • Proficiency in OWASP security principles, understanding accessibility, and security compliance • Competency in data security and data protection strategies • Experience with the entire Software Development Life Cycle (SDLC), Agile Development, SCRUM, or Extreme Programming methodologies • A passion for solving problems and providing workable solutions while demonstrating the flexibility to learn new technologies that meet business needs • Strong communication skills (English) as well as experience in mentoring and educating your peers • Expertise in one or more programming languages such as Java, PHP, Python, Go, etc. Emphasis on Java (8+) and Python • Expertise in one or more ETL/ELT tools such as Spring Cloud Data Flow, Google Dataflow, Apache Beam, Adobe Airflow, etc. Emphasis on Spring Cloud Data Flow (Spring 4+ and Spring Boot 2+) • Expertise in one or more Version Control Systems such as Git, SVN, CVS, Team Foundation. Emphasis on Git • Expertise in one or more Message-Oriented-Middleware such as RabbitMQ, JMS, Kafka, Pulsar. Emphasis on Apache Kafka • Proficiency in one or more public cloud providers (AWS, Azure, GCP, etc). Emphasis on Google Cloud Platform • Proficiency in one or more cloud DWH platforms such as BigQuery, Snowflake, Redshift, Cloudera, Azure Data Lake Store, etc. Emphasis in BigQuery • Proficiency in full-stack observability principles (tracing, metrics, logging) and one or more observability tools such as Apache Skywalking, Prometheus, Grafana, Graylog, and StackDriver • Competency in one or more RDBMS such as PostgreSQL, MySQL, Oracle, SQL Server, etc. Emphasis on PostgreSQL • Competency developing queries and stored procedures in SQL, PLSQL or T-SQL • Fluency in data visualizations techniques using tools such as PLX Dashboards, Google Data Studio, Looker, Tableau, or similar technologies. Emphasis in Looker • Fluency in distributed or NoSQL databases such as CockroachDB, MongoDB, Cassandra, Couchbase, DynamoDB, Redis, etc. Emphasis on CockroachDB • Understanding of one or more large-scale data processing platforms such as Apache Spark, Apache Storm, Apache Flink, Hadoop, etc. • Understanding of cloud object storage such as S3, GCS. Emphasis on GCS • Understanding of HTML and JavaScript

🏖️ Benefits

• Opportunities for learning, mentorship, career growth • Collaborative and diverse team environment • Exposure to cloud-native technologies and modern data platforms

Apply Now

Similar Jobs

August 9

Data Architect con visión consultiva para una consultora líder en soluciones financieras. Diseña, modela y visualiza datos; trabajo 100% remoto con clientes internacionales.

🗣️🇪🇸 Spanish Required

AWS

Azure

Cloud

August 1

Topsort is seeking a Senior Data Engineer to build data pipelines and optimize data solutions.

🇨🇴 Colombia – Remote

💰 $8.5M Seed Round on 2022-03

⏰ Full Time

🟠 Senior

🚰 Data Engineer

Airflow

Apache

AWS

BigQuery

Cloud

ETL

Kafka

MySQL

Postgres

Python

Scala

Spark

SQL

July 31

Responsible for managing company data and developing efficient data solutions. Focus on scalable, maintainable solutions in remote technology sector.

🇨🇴 Colombia – Remote

💵 $6M - $8M / month

⏰ Full Time

🟠 Senior

🚰 Data Engineer

🗣️🇪🇸 Spanish Required

DynamoDB

Node.js

Python

SQL

July 30

Artefact

51 - 200

As a Data Engineer at Artefact, work remotely to transform data strategies for clients.

🗣️🇪🇸 Spanish Required

June 28

Work for a US technology company as a Senior Data Engineer specializing in ETL, Node, and GCP. Join a team building reliable digital solutions for independent retailers.

🇨🇴 Colombia – Remote

⏰ Full Time

🟠 Senior

🚰 Data Engineer

BigQuery

Cloud

ETL

Google Cloud Platform

JavaScript

Kubernetes

Node.js

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com