Senior Data Engineer

March 19

Apply Now
Logo of Tkxel

Tkxel

Artificial Intelligence • Cloud Engineering • SaaS

Tkxel is a technology company that specializes in transforming businesses through innovative software solutions and services. They offer a comprehensive range of services including Digital Transformation, AI Software Development, Cloud Engineering, Application Development, and UI/UX Design. With over 15 years of expertise and a dedicated team, Tkxel aims to enhance operational efficiency for clients across various sectors by leveraging the latest technologies such as Machine Learning, Generative AI, and Cybersecurity.

501 - 1000 employees

Founded 2008

🤖 Artificial Intelligence

☁️ SaaS

📋 Description

• This is a remote position. • Data Pipeline Development: Builds understanding of the data needs of the client and designs, constructs, installs, tests, and maintains highly scalable data management systems using Microsoft Fabric suite (Azure Data Factory, Azure Synapse Analytics, etc.) and other relevant technologies that can efficiently and effectively meet those needs. • Perform data availability assessment and implement robust processes to ensure the timely identification, collection, and validation of relevant data sets required for the client. • ETL Processes: Develops ETL processes to extract data from various sources, transform the data according to business rules, and load it into a centralized data repository, ensuring data accuracy and availability. • Data Lake: Implements and manages data storage solutions using Azure OneLake and ensures optimal data storage architecture for ease of access and analysis. • Data Integration: Integrates data from various business systems into a unified data platform, enabling a consolidated view of information across the organization. • Data Quality and Governance: Ensures data accuracy and quality by implementing data governance and quality control measures, including data validation and cleansing. • Performance Optimisation: Monitors, tunes, and reports on the performance of data pipelines and databases to ensure they meet the functional and performance requirements. • Security and Compliance: Implements security measures to protect data integrity and compliance with data protection regulations and company policies.

🎯 Requirements

• Azure Data Factory • Azure Synapse Analytics • Azure Data Lake or Apache Delta Lake • Apache Spark or Databricks • Knowledge of implementing Apache Spark using pyspark • Knowledge of working Azure BLOB Storage • ETL Development • SQL and Data Modeling • Data Quality Management • Security practices • Performance tuning • Testing and troubleshooting • Clear notes and documentation

Apply Now

Similar Jobs

March 11

Experienced Data Engineer involved in migrating data infrastructure to Snowflake and building data pipelines. Collaborating with cross-functional teams to ensure data quality and implementing CI/CD processes for deployments.

AWS

Azure

Cloud

ETL

Oracle

SQL

March 10

Develop data models and ETL pipelines for IRS initiatives using Palantir technology. Collaborate with federal stakeholders on high-impact projects.

ETL

Node.js

Python

SQL

Tableau

March 10

Remote Data Architect role at Pivotal Solutions. Focus on designing ETL processes with AWS and data analytics.

Airflow

Amazon Redshift

AWS

Azure

Cloud

ETL

Google Cloud Platform

Java

MS SQL Server

Node.js

Python

Scala

Spark

SQL

February 23

Join an international organization as a Senior Azure Data Engineer, working remotely in IT Services.

Azure

Cloud

Node.js

SQL

February 19

Seeking (Senior) Azure Data Engineer for an international organization in New York. Remote role requiring Azure Cloud expertise.

Azure

Cloud

ETL

Node.js

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com