Data Engineer

11 hours ago

Apply Now
Logo of Conduent

Conduent

B2B • eCommerce • Government

Conduent is a leading provider of technology-led solutions aimed at enhancing customer experiences and improving operational efficiency for businesses and government agencies. The company offers a wide range of services, including customer experience management, finance and accounting solutions, human capital management, integrated digital solutions, and specialized services for healthcare and public sector clients. By leveraging automation and analytics, Conduent helps organizations streamline processes and drive business success.

10,000+ employees

Founded 2017

🤝 B2B

🛍️ eCommerce

🏛️ Government

💰 Venture Round on 2009-01

📋 Description

• Engineer robust AZURE data factory pipelines using Linked Services/Datasets to ingest high volume, diverse data from disparate source systems into Azure BLOB storage • Design and implementation of scalable data pipelines using Azure Databricks and Azure Data factory, adhering to Medallion architecture principles to ensure data integrity, consistency, and performance • Develop complex & advanced transformation logic using PySpark (Python), user-defined functions (UDFs) within Azure Databricks to design and implement data processing and ETL workflow, enabling efficient transformation of heterogeneous source data into an integrated advanced Benefit and Eligibility system • Creating modular and reusable notebooks in Azure Databricks utilizing functions, parameters, widgets for dynamic data processing and various python libraries such Panda, numpy, Pyspark SQL, SQL alchemy etc. • Developing highly optimized Databrick notebooks using PySpark (Python & SQL) for advanced data joining, data filtering, pre-aggregation, and processing of large data sets. • Working with SQL Server and Oracle, writing optimized queries with CTEs, ANSI SQL, window functions, Views, materialized Views, stored procedures and Oracle-specific syntax to perform data profiling, validation, balancing, reconciliation to ensure data accuracy (data quality) and data completeness. • Solving complex data engineering challenges including Delta merge for hundreds of millions of records, deduplication, performance tuning, and improving ETL run time efficiency using partitioning strategies, Z-Ordering etc. • Closely working with Client and Business analysts to define and assess complex data conversion and data integration requirements, translating business needs into technical solutions.

🎯 Requirements

• Bachelors Degree in Computer Science or related field • Total experience 10+ as data analyst/engineer • Federal government or State level experience • 2 years experience in Health and Huma Services programs (SNAP and TANF)

🏖️ Benefits

• Health insurance coverage • Voluntary dental and vision programs • Life and disability insurance • Retirement savings plan • Paid holidays • Paid time off (PTO) or vacation or sick time • Flexible working conditions

Apply Now

Similar Jobs

21 hours ago

Data Migration Consultant delivering end-to-end implementations of Ridgeline software for investment management. Collaborating with clients and teams to ensure successful data migration projects.

ETL

SQL

23 hours ago

Senior Data Engineer building scalable global reporting and AI-driven insights for HungerRush's restaurant technology solutions. Collaborate across teams to ensure clean, consistent data for analytics.

Amazon Redshift

Azure

BigQuery

Cloud

ETL

Python

SQL

Yesterday

Senior Product Manager at cybersecurity company leading core data pipeline and architecture initiatives. Driving product strategy and collaborating with engineering teams to enhance platform capabilities.

Distributed Systems

ETL

Yesterday

Enterprise Data Architect leading data engineering architecture for HALO, focusing on Snowflake and data solutions. Collaborating with VP of Data Engineering and cross-functional teams to drive analytics and data governance.

Airflow

Apache

ERP

ETL

Postgres

SQL

Tableau

Yesterday

Chief Data Architect leading development of data strategy and architecture for OSC Edge. Providing technical expertise and overseeing complex data architecture projects in a remote capacity.

TypeScript

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com