
Recruitment • Telecommunications • Finance
Parvana is an IT recruitment specialist, focusing on the fields of Software Development, Finance, and Telecommunications. With a presence in both South Africa and the United Kingdom, Parvana offers a global perspective in delivering personalized and efficient recruitment services. Their dedicated team of skilled professionals assists candidates with CV preparation, interview coaching, and career advice, ensuring a value-added experience in the job market.
August 13

Recruitment • Telecommunications • Finance
Parvana is an IT recruitment specialist, focusing on the fields of Software Development, Finance, and Telecommunications. With a presence in both South Africa and the United Kingdom, Parvana offers a global perspective in delivering personalized and efficient recruitment services. Their dedicated team of skilled professionals assists candidates with CV preparation, interview coaching, and career advice, ensuring a value-added experience in the job market.
• This is a remote position. • About our client: a software and data solutions provider across various industries; offers plug-and-play products, customizable solutions, cloud-based and hosted deployments. • What you will be doing: Analyzes complex customer data to determine integration needs; develops and tests scalable data integration/transformation pipelines using PySpark, SparkSQL, and Python; contributes to the codebase through coding, reviews, validation, and complex transformation logic; automates and maintains data validation and quality checks; collaborates with FPA, data engineers, and developers to align solutions with financial reporting and business objectives; participates in solution architecture and technical discussions, refining user stories and acceptance criteria; utilizes modern data formats/platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks); partners with the product team to ensure accurate customer data reflection and provide feedback based on data insights. • What our client is looking for: A Data Analytics Engineer with 5+ years of experience; must have strong Python, PySpark, Notebook, and SQL coding skills, especially with Databricks and Delta Lake; proven ability to build and deploy scalable ETL pipelines to cloud production environments using CI/CD; experience with Agile/Scrum, data quality concepts, and excellent communication; cloud environment (Azure, AWS) and Infrastructure as Code (Terraform, Pulumi) experience beneficial; telecoms industry or consulting experience, plus accounting knowledge, is a plus.
• A Data Analytics Engineer with 5+ years of experience. • Must have strong Python, PySpark, Notebook, and SQL coding skills, especially with Databricks and Delta Lake. • Proven ability to build and deploy scalable ETL pipelines to cloud production environments using CI/CD. • Experience with Agile/Scrum, data quality concepts, and excellent communication is essential. • Cloud environment (Azure, AWS) and Infrastructure as Code (Terraform, Pulumi) experience beneficial. • Telecoms industry or consulting experience, plus accounting knowledge, is a plus. • Data Analytics Engineer, PySpark, Python, SQL, Databricks, ETL, CI/CD, Cloud, Azure, AWS
Apply Now