Engineering Your Vision
Custom Software Development • Mobile App Development • Enterprise solutions • Team Extension • Dedicated Teams
51 - 200
April 5
Loading...
Engineering Your Vision
Custom Software Development • Mobile App Development • Enterprise solutions • Team Extension • Dedicated Teams
51 - 200
• A biotechnology research project that earned recognition as a trusted provider of clinical genetic testing and an ideal collaborator for developing precision medicine solutions. Patients are central to the project's mission, driving partnerships with numerous non-profit organizations to support patients and healthcare professionals in their quest for answers. The main focus - is Clinical Cancer Diagnostic; however, the project is designed to expand into additional areas such as Molecular Diagnostic Testing, Cardio Genetics, Neuro Genetics, etc • Develop connectors for Kafka to streamline the syncing of updates from source data repositories. • Establish partitioned Kafka topics to optimize the synchronization of updates to destination data marts. • Leverage Apache Flink for crafting intricate data analytics workloads to enable real-time monitoring and transformations. • Deploy dashboards using Datadog and Cloudwatch to uphold system health and fulfil user needs. • Institute schema registries to uphold data governance standards while catering to diverse data requirements. • Collaborate closely with a West Coast-based scrum team, contributing to daily pull request submissions, code reviews, documentation maintenance, backlog management, and build validation across environments within sprint cycles lasting 2-4 weeks. • Coordinate with other scrum teams to ensure coherence on data contracts, API specifications, and deployment timelines. • Architect database schemas with a focus on query access patterns. • Establish and manage CI/CD pipelines using infrastructure-as-code principles. • Transition on-premises ETL jobs from PHP to AWS Flink and Glue processes gradually. • Work alongside QA Engineers to develop automated test suites. • Engage with end-users to troubleshoot service interruptions and champion our data product offerings. • Maintain vigilant oversight of data quality, promptly addressing discrepancies, latency issues, and defects.
• Proficiency in Apache Kafka (preferably MSK flavour), Debezium, Python, Apache Flink or PySpark Streaming, MySQL (preferably RDS flavours), CDK or Terraform, Athena, Glue, Lambda, Appflow, HANA/4, PHP, Redis, Docker, and JavaScript. • At least 6 years of hands-on experience collaborating within professional scrum teams or equivalent educational background. • A minimum of 3 years of practical experience in designing and indexing relational databases. • At least 2 years of practical experience in constructing and managing real-time data streams. • Demonstrated proficiency with at least 1 year of experience in developing monitoring dashboards. • Nice to have: • A Master’s degree in computer science, data science, mathematics, or life sciences. • Demonstrate a foundational grasp of genomic concepts and terminology. • Exhibit flexibility in availability. • Demonstrate experience in constructing data APIs and providing Data as a Service. • Showcase proficiency in integrating with SaaS platforms like SAP and Salesforce. • Display familiarity with PHP MVC frameworks such as Symfony or express readiness to acquire such skills. • Knowledge of Atlassian products, including Jira, Confluence, and Bamboo. • Show proficiency in utilising system diagramming tools like Miro, LucidCharts, or Visio.
• 36 paid absence days per year for the work-life balance of each specialist + 1 additional day for each following year of cooperation with the company • Up to 10 unused absence days can be added to income after 12 months of cooperation • Health insurance compensation • Depreciation coverage for personal laptop usage for project needs • Udemy courses of your choice • Regular soft-skills training • Excellence Сenters meetups
Apply NowFebruary 3
February 3
501 - 1000
🇨🇴 Colombia – Remote
💰 $80M Private Equity Round on 2018-09
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer