Senior Data Solutions Architect EU

Job not on LinkedIn

August 5

Apply Now
Logo of Carbon60

Carbon60

Cloud Computing • Technology • Security

Carbon60 is a leading Canadian managed cloud services provider that specializes in simplifying cloud management for various organizations. They offer an extensive range of services including managed private, public, and edge clouds, cloud backup and disaster recovery, cloud security, and migration services. Carbon60 is recognized for its expertise in managing complex performance, security, and compliance requirements across multiple sectors including financial services, healthcare, public, and technology sectors. Their cloud solutions are designed to be secure, compliant, and optimized, offering 24x7x365 support with a focus on delivering high-quality, reliable, and flexible cloud-based solutions to empower their clients' cloud journey.

51 - 200 employees

🔐 Security

💰 Private Equity Round on 2019-01

📋 Description

• Provide deep technical expertise and leadership across a range of cloud data technologies. • Lead the design and implementation of data platforms to meet customers' business requirements. • Identify and communicate technical risks as they emerge over the course of a project. • Lead teams of data engineers to execute project roadmaps. • Maintain a close working relationship with the customer, as a "Trusted Advisor". • Establishing credibility and building impactful relationships with our customers. • Design and Delivery Complex ETL/ELT processes, large-scale batch and real-time stream processing solutions, and optimizing data and schemas Data Lakes. • AI-based solutions for NLP, speech and video recognition, anomaly and fraud detection. • End-to-end data solutions from ingestion pipelines, cataloging, analysis, and sharing insights. • Migrations and transformations of data services such as relational databases, NoSQL databases, and Data Warehouses.

🎯 Requirements

• 5+ years of experience in public cloud environments (AWS preferred). • 5+ years of developing code using Python, Java, or Scala languages • Experience building complex data engineering pipelines and ETL/ELT tasks • Experience with relational databases (for example Oracle, SQL Server, MySQL, Postgres, MariaDB) or managed cloud RDBMS services (such as Amazon Aurora) • Experience with NoSQL databases such as MongoDB, Cassandra, CouchDB, ElasticSearch, Neo4j, DynamoDB, BigTable, etc. • Experience building batch processing and real-time processing systems. • Experience building Data Warehouses and Data Lakes • Experience with 3+ of the following: • - Building real-time streaming solutions leveraging technologies such as: Kafka, Kinesis, Pub/Sub messaging, event buses, or Spark Streaming • - Designing and implementing analytics and BI solutions such as MicroStrategy, Tibco, Qlikview, or Tableau • - Building AI/ML solutions, including operations such as ongoing training, optimizations, and model deployments. • - Designing, implementing, training, and optimizing ML and AI models • - Building solutions leveraging caching systems (for example Redis) and object stores (for example Amazon S3, Amazon Glacier, etc.) • Proven ability to discuss and design solutions using concepts such as data lineage, data quality gates, data anonymization, data governance, data security, data replication, data caching, data lifecycle management, data catalogs, and networking • Understanding of requirements and impacts from compliance frameworks such as HIPAA, PCI, GDPR, PIPEDA, etc. • Strong communication and presentation skills, written and verbal • Experience writing technical documentation • Experience in technical mentoring • Experience working in a customer-facing delivery role in a consulting or professional services environment

🏖️ Benefits

• Flexibility & Time Off • Remote first work environment • Independent contractor engagement

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com