
B2B • Cybersecurity • SaaS
SYMVOS® is a leading provider of global people and process solutions, focused on enhancing business operations through digital transformation. The company emphasizes innovation, diversity, and inclusion in its services, which range from business process outsourcing and staffing to cyber security and digital marketing. With a commitment to fostering growth opportunities for clients, SYMVOS employs cutting-edge technologies to address challenges and enable meaningful relationships between brands and their customers.
11 - 50 employees
Founded 2020
🤝 B2B
🔒 Cybersecurity
☁️ SaaS
December 9, 2024
AWS
Azure
Cloud
Cyber Security
Docker
ETL
Google Cloud Platform
Hadoop
Java
Kafka
Kubernetes
MySQL
Node.js
NoSQL
Postgres
Python
Scala
Spark
SQL

B2B • Cybersecurity • SaaS
SYMVOS® is a leading provider of global people and process solutions, focused on enhancing business operations through digital transformation. The company emphasizes innovation, diversity, and inclusion in its services, which range from business process outsourcing and staffing to cyber security and digital marketing. With a commitment to fostering growth opportunities for clients, SYMVOS employs cutting-edge technologies to address challenges and enable meaningful relationships between brands and their customers.
11 - 50 employees
Founded 2020
🤝 B2B
🔒 Cybersecurity
☁️ SaaS
• This is a remote position. • Job Role: Data Engineer. • Overview: As a Data Engineer, you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure for our organization. • You will work closely with data scientists, analysts, and other stakeholders to ensure optimal data flow and integration for analytics, machine learning, and business intelligence purposes. • This role requires a deep understanding of data architecture, ETL processes, data modeling, and proficiency in programming and scripting languages. • Key Responsibilities: • Data Pipeline Development: Design, implement, and maintain scalable and efficient data pipelines to ingest, transform, and store large volumes of data from various sources (e.g., databases, APIs, logs). • Optimize data pipelines for performance, reliability, and scalability. • Data Integration and ETL Processes: Develop and implement ETL processes to cleanse, transform, and integrate data into data warehouses or data lakes. • Ensure data quality and consistency across different data sources. • Data Modeling: Design and implement data models and schemas to support analytics and reporting requirements. • Collaborate with data analysts and scientists to understand data requirements and translate them into technical solutions. • Database Management: Manage and optimize databases (SQL and NoSQL) for performance and scalability. • Implement database schema changes, indexes, and optimizations as needed. • Data Infrastructure: Design and deploy infrastructure for data storage and processing, considering factors such as availability, reliability, and cost-effectiveness (e.g., cloud services like AWS, Azure, GCP). • Collaboration and Communication: Work closely with cross-functional teams (e.g., data scientists, analysts, software engineers) to support their data infrastructure needs. • Monitor data pipelines and infrastructure to ensure data availability, integrity, and performance. • Perform troubleshooting and resolve issues related to data processing and storage.
• 5+ years of proven experience as a Data Engineer or in a similar role. • Strong programming skills in languages such as Python, Java, Scala, or similar for data manipulation and scripting. • Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). • Experience with big data technologies and frameworks (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud platforms and services (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes). • Understanding of data warehousing concepts and architectures. • Knowledge of data modeling, ETL tools, and data integration techniques. • Bachelor’s degree in Computer Science, Engineering, or a related field (Master’s degree preferred). • Relevant certifications (e.g., AWS Certified Big Data - Specialty) are a plus.
Apply Now