Data Architect

Job not on LinkedIn

October 13

Apply Now
Logo of Nymphis Technologies

Nymphis Technologies

B2B • eCommerce • Enterprise

Nymphis Technologies is an Agile software consultancy and development firm based in Cluj-Napoca, Romania that provides solution architecture, cloud transformation, DevOps and eCommerce development services. Their team of Java developers, product owners and testers delivers scalable, future-proof software through iterative Agile processes, covering design, CI/CD, QA, test automation, maintenance and support for global B2B clients. They focus on building reliable technical solutions, proof-of-concepts and ongoing support for enterprise-grade projects.

📋 Description

• Developing technology specifications and ensuring that new technology solutions are designed for optimal access and usefulness, and leveraging existing technologies when possible • Applying architectural and engineering concepts to design a solution that meets operational requirements, such as scalability, maintainability, security, reliability, extensibility, flexibility, availability and manageability • Identifying the technologies to employ based on business requirements coverage, price, performance, service agreements and organisation constraints • Participate and lead research and development efforts (proofs of concept, prototypes), as subject matter experts, when introducing new technologies. • Provide technical expertise to propose level of effort estimates (LOE), work breakdown structures and technical resource planning for proposed and current work • Provide input to management throughout the project management lifecycle • Ensures technology solutions are production ready and meet the defined specifications, and that the solution can be maintained via production support methodologies and resources. • Provide technical guidance to coach and mentor team members; provide performance feedback to manager(s).

🎯 Requirements

• Good analytical skills (skilled in analysing and documenting business requirements and corresponding solution specifications) • Good knowledge of software design methods and techniques • Experience of BI/DWH maintenance and design • Experience of ETL development and data modelling, preferably on several platforms and with different tools • Very good knowledge of databases eg. Oracle, MySQL, Teradata, MS SQL • Experience with database development of stored procedures, triggers, views • Experience with Big Data ecosystem • Experience with Data Warehouse in cloud ecosystems • Good knowledge of SQL, PL/SQL ,T-SQL • Good knowledge of SQL optimization • Good knowledge of Unix-Shellscripts • Good knowledge of Data Modeling concepts, methodologies and processes • Basic database administration skills • Leadership experience is an advantage • Fluency in written and spoken English, preferably also German • Willingness to travel abroad • Nice to have: Basic understanding of machine learning fundamentals and deep learning • Implementation experience in machine learning algorithms and applications • Programming skills in at least one object-oriented programming language (Java, Scala, C++, Python, etc.) • Implementation experience with at least one of the modern distributed ML frameworks such as TensorFlow, PyTorch, Caffe, MxNet

🏖️ Benefits

• Flexible working arrangements • Professional development opportunities

Apply Now

Similar Jobs

September 28

Senior Data Engineer building Looker/LookML models and BigQuery pipelines for 3Pillar Global. Deliver scalable data models, dashboards, and analytics to support product-focused clients.

BigQuery

Cloud

SQL

September 28

Senior Data Engineer building ETL and real-time pipelines for analytics at 3Pillar Global. Collaborates with clients and engineers on scalable data platforms.

Airflow

Apache

AWS

Azure

Cloud

Docker

ETL

Google Cloud Platform

NoSQL

Python

Redis

SQL

August 9

Trimble Inc.

10,000+ employees

Join Transporeon as a Data Engineer to implement scalable data solutions on AWS platforms.

Amazon Redshift

AWS

Azure

Cloud

Docker

DynamoDB

ETL

Kubernetes

Postgres

Python

SQL

Terraform

April 26

Join Tecknoworks as a Data Engineer. Develop robust data pipelines on AWS and Azure.

Airflow

Amazon Redshift

Apache

AWS

Azure

Cloud

Docker

ETL

Kafka

Kubernetes

Python

Spark

SQL

Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or support@remoterocketship.com