Data Architect

27 minutes ago

Apply Now
Logo of Dynatron Software, Inc.

Dynatron Software, Inc.

Automotive • SaaS • Analytics

Dynatron Software, Inc. is a company that combines advanced analytics software with expert coaching to help automotive service departments maximize their revenue opportunities. Their product suite includes various solutions like PriceSmart for optimizing labor and part pricing, FileSmart for enhancing warranty labor rates and parts markup, SellSmart for increasing service sales to existing customers, and MarketSmart for boosting service department traffic through strategic marketing campaigns. With a focus on increasing profitability, Dynatron collaborates closely with dealerships to identify and capitalize on hidden revenue streams using their proprietary analytics and extensive repair order database.

51 - 200 employees

Founded 1999

☁️ SaaS

📋 Description

• Design scalable conceptual, logical, and physical data models supporting OLTP, OLAP, real-time analytics, and ML workloads • Architect modular, domain-driven data structures for multi-domain analytics • Apply modern modeling techniques, including 3NF, Dimensional Modeling, Data Vault, Medallion Architecture, and Data Mesh principles • Define canonical models, conformed dimensions, and enterprise reference datasets • Architect real-time ingestion and event-driven pipelines using Kafka, Kinesis, Pulsar, or Azure Event Hub • Design low-latency, high-throughput streaming architectures for operational and analytical use cases • Design ML-ready datasets, feature stores, and reproducible data pipelines • Architect for drift detection, data quality monitoring, lineage visibility, retraining workflows, and model governance • Design scalable architectures using Snowflake, Databricks, or other cloud-native platforms

🎯 Requirements

• 7-10+ years of experience as a Data Architect or Senior Data Engineer in enterprise-scale environments • Deep hands-on experience with Snowflake, Databricks, Azure Data Factory, AWS Glue, Bedrock, Redshift, BigQuery, or Teradata • Strong SQL and Python/Scala skills, with expertise in schema design and metadata management • Experience building streaming architectures with Kafka, Kinesis, or Event Hub • Knowledge of ML/AI pipelines, feature stores, vector databases, and modern AI platform tooling • Expertise in encryption, masking, tokenization, IAM, and RBAC • Understanding of PII/PHI requirements and regulatory standards • Experience implementing secure patterns across cloud platforms • Experience designing distributed systems across AWS, Azure, or GCP

🏖️ Benefits

• Comprehensive health, vision, and dental insurance • Employer-paid short- and long-term disability and life insurance • 401(k) with competitive company match • Flexible vacation policy and 11 paid holidays • Remote-first culture

Apply Now

Similar Jobs

7 hours ago

Senior Data Engineer at ProducePay designing and optimizing data platform for analytics needs. Owning data lifecycle, ensuring reliability, security, and scalability of data architecture across teams.

Airflow

Apache

AWS

EC2

Linux

Pandas

Postgres

Python

Scala

Spark

SQL

Terraform

12 hours ago

Data Engineer contributing to scalable data infrastructure and pipelines at Rhino + Jetty merger. Collaborating with cross-functional teams for data integration and analytics capabilities.

Airflow

BigQuery

Cloud

Python

SQL

Tableau

22 hours ago

Data Engineer building and maintaining data pipelines and systems for healthcare analytics at Podimetrics. Collaborating with cross-functional teams to enhance data reliability and usability.

BigQuery

Cloud

Python

SQL

Yesterday

Sr Principal Data Architect responsible for defining data and information architecture at GE Aerospace. Leading enterprise data lake strategy and ensuring high data quality across corporate functions.

Airflow

AWS

Azure

Cyber Security

ETL

Google Cloud Platform

Informatica

Java

Python

Scala

SQL

Vault

2 days ago

Data Engineer building technology strategy for Conduent by designing and implementing data pipelines and transformations. Collaborating with business analysts and clients to meet complex data integration needs.

Azure

ETL

Numpy

Oracle

PySpark

Python

SQL