
Artificial Intelligence • B2B • Enterprise
AGENTIC is a B2B conference and event series focused on the autonomous AI era, convening enterprise leaders, builders, policymakers, and vendors to explore and operationalize AI agents, generative AI, and automation. The event offers keynotes, workshops, executive roundtables, and an interactive "Vibe Lounge" to showcase tools, enable hands-on demos, and drive measurable, outcomes-focused connections between buyers and solution providers across industries like healthcare, finance, retail, manufacturing, media, and government. AGENTIC emphasizes trust-driven growth, responsible AI governance, and practical deployment strategies to help organizations adopt AI at scale.
November 9

Artificial Intelligence • B2B • Enterprise
AGENTIC is a B2B conference and event series focused on the autonomous AI era, convening enterprise leaders, builders, policymakers, and vendors to explore and operationalize AI agents, generative AI, and automation. The event offers keynotes, workshops, executive roundtables, and an interactive "Vibe Lounge" to showcase tools, enable hands-on demos, and drive measurable, outcomes-focused connections between buyers and solution providers across industries like healthcare, finance, retail, manufacturing, media, and government. AGENTIC emphasizes trust-driven growth, responsible AI governance, and practical deployment strategies to help organizations adopt AI at scale.
• Design, build, and maintain scalable ETL/ELT pipelines from diverse ERP sources into centralized Data Lakes and Warehouses • Develop connectors for structured/semi-structured data using Python, SQL, APIs, or middleware solutions • Implement bronze, silver, and gold layers for ingestion, cleaning, and curated datasets • Organize data structures for optimized use in Power BI and AI systems • Align global units of measure (lbs, kg, packaging, linear feet) across products and regions • Execute data deduplication, enrichment, and harmonization from disparate systems • Work closely with the Data Architect Lead on schema definitions, partitioning strategies, and infrastructure design • Set up and maintain sandbox/staging environments for safe testing • Provide ready-to-use, clean datasets to support BI dashboards and AI/ML use cases • Document pipeline architectures, data transformation logic, and integration points clearly • Ensure adherence to data governance policies and assist with metadata management
• 7+ years of experience in enterprise-scale data engineering • Strong proficiency in SQL (Advanced) and Python for Data Processing • Spark or Databricks for distributed data workflows • Cloud platforms such as Azure Data Lake/Blob, Synapse, or equivalents • ETL orchestration tools like Azure Data Factory (ADF), Airflow, or dbt • API integrations and data ingestion from ERP systems (e.g., NetSuite, QuickBooks, Salesforce, RF Smart, etc.) • Demonstrated experience with master data frameworks, unit conversion, and ERP-to-warehouse mapping • Handling both structured and unstructured data • Data modeling best practices (star schema, snowflake schema, etc.) • Fluent English (C1 level) – required for daily client calls and clear technical documentation • Strong interpersonal and collaboration skills to work with cross-functional teams (BI, QA, DevOps, Business Analysts)
• Professional development
Apply NowNovember 9
Online Data Engineer Manager handling team leadership and project execution at Home Depot. Focused on advanced analytics development for strategic company objectives.
🇺🇸 United States – Remote
💵 $125k - $220k / year
💰 Debt Financing on 2007-07
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
Airflow
BigQuery
ETL
JavaScript
PySpark
Python
React
SQL
November 8
Distinguished Data Engineer at Capital One driving the adoption of modern data technologies. Collaborating on engineering issues and promoting a culture of excellence in cloud-based solutions.
🇺🇸 United States – Remote
💵 $239.9k - $273.8k / year
💰 Post-IPO Equity on 2023-05
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor
Airflow
AWS
Java
Kafka
Python
Spark
SQL
Go
November 8
Sr. Snowflake Data Engineer responsible for designing, implementing, and optimizing Snowflake data solutions. Engaging with clients to understand requirements and leading high-quality data platform deliveries.
Airflow
AWS
Azure
Cloud
Google Cloud Platform
Informatica
Python
SQL
November 7
Senior Data Engineer building scalable systems to analyze cryptocurrency transactions at TRM Labs. Collaborating with engineers and data scientists to develop critical algorithms for blockchain analysis.
🇺🇸 United States – Remote
💰 $70M Series B on 2022-11
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
Apache
Hadoop
Python
Spark
SQL
November 7
Senior Manager, Data Engineering at CrowdStrike leading global team of data engineers. Focused on optimizing data ingestion and processing pipelines in a fast-paced SaaS environment.
🇺🇸 United States – Remote
💵 $145k - $220k / year
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
Airflow
Kafka
Python
SFDC