
Agriculture • Finance • Marketplace
ProducePay is a company dedicated to transforming the agricultural industry by providing farmers and produce companies with financial resources, market insights, and trading solutions. They aim to streamline and stabilize the supply chain for fresh produce by offering pre-season financing, quick-pay options, and their signature Predictable Commerce Programs. By providing access to capital, global trading networks, and real-time market intelligence, ProducePay helps growers, marketers, and retailers improve predictability and manage risks in the volatile produce industry. Their mission is to eliminate economic and food waste, and ensure a sustainable supply chain, ultimately contributing to feeding the world more sustainably.
19 minutes ago

Agriculture • Finance • Marketplace
ProducePay is a company dedicated to transforming the agricultural industry by providing farmers and produce companies with financial resources, market insights, and trading solutions. They aim to streamline and stabilize the supply chain for fresh produce by offering pre-season financing, quick-pay options, and their signature Predictable Commerce Programs. By providing access to capital, global trading networks, and real-time market intelligence, ProducePay helps growers, marketers, and retailers improve predictability and manage risks in the volatile produce industry. Their mission is to eliminate economic and food waste, and ensure a sustainable supply chain, ultimately contributing to feeding the world more sustainably.
• Design and implement robust data governance, quality, and observability frameworks to ensure accuracy across all data products. • Own the architecture, modeling, and optimization of the data warehouse (e.g., dbt transformations, dimensional modeling) to support business intelligence and data science initiatives. • Build, deploy, and monitor high-volume, reliable data pipelines using appropriate workflow orchestration tools (e.g., Apache Airflow, Dagster). • Reviewing and analyzing data sets to identify patterns or trends that may have business implications. • Assisting and providing feedback for new and existing data models, databases. • Providing consultation on data management and integrity issues to other members of the company. • Recommending changes to existing databases to improve performance or resolve issues.
• 5+ years of professional experience in Data Engineering. • At least a Bachelor’s Degree in Computer Science, Engineering, or a related quantitative field. • Familiar developing ELT pipelines using Python, REST, and SQL in a Linux environment. • Passionate about data and the power of data. • Willing to step outside of comfort and immediate responsibility zones to drive results and desired outcome. • Committed to making data a key focus for overall company strategy, growth, and product development. • Deep expertise in advanced SQL and data modeling concepts (e.g., Dimensional Modeling, 3NF). • Hands-on production experience with Snowflake and PostgreSQL. • Proficiency in Python or Scala, including experience with data processing libraries (e.g., Pandas, Spark). • Production experience building and maintaining data services on AWS (e.g., S3, EC2, Lambda, Kinesis/MSK). • Excellent communication and presentation skills. • Highly organized with good time management skills. • Strong attention to detail. • Strong understanding of Infrastructure as Code (IaC) principles; experience with Pulumi or Terraform is highly preferred. • Exceptional problem-solving ability. • Ability to succeed in a collaborative work environment and work cross-functionally.
• Health Care Plan (Medical, Dental & Vision) • Retirement Plan (401k) • Life Insurance (Basic, Voluntary & AD&D) • Paid Time Off (Vacation, Sick & Public Holidays) • Family Leave (Maternity, Paternity) • Short Term & Long Term Disability • Training & Development • Work From Home • Wellness Resources
Apply Now5 hours ago
Data Engineer contributing to scalable data infrastructure and pipelines at Rhino + Jetty merger. Collaborating with cross-functional teams for data integration and analytics capabilities.
🇺🇸 United States – Remote
💵 $135k - $175k / year
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
Airflow
BigQuery
Cloud
Python
SQL
Tableau
15 hours ago
Data Engineer building and maintaining data pipelines and systems for healthcare analytics at Podimetrics. Collaborating with cross-functional teams to enhance data reliability and usability.
🇺🇸 United States – Remote
💵 $125k / year
⏰ Full Time
🟡 Mid-level
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
BigQuery
Cloud
Python
SQL
Yesterday
Sr Principal Data Architect responsible for defining data and information architecture at GE Aerospace. Leading enterprise data lake strategy and ensuring high data quality across corporate functions.
Airflow
AWS
Azure
Cyber Security
ETL
Google Cloud Platform
Informatica
Java
Python
Scala
SQL
Vault
2 days ago
Data Engineer building technology strategy for Conduent by designing and implementing data pipelines and transformations. Collaborating with business analysts and clients to meet complex data integration needs.
🇺🇸 United States – Remote
💵 $96.3k - $125k / year
💰 Venture Round on 2009-01
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor
Azure
ETL
Numpy
Oracle
PySpark
Python
SQL
2 days ago
Data Migration Consultant delivering end-to-end implementations of Ridgeline software for investment management. Collaborating with clients and teams to ensure successful data migration projects.
🇺🇸 United States – Remote
💵 $128k - $153k / year
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
ETL
SQL