
SaaS • Real Estate • B2B
Second Nature is a Resident Experience Platform that helps property managers personalize and automate resident onboarding and benefits. Its software combines Resident Onboarding, customizable Resident Benefits Packages, and an orchestration engine called Maestro to streamline leases, move-in tasks, credit-building, renters insurance, utilities, and other resident services. Delivered as a cloud platform for property management companies, Second Nature aims to improve resident retention, reduce delinquencies and maintenance costs, and increase operational efficiency.
201 - 500 employees
Founded 2012
☁️ SaaS
🏠 Real Estate
🤝 B2B
💰 $16.4M Series C on 2020-03
Yesterday

SaaS • Real Estate • B2B
Second Nature is a Resident Experience Platform that helps property managers personalize and automate resident onboarding and benefits. Its software combines Resident Onboarding, customizable Resident Benefits Packages, and an orchestration engine called Maestro to streamline leases, move-in tasks, credit-building, renters insurance, utilities, and other resident services. Delivered as a cloud platform for property management companies, Second Nature aims to improve resident retention, reduce delinquencies and maintenance costs, and increase operational efficiency.
201 - 500 employees
Founded 2012
☁️ SaaS
🏠 Real Estate
🤝 B2B
💰 $16.4M Series C on 2020-03
• Build and scale data infrastructure in GCP. • Write production-grade code using Python and SQL. • Integrate data from high-priority business systems. • Monitor and optimize existing infrastructure. • Own projects end-to-end from design to deployment. • Champion data quality and observability. • Collaborate across teams to deliver data solutions.
• 7+ years experience in data engineering or a related backend development role. • Expert-level skills in Python and SQL for complex data processing and pipeline development. • Deep, hands-on experience with the Google Cloud Platform (GCP) data stack. • Proven experience with core technologies like BigQuery, Dataflow, and Cloud Composer. • Comfortable and excited to use AI-assisted development tools. • Hands-on experience with containerization (Docker) and CI/CD pipelines. • Strong understanding of cloud security best practices. • A track record of building scalable data pipelines. • Excellent communication skills for stakeholder management. • Strong sense of accountability and ability to adapt in a fast-paced environment.
• Health insurance • 401K Plan • Open PTO and sick days • Diverse, inclusive, supportive, and growth-focused culture
Apply Now2 days ago
Senior Data Engineer managing and transforming data pipeline at Experian. Utilizing DBT and Snowflake for product development and research in fraud detection.
🇺🇸 United States – Remote
💵 $115.7k - $208.3k / year
⏰ Full Time
🟠 Senior
🚰 Data Engineer
🦅 H1B Visa Sponsor
2 days ago
Data Architect designing a data foundation to support AI/ML applications at Leidos. Responsible for creating big data systems and ensuring data access and documentation.
🇺🇸 United States – Remote
💵 $104.7k - $189.2k / year
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor
2 days ago
Data Engineer building and scaling cloud-native data pipelines using Snowflake and dbt for investment research. Solving data challenges and delivering trusted data to stakeholders across the business.
2 days ago
Data Engineer at Cayuse responsible for designing and maintaining scalable data pipelines. Collaborating with health programs on data specifications and ensuring data integrity and governance.
2 days ago
Cloud Data Engineer responsible for managing data architecture and pipelines at GOVX. Collaborating cross-functionally to enhance data reporting and access for stakeholders.
🇺🇸 United States – Remote
💵 $120k - $140k / year
⏰ Full Time
🟠 Senior
🔴 Lead
🚰 Data Engineer
🦅 H1B Visa Sponsor