
B2B • Marketplace
Upwork is a leading online platform that connects freelancers with businesses, offering a wide range of professional services such as design, writing, programming, and more. It facilitates collaboration between independent professionals and clients through job postings, project catalogs, and consultations. Upwork aims to enable freelancers to find work and clients to find the best talent, providing tools to hire, manage, and pay remote talent worldwide.
October 22
AWS
Azure
Cloud
DynamoDB
Google Cloud Platform
Kubernetes
MongoDB
MySQL
NoSQL
Oracle
Perl
Postgres
Python
Ruby
SQL
Terraform

B2B • Marketplace
Upwork is a leading online platform that connects freelancers with businesses, offering a wide range of professional services such as design, writing, programming, and more. It facilitates collaboration between independent professionals and clients through job postings, project catalogs, and consultations. Upwork aims to enable freelancers to find work and clients to find the best talent, providing tools to hire, manage, and pay remote talent worldwide.
• Join Upwork's Data Infrastructure team within the Data Platform Services (DPS) organization, responsible for designing, operating, and automating all database systems (Postgres, MySQL, DynamoDB, MongoDB) across Upwork’s global infrastructure. • You’ll orchestrate complex systems spanning Terraform, RDS, Presto, and Rancher to solve challenges like: • Zero-downtime migrations and cross-region replication • End-to-end database provisioning (infrastructure deployment, user/access configuration, service integration) • Vulnerability management and security hardening at scale • Incident response for high-severity database alerts (24/7 on-call rotation) • Design and implement Python-based automation frameworks (not scripts) for database lifecycle management • Collaborate with infrastructure teams to integrate systems via APIs (AWS, Kubernetes, HashiCorp) • Optimize Postgres performance, replication, and backup strategies (99% of relational DB use cases) • Participate in LATAM Time-friendly on-call shifts with weekend coverage
• Hybrid expertise: Deep experience in both database engineering/administration and software development. Candidates who have transitioned from database engineering to software development (or vice versa) are especially encouraged. • Programming skills: Strong background in Python (required); ability to develop robust automation beyond basic scripting. Experience with Ruby or Perl is acceptable if you can quickly adapt to Python. • Database expertise: 7+ years of professional experience with relational databases, with a strong preference for Postgres. Experience with MySQL or Oracle is also valued. NoSQL experience (e.g., DynamoDB) is a plus but not required. • SQL proficiency: Solid understanding of SQL; experience with procedural languages (PL/pgSQL for Postgres or PL/SQL for Oracle) is beneficial but not essential, as most automation is done in Python. • Cloud & DevOps familiarity: Experience with Terraform and related infrastructure-as-code tools is a plus, but not a core requirement. Familiarity with cloud environments (AWS, GCP, Azure) is helpful. • Automation mindset: Passion for automating repetitive tasks and improving operational efficiency. • Ownership & accountability: Proactive, resourceful, and able to take full responsibility for solving problems and delivering outcomes. • Collaboration: Strong communication skills; able to work effectively in a distributed, multicultural team.
• Work on challenging, high-impact automation projects at the heart of Upwork’s business. • Collaborate with experienced engineers in a supportive, global team environment. • Gain exposure to a wide array of technologies and complex systems orchestration. • Opportunity to shape and improve the core data infrastructure of the world’s leading work marketplace.
Apply Now