
Crypto • Finance • Fintech
Coinbase is a leading cryptocurrency exchange platform that allows individuals and institutions to buy, sell, and trade various crypto assets such as Bitcoin and Ethereum. The company offers advanced trading tools, institutional solutions, and a self-hosted wallet for storing and managing cryptocurrencies. With a strong focus on security and transparency, Coinbase provides a trusted platform used by millions globally. It supports various features including staking, earning rewards, and spending crypto through their cards. Additionally, Coinbase provides developer tools and APIs for building onchain applications, making it a comprehensive hub for engaging in the crypto economy.
1001 - 5000 employees
Founded 2012
₿ Crypto
💸 Finance
💳 Fintech
💰 $21.4M Post-IPO Equity on 2022-11
August 17

Crypto • Finance • Fintech
Coinbase is a leading cryptocurrency exchange platform that allows individuals and institutions to buy, sell, and trade various crypto assets such as Bitcoin and Ethereum. The company offers advanced trading tools, institutional solutions, and a self-hosted wallet for storing and managing cryptocurrencies. With a strong focus on security and transparency, Coinbase provides a trusted platform used by millions globally. It supports various features including staking, earning rewards, and spending crypto through their cards. Additionally, Coinbase provides developer tools and APIs for building onchain applications, making it a comprehensive hub for engaging in the crypto economy.
1001 - 5000 employees
Founded 2012
₿ Crypto
💸 Finance
💳 Fintech
💰 $21.4M Post-IPO Equity on 2022-11
• The Analytics Engineering team bridges the gap between data engineering, data science, and business analytics by building scalable, impactful data solutions. We transform raw data into actionable insights through robust pipelines, well-designed data models, and tools that empower stakeholders across the organization to make data-driven decisions. • Our team combines technical expertise with a deep understanding of the business to unlock the full potential of our data. We prioritize data quality, reliability, and usability, ensuring stakeholders can rely on our data to drive meaningful outcomes. • What We Do: • Trusted Data Sources: Develop and maintain foundational data models that serve as the single source of truth for analytics across the organization. • Actionable Insights: Empower stakeholders by translating business requirements into scalable data models, dashboards, and tools. • Cross-Functional Collaboration: Partner with engineering, data science, product, and business teams to ensure alignment on priorities and data solutions. • Scalable Data Products: Build frameworks, tools, and workflows that maximize efficiency for data users, while maintaining high standards of data quality and performance. • Outcome-Focused Solutions: Use modern development and analytics tools to deliver value quickly, while ensuring long-term maintainability. • What you’ll be doing: Analytics engineer is a hybrid Data Engineer/Data Scientist/Business Analyst role that has the expertise to understand data flows end to end, and the engineering toolkit to extract the most value out of it indirectly (building tables) or directly (solving problems, delivering insights). • Be the expert: Quickly build subject matter expertise in a specific business area and data domain. Understand the data flows from creation, ingestion, transformation, and delivery. • Examples: Step into a new line of business and work with Engineering and Product partners to deliver first data pipelines and insights. Communicate with engineering teams to fix data gaps for downstream data users. Take initiative and accountability for fixing issues anywhere in the stack. • Generate business value: Interface with stakeholders on data and product teams to deliver the most commercial value from data (directly or indirectly). • Examples: Build out a new data model allowing multiple downstream DS teams to more easily unlock business value through experimentation and ad hoc analysis. Combine Eng details of the algo engine with stats and data expertise to come up with feasible solutions for Eng to make the algo better. Work with PMs to tie together new x-PG, and x-Product data into one holistic framework to optimize key financing product business metrics. • Focus on outcomes not tools: Use a variety of frameworks and paradigms to identify the best-fit tools to deliver value. • Examples: Develop new abstractions (e.g. UDFs, Python packages, dashboards) to support scalable data workflows/infra. Stand up a framework for building data apps internally, enabling other DS teams to quickly add value. Use established tools with mastery (e.g. Google Sheets, SQL) to quickly deliver impact when speed is top priority.
• Data Modeling Expertise: Strong understanding of best practices for designing modular and reusable data models (e.g., star schemas, snowflake schemas). • Prompt Design and Engineering: Expertise in prompt engineering and design for LLMs (e.g., GPT), including creating, refining, and optimizing prompts to improve response accuracy, relevance, and performance for internal tools and use cases. • Advanced SQL: Proficiency in advanced SQL techniques for data transformation, querying, and optimization. • Intermediate to Advanced Python: Expertise in scripting and automation, with experience in Object-Oriented Programming (OOP) and building scalable frameworks. • Collaboration and Communication: Strong ability to translate technical concepts into business value for cross-functional stakeholders. Proven ability to manage projects and communicate effectively across teams. • Data Pipeline Development: Experience building, maintaining, and optimizing ETL/ELT pipelines, using modern tools like dbt, Airflow, or similar. • Data Visualization: Proficiency in building polished dashboards using tools like Looker, Tableau, Superset, or Python visualization libraries (Matplotlib, Plotly). • Development Tools: Familiarity with version control (GitHub), CI/CD, and modern development workflows. • Data Architecture: Knowledge of modern data lake/warehouse architectures (e.g., Snowflake, Databricks) and transformation frameworks. • Business Acumen: Ability to understand and address business challenges through analytics engineering. • Data savvy: Familiarity with statistics and probability. • Bonus Skills: • Experience with cloud platforms (e.g., AWS, GCP). • Familiarity with Docker or Kubernetes.
Apply NowJune 19
Join a team as a Python ETL Developer in a remote role focusing on data engineering.
March 31
Join the Government of Alberta as a Data Analytics Developer to optimize data models and pipelines. Enhance and maintain data systems supporting business goals.