Data Analytics Engineer at The (Remote)

June 13, 2024

Job Description


The Platform-as-a-Service (PaaS) removes the complexities of cloud infrastructure management and optimizes development-to-production workflows, reducing the time it takes to build and deploy applications. Delivering efficiency, reliability, and security gives development teams both control and peace of mind. Built for developers, by developers.

Adopted and loved by 16,000+ developers, 7,000 customers, and proven over the last 8 years, provides out-of-the-box capabilities that serve as the launchpad for creative development teams’ out-of-the-box thinking.

We provide 24×7 support, managed cloud infrastructure, and automated security and compliance with an all-in-one PaaS. We give our customers complete control over their data by keeping applications secure and available around the clock.

Position Summary:

The Data Engineering and Analytics team is in search of our next Data Analytics Engineer to contribute to our robust data modeling network, which forms the foundation of our data platform.

Your mission will involve shaping robust data models, optimizing our data infrastructure, and collaborating seamlessly with your peers. Join us to transform our data strategies and unleash the full potential of our analytics capabilities through your expertise and innovative mindset.

This role directly reports to the Director of Data Engineering and Analytics.

What to expect:

  • Gain a deep understanding of the data emitted by our product and internal systems on the path to creating meaningful data models for our internal stakeholders.
  • Collaborate with partner teams, both within and outside of engineering, to gather requirements, align methodologies, and ensure best practices with data modeling and consumption.
  • Work with data at petabyte scale in an efficient, performant manner that enables widespread data-driven decision making.
  • Foster analytical curiosity to understand the breadth of data and bring standardization to its representation while understanding the unique impact it has on each business unit.
  • Bring efficiency to our operations and codebase by providing repeatable, well-documented solutions.
  • Understand the implications of various ETL/ELT methodologies on the scale and use of our data.

What you bring:

  • A caring mind-frame and an empathetic spirit. The ability to think from other perspectives is a skill we ask of all platformers.
  • Advanced proficiency in SQL, and some experience with Python would be beneficial.
  • The ability to balance the trade-offs between different approaches to a given problem, and to clearly communicate the reasoning behind a proposed solution.
  • Familiarity with data tooling, specifically data infrastructure tooling offered by GCP and/or AWS, as well as data visualization tools.
  • Comfort working in a remote environment, managing your time and contributions efficiently without direct supervision.

Bonus points for:

  • Experience with integrating enterprise systems, such as NetSuite, Salesforce, or Marketo.
  • Past use of dbt, Hightouch, and/or Fivetran.
  • Knowledge of CI/CD workflows.
  • Classical training in software development.