Senior Data Engineer, Messaging Platforms

  1. Home
  2. Remote jobs
  3. terraform
  • Company The New York Times
  • Employment Full-time
  • Location 🇺🇸 United States nationwide
  • Submitted Posted 3 days ago - Updated 14 hours ago

About the Role

The New York Times’ Core Platforms mission is looking for a data engineer to join the fast-growing Messaging Platforms group. As part of the Messaging Platforms group, your work will directly help grow customer acquisition, engagement, and reader loyalty. You will build data pipelines, data models, applications and infrastructure for cloud platforms that process billions of messages a month to support data analysis and bring insights and improvements to the user messaging journey. You will work with other engineers, designers, and product managers. You’ll be a member of an organization that values empathy, collaboration, transparency, diversity, and learning.

This role requires limited on-call hours. An on-call schedule will be determined when you join, taking into account team size and other variables.

Responsibilities:

  • Create ETL data pipelines to support NYT analysts, marketers, advertisers and editors
  • Contribute to a shared data platform by developing new capabilities using tech like BigQuery, dbt, Go, Python, Java, Airflow and Apache Beam
  • Assemble large and complex datasets from multiple data sources into formats that can be used by the business
  • Automate batch and streaming data transformation pipelines
  • Implement data processing capabilities like user behavior models, metric aggregations, audience management and data science features
  • Build reliable, scalable, maintainable, and cost-efficient data products
  • Demonstrate support and understanding of our value of journalistic independence and a strong commitment to our mission to seek the truth and help people understand the world.

Basic Qualifications:

  • 5 years of professional experience in Data Engineering
  • 5 years of experience with SQL and query optimization
  • 3 years of overall experience with cloud infrastructure architecture and configuration, especially with cloud-native data-warehousing solutions like Snowflake or BigQuery
  • 3 years of experience writing production-quality Python (or other scripting language)
  • 3 years of experience building and orchestrating ETL pipelines (e.g. Apache Airflow, DBT Cloud, BigQuery)
  • Experience with cloud computing platforms (e.g. GCP, AWS)
  • Familiarity with Git, and CI/CD systems such as Drone or Jenkins
  • Strong experience with Terraform or similar Infrastructure as Code platform

Preferred Qualifications:

  • Some experience on at least one distributed system such as NoSQL databases, stream processing systems or message brokers
  • Experience with the modern data stack (Fivetran / Snowflake / dbt / Looker / Hightouch or equivalents)
  • Familiarity with BI tools (e.g. Looker, Mode, Tableau) and experience sharing data insights with reports and dashboards.

#LI-Remote

REQ-017235

Loading similar jobs...

USA Remote Jobs

Discover fully remote job opportunities in the United States at USA Remote Jobs. Apply for roles like Software Developer, Customer Service Specialist, Project Manager, and more!

© 2025 Created by USA Remote Jobs. All rights reserved.