Data Platform Architect - Data Engineering, Field CTO Office

  1. Home
  2. Remote jobs
  3. Architecture
  • Company Snowflake
  • Employment Full-time
  • Location 🇺🇸 United States, California
  • Submitted Posted 1 month ago - Updated 10 hours ago

Build the future of the AI Data Cloud. Join the Snowflake team.

There is only one Data Cloud. Snowflake’s founders started from scratch and designed a data platform built for the cloud that is effective, affordable, and accessible to all data users. But it didn’t stop there. They engineered Snowflake to power the Data Cloud, where thousands of organizations unlock the value of their data with near-unlimited scale, concurrency, and performance. This is our vision: a world with endless insights to tackle the challenges and opportunities of today and reveal the possibilities of tomorrow. 

Our Sales Engineering organization is seeking a Data Platform Architect to join our Field CTO Office who can provide leadership in working with both technical and business executives in the design and architecture of the Snowflake Cloud Data Platform as a critical component of their enterprise data architecture and overall ecosystem.

In this role you will work with sales teams, product management, and technology partners to leverage your expertise, best practices and reference architectures highlighting Snowflake’s Cloud Data Platform capabilities across Data Warehouse, Data Lake, and Data Engineering workloads.

As a Data Platform Architect, you must share our passion and vision in helping our customers and partners drive faster time to insight through Snowflake’s Cloud Data Platform, thrive in a dynamic environment, and have the flexibility and willingness to jump in and get things done. You are equally comfortable in both a business and technical context, interacting with executives and talking shop with technical audiences. 

IN THIS ROLE YOU WILL GET TO: 

  • Apply your multi-cloud data architecture expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners

  • Partner with sales teams and channel partners to understand the needs of our customers, strategize on how to navigate and accelerate winning sales cycles, provide compelling value-based enterprise architecture deliverables and working sessions to ensure customers are set up for success operationally, and support strategic enterprise pilots / proof-of-concepts 

  • Collaborate closely with our Product team to effectively influence the Cloud Data Platform product roadmaps based on field team and customer feedback

  • Partner with Product Marketing teams to spread awareness and support pipeline building via customer roundtables, conferences, events, blogs, webinars, and whitepapers

  • Contribute to the creation of reference architectures, blueprints, best practices, etc. based on field experience to continue to up-level and enable our internal stakeholders

ON DAY ONE, WE WILL EXPECT YOU TO HAVE: 

  • 10+ years of architecture and data engineering experience within the Enterprise Data space

  • Deep Technical Hands on Expertise within one or more of the following (preferably with several): Data Warehouse Modernization / Migrations, Data Lakes, Data Engineering

  • Development experience with technologies such as SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive and any other Big data technologies

  • Must have some prior knowledge of Data Engineering tools for ingestion, transformation and curation

  • Familiarity with real-time or near real time use cases (ex. CDC) and technologies (ex. Kafka, Flink) and deep understanding of integration services and tools for building ETL and ELT data pipelines and its orchestration technologies such as Matilion, Fivetran, Airflow, Informatica, etc..

  • 3+ years of Cloud Provider experience including certifications in AWS, GCP and/or Azure with detailed knowledge of cloud storage and services supporting data ingestion, transformation, pipelines, and stream processing. 

  • Working knowledge of any of the known table formats like Hudi, Iceberg etc and experience with at least one Big data project for data lake or data engineering is a plus

  • Hands on experience with database change management and DevOps processes and technologies such as Github, Gitlab, Jenkins and Terraform

  • Strong architectural expertise in data engineering to confidently present and demo to business executives and technical audiences, and effectively handle any impromptu questions

  • Bachelor’s Degree required, Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience preferred 

 

Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

The application window is expected to be open until December 13, 2024. This opportunity will remain posted based on business needs, which may be before or after the specified date.

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.

How do you want to make your impact?

Loading similar jobs...

USA Remote Jobs

Discover fully remote job opportunities in the United States at USA Remote Jobs. Apply for roles like Software Developer, Customer Service Specialist, Project Manager, and more!

© 2024 Created by USA Remote Jobs. All rights reserved.