Snowflake

Principal Software Engineer - Data Lake

Bellevue, WA US
Java C++ AWS Azure GCP Spark SQL
Description

Build the future of data. Join the Snowflake team.

The Snowflake Data Lake team’s mission is to power open standards with Snowflake innovation. Our customers want to bring more data to Snowflake to support their variety of data lake use cases with large data sets but face the common challenges of control, cost, and interoperability. This team aims to address these challenges and enable customers to benefit from Snowflake’s rich features and integrated platform capabilities while embracing their choice of open table standards (e.g., Apache Iceberg), file formats (e.g.,Apache Parquet), storage solutions, and third-party open source tool set (e.g.,Apache Spark). We’re on the early journey to build the best data lake solutions for any workload at scale.

We are looking for outstanding Principal Software Engineers who are technical leaders and know the internals in query engines, data warehouses, and big data open sources to join us to define strategies, set technical directions, design and execute, engage and deliver innovation and bring Snowflake data lake solutions to thousands of Enterprise customers.

AS A PRINCIPAL SOFTWARE ENGINEER AT SNOWFLAKE, YOU WILL:

  • Understand customer requirements and define product strategies
  • Design, develop, and operate highly reliable large scale data lake systems
  • Embrace Snowflake innovations with open source standards and tool sets
  • Be an active influencer for the direction of open source standards
  • Partner closely with Product teams to understand requirements and design cutting edge new capabilities that go directly into customer’s hands
  • Analyze fault-tolerance and high availability issues, performance and scale challenges, and solve them
  • Ensure operational excellence of the services and meet the commitments to our customers regarding reliability, availability, and performance
  • Set technical directions and influence cross-functional teams

AN IDEAL CANDIDATE WILL HAVE MOST OF THE FOLLOWING QUALIFICATIONS:

  • 15+ years of hands-on direct internal experience in large scale data intensive distributed systems, especially in query engines, object storage, data warehouse, data lake, data analytics, SQL/NoSQL databases, distributed file systems and data platform infrastructure
  • Proven track record of leading and delivering multi-year large and complicated projects across organizations
  • Strong development skills in Java and C++.
  • An active PMC (Program Management Committee) or Committer to open sources like Apache Iceberg, Parquet, Spark, Hive, Flink, Delta, Presto, Trino, and Avro
  • A growth mindset and excitement about breaking the status quo by seeking innovative solutions
  • An excellent team player who is consistent in making everyone around you better
  • Experience with public clouds (AWS, Azure, GCP)
  • BS/MS/PhD in Computer Science

Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

50,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🎉 12 people have signed up in the past 7 days.

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee