PepsiCo

Senior Data Engineer

Remote Hyderabad, India
Machine Learning Azure SQL Python AWS Kubernetes
Description
Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Purpose. For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset. Senior Data Engineer: As a Sr.data engineering, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The candidate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Responsible for implementing best practices around systems integration, security, performance, and data management. Collaborate with internal clients (product teams, sector leads, data science) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Oversee work with internal clients and external partners to Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 9+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 3+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 3+ years of experience in Python and Pyspark/Scala programming on big data platforms like Databricks 3+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Databricks, Redshift, Synapse or Snowflake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for deployment. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor


Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Responsible for implementing best practices around systems integration, security, performance, and data management. Collaborate with internal clients (product teams, sector leads, data science) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Oversee work with internal clients and external partners to Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Create documentation for learnings and knowledge transfer to internal associates.


9+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 3+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 3+ years of experience in Python and Pyspark/Scala programming on big data platforms like Databricks 3+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Databricks, Redshift, Synapse or Snowflake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for deployment. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor
PepsiCo
PepsiCo

0 applies

6 views

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

60,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 452 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

To try it out

For active job seekers

For those who are passive looking

Cancel anytime

Frequently Asked Questions

  • We prioritize job seekers as our customers, unlike bigger job sites, by charging a small fee to provide them with curated access to the best companies and up-to-date jobs. This focus allows us to deliver a more personalized and effective job search experience.
  • We've got about 70,000 jobs from 5,000 vetted companies. No fake or sleazy jobs here!
  • We aggregate jobs from 5,000+ companies' career pages, so you can be sure that you're getting the most up-to-date and relevant jobs.
  • We're the only job board *for* software engineers, *by* software engineers… in case you needed a reminder! We add thousands of new jobs daily and offer powerful search filters just for you. 🛠️
  • Every single hour! We add 2,000-3,000 new jobs daily, so you'll always have fresh opportunities. 🚀
  • Typically, job searches take 3-6 months. EchoJobs helps you spend more time applying and less time hunting. 🎯
  • Check daily! We're always updating with new jobs. Set up job alerts for even quicker access. 📅

What Fellow Engineers Say

Sid avatar
Sid
Very nice portal for searching jobs in this rough market.
Mar 6, 2025
Michael Duran avatar
Michael Duran
Software Engineer
I've been using this job search site for a while now, and it’s honestly one of the best out there! The clean and easy-to-navigate UI makes the whole job-hunting process so much smoother. Plus, the job postings are always up-to-date, so I never feel like I’m wasting time. The cherry on top is the owner—super kind and always quick to respond. Definitely recommend checking it out if you're on the job hunt!
Aug 21, 2024
Sai avatar
Sai
It’s really great website for finding jobs based on skills it’s really helpful give a go
Aug 21, 2024
Adinadh avatar
Adinadh
What I like most about Echo Jobs is how easy it is to use. The platform helps me quickly find jobs that match my skills and interests, thanks to its great recommendations and filters. Yes, I would definitely recommend Echo Jobs to a friend. It makes job searching simple and efficient, making it a great tool for anyone looking for a new job.
Jul 23, 2024
As a student navigating the job market, I've found LinkedIn increasingly frustrating due to numerous fake postings by consultancies. In contrast, this job posting website has been a game-changer for me. It offers genuine opportunities and a straightforward application process, making it much easier to find and apply for real jobs. Highly recommend it to fellow students seeking reliable job listings!
Jul 16, 2024
Cliff Gor avatar
Echo Jobs has been exceptional in my job hunt where it provides one platform to job hunt and I don't have to open 10 websites just to look for a job. It has also helped me focus much on the job skill and the location filtering out the onsite jobs and remote ones. The only feature that I would request is to display fully remote jobs that are not restricted to a country since the one available shows ie, Remote, US yet. But if it could show remote only, that would be helpful not only to me but to other people applying for full remote and not tied to only US candidates
Apr 22, 2024
I found EchoJobs in 2022, and I love it. It has a lot of remote jobs. It's exclusive to software and technology jobs (helpful for devs like me). What I like the most are its filters and its API. If you're a tech professional seeking remote work, I highly recommend giving it a try to EchoJobs.
Mar 4, 2024
Would definitely recommend it! Excellent product, dedicated founder, Jobs are easier to find. Congrats 🎉 to the entire team!
Mar 3, 2024
Brandon Banks avatar
Brandon Banks
Echo Jobs is really impressive. It provides a great user experience with an ability to quickly search through the many job postings. There is an impressive amount of jobs here and it is quickly updated. The details in the each job posting is helpful when determining if it is worth pursuing. I would highly recommend using Echo Jobs to find the next step in your career.
Mar 2, 2024
Tyler Young avatar
Tyler Young
tylerayoung.com
Best wishes with EchoJobs—it's become my favorite job board overnight!
Dec 16, 2023
Simply put, it's the most up to date tech jobs aggregator I’ve found. I'm like... "I don't have to check 10+ jobs boards daily just to see if there's a new job listing? sign me up!" The filters are also quite helpful! The UI is very clean and straightforward. Love it!
Oct 5, 2023