Western Digital

Principal Engineer, Data Analytics Engineering

Bengaluru, India
GCP Spark Kafka Java Python Streaming SQL Hadoop AWS Azure
Description

Company Description

At Western Digital, our vision is to power global innovation and push the boundaries of technology to make what you thought was once impossible, possible.

At our core, Western Digital is a company of problem solvers. People achieve extraordinary things given the right technology. For decades, we’ve been doing just that. Our technology helped people put a man on the moon.

We are a key partner to some of the largest and highest growth organizations in the world. From energizing the most competitive gaming platforms, to enabling systems to make cities safer and cars smarter and more connected, to powering the data centers behind many of the world’s biggest companies and public cloud, Western Digital is fueling a brighter, smarter future.

Binge-watch any shows, use social media or shop online lately? You’ll find Western Digital supporting the storage infrastructure behind many of these platforms. And, that flash memory card that captures and preserves your most precious moments? That’s us, too.

We offer an expansive portfolio of technologies, storage devices and platforms for business and consumers alike. Our data-centric solutions are comprised of the Western Digital®, G-Technology™, SanDisk® and WD® brands.

Today’s exceptional challenges require your unique skills. It’s You & Western Digital. Together, we’re the next BIG thing in data.

Job Description

  • Architect end-to-end solutions, lead teams by example and work with cross functional stakeholders to productise the solutions.  
  • Design, build and support platform for customer analytical needs using Java and/or Python as programming language
  • Design, build and Support batch processing and streaming data pipelines
  • Write and maintain extract, transform and load scripts (ETLs)
  • Work with cross functional stakeholders to understand their requirements
  • Own data quality for allocated areas of ownership
  • Guide team members overcome technical challenges

Qualifications

Required Qualifications

  • Bachelor’s or Master’s Degree in Computer Science or Software Engineering
  • Proficient in computer science fundamentals, algorithms & data structures
  • Proficient in programming languages like Java And/or Python
  • Experience with Database Technologies (SQL/NoSQL) & Big Data Technologies (Hadoop, Hive)


Preferred Qualifications

  • Experience with Cloud Platforms like AWS, GCP and Azure
  • Proficient in architect and designing batch processing and streaming data pipelines using Spark, Kafka
  • Familiar with working in an agile software development framework with test driven development approaches
  • Excellent communication, analytical and problem solving skills

Additional Information

All your information will be kept confidential according to EEO guidelines.

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

50,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 257 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee

Wall of love from fellow engineers