Key Responsibilities:
- Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase
- Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem
- Leverage GCP for scalable big data processing and storage solutions
- Implementing automation/DevOps best practices for CI/CD, IaC, etc.
Qualifications:
- Bachelors' degree in Computer Science, software engineering or related field of study.
- Experience with GCP managed services and understanding of cloud-based batch processing systems are critical.
- Proficiency in Oozie, Airflow, Map Reduce, Java
- Strong programming skills with Java (specifically Spark), Python, Pig, and SQL
- Expertise in public cloud services, particularly in GCP.
- Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce
- Familiarity with BigTable and Redis
- Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
- Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions.
- Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals.
- Proven experience in engineering batch processing systems at scale.
- Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.
- Google Associate Cloud Engineer Certification or other Google Cloud Professional level certification
- 5+ years of experience in customer-facing software/technology or consulting
- 5+ years of experience with “on-premises to cloud” migrations or IT transformations
- 5+ years of experience building, and operating solutions built on GCP (ideally) or AWS/Azure
Other Jobs from Rackspace
Software Developer - C# with GenAI - Hybrid - Sydney
Senior Streaming Engineer (GCP)
Presales Data Science Architect – AWS Cloud
Software Developer III (Python with Middleware)
Software Developer II (Python) - R-20304
There are more than 50,000 engineering jobs:
Subscribe to membership and unlock all jobs
Engineering Jobs
60,000+ jobs from 4,500+ well-funded companies
Updated Daily
New jobs are added every day as companies post them
Refined Search
Use filters like skill, location, etc to narrow results
Become a member
🥳🥳🥳 401 happy customers and counting...
Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.
To try it out
For active job seekers
For those who are passive looking
Cancel anytime
Frequently Asked Questions
- We prioritize job seekers as our customers, unlike bigger job sites, by charging a small fee to provide them with curated access to the best companies and up-to-date jobs. This focus allows us to deliver a more personalized and effective job search experience.
- We've got about 70,000 jobs from 5,000 vetted companies. No fake or sleazy jobs here!
- We aggregate jobs from 5,000+ companies' career pages, so you can be sure that you're getting the most up-to-date and relevant jobs.
- We're the only job board *for* software engineers, *by* software engineers… in case you needed a reminder! We add thousands of new jobs daily and offer powerful search filters just for you. 🛠️
- Every single hour! We add 2,000-3,000 new jobs daily, so you'll always have fresh opportunities. 🚀
- Typically, job searches take 3-6 months. EchoJobs helps you spend more time applying and less time hunting. 🎯
- Check daily! We're always updating with new jobs. Set up job alerts for even quicker access. 📅
What Fellow Engineers Say