Bank of America

Big Data Platform Engineer

Denver, CO Chicago, IL
Kafka Kubernetes Docker Hadoop Java Shell Ansible Spark Git Python

Job Description:

At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day.

One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being.

Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization.

Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us!

Job Description:

We invite you to join the GIS team at Bank of America as a Big Data Platform Engineer. We are a tight-knit, supportive community passionate about on delivering the best experience for our customers while remaining sensitive to their unique needs.

In this role, you will be helping setup architecture for Big Data Platforms, Distributed Systems and providing automated solutions to deliver analytical capabilities and build new data pipelines that are supporting the cyber security team.

You will also be responsible for troubleshooting and making sure of optimal performance of Cyber Data Platform.

Required Skills:

  • Minimum 4+ years of professional experience as a Big Data Engineer
  • Experience working with Hadoop/Big Data eco system and Distributed Systems
  • Hands on experience in at least one programming language, e.g.: Python, Java and/or Shell Scripting
  • Experience and proficiency with Linux operating system is a must
  • Ability to adapt and continue to learn new technologies is important
  • Experience with Configuration Management tool like Ansible.

Desired Skills:

  • Experience working with Hadoop/Big Data and Distributed Systems
  • Working experience on tools like Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, MapReduce, etc.
  • Working experience with container orchestration platform like Kubernetes and experience with Docker
  • Hands on programming experience in perhaps Python, or Shell Scripting
  • Experience in end-to-end design and build process of Near-Real Time and Batch Data Pipelines
  • Experience working in Agile development process and deep understanding of various phases of the Software Development Life Cycle
  • Experience using Source Code and Version Control systems like SVN, Git, etc.
  • Experience in configuration management tool like Ansible
  • Self-starter who works with minimal supervision and the ability to work in a team of diverse skill sets
  • Ability to comprehend customer requests and provide the correct solution
  • Strong analytical mind to help take on complicated problems
  • Desire to resolve issues and dive into potential issues


1st shift (United States of America)

Hours Per Week: 


There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

50,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 206 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee

Wall of love from fellow engineers