Capco

Junior Python + Devops Engineer

Bengaluru, India
Docker Terraform Chef GCP Bash Puppet API Shell AWS Azure Python Git Ansible Kubernetes
Description

 

Job Description: DevOps Engineer (2+ Years of Experience)

Position: Python + DevOps Engineer
Experience: 2+ Years
Location: Bengaluru / Pune

 

Hong Kong, Malaysia, India

Joining Capco means joining an organization that is committed to an inclusive working environment where you’re encouraged to #BeYourselfAtWork. We celebrate individuality and recognize that diversity and inclusion, in all forms, is critical to success. It’s important to us that we recruit and develop as diverse a range of talent as we can, and we believe that everyone brings something different to the table – so we’d love to know what makes you different. Such differences may mean we need to make changes to our process to allow you the best possible platform to succeed, and we are happy to cater to any reasonable adjustments you may require. You will find the section to let us know of these at the bottom of your application form or you can mention it directly to your recruiter at any stage and they will be happy to help.

ABOUT CAPCO

Capco is a global technology and business consultancy, focused on the financial services sector. We are passionate about helping our clients succeed in an ever-changing industry. You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.



Job Overview:

We are seeking a highly experienced Python + DevOps Engineer with 3+ years of expertise to join our dynamic team. The role will focus on the deployment, integration, and maintenance of BigID across various environments, ensuring compliance with data retention and legal hold requirements. The candidate will also be responsible for API development, database scanning, and developing Python scripts for integration with Google Cloud Platform (GCP) for data storage and further utilization.

 

Must have skills: Python, Devops, Linux, CI/CD, docker, GIT/version control.

Good to have: Big ID, GCP

Key Responsibilities:

  • Deployment and Integration:
    • Deploy and integrate BigID across multiple environments (Dev, QA, PreProd, Prod).
    • Address issues related to authentication, connectivity, and integration with the existing infrastructure.
    • Ensure successful deployment in both production and development instances by configuring the necessary environments.
  • Data Retention and Legal Hold:
    • Implement BigID’s retention module to manage legal holds and apply retention rules for both structured and unstructured data in compliance with regulatory obligations.
  • API Development:
    • Develop robust APIs for data storage and processing based on project requirements.
  • Database Scanning:
    • Configure BigID to connect with relevant data platforms, including NAS, for performing initial scans.
    • Support the data retention use case by ensuring accurate scanning, reporting, and identification of data subject to retention policies.
  • Python Script Development for GCP Integration:
    • Develop Python scripts to automate data transfer from BigID to GCP, ensuring secure storage and availability for further processing or analytics.
    • Support future use cases by enabling the movement of data from BigID to GCP for various business or regulatory purposes.

Required Skills and Experience:

  • Development and Scripting:
    • Strong experience in development using Python, Bash/Shell scripting, and other scripting languages on Linux-based operating systems.
  • Linux Expertise:
    • Proficiency in managing and deploying applications on Linux servers and operating systems.
  • CI/CD Tools: Strong experience with Jenkins, GitLab CI, CircleCI, or similar tools for continuous integration and continuous deployment.
  • Configuration Management: Proficiency with Ansible, Terraform, Chef, or Puppet for automating deployment and infrastructure configuration.
  • Containerization & Orchestration: Experience with Docker, Kubernetes, and Helm for deploying, managing, and scaling containerized applications.
  • Cloud Platforms: Expertise in GCP and other cloud platforms (AWS, Azure) with hands-on experience in deploying, managing, and integrating cloud services.
  • Version Control: Strong understanding and experience with Git or other version control systems.
  • Monitoring & Logging: Familiarity with monitoring tools and logging tools
  • Networking and Security: Knowledge of networking, firewall configurations, SSL/TLS, and implementing secure authentication mechanisms.
  • Experience with BigID: Experience with BigID or similar data discovery, privacy, and governance tools.

Qualifications:

  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • 3+ years of experience in DevOps or related roles.
  • Excellent problem-solving and communication skills.
  • Ability to work collaboratively with cross-functional teams.

 

WHY JOIN CAPCO?

You will work on engaging projects with some of the largest banks in the world, on projects that will transform the financial services industry.

We offer:

A work culture focused on innovation and building lasting value for our clients and employees

Ongoing learning opportunities to help you acquire new skills or deepen existing expertise

A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients

A diverse, inclusive, meritocratic culture

Enhanced and competitive family friendly benefits, including maternity / adoption / shared parental leave and paid leave for sickness, pregnancy loss, fertility treatment, menopause and bereavement

NEXT STEPS

If you’re looking forward to progressing your career with us, please do not hesitate to apply. We are looking forward to receiving your application.

To learn more about Capco and its people

#LI-Hybrid

#LI-RA1

 

Capco
Capco
Consulting Finance Financial Services Insurance

0 applies

1 views

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

60,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 401 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

To try it out

For active job seekers

For those who are passive looking

Cancel anytime

Frequently Asked Questions

  • We prioritize job seekers as our customers, unlike bigger job sites, by charging a small fee to provide them with curated access to the best companies and up-to-date jobs. This focus allows us to deliver a more personalized and effective job search experience.
  • We've got about 70,000 jobs from 5,000 vetted companies. No fake or sleazy jobs here!
  • We aggregate jobs from 5,000+ companies' career pages, so you can be sure that you're getting the most up-to-date and relevant jobs.
  • We're the only job board *for* software engineers, *by* software engineers… in case you needed a reminder! We add thousands of new jobs daily and offer powerful search filters just for you. 🛠️
  • Every single hour! We add 2,000-3,000 new jobs daily, so you'll always have fresh opportunities. 🚀
  • Typically, job searches take 3-6 months. EchoJobs helps you spend more time applying and less time hunting. 🎯
  • Check daily! We're always updating with new jobs. Set up job alerts for even quicker access. 📅

What Fellow Engineers Say