Equifax

Big Data Engineer - Intermediate

Costa Rica
GCP API MySQL Microservices Java SQL Spark Python Git AWS Hadoop Oracle Scala Azure PostgreSQL
Search for More Jobs Talk to a recruiter now 💪
Description

What you’ll do

  • Perform general application development activities, including unit testing, code deployment to development environment and technical documentation.  Works on one or more projects, making contributions to unfamiliar code written by  team members.

  • Participates in estimation process, use case specifications, reviews of test plans and test cases, requirements, and project planning. Diagnose and resolve performance issues.

  • Documents code/processes so that any other developer is able to dive in with minimal effort.

  • Develop, and operate high scale applications from the backend to UI layer, focusing on operational excellence, security and scalability.

  • Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset.

  • Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit engineering team employing agile software development practices.

  • Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality.

  • Able to write, debug, and troubleshoot code in mainstream open source technologies. Lead effort for Sprint deliverables, and solve problems with medium complexity

What experience you need

  • Bachelor's degree in Computer Science, Systems Engineering or equivalent experience

  • 3+ years experience working in Data Engineering using any programming languages such as Python, Java, Scala and SQL (is a must)

  • 3+ years of experience working in ETL (Extract, Transform, and Load) procedures

  • 3+ years of experience working with Big Data Frameworks such as Apache Spark, Apache Beam or equivalent.

  • 2+ years of experience working with workflow management technologies such as Apache Airflow or equivalent.

  • 1+ years experience with software build tools like Maven or Gradle

  • 1+ year of experience with Cloud technology: GCP, AWS, or Azure

  • English proficiency B2 or above

What could set you apart

  • Data Engineering using GCP Technologies (BigQuery, DataProc, Dataflow, Composer, etc)

  • Experience with other Big Data Technologies such as  Hadoop, Hive or equivalent.

  • Experience using encryption mechanism for sensitive data in transit and at rest

  • Working with multiple data sources and structures such as API, Database, JSON, CSV, XML, Text files, etc

  • Relational databases (e.g. Oracle, PostgreSQL, SQL Server, MySQL)

  • Source code control management systems (e.g. SVN/Git, Github)

  • Agile environments (e.g. Scrum, XP)

  • Atlassian tooling (e.g. JIRA, Confluence, and Github)

  • Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

  • Cloud Certification Strongly Preferred

#LI-DU1
#LI-Hybrid

Primary Location:

CRI-Sabana

Function:

Function - Tech Dev and Client Services

Schedule:

Full time

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

60,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 307 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee

Wall of love from fellow engineers