Fidelity

Principal Software Engineer/Developer

Durham, NC US
PostgreSQL Docker Hadoop Spark AWS SQL Python Java Oracle
Search for More Jobs Talk to a recruiter now 💪
This job is closed! Check out or
Description

Job Description:

Position Description: 

 

Designs and delivers Data Lakes and Warehouses using Snowflake, Amazon Web Services (AWS), Airflow, Oracle, Teradata, Postgres, and SQL Server. Integrates data using Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) development techniques and programs. Provides long term solutions for data analytics platforms using Python and Java. Uses business knowledge to translate the vision for divisional initiatives into business solutions by developing complex or multiple software applications and conducting studies of alternatives. Analyzes and recommends changes in project development policies, procedures, standards, and strategies to development experts and management. 

 

Primary Responsibilities: 

 

  • Participates in architecture design teams. 

  • Defines and implements application-level architecture. 

  • Develops applications on complex projects, components, and subsystems for the division. 

  • Recommends development testing tools and methodologies and reviews and validates test plans. 

  • Responsible for QA readiness of software deliverables. 

  • Develops comprehensive documentation for multiple applications or subsystems. 

  • Establishes full project life cycle plans for complex projects across multiple platforms. 

  • Responsible for meeting project goals on-time and on-budget. 

  • Advises on risk assessment and risk management strategies for projects. 

  • Plans and coordinates project schedules and assignments for multiple projects. 

  • Acts as a primary liaison for business units to resolve various project/technology issues. 

  • Provides technology solutions to daily issues and technical evaluation estimates on technology initiatives. 

  • Advises senior management on technical strategy. 

  • Mentors junior team members. 

  • Performs independent and complex technical and functional analysis for multiple projects supporting several divisional initiatives.  

  • Develops original and creative technical solutions to on-going development efforts. 

 

Education and Experience: 

 

Bachelor’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and five (5) years of experience as a Principal Software Engineer/Developer (or a closely related occupation) building complex data lake platforms to ingest data from multiple sources for analysis using Ingestion Frameworks.  

 

Or, alternatively, Master’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and three (3) years of experience as a Principal Software Engineer/Developer (or a closely related occupation) building complex data lake platforms to ingest data from multiple sources for analysis using Ingestion Frameworks. 

 

Skills and Knowledge: 

 

Candidate must also possess: 

 

  • Demonstrated Expertise (“DE”) collaborating across enterprises to streamline data classification (consistent usage and definitions) for investment and retirement platforms (Defined Contributions/Defined Benefits) using Collibra and Alation within a financial or healthcare services domain. 

  • DE coordinating data transport for Cloud-native applications using Data Movement Services (DMS); performing cutover and gap analysis to define Role Based Access Control Models(RBAC) according to compliance and business-governance needs using Identity and Access Management (IAM) roles; building data platforms using AWS Services, Airflow, and DataIngestion Frameworks to ingest data into Snowflake; and migrating on-premises data analytics or reporting and insights platforms to Cloud using Snowflake, Datadog, Docker, and AWS. 

  • DE building data warehouses and developing Big Data Hadoop solutions using Snowflake, Informatica, Airflow, Control-M, UNIX, Spark SQL, and Python. 

  • DE analyzing, profiling, mining, extraction, and cleansing data and large-scale data warehouses using Snowflake, Postgres, Netezza, AWS, SQL Server, and Oracle; and generating visual insights for business and end users by creating data models and data structures using Oracle Business Intelligence Enterprise Edition (OBIEE), BI Publisher, and Tableau. 

#PE1M2 

 

Certifications:

Category:

Information Technology

Fidelity’s working model blends the best of working offsite with maximizing time together in person to meet associate and business needs. Currently, most hybrid roles require associates to work onsite all business days of one assigned week per four-week period (beginning in September 2024, the requirement will be two full assigned weeks). 

Fidelity
Fidelity
Asset Management Finance Financial Services Retirement Wealth Management

0 applies

51 views

Similar Jobs

Data Engineer

Toronto, Ontario Canada

Software Engineer, ML Data

London, UK Paris, France

Data engineer H/F

Lille, France

Senior Data Engineer

Remote Bengaluru, India

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

60,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 307 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee

Wall of love from fellow engineers