UKG

Sr Data Engineer

Noida, India
Azure GCP SQL Python JavaScript Spark API Git
Description

Job Summary: The Software Engineer - Data will be responsible for designing, developing, and maintaining robust data pipelines and architectures on the Google platform. The ideal candidate will have extensive experience with orchestration and data engineering tools as well as cloud data warehouses. They will be adept at working with various data sources, including files, FTP, and third-party APIs such as Salesforce and Qualtrics, as well as critical systems like ERPs. The role requires a deep understanding of medallion architecture and star schema data modeling to transform raw data into a cohesive EDW structure. The candidate should be comfortable working in an agile/scrum environment, utilizing standardized frameworks, Git repositories, and CI/CD processes for deployment.
Key Responsibilities:
• Software Development: Write clean, maintainable, and efficient code or various software applications and systems.  Design, develop, and maintain scalable data pipelines using modern cloud data applications such as Apache Airflow, DataFlow, Google Big Query, Azure Databricks, and Deltalake (Lakehouse). Ingest data from various sources, including files, FTP, third-party APIs (e.g., Salesforce, Qualtrics), and ERPs. Implement medallion architecture in datalakes (Azure Data Lake & Google Cloud Store) and star schema data modeling to transform and integrate data into a unified EDW structure utilizing advanced SQL and Python (PySpark).
• Collaboration: with cross-functional teams, business data analysts, product owners and business users to understand business requirements and translate them into technical solutions.
• Governance: Ensure data quality, integrity, and security across all data pipelines and storage solutions.
• Technical Leadership: Contribute to the design, development, and deployment of complex software applications and systems, ensuring they meet high standards of quality and performance.  
• Project Management: Manage execution and delivery of features and projects, negotiating project priorities and deadlines, ensuring successful and timely completion, with quality.  Participate in agile/scrum ceremonies, contributing to sprint planning, stand-ups, and retrospectives.  
• Architectural Design: Participate in design reviews with peers and stakeholders and in the architectural design of new features and systems, ensuring scalability, reliability, and maintainability.   
• Code Review: Diligent about reviewing code developed by other developers, provide feedback and maintain a high bar of technical excellence to ensure code is adhering to industry standard best practices like coding guidelines, elegant, efficient and maintainable code, with observability built from ground up, unit tests etc.   
• Testing: Build testable software, define tests, participate in the testing process, automate tests using, tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide.   
• Service Health and Quality: Maintain the health and quality of services and incidents, proactively identifying and resolving issues. Utilize service health indicators and telemetry for action providing recommendations to optimize performance. Conduct thorough root cause analysis and drive the implementation of measures to prevent future recurrences.  Maintain and support the data fabric / data platform that allows data analysts and scientists leverage for end-to-end data and information.  
• Dev Ops Model: Understanding of working in a DevOps Model.  Taking ownership from working with product management on requirements to design, develop, test, deploy and maintain the software in production.     
• Documentation: Properly document new features, enhancements or fixes to the product, contributing to training materials. 
• Develop data products, including web applications, API endpoints, and data visualizations, that deliver information and insights to business users.  
• Develop, maintain, and support the front-end Data Marketplace.


Qualifications:
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• 6+ years of experience in data engineering, with a focus on Azure and/or Google cloud data technologies.
• Proficiency in cloud data warehousing and orchestration tools such as Azure Data Factory, Apache Airflow, DataFlow, Google Big Query, Azure Databricks, and Deltalake (Lakehouse).
• Strong knowledge of data ingestion techniques from various sources, including files, FTP, and third-party APIs.
• Experience with medallion architecture and star schema data modeling.
• Proven ability to monitor, troubleshoot, and tune data pipelines for optimal performance.
• Advanced skills in SQL, Python (PySpark) and JavaScript.
• Familiarity with agile/scrum methodologies and standardized frameworks.
• Experience with Git repositories and CI/CD processes.
• Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration skills.
• Self-starter with a proactive approach to learning and development.
Preferred Qualifications:
• Experience in Apache Spark frameworks
• Experience with some ML Engineering concepts and deploying and serving ML assets utilizing tools such as Vertex AI, Databricks ML (ML Flow), and Kubeflow
• Experience test automation frameworks and tools

UKG
UKG
Bookkeeping and Payroll Human Resources Software Bookkeeping and Payroll Human Resources Software Bookkeeping and Payroll Human Resources Software

0 applies

1 views

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

60,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 401 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

To try it out

For active job seekers

For those who are passive looking

Cancel anytime

Frequently Asked Questions

  • We prioritize job seekers as our customers, unlike bigger job sites, by charging a small fee to provide them with curated access to the best companies and up-to-date jobs. This focus allows us to deliver a more personalized and effective job search experience.
  • We've got about 70,000 jobs from 5,000 vetted companies. No fake or sleazy jobs here!
  • We aggregate jobs from 5,000+ companies' career pages, so you can be sure that you're getting the most up-to-date and relevant jobs.
  • We're the only job board *for* software engineers, *by* software engineers… in case you needed a reminder! We add thousands of new jobs daily and offer powerful search filters just for you. 🛠️
  • Every single hour! We add 2,000-3,000 new jobs daily, so you'll always have fresh opportunities. 🚀
  • Typically, job searches take 3-6 months. EchoJobs helps you spend more time applying and less time hunting. 🎯
  • Check daily! We're always updating with new jobs. Set up job alerts for even quicker access. 📅

What Fellow Engineers Say