Upstox

Senior Data Engineer/ Lead Data Engineer

Bengaluru, India
Python Scala Hadoop Streaming DynamoDB Terraform Spark AWS SQL
Search for More Jobs Talk to a recruiter now 💪
Description
The Upstox Story: Upstox is one of India's leading Fin-Tech companies with a mission to simplify trading & investing to make it easily accessible to the masses. We aim to enable everyone, from new investors to seasoned traders, to invest across multiple categories with our state-of-the-art trade & investment platform and commission-free pricing. We offer numerous asset categories to invest in, like Stocks, IPOs, Mutual Funds, and more.
By focusing on our customers’ needs and equipping them with personalised yet powerful tools, we witnessed a steep growth of 800% in our customer base from 25 Thousand in 2017 to 2 Lakh in 2019. With 1500% growth in 2020, currently, over 10 million customers trust us with their investment decisions, thus setting us on the course to become an industry-leader in the country. 
Our mission is simple - to break down the complexities of investing and make it more effortless, accessible, affordable, and easy for the masses to adopt. This key principle when infused with intuitive design and leading-edge technology will help us empower every Indian to take control of their investments.
RKSV Securities was founded by Ravi Kumar and Shrinivas Viswanath in 2009 and was soon graced by Kavitha Subramanian as the third co-founder in 2016. Backed by Ratan Tata, Upstox had raised $4 million in Series A funding in early 2016 that was led by Kalaari Capital. The Series B funding round scaled multifold and witnessed an investment of $25 million by US-based investment firm - Tiger Global Management in September 2019. 
We have a team of highly skilled technology and finance professionals, and are currently looking for highly motivated field experts to be part of our high-energy team. 

Here is what you need to know about this role: Senior Data Engineer / Lead Data Engineer

Who you will work with: The Data Engineering team at Upstox is responsible for architecting, developing and maintaining a Lake House (Data Lake + Data Warehouse) which serves as a single source of truth for organization-wide data. The team is also responsible for governing the data assets, generating business & customer insights and securely & optimally exposing it to downstream systems, services and end Users. 

What you will do: 
1. Create and maintain scalable Big-data ETL pipelines that feed organization-wide data into Upstox Data Platform(UDP).
2. Create and maintain modular and scalable Big-data processors that can be leveraged to get business & customer insights, operational efficiency and other key business performance metrics.
3. Create and maintain scalable connectors that expose the data securely for consumption by downstream systems and services in near real-time.
4. Collaborate with Devops & Infra team to build the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources.
5. Collaborate with Devops team to monitor and maintain the data platform components and ensure internal uptime SLAs are met.
6. Collaborate with the DBA & BI team to set up data quality control processes.    

You should apply if you have: 
1. Must have: Hands-on working experience with Python / Scala, Spark(or similar frameworks), Airflow, Apache HUDI/Delta/Iceberg (OpenTableFormat), AWS Athena, Big-data ETL pipelines, Hadoop (or similar distributed systems), advanced SQL,SQL query tuning.
2. Good to have: Working knowledge of streaming frameworks (Spark Streaming / Apache Flink or similar frameworks), workflow management tools / platforms (Apache Airflow / NiFior similar solutions),DBT (data build tool) dimensional data modelling, cloud native solutions (preferably on AWS).
3. Brownie Points: Understanding of AWS Services -Redshift, DynamoDB, Lambda, Glue, Athena, Lake Formation, IAM, SQS & SNS, creating CloudFormation / Terraform templates, understanding of Devops & SRE.
4. At least 7 years of experience in a Data Engineering role with exposure to dealing with large volumes of data.
5. Bachelor's / Master's degree in the field of Computer Science or equivalent.
6. Resilience and determination to deliver with high standards.
7. A Well-organized approach to problem solving and exhibit attention to detail.
8. A strong desire to work in a fast-paced dynamic environment.
9. Technical expertise to deliver scalable enterprise-grade solutions and own it end to end.
10. Strong communication skills and demonstrate effective collaboration with multiple stakeholders.

Location: Bengaluru

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

60,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 320 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee

Wall of love from fellow engineers