Rekor Systems, Inc., (Rekor.ai) (NASDAQ: REKR) is a trusted global authority on Roadside Intelligence providing innovative solutions with the mission to drive the world to be safer, smarter, and more efficient. Rekor products combine IoT technology, connected vehicle telemetry data, computer vision data and machine learning to enable the future of Public Safety, Urban Mobility and Traffic Management. With our disruptive technology and integrated solutions, we deliver a transformative, global impact for customers and citizens every day.
Data Engineer
As a Data Engineer at Rekor, you will be designing, implementing, and supporting complex features in the reflection of performance, clean code, and high-quality deliveries.This position is part of the newly formed Global Data and Analytics Organization, which is composed of a diverse and growing team of talented scientists, analysts and engineers that keep our data flowing from connected vehicles and AI-enabled Edge Cameras, and that serve up insights and predictions directly into our portfolio of customer-facing products.
The ideal candidate is highly motivated and comfortable in the rapidly changing nature of a startup environment. Someone who can move relentlessly forward amidst uncertainty and raises their hand to do what’s needed. We’re looking for someone who enjoys teamwork but knows how to work independently as well.
Role and Responsibilities:
- Take ownership of our streaming data pipelines, handling billions of events on a daily basis. Your role involves real-time processing of this data, ensuring the seamless flow and timely generation of valuable insights for our customers, contributing to the dynamic and responsive nature of our data-driven services
- You will play a key role in designing and implementing feature engineering batch jobs, processing vast volumes of data. Your focus will be on serving our research, analytics, models, and Business Intelligence functions, leveraging cutting-edge technologies to ensure the efficiency and excellence of our data processing workflows.
- You will be instrumental in designing and implementing the infrastructure for data pipelines that efficiently handle a diverse range of events from tens of external feeds. These feeds are critical sources of relevant data, forming the core foundation of our platform, and your role will ensure seamless integration and optimal performance.
- Collaborate with our Data Science team to transition machine learning models from the research phase, taking a lead role in designing and implementing robust, scalable solutions for real-time and batch mode model serving in production. Your responsibilities will include ensuring high scalability, fault tolerance, and overall reliability of the deployed models.
- Responsibility of overseeing and optimizing the data lake and data warehouse infrastructure, aimed at enhancing support for our algorithm team's research, analytics, machine learning models, and Business Intelligence initiatives. Your role will be pivotal in ensuring the seamless flow and accessibility of data critical to the success of our diverse data-driven functions
- 3+ years of proven backend development experience in JVM environment / (Kotlin – Advantage)
- Experience in the Data field
- Experience with Big Data and streaming technologies: Spark, Hadoop, EMR, Flink, Kafka, Kinesis, Firehose
- Experience with AWS data technologies: Glue, Athena, Data Pipeline, Lambda, Step Functions
- Proficient in business process analysis, data modelling, process flow, functional architecture documentation, and agile methodologies
- Ability to work across regions and cultures, collaborating effectively across business units and geography.
Advantages:
· Experience working with data workflow (ETL) tools and platforms (Glue, StreamSets, Airflow, Matillion, Informatica, Talend etc.).
· Experience with a variety of database, warehouse and lakehouse technologies (PostgreSQL, MySQL, S3/Athena, Redshift, Snowflake etc.)
· Experience with NoSQL Database technologies (DynamoDB, Redis, Elastic, DocumentDB etc)
· Experience with containerization and container orchestration technologies, like Docker, Kubernetes, and Argo
· Experience with Master Data Management, Data Catalogues and Data Governance
· Broad and deep understanding of data engineering lifecycles and enablement
Education:
· BS/MS/PhD in Computer Science, Engineering or equivalent technical field
Job Location:
Tel-Aviv, Israel
· #LI-Hybrid#LI-Onsite
Jobs from our Partners
Principal HPC software engineer
Technical Project Lead - Weekend
Principal Systems Engineer ;
Desktop Applications Developer
Other Jobs from Rekor Systems Inc
QA Engineer
QA Automation Engineer
Manual QA Engineer
Similar Jobs
Senior Software Engineer
Senior DevOps Engineer
Staff Site Reliability Engineer - Performance Engineering
Staff Site Reliability Engineer - Performance Engineering
Staff Site Reliability Engineer - Performance Engineering
There are more than 50,000 engineering jobs:
Subscribe to membership and unlock all jobs
Engineering Jobs
50,000+ jobs from 4,500+ well-funded companies
Updated Daily
New jobs are added every day as companies post them
Refined Search
Use filters like skill, location, etc to narrow results
Become a member
🥳🥳🥳 257 happy customers and counting...
Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.
Cancel anytime / Money-back guarantee