Ardoq

Senior Data Engineer

Remote Oslo, Norway
Python TypeScript Bash PowerShell Terraform SQL
Search for More Jobs Talk to a recruiter now 💪
Description

Do you want to be a Senior Data Engineer for an exciting SaaS scale-up company?

Are you curious about what it is like to work for a growth company that’s helping organizations make reliable, data-driven decisions faster? Then, you can be the right candidate for Ardoq.

Ardoq is one of the fastest-growing European SaaS companies backed by some of the most renowned technology investors, including EQT and One Peak. In 2022, we raised $125M in our Series D funding round, and in 2023, was named a Leader in the Gartner® Magic Quadrant™️ for Enterprise Architecture Tools for the third year running. Our cloud-native platform provides businesses with the insights they need to plan and execute change across their people, projects, processes, applications, infrastructure, and business capabilities. We empower businesses to steer their digital transformation and strategic change initiatives with clarity and confidence.

At Ardoq, we are committed to building a diverse and inclusive workforce, which has helped make Ardoq the Bold, Caring, and Driven company it is today. We pride ourselves on being an equal opportunity employer.

We have an award-winning platform and a reputation for our dedication to company culture. Ardoqians come from over 30 countries, sharing English as our working language. Headquartered in Oslo, we also have offices in Copenhagen, London, and New York.

Are we a good fit for your next career step? Apply today, get to know us, and find out.

 

Job Title: Senior Data Engineer

Reports to: VP of Engineering

We are committed to our future growth and building a global team. Today we're looking for a Senior Data Engineer to join us in our Oslo office. 

Objectives for the Role

  • Manage and refine our core data infrastructure technologies for the extraction, transformation, and usage of data.
  • Consolidate data transformation processes into a cohesive, rationalized framework, eliminating unnecessary distributed repositories and tools.
  • Proactively work to implement missing data quality tests to monitor and maintain the health of our data.

Goals/Performance Metrics

  • Reduction in the overall cost of the tech stack and maintenance.
  • Familiarity with our tools, technologies, and data platform architecture; high-level understanding of the product.
  • Tracking number of data errors and discrepancies, data quality incidents reported and resolved.
  • Reduction in the number of different tools and repositories used for data transformation.
  • Percentage of data pipelines covered by automated data quality tests.
  • Implementation and coverage of monitoring tools across data pipelines.

Responsibilities

  • Work as part of the data team managing the data platform. Collaborate with data analysts, data scientists, and other stakeholders to understand data requirements.
  • Design, build, and maintain scalable ETL (Extract, Transform, Load) pipelines.
  • Implement and monitor data quality checks to ensure data integrity. Identify and resolve data discrepancies and anomalies.
  • Continuously improve the quality of the codebase as it develops. Document data workflows, schemas, and processes for reference and knowledge sharing.
  • Design and optimize database schemas. Manage and tune databases for performance and scalability.
  • Monitor data pipelines and systems for performance and reliability. Troubleshoot and resolve issues promptly to minimize downtime.

Required Skills and Experiences

  • Proven experience in data engineering or a related field.
  • Strong understanding of data pipeline development and data warehouse infrastructure.
  • Proficiency in data modeling, SQL, Python. Experience with Typescript is a plus.
  • Proficiency in scripting for automation (e.g., Bash, PowerShell).
  • Experience with workflow orchestration tools like Apache Airflow or Prefect, and infrastructure as code like Terraform.
  • Strong analytical skills, attention to detail, and a proactive approach to code quality.
  • In-depth knowledge of relational databases and different types of indexes (e.g., B-tree, hash, full-text) and their appropriate use cases.
  • Proficiency with monitoring tools like Grafana, and log management tools. Experience setting up and interpreting performance metrics and alerts to monitor the health and performance of data pipelines.
  • Strong communication skills, ability to take initiative, and a collaborative mindset.

The benefits you'll love:

  • Be a part of one of the fastest-growing B2B SaaS companies from the Nordics
  • Flexible and hybrid working wherever possible to support your work-life balance
  • Retirement and insurance benefits program include Travel, Health, Disability, and Life insurance to secure your future
  • Parental leave scheme that covers regular pay above 6G (G = the National Insurance Scheme basic amount)
  • Centrally located office set up for all Ardoqians to succeed
  • Employee stock option program
  • Personal learning budget for professional growth after six months of employment

At Ardoq, you will work with bold, caring, and driven people, bridging business and IT. So come build the future with us!


Ardoq
Ardoq
Analytics B2B Cloud Management Compliance Enterprise Software Innovation Management IT Management Risk Management SaaS Software

0 applies

85 views

Other Jobs from Ardoq

Data Analyst

Remote Oslo, Norway

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

60,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 307 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee

Wall of love from fellow engineers