Coveo

Data Analytics Developer

Quebec Canada
SQL Python Java
Description

We are looking for you!

Coveo is the relevance company, allowing companies to provide highly personalized interactions for their customers and users. Every user's journey can be tailored and made relevant. In the modern, experience-driven economy, this gives companies the competitive edge.

For a customer to understand the value & performance of their solutions they need to be presented with meaningful, accurate, up-to-date data points, in the right context, that shows how Coveo is helping them achieve their business objectives.

To achieve this, our commerce team is required to deliver end to end features across the Coveo platform. They need to derive valuable metrics from usage and performance data at scale and integrate them deeply into the Coveo Merchandising Hub (in both the UI and via other channels, e.g. exports, BI integrations).

How you can help:

Our platform is built on Snowflake, the cloud-based data-warehousing solution. Here you would join us in our work to build new features, standardize data formats, optimize operations, prevent failures and make sure we can handle the data volumes. Specifically you would help us in our work to transform event data into a normalized and consistent Commerce Data Model, which would enable our customers to perform their analysis and reporting.

What your typical Coveo day would look like?

Every day at Coveo is different and exciting. But here are some examples of what you might encounter on a typical day:

  • Work with the team to define the data format in order to implement a new data pipeline to gather, analyze and learn from all the interactions happening in the clients’ applications.
    • You’ll be estimating the load of over several millions of daily events! 
  • Set up pre-calculations in Snowflake on a big dataset to optimize the load times of the commerce dashboards. 
  • Watch the highlights of Snowflake’s Data Cloud Summit and review with the Product Manager on how to use ML models on their platform. 
  • A member of the Usage Analytics team has a question about extracting the right columns from a dataset she needs for her analysis. You take a quick 10 minutes to discuss with her, and then get back to your code.

What you bring:

  • 3-4 years of working experience in SQL, ETL and Data Warehousing
  • Hands on experience with Cloud Technologies

It will be a plus if you know about:

  • dbt
  • Prefect
  • Proficiency in at least one language like Python or Java

And, while on the job, you will learn in more detail about:

  • Data Lake Architecture

  • Best practices in CI/CD and DevSecOps

If you’re curious about our work, find more here:

You want to take on the challenge?

Send us your CV, we want to get to know you!
You don’t need to check every single box; passion goes a long way and we appreciate that skillsets are transferable.

Join the #Coveolife

We encourage all qualified candidates to apply regardless of, for example, age, gender, disability, gaps in CV, national or ethnic background. We know that applying for a new role is a lot of work and we really appreciate your time.

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

50,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 264 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee

Wall of love from fellow engineers