Bigblue

Senior Data Scientist

Paris, France Remote Hybrid
GCP Azure Python SQL
Description
Bigblue is the best fulfillment platform in Europe to run an e-commerce business. We handle hundreds of thousands of parcels every month for hundreds of merchants. Our product team has been focused for the first 4 years on building an operations platform, merchant frontend, and exceptional buyer experience with a team of full-stack software engineers. In January 2022, we introduced a more structured data practice inside Bigblue and are hiring to strengthen its ranks. We envision these engineers to be the data counterparts to our software engineers with the same goal: solve customer problems.

What data means for Bigblue

Being an operational business at heart, Bigblue has been data-driven from day one. It drives all our decisions. Because of our unique position at the center of our merchants' businesses, and our close connectivity to our fulfillment network, we have access to large amounts of fresh, real-time data. Data tells us how our merchants are doing and how we can help them grow. As our network grows in size and geographic span, looking for optimization also opens us to a whole new class of data problems. Because of this, we expect our data headcount to grow even more in 2024.

Missions

Solve customers' data problems

As a (still early) member of the data practice in the Bigblue product team, you will solve critical problems for our business and our merchants. You'll participate in our product development cycles, focusing on data problems and collaborating with designers, engineers, and stakeholders inside and outside Bigblue. You'll own the entire problem-solution-experimentation lifecycle: build proof of concepts, quickly launch features and experiments in front of merchants and buyers to reach strategic outcomes. Solutions will be anything from algorithms, data analysis, to ML models.

Some things you might work on:
• Computing a reliable, real-time delivery ETA to communicate to buyers,
• Iterating on our internal logistics algorithms to optimize specific parts of our operations (packaging, warehousing, ...),
• Finding the best and greenest routes in our European carrier network to deliver on time with a minimal carbon footprint.

Empower all teams with data

Data is the true differentiator for Bigblue in the long run. We need your help to build out everyday data tools at Bigblue. You’ll be working with various internal teams across Engineering and Operations to help them be autonomous with their data needs. You will build and own tooling, pipelines and canonical datasets to power Bigblue's analytics, intelligence, and external data integration. In short, you will make the core of Bigblue's datasets reliable and enable the entire team to make data-driven decisions every minute.

Some things you might work on:
• Surface unified data schemas and tables that provide a complete view of the business across our network
• Help operations data analysts setup and evolve their near-real-time operations monitoring dashboards to track warehouse and carrier operations closely
• Build a centralized experimentation platform to pipeline experiment metrics and compute descriptive statistics

Help build Bigblue's data team

As an early data team member, you will take part in designing and running the data hiring process for both the data team and our distributed team of Analysts. You will get the opportunity to mentor other Data people and develop them into our future data leaders. You will get the opportunity to influence what data will mean for Bigblue in the coming years.

Requirements - must haves

  • Strong interest in solving business-critical problems
  • Comfortable with ownership and a large bias towards action and experimentation
  • Hands-on experience with modern data stacks from orchestration to online models (we use Fivetran, BigQuery, dbt)
  • Proficiency in the Python & SQL programming languages
  • Solid foundations in Probability, Statistics, and ML theory.

Requirements - nice to haves

  • Experience writing production-grade software in distributed systems
  • Experience in offline model evaluation and online model evaluation via A/B testing
  • Good communication skills with diverse audiences (e.g. clients, ops teams)
  • Hands-on experience with cloud technologies such as Amazon Web Services, Google Cloud, or Azure

Interview process

  • [45min] Hiring manager phone screen
  • [1h] Live data problem-solving case study
  • [45min] Data: past project interview

  • Onsite
  • [45min] Product team fit: chat with 2 Product team members
  • [45min] Founder interview
  • [15min] Conclusion with hiring manager

  • [1-2 days] Reference calls
  • 🎉 Offer

Package

  • Level D-E on our grid, 5-10 y. of experience
  • 65 - 95 kE annual salary
  • 35 - 53 kE gross value of BSPCE grant

There are more than 50,000 engineering jobs:

Subscribe to membership and unlock all jobs

Engineering Jobs

50,000+ jobs from 4,500+ well-funded companies

Updated Daily

New jobs are added every day as companies post them

Refined Search

Use filters like skill, location, etc to narrow results

Become a member

🥳🥳🥳 232 happy customers and counting...

Overall, over 80% of customers chose to renew their subscriptions after the initial sign-up.

Cancel anytime / Money-back guarantee

Wall of love from fellow engineers