The Challenge 

We are looking for a full-time (32 - 40 hour) Data Engineer. As a Data Engineer you will contribute to the development and maintenance of the modern data platform. 

Within the BI & Data Engineering team, as part of the Group IT department, our Data Engineers focus mainly on the gathering, processing and storing of data which is critical to the success of company and satisfying information needs of the stakeholders. The Azure Data Platform you will be working on, is a critical component in the Data Excellence strategy. It will be your responsibility to extract data from several internal or external sources, connect, combine and model them, in order to make them available to different departments. 

 A growing part of the IT landscape is developed in Microsoft Azure. Azure Data Factory is used for orchestration and ETL and some ETL is implemented in Databricks. We use PowerBI for dashboarding on top of the Azure Lakehouse data. The Lakehouse has three different layers you will be modelling on. You will be working in a multi-disciplinary team in a network organization and collaborate closely with the DevOps team to implement best practices for  (CI/CD) and Infrastructure as Code.  

Responsibilities: 

  • Maintaining existing processes. 
  • Execute data transformations and modelling using dbt (Data Build Tool) to structure the data. 
  • Write efficient code in Python. 
  • Connecting new data sources. 
  • Design, implement, and maintain scalable pipelines. 
  • Collaborate closely with Data Scientists to industrialize and deploy their algorithms and models into production.
  • To deliver value, you will make sure that data is collected from many different structured and unstructured sources, and transported to the platform, in the right format and according to data architecture principles. 

In this role, you work closely together with other Data Engineers (4-eye principle) and you report to the IT Lead BI & Data Engineering. 

 

What can we find on your CV? 

  • At least five years of relevant working experience
  • Must-have tooling experience: Databricks (using notebooks and clusters), dbt (data build tool) for transformational work, Azure Data Factory
  • Experience in working in an Agile/Scrum environment; 
  • At least a Bachelor degree, preferably in the field of IT; 
  • Knowledge on CI/CD.  
      

This is you! 

  • You have strong communication skills and speak English fluently; 
  • You are a team player, are enthusiastic and know how to take ownership; 
  • You have a passion for data and want to get the most out of it; 
  • Project management is something you are good at (limited to smaller data integration/modelling projects you will be running). 

 
Our offer 

A challenging full-time job within an enthusiastic and committed team where cooperation and results come first. 

You will receive:

  • An extensive introduction program; 
  • A gross monthly salary of €4.800,- up to €6.300,- depending on experience.
  • Approx. 36 free days (23 holidays & 13 ATV based on a full-time job); 
  • Laptop and an iPhone; 
  • Good secondary employment conditions; 
  •  Extensive development opportunities through the Academy. 

 

Please upload your CV in the fields below. We will be in contact very soon!