- Demonstrable experience in the design and / or development of ETL data flows or data pipelines or with ODI / Oracle BI, SAS, Apache NiFi, Apache spark or similar tools
- Understanding of Agile (E2E Agility; Mindset; Business Value; Openness to change,....) And Scrum and can apply it
- You act as a knowledge Expert within the team and take other team members in tow.
- Knowledge of No SQL databases, unstructuctured data, large data volumes and how to deal with this
- Knowledge building of a DWH and data modeling (3rd NV, data vault, dimensional)
- Thorough knowledge of DB programming languages such as PL-SQL
- Experience in tuning large data warehouses
- Proven knowledge of AWS in a data & analytics context is an asset. Relevant knowledge of e.g. glue, step functions, kinesis, cloud formation, SNOW flake is an added value. You can substantiate that knowledge with the necessary certifications (for example AWS practitioner, AWS architect, AWS developer or AWS data analytics), relevant experience or a specific learning path.
- Knowledge of Cloud providers and cloud possibilities in the field of data
- Demonstrable experience of implementation in ODI (Oracle Data Integrator)
- Reporting via OBI - know what the possibilities are, what data is needed, what the limitations of tools are.
- Reporting via MicroStrategy - knowing what the possibilities are, what data is needed, what the limitations of tools are.
- Knowledge of Apache NiFi
- Understanding Kafka - knowing the possibilities of these or similar solutions
- Dremio (data as a service platform) - understand the possibilities of these or similar solutions
- Event driven architecture - understand what this means in terms of data
- Knowledge of defining and creating dashboards / reports / visualization optionsµ
- Languages: You speak Dutch and can speak well in English.
- Mission based in Brussels
- Long term mission
- Start: immediately