afarax is looking for a freelance Databricks Senior Data Engineer. We need you!
The project:
Our client in the insurance sector, is seeking an experienced Databricks Data Engineer to strengthen their team.
Objective of the job:
- Databricks Engineering & Implementation
- Synchronization with the data architect on the Target Architecture and Roadmap
- Lead hands-on implementation of enterprise-grade solutions on the Databricks platform.
- Design and build robust data pipelines using Databricks notebooks, workflows, and Delta architecture.
- Implement scalable ingestion, transformation, and serving frameworks.
- Apply performance tuning and cost optimization techniques across Databricks workloads.
- Ensure reliable CI/CD and deployment practices for Databricks assets.
- Lakehouse Migration & Modernization
- Support the migration from legacy Azure SQL / Data Vault platforms to a Lakehouse architecture.
- Translate existing data structures into efficient Delta-based models.
- Help modernize batch and future-ready streaming workloads where relevant.
- Contribute pragmatic technical decisions during the transformation.
- Storage & Data Engineering
- Implement data solutions using:
- Azure Data Lake Storage (ADLS)
- Delta Lake / Delta Tables
- Medallion architecture principles (Bronze / Silver / Gold)
- Build reusable ingestion and transformation patterns.
- Implement data solutions using:
- Orchestration & Integration
- Use Azure Data Factory (ADF) for orchestration and integration with enterprise systems.
- Design maintainable scheduling, dependency, and monitoring patterns
- Governance & Security
- Guide implementation of security best practices on Databricks.
- Support setup and optimization of Unity Catalog structures, permissions, and governance models.
- Ensure alignment with enterprise standards for access control and compliance.
- Technical Leadership
- Act as senior technical reference for internal developers and engineering teams.
- Coach and mentor existing teams on Databricks engineering best practices.
- Promote reusable standards, coding discipline, and knowledge sharing.
- Collaborate effectively with architects, analysts, BI teams, and platform teams.
- Reporting & Analytics Enablement
- Support integration with Power BI using Databricks data sources.
- Help optimize data consumption patterns for reporting and analytics users.
Is this you?
- Fluent in English
- Good knowledge of French and Dutch
- Experience with Data Vault migration scenarios.
- Experience with Power BI integration.
- Experience with Infrastructure as Code (Terraform, Bicep, ARM).
- Experience with DevOps pipelines (Azure DevOps / GitHub Actions).
- Exposure to streaming architectures or event-driven pipelines.
- Proven real-world Databricks implementation experience in enterprise environments.
- Strong hands-on expertise with:
- Databricks Workspace
- Spark / PySpark / SQL
- Delta Lake / Delta Tables
- Unity Catalog
- Databricks Workflows
- Cluster Pool and Serverless
- Strong experience with Azure Data Lake Storage (ADLS).
- Strong experience with Azure Data Factory (ADF).
- Strong understanding of performance optimization and Databricks pricing/cost management.
- Experience migrating legacy data platforms to modern cloud data platforms.
- Ability to define and implement engineering best practices.
- Ability to technically lead teams while remaining hands-on
How afarax supports you?
- You benefit from our extensive network
- You will have access to projects that fit your expertise
- We help and support you throughout your project
- We offer the possibility to build a valuable and lasting partnership
Check out more projects on: https://afarax.be/jobs/type/freelance/