Overview
đź§ Reports to: Head of IT
My client is rebuilding their reporting platform around a governed Azure + Databricks Lakehouse — and they need a Power BI engineer to own the enterprise BI layer.
This is not a standard reporting role. You will build version-controlled semantic models and high-performance datasets that power decision-making across Operations, Finance, Logistics, Sales and Customer Care.
Why this role exists
To turn trusted Gold-layer Lakehouse data into certified, enterprise-grade Power BI insight — with governance, performance and adoption built in from day one.
What you’ll be doing
- Build PBIP datasets fully version-controlled in Git
- Manage CI/CD and PR workflows via Azure DevOps
Lakehouse-Aligned BI
- Build only from Gold curated Databricks tables
- Partner with Data Engineering on contract-first schemas
- Validate, profile and optimise data using Databricks SQL
Performance & Governance
- Design high-performance models (aggregations, incremental refresh, hybrid DQ/Import)
- Implement RLS / OLS, certifications, endorsements and metadata standards
- Monitor and tune models in Power BI PPU capacity
Adoption & UX
- Build clean, intuitive dashboards with strong UX standards
- Support BI adoption through training and documentation
What my client is looking for
Certifications: PL-300 | DP-600
Experience:
- Commercial years Power BI development
- Strong DAX and semantic modelling
- Databricks SQL & Lakehouse experience
- Performance optimisation and Git-based workflows
- Strong analytical mindset and UX intuition
Nice to have: Python / PySpark, Delta Lake, Unity Catalog, data quality frameworks
If you want to move beyond “report building” and help define how enterprise BI is done — this is your role.
#J-18808-Ljbffr
Contact Detail:
Head Resourcing Recruiting Team