At a Glance
- Tasks: Design and govern scalable solutions on the Databricks Lakehouse platform.
- Company: Join Axpo, Switzerland's largest renewable energy producer, driving sustainable innovation globally.
- Benefits: Enjoy a flexible workload with opportunities for remote work and collaboration across Europe.
- Why this job: Be part of a dynamic team transforming enterprise data with cutting-edge technology and impactful change.
- Qualifications: 5+ years in data engineering, 3+ years in architecture, with expertise in Databricks and Apache Spark.
- Other info: Fluency in English is essential; knowledge of other European languages is a plus.
The predicted salary is between 48000 - 84000 £ per year.
Location: Baden, Madrid, London | Workload: 80–100%Who We AreAxpo is driven by a single purpose to enable a sustainable future through innovative energy solutions. As Switzerland\’s largest producer of renewable energy and a leading international energy trader, we leverage cutting-edge technologies to serve customers in over 30 countries. We thrive on collaboration, innovation, and a passion for driving impactful change.About the TeamYou’ll report directly to the Head of Development and be part of our highly skilled data platform engineering team. Together, we are building a secure, scalable, and efficient data platform to empower Axpo’s decentralized business hubs with self-service analytics and AI capabilities. We work in close collaboration with stakeholders across Europe, bringing together global innovation and local context.What You Will DoAs a Databricks Solution Architect, you will play a pivotal role in Axpo’s enterprise data transformation by designing and governing scalable and secure solutions on the Databricks Lakehouse platform.You will:Define architecture standards and patterns for Databricks-based solutions across ingestion, processing, and analytics.Lead the design of performant, secure, and cost-effective Lakehouse architectures aligned with enterprise needs.Collaborate with business stakeholders, engineers, and data scientists to design end-to-end solutions that enable innovation and data-driven decision making.Guide engineering teams on implementing best practices around data modeling, orchestration, CI/CD, and infrastructure-as-code.Promote and govern usage of Unity Catalog for access control, lineage, and metadata management.Champion platform observability, data quality, and operational monitoring across analytics pipelines.Evaluate new Databricks features (e.g., Delta Sharing, governance enhancements) and lead their integration into platform capabilities.Establish solution review processes and mentor engineers and analysts on architectural thinking and Databricks capabilities.Support security, compliance, and cost-optimization efforts in close collaboration with platform and cloud teams.What You Bring & Who You AreYou are a strategic thinker with hands-on technical expertise and a strong focus on business value. You bring:A degree in Computer Science, Data Engineering, Information Systems, or related field.5+ years in data engineering and 3+ years in architecture roles, with deep experience designing solutions on Databricks and Apache Spark.Strong grasp of Delta Lake, Lakehouse architecture, and Unity Catalog governance.Expertise in Python, SQL, and optionally Scala; strong familiarity with dbt and modern ELT practices.Proven experience integrating Databricks with Azure services (e.g., Data Lake, Synapse, Event Hubs).Hands-on knowledge of CI/CD, GitOps, Terraform, and orchestration tools (e.g., Dragster, Airflow).Sound understanding of enterprise data architecture, data governance, and security principles (e.g., GDPR).Strong communication and stakeholder management skills, able to bridge technical and business domains.Fluency in English; other European languages a plus.Technologies You’ll Work WithCore: Databricks, Spark, Delta Lake, Unity Catalog, dbt, SQL, PythonCloud: Microsoft Azure (Data Lake, Synapse, Storage, Event Hubs)DevOps: Bitbucket/GitHub, Azure DevOps, TerraformOrchestration & Monitoring: Dragster, Airflow, Datadog, GrafanaVisualization: Power BIOther: Confluence, Docker, LinuxNice to HaveKnowledge of Microsoft Fabric or SnowflakeFamiliarity with Dataiku or similar low-code analytics platformsExperience with enterprise metadata and lineage solutionsBackground in energy trading or related industries #J-18808-Ljbffr
Databricks Solution Architect (f/m/d) employer: JobLeads GmbH
Contact Detail:
JobLeads GmbH Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Databricks Solution Architect (f/m/d)
✨Tip Number 1
Familiarise yourself with the Databricks Lakehouse platform and its features. Understanding how Delta Lake and Unity Catalog work will give you a significant edge during discussions with our team.
✨Tip Number 2
Showcase your experience with Azure services, particularly in integrating Databricks with tools like Data Lake and Synapse. Be prepared to discuss specific projects where you've successfully implemented these technologies.
✨Tip Number 3
Highlight your knowledge of CI/CD practices and tools such as Terraform and GitOps. We value candidates who can demonstrate their ability to streamline data engineering processes through automation.
✨Tip Number 4
Prepare to discuss your approach to data governance and security principles, especially in relation to GDPR compliance. This is crucial for ensuring that our solutions meet regulatory standards.
We think you need these skills to ace Databricks Solution Architect (f/m/d)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering and architecture, particularly with Databricks and Apache Spark. Use specific examples that demonstrate your expertise in designing scalable solutions.
Craft a Compelling Cover Letter: In your cover letter, express your passion for innovative energy solutions and how your skills align with Axpo's mission. Mention your experience with technologies like Delta Lake and Unity Catalog, and how you can contribute to their data transformation goals.
Showcase Technical Skills: Clearly outline your technical skills in Python, SQL, and any other relevant tools or platforms. Provide examples of projects where you've successfully implemented CI/CD practices or worked with Azure services.
Highlight Collaboration Experience: Since the role involves working closely with stakeholders, emphasise your communication and collaboration skills. Share instances where you've bridged the gap between technical teams and business stakeholders to drive successful outcomes.
How to prepare for a job interview at JobLeads GmbH
✨Understand the Databricks Ecosystem
Make sure you have a solid grasp of the Databricks Lakehouse platform, including Delta Lake and Unity Catalog. Be prepared to discuss how these technologies can be leveraged for scalable and secure data solutions.
✨Showcase Your Architectural Skills
Be ready to demonstrate your experience in designing performant and cost-effective architectures. Prepare examples from your past work where you defined architecture standards or led design initiatives.
✨Communicate Effectively with Stakeholders
Highlight your strong communication skills by discussing how you've collaborated with business stakeholders and technical teams. Share specific instances where you bridged the gap between technical and business domains.
✨Prepare for Technical Questions
Expect questions on CI/CD practices, data governance, and security principles. Brush up on your knowledge of tools like Terraform, GitOps, and orchestration tools, as well as your programming skills in Python and SQL.