At a Glance
- Tasks: Design and govern scalable solutions on the Databricks Lakehouse platform.
- Company: Join Axpo, Switzerland's largest renewable energy producer, driving sustainable innovation globally.
- Benefits: Enjoy a collaborative culture, flexible workload, and opportunities for impactful change.
- Why this job: Be part of a skilled team transforming enterprise data with cutting-edge technologies.
- Qualifications: 5+ years in data engineering, 3+ years in architecture, and expertise in Databricks required.
- Other info: Fluency in English is essential; other languages are a bonus.
The predicted salary is between 48000 - 72000 £ per year.
Location: Baden, Madrid, London | Workload: 80–100%
Who We Are
Axpo is driven by a single purpose to enable a sustainable future through innovative energy solutions. As Switzerland's largest producer of renewable energy and a leading international energy trader, we leverage cutting-edge technologies to serve customers in over 30 countries. We thrive on collaboration, innovation, and a passion for driving impactful change.
About the Team
You’ll report directly to the Head of Development and be part of our highly skilled data platform engineering team. Together, we are building a secure, scalable, and efficient data platform to empower Axpo’s decentralized business hubs with self-service analytics and AI capabilities. We work in close collaboration with stakeholders across Europe, bringing together global innovation and local context.
What You Will Do
As a Databricks Solution Architect, you will play a pivotal role in Axpo’s enterprise data transformation by designing and governing scalable and secure solutions on the Databricks Lakehouse platform. You will:
- Define architecture standards and patterns for Databricks-based solutions across ingestion, processing, and analytics.
- Lead the design of performant, secure, and cost-effective Lakehouse architectures aligned with enterprise needs.
- Collaborate with business stakeholders, engineers, and data scientists to design end-to-end solutions that enable innovation and data-driven decision making.
- Guide engineering teams on implementing best practices around data modeling, orchestration, CI/CD, and infrastructure-as-code.
- Promote and govern usage of Unity Catalog for access control, lineage, and metadata management.
- Champion platform observability, data quality, and operational monitoring across analytics pipelines.
- Evaluate new Databricks features (e.g., Delta Sharing, governance enhancements) and lead their integration into platform capabilities.
- Establish solution review processes and mentor engineers and analysts on architectural thinking and Databricks capabilities.
- Support security, compliance, and cost-optimization efforts in close collaboration with platform and cloud teams.
What You Bring & Who You Are
You are a strategic thinker with hands-on technical expertise and a strong focus on business value. You bring:
- A degree in Computer Science, Data Engineering, Information Systems, or related field.
- 5+ years in data engineering and 3+ years in architecture roles, with deep experience designing solutions on Databricks and Apache Spark.
- Strong grasp of Delta Lake, Lakehouse architecture, and Unity Catalog governance.
- Expertise in Python, SQL, and optionally Scala; strong familiarity with dbt and modern ELT practices.
- Proven experience integrating Databricks with Azure services (e.g., Data Lake, Synapse, Event Hubs).
- Hands-on knowledge of CI/CD, GitOps, Terraform, and orchestration tools (e.g., Dragster, Airflow).
- Sound understanding of enterprise data architecture, data governance, and security principles (e.g., GDPR).
- Strong communication and stakeholder management skills, able to bridge technical and business domains.
- Fluency in English; other European languages a plus.
Technologies You’ll Work With
Core: Databricks, Spark, Delta Lake, Unity Catalog, dbt, SQL, Python
Cloud: Microsoft Azure (Data Lake, Synapse, Storage, Event Hubs)
DevOps: Bitbucket/GitHub, Azure DevOps, Terraform
Orchestration & Monitoring: Dragster, Airflow, Datadog, Grafana
Visualization: Power BI
Other: Confluence, Docker, Linux
Nice to Have
- Knowledge of Microsoft Fabric or Snowflake
- Familiarity with Dataiku or similar low-code analytics platforms
- Experience with enterprise metadata and lineage solutions
- Background in energy trading or related industries
Databricks Solution Architect (f/m/d) employer: Axpo Group
Contact Detail:
Axpo Group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Databricks Solution Architect (f/m/d)
✨Tip Number 1
Familiarise yourself with the Databricks Lakehouse platform and its features, especially Delta Lake and Unity Catalog. Understanding these technologies will help you demonstrate your expertise during discussions with the hiring team.
✨Tip Number 2
Network with professionals in the data engineering and architecture fields, particularly those who have experience with Databricks. Engaging with industry peers can provide insights and potentially lead to referrals.
✨Tip Number 3
Prepare to discuss real-world examples of how you've implemented scalable and secure data solutions in previous roles. Being able to share specific experiences will showcase your hands-on expertise and problem-solving skills.
✨Tip Number 4
Stay updated on the latest trends and advancements in data engineering, particularly around Azure services and CI/CD practices. Showing that you're proactive about learning can set you apart from other candidates.
We think you need these skills to ace Databricks Solution Architect (f/m/d)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering and architecture, particularly with Databricks and Apache Spark. Use specific examples that demonstrate your expertise in designing scalable solutions.
Craft a Compelling Cover Letter: In your cover letter, express your passion for sustainable energy solutions and how your skills align with Axpo's mission. Mention your experience with technologies like Delta Lake and Unity Catalog, and how you can contribute to their data transformation efforts.
Showcase Technical Skills: Clearly outline your technical skills in Python, SQL, and any other relevant tools or platforms. Provide examples of projects where you've successfully implemented CI/CD practices or worked with Azure services.
Highlight Collaboration Experience: Since the role involves working closely with stakeholders, emphasise your communication and collaboration skills. Share instances where you've bridged the gap between technical teams and business stakeholders to drive successful outcomes.
How to prepare for a job interview at Axpo Group
✨Understand the Databricks Ecosystem
Make sure you have a solid grasp of the Databricks Lakehouse platform, including Delta Lake and Unity Catalog. Be prepared to discuss how these technologies can be leveraged for scalable and secure data solutions.
✨Showcase Your Architectural Skills
Be ready to demonstrate your experience in designing performant and cost-effective architectures. Prepare examples from your past work where you defined architecture standards or led design initiatives.
✨Communicate Effectively with Stakeholders
Highlight your strong communication skills by discussing how you've collaborated with business stakeholders and technical teams. Share specific instances where you bridged the gap between technical and business domains.
✨Prepare for Technical Questions
Expect questions on data engineering principles, CI/CD practices, and orchestration tools. Brush up on your knowledge of Python, SQL, and Azure services, as these are crucial for the role.