At a Glance
- Tasks: Lead the design and delivery of cutting-edge data and AI solutions on the Databricks Lakehouse Platform.
- Company: Join a dynamic consultancy partnered with Databricks, driving innovation in data engineering.
- Benefits: Enjoy flexible working options, competitive salary, and opportunities for professional growth.
- Why this job: Be at the forefront of data technology, shaping solutions that impact businesses and communities.
- Qualifications: Expertise in Databricks, Apache Spark, and cloud platforms is essential; certifications are a plus.
- Other info: Ideal for seasoned professionals looking to mentor and lead in a collaborative environment.
The predicted salary is between 54000 - 84000 £ per year.
Techyard is supporting a growing Databricks-partnered consultancy in securing a Databricks Champion Solution Architect to lead the design and delivery of advanced data and AI solutions on the Databricks Lakehouse Platform. This strategic, high-impact role is ideal for a seasoned professional who can operate as both a hands-on architect and a trusted advisor—bridging business vision with technical excellence. You’ll play a pivotal part in driving innovation, embedding best practices, and shaping the consultancy’s data engineering capabilities as they scale their team and client base.
Key Responsibilities:
- Architect and implement end-to-end, scalable data and AI solutions using the Databricks Lakehouse (Delta Lake, Unity Catalog, MLflow).
- Design and lead the development of modular, high-performance data pipelines using Apache Spark and PySpark.
- Champion the adoption of Lakehouse architecture (bronze/silver/gold layers) to ensure scalable, governed data platforms.
- Collaborate with stakeholders, analysts, and data scientists to translate business needs into clean, production-ready datasets.
- Promote CI/CD, DevOps, and data reliability engineering (DRE) best practices across Databricks environments.
- Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows.
- Drive performance tuning, cost optimisation, and monitoring across data workloads.
- Mentor engineering teams and support architectural decisions as a recognised Databricks expert.
Essential Skills & Experience:
- Demonstrable expertise with Databricks and Apache Spark in production environments.
- Proficiency in PySpark, SQL, and working within one or more cloud platforms (Azure, AWS, or GCP).
- In-depth understanding of Lakehouse concepts, medallion architecture, and modern data warehousing.
- Experience with version control, testing frameworks, and automated deployment pipelines (e.g., GitHub Actions, Azure DevOps).
- Sound knowledge of data governance, security, and compliance, including Unity Catalog.
- Excellent communication, leadership, and problem-solving skills.
Desirable:
- Databricks certifications (e.g., Data Engineer Associate / Professional or Solutions Architect).
- Familiarity with MLflow, dbt, and BI tools such as Power BI or Tableau.
- Exposure to MLOps practices and deploying ML models within Databricks.
- Experience working within Agile and DevOps-driven delivery environments.
Databricks Solution Architect Champion employer: LinkedIn
Contact Detail:
LinkedIn Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Databricks Solution Architect Champion
✨Tip Number 1
Network with professionals in the Databricks community. Attend meetups, webinars, or conferences to connect with others in the field. This can help you gain insights into the latest trends and potentially lead to referrals.
✨Tip Number 2
Showcase your hands-on experience with Databricks and Apache Spark through personal projects or contributions to open-source initiatives. This practical demonstration of your skills can set you apart from other candidates.
✨Tip Number 3
Familiarise yourself with the specific tools and technologies mentioned in the job description, such as MLflow, dbt, and Airflow. Being able to discuss these tools confidently during interviews will demonstrate your readiness for the role.
✨Tip Number 4
Prepare to discuss real-world scenarios where you've implemented data solutions or optimised workflows. Use the STAR method (Situation, Task, Action, Result) to structure your responses and highlight your problem-solving abilities.
We think you need these skills to ace Databricks Solution Architect Champion
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Databricks, Apache Spark, and cloud platforms. Emphasise your hands-on architectural skills and any relevant certifications to align with the job requirements.
Craft a Compelling Cover Letter: In your cover letter, showcase your understanding of Lakehouse architecture and your ability to bridge business needs with technical solutions. Use specific examples from your past experiences to demonstrate your expertise.
Highlight Relevant Projects: Include details about specific projects where you implemented data and AI solutions using Databricks. Discuss your role in these projects, the challenges faced, and how you overcame them to deliver successful outcomes.
Showcase Soft Skills: Since this role requires excellent communication and leadership skills, make sure to mention instances where you've mentored teams or collaborated with stakeholders. This will help illustrate your ability to lead and advise effectively.
How to prepare for a job interview at LinkedIn
✨Showcase Your Technical Expertise
Be prepared to discuss your hands-on experience with Databricks and Apache Spark. Highlight specific projects where you've implemented scalable data solutions, and be ready to explain the architecture and design choices you made.
✨Demonstrate Your Problem-Solving Skills
Expect scenario-based questions that assess your ability to tackle real-world challenges. Think of examples where you've optimised performance or resolved issues in data pipelines, and articulate your thought process clearly.
✨Communicate Effectively
As a Solution Architect, you'll need to bridge technical and business needs. Practice explaining complex concepts in simple terms, and be ready to discuss how you've collaborated with stakeholders to deliver successful outcomes.
✨Emphasise Your Leadership Experience
Since mentoring is part of the role, share instances where you've led teams or guided junior engineers. Discuss your approach to fostering a collaborative environment and promoting best practices in data engineering.