At a Glance
- Tasks: Design and implement big data solutions using the Databricks platform for diverse clients.
- Company: Join Databricks, a leading data and AI company with a global presence.
- Benefits: Enjoy competitive pay, comprehensive benefits, and opportunities for professional growth.
- Why this job: Make a real impact by solving big data challenges and transforming client projects.
- Qualifications: Experience in data engineering, coding in Python or Scala, and cloud ecosystems.
- Other info: Collaborative environment with a commitment to diversity and inclusion.
The predicted salary is between 36000 - 60000 £ per year.
Overview
As a Solutions Architect in our Professional Services team, you will work with clients on short to medium-term customer engagements on their big data challenges using the Databricks Platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers get the most value out of their data. RSAs are billable and know how to complete projects according to specifications with excellent customer service. You will report to the regional Manager/Lead.
The Impact You Will Have
- You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-tos and productionalising customer use cases.
- Work with engagement managers to scope a variety of professional services with input from the customer.
- Guide strategic customers as they implement transformational big data projects, including end-to-end design, build and deployment of industry-leading big data and AI applications.
- Consult on architecture and design; bootstrap or implement customer projects, which leads to a customer’s successful understanding, evaluation and adoption of Databricks.
- Provide an escalated level of support for customer operational issues.
- Collaborate with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet the customer’s needs.
- Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution of engagement-specific product and support issues.
What We Look For
- Proficient in data engineering, data platforms, and analytics with a strong track record of successful projects and in-depth knowledge of industry best practices.
- Comfortable writing code in either Python or Scala.
- Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one.
- Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals.
- Familiarity with CI/CD for production deployments.
- Working knowledge of MLOps.
- Design and deployment of performant end-to-end data architectures.
- Experience with technical project delivery - managing scope and timelines.
- Documentation and white-boarding skills.
- Experience working with clients and managing conflicts.
- Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.
- Travel is required up to 10%, more at peak times.
- Databricks Certification.
About Databricks
Databricks is the data and AI company. More than 10,000 organizations worldwide rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow.
Benefits
At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit mybenefitsnow.com/databricks.
Our Commitment to Diversity and Inclusion
At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards.
Compliance
If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Details
- Seniority level: Mid-Senior level
- Employment type: Full-time
- Job function: Engineering and Information Technology
- Industries: Software Development
Big Data Solutions Architect, Spark (Professional Services) in London employer: Databricks
Contact Detail:
Databricks Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Big Data Solutions Architect, Spark (Professional Services) in London
✨Tip Number 1
Network like a pro! Reach out to folks in your industry on LinkedIn or at meetups. A friendly chat can lead to opportunities that aren’t even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repo showcasing your projects, especially those involving Spark and data engineering. This gives potential employers a taste of what you can do.
✨Tip Number 3
Prepare for interviews by practising common technical questions related to big data and cloud ecosystems. We recommend doing mock interviews with friends or using online platforms to get comfortable.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive!
We think you need these skills to ace Big Data Solutions Architect, Spark (Professional Services) in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Solutions Architect role. Highlight your experience with data engineering, cloud ecosystems, and Apache Spark. We want to see how your skills align with what we’re looking for!
Showcase Your Projects: Include specific examples of projects you've worked on that demonstrate your expertise in big data solutions. We love seeing how you’ve tackled challenges and delivered results for clients!
Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points where possible to make it easy for us to read through your qualifications and experiences quickly.
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. We can’t wait to hear from you!
How to prepare for a job interview at Databricks
✨Know Your Tech Inside Out
Make sure you brush up on your data engineering and cloud technology skills, especially with Apache Spark. Be ready to discuss your past projects in detail, showcasing how you've tackled big data challenges and integrated solutions effectively.
✨Showcase Your Client Management Skills
Since this role involves working closely with clients, prepare examples of how you've successfully managed client relationships and resolved conflicts. Highlight your customer service approach and how it led to successful project outcomes.
✨Prepare for Technical Questions
Expect technical questions that test your knowledge of Python or Scala, as well as your understanding of CI/CD processes. Practise coding problems and be ready to explain your thought process clearly during the interview.
✨Demonstrate Your Collaborative Spirit
Collaboration is key in this role, so think of examples where you've worked with cross-functional teams. Be prepared to discuss how you’ve contributed to team success and how you handle feedback from peers and clients alike.