At a Glance
- Tasks: Design and optimise data pipelines using AWS and Databricks, driving best practices in data engineering.
- Company: Join ShareForce, a leader in data analytics and AI with a collaborative team.
- Benefits: Competitive daily rate, hybrid work model, and opportunities for professional growth.
- Why this job: Make an impact in cutting-edge data solutions while mentoring junior engineers.
- Qualifications: Experience with AWS, Databricks, Python, SQL, and a passion for data science.
- Other info: Exciting contract role with potential for extension and travel to client site.
The predicted salary is between 36000 - 60000 £ per year.
This is an exciting contract opportunity for an SC Cleared Databricks Engineer to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the AWS ecosystem.
Responsibilities:
- Design, build, and optimise end-to-end data pipelines using AWS and Databricks, including Delta Live Tables.
- Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions.
- Drive best practices for data engineering.
- Help clients realise the potential of data science, machine learning, and scaled data processing within AWS / Databricks ecosystem.
- Mentor junior engineers and support their personal development.
- Take ownership for the delivery of core solution components.
- Support with planning, requirements refinements, and work estimation.
Skills & Experiences:
- Proven experience designing and implementing data solutions in AWS using Databricks as a core platform.
- Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows.
- Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks.
- Deep understanding of lakehouse architecture, with a solid grasp of data warehousing, data lakes, and real-time data processing.
- Good experience with CI/CD practices and tools for data platforms.
- Good knowledge on how to leverage AI to increase deployment productivity and quality.
- Excellent communication skills.
- Desirable: certification in Databricks Data Engineer and/or AWS Data Engineer.
Additional Information:
- Rate offered: £500-550 per day
- IR35 Status: Outside
- Location: Hybrid with 1 day per week travel to client site in the North East
- Start date: January
- Duration: 3 months initial sign up with significant opportunity for extension
- Required: Active SC Clearance
- Seniority level: Mid-Senior level
- Employment type: Full-time
- Job function: Engineering and Information Technology
Databricks Engineer - SC Cleared in Scotland employer: Shareforce
Contact Detail:
Shareforce Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Databricks Engineer - SC Cleared in Scotland
✨Network Like a Pro
Get out there and connect with people in the industry! Attend meetups, webinars, or even just grab a coffee with someone who works at ShareForce. Building relationships can open doors that a CV just can't.
✨Show Off Your Skills
When you get the chance to chat with potential employers, don’t hold back! Talk about your hands-on experience with Databricks and AWS. Share specific examples of projects you've worked on, especially those involving Delta Live Tables.
✨Be Ready for Technical Challenges
Prepare yourself for technical interviews by brushing up on your Python and SQL skills. You might be asked to solve problems on the spot, so practice coding challenges related to data engineering and Databricks workflows.
✨Apply Through Our Website
Don’t forget to apply directly through our website! It’s the best way to ensure your application gets seen. Plus, it shows you're genuinely interested in the role and the company.
We think you need these skills to ace Databricks Engineer - SC Cleared in Scotland
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Databricks Engineer role. Highlight your experience with AWS and Databricks, especially any hands-on work with Delta Live Tables and data pipelines. We want to see how your skills match what we're looking for!
Showcase Your Projects: Include specific projects where you've designed and implemented data solutions. If you've mentored junior engineers or driven best practices, let us know! Real-world examples can really make your application stand out.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Explain why you're excited about this role and how you can contribute to our team. We love seeing passion and enthusiasm, so don’t hold back!
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and we’ll be able to review your application quickly. Plus, it shows you’re serious about joining us!
How to prepare for a job interview at Shareforce
✨Know Your Databricks Inside Out
Make sure you brush up on your Databricks knowledge, especially around Delta Live Tables and workflows. Be ready to discuss how you've designed and implemented data solutions in AWS using Databricks, as this will likely come up during the interview.
✨Showcase Your Coding Skills
Prepare to demonstrate your coding prowess in Python and SQL. You might be asked to solve a problem or explain your approach to developing modular, reusable code in Databricks, so have some examples ready to share.
✨Understand the Lakehouse Architecture
Familiarise yourself with lakehouse architecture and be prepared to discuss its benefits over traditional data warehousing. This understanding will show that you can drive best practices for data engineering and help clients leverage their data effectively.
✨Communicate Clearly and Confidently
Since excellent communication skills are essential for this role, practice articulating your thoughts clearly. Be ready to collaborate with stakeholders and mentor junior engineers, so showcasing your ability to communicate complex ideas simply will be key.