At a Glance
- Tasks: Lead the design and development of robust data pipelines using Databricks.
- Company: Join a forward-thinking company in the heart of London.
- Benefits: Short-term contract with immediate start and competitive pay.
- Why this job: Make an impact by working on cutting-edge data projects.
- Qualifications: Experience with Databricks, Python, SQL, and DataOps practices required.
- Other info: Great opportunity to enhance your skills in a dynamic environment.
The predicted salary is between 36000 - 60000 £ per year.
This role is a DataBricks Specialist on a short-term contract with an immediate start. Key skills include Databricks, Python, SQL, and DataOps practices. Experience with ETL/ELT frameworks and cloud platforms (AWS, Azure, GCP) is required.
Location: United Kingdom (London, England)
Start: Immediate
Responsibilities- Lead design and development of robust data pipelines.
- Integrate data from diverse sources (APIs, relational databases, files, etc.).
- Collaborate with business and analytics teams to understand data requirements.
- Ensure quality, reliability, security and governance of the ingested data.
- Follow modern DataOps practices such as code versioning, data tests, and CI/CD.
- Document processes and best practices in data engineering.
- Proven experience in building and managing large-scale data pipelines in Databricks (PySpark, Delta Lake, SQL).
- Strong programming skills in Python and SQL for data processing and transformation.
- Deep understanding of ETL/ELT frameworks, data warehousing, and distributed data processing.
- Hands-on experience with modern DataOps practices: version control (Git), CI/CD pipelines, automated testing, infrastructure-as-code.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and related data services.
- Strong problem-solving skills with the ability to troubleshoot performance, scalability, and reliability issues.
- Proficiency in Git.
DataBricks Specialist - Short Term Contract employer: Data Freelance Hub
Contact Detail:
Data Freelance Hub Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land DataBricks Specialist - Short Term Contract
✨Tip Number 1
Network like a pro! Reach out to your connections in the data world, especially those who work with Databricks or similar tech. A friendly chat can lead to opportunities that aren’t even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects with Databricks, Python, and SQL. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on DataOps practices and common data challenges. Be ready to discuss how you've tackled issues in the past, especially around ETL/ELT frameworks and cloud platforms.
✨Tip Number 4
Don’t forget to apply through our website! We’ve got loads of opportunities, and applying directly can give you a better chance of landing that short-term contract as a DataBricks Specialist.
We think you need these skills to ace DataBricks Specialist - Short Term Contract
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Databricks, Python, SQL, and DataOps practices. We want to see how your skills match the job description, so don’t be shy about showcasing relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re the perfect fit for this role. Share specific examples of your work with ETL/ELT frameworks and cloud platforms like AWS, Azure, or GCP.
Showcase Your Problem-Solving Skills: In your application, highlight instances where you've tackled performance or reliability issues in data pipelines. We love seeing how you approach challenges and find solutions!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Data Freelance Hub
✨Know Your DataBricks Inside Out
Make sure you brush up on your DataBricks knowledge before the interview. Be ready to discuss your experience with PySpark, Delta Lake, and how you've built data pipelines in the past. Having specific examples at hand will show that you really know your stuff.
✨Show Off Your Python and SQL Skills
Prepare to demonstrate your programming prowess in Python and SQL. You might be asked to solve a problem or explain how you would approach a data transformation task. Practising coding challenges related to data processing can give you a leg up.
✨Familiarise Yourself with DataOps Practices
Since this role emphasises modern DataOps practices, make sure you can talk about your experience with version control, CI/CD pipelines, and automated testing. Being able to articulate how you've implemented these practices in previous roles will set you apart.
✨Cloud Platforms Are Key
Don’t forget to highlight your familiarity with cloud platforms like AWS, Azure, or GCP. Be prepared to discuss how you've used these services in your data projects, as this is crucial for the role. Showing that you understand the cloud ecosystem will definitely impress the interviewers.