At a Glance
- Tasks: Design and maintain data pipelines using Databricks and Apache Spark.
- Company: Join a forward-thinking company with a focus on data innovation.
- Benefits: Earn Β£400-500 per day, fully remote work, and flexible hours.
- Why this job: Make an impact by optimising data workflows and collaborating with analytics teams.
- Qualifications: 3+ years as a Data Engineer with strong Databricks and Python skills.
- Other info: Exciting opportunity for career growth in a dynamic, remote environment.
We are currently recruiting a Data Engineer for one of our clients. The role is outside IR35 and is paying Β£400-500 per day, it will initially be for 6 months. It is also fully remote.
Key Responsibilities
- Design, develop, and maintain batch and streaming data pipelines using Databricks (Apache Spark)
- Build and optimize ETL/ELT workflows for large-scale structured and unstructured data
- Implement Delta Lake architectures (Bronze/Silver/Gold layers)
- Integrate data from multiple sources (databases, APIs, event streams, files)
- Optimize Spark jobs for performance, scalability, and cost
- Manage data quality, validation, and monitoring
- Collaborate with analytics and ML teams to support reporting and model development
- Implement CI/CD, version control, and automated testing for data pipelines
Required Qualifications
- 3+ years of experience as a Data Engineer
- Strong experience with Databricks and Apache Spark
- Proficiency in Python (required); SQL (advanced)
- Hands-on experience with AWS or Azure cloud services:
- AWS: S3, EMR, Glue, Redshift, Lambda, IAM
- Azure: ADLS Gen2, Azure Databricks, Synapse, Data Factory, Key Vault
Data Engineer in Hull employer: Searches @ Wenham Carter
Contact Detail:
Searches @ Wenham Carter Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land Data Engineer in Hull
β¨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field and let them know you're on the hunt. You never know who might have a lead or can refer you to a great opportunity.
β¨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Databricks and Apache Spark. This will give potential employers a taste of what you can do and set you apart from the crowd.
β¨Tip Number 3
Prepare for interviews by brushing up on your technical skills. Be ready to discuss your experience with ETL/ELT workflows and Delta Lake architectures. Practising common interview questions can help you feel more confident.
β¨Tip Number 4
Don't forget to apply through our website! We make it easy for you to find roles that match your skills and interests. Plus, weβre here to support you throughout the application process.
We think you need these skills to ace Data Engineer in Hull
Some tips for your application π«‘
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Databricks, Apache Spark, and any relevant cloud services like AWS or Azure. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background makes you a great fit for the role. Keep it concise but engaging β we love a good story!
Showcase Your Projects: If you've worked on any cool projects involving data pipelines or ETL workflows, make sure to mention them! We want to see your hands-on experience and how you've tackled challenges in the past.
Apply Through Our Website: Don't forget to apply through our website! Itβs the best way for us to keep track of your application and ensure it gets the attention it deserves. Plus, itβs super easy β just a few clicks and youβre done!
How to prepare for a job interview at Searches @ Wenham Carter
β¨Know Your Tech Stack
Make sure you brush up on your knowledge of Databricks, Apache Spark, and the cloud services mentioned in the job description. Be ready to discuss how you've used these technologies in past projects, as this will show your practical experience and understanding.
β¨Showcase Your Problem-Solving Skills
Prepare to talk about specific challenges you've faced in data engineering and how you overcame them. Use examples that highlight your ability to optimise ETL/ELT workflows or improve data quality, as this will demonstrate your critical thinking and technical prowess.
β¨Familiarise Yourself with Delta Lake
Since Delta Lake architectures are a key part of the role, make sure you understand the Bronze/Silver/Gold layer concept. Be prepared to explain how you've implemented similar architectures in the past and the benefits they brought to your projects.
β¨Collaboration is Key
This role involves working closely with analytics and ML teams, so be ready to discuss your experience collaborating with cross-functional teams. Highlight any successful projects where teamwork played a crucial role in achieving results, as this will show you're a great fit for their culture.