At a Glance
- Tasks: Join our Analytics & Data Team to build scalable data pipelines and deploy AI models.
- Company: We're a forward-thinking company based in Bristol, embracing innovation and collaboration.
- Benefits: Enjoy hybrid working, competitive pay, and the chance to work with cutting-edge technology.
- Why this job: Be part of a dynamic team that champions data-driven decision-making and fosters a culture of innovation.
- Qualifications: Strong Python skills, experience with Azure Databricks, and a passion for data quality are essential.
- Other info: This role offers flexibility and the opportunity to make a real impact in the data landscape.
The predicted salary is between 43200 - 72000 £ per year.
📍 Bristol - Hybrid 1 - days per week on site
💰 £600 (neg) per day Inside IR35
What You'll Be Doing
- Providing expertise to the newly formed Analytics & Data Team and support the team in building tools and data processes.
- Build the Backbone: Design and maintain scalable, reusable data pipelines that power analytics and AI initiatives.
- Bridge IT & Data: Lead the integration of systems to unlock new platform capabilities and deliver innovative features.
- Operationalize AI: Deploy production-ready AI models with automated monitoring - from data ingestion to model outputs.
- Master the ELT: Focus on Extract & Load processes across diverse enterprise data sources.
- Keep It Flowing: Monitor workflows, define SLIs, and set up alerts to ensure seamless data operations.
- Champion Governance: Apply best practices in data governance, maintaining data catalogues and dictionaries.
- Set the Standard: Develop and promote Python coding standards and drive continuous improvements in data quality.
- Model for Impact: Use dimensional data modeling to create robust data structures that support business intelligence.
- Drive a Data Culture: Empower teams with evidence-based insights and foster a data-driven decision-making environment.
What we're looking for
- Strong Python skills, especially with PySpark.
- Deep experience with Azure Databricks and cloud-based data platforms.
- A flexible, adaptive mindset and a collaborative spirit.
- Excellent communication skills - able to translate tech into plain English.
- A logical, analytical approach to problem-solving.
- Familiarity with the modern data stack and integration best practices.
- A passion for data quality, innovation, and continuous improvement.
Data Engineer employer: Reed Technology
Contact Detail:
Reed Technology Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Network with professionals in the data engineering field, especially those who work with Azure Databricks and PySpark. Attend local meetups or online webinars to connect with potential colleagues and learn about their experiences.
✨Tip Number 2
Showcase your Python skills by contributing to open-source projects or creating your own data pipeline projects. This hands-on experience will not only enhance your skills but also provide you with tangible examples to discuss during interviews.
✨Tip Number 3
Familiarise yourself with the latest trends in data governance and best practices. Being able to discuss these topics confidently can set you apart from other candidates and demonstrate your commitment to maintaining data quality.
✨Tip Number 4
Prepare to explain complex technical concepts in simple terms. Practice articulating how you've solved problems in previous roles, as strong communication skills are essential for bridging IT and data teams effectively.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your strong Python skills, experience with Azure Databricks, and any relevant projects that showcase your ability to build scalable data pipelines. Use keywords from the job description to align your experience with what the company is looking for.
Craft a Compelling Cover Letter: In your cover letter, express your passion for data quality and innovation. Mention specific examples of how you've operationalised AI or improved data processes in previous roles. This is your chance to show your personality and collaborative spirit.
Showcase Your Problem-Solving Skills: Provide examples in your application that demonstrate your logical and analytical approach to problem-solving. Discuss challenges you've faced in data engineering and how you overcame them, particularly in relation to ELT processes or data governance.
Highlight Communication Skills: Since excellent communication is key for this role, include instances where you've successfully translated technical concepts into plain English for non-technical stakeholders. This will show that you can bridge the gap between IT and data effectively.
How to prepare for a job interview at Reed Technology
✨Showcase Your Python Skills
Since strong Python skills, particularly with PySpark, are crucial for this role, be prepared to discuss your experience in detail. Bring examples of projects where you've used Python to build data pipelines or automate processes.
✨Demonstrate Your Cloud Experience
Highlight your familiarity with Azure Databricks and other cloud-based data platforms. Be ready to explain how you've leveraged these tools in past roles to enhance data operations or analytics capabilities.
✨Communicate Clearly
Excellent communication skills are essential for translating technical concepts into plain English. Practice explaining complex data processes or models in simple terms, as you may need to do this during the interview.
✨Emphasise Your Problem-Solving Approach
Prepare to discuss your logical and analytical approach to problem-solving. Share specific examples of challenges you've faced in data engineering and how you overcame them, focusing on your thought process and the outcomes.