At a Glance
- Tasks: Design and build scalable data pipelines and analytics platforms using modern cloud technologies.
- Company: Join a leading Financial Services firm driving innovative Data Modernisation initiatives.
- Benefits: Enjoy flexible hybrid/remote work options and a competitive salary package.
- Why this job: Be part of a collaborative culture that values innovation and professional growth in data transformation.
- Qualifications: Experience with ETL/ELT pipelines, Python, SQL, and data visualisation tools is essential.
- Other info: Opportunity to work with cutting-edge technologies and make a real impact in the industry.
The predicted salary is between 32000 - 60000 £ per year.
Location: Birmingham (Hybrid/Remote options available)
Salary: £40,000 - £75,000 (depending on experience)
The Opportunity
Join a forward-thinking Financial Services leader driving a groundbreaking Data Modernisation programme. We’re seeking talented Analytics Data Engineers to shape the future of data-driven decision-making in a fast-paced, innovative environment. This permanent role offers a competitive salary, flexible working, and the chance to work with cutting-edge cloud technologies to deliver impactful data solutions.
As an Analytics Data Engineer, you’ll design and build robust, scalable data pipelines and analytics platforms that empower business teams to unlock insights through self-service tools. You’ll work with modern cloud-native tools like Snowflake, Databricks, Python, and advanced visualisation platforms to create solutions that drive measurable business outcomes.
Key Responsibilities
- Develop Scalable Data Pipelines: Design and implement ETL/ELT workflows using Python, SQL, and cloud-native tools to support analytics and reporting needs.
- Enable Business Insights: Build intuitive, optimised data models for platforms like Power BI, Tableau, or Looker to enable self-service analytics for non-technical users.
- Ensure Data Excellence: Implement robust data governance, quality, and security practices within a decentralised Data Mesh framework.
- Collaborate Across Teams: Partner with data scientists, analysts, and business stakeholders to translate requirements into high-performance data products.
- Drive Continuous Improvement: Proactively identify opportunities to optimise data workflows, adopt emerging technologies, and enhance analytics capabilities.
Requirements:
- Technical Proficiency: Hands-on experience building ETL/ELT pipelines with Python, SQL, or tools like Apache Airflow, and expertise in visualisation tools (Power BI, Tableau, or Looker).
- Cloud Expertise: Familiarity with cloud platforms like Snowflake, Databricks, or AWS/GCP/Azure for scalable data solutions.
- Data Modelling Mastery: Strong understanding of dimensional and relational modelling techniques for analytics use cases.
- Stakeholder Engagement: Excellent communication skills to work with cross-functional teams and present solutions to technical and non-technical audiences.
- Innovative Mindset: A proactive approach to exploring new tools, staying ahead of industry trends, and driving best practices in data engineering.
Why Join our client?
- Work with cutting-edge technologies in a collaborative, innovative culture.
- Flexible hybrid/remote working options to suit your lifestyle.
- Opportunities for professional growth and impact in a high-visibility Data Transformation programme.
If you’re excited to shape the future of data in Financial Services, apply with your latest CV to join our mission!
Contact Detail:
McCabe & Barton Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Analytics Data Engineer
✨Tip Number 1
Familiarise yourself with the specific cloud technologies mentioned in the job description, such as Snowflake and Databricks. Having hands-on experience or even completing relevant online courses can give you a significant edge during interviews.
✨Tip Number 2
Network with professionals in the data engineering field, especially those who work with financial services. Attend industry meetups or webinars to connect with potential colleagues and learn about their experiences, which can provide valuable insights.
✨Tip Number 3
Prepare to discuss your previous projects involving ETL/ELT workflows and data visualisation tools. Be ready to explain your thought process and the impact of your work on business outcomes, as this will demonstrate your practical knowledge and problem-solving skills.
✨Tip Number 4
Showcase your innovative mindset by staying updated on the latest trends in data engineering. Mention any new tools or methodologies you've explored recently, as this reflects your proactive approach and eagerness to contribute to continuous improvement.
We think you need these skills to ace Analytics Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in building ETL/ELT pipelines, using Python and SQL. Emphasise your familiarity with cloud platforms like Snowflake or Databricks, as well as your expertise in visualisation tools such as Power BI or Tableau.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your innovative mindset. Mention specific projects where you've successfully implemented data solutions and how they drove business outcomes.
Highlight Collaboration Skills: In your application, emphasise your ability to work with cross-functional teams. Provide examples of how you've effectively communicated technical concepts to non-technical stakeholders, showcasing your stakeholder engagement skills.
Showcase Continuous Learning: Demonstrate your proactive approach to learning by mentioning any recent courses, certifications, or technologies you've explored. This shows your commitment to staying ahead of industry trends and improving your data engineering capabilities.
How to prepare for a job interview at McCabe & Barton
✨Showcase Your Technical Skills
Be prepared to discuss your hands-on experience with ETL/ELT pipelines, particularly using Python and SQL. Bring examples of past projects where you successfully implemented data solutions, as this will demonstrate your technical proficiency.
✨Understand the Tools
Familiarise yourself with the cloud platforms mentioned in the job description, such as Snowflake and Databricks. If possible, have a few insights or experiences ready to share about how you've used these tools in previous roles.
✨Communicate Clearly
Since you'll be collaborating with both technical and non-technical teams, practice explaining complex concepts in simple terms. This will show your ability to engage with diverse stakeholders effectively.
✨Demonstrate an Innovative Mindset
Prepare to discuss how you've proactively identified opportunities for improvement in your past roles. Highlight any new tools or technologies you've explored and how they enhanced your data engineering practices.