At a Glance
- Tasks: Develop and maintain data pipelines for impactful reporting and analysis.
- Company: Join one of the UK's fastest growing businesses with a focus on data-driven decisions.
- Benefits: Up to £80,000 salary, bonus scheme, 25 days holiday, and medical insurance.
- Other info: Dynamic hybrid role with exposure to various teams and projects.
- Why this job: Be part of a team that values data and drives innovation across the business.
- Qualifications: Experience with Python, SQL, and data warehousing essential.
The predicted salary is between 68000 - 92000 £ per year.
I’m working with one of the UK’s fastest growing businesses, currently building out their data team as they continue to scale. It’s a great opportunity to join a business where data is becoming more central to how decisions are made, with ongoing investment in tools and infrastructure.
They’re looking for a Data Engineer to support the development of their data platform, working on pipelines and datasets that are used across the business. You’ll be involved in getting data from source through to something clean and usable for reporting and analysis.
- Python
- SQL
- Bonus Scheme
- ~25 days holiday + BH
- ~ Birthday off
- ~ Pension
- ~ Medical insurance + death insurance
- ~ Company discounts
Experience working with Python or a similar language for data pipelines.
Strong SQL.
Understanding of data warehousing and data modelling.
Experience pulling data from APIs and multiple sources.
Good understanding of data quality and governance.
Building and maintaining data pipelines using Python and SQL within GCP.
Extracting, transforming and loading data from different sources using tools such as BigQuery and Dataform.
Creating datasets and data models that can be used for reporting and analysis in Looker Studio.
Working with teams across the business to understand data requirements and translate these into solutions.
Supporting data projects from initial brief through to delivery, working with GCP services like Airflow, Cloud Functions and Cloud Run.
Improving data quality, reliability and performance through monitoring, validation and optimisation within the data warehouse environment.
This role sits in a business that is investing properly in data rather than treating it as an afterthought. You will be working across a mix of data sources and teams, which gives you good exposure to how different areas of the business operate. It is not a siloed role where you are just handed tickets.
Ingegnere dei dati (f/m) in Liverpool employer: Forward Role Recruitment
Contact Detail:
Forward Role Recruitment Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Ingegnere dei dati (f/m) in Liverpool
✨Tip Number 1
Network like a pro! Reach out to current employees on LinkedIn or attend industry meetups. It’s all about making connections that can help us get our foot in the door.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data projects, especially those involving Python and SQL. This gives us a chance to demonstrate our expertise beyond just a CV.
✨Tip Number 3
Prepare for the interview by brushing up on common data engineering questions. We should also be ready to discuss how we’ve tackled data quality and governance in past projects.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure our application gets seen by the right people. Plus, it shows we’re genuinely interested in the role.
We think you need these skills to ace Ingegnere dei dati (f/m) in Liverpool
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Data Engineer role. Highlight your experience with Python, SQL, and any data pipeline projects you've worked on. We want to see how you can contribute to our data team!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our needs. Don’t forget to mention your understanding of data quality and governance – it’s key for us!
Showcase Your Projects: If you've worked on any relevant projects, whether in a professional or personal capacity, make sure to include them. We love seeing practical examples of how you've built and maintained data pipelines or worked with GCP services.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows you’re keen to join our team at StudySmarter!
How to prepare for a job interview at Forward Role Recruitment
✨Know Your Tech Inside Out
Make sure you brush up on your Python and SQL skills before the interview. Be ready to discuss how you've used these languages in past projects, especially for building data pipelines or working with APIs. The more specific examples you can provide, the better!
✨Understand the Business Context
Research the company and its approach to data. Understand how they use data to drive decisions and what tools they invest in. This will help you tailor your answers to show that you're not just a tech whiz but also someone who gets the bigger picture.
✨Prepare for Scenario Questions
Expect questions that ask you to solve hypothetical problems related to data quality, governance, or pipeline optimisation. Think through some scenarios beforehand and be ready to explain your thought process and the steps you'd take to tackle these challenges.
✨Show Your Collaborative Spirit
Since this role involves working with various teams, be prepared to discuss how you've collaborated in the past. Share examples of how you've translated data requirements into solutions and how you’ve supported projects from start to finish. Highlighting your teamwork skills will set you apart!