At a Glance
- Tasks: Build and optimise data pipelines for a modern cloud data warehouse.
- Company: Join a rapidly expanding global retail and e-commerce brand.
- Benefits: Enjoy a salary up to Β£65,000 plus a competitive benefits package.
- Why this job: Take ownership of impactful projects in a dynamic and innovative environment.
- Qualifications: Strong experience with BigQuery, GCP, SQL, and Python is essential.
- Other info: This role is remote, perfect for UK-based applicants.
The predicted salary is between 39000 - 65000 Β£ per year.
Job Description
DATA ENGINEER
Β£65,000 + BENEFITS
Remote (UK based)
Looking for a role where you can take ownership of a modern cloud data warehouse, optimise pipelines, and make data more scalable, reliable, and cost-effective?
THE COMPANY:
I'm working with a global retail & e-commerce brand that's expanding rapidly across the UK. With a growing digital and physical presence, their central data team plays a key role in powering insights across e-commerce, retail, finance, and marketing.
THE ROLE:
This is a hands-on data engineering role where you'll focus on building, optimising, and scaling pipelines into BigQuery (GCP) from a wide range of sources. You'll work closely with BI developers and analysts to ensure clean, reliable data is available for reporting and decision-making.
- Manage and optimise pipelines from ERP, website, APIs, app stores, social media, and more.
- Work with BigQuery, Google Buckets, Python scripts, and Talend (transitioning to Dataform/Cloud workflows).
- Support migrations between GCP projects and ensure the warehouse is scalable and cost-efficient.
- Partner with BI teams to prepare and transform data for reporting in Power BI and Looker.
YOUR SKILLS AND EXPERIENCE:
A successful Data Engineer will have the following skills and experience:
- Strong hands-on experience with BigQuery & GCP (essential).
- Solid SQL skills, with a focus on optimisation and performance tuning.
- Experience working with APIs and data pipelines.
- Python for automation and bespoke data jobs.
- Knowledge of BI tools (Power BI / Looker) β good to have for connecting and enabling reporting.
THE BENEFITS:
You will receive a salary up to Β£65,000 depending on experience, plus a competitive benefits package.
HOW TO APPLY:
Please register your interest by sending your CV to Molly Bird via the apply link on this page.
Data Engineer GCP employer: Harnham - Data & Analytics Recruitment
Contact Detail:
Harnham - Data & Analytics Recruitment Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land Data Engineer GCP
β¨Tip Number 1
Familiarise yourself with Google Cloud Platform (GCP) and BigQuery. Make sure you can confidently discuss your experience with these tools, as they are essential for the role. Consider working on personal projects or contributing to open-source projects that utilise GCP to showcase your skills.
β¨Tip Number 2
Network with professionals in the data engineering field, especially those who work with GCP. Join relevant online communities or attend local meetups to connect with others. This can lead to valuable insights and potential referrals that could help you land the job.
β¨Tip Number 3
Prepare to discuss specific examples of how you've optimised data pipelines in the past. Be ready to explain the challenges you faced and how you overcame them, particularly with SQL and Python. This will demonstrate your hands-on experience and problem-solving abilities.
β¨Tip Number 4
Stay updated on the latest trends and best practices in data engineering, especially regarding cloud technologies. Follow industry blogs, podcasts, or webinars to enhance your knowledge. This will not only prepare you for interviews but also show your passion for the field.
We think you need these skills to ace Data Engineer GCP
Some tips for your application π«‘
Tailor Your CV: Make sure your CV highlights your experience with BigQuery and GCP, as well as your SQL skills. Use specific examples of projects where you've optimised data pipelines or worked with APIs to demonstrate your hands-on experience.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention how your skills align with the job requirements, particularly your experience with Python and BI tools like Power BI or Looker.
Showcase Relevant Projects: If you have any relevant projects or case studies, consider including them in your application. This could be a GitHub link or a portfolio that showcases your work with data engineering, especially in cloud environments.
Proofread Your Application: Before submitting, take the time to proofread your CV and cover letter. Check for any spelling or grammatical errors, and ensure that all information is clear and concise. A polished application reflects your attention to detail.
How to prepare for a job interview at Harnham - Data & Analytics Recruitment
β¨Showcase Your Technical Skills
Make sure to highlight your hands-on experience with BigQuery and GCP during the interview. Be prepared to discuss specific projects where you've optimised data pipelines or worked with APIs, as this will demonstrate your technical proficiency.
β¨Prepare for Scenario-Based Questions
Expect questions that assess your problem-solving abilities in real-world scenarios. Think about challenges you've faced in previous roles, particularly around data optimisation and pipeline management, and be ready to explain how you overcame them.
β¨Familiarise Yourself with BI Tools
While knowledge of BI tools like Power BI and Looker is not mandatory, having a basic understanding can set you apart. Brush up on how these tools connect to data sources and prepare to discuss any relevant experience you have with them.
β¨Demonstrate Your Team Collaboration Skills
Since the role involves working closely with BI developers and analysts, be prepared to talk about your experience collaborating with cross-functional teams. Highlight instances where your teamwork led to successful project outcomes.