At a Glance
- Tasks: Lead data engineering projects using Google Cloud to create innovative data solutions.
- Company: Join a forward-thinking consultancy that values ingenuity and diversity.
- Benefits: Enjoy private medical insurance, generous leave, and a supportive work culture.
- Why this job: Make a real impact by transforming complex data into actionable insights.
- Qualifications: Experience with GCP tools and a passion for mentoring others.
- Other info: Dynamic team environment with endless opportunities for personal and professional growth.
The predicted salary is between 36000 - 60000 £ per year.
We’re an innovation and transformation consultancy that believes in the power of ingenuity to build a positive-human future in a technology-driven world. Our diverse teams of experts combine innovative thinking with breakthrough technologies to progress further, faster. With a global network of FTSE 100 and Fortune 500 clients, we’ll offer you unrivalled opportunities for growth and the freedom to excel. Combining strategies, technologies and innovation, we turn complexity to opportunity and deliver enduring results, enabling you to build a lasting career.
As a Principal GCP Data Engineer, you will be a true subject matter expert in using the data processing and management capabilities of Google Cloud to develop data-driven solutions for our clients. You will typically lead a team or the solution delivery effort, demonstrating technical excellence through leading by example. You could be providing technical support, leading an engineering team or working across multiple teams as a subject matter expert who is critical to the success of a large programme of work. Your team members will look to you as a trusted expert and will expect you to define the end-to-end software development lifecycle in line with modern best practices.
As part of your responsibilities, you will be expected to:
- Develop robust data processing jobs using tools such as Google Cloud Dataflow, Dataproc and BigQuery.
- Design and deliver automated data pipelines that use orchestration tools such as Cloud Composer.
- Design end-to-end solutions and contribute to architecture discussions beyond data processing.
- Own the development process for your team, building strong principles and putting robust methods and patterns in place across architecture, scope, code quality and deployments.
- Shape team behaviour for writing specifications and acceptance criteria, estimating stories, sprint planning and documentation.
- Actively define and evolve PA’s data engineering standards and practices, ensuring we maintain a shared, modern and robust approach.
- Lead and influence technical discussions with client stakeholders to achieve the collective buy-in required to be successful.
- Coach and mentor team members, regardless of seniority, and work with them to build their expertise and understanding.
Qualifications
To be successful in this role, you will need to have:
- Experience delivering and deploying production-ready data processing solutions using BigQuery, Pub/Sub, Dataflow and Dataproc.
- Experience developing end-to-end solutions using batch and streaming frameworks such as Apache Spark and Apache Beam.
- Expert understanding of when to use a range of data storage technologies including relational/non-relational, document, row-based/columnar data stores, data warehousing and data lakes.
- Expert understanding of data pipeline patterns and approaches such as event-driven architectures, ETL/ELT, stream processing and data visualisation.
- Experience working with business owners to translate business requirements into technical specifications and solution designs that satisfy the data requirements of the business.
- Experience working with metadata management products such as Cloud Data Catalog and Collibra and Data Governance tools like Dataplex.
- Experience in developing solutions on GCP using cloud-native principles and patterns.
- Experience building data quality alerting and data quarantine solutions to ensure downstream datasets can be trusted.
- Experience implementing CI/CD pipelines using techniques including git code control/branching, automated tests and automated deployments.
- Comfortable working in an Agile team using Scrum or Kanban methodologies.
In addition to the above, we would be thrilled if you also had:
- Experience of working on migrations of enterprise scale data platforms including Hadoop and traditional data warehouses.
- An understanding of machine learning model development lifecycle, feature engineering, training and testing.
- Good understanding or hands-on experience of Kafka.
- Experience as a DBA or developer on RDBMS such as PostgreSQL, MySQL, Oracle or SQL Server.
- Experience designing data applications to meet non-functional requirements such as performance and availability.
You are pragmatic and already understand that writing code is only part of what a data engineer does. You can clearly communicate with both clients and peers, describing technical issues and solutions in both written and meeting/workshop contexts. You are able to clearly explain technical concepts to non-technical audiences at all levels of an organisation. You are able to influence and persuade senior and specialist client stakeholders, potentially across multiple organisational boundaries without direct authority. You are a confident problem solver and troubleshooter. You are confident and generous in sharing your specialist knowledge, ideas and solutions. You are constantly learning and able to make others better by consciously teaching and unconsciously inspiring.
Additional information
Benefits package at PA:
- Private medical insurance.
- Interest free season ticket loan.
- 25 days annual leave with the opportunity to buy 5 additional days.
- Company pension scheme.
- Annual performance-based bonus.
- Life and Income protection insurance.
- Tax efficient benefits (cycle to work, give as you earn, childcare benefits).
- Voluntary benefits (Dental, critical illness, spouse/partner life assurance).
PA is committed to building an inclusive and supportive culture where diversity thrives, and all of our people can excel. We believe that greater diversity stimulates innovation, enabling us to fulfil our purpose of ‘Bringing Ingenuity to Life’, supporting the growth of our people, and delivering more enduring results for our clients.
Google Cloud Platform Data Engineer in Bristol employer: Astro Studios, Inc.
Contact Detail:
Astro Studios, Inc. Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Google Cloud Platform Data Engineer in Bristol
✨Tip Number 1
Network like a pro! Reach out to current employees at The Distillery on LinkedIn. Ask them about their experiences and any tips they might have for landing the GCP Data Engineer role. Personal connections can make a huge difference!
✨Tip Number 2
Prepare for the interview by brushing up on your technical skills. Make sure you can confidently discuss Google Cloud tools like BigQuery and Dataflow. Practise explaining complex concepts in simple terms, as you'll need to communicate effectively with both technical and non-technical stakeholders.
✨Tip Number 3
Showcase your problem-solving skills during interviews. Be ready to tackle hypothetical scenarios or case studies related to data engineering. This will demonstrate your ability to think critically and apply your knowledge in real-world situations.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, it shows you're genuinely interested in joining our innovative team at The Distillery.
We think you need these skills to ace Google Cloud Platform Data Engineer in Bristol
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the GCP Data Engineer role. Highlight your experience with Google Cloud tools like BigQuery and Dataflow, and don’t forget to mention any relevant projects that showcase your skills!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how your expertise aligns with our mission at StudySmarter. Keep it engaging and personal!
Showcase Your Technical Skills: Be specific about your technical skills in your application. Mention your experience with data pipelines, CI/CD processes, and any cloud-native principles you've applied. We love seeing concrete examples of your work!
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to track your application status directly. Let’s get started on this journey together!
How to prepare for a job interview at Astro Studios, Inc.
✨Know Your GCP Tools Inside Out
Make sure you’re well-versed in Google Cloud tools like BigQuery, Dataflow, and Dataproc. Be ready to discuss how you've used these technologies in past projects, as this will show your practical experience and expertise.
✨Prepare for Technical Discussions
Expect to engage in technical discussions about data pipeline patterns and architectures. Brush up on event-driven architectures and ETL/ELT processes, and be prepared to explain your thought process clearly to both technical and non-technical stakeholders.
✨Showcase Your Leadership Skills
As a Principal GCP Data Engineer, you'll likely lead teams. Prepare examples of how you've coached or mentored team members in the past, and be ready to discuss how you foster collaboration and innovation within a team.
✨Demonstrate Problem-Solving Abilities
Be prepared to tackle hypothetical scenarios or case studies during the interview. Show how you approach problem-solving, from identifying issues to implementing solutions, and highlight your ability to communicate complex ideas simply.