At a Glance
- Tasks: Join us as a GCP Data Engineer to develop and optimise real-time data processing workflows.
- Company: Staffworx Limited is a UK recruitment consultancy supporting global E-commerce and software sectors.
- Benefits: Enjoy a hybrid work model with potential for long-term extensions and career growth.
- Why this job: Be part of a digital transformation programme with cutting-edge technologies and impactful projects.
- Qualifications: Proficiency in Python, Java, Spark, and familiarity with GCP services is essential.
- Other info: GCP certification is a strong advantage; this role offers a dynamic and innovative work environment.
The predicted salary is between 36000 - 60000 £ per year.
Future Talent Pool - GCP Data Engineer, London, hybrid role - digital Google Cloud transformation programme.
- Proficiency in programming languages such as Python, PySpark and Java.
- Develop ETL processes for data ingestion & preparation.
- Experience with SparkSQL, CloudRun, DataFlow, CloudStorage, GCP BigQuery, Google Cloud Platform Data Studio.
- Unix/Linux Platform experience.
- Version control tools (Git, GitHub), automated deployment tools.
- Familiarity with Google Cloud Platform services, Pub/Sub, BigQuery Streaming and related technologies.
- Deep understanding of real-time data processing and event-driven architectures.
- Familiarity with data orchestration tools like Google Cloud Platform Cloud Composer.
- Google Cloud Platform certification(s) is a strong advantage.
- Develop, implement, and optimize real-time data processing workflows using Google Cloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming.
6 months initial, likely long term extensions.
GCP Data Engineer (Java, Spark, ETL) employer: Job Traffic
Contact Detail:
Job Traffic Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land GCP Data Engineer (Java, Spark, ETL)
✨Tip Number 1
Familiarise yourself with Google Cloud Platform services, especially Dataflow, Pub/Sub, and BigQuery Streaming. Understanding these tools will not only help you in interviews but also demonstrate your commitment to mastering the technologies we use.
✨Tip Number 2
Engage with the GCP community through forums or local meetups. Networking with professionals in the field can provide insights into the role and may even lead to referrals, increasing your chances of landing a position with us.
✨Tip Number 3
Consider obtaining relevant Google Cloud certifications. While not mandatory, they can significantly enhance your profile and show us that you have a solid understanding of GCP services and best practices.
✨Tip Number 4
Practice coding challenges in Java and PySpark, focusing on ETL processes and real-time data processing. This will prepare you for technical interviews and showcase your problem-solving skills effectively.
We think you need these skills to ace GCP Data Engineer (Java, Spark, ETL)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with GCP, Java, Spark, and ETL processes. Use specific examples of projects where you've implemented these technologies to demonstrate your skills.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention your familiarity with Google Cloud services and how your background aligns with their digital transformation programme.
Showcase Relevant Projects: If you have worked on relevant projects, include them in your application. Detail your role, the technologies used, and the outcomes achieved, especially focusing on real-time data processing and event-driven architectures.
Highlight Certifications: If you hold any Google Cloud Platform certifications, make sure to mention them prominently in your application. This can set you apart from other candidates and show your commitment to the field.
How to prepare for a job interview at Job Traffic
✨Showcase Your Technical Skills
Make sure to highlight your proficiency in programming languages like Python, PySpark, and Java. Be prepared to discuss specific projects where you've developed ETL processes or worked with SparkSQL, as this will demonstrate your hands-on experience.
✨Understand the GCP Ecosystem
Familiarise yourself with Google Cloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming. During the interview, be ready to explain how you've used these tools in past projects and how they can benefit the company's digital transformation programme.
✨Discuss Real-Time Data Processing
Since the role involves real-time data processing, prepare to talk about your understanding of event-driven architectures. Share examples of how you've implemented real-time workflows and the challenges you faced, along with how you overcame them.
✨Highlight Your Certification
If you have any Google Cloud Platform certifications, make sure to mention them. Certifications can set you apart from other candidates, so be ready to discuss what you learned during the certification process and how it applies to the role.