At a Glance
- Tasks: Create and maintain regulatory reporting systems on Google Cloud Platform.
- Company: Join a global leader in the tech industry with a passion for innovation.
- Benefits: Enjoy remote work, competitive salary, eye care, flu vaccinations, and life assurance.
- Other info: Collaborative environment with opportunities for personal and professional growth.
- Why this job: Make an impact by developing cloud-native solutions and exploring AI-driven automation.
- Qualifications: Experience with GCP, SQL development, and data pipeline orchestration.
The predicted salary is between 50000 - 70000 £ per year.
As a Regulatory Data Developer, you will play a key role in creating, migrating, and maintaining regulatory reporting systems on Google Cloud Platform (GCP).
The Regulatory Data team is a crucial function of the Data Engineering department, responsible for building and maintaining near real-time and batch reporting systems to fulfil regulatory requirements. The team utilises complex business logic to transform internal data models into those required by industry regulators and third parties.
Using your passion for delivering innovative solutions, you will support our continued migration from on-premise SQL Server systems to GCP as the team creates new GCP reporting systems. You will support this migration while also developing new cloud-native solutions and exploring AI-driven development and automation to improve efficiency, consistency, and reliability across the team's processes.
Preferred Skills and Experience:
- Experience with Google BigQuery.
- SQL development experience, with the ability to write complex queries, stored procedures, and performance-tune data-intensive workloads.
- Familiarity with GCP services such as Cloud Storage, Cloud Functions, Pub/Sub, or Cloud Composer.
- Experience with data pipeline development (ETL/ELT) and orchestration tooling.
- Exposure to Infrastructure as Code such as Terraform, and CI/CD pipelines such as Git, GitLab.
- Methodical, with high attention to detail and the ability to break down complex requirements into simple solutions.
- Interest in AI-assisted development tooling, such as Claude Code.
- Ability to work to deadlines in a fast-paced, reactive environment.
- SQL Server experience to support legacy systems during migration.
What you will be doing:
- Developing and maintaining regulatory data submission systems, both existing on-premise and new cloud-based solutions in GCP/BigQuery.
- Building and maintaining automated data pipelines for ingestion, transformation, and loading of regulatory data.
- Supporting the migration of legacy SQL Server solutions to BigQuery, working closely with seniors and technical leads on architecture and data modelling.
- Implementing data validation, monitoring, and alerting to ensure accuracy, consistency, and reliability of regulatory submissions.
- Leveraging AI tooling, such as Claude Code, and contributing to the team's automation strategies to reduce manual effort and improve quality.
- Participating in code reviews and adhering to departmental standards and the development process.
- Conducting QA to ensure solutions are accurate, efficient, and performant.
- Creating and maintaining relevant documentation.
- Collaborating with other team members to deliver high-quality solutions within required timeframes.
Bonus:
- Eye care and Flu Vaccinations
- Life Assurance
Life at bet365: We are a unique global operator with passion and drive to be the best in the industry. Our values form the foundation of culture and shape the unique way that we work. People are our superpower and we support you to be the best you can be.
Data Engineer GCP (Remote) employer: bet365 Group
Contact Detail:
bet365 Group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer GCP (Remote)
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, especially those already working at companies you're eyeing. A friendly chat can open doors and give you insider info that could help you stand out.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repo showcasing your projects, especially those related to GCP and data engineering. This gives potential employers a taste of what you can do beyond just your CV.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and scenarios. Practice explaining your thought process when tackling complex problems, as this will demonstrate your methodical approach and attention to detail.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining our team!
We think you need these skills to ace Data Engineer GCP (Remote)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with GCP, SQL development, and any relevant projects that showcase your skills in building and maintaining data systems.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about regulatory data and how your background makes you a perfect fit for our team. Don’t forget to mention your interest in AI-driven development!
Showcase Your Technical Skills: Be specific about your technical skills in your application. Mention your experience with Google BigQuery, ETL processes, and any familiarity with tools like Terraform or CI/CD pipelines. We love seeing concrete examples!
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to keep track of your application status directly!
How to prepare for a job interview at bet365 Group
✨Know Your GCP Stuff
Make sure you brush up on your Google Cloud Platform knowledge, especially BigQuery and other services mentioned in the job description. Be ready to discuss how you've used these tools in past projects or how you would approach specific tasks related to regulatory data systems.
✨Show Off Your SQL Skills
Prepare to demonstrate your SQL prowess. You might be asked to write complex queries or explain how you would performance-tune data-intensive workloads. Practise writing stored procedures and think about examples where you've optimised queries in the past.
✨Talk About Data Pipelines
Since the role involves developing data pipelines, be ready to discuss your experience with ETL/ELT processes and orchestration tools. Share specific examples of how you've built or maintained data pipelines, and highlight any challenges you faced and how you overcame them.
✨Emphasise Collaboration and Attention to Detail
This role requires working closely with team members and adhering to high standards. Prepare to talk about how you ensure accuracy and reliability in your work, and give examples of how you've collaborated effectively in a team setting, especially under tight deadlines.