At a Glance
- Tasks: Stabilise and structure BigQuery for enterprise-wide data governance and scalability.
- Company: Join a dynamic company focused on innovative data solutions.
- Benefits: Competitive day rate, flexible working, and opportunities for professional growth.
- Why this job: Make a real impact by shaping the future of data architecture in a fast-paced environment.
- Qualifications: Strong experience with BigQuery, data modelling, and analytics architecture.
- Other info: Work 2 days in the office in Central London with a hands-on, outcome-driven approach.
The business is consolidating a growing number of data sources into BigQuery as a core enterprise data platform. Initial focus has been on DTC and ecommerce, with planned expansion across finance, operations, logistics, marketing and others. Current data sources include ecommerce platforms, subscription systems, customer service tools, personalisation platforms, and marketplace integrations. Data is actively consumed via SQL and AI assisted analysis to power internal reporting applications built in Laravel.
The role is required to stabilise, structure, and future proof the BigQuery environment so it can support scale, governance, and enterprise wide adoption. Preference for IaC, such as Terraform.
Primary objectives- BigQuery architecture and data model ownership
- Review the current BigQuery structure, ingestion patterns, and table design.
- Design and implement a scalable, well governed data architecture suitable for a global enterprise business.
- Define and implement golden datasets with clear ownership, access rules, and change control.
- Introduce appropriate schema and field level controls to prevent uncontrolled changes and data drift.
- Ensure the data model supports downstream analytics, AI driven querying, and application level reporting.
- Produce clear documentation explaining the architecture, data model, and usage patterns for both technical and non-technical stakeholders.
- Work alongside the existing data engineering resource to review current data pipelines, models, and delivery practices.
- Assess the effectiveness of current ways of working, technical approaches, and delivery processes against current and future business needs.
- Provide an evidence based view on strengths, gaps, and areas for improvement across data engineering capability and operating model.
- Make pragmatic recommendations on role scope, process improvements, upskilling opportunities, and resourcing required to support the target state.
The required tech stack is Google Big Query plus ‘Infrastructure as Code’ / DBT to be confirmed.
Key deliverables- Documented target state BigQuery architecture and data model.
- Defined and implemented golden tables with clear ownership and governance.
- Standards for data ingestion, transformation, and consumption.
- A practical roadmap for scaling BigQuery usage across additional business functions.
- Clear documentation that enables confident use of data across the organisation.
- Strong hands on experience designing and operating BigQuery environments at scale.
- Deep understanding of data modelling, analytics architecture, and data governance.
- Experience working with complex, multi source data environments, ideally including ecommerce and subscription data.
- Experience with data pipeline orchestrations tools such as Cloud Composer, Airflow or equivalent.
- Comfort working in fast moving environments with imperfect starting points.
- Ability to balance best practice with pragmatism and delivery speed.
- Strong communication skills and ability to explain complex concepts clearly.
- Experience supporting AI driven analytics or natural language querying of data.
- Experience working closely with application teams consuming data directly in products or dashboards.
- Background in DTC, retail, or consumer brands.
- Hands on and delivery focused.
- Pragmatic and outcome driven.
- 2 Days in office (Central London) per week.
- Comfortable operating with autonomy.
- Able to challenge existing approaches constructively.
- Focused on clarity, documentation, and long term sustainability.
- BigQuery is trusted as a scalable, governed enterprise data platform.
- Golden datasets (Curated, business validated tables that serve as the single source of truth) are clearly defined, locked down, and actively used.
- The business is unblocked to expand data usage across finance, operations, and other functions.
- There is clear visibility on the current operating model and what is required to support future growth.
Senior BigQuery Data Engineer - Contract in Slough employer: Augustinus Bader
Contact Detail:
Augustinus Bader Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior BigQuery Data Engineer - Contract in Slough
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with BigQuery. A friendly chat can lead to insider info about job openings or even referrals that could give you a leg up.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your BigQuery projects and any relevant data models you've designed. This gives potential employers a tangible look at what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering scenarios. Be ready to discuss how you've tackled challenges in previous roles, especially around data governance and architecture. We want to see your problem-solving skills in action!
✨Tip Number 4
Don't forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive about their job search!
We think you need these skills to ace Senior BigQuery Data Engineer - Contract in Slough
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior BigQuery Data Engineer role. Highlight your experience with BigQuery, data modelling, and any relevant tech stack you've worked with. We want to see how your skills match what we're looking for!
Showcase Your Projects: Include specific projects where you've designed or operated BigQuery environments. Share details about the challenges you faced and how you overcame them. This helps us understand your hands-on experience and problem-solving skills.
Be Clear and Concise: When writing your application, keep it clear and to the point. Use straightforward language to explain complex concepts, as we value strong communication skills. Remember, clarity is key for both technical and non-technical stakeholders!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows you're keen on joining our team at StudySmarter!
How to prepare for a job interview at Augustinus Bader
✨Know Your BigQuery Inside Out
Make sure you brush up on your BigQuery knowledge before the interview. Understand its architecture, data modelling, and governance principles. Be ready to discuss your hands-on experience with designing and operating BigQuery environments at scale, as this will be crucial for the role.
✨Showcase Your Problem-Solving Skills
Prepare to share specific examples of how you've tackled complex, multi-source data environments in the past. Highlight your ability to balance best practices with pragmatism, especially in fast-moving situations. This will demonstrate your capability to stabilise and future-proof the BigQuery environment.
✨Communicate Clearly and Confidently
Since strong communication skills are essential for this role, practice explaining complex concepts in simple terms. Think about how you would present your ideas to both technical and non-technical stakeholders. This will show that you can produce clear documentation and facilitate understanding across the organisation.
✨Be Ready to Discuss Governance and Scalability
Familiarise yourself with the concepts of data governance and scalability in relation to BigQuery. Be prepared to discuss how you would define and implement golden datasets, access rules, and change control. This will highlight your strategic thinking and ability to support enterprise-wide adoption of the data platform.