At a Glance
- Tasks: Own data engineering, build and maintain pipelines, and create insightful reports.
- Company: Join a dynamic content and social media business in London.
- Benefits: Competitive salary, bonus, training budget, and flexible working arrangements.
- Why this job: Lead your own projects and grow into a department leader.
- Qualifications: Proficient in Python, SQL, and Google BigQuery; experience with data tools is a plus.
- Other info: Exciting opportunity to work with social media data and modern tech.
The predicted salary is between 28000 - 44000 £ per year.
We’re looking for a skilled Data Engineer to join a growing content & social media business. You will own data engineering, so will need to be confident as the stand-alone person. With ownership and freedom in the role, you will be working end-to-end on modelling, building & maintaining pipelines, warehousing and reporting & insights. Long term, you will be the leader of a department and grow with the business.
Key Responsibilities
- Develop, optimize, and maintain robust ETL/ELT data pipelines in BigQuery, SQL and Python.
- Integrate data from APIs across Meta (Facebook & Instagram), TikTok, YouTube, Twitter/X, etc.
- Create and maintain models in dataform/DBT.
Required Skills & Experience
- Strong proficiency in Python for data processing and pipeline development.
- Solid experience with SQL, including writing efficient and scalable queries.
- Hands-on experience with Google BigQuery, including architecture, performance tuning, and cost optimization.
- Familiarity with cloud-based data tools and modern data ecosystems.
- Understanding of data modeling, warehousing concepts, and best practices.
Nice-to-Have
- Experience with Airflow, dbt, or similar orchestration/modeling tools.
- Experience with BI tools.
- Knowledge of GCP services (Cloud Storage, Dataflow, Pub/Sub, etc.).
- Social media & content experience.
Starting salary between £35-55K + bonus (regularly reviewed), training budget. Flexible working arrangements, ideally 4 days a week in office.
Data Engineer in Slough employer: KnoWho
Contact Detail:
KnoWho Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in Slough
✨Tip Number 1
Network like a pro! Reach out to people in the industry on LinkedIn or at local meetups. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data engineering projects, especially those involving Python, SQL, and BigQuery. This will give potential employers a taste of what you can do.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and practical tests. Practice coding challenges in Python and SQL to ensure you're ready to impress when it counts.
✨Tip Number 4
Don’t forget to apply through our website! We’ve got loads of opportunities that might just be the perfect fit for you. Plus, it’s a great way to get noticed by our hiring team.
We think you need these skills to ace Data Engineer in Slough
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Python, SQL, and BigQuery, and don’t forget to mention any relevant projects or achievements that showcase your skills.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how your skills align with our needs. Be sure to mention your experience with ETL/ELT processes and any social media data integration you've done.
Showcase Your Projects: If you’ve worked on any cool data projects, make sure to include them in your application. Whether it’s a personal project or something from a previous job, showing off your hands-on experience can really set you apart from other candidates.
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and you’ll be able to keep track of your application status directly!
How to prepare for a job interview at KnoWho
✨Know Your Tech Inside Out
Make sure you brush up on your Python, SQL, and BigQuery skills. Be ready to discuss specific projects where you've built or optimised data pipelines. The more you can demonstrate your hands-on experience, the better!
✨Showcase Your Problem-Solving Skills
Prepare to talk about challenges you've faced in previous roles, especially around data integration from APIs or performance tuning. Companies love candidates who can think critically and come up with innovative solutions.
✨Familiarise Yourself with the Company’s Data Ecosystem
Do a bit of research on the tools and technologies the company uses. If they mention Airflow or dbt, have examples ready of how you've used similar tools. This shows you're proactive and genuinely interested in the role.
✨Ask Insightful Questions
Prepare some thoughtful questions about the team structure, future projects, or how they measure success in the data department. This not only shows your enthusiasm but also helps you gauge if the company is the right fit for you.