Junior Data Engineer - Build Reliable Data Pipelines (Remote) in London
Junior Data Engineer - Build Reliable Data Pipelines (Remote)

Junior Data Engineer - Build Reliable Data Pipelines (Remote) in London

London Entry level 30000 - 40000 £ / year (est.) Home office possible
Smart Communications group

At a Glance

  • Tasks: Build and maintain reliable data pipelines for analytics and reporting.
  • Company: Join a leading tech company focused on meaningful customer conversations.
  • Benefits: Competitive salary, remote work, health insurance, and 25 days holiday plus your birthday off.
  • Other info: Dynamic, flexible work environment with excellent career growth opportunities.
  • Why this job: Gain hands-on experience with modern cloud data technologies and tackle real-world data challenges.
  • Qualifications: 1-3 years in data engineering, strong Python and SQL skills, eagerness to learn.

The predicted salary is between 30000 - 40000 £ per year.

We are looking for a Junior Data Engineer who’s excited to contribute to a growing Data Operations function within a SaaS business. In this role, you’ll help build and maintain reliable, high-quality data pipelines that support internal reporting, analytics, and operational needs across the company. Working closely with the Data Engineer and the wider Data Operations team, you’ll help develop scalable data processes, maintain trusted datasets, and support the smooth operation of our enterprise data platform. You’ll assist in integrating new internal data sources, improving data quality, and helping to create well-governed data assets that enable accurate reporting, meaningful insights, and future analytical work. This is an excellent opportunity for someone early in their data engineering career who wants hands-on experience with modern cloud data technologies, solid engineering best practices, and real-world data challenges. You’ll grow your technical skills while contributing to initiatives that help teams across the business make informed, data-driven decisions.

The responsibilities of the role include:

  • Build and maintain reliable data pipelines that load and transform data within our enterprise data lake.
  • Develop and maintain data transformations using PySpark and SQL to support internal reporting and analytics needs.
  • Ingest and integrate data from internal and approved external sources using a variety of integration patterns and APIs.
  • Apply and monitor data quality checks to help ensure accuracy, completeness, and consistency across datasets.
  • Support day-to-day data operations, including job monitoring, incident triage, reruns, backfills, and general platform maintenance.
  • Assist with root-cause analysis and post-incident improvements to enhance pipeline reliability.
  • Collaborate with stakeholders to gather clear data and reporting requirements, document outcomes, and translate needs into well-defined technical tasks.
  • Record and manage work items in JIRA, ensuring tasks are well-organized, up-to-date, and clearly documented.
  • Contribute to data governance activities, including data cataloging, metadata upkeep, documentation, and adherence to security and access standards.
  • Maintain clear technical documentation for data pipelines, workflows, and operational procedures.
  • Work with the Data Engineer and wider Data Operations team to support analytics, reporting initiatives, and internal data consumers.

What we’re looking for:

Must have skills and experience:

  • 1–3 years’ experience in data engineering, analytics engineering, or a related role.
  • Bachelor’s degree (or equivalent experience) in computer science, data & analytics, or a related discipline.
  • Solid knowledge of Python and SQL.
  • Hands-on experience building or maintaining data pipelines.
  • Practical experience with PySpark (DataFrames and Spark SQL).
  • Familiarity with cloud storage concepts (e.g. Amazon S3).
  • Experience working with data tables and basic data modelling concepts.
  • Experience writing technical documentation.
  • Strong organizational and time management skills.
  • Strong problem-solving skills and attention to detail.
  • Eagerness to learn and grow in the field of data engineering.

Advantageous skills/experience:

  • Experience working with Databricks and Delta Lake.
  • Experience integrating data via REST APIs.
  • Exposure to data quality frameworks or validation techniques.
  • Basic understanding of data governance or access control concepts.
  • Experience with AWS-based data platforms.
  • Familiarity with BI and reporting tools such as Power BI, Tableau, or AWS QuickSight.

We look for the following SMART values in everyone we hire at Smart Communications:

  • Speak Openly - We are positive, creative, helpful, kind and we have fun. We listen and provide constructive feedback. Through meaningful conversations we encourage each other to be the best that we can be. We’re not complainers we’re problem solvers.
  • Make a Difference - We focus on the things that matter and prioritize the things that have the greatest impact. We celebrate success and hold ourselves accountable for our choices. We don’t sit on the sidelines.
  • Agile & Flexible - We are focused on evolving, improving and growing. We think differently and challenge the status quo with open minds. We ask ‘why?’ so that we can help remove complexity. We don’t allow hurdles to get in our way.
  • Results-Focused - We get stuff done by being efficient, working at pace and paying attention to detail. We focus on finding solutions and fixing things. We don’t believe in being busy for the sake of being busy, we focus on productivity.
  • Teamwork - We are stronger and better together. We collaborate, trust and support each other to deliver results for our company and our customers. We don’t want anyone to feel disengaged, we’re in this together!

What’s the deal? We will provide you with the tools, equipment and support to give you the best possible chance of success and over-achieving your goals. Salary will depend on your experience and will be highly competitive. All our packages include an annual bonus based on the company’s performance, so we are all incentivized to over-achieve! In addition to a friendly, flexible and fun working environment, we provide a range of other benefits, including extensive health insurance, income protection, life assurance, subsidised gym membership, leisure travel insurance, pension contribution, Cycle2Work and childcare vouchers, as well as 25 days’ holiday allowance plus an additional day off for your birthday! Located in Covent Garden, our offices are comfortable, flexible, and are always stocked with free beverages and fresh fruit. This role is fully remote.

So, if we interest you, please let us know by applying for this position and tell us all about yourself.

Please note: we only consider applicants with current legal right to work in the countries in which our positions are based. All qualified applicants will receive consideration for employment regardless of race, colour, religion, sex, national origin, sexual orientation, age, disability, marital status or gender identity.

Junior Data Engineer - Build Reliable Data Pipelines (Remote) in London employer: Smart Communications group

Smart Communications is an exceptional employer that fosters a collaborative and innovative work culture, perfect for those looking to kickstart their data engineering career. With a strong focus on employee growth, you will gain hands-on experience with cutting-edge cloud technologies while enjoying a competitive salary, extensive benefits, and a flexible remote working environment. Located in the vibrant Covent Garden area, our team thrives on open communication, teamwork, and making a meaningful impact across the business.
Smart Communications group

Contact Detail:

Smart Communications group Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Junior Data Engineer - Build Reliable Data Pipelines (Remote) in London

✨Tip Number 1

Network like a pro! Reach out to people in the industry, attend virtual meetups, and connect with current employees at Smart Communications. A friendly chat can sometimes lead to job opportunities that aren’t even advertised!

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your data projects, whether it’s building pipelines or working with SQL. Having tangible examples of your work can really impress hiring managers and set you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on common data engineering questions and scenarios. Practice explaining your thought process when solving problems, as this will demonstrate your analytical skills and eagerness to learn.

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in joining the Smart Communications team!

We think you need these skills to ace Junior Data Engineer - Build Reliable Data Pipelines (Remote) in London

Data Engineering
Python
SQL
PySpark
Data Pipeline Development
Data Quality Checks
Data Integration
Technical Documentation
Cloud Storage Concepts
Data Modelling
Problem-Solving Skills
Organisational Skills
Time Management
Data Governance

Some tips for your application 🫡

Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Junior Data Engineer role. Highlight your experience with Python, SQL, and any data pipeline projects you've worked on. We want to see how you can contribute to our Data Operations team!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for data engineering and why you're excited about joining us at Smart Communications. Be sure to mention specific projects or technologies that resonate with the job description.

Show Off Your Problem-Solving Skills: In your application, don’t shy away from showcasing your problem-solving abilities. Whether it’s through examples of past projects or challenges you’ve overcome, we love to see how you tackle real-world data issues!

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows you’re keen to join our team!

How to prepare for a job interview at Smart Communications group

✨Know Your Data Tools

Make sure you brush up on your knowledge of Python, SQL, and PySpark before the interview. Be ready to discuss how you've used these tools in past projects or coursework, as they are crucial for building and maintaining data pipelines.

✨Show Your Problem-Solving Skills

Prepare to share examples of how you've tackled data-related challenges. Think about specific instances where you identified a problem, implemented a solution, and what the outcome was. This will demonstrate your analytical thinking and attention to detail.

✨Understand Data Governance

Familiarise yourself with data governance concepts and why they matter. Be prepared to discuss how you would ensure data quality and security in your work. This shows that you understand the importance of well-governed data assets.

✨Ask Insightful Questions

At the end of the interview, don’t hesitate to ask questions about the team dynamics, ongoing projects, or the company’s approach to data operations. This not only shows your interest but also helps you gauge if the role is the right fit for you.

Junior Data Engineer - Build Reliable Data Pipelines (Remote) in London
Smart Communications group
Location: London

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>