At a Glance
- Tasks: Join us as a Data Engineer to transform data infrastructure and support CRM integration.
- Company: Be part of a dynamic team driving critical data transformation for a leading client.
- Benefits: Enjoy flexible working options and the chance to work with cutting-edge technology.
- Why this job: This role offers hands-on experience, collaboration, and the opportunity to modernise legacy systems.
- Qualifications: Experience in data engineering, AWS, Python, and a willingness to tackle challenging tasks.
- Other info: Ideal for those looking to make an impact in a fast-paced environment.
The predicted salary is between 36000 - 60000 £ per year.
We are partnering with a client undergoing a critical transformation of their data infrastructure and CRM capabilities. They are seeking a hands-on Data Engineer with strong AWS experience to support two key initiatives: finalising a Braze CRM integration and migrating legacy R-based data pipelines to a modern cloud-native stack.
Phase 1: CRM Data Engineering (Month 1)
- Support the CRM team with data engineering requests.
- QA, deploy, and monitor data pipelines that push third-party formatted data into Braze.
- Manage ad hoc CRM data tasks including journey updates and API integrations.
- Work extensively within AWS using Lambda, API Gateway, and Python to maintain and enhance integrations.
Phase 2: Legacy Pipeline Migration (Months 2-3)
- Analyze and understand existing R-based data pipelines created by data scientists.
- Migrate these pipelines into Airflow, dbt, and Terraform workflows.
- Modernize and scale legacy infrastructure running on AWS.
- Collaborate with engineering teams to ensure a smooth transition and system stability.
Languages & Scripting:
- Python (primary scripting language for Lambda functions)
- SQL (BigQuery, Redshift)
- R (not essential but beneficial for interpreting existing scripts)
Cloud & Infrastructure:
- AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose
- Terraform for infrastructure as code
Orchestration & Transformation:
- Apache Airflow
- dbt
CRM & Marketing Tools:
- Braze (preferred)
- Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is a plus
Candidate Profile
- Proven commercial experience as a data engineer; industry background is not critical.
- Hands-on, pragmatic, and able to deliver quickly with minimal supervision.
- Strong communicator, able to clearly explain technical decisions and project status.
- Willing to take on essential but sometimes tedious tasks without hesitation.
- Practical attitude, especially when working with legacy systems or imperfect code.
- Ideally, experience migrating legacy scripting environments (e.g., R to Python) to modern pipelines.
Contact Detail:
Harnham - Data & Analytics Recruitment Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer Contract
✨Tip Number 1
Familiarise yourself with AWS services, especially Lambda and API Gateway, as these are crucial for the role. Consider building a small project that utilises these services to demonstrate your hands-on experience.
✨Tip Number 2
Brush up on your Python skills, particularly in the context of data engineering. You might want to explore libraries like Pandas or NumPy, which can be beneficial when working with data pipelines.
✨Tip Number 3
Gain a solid understanding of Apache Airflow and dbt, as these tools will be essential for migrating legacy pipelines. You could try setting up a simple workflow using Airflow to showcase your ability to manage orchestration.
✨Tip Number 4
If you have any experience with CRM tools like Braze, make sure to highlight it. If not, consider exploring their documentation or tutorials to get a basic understanding of how they work, as this knowledge will be advantageous.
We think you need these skills to ace Data Engineer Contract
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with AWS and Python. Include specific projects or tasks that demonstrate your ability to manage data pipelines and work with CRM systems like Braze.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention your hands-on experience with AWS services and any relevant projects that align with the job description, such as migrating legacy systems or working with data integration.
Showcase Technical Skills: Clearly list your technical skills related to the job, such as proficiency in Python, SQL, and familiarity with tools like Apache Airflow and Terraform. Provide examples of how you've used these skills in previous roles to solve problems or improve processes.
Highlight Communication Abilities: Since strong communication is essential for this role, include examples in your application that demonstrate your ability to explain technical concepts clearly. This could be through past experiences where you collaborated with teams or presented project updates.
How to prepare for a job interview at Harnham - Data & Analytics Recruitment
✨Showcase Your AWS Expertise
Make sure to highlight your experience with AWS services, especially Lambda and API Gateway. Be prepared to discuss specific projects where you've used these tools to solve problems or improve processes.
✨Demonstrate Your Data Pipeline Knowledge
Be ready to talk about your experience with data pipelines, particularly in migrating legacy systems. Discuss any relevant projects where you've worked with Airflow, dbt, or Terraform, and how you approached the challenges involved.
✨Communicate Clearly
Since strong communication is key for this role, practice explaining complex technical concepts in simple terms. Think of examples where you've had to communicate project statuses or technical decisions to non-technical stakeholders.
✨Embrace the Tedious Tasks
Prepare to discuss your attitude towards essential but sometimes tedious tasks. Share examples of when you've taken on such responsibilities and how you maintained motivation and focus while doing so.