Real-Time Data Engineer – Kafka, AWS & Snowflake in Brighton
Real-Time Data Engineer – Kafka, AWS & Snowflake

Real-Time Data Engineer – Kafka, AWS & Snowflake in Brighton

Brighton Full-Time 36000 - 60000 £ / year (est.) No home office possible
Go Premium
H

At a Glance

  • Tasks: Build reliable data pipelines for innovative genAI products using Python and Kafka.
  • Company: Tech-driven data company based in Brighton with a focus on innovation.
  • Benefits: Competitive salary, flexible working hours, and opportunities for professional growth.
  • Why this job: Join a dynamic team and make an impact in the exciting world of data engineering.
  • Qualifications: Experience with Python and real-time data streaming platforms like Kafka.
  • Other info: Collaborative environment with a focus on data engineering standards and project ownership.

The predicted salary is between 36000 - 60000 £ per year.

A tech-driven data company in Brighton is seeking a skilled mid-Level Data Engineer to build reliable data pipelines for cutting-edge genAI products. The ideal candidate will have strong experience with Python and real-time data streaming platforms like Kafka.

This role involves collaborating with cross-functional teams, contributing to data engineering standards, and taking ownership of well-defined projects.

If you're excited about data, apply regardless of your alignment with every single requirement!

Real-Time Data Engineer – Kafka, AWS & Snowflake in Brighton employer: Humara

Join a dynamic tech-driven data company in Brighton, where innovation meets collaboration. We offer a supportive work culture that prioritises employee growth through continuous learning opportunities and mentorship. With a focus on cutting-edge technology and a commitment to diversity, we empower our team to take ownership of impactful projects while enjoying the vibrant lifestyle that Brighton has to offer.
H

Contact Detail:

Humara Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Real-Time Data Engineer – Kafka, AWS & Snowflake in Brighton

Tip Number 1

Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.

Tip Number 2

Show off your skills! Create a portfolio showcasing your projects, especially those involving Python, Kafka, and AWS. This will give potential employers a taste of what you can do and set you apart from the crowd.

Tip Number 3

Prepare for interviews by brushing up on your technical knowledge and soft skills. Practice common data engineering questions and be ready to discuss how you've tackled challenges in past projects.

Tip Number 4

Don't forget to apply through our website! We love seeing applications come directly from candidates who are excited about joining our team. Plus, it shows you're genuinely interested in what we do!

We think you need these skills to ace Real-Time Data Engineer – Kafka, AWS & Snowflake in Brighton

Python
Kafka
Real-Time Data Streaming
Data Pipeline Development
Collaboration
Data Engineering Standards
Project Ownership
Cross-Functional Teamwork

Some tips for your application 🫡

Show Your Passion for Data: When writing your application, let us see your enthusiasm for data engineering! Share any personal projects or experiences that highlight your love for working with data, especially in real-time environments like Kafka.

Tailor Your CV and Cover Letter: Make sure to customise your CV and cover letter to reflect the skills and experiences mentioned in the job description. Highlight your experience with Python and any relevant projects that showcase your ability to build reliable data pipelines.

Be Clear and Concise: We appreciate clarity! Keep your application straightforward and to the point. Use bullet points where possible to make it easy for us to see your key achievements and skills at a glance.

Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. We can’t wait to see what you bring to the table!

How to prepare for a job interview at Humara

Know Your Tech Stack

Make sure you brush up on your knowledge of Python, Kafka, AWS, and Snowflake. Be ready to discuss how you've used these technologies in past projects, as well as any challenges you faced and how you overcame them.

Show Your Collaborative Spirit

Since this role involves working with cross-functional teams, be prepared to share examples of how you've successfully collaborated with others. Highlight your communication skills and how you contribute to team dynamics.

Demonstrate Ownership

Think of specific projects where you took ownership and drove results. Be ready to explain your thought process, the steps you took, and the impact your work had on the project or team.

Ask Insightful Questions

Prepare a few thoughtful questions about the company's data engineering standards and the genAI products you'll be working on. This shows your genuine interest in the role and helps you assess if it's the right fit for you.

Real-Time Data Engineer – Kafka, AWS & Snowflake in Brighton
Humara
Location: Brighton
Go Premium

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

H
  • Real-Time Data Engineer – Kafka, AWS & Snowflake in Brighton

    Brighton
    Full-Time
    36000 - 60000 £ / year (est.)
  • H

    Humara

    50-100
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>