At a Glance
- Tasks: Join our FinTech team to develop scalable data pipelines and ensure data quality.
- Company: A leading high-tech company with over 6,500 employees and 300 million active users.
- Benefits: Enjoy full remote work flexibility and competitive pay of £450 per day.
- Why this job: Be part of cutting-edge technology in AR, VR, and data analytics while making impactful decisions.
- Qualifications: 5+ years in SQL and Python, strong Apache Airflow experience required.
- Other info: Contract likely extends into 2027, offering long-term opportunities.
The predicted salary is between 54000 - 90000 £ per year.
Our client is a global leading and fast growing high tech company with over 6,500 employees across 20+ offices and 300 million+ active users on some of the platforms they developed. They are looking for a senior Data Engineer to join their FinTech team!
You will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets.
- Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing
- Design and develop scalable ETL pipelines to automate data processes and optimize delivery
- Implement and manage data warehousing solutions, ensuring data integrity through rigorous testing and validation
- Lead, plan and execute workflow migration and data orchestration using Apache Airflow
- Focus on data engineering and data analytics
Requirements:
- 5+ years of experience in SQL
- 5+ years of development in Python
- MUST have strong experience in Apache Airflow
- Experience with ETL tools, data architecture, and data warehousing solutions
- Strong communication skills
This contract is £450 per day inside IR35, 6 month contract with likely extensions into 2027.
Senior Data Engineer employer: Widen the Net Limited
Contact Detail:
Widen the Net Limited Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Make sure to showcase your experience with SQL and Python in your conversations. Highlight specific projects where you've used these skills, especially in relation to data pipelines and ETL processes.
✨Tip Number 2
Familiarise yourself with Apache Airflow if you haven't already. Being able to discuss its features and how you've implemented it in past projects will demonstrate your expertise and make you stand out.
✨Tip Number 3
Prepare to discuss your approach to ensuring data quality and integrity. Be ready to share examples of how you've tested and validated data in previous roles, as this is crucial for the position.
✨Tip Number 4
Network with professionals in the FinTech space. Engaging with others in the industry can provide insights into the company culture and expectations, which can be beneficial during interviews.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with SQL, Python, and Apache Airflow. Use specific examples from your past roles that demonstrate your skills in developing scalable data pipelines and managing data warehousing solutions.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention your relevant experience and how it aligns with the job requirements, particularly your expertise in ETL processes and data architecture.
Showcase Your Projects: If you have worked on notable projects involving data engineering or analytics, summarise these in your application. Highlight any specific technologies you used, such as Big Query, Spark, or Hadoop, to demonstrate your hands-on experience.
Proofread Your Application: Before submitting, carefully proofread your application for any spelling or grammatical errors. A polished application reflects your attention to detail, which is crucial for a Senior Data Engineer role.
How to prepare for a job interview at Widen the Net Limited
✨Showcase Your Technical Skills
Be prepared to discuss your experience with SQL, Python, and Apache Airflow in detail. Bring examples of projects where you've developed scalable data pipelines or implemented ETL processes, as this will demonstrate your hands-on expertise.
✨Understand the Company’s Tech Stack
Familiarise yourself with the technologies mentioned in the job description, such as Big Query, Spark, and Hadoop. Showing that you have a grasp of their tech stack will impress the interviewers and show your commitment to the role.
✨Prepare for Scenario-Based Questions
Expect questions that assess your problem-solving skills in real-world scenarios. Think about challenges you've faced in previous roles related to data quality and integrity, and be ready to explain how you overcame them.
✨Highlight Your Communication Skills
Since strong communication skills are a requirement, prepare to discuss how you've effectively collaborated with cross-functional teams. Share examples of how you've communicated complex data concepts to non-technical stakeholders.