At a Glance
- Tasks: Design and implement data pipelines, ensuring quality and integrity while solving integration challenges.
- Company: Join a company revolutionizing insurance with advanced data intelligence for small and medium-sized businesses.
- Benefits: Enjoy a fully remote role with flexible work options and the chance to impact the insurance industry.
- Why this job: Be part of a dynamic team that values collaboration and innovation in data analytics.
- Qualifications: Bachelor’s or Master’s in Computer Science; 3+ years in data engineering with strong SQL and Python skills.
- Other info: Reach out directly for more info or apply to kickstart your career!
The predicted salary is between 36000 - 60000 £ per year.
Analytics Engineer
Job Type: Permanent Position
Location: Fully Remote (UK Based)
Start Date: ASAP
About The Company:
We have partnered with a company that empowers underwriters to serve their insureds more effectively. They are using advanced data intelligence tools to rebuild the way that underwriters share and exchange risk. With a current focus on the small and medium-sized businesses that power our global economy and their niche insurance needs, they leverage granular information on each policy to deliver unprecedented insight into insurance pools, and their speciality portfolio is fully diversified with very low catastrophe, aggregation or systemic risk.
The Role:
- Designing and implementing data pipelines and models, ensuring data quality and integrity.
- Solving challenging data integration problems, utilising optimal patterns, frameworks, query techniques, sourcing from vast and varying data sources.
- Building, maintaining, and optimising our Data Warehouse to support reporting and analytics needs.
- Collaborating with product managers, business stakeholders and engineers to understand the data needs, representing key data insights in a meaningful way.
- Staying up-to-date with industry trends and best practices in data modelling, database development, and analytics.
- Optimising pipelines, frameworks, and systems to facilitate easier development of data artifacts.
You will be successful if you have:
- A Bachelor’s or Master\’s degree in Computer Science, Information Systems, or a related field.
- 3+ years of experience in building data pipelines, models and maintaining Data Warehouses for reporting and analytics.
- Strong skills in SQL, Python, problem-solving and data analysis.
- Deep Experience with Snowflake and AWS
- Deep Experience with dbt.
- Excellent communication and collaboration skills.
- An eagerness to learn and collaborate with others, learn quickly and able to work with little supervision.
If you would like to have a chat about this exciting opportunity, apply below or reach out directly to g.ndonfak@annapurnarecruitment.com
Data Engineer employer: Annapurna
Contact Detail:
Annapurna Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Make sure to showcase your experience with data pipelines and Data Warehouses in your conversations. Be ready to discuss specific projects where you implemented solutions using SQL, Python, Snowflake, or AWS.
✨Tip Number 2
Familiarize yourself with the latest trends in data modeling and analytics. Being able to discuss recent advancements or best practices can set you apart during discussions with product managers and engineers.
✨Tip Number 3
Prepare to demonstrate your problem-solving skills by thinking through potential data integration challenges. Consider how you would approach these issues and be ready to share your thought process.
✨Tip Number 4
Highlight your collaboration skills by sharing examples of how you've worked with cross-functional teams in the past. This will show that you can effectively communicate and understand the data needs of various stakeholders.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Understand the Role: Make sure to thoroughly read the job description for the Data Engineer position. Understand the key responsibilities and required skills, such as experience with SQL, Python, Snowflake, and AWS.
Tailor Your CV: Customize your CV to highlight relevant experience in building data pipelines and maintaining Data Warehouses. Emphasize your problem-solving skills and any specific projects that showcase your expertise in data analysis.
Craft a Compelling Cover Letter: Write a cover letter that connects your background to the company's mission of empowering underwriters. Mention your eagerness to learn and collaborate, and how your skills align with their needs.
Proofread Your Application: Before submitting, carefully proofread your application materials. Check for any grammatical errors or typos, and ensure that all information is accurate and clearly presented.
How to prepare for a job interview at Annapurna
✨Showcase Your Technical Skills
Be prepared to discuss your experience with SQL, Python, Snowflake, and AWS in detail. Bring examples of data pipelines or models you've built, and be ready to explain the challenges you faced and how you overcame them.
✨Understand the Company's Mission
Research the company’s focus on empowering underwriters and their approach to data intelligence. This will help you align your answers with their goals and demonstrate your genuine interest in their work.
✨Prepare for Problem-Solving Questions
Expect questions that assess your problem-solving abilities, especially related to data integration and optimization. Practice articulating your thought process clearly and logically when tackling complex data issues.
✨Emphasize Collaboration Skills
Highlight your ability to work with product managers, business stakeholders, and engineers. Share examples of past collaborations where you successfully communicated data insights and contributed to team projects.