At a Glance
- Tasks: Design and maintain data pipelines using Python and SQL while collaborating with teams.
- Company: Amaris Consulting is a global tech consulting firm with over 1,000 clients and 7,600 team members.
- Benefits: Enjoy a full-time role with opportunities for growth in a diverse and inclusive environment.
- Why this job: Join a dynamic team focused on innovation and impactful projects across various industries.
- Qualifications: Experience in data engineering, proficiency in Python and SQL, and familiarity with cloud platforms required.
- Other info: Entry-level position with a supportive recruitment process tailored to candidates.
The predicted salary is between 28800 - 48000 £ per year.
Amaris Consulting is an independent technology consulting firm providing guidance and solutions to businesses. With more than 1,000 clients across the globe, we have been rolling out solutions in major projects for over a decade – this is made possible by an international team of 7,600 people spread across 5 continents and more than 60 countries. Our solutions focus on four different Business Lines: Information System & Digital, Telecom, Life Sciences and Engineering. We’re focused on building and nurturing a top talent community where all our team members can achieve their full potential. Amaris is your steppingstone to cross rivers of change, meet challenges and achieve all your projects with success.
We are seeking a talented Data Engineer to join our teams in São Paulo, Rio de Janeiro, and Curitiba. The ideal candidate will design, build, and maintain scalable data pipelines for batch and real-time processing, collaborating with data scientists, analysts, and stakeholders to deliver robust solutions.
Responsibilities- Design, develop, and maintain ETL/ELT data pipelines using Python and SQL.
- Implement Infrastructure as Code (IaC) with Terraform to manage cloud resources.
- Manage data integration across Azure with Databricks or GCP platforms.
- Collaborate with data teams and stakeholders to ensure data quality and availability.
- Apply DevOps practices such as CI/CD, monitoring, and automation.
- Optimize data workflows for scalability, performance, and security.
- Ensure compliance with data governance and security policies.
- Experience in data engineering or related roles.
- Proficiency in Python and SQL.
- Experience with Terraform for IaC.
- Hands-on experience with cloud platforms (Azure, GCP, or AWS).
- Knowledge of Databricks or BigQuery for big data processing.
- Familiarity with DevOps tools and methodologies.
- Fluent in English, both written and spoken.
Amaris Consulting is committed to diversity and equal opportunity, welcoming applications from all qualified candidates regardless of gender, ethnicity, orientation, or other characteristics.
Seniority level: Entry level
Employment type: Full-time
Job function: Information Technology
Industries: IT Services and Consulting
Data Engineer employer: Amaris Consulting
Contact Detail:
Amaris Consulting Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Familiarise yourself with the specific technologies mentioned in the job description, such as Python, SQL, and Terraform. Having hands-on experience or projects that showcase your skills in these areas can significantly boost your chances during the interview process.
✨Tip Number 2
Prepare to discuss your previous experiences with data pipelines and cloud platforms like Azure or GCP. Be ready to share specific examples of how you've designed or optimised data workflows, as this will demonstrate your practical knowledge and problem-solving abilities.
✨Tip Number 3
Since collaboration is key in this role, think about instances where you've worked with data scientists or analysts. Highlight your communication skills and ability to work in a team, as these are crucial for ensuring data quality and availability.
✨Tip Number 4
Research Amaris Consulting's culture and values to align your responses during interviews. Understanding their commitment to diversity and equal opportunity can help you articulate how you would fit into their team and contribute positively to their environment.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with Python, SQL, and cloud platforms like Azure or GCP. Use keywords from the job description to ensure your application stands out.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your understanding of Amaris Consulting's mission. Mention specific projects or experiences that align with the responsibilities outlined in the job description.
Showcase Technical Skills: In your application, emphasise your proficiency in ETL/ELT processes, Infrastructure as Code (IaC) with Terraform, and any experience with Databricks or BigQuery. Providing examples of past projects can strengthen your application.
Prepare for Interviews: Anticipate questions related to your technical skills and experience. Be ready to discuss how you have implemented DevOps practices and optimised data workflows in previous roles. Research Amaris Consulting's culture and values to demonstrate your fit.
How to prepare for a job interview at Amaris Consulting
✨Know Your Tech Stack
Make sure you are well-versed in Python, SQL, and Terraform, as these are crucial for the Data Engineer role. Brush up on your knowledge of cloud platforms like Azure and GCP, and be ready to discuss how you've used these technologies in past projects.
✨Prepare for Case Studies
Since Amaris Consulting may ask you to complete a case study, practice solving data engineering problems. Familiarise yourself with common scenarios involving ETL/ELT processes and be prepared to explain your thought process clearly.
✨Showcase Collaboration Skills
Collaboration is key in this role, so be ready to discuss how you've worked with data scientists, analysts, and other stakeholders in previous positions. Highlight any successful projects where teamwork led to improved data quality or availability.
✨Understand DevOps Practices
Familiarity with DevOps methodologies is important for this position. Be prepared to talk about your experience with CI/CD, monitoring, and automation, and how these practices can enhance data workflows and security.