At a Glance
- Tasks: Design and optimise data pipelines, automate workflows, and manage cloud-based data infrastructure.
- Company: Join Hard Rock Digital, a leader in online gaming and entertainment, passionate about innovation.
- Benefits: Enjoy flexible work hours, competitive pay, and a vibrant startup culture.
- Why this job: Be part of a passionate team building products for millions while advocating for DataOps best practices.
- Qualifications: Bachelor’s degree or equivalent experience with 3 years in DataOps; proficiency in Airflow and Snowflake required.
- Other info: We celebrate diversity and inclusivity, empowering you to bring your authentic self to work.
The predicted salary is between 43200 - 72000 £ per year.
What are we building?Hard Rock Digital is a team focused on becoming the best online sportsbook, casino, and social casino company in the world. We’re building a team that resonates passion for learning, operating and building new products and technologies for millions of consumers. We care about each customer\’s interaction, experience, behavior, and insight and strive to ensure we’re always acting authentically.Rooted in the kindred spirits of Hard Rock and the Seminole Tribe of Florida, the new Hard Rock Digital taps a brand known the world over as the leader in gaming, entertainment, and hospitality. We’re taking that foundation of success and bringing it to the digital space — ready to join us?What’s the position?We are seeking a passionate DataOps Engineer who loves optimizing pipelines, automating workflows, and scaling cloud-based data infrastructure.Key Responsibilities:Design, build, and optimize data pipelines using Airflow, DBT, and Databricks.Monitor and improve pipeline performance to support real-time and batch processing.Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as Snowflake.Implement best practices for cost-efficient, secure, and scalable data processing.Enable and optimize AWS SageMaker environments for ML teams.Collaborate with ML, Data Science, and Reporting teams to ensure seamless data accessibility.Implement data pipeline monitoring, alerting, and logging to detect failures and performance bottlenecks.Build automation to ensure data quality, lineage tracking, and schema evolution management.Participate in incident response, troubleshooting, and root cause analysis for data issues.Advocate for DataOps best practices, driving automation, reproducibility, and scalability.Document infrastructure, data workflows, and operational procedures.What are we looking for?We are looking for a DataOps Engineer with experience supporting high-velocity data/development teams and designing and maintaining data infrastructure, pipelines, and automation frameworks. You should also have experience streamlining data workflows using tools like Airflow, DBT, Databricks, and Snowflake while maintaining data integrity, security, and performance.Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.Minimum of 3 years of experience in DataOps or similar.Proficiency in key technologies, including Airflow, Snowflake, and SageMaker.Certifications in AWS/Snowflake/other technologies a plus.Excellent communication and interpersonal skills.Ability to work in a fast-paced environment and manage multiple priorities effectively.What’s in it for you?We offer our employees more than just competitive compensation. Our team benefits include:Competitive pay and benefitsFlexible vacation allowanceFlexible work from home or office hoursStartup culture backed by a secure, global brandOpportunity to build products enjoyed by millions as part of a passionate teamRoster of UniquesWe care deeply about every interaction our customers have with us, and trust and empower our staff to own and drive their experience. Our vision for our business and customers is built on fostering a diverse and inclusive work environment where regardless of background or beliefs you feel able to be authentic and bring all your talent into play. We want to celebrate you being you (we are an equal opportunity employer). #J-18808-Ljbffr
Data DevOps Engineer employer: JobLeads GmbH
Contact Detail:
JobLeads GmbH Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data DevOps Engineer
✨Tip Number 1
Familiarise yourself with the specific tools mentioned in the job description, such as Airflow, DBT, and Databricks. Having hands-on experience or projects showcasing your skills with these technologies can set you apart from other candidates.
✨Tip Number 2
Network with current or former employees of Hard Rock Digital on platforms like LinkedIn. Engaging in conversations about their experiences can provide valuable insights into the company culture and expectations, which you can leverage during interviews.
✨Tip Number 3
Stay updated on the latest trends in DataOps and cloud technologies, especially AWS and Snowflake. Being able to discuss recent developments or best practices in these areas during your interview will demonstrate your passion and commitment to the field.
✨Tip Number 4
Prepare to discuss specific examples of how you've optimised data pipelines or automated workflows in your previous roles. Concrete examples will help illustrate your problem-solving skills and ability to drive efficiency, which are key for this position.
We think you need these skills to ace Data DevOps Engineer
Some tips for your application 🫡
Understand the Role: Before applying, make sure you fully understand the responsibilities and requirements of the Data DevOps Engineer position. Familiarise yourself with the technologies mentioned, such as Airflow, DBT, Databricks, and Snowflake.
Tailor Your CV: Customise your CV to highlight relevant experience in DataOps, data infrastructure, and automation frameworks. Emphasise your proficiency with the key technologies listed in the job description and any certifications you may have.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your understanding of Hard Rock Digital's mission. Mention specific projects or experiences that demonstrate your ability to optimise data pipelines and collaborate with teams.
Showcase Soft Skills: In your application, don't forget to highlight your communication and interpersonal skills. Provide examples of how you've effectively managed multiple priorities in fast-paced environments, as this is crucial for the role.
How to prepare for a job interview at JobLeads GmbH
✨Showcase Your Technical Skills
Be prepared to discuss your experience with key technologies like Airflow, Snowflake, and SageMaker. Bring examples of how you've optimised data pipelines or automated workflows in previous roles.
✨Demonstrate Problem-Solving Abilities
Expect questions about troubleshooting and incident response. Share specific instances where you identified and resolved data issues, highlighting your analytical thinking and root cause analysis skills.
✨Emphasise Collaboration
Since the role involves working with ML, Data Science, and Reporting teams, be ready to talk about your experience collaborating across departments. Discuss how you ensure seamless data accessibility and communication.
✨Understand the Company Culture
Research Hard Rock Digital's mission and values. Be ready to express how your passion for learning and building new products aligns with their goal of enhancing customer experiences in the digital space.