Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics in Douglas
Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics

Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics in Douglas

Douglas Full-Time 60000 - 80000 £ / year (est.) Home office possible
Canonical Group Ltd

At a Glance

  • Tasks: Develop innovative software solutions using Python and Kubernetes for AI/ML and data analytics.
  • Company: Join Canonical, a leader in open source software and remote collaboration.
  • Benefits: Enjoy competitive pay, remote work, travel opportunities, and a personal development budget.
  • Other info: Experience a dynamic work culture with excellent career growth and global collaboration.
  • Why this job: Make a real impact in tech while working with cutting-edge tools and a global team.
  • Qualifications: Strong Python skills and a passion for technology; academic excellence is a plus.

The predicted salary is between 60000 - 80000 £ per year.

Home based - Worldwide Canonical is a leading provider of open source software and operating systems to the global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1000+ colleagues in 70+ countries and very few roles based in offices. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution. The company is founder led, profitable and growing.

We are hiring Python and Kubernetes Specialist Engineers focused on Data, Workflows, AI/ML and Analytics Solutions to join our teams building open source solutions for public cloud and private infrastructure. As a software engineer on the team, you'll collaborate on end-to-end data analytics and MLOps solutions composed of popular, open-source, machine learning tools, such as Kubeflow, MLFlow, DVC, and Feast. You may also work on ETL, data governance and visualization tools like Apache SuperSet, dbt, workflow orchestration tools such as Airflow and Temporal, or data warehouse solutions such as Apache Trino, or ClickHouse. These solutions may be run on servers or on the cloud, on machines or on Kubernetes, on developer desktops, or as web services. We serve the needs of individuals and community members as much as the needs of our Global 2000 and Fortune 500 customers; we make our primary work available free of charge and our Pro subscriptions are also available to individuals for personal use at no cost. Our goal is to enable more people to enjoy the benefits of open source, regardless of their circumstances.

This initiative spans many teams that are home-based and in multiple time zones. We believe in distributed collaboration but we also try to ensure that colleagues have company during their work hours! Successful candidates will join a team where most members and your manager are broadly in the same time zone so that you have the benefits of constant collaboration and discussion.

What your day will look like:

  • Develop your understanding of the entire Linux stack, from kernel, networking, and storage, to the application layer
  • Design, build and maintain solutions that will be deployed on public and private clouds and local workstations
  • Master distributed systems concepts such as observability, identity, tracing
  • Work with both Kubernetes and machine-oriented open source applications
  • Collaborate proactively with a distributed team of engineers, designers and product managers
  • Debug issues and interact in public with upstream and Ubuntu communities
  • Generate and discuss ideas, and collaborate on finding good solutions

What we are looking for in you:

  • Professional or academic software delivery using Python
  • Exceptional academic track record from both high school and university
  • Undergraduate degree in a technical subject or a compelling narrative about your alternative chosen path
  • Confidence to respectfully speak up, exchange feedback, and share ideas without hesitation
  • Track record of going above-and-beyond expectations to achieve outstanding results
  • Passion for technology evidenced by personal projects and initiatives
  • The work ethic and confidence to shine alongside motivated colleagues
  • Professional written and spoken English with excellent presentation skills
  • Experience with Linux (Debian or Ubuntu preferred)
  • Excellent interpersonal skills, curiosity, flexibility, and accountability
  • Appreciative of diversity, polite and effective in a multi-cultural, multi-national organisation
  • Thoughtfulness and self-motivation
  • Result-oriented, with a personal drive to meet commitments
  • Ability to travel twice a year, for company events up to two weeks long

Additional skills that would be nice to have:

  • Proven track record of building highly automated machine learning solutions, data pipelines, or orchestrating workflows for the cloud.
  • Hands‑on experience with machine learning libraries, or tools.
  • Experience with container technologies (Docker, LXD, Kubernetes, etc.)
  • Experience with public clouds (AWS, Azure, Google Cloud)
  • Working knowledge of cloud computing
  • Passionate about software quality and testing
  • Experience working on an open source project

What we offer colleagues:

  • We consider geographical location, experience, and performance in shaping compensation worldwide.
  • We revisit compensation annually (and more often for graduates and associates) to ensure we recognise outstanding performance.
  • In addition to base pay, we offer a performance-driven annual bonus or commission.
  • We provide all team members with additional benefits, which reflect our values and ideals.
  • We balance our programs to meet local needs and ensure fairness globally.
  • Distributed work environment with twice-yearly team sprints in person
  • Personal learning and development budget of USD 2,000 per year
  • Annual compensation review
  • Recognition rewards
  • Annual holiday leave
  • Maternity and paternity leave
  • Employee Assistance Programme
  • Opportunity to travel to new locations to meet colleagues
  • Priority Pass, and travel upgrades for long haul company events

About Canonical:

Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence - in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since its inception in 2004.

Canonical is an equal opportunity employer. Equality and diversity are principal values of Canonical. We are committed to fair hiring throughout. Human Resources will review all applicant data in a methodical and confidential manner.

Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics in Douglas employer: Canonical Group Ltd

Canonical is an exceptional employer that champions a distributed work culture, allowing you to collaborate with talented colleagues from around the globe while enjoying the flexibility of home-based work. With a strong focus on personal development, we offer a generous learning budget, annual compensation reviews, and unique opportunities for travel to exciting locations for team sprints. Join us in our mission to make open source accessible to everyone, and be part of a pioneering company that values diversity, innovation, and excellence.
Canonical Group Ltd

Contact Detail:

Canonical Group Ltd Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics in Douglas

✨Tip Number 1

Network like a pro! Reach out to folks in your field on LinkedIn or join relevant online communities. Engaging with others can lead to job opportunities that aren't even advertised yet.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your projects, especially those involving Python and Kubernetes. This gives potential employers a taste of what you can do and sets you apart from the crowd.

✨Tip Number 3

Prepare for interviews by practising common questions and coding challenges. We recommend doing mock interviews with friends or using platforms that simulate real interview scenarios to boost your confidence.

✨Tip Number 4

Apply through our website! It’s the best way to ensure your application gets seen. Plus, it shows you're genuinely interested in joining our team at Canonical, which is always a bonus!

We think you need these skills to ace Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics in Douglas

Python
Kubernetes
Data Analytics
MLOps
ETL
Data Governance
Apache SuperSet
dbt
Airflow
Temporal
Apache Trino
ClickHouse
Linux (Debian or Ubuntu)
Machine Learning Libraries
Container Technologies (Docker, LXD)

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the role of Python and Kubernetes Software Engineer. Highlight your experience with data analytics, MLOps, and any relevant open-source projects you've worked on. We want to see how your skills align with what we're looking for!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for technology and any personal projects that showcase your skills. Let us know why you're excited about working with Canonical and how you can contribute to our mission.

Showcase Your Communication Skills: Since we work in a distributed team, strong written communication is key. Make sure your application is clear, concise, and free of jargon. We appreciate candidates who can express their ideas effectively, so let your personality come through!

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands. Plus, it shows us that you're proactive and genuinely interested in joining our team!

How to prepare for a job interview at Canonical Group Ltd

✨Know Your Tech Stack

Make sure you have a solid understanding of Python and Kubernetes, as well as the tools mentioned in the job description like Kubeflow and MLFlow. Brush up on your knowledge of data pipelines and MLOps solutions, as these will likely come up during the interview.

✨Showcase Your Projects

Be ready to discuss any personal projects or contributions to open source that demonstrate your passion for technology. This is a great way to show your hands-on experience and problem-solving skills, which are crucial for this role.

✨Prepare for Collaboration Questions

Since the role involves working with a distributed team, expect questions about collaboration and communication. Think of examples where you've successfully worked in a team, especially in a remote setting, and how you handled challenges.

✨Ask Insightful Questions

Prepare thoughtful questions about the company's culture, the team dynamics, and the specific projects you'll be working on. This shows your genuine interest in the role and helps you assess if it's the right fit for you.

Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics in Douglas
Canonical Group Ltd
Location: Douglas

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>