Company Overview
Hopsworks is a leading provider of machine learning solutions for enterprises. Our flagship product is the Hopsworks AI Lakehouse which enables organizations to build, manage, and serve machine learning features at scale, empowering data scientists, machine learning engineers, and data engineers to develop and deploy advanced machine learning models faster and more efficiently. Our platform is built on open-source technologies and offers robust features for data ingestion, feature engineering, model training, model serving, and monitoring, all within a collaborative and secure environment.
We are a fast-growing, innovative company that is at the forefront of the data and machine learning revolution. Our customers span across industries, including finance, healthcare, e-commerce, and technology, and data-driven enterprises around the world trust our platform to drive business insights and outcomes. With a strong focus on customer success and cutting-edge technology, we are dedicated to helping our customers harness the full potential of their data to achieve their business goals.
Job Summary
As a DevOps Engineer at Hopsworks, you will play a pivotal role in maintaining and enhancing our MLOps platform infrastructure. You will work closely with our engineering team to ensure the seamless deployment and management of machine learning solutions. If you are passionate about DevOps, cloud technologies, and love working in a fast-paced, collaborative environment, we would love to talk to you!
Responsibilities
- Design, implement, and maintain infrastructure for the Hopsworks platform.
- Work with various cloud providers (AWS, Azure, GCP) to deploy Hopsworks in a timely and cost-effective way.
- Configure and maintain Kubernetes clusters for containerized deployments both on the cloud and on-prem.
- Monitor system performance and troubleshoot issues, ensuring high availability and reliability.
- Collaborate with cross-functional teams to automate deployment pipelines and improve CI/CD processes.
- Implement security best practices to protect sensitive data and systems.
- Provide on-call support as needed for critical system issues.
Qualifications
- Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience).
- Proven experience in DevOps roles, preferably in a cloud-native environment.
- Strong knowledge of distributed data processing.
- Proficiency in working with multiple cloud providers (AWS, Azure, GCP).
- Hands-on experience with Kubernetes and container orchestration.
- Solid Linux system administration skills.
- Scripting and automation expertise (e.g., bash or python or golang).
- Familiarity with monitoring tools (e.g., Prometheus, Grafana) and log management.
- Strong problem-solving skills and a proactive attitude.
- Excellent communication and collaboration skills.
Nice-to-Have
- Certification in cloud technologies (e.g., AWS Certified DevOps Engineer, Google Cloud Professional DevOps Engineer).
- Experience with machine learning and data science workflows.
- Knowledge of configuration management tools (e.g., chef, terraform).
- Familiarity with CI/CD pipelines and tools (e.g., Jenkins, CI/CD).
If you're passionate about DevOps and want to contribute to the success of the Hopsworks platform, we look forward to hearing from you!
How to Apply
Please send your CV or resume to [email protected]