Arbetsbeskrivning
Join our mission to soundtrack the world!
We are now looking for a Senior Data Platform Engineer to join our Data Infrastructure team to help us evolve our data platform and our tooling for data ingestion, processing and management.
The Data Infrastructure team is a part of the Data Platform area
- our division dedicated to building our data, infrastructure and insights platform.
We work in a hybrid setup, spending at least three days a week at our offices in Stockholm.
How you will make an impact
As a Senior Data Platform Engineer in our Data Infrastructure team, you will play a pivotal role in building a scalable and robust data platform to enable us to leverage a broad range of data sources at the company.
You will continue building upon and improving our data platform that runs on the Google Cloud platform, where part of our solution stack includes Airflow for data pipeline orchestration, Snowplow for event collection, BigQuery as data warehouse, and Kubernetes as the underlying infrastructure.
What you can expect to do:
- Collaborate with a talented and diverse team to build and enhance our data platform to enable data collection, processing, access, usability and monitoring.
- Create architectural guidelines, solutions and best practices for our data pipelines in Airflow/Composer.
- Provision and manage resources in Google Cloud using Terraform.
- Manage new and existing data sources into our ELT and reverse ETL pipelines.
- Strategize optimal technical solutions for data governance, data cataloging and discovery, and data access.
- Ensure data quality and reliability through implementing standardized monitoring, CI/CD and testing practices.
- Work closely with stakeholders to understand data requirements and develop solutions that meet their needs.
- Continuously explore and adopt new technologies and best practices to improve our data infrastructure
In order for you to thrive in this role, we believe that you have:
- At least 4-5 years of experience of working with data infrastructure, systems architecture, and GCP related technologies.
- Hands on experience of working/building ingestion frameworks and orchestration tools.
- Hands on experience with infra as code tools like Terraform
- Experience with data security and compliance (PII handling and GDPR)
- Strong experience with data pipelines, distributed systems, and cloud infrastructure.
- Solid understanding of software engineering practices, including version control, CI/CD, testing and documentation.
- A love for teamwork and an ability to collaborate with Analytics Engineers, Data Scientists and Machine Learning Engineers.
- Proficiency in Python
- Excellent communication skills with the ability to adapt technical topics for less technical audiences.
- A pragmatic mindset with a focus on delivering practical solutions and making an impact.
If you believe you're a strong match for this position, even if you don't meet every requirement perfectly, we encourage you to apply!
We value diverse experiences and understand that expertise can manifest in various ways.