Technical Specialist
Skicka ansökan

Om tjänsten

Besök hemsida
Omfattning Heltid
Publicerad 2024-09-10

We are seeking an experienced Technical Specialist to join our dynamic team in HCL Technology Sweden AB. The ideal candidate will possess a strong background in Azure Cloud Data Architecture, Snowflake, and Kafka development skills. This role requires a strategic thinker who can design, architect, and implement robust data solutions while collaborating with Product Owners, Engineering Managers, Enterprise Architects, and Business Stakeholders.

Key Responsibilities:

1. Solution Design & Architecture:

  • Design and architect scalable and efficient data solutions that meet business needs.
  • Work closely with Enterprise Architects to align solutions with the overall architectural vision.
  • Collaborate with Product Owners and Engineering Managers to define data product requirements and deliverables.

2. Data Pipeline Development:

  • Design, build, and maintain ETL data pipelines using the Azure Cloud Data Engineering stack.
  • Leverage Azure Data Lake and Lakehouse architectures for effective data storage and processing.
  • Extract, transform, and load (ETL) data from various sources into data warehouses or data lakes.

3. Data Governance & Quality:

  • Desing, Implement and enforce data governance policies and practices.
  • Ensure data quality by implementing data validation and quality checks.
  • Maintain data accuracy and reliability for analytical purposes.

4. SQL and Snowflake Expertise:

  • Develop and optimize complex SQL queries and stored procedures for data transformation and integration.
  • Utilize Snowflake for efficient data warehousing solutions, including building data products and designing secure data sharing methodologies.
  • Ensure data security and compliance with industry standards and regulations.

5. Kafka Integration:

  • Design and Implement Ingestion frameworks to manage real-time and batch data ingestion from Kafka and similar message queuing systems.
  • Develop and maintain Kafka data streaming solutions.

6. Collaboration & Communication:

  • Work with business stakeholders to understand data requirements and deliver actionable insights.
  • Collaborate with engineering teams to ensure seamless integration of data solutions.
  • Communicate effectively with cross-functional teams to drive data initiatives.

7. DevOps and Automation:

  • Implement DevOps practices to streamline data pipeline deployments and operations.
  • Automate data processing workflows to enhance efficiency and reliability.

8. Streamlit:

  • Experience in building customized App using Streamlit and Integration with Snowflake

Qualifications and Must Have:

  • Master's degree in Computer Science, Information Technology, or a related field.
  • Strong Experience in data architecture & engineering.
  • Proven expertise in Azure Cloud Data Architecture, Azure Data Factory, Databricks and ETL processes.
  • Strong proficiency in SQL and Snowflake for data transformation and integration.
  • Extensive experience with Kafka for real-time data streaming and batch processing.
  • Demonstrated experience in data governance and data quality management.
  • Excellent problem-solving skills and ability to design scalable data solutions.
  • Strong communication skills with the ability to work effectively with technical and non-technical stakeholders.

Preferred Skills:

  • Experience with DevOps practices and automation tools.
  • Familiarity with data product design and development.
  • Knowledge of modern data warehousing concepts and technologies.
  • Strong analytical and organizational skills.


Skicka ansökan

Mer info

Omfattning Heltid
Varaktighet Tillsvidare
Antal platser 1
Lön Fast och rörlig lön

Dela annons