Specialist, Data Operations

Job Summary
Oversee the development and maintenance of our ETL pipelines and ensure seamless data integration into our cloud environment. This role is also responsible for managing data access controls and enforcing governance standards in line with the company data policy. This role is responsible to support and monitor compliance with personal data protection laws.

 

Job Responsibilities  

ETL Development and Maintenance:
•    Design, implement, and optimize ETL pipelines to integrate data from diverse systems into company data cloud.
•    Integrate data from APIs, on-premises systems, and third-party services into the data cloud.
•    Automate and schedule ETL workflows to ensure timely data availability for analysis.
•    Troubleshoot ETL failures and implement robust error-handling mechanisms.
Data Quality Management:
•    Develop processes to validate data accuracy, completeness, and consistency.
•    Implement quality control checks to prevent data corruption during ETL processes.

Access Control and Security:
•    Collaborate with data users to define and implement role-based access control (RBAC).
•    Monitor and manage data access permissions to maintain confidentiality and compliance.

Governance and Compliance:
•    Implement governance frameworks for managing sensitive data and maintaining compliance with personal data protection laws.
•    Ensure that all ETL processes adhere to organizational and legal data policies.

Platform Administration:
•    Monitor system performance and scalability of Microsoft Fabric, optimizing configurations as needed.
•    Collaborate with IT teams to ensure data platform stability and timely updates.
Collaboration and Stakeholder Support:
•    Work with Data Analysts to ensure data pipelines deliver clean, structured, and accessible data.
•    Act as a technical liaison between IT, analytics, and business teams.

Documentation and Process Standardization:
•    Create and maintain comprehensive documentation for ETL pipelines, data dictionary, access control rules, and operational workflows.
•    Develop and enforce standardized processes for data operations and integration.


Qualification:
Bachelor’s degree in computer science / data engineering / related field.

 

Relevant Experience & Years of Service:
3 or more years of relevant experience as a data operations or data engineering role.

 

Technical Skills & Professional Knowledge
•    Strong expertise in ETL frameworks and scripting languages (SQL).
•    Proficiency with cloud-based data platforms like Microsoft Fabric or Azure Data Factory.
•    Familiarity with data governance tools and practices.
•    Familiarity with data quality standard and rules.

 

Competencies
•    Strong organizational and troubleshooting skills.
•    Ability to manage multiple tasks and prioritize effectively.
•    Attention to detail and commitment to maintaining data integrity.
•    Knowledge of advanced data security practices and compliance frameworks.