Posted on Monday, September 4, 2023
- Design, develop, and maintain robust and scalable data pipelines to acquire, process, and transform data from various sources, adhering to the principles of DataOps to foster seamless collaboration and efficient data flow.
- Monitor and troubleshoot data pipelines, identifying and resolving issues to minimize disruptions and maintain consistent data delivery.
- Implement and enforce data quality standards using DataOps practices, conducting regular audits and checks to identify and rectify data inconsistencies, errors, and anomalies.
- Collaborate with data engineers to set up automated data validation and testing frameworks, ensuring accurate and reliable data.
- Work closely with stakeholders to define and implement data governance policies grounded in DataOps principles, ensuring compliance with industry regulations and internal data management guidelines.
- Assist in establishing data access controls, data retention policies, and data masking/anonymization processes to protect sensitive information and enable secure collaboration.
- Embrace the data-centric nature of DataOps to foster collaboration across teams. Partner with data analysts, scientists, and business teams to understand data requirements and provide necessary data support.
- Continuously monitor and fine-tune data pipelines to ensure timely data availability, contributing to the reliability of data-driven decision-making.
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Proven experience working with data engineering, data operations, or similar roles, preferably in a startup or fast-paced environment.
- Strong proficiency in SQL, scripting languages (Python, Bash, etc.), and experience with modern data processing and orchestration technologies. such as Airflow, Kafka, DBT, Spark
- Familiarity with data integration tools, ETL processes, and data warehousing concepts.
- Solid understanding of data governance, data quality, and data security principles.
- Experience with cloud platforms (AWS) and related services for data storage and processing.
- Excellent problem-solving skills and attention to detail, with the ability to troubleshoot and resolve complex data-related issues.
- Strong communication and collaboration skills to work effectively with cross- functional teams.
- Prior experience in the insurance or financial sector is a plus.