Design, implement and manage end to end data pipelines (ETL, data streaming and data warehousing) to make data easily accessible for analysis.
Design end to end ETL as per-use cases and business requirements
Ensure high data quality and integrity from data sources.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Troubleshoot complex data issues and perform root cause analysis to proactively resolve product and operational issues
Qualifications
Bachelor degree in Computer Science, or IT related fields
1 years of Data Engineering, ETL developer or a related field
Experience using Python, Informatica PowerCenter
Experience using SSIS, SSAS, SSRS, Python, PySpark, Airflow, Cloud Platform
Understanding Linux and docker is a plus
Communication skills, especially explaining technical concepts to non-technical business leaders.