Create and Develop data ingestion from various sources: RDBMS, Rest API, Kafka, Text File, and Spread sheet Design, develop, optimize, and maintain data architecture and pipelines
Work with Core Data Engineering / Data Warehousing team to utilize existing frameworks for the implantation of these data pipelines
Create and Develop data ingestion from various sources: RDBMS Rest API, Kafka, Text File, and Spread sheet Design, develop, optimize, and maintain data architecture and pipelines
Drive the prioritization, strategy, and focus to solve user problems
Maintaining and optimizing Data Pipelines
Participate in code reviews and follow best practices for development and documentation data Pipelines
Continuously learn and adapt to new technologies and methodologies within the data engineering landscape
Qualifications
Having experiences as a Data Engineer or Data Analyst is preferable.
Excellent command of programming languages, preferably in Python.
Familiar in managing a serverless data warehouse like BigQuery or Redshift.
Familiarity with schedulers like Airflow & Airbyte.
Have experience working with Github and Docker.
Deep knowledge of SQL database design (MySQL, Redshift, PostgreSQL).
Understand how to optimize data retrieval and how to develop dashboards, reports, and other visualizations for stakeholders.
Good communication skills to work across departments.
WFO: Lion Parcel Head Office (Kedoya, West Jakarta).