Job Description
Your daily routines will be:
- Iteratively develop and improving upon our data pipeline with new features or services.
- Developing tools or engine on Data Gathering/ETL solution.
- Working with Data Analyst, fellow software/infra engineers for giving impact on business.
- Execute to success through lean weight planning, attention to detail, effective dev design, and efficient decision-making.
Requirements
You will enjoy the role if you:
- Bachelor’s degree in IT/Computer Science or related major.
- 2 years experience develop data pipeline system.
- Proficient in using nosql and SQL databases, task/flow scheduler, ETL.
- Proven experience in code versioning tools such as GitHub.
- Understanding about Internet Programming, Javascript or Phyton, database modelling, cloud infrastructure.