Job Description
Responsibilities
-
- Bigdata ETL Engineer is responsible for leveraging both open-source and self-developed data virtualization tools to incorporate various disparate data sources into our datalakehouse.
- After joining our team, you will develop ETL pipelines, data quality rules, and data masking tools to connect isolated data islands and achieve a consistent and convenient experience in data analytics.
Requirements
-
-
- Bachelor’s degree or higher in Computer Science, Software Engineering or a related field, or equivalent functional experience in the area.
- 2+ years experience in a bigdata related area.
- A decent understanding of a big data ecosystem and familiar with computation engines(Spark/Flink) and storage engines(Hive/Hudi/Doris).
- Strong Java/Scala/Python/SQL development skills.
- Strong oral and written communication skills.
- Experiences with AWS services and concepts are a plus.
- Experiences with monitoring, optimizing, and troubleshooting large-scale big data infrastructure is a plus.
-
To apply for this job please visit jobs.lever.co.