
Software Engineer, Data Backend (Ad Cloud)
- Taipei City
- Permanent
- Full-time
- Design, develop, and maintain RESTful APIs using Python.
- Build and manage robust data warehouses utilizing ClickHouse, Trino/Presto and Pinot.
- Design and develop data pipelines using Apache Airflow and Apache Spark.
- Work closely with cross-functional teams to develop automation tools that streamline daily operations.
- Implement state-of-the-art monitoring and alerting systems to ensure optimal system performance and stability.
- Address queries from applications in a timely and effective manner, ensuring high client satisfaction.
- Work on cloud platforms such as AWS and GCP, leveraging their capabilities to optimize data operations.
- Utilize Kubernetes (k8s) for container orchestration to facilitate efficient deployment and scaling of applications.
- BS/MS degree in Computer Science
- 2+ years of experience in building and operating large-scale distributed systems or applications
- Experience in Kubernetes development, Linux/Unix
- Experience in managing data lake or data warehouse
- Expertise in developing data structures, algorithms on top of Big Data platforms
- Ability to operate effectively and independently in a dynamic, fluid environment
- Ability to work in a fast-moving team environment and juggle many tasks and projects
- Eagerness to change the world in a huge way by being a self-motivated learner and builder
- Experience working with ClickHouse is a big plus
- Contributing to open source projects is a huge plus (please include your github link)
- Experience working with Python, Scala/Java is a plus
- Experience with Hadoop, Hive, Flink, Presto/Trino and related big data systems is a plus
- Experience with Public Cloud like AWS or GCP is a plus