Apply for this job The Opportunity:
Lalamove is disrupting the logistics industry by connecting customers and drivers directly through our technology. We offer customers a lightning fast and convenient way to book delivery and moving services whether they are at their home, at work or on the go. People talk about O2O, we live it. Onto our fourth year as a start-up, now operating in Hong Kong, China, Taiwan, Thailand, Singapore, Philippines and Vietnam, our aspirations don't stop there as our model has the ability to transform how goods moved in any city worldwide.
As a Software Engineer in Data at Lalamove, you will be part of the growing data team, which supports different functional departments in the headquarters, as well as over 100 cities in a highly technology oriented company. Surrounded by other talents, you will be responsible for designing/building a full technology stack for data architecture and pipelines that make data available and accessible to the entire company for making data-driven strategic decisions. We strive for disruptive solutions for business opportunities and continuous improvement. What we seek:
Quick learner: you demonstrate the ability to learn new technology and frameworks quickly.
Problem solver: you are a problem solver with strong critical thinking skills, and willing to find creative solutions to difficult problems.
High autonomy: Self-organized, self-starter, passionate with a can-do attitude and take ownership of end-to-end projects. Ability to work independently yet teamwork oriented.
What you'll need:
- At least 2 years of work experience in a role related to data engineering or data infrastructure.
- Experience in data ETL processing or streaming. Familiar with tools such as Apache Spark, Apache Kafka.
- Database fundamentals.
- Programming skills in Scala or Go or scripting in Python.
- Experience in Cloud solution.
- Fluent in software development tools such as Git, CI/CD and agile development cycles.
- Documentation skills and knowledge sharing.
Plus but not required:
- Knowledge in modern data lake architecture, and data warehousing.
- Experience with workflow orchestration tools like Airflow is a plus.
- Familiarity with Kubernetes, Hadoop ecosystem, NOSQL, BI tools, data governance.
- Ability to use a wide variety of open source technologies; contributed to open source software.
- Participated in coding competition.
- Good command of English.