Vulcan Labs is one of the largest publishers on the App Store and Android platforms. globally. We continuously develop and publish mobile applications focusing on new technologies such as AI on the way to conquering globally.
- Responsibilities:
As Data Engineer in our team, you will:
- Develop, maintain, and optimize data pipelines and ETL processes to ensure efficient data flow and integration.
- Build and run the data processing pipeline on Google Cloud Platform (GCP).
- Develop API using Python framework for internal transfer data.
- Support and debug data pipelines
- Implement methods for DevOps automation of all parts of the built data pipelines to deploy from development to production
- Work closely with our Data Analysts to implement new insights and statistical models.
3. Why do you love working here?
- SALARY & BONUS: Additionally to the 13th-month salary, you will have an attractive remuneration package including bonuses based on the KPI and hot rewards for each project. Salary review occurs twice annually based on performance and contribution.
- BEING A GREAT CULTURE: You will participate in the development team of top chart applications globally at a cool office with young and friendly talents with a global approach.
- CHALLENGE YOURSELF: You have an opportunity to work with cutting-edge technology: Virtual Reality, Augmented Reality, Artificial Intelligence and challenge your career path with the meaning of “ Go Big or Go Home”.
- NO BARRIERS: You are independent to give ideas and design innovative products to serve millions of global users and products will stand on top of technological trends.
- OTHER BENEFITS: We have so many events such as company trips, team building, birthday party, year-end party
2. Job requirement:
Your skills and experience
Technical Skills
- Have 3+ years of experience as a Data Engineer.
- Proficient in Advanced SQL and Python skills.
- Proficient in database query languages such as SQL and NoSQL
- Strong Cloud-based Data Engineering experience in one of the following Clouds (AWS/Azure/GCP); we mainly use GCP but we are open to your Cloud experience (with at least 1 year).
- Good Docker/Kubernetes knowledge is a plus.
- Develop, test, and deploy data pipeline design, experience with ETL frameworks/tools, Airflow (or similar DAG).
- Work with different types of source systems (SQL/NoSQL/document-based, structured/semi-structured/unstructured, batch, near-realtime and real-time,...).
- Proven successful design and implementation of large and complex data solutions (Data Warehouse, Data Lake) using various architectural patterns such as Microservices
- Experience with developing and deploying CI/CD pipelines.
Non-Technical Skills
- Good communication skills - verbal and written.
- Proactive problem solver, eye for detail, process driven.
Cách 1: Nộp hồ sơ trực tiếp tại văn phòng:
Địa chỉ: 74 Nguyễn Cơ Thạch, Phường An Lợi Đông, TP. Thủ Đức Cách 2: Nộp hồ sơ qua email:
Bấm vào nút "Ứng tuyển ngay" để nộp