TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy. TikTok's global headquarters are in Los Angeles and Singapore, and its offices include New York, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo.Why Join UsCreation is the core of TikTok's purpose. Our platform is built to help imaginations thrive. This is doubly true of the teams that make TikTok possible.Together, we inspire creativity and bring joy - a mission we all believe in and aim towards achieving every day.To us, every challenge, no matter how difficult, is an opportunity; to learn, to innovate, and to grow as one team. Status quo? Never. Courage? Always.At TikTok, we create together and grow together. That's how we drive impact - for ourselves, our company, and the communities we serve.Join us. As a data engineer in the Data Platform E-Commerce team, you will have the opportunity to build, optimize and grow one of the largest data platforms in the world. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct and huge impact on the company's core products as well as hundreds of millions of users. Responsibilities
- Design, implement, and support data warehouse / data lake infrastructure and build general & robust data warehouse models to provide the truth of data to business and enable key business stakeholders can perform data insights operation with low cost
- Design and build data transformations efficiently and reliably for different purposes (. reporting, growth analysis, multi-dimensional analysis);
- Design and implement reliable, scalable, robust, and extensible big data systems that support core products and business;
- Establish solid design and best engineering practices for engineers as well as non-technical people.
Minimum Qualification :
3+ years of data engineering experienceBS or MS degree in Computer Science or related technical field or equivalent practical experience;Experience in the Big Data technologies(Hadoop, M / R, Hive, Spark, Metastore, Presto, Flume, Kafka, ClickHouse, Flink ;Experience with performing data analysis, data ingestion and data integration;Experience with ETL(Extraction, Transformation & Loading) and architecting data systems;Experience with schema design, data modeling and SQL queries;Experience in dealing with large and complex data sets and performance tuningProficiency in one of the scripting languages - Python, Ruby, or similarPassionate and self-motivated about technologies in the Big Data area.