Job description
ResponsibilitiesTikTok is the leading destination for short-form mobile video. We have several global offices including Los Angeles, New York, London, Paris, Berlin, Dubai, Singapore, Jakarta, Seoul, and Tokyo.
Given the rapidly accelerating growth of TikTok around the world, we are in the midst of creating a revolutionary platform for TikTok. The Data Cycling Centre (DCC) uncovers white spaces within the organization to create breakthroughs in the current process and strategic roadmap. Set on helping business owners make smarter decisions and providing industry-leading innovations to simplify and automate workflows, DCC uses quantitative and qualitative data to guide and uncover insights, turning our findings into real products to power exponential growth.
1. Partner with leadership, data project specialists, data product managers and data scientists to understand data storage and analytics needs
2. Design team's data Architecture and set SOPs
3. Develop custom storage, extract, transform, and load (ETLs) and reporting systems for existing and new data (both structured and unstructured) using a variety of traditional as well as large-scale distributed data systems
4. Support and sustain efficient and reliable data pipelines, warehouses, online caches and reporting systems with low latencies to solve business and product problems
5. Work closely with analysts to productionize various statistical and machine learning models using data processing pipelines
Qualifications
1. Bachelor's degree in Computer Science, related technical field or equivalent practical experience
1. Bachelor's degree in Computer Science, related technical field or equivalent practical experience
2. Experience with writing and optimizing SQL, and with one general purpose programming language (e.g., Java, C/C++, Python)
3. 5+ years of work experience with ETL, data architecture and data modeling
4. Experience in data processing using traditional and distributed systems (e.g., Hadoop, Presto, Spark, Dataflow, Airflow)
5. Experience with designing and building data models, data warehouses and non-relational data storage systems (NoSQL and distributed database management systems)
6. Experience with Unix or GNU/Linux systems, and with workflow management engines (e.g. Dorado, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4)
7. Preferred Qualifications:
- Worked as DBO/ DBA
- Well-versed with best practices to improve data discoverability, track data lineage, and ensure data quality
- Experience with Hadoop, Alibaba Cloud, Aeolus, Dorado, Coral
dudleyanddudleyllc.com is the go-to platform for job seekers looking for the best job postings from around the web. With a focus on quality, the platform guarantees that all job postings are from reliable sources and are up-to-date. It also offers a variety of tools to help users find the perfect job for them, such as searching by location and filtering by industry. Furthermore, dudleyanddudleyllc.com provides helpful resources like resume tips and career advice to give job seekers an edge in their search. With its commitment to quality and user-friendliness, dudleyanddudleyllc.com is the ideal place to find your next job.