Responsibilities:
Own, enhance, and improve core platform
Focus on system efficiency and resilient infrastructure
Integrate well-known SAAS platforms to big data repository e.g., the major cloud providers, DataDog, and Snowflake
Optimize Spark and airflow processes
Contribute to data design and platform architecture while working closely with other business units and engineering teams
Face the challenges of testing and monitoring your large-scale data pipelines
7+ years developing and operating large-scale systems with high availability
7+ years of experience with Python or equivalent
Experience working with cloud environments (AWS preferred) and big data technologies (Spark, Airflow, S3, Snowflake, EMR)
Autodidact and self-motivated team player with solid communication skills who is interested in tackling the challenge of working with data at a scale