At our company, we build mobile games enjoyed by millions of players around the world. Our decisions are driven by data from feature development and live ops to user acquisition and monetization. Were looking for a skilled data Engineer to help us take our analytics and data infrastructure to the next level. As a data Engineer, youll design and maintain the systems that power insights across the company. Youll work closely with data analysts, UA managers, and product teams to ensure clean, fast, and scalable access to our most important asset: data. data Pipeline Development
* Design, build, and maintain scalable, robust ETL /ELT pipelines.
* Ingest data from various sources (APIs, databases, flat files, cloud buckets).
* Automate workflows for batch and/or streaming pipelines (e.g., using Airflow, GCP services). data Modeling & Infrastructure
* Design and implement efficient data models in BigQuery and Snowflake.
* Organize data for analytics teams in cloud warehouses (BigQuery, Snowflake).
* Implement best practices for partitioning clustering, and materialized views. Manage and optimize data infrastructure (cloud resources, Storage, compute).
* Ensure scalability, security, and compliance in data platforms. data Quality & Governance
* Monitor data integrity, consistency, and accuracy.
* Implement validation, monitoring, and alerting for pipeline health and data accuracy.
* Maintain documentation and data catalogs.
* Troubleshoot failures or performance bottlenecks. Collaboration & Enablement
* Work closely with data analysts, managers, and developers.
* Translate business requirements into technical solutions.
* Support self-service analytics and create reusable datasets.
* Design, build, and maintain scalable, robust ETL /ELT pipelines.
* Ingest data from various sources (APIs, databases, flat files, cloud buckets).
* Automate workflows for batch and/or streaming pipelines (e.g., using Airflow, GCP services). data Modeling & Infrastructure
* Design and implement efficient data models in BigQuery and Snowflake.
* Organize data for analytics teams in cloud warehouses (BigQuery, Snowflake).
* Implement best practices for partitioning clustering, and materialized views. Manage and optimize data infrastructure (cloud resources, Storage, compute).
* Ensure scalability, security, and compliance in data platforms. data Quality & Governance
* Monitor data integrity, consistency, and accuracy.
* Implement validation, monitoring, and alerting for pipeline health and data accuracy.
* Maintain documentation and data catalogs.
* Troubleshoot failures or performance bottlenecks. Collaboration & Enablement
* Work closely with data analysts, managers, and developers.
* Translate business requirements into technical solutions.
* Support self-service analytics and create reusable datasets.
Requirements:
* 2+ years of experience as a data Engineer or similar role.
* Strong SQL and Python skills for data manipulation and pipeline logic.
* Experience with Airflow for orchestration and Docker/Kubernetes for deployment.
* Hands-on experience with cloud data platforms (GCP, AWS) and warehouses like BigQuery or Snowflake.
* Knowledge of data modeling, optimization, and performance tuning.
* Familiarity with DAX and BI tools like Power BI or Looker.
* Experience with Kafka or Pub/Sub for Real-Time data ingestion- an advantage.
* Familiarity with dbt for modular, testable SQL transformations- an advantage.
* Knowledge of Docker Kubernetes, and cloud-native tools in GCP – an advantage.
* Experience with Firebase Analytics and Unity Analytics ( data structure wise)- an advantage.
Our Tech Stack: Languages: SQL, Python, DAX Orchestration: Airflow, Docker, Kubernetes data Warehouses: BigQuery, Snowflake Cloud: GCP, AWS
* 2+ years of experience as a data Engineer or similar role.
* Strong SQL and Python skills for data manipulation and pipeline logic.
* Experience with Airflow for orchestration and Docker/Kubernetes for deployment.
* Hands-on experience with cloud data platforms (GCP, AWS) and warehouses like BigQuery or Snowflake.
* Knowledge of data modeling, optimization, and performance tuning.
* Familiarity with DAX and BI tools like Power BI or Looker.
* Experience with Kafka or Pub/Sub for Real-Time data ingestion- an advantage.
* Familiarity with dbt for modular, testable SQL transformations- an advantage.
* Knowledge of Docker Kubernetes, and cloud-native tools in GCP – an advantage.
* Experience with Firebase Analytics and Unity Analytics ( data structure wise)- an advantage.
Our Tech Stack: Languages: SQL, Python, DAX Orchestration: Airflow, Docker, Kubernetes data Warehouses: BigQuery, Snowflake Cloud: GCP, AWS
This position is open to all candidates.