In this role, you will be responsible for building the foundations of our data analytics platform enabling scalable data pipelines and robust data modeling to support real-time and batch analytics, ML models and business insights that serve both business intelligence and product needs.
You will be part of the R&D team, collaborating closely with engineers, analysts, and product managers to deliver a modern data architecture that supports internal dashboards and future-facing operational analytics.
If you enjoy architecting from scratch, turning raw data into powerful insights, and owning the full data lifecycle this role is for you!
Responsibilities
Take full ownership of the design and implementation of a scalable and efficient BI data infrastructure, ensuring high performance, reliability and security.
Lead the design and architecture of the data platform from integration to transformation, modeling, storage, and access.
Build and maintain ETL/ELT pipelines, batch and real-time, to support analytics, reporting, and product integrations.
Establish and enforce best practices for data quality, lineage, observability, and governance to ensure accuracy and consistency.
Integrate modern tools and frameworks such as Airflow, dbt, Databricks, Power BI, and streaming platforms.
Collaborate cross-functionally with product, engineering, and analytics teams to translate business needs into data infrastructure.
Promote a data-driven culture be an advocate for data-driven decision-making across the company by empowering stakeholders with reliable and self-service data access.
5+ years of hands-on experience in data engineering and in building data products for analytics and business intelligence.
Proven track record of designing and implementing large-scale data platforms or ETL architectures from the ground up.
Strong hands-on experience with ETL tools and data Warehouse/Lakehouse products (Airflow, Airbyte, dbt, Databricks)
Experience supporting both batch pipelines and real-time streaming architectures (e.g., Kafka, Spark Streaming).
Proficiency in Python, SQL, and cloud data engineering environments (AWS, Azure, or GCP).
Familiarity with data visualization tools like Power BI, Looker, or similar.
BSc in Computer Science or a related field from a leading university






