Design and Development:
  • Design, develop, and maintain scalable and optimized ELT/ETL data pipelines using the modern data stack (Snowflake, dbt, and orchestration tools like Airflow).
  • Be the primary owner of dbt (data build tool) projects, including developing modular and tested data models, implementing data quality checks, and maintaining comprehensive documentation.
Data Warehouse Management (Snowflake):
  • Implement advanced data modeling techniques (e.g., dimensional, Data Vault) within Snowflake to create clean, reliable datasets for analytics and business intelligence.
  • Optimize Snowflake performance and cost efficiency by tuning SQL queries, managing virtual warehouses, and utilizing advanced features like clustering, caching, and Snowpipe.
SQL and Problem-Solving:
  • Write complex, performant SQL to transform raw data into high-value business metrics.
  • Proactively identify and resolve complex data-related issues, bottlenecks, and performance degradation across the data ecosystem.
  • Collaboration and Communication:
  • Work closely with data analysts, business intelligence developers, and product teams to translate business requirements into technical data solutions.
  • Provide technical guidance, participate in code reviews for dbt models and SQL logic, and champion data engineering best practices.