for data manipulation and processing.
Experience with data warehouse solutions for
Snowflake, BigQuery, Databricks
.
Ability to design and implement efficient data models for data lakes and warehouses.
Familiarity with CI/CD pipelines and automation tools to streamline data engineering workflows
Deep understanding of principles in data warehousing and cloud architecture for building very efficient and scalable data systems.
Experience with Apache Airflow and/or AWS MWAA
Experience with Snowflake's distinctive features, including multi-cluster architecture and shareable data features.
Expertise in distributed processing frameworks like Apache Spark, or other big data technologies is a plus.