Data Lead

Year    TS, IN, India

Job Description

Data Infrastructure & Pipeline Development: ₹ Design, develop, and optimize scalable, efficient, and reliable data pipelines for large-scale data processing and transformation. ? Manage and maintain data architecture, ensuring high availability and performance using tools like Snowflake, Dataproc, BigQuery and other cloud technologies. ? Lead the integration of data sources from multiple systems, ensuring seamless data flow across various platforms.

? Build and optimize data pipelines using BigQuery, Snowflake, DBT Cloud, and Airflow.

? Expertise in Data Modelling to desing and build Data warehouses, Data Marts and Data lakes

? Manage version control and workflows with GitHub. Performance & Optimization:

? Perform tuning and optimization of queries and data pipelines to ensure high-performance data systems.

? Conduct regular performance reviews and recommend improvements or optimizations for system reliability, speed, and cost-efficiency. DBT (Data Build Tool) Implementation:

? Implement and maintain DBT models for data transformation workflows.

? Collaborate with data analysts and data scientists to ensure high-quality, well-documented datasets for downstream analysis.

? Ensure the use of best practices for DBT testing, version control, and deployment. Snowflake Management:

? Architect and optimize Snowflake data warehouse environments.

? Oversee and manage Snowflake data ingestion, transformation, and storage strategies.

? Collaborate with cross-functional teams to ensure Snowflake is being utilized effectively and efficiently. Leadership & Mentorship:

? Lead and mentor a team of data engineers, ensuring that best practices are followed in development and deployment of data pipelines.

? Conduct code reviews, provide feedback, and ensure the implementation of high-quality data solutions.

Preferred Skills:

? 10+ years of experience in Data Engineering with a strong focus on data warehousing, ETL pipelines, and big data technologies.

? At least 3-5 years of hands-on experience with Snowflake data warehouse or BigQuery, including setup, configuration, optimization, and maintenance.

? Proficiency in SQL for query optimization and performance tuning.

? In-depth experience with Dataproc for running large-scale data processing workflows (e.g., Spark, Hadoop).

? Expertise with DBT or any other ELT tool for data transformation and model building. Technical Skills:

? Strong experience in cloud platforms like AWS, GCP, or Azure, with a focus on data engineering tools and services.

? Proficient in programming/scripting languages such as Python, Java, or Scala for data processing.

? Experience with CI/CD pipelines and version control (Git, Jenkins, etc.).

? Knowledge of distributed computing frameworks (e.g., Spark, Hadoop) and related data processing concepts. Data Architecture & Design:

? Experience with building and maintaining data warehouses and lakes.

? Strong understanding of data modeling concepts, data quality, and governance.

? Familiarity with Kafka, Airflow, or similar tools for orchestrating data workflows.

Job Types: Full-time, Permanent, Fresher

Pay: ₹2,540,454.62 - ₹2,764,456.72 per year

Work Location: In person

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4468411
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    TS, IN, India
  • Education
    Not mentioned
  • Experience
    Year