Staff Data Engineer

Year    TS, IN, India

Job Description

Location:

Telangana Hyderabad

We are looking to hire an exceptional Data Engineer. In this role, you will leverage your expertise in ETL development, data pipeline optimization, and data warehousing to ensure seamless data accessibility for business intelligence and analytics teams. You will work closely with business and technical stakeholders to design, build, and maintain scalable data pipelines and workflows, enabling data-driven decision-making across the organization. Additionally, you will be responsible for optimizing data warehouse performance, automating data refresh schedules, and ensuring data integrity and reliability.

Responsibilities:



Design, develop, and maintain scalable ETL pipelines to support data integration and analytics needs. Build, optimize, and manage data pipelines and workflows to ensure efficient data movement and transformation. Monitor and troubleshoot data pipeline performance, ensuring data reliability and accuracy Maintain and optimize data warehouse architecture, ensuring efficient storage and data retrieval. Work with large, complex datasets to support analytical and business intelligence needs. Define, execute, and optimize SQL queries for data transformation and extraction. Collaborate with Business Intelligence, Data Analytics, and Engineering teams to ensure data accessibility and performance. Automate data ingestion, processing, and refresh schedules to maintain up-to-date datasets. Implement data governance, security, and compliance best practices. Continuously evaluate and adopt new technologies to improve data infrastructure.

Requirements:



5+ years of experience in ETL development, data pipeline engineering, or data warehouse management. Strong proficiency in SQL (PostgreSQL, MySQL, or similar) for data manipulation and optimization. Experience with data pipeline tools (e.g., Apache Airflow, AWS Glue, dbt, or similar). Hands-on experience with cloud-based data platforms (e.g., AWS Redshift, Snowflake, BigQuery, or Azure Synapse). Knowledge of data modeling, data warehousing concepts, and performance tuning. Experience working with structured and semi-structured data formats (JSON, Parquet, Avro, etc.). Proficiency in Python or another scripting language for data automation and transformation. Strong problem-solving and troubleshooting skills. Excellent communication skills and ability to work with both technical and non-technical stakeholders. Experience working in Agile development practices and delivering iterative solutions.

Nice to Have:



Experience with orchestration tools like Apache Airflow or Prefect. Familiarity with real-time data streaming (Kafka, Kinesis, or similar). Knowledge of NoSQL databases (MongoDB, DynamoDB, etc.). Experience with CI/CD pipelines for data engineering workflows. Certifications in AWS, Azure, GCP, or relevant data engineering technologies.
Job Types: Full-time, Permanent

Pay: From ?703,603.23 per year

Schedule:

Day shift
Work Location: In person

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3721786
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Contract
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    TS, IN, India
  • Education
    Not mentioned
  • Experience
    Year