Minimum of 7 years of professional experience in software development, with a strong focus on Python for data engineering and ETL (Extract, Transform, Load) processes.
Python:
Expert-level proficiency in Python, including writing clean, well-documented, and production-ready code.
DAGs & Orchestration:
Extensive hands-on experience (at least 3-5 years) designing, implementing, and managing data pipelines using DAG-based orchestration platforms like Apache Airflow. A strong understanding of Airflow concepts (operators, sensors, hooks, XComs) is essential.
Database Skills:
Solid experience with SQL and relational databases (e.g., PostgreSQL, MySQL). Experience with NoSQL databases and data warehouses (e.g., Snowflake, BigQuery) is a plus.
Cloud Platforms:
Proven experience working with at least one major cloud provider (AWS, GCP, or Azure), including familiarity with their data-related services (e.g., S3, Cloud Storage, EMR, Dataproc).
Data Formats:
Experience with various data formats (e.g., Parquet, Avro, JSON) and data transformation techniques.
Version Control:
Strong knowledge of Git and collaborative development workflows.
Problem-Solving:
Excellent analytical and problem-solving skills with a meticulous attention to detail.
Preferred Qualifications
Experience with streaming data technologies (e.g., Kafka, Spark Streaming, Flink).
Knowledge of containerization technologies (Docker, Kubernetes).
Experience with CI/CD pipelines for data engineering workflows.
Familiarity with data governance and security best practices.
Job Types: Contractual / Temporary, Freelance
Pay: ₹12,886.65 - ₹64,935.94 per month
Benefits:
Work from home
Work Location: Remote
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.