Senior Associate Gcp Data Engineer

Year    KA, IN, India

Job Description

Senior Associate GCP Data Engineer



JR-153624
Hybrid
Bengaluru
Information Technology
Full time

Who are we




Equinix is the world's digital infrastructure company, operating over 260 data centers across the globe. Digital leaders harness Equinix's trusted platform to bring together and interconnect foundational infrastructure at software speed. Equinix enables organizations to access all the right places, partners and possibilities to scale with agility, speed the launch of digital services, deliver world-class experiences and multiply their value, while supporting their sustainability goals.




Our culture is based on collaboration and the growth and development of our teams. We hire hardworking people who thrive on solving challenging problems and give them opportunities to hone new skills and try new approaches, as we grow our product portfolio with new software and network architecture solutions. We embrace diversity in thought and contribution and are committed to providing an equitable work environment that is foundational to our core values as a company and is vital to our success.



Job Summary





As a key member of the Data Team at Equinix, we are seeking a skilled GCP Data Engineer who will be responsible for end-to-end development of Data Engineering use cases, Equinix Data Lake platform and tools. You will design, build, and maintain scalable data infrastructure and analytics solutions on Google Cloud Platform. The ideal candidate will have strong expertise in cloud-native data technologies and a passion for building robust, efficient data pipelines that drive business insights.





Responsibilities




Design, develop, and maintain scalable ETL/ELT pipelines using Cloud Dataflow, Cloud Composer (Apache Airflow), Dataform/dbt and Cloud Functions Build real-time streaming data pipelines using Cloud Pub/Sub, Kafka and Dataflow Implement automated data quality checks and monitoring across all data workflows Optimize pipeline performance and cost efficiency through proper resource allocation and scheduling Architect and implement data lake and data warehouse solutions using Dataproc, BigQuery, Cloud Storage, and Cloud SQL Design optimal data models, partitioning strategies, and clustering for analytical workloads Manage data lifecycle policies and implement automated archival and retention strategies Ensure data security, encryption, and access control across all storage layers Build and optimize BigQuery datasets for analytics and reporting use cases Create and maintain dimensional models and fact tables for business intelligence Implement data marts and aggregation layers for improved query performance Support self-service analytics through proper data cataloging and documentation Having good knowledge of Dataplex and Analytics hub Integrate data from various sources including databases, APIs, SaaS applications, and file systems Implement change data capture (CDC) solutions for real-time data synchronization Work with third-party data providers and external data feeds Implement comprehensive monitoring and alerting using Cloud Monitoring and Cloud Logging Troubleshoot data pipeline issues and implement robust error handling mechanisms Maintain data lineage documentation and impact analysis capabilities




Qualifications



Technical Skills


GCP Services: 2+ years hands-on experience with BigQuery, Cloud Dataflow Cloud Composer, Cloud Storage, Cloud Pub/Sub, Dataform/dbt and Cloud Functions Programming: Proficiency in Python/Java and SQL; experience with Spark/Beam development Data Technologies: Strong understanding of Apache Beam, Apache Airflow, and distributed computing concepts Database Systems: Experience with both RDBMS (Cloud SQL, PostgreSQL, MySQL) and NoSQL (Bigtable, Firestore) databases Infrastructure as Code: Experience with Terraform, Cloud Deployment Manager, or similar tools



Professional Experience


Bachelor's degree in computer science, Engineering, or related field 2+ years of experience in data engineering or related roles 2+ years of specific experience with Google Cloud Platform Experience with version control systems (Git) and CI/CD pipelines Knowledge of data modelling techniques and dimensional modelling




Preferred Qualifications


Google Cloud Professional Data Engineer certification Experience with containerization (Docker, Kubernetes/GKE) Experience with data governance and compliance frameworks (GDPR, HIPAA, SOX) Familiarity with business intelligence tools (Looker, Tableau, Power BI) Experience with streaming technologies beyond GCP (Kafka, Spark Streaming)
Equinix is committed to ensuring that our employment process is open to all individuals, including those with a disability. If you are a qualified candidate and need assistance or an accommodation, please let us know by completing form.


Equinix is an Equal Employment Opportunity and, in the U.S., an Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to unlawful consideration of race, color, religion, creed, national or ethnic origin, ancestry, place of birth, citizenship, sex, pregnancy / childbirth or related medical conditions, sexual orientation, gender identity or expression, marital or domestic partnership status, age, veteran or military status, physical or mental disability, medical condition, genetic information, political / organizational affiliation, status as a victim or family member of a victim of crime or abuse, or any other status protected by applicable law.

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD4011576
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    KA, IN, India
  • Education
    Not mentioned
  • Experience
    Year