Cloud Data Engineer

Year    Bangalore, Karnataka, India

Job Description


The people here at Apple don't just build products - they build the kind of wonder that's revolutionised entire industries. It's the diversity of those people and their ideas that inspires the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it. Imagine what you could do here. Are you passionate about handling large & complex data problems, want to make an impact and have the desire to work on groundbreaking big data technologies? Then we are looking for you. At Apple, great ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day-to-day basis? If so, Apple's Global Business Intelligence team is looking for passionate, meticulous, technical savvy, energetic engineer who likes to think creatively. Apple's Enterprise Data warehouse team deals with Petabytes of data catering to a wide variety of real- time, near real-time and batch analytical solutions. These solutions are integral part of business functions like Retail, Sales, Operations, Finance, AppleCare, Marketing and Internet Services, enabling business drivers to make critical decisions. We use a diverse technology stack such as Snowflake, Spark, Flink, Trino, Kafka, Iceberg, Cassandra and beyond. Designing, developing and scaling these big data solutions are a core part of our daily job.DescriptionWe are seeking experienced Senior Data Engineer with a strong background in designing and building scalable data architectures. You will play a key role in creating and optimising our data pipelines, improving data flow across our organisation, and working closely with cross-functional teams to ensure data accessibility and quality. This role requires deep knowledge of BigData ecosystem and Datalake concepts, as well as hands-on expertise in modern big data technologies like Advanced SQL, Spark, Flink, Trino, Iceberg and Snowflake Data Pipeline Development: * Design, build, and maintain scalable ELT processes using Spark, Flink, Snowflake and other big data frameworks. * Implement robust, high-performance data pipelines in cloud environments. * Deep and hand-on knowledge of at-least one programming language like Python, Java or, Scala * Expertise with advanced SQL skills and knowledge of BI/Analytics platforms. Datalake & Datawarehouse Architecture: * Develop and maintain efficient Datalake solutions * Ensure Datalake reliability, consistency, and cost-effectiveness. * Develop data models and schemas optimised for performance and scalability. * Experience with modern data warehouses like Iceberg, Snowflake, etc. Orchestration & CI/CD: * Comfortable with basic DevOps principles and tools for CI/CD (Jenkins, GitLab CI, or GitHub Actions). * Familiar with containerisation and orchestration tools (Docker, Kubernetes). * Familiarity with Infrastructure as Code (Terraform, CloudFormation) is a plus. Performance Tuning & Optimisation: * Identify bottlenecks, optimise processes, and improve overall system performance. * Monitor job performance, troubleshoot issues, and refine long-term solutions for system efficiency. Collaboration & Leadership: * Work closely with data scientists, analysts, and stakeholders to understand data needs and deliver solutions. * Mentor and guide junior data engineers on best practices and cutting-edge technologies.Minimum Qualifications

  • * At-least 5+ years of hands on experience in developing and building data pipelines on Cloud & Hybrid infrastructure for analytical needs
  • * Experience working with any cloud based data warehouse solutions - Snowflake, SingleStore etc. along with expertise in SQL and Advance SQL.
  • * Experience in designing and building dimensional data models to improve accessibility, efficiency and quality of data
  • * Bachelor's Degree or equivalent in data engineering, computer science or similar field.
Preferred Qualifications
  • * High expertise in modern cloud warehouse, data lakes and implementation experience on any of the cloud platforms (preferably AWS)
  • * Expertise working with data at scale (peta bytes) with big data tech stack and advanced programming languages e:g Python, Java or, Scala.
  • * Database development experience with Relational or MPP/distributed systems such as Snowflake, SingleStore
  • * Hands-on experience with distributed computing in large-scale data environments.
  • * Excellent problem solving, critical thinking with ability to evaluate and apply new technologies in a short time
  • * Experience in working with global collaborators with ability to influence decision making.

Apple

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD3664190
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Bangalore, Karnataka, India
  • Education
    Not mentioned
  • Experience
    Year