The people here at Apple don't just build products - they build the kind of wonder that's revolutionised entire industries. It's the diversity of those people and their ideas that inspires the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it. Imagine what you could do here. Are you passionate about handling large & complex data problems, want to make an impact and have the desire to work on groundbreaking big data technologies? Then we are looking for you. At Apple, great ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day-to-day basis? If so, Apple's Global Business Intelligence team is looking for passionate, meticulous, technical savvy, energetic engineer who likes to think creatively. Apple's Enterprise Data warehouse team deals with Petabytes of data catering to a wide variety of real- time, near real-time and batch analytical solutions. These solutions are integral part of business functions like Retail, Sales, Operations, Finance, AppleCare, Marketing and Internet Services, enabling business drivers to make critical decisions. We use a diverse technology stack such as Snowflake, Spark, Flink, Trino, Kafka, Iceberg, Cassandra and beyond. Designing, developing and scaling these big data solutions are a core part of our daily job.DescriptionWe are seeking experienced Senior Data Engineer with a strong background in designing and building scalable data architectures. You will play a key role in creating and optimising our data pipelines, improving data flow across our organisation, and working closely with cross-functional teams to ensure data accessibility and quality. This role requires deep knowledge of BigData ecosystem and Datalake concepts, as well as hands-on expertise in modern big data technologies like Advanced SQL, Spark, Flink, Trino, Iceberg and Snowflake Data Pipeline Development: * Design, build, and maintain scalable ELT processes using Spark, Flink, Snowflake and other big data frameworks. * Implement robust, high-performance data pipelines in cloud environments. * Deep and hand-on knowledge of at-least one programming language like Python, Java or, Scala * Expertise with advanced SQL skills and knowledge of BI/Analytics platforms. Datalake & Datawarehouse Architecture: * Develop and maintain efficient Datalake solutions * Ensure Datalake reliability, consistency, and cost-effectiveness. * Develop data models and schemas optimised for performance and scalability. * Experience with modern data warehouses like Iceberg, Snowflake, etc. Orchestration & CI/CD: * Comfortable with basic DevOps principles and tools for CI/CD (Jenkins, GitLab CI, or GitHub Actions). * Familiar with containerisation and orchestration tools (Docker, Kubernetes). * Familiarity with Infrastructure as Code (Terraform, CloudFormation) is a plus. Performance Tuning & Optimisation: * Identify bottlenecks, optimise processes, and improve overall system performance. * Monitor job performance, troubleshoot issues, and refine long-term solutions for system efficiency. Collaboration & Leadership: * Work closely with data scientists, analysts, and stakeholders to understand data needs and deliver solutions. * Mentor and guide junior data engineers on best practices and cutting-edge technologies.Minimum Qualifications
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.