: Responsibilities and day-to-day life: You will be working on the core data pipeline that moves data from all our customers to our warehouses via APIs. The responsibility will include extracting data via APIs from multiple cloud platforms, integrating it, and mapping it on our platform. To perform the data extraction, the employee should be well-versed in the deployment of ETL/ELT connectors. Additionally, you are in charge of creating and maintaining data flows and making sure the data complies with all business needs. Building data pipelines, structured data extraction & Integration, and data warehousing solutions while leading, guiding, and mentoring a team of engineers Translating technical needs for data streams, data integrations, data transformations, and data warehousing from business requirements. Administrate data orchestration through modern tools concepts such as ETL, ELT etc. Builds tools, services, and automation to extend the platform\'s capabilities. Must have: Good knowledge of middleware tools for API transactions is a must. Proficiency with open API platforms. Builds RESTful APIs Expertise in working with Cloud-based APIs Keep up-to-date and comprehensive knowledge of the ETL tools, Big Query, Snowflake, Redshift, or other cloud-based data warehouse databases Qualifications Required: Bachelor\'s degree in any respective field. Experience with data extraction technologies using APIs. 2-3 years of experience in data engineering. Proven track record of success. Experience in cyber security will be preferred.
foundit
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.