What are my responsibilities? As a Data Integration Specialist, you are required to:
Develop prototypes and proof of concepts using multiple data-sources, AI/ML methods and big-data technologies
Implement / Deploy large, structured and unstructured databases based on scalable cloud infrastructures
Port data across data-sources /DB systems in order to improve performance, scalability or to facilitate analytics.
Develop technical solutions which combine disparate information to create meaningful insights for business, using Big-data architectures (supportive role along with Data Architect)
Stay curious and enthusiastic about using related technologies to solve problems and enthuse others to see the benefit in business domain
Qualification: Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level: At least 5 - 7 years hands-on experience in the area of DB; with working experience on Data Engineering, ETL, Data Warehousing & Data Analytics in at least 2 projects. Desired Knowledge & Experience:
Expert knowledge and multi-year (4+) experience on data transformation, data bases, data governance and data quality management.
Experience in conventional DB technologies (RDBMS, Object Oriented)
Experience in using query languages such as SQL
Hands-on expertise in modern big data technologies such as SAS, Databricks, Spark and distributed databases (e.g. Teradata).
Experience with NoSQL databases such as MongoDB, Cassandra, HBase, is desirable.
Knowledge of Cloud technologies and frameworks in Microsoft Azure and Amazon AWS environment is desirable
Well-versed in relational database design and experience with processing and managing large data sets (multiple TB scale). (e.g. T-SQL, Microsoft SQL, Oracle)
Exceptional knowledge of data wareshousing solutions (e.g Teradata, Snowlflake)
Exceptional knowledge in Data Warehousing concepts (Data Modelling, STAR Schema, Snowflake Schema and various Data Warehousing concepts)
Strong knowledge on Extract Transform & Load concept (any tool is fine e.g. SAS DI Studio, Informatica…etc)
Good know-how & experience on Dataops (Data Orchestration / Workflow management & Monitoring Systems) and suggest improvements / inputs for CI/CD.
Knowledge of any reporting/BI tool (e.g. Qlik, Tableau, PowerBI, SAS VA etc…)
Knowledge of Azure cloud-based Data Storage and Services like (Azure BLOB, AZURE Databricks, Azure Data Factory, etc) is preferred.
Knowledge on SAS Code and SAS based tools will be a plus.
Sound knowledge in programming languages like Python, R will be an added advantage
Strong inclination towards working in any BigData projects
Strong written/communication skill to be work with partners along the globe
Required Soft skills & Other Capabilities:
Great attention to detail and good analytical abilities.
Good planning and organizational skills
Collaborative approach to sharing ideas and finding solutions
Ability to work independently and also in a global team environment.
Documentation and Presentation skills
Organization: Siemens Healthineers Company: Siemens Healthineers India LLP Experience Level: Experienced Professional Full / Part time: Full-time
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.