Snowflake Developer

Year    India, India

Job Description


designing, implementing and testing cloud computing solutions using Snowflake technology, creating, monitoring and optimization of ETL/ELT processes, migrating solutions from on-premises to public cloud platforms.

Technical Skills:

  • Knowledge of SQL language and cloud-based technologies
  • Data warehousing concepts, data modeling, metadata management
  • Data lakes, multi-dimensional models, data dictionaries
  • Migration to AWS or Azure Snowflake platform
  • Performance tuning and setting up resource monitors
  • Snowflake modeling - roles, databases, schemas
  • SQL performance measuring, query tuning, and database tuning
  • ETL tools with cloud-driven skills
  • Integration with third-party tools
  • Ability to build analytical solutions and models
  • Coding in languages like Python, Java, JavaScript
  • Root cause analysis of models with solutions
  • Hadoop, Spark, and other warehousing tools
  • Managing sets of XML, JSON, and CSV from disparate sources
  • SQL-based databases like Oracle SQL Server, Teradata, etc.
  • Snowflake warehousing, architecture, processing, administration
  • Data ingestion into Snowflake
  • Enterprise-level technical exposure to Snowflake applications
Soft Skills:
  • Project management
  • Problem-solving
  • Innovation and best coding practices
  • Interpersonal, presentation, and communication skills
  • Critical and out-of-the-box thinking
  • Analytical, quantitative, problem-solving, and organizational skills
  • Testing and test case preparation abilities
Responsibilities
  • Create, test, and implement enterprise-level apps with Snowflake
  • Design and implement features for identity and access management
  • Create authorization frameworks for better access control
  • Solve performance issues and scalability issues in the system
  • Transaction management with distributed data processing algorithms
  • Possess ownership right from start to finish
  • Build, monitor, and optimize ETL and ELT processes with data models
  • Migrate solutions from on-premises setup to cloud-based platforms
  • Understand and implement the latest delivery approaches based on data architecture
  • Project documentation and tracking based on understanding user requirements
  • Perform data integration with third-party tools including architecting, designing, coding, and testing phases
  • Manage documentation of data models, architecture, and maintenance processes
  • Continually review and audit data models for enhancement
  • Maintenance of ideal data pipeline based on ETL tools
  • Coordination with BI experts and analysts for customized data models and integration
  • Code updates, new code development, and reverse engineering
  • Performance tuning, user acceptance training, application support
  • Maintain confidentiality of data
  • Risk assessment, management, and mitigation plans
  • Regular engagement with teams for status reporting and routine activities
  • Migration activities from one database to another or on-premises to cloud
Skills

Must have

7+ yrs of relevant experience
Data Lakes, Data Warehousing, SnowFlake, ETL, Teradata

Nice to have

Analytical Skills, Java, XML, Hadoop, ETL, JSON, Python

Languages

English: C1 Advanced

Seniority

Regular

Relocation package

If needed, we can help you with relocation process. .

Vacancy Specialization

Data Modeling

Ref Number

VR-98624

Luxoft

Beware of fraud agents! do not pay money to get a job

MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD3107369
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    India, India
  • Education
    Not mentioned
  • Experience
    Year