As a key member of the DTS team, you will primarily collaborate closely with a leading global hedge fund on data engagements. Partner with data strategy and sourcing team on data requirements to design data pipelines and delivery structures.
Desired Skills and Experience
Essential skills
A bachelor's degree in computer science, engineering, mathematics, or statistics
4-6 years of experience in a Data Engineering role, with a proven track record of delivering insightful and value add dashboards
Strong experience with Snowflake for data warehousing and analytics.
Proficiency in Python for scripting and automation.
Hands-on experience with AWS S3 for data storage and retrieval.
Solid understanding of SQL and data modeling concepts.
Familiarity with Unix-based command-line environments.
Experience in building and maintaining QA frameworks for data validation.
Ability to prioritize multiple projects simultaneously, problem solve, and think outside the box
Key Responsibilities
Data Loading Operations: Operate proprietary tools for data ingestion using command-line interfaces and Python scripting in a Unix-based environment.
Custom SQL Development: Write and optimize bespoke SQL scripts to query and manipulate transactional data within the Snowflake data warehouse.
Quality Assurance (QA): Design and implement QA scripts and validation frameworks to ensure data integrity, accuracy, and completeness, including handling edge cases.
Data Cleansing & Transformation: Apply engineering principles and business logic to cleanse and transform raw datasets into structured formats for internal ELT pipelines
Key Metrics
Snowflake, AWS (S3 specifically), Python
Unix based OS command line, Exposure to QA
Behavioral Competencies
Good communication (verbal and written)
* Experience in managing client stakeholders
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.