with strong software engineering principles and hands-on experience in building scalable data workflows. The ideal candidate has experience working with
Polars
or similar data frame libraries and exposure to cloud platforms like
GCP
.
Key Responsibilities:
Design, develop, and optimize data workflows using
Python
and
Polars
Write clean, scalable, and testable code
Implement unit and end-to-end tests
Set up and maintain Docker-based development environments
Collaborate with cross-functional teams to deploy solutions on
Google Cloud Platform
Leverage foundational
machine learning
knowledge in project delivery
Occasionally contribute to projects using
or similar libraries like Pandas
Familiarity with testing tools and frameworks
Experience using
Docker
for local and CI environments
Exposure to
data engineering
workflows and pipelines
Experience working with
GCP services
Basic understanding of
machine learning
workflows
Comfortable contributing to
Ruby
codebases when needed
If you're a proactive engineer who enjoys building data-driven systems in a remote-first environment, we'd love to hear from you.
Job Type: Full-time
Pay: From ?1,200,000.00 per year
Application Question(s):
How many years of work experience do you have with Python (Programming Language)?
Can you join immediately ?
How many days of notice period do you have
What's your current CTC?
Work Location: Remote
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.