At icogz Technologies Pvt Ltd, we are pioneers in harnessing the power of generative AI and proprietary algorithms to transform the complexity of Big Data into clear, actionable insights. Our unique approach recognizes the challenges posed by fragmented data silos, which stem from the diverse perspectives of various stakeholders within a company.
By utilizing our advanced proprietary algorithms and streamlined data processing flows, we effectively integrate and analyze vast, disparate datasets. This process delivers a unified, cohesive view of your business operations, enabling real-time analysis and insightful decision-making.
Our solutions at icogz Technologies empower businesses to swiftly adapt and capitalize on strategic opportunities. By turning big data into a navigable landscape of insights, we help enhance operational agility, improve performance, and drive profitability.
Role Overview:
We're seeking a seasoned
Senior Python Developer
to architect and deliver high-performance, scalable backend solutions. You'll be involved in the development of robust APIs, optimize data processing workflows, and ensure code quality while adhering to secure, efficient, and SaaS-ready practices. Your expertise in Django, FastAPI, PostgreSQL, Document DB, and big data will drive our platform's evolution.
Key Responsibilities:
Design, develop, and maintain
RESTful APIs
using
Django/DRF
and
FastAPI
for big data analytics and LLM based applications.
Architect, Develop and Maintain
multi-tenant SaaS solutions
with security-first coding practices (e.g., OWASP and AVS2 standards).
Optimize
PostgreSQL
performance for large datasets, including query tuning, indexing, and partitioning.
Manage and Optimize Document DB operations
: Design, implement, and maintain databases in Document DB (e.g., MongoDB, Couchbase) to ensure efficient data storage, retrieval, and scalability. Optimize database queries, indexes, and ensure high availability and disaster recovery strategies are in place.
Process and analyze big data efficiently using Pandas, PySpark, or similar libraries.
Integrate third-party APIs (payment gateways, AI/ML services, marketplaces) with fault tolerance and monitoring.
Implement caching strategies (Redis, Memcached) to reduce latency and improve scalability.
Write
async-friendly code
and leverage concurrency for performance-critical tasks.
Develop
dynamic, reusable code
components to support rapid iteration in marketplace or SaaS environments.
Ensure compliance with
secured coding practices
(authentication, encryption, vulnerability mitigation).
Collaborate with data scientists/ML engineers to deploy and optimize AI/ML models in production.
Stay updated with AI-driven tools (e.g., GitHub Copilot) to enhance development efficiency.
Requirements:
5+ years of Python development experience, with
expert-level proficiency in Django and FastAPI
.
Strong understanding of
REST API design
, authentication (OAuth2, JWT), and versioning.
Hands-on experience with
PostgreSQL
, including query optimization for large datasets.
Expertise in Document DB systems
(e.g., MongoDB, Couchbase), including schema design, query optimization, and storage management.
Proficient in
Pandas, NumPy
, or similar libraries for data transformation/analytics.
Familiarity with async frameworks (e.g.,
asyncio, Celery
) and event-driven architectures.
Experience building
multi-tenant SaaS systems
(isolated databases, tenant-aware middleware).
Knowledge of
big data tools
(Spark, Dask, pyArrow) and efficient file processing (Parquet, CSV).
Expertise in caching mechanisms.
Proven track record of integrating third-party APIs (social media platforms, payment systems, cloud services).
Adaptability to AI tools.
Strong problem-solving skills and a commitment to writing clean, maintainable code.
Nice-to-Have:
Experience with
Django Channels
for WebSocket/real-time applications.
Familiarity with
cloud platforms
(AWS, Azure) and containerization (Docker, Kubernetes).
Exposure to
marketplace ecosystems
(e.g., pricing engines, vendor APIs).
Contributions to open-source projects or tech community involvement.
Monitoring tools like Grafana, Prometheus.
Awareness of AI/ML concepts (model deployment, data pipelines) and adaptability to AI tools.
Exposure to analytical Databases like ClickHouse, DuckDB.
Why Join Us
Opportunities to work on AI/ML-driven projects.
Continuous learning.
Collaborative culture with a focus on innovation.
Job Types: Full-time, Permanent
Location Type:
In-person
Schedule:
Day shift
Fixed shift
Work Location: In person
Beware of fraud agents! do not pay money to get a job
MNCJobsIndia.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.