Cutshort logo

50+ Python Jobs in India

Apply to 50+ Python Jobs on CutShort.io. Find your next job, effortlessly. Browse Python Jobs and apply today!

icon
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via Devseekerz by Sakthi Ganesh
Remote only
4 - 12 yrs
₹22L - ₹36L / yr
skill iconPython
skill iconData Analytics
skill iconData Science
skill iconMachine Learning (ML)

● Candidate should have Hands-on development experience as Data Analyst and/or ML Engineer.

● Candidate must have Coding experience in Python.

● Candidate should have Good Experience with ML models and ML algorithms.

● Need Experience with statistical modelling of large data sets.

● Looking for Immediate joiners or max. 30 days of Notice Period candidates.

● The candidates based out of these locations - Bangalore, Pune, Hyderabad, Mumbai, will be preffered.


What You will do:

● Play the role of Data Analyst / ML Engineer

● Collection, cleanup, exploration and visualization of data

● Perform statistical analysis on data and build ML models

● Implement ML models using some of the popular ML algorithms

● Use Excel to perform analytics on large amounts of data

● Understand, model and build to bring actionable business intelligence out of data that is available in different formats

● Work with data engineers to design, build, test and monitor data pipelines for ongoing business operations

 

Basic Qualifications:

● Experience: 4+ years.

● Hands-on development experience playing the role of Data Analyst and/or ML Engineer.

● Experience in working with excel for data analytics

● Experience with statistical modelling of large data sets

● Experience with ML models and ML algorithms

● Coding experience in Python

 

Nice to have Qualifications:

● Experience with wide variety of tools used in ML

● Experience with Deep learning

 

Benefits:

● Competitive salary.

● Hybrid work model.

● Learning and gaining experience rapidly.

● Reimbursement for basic working set up at home.

● Insurance (including a top up insurance for COVID).

Read more
IT Solutions

IT Solutions

Agency job
via HR Central by Melrose Savia Pinto
Bengaluru (Bangalore)
2 - 5 yrs
₹10L - ₹18L / yr
SQL
skill iconPython
Customer Relationship Management (CRM)
Microsoft Dynamics CRM
Salesforce
+2 more

The CRM team is responsible for communications across email, mobile push and web push channels. We focus on our existing customers and manage our interactions and touchpoints to ensure that we optimise revenue generation, drive traffic to the website and app, and extend the active customer lifecycle. We also work closely with the Marketing and Product teams to ensure that any initiatives are integrated with CRM activities.


Our setup is highly data driven and requires the understanding and skill set to work with large datasets, employing data science techniques to create personalised content at a 1:1 level. The candidate for this role will have to demonstrate a strong background working in this environment, and have a proven track record of striving to find technical solutions for the many projects and situations that the business encounters.


Overview of role :


- Setting up automation pipelines in Python and SQL to flow data in and out of CRM platform for reporting, personalisation and use in data warehousing (Redshift)


- Writing, managing, and troubleshooting template logic written in Freemarker.


- Building proprietary algorithms for use in CRM campaigns, targeted at improving all areas of customer lifecycle.


- Working with big datasets to segment audiences on a large scale.


- Driving innovation by planning and implementing a range of AB tests.


- Acting as a technical touchpoint for developer and product teams to push projects over the line.


- Integrating product initiatives into CRM, and performing user acceptance testing (UAT)


- Interacting with multiple departments, and presenting to our executive team to help them understand CRM activities and plan new initiatives.


- Working with third party suppliers to optimise and improve their offering.


- Creating alert systems and troubleshooting tools to check in on health of automated jobs running in Jenkins and CRM platform.


- Setting up automated reporting in Amazon Quicksight.


- Assisting other teams with any technical advice/information they may require.


- When necessary, working in JavaScript to set up Marketing and CRM tags in Adobe Launch.


- Training team members and working with them to make processes more efficient.


- Working with REST APIs to integrate CRM System with a range of technologies from third party vendors to in-house services.


- Contributing to discussions on future strategy, interpretation of test results, and helping resolve any major CRM issues


Key skills required :


- Strong background in SQL


- Experience with a programming language (preferably Python OR Free marker)


- Understanding of REST APIs and how to utilise them


- Technical-savvy - you cast a creative eye on all activities of the team and business and suggest new ideas and improvements


- Comfortable presenting and interacting with all levels of the business and able to communicate technical information in a clear and concise manner.


- Ability to work under pressure and meet tight deadlines.


- Strong attention to detail


- Experience working with large datasets, and able to spot and pick up on important trends


- Understanding of key CRM metrics on performance and deliverability

Read more
Peenak Business solutions
Gaurav Kaushik
Posted by Gaurav Kaushik
Bengaluru (Bangalore)
4 - 6 yrs
₹25L - ₹32L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconGo Programming (Golang)
SQL
NOSQL Databases

Exp: 4-6 years

Position: Backend Engineer

Job Location: Bangalore ( office near cubbon park - opp JW marriott)

Work Mode : 5 days work from office 


Requirements:

● Engineering graduate with 3-5 years of experience in software product development.

● Proficient in Python, Node.js, Go

● Good knowledge of SQL and NoSQL

● Strong Experience in designing and building APIs

● Experience with working on scalable interactive web applications

● A clear understanding of software design constructs and their implementation

● Understanding of the threading limitations of Python and multi-process architecture

● Experience implementing Unit and Integration testing

● Exposure to the Finance domain is preferred

● Strong written and oral communication skills

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Pune, Hyderabad, Indore, Jaipur, Kolkata
4 - 5 yrs
₹2L - ₹18L / yr
skill iconPython
PySpark

We are looking for a skilled and passionate Data Engineers with a strong foundation in Python programming and hands-on experience working with APIs, AWS cloud, and modern development practices. The ideal candidate will have a keen interest in building scalable backend systems and working with big data tools like PySpark.

Key Responsibilities:

  • Write clean, scalable, and efficient Python code.
  • Work with Python frameworks such as PySpark for data processing.
  • Design, develop, update, and maintain APIs (RESTful).
  • Deploy and manage code using GitHub CI/CD pipelines.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Work on AWS cloud services for application deployment and infrastructure.
  • Basic database design and interaction with MySQL or DynamoDB.
  • Debugging and troubleshooting application issues and performance bottlenecks.

Required Skills & Qualifications:

  • 4+ years of hands-on experience with Python development.
  • Proficient in Python basics with a strong problem-solving approach.
  • Experience with AWS Cloud services (EC2, Lambda, S3, etc.).
  • Good understanding of API development and integration.
  • Knowledge of GitHub and CI/CD workflows.
  • Experience in working with PySpark or similar big data frameworks.
  • Basic knowledge of MySQL or DynamoDB.
  • Excellent communication skills and a team-oriented mindset.

Nice to Have:

  • Experience in containerization (Docker/Kubernetes).
  • Familiarity with Agile/Scrum methodologies.


Read more
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Ahmedabad, Indore
7 - 14 yrs
₹15L - ₹25L / yr
Process automation
uipath
Power Automate
BOT development
BOT design
+5 more

As an RPA (Robotic Process Automation) Lead, you will drive the strategic implementation of automation solutions, lead a team in designing and deploying robotic workflows, and collaborate with stakeholders to optimize business processes, ensuring efficiency and innovation.


We are looking for you!

You are a team player, get-it-done person, intellectually curious, customer focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You have the zeal to think differently, understand that career is a journey and make the choices right. Ideal candidates would be someone who is creative, proactive, go getter and motivated to look for ways to add value to job accomplishments.


You are self-motivated with a strong work ethic, positive attitude, and demeanor, enthusiastic when embracing new challenges, able to multitask and prioritize (good time management skills), willingness to learn new technology/methodologies, adaptable and flexible when new products are assigned. You prefer to work independently with less or no supervision. You are process oriented, have a methodical approach and demonstrate quality first approach and preferably who have worked in a result-oriented team(s).


What you’ll do

  • Work in customer facing roles, understanding business requirements, Process Assessment.
  • Conducts architectural evaluation, design and analysis of automation deployments.
  • Handson experience in Bot Design, Bot Development, Testing and Debugging
  • Responsible to prepare and review of technical documentation (Solution Design Document).
  • Driving best practice design - identifying reusable components, queues, configurable parameters.
  • Experience with customer interaction and software development lifecycle, as well as Agile project management methodology.
  • Researching, recommending and implementing new processes and technology to improve the quality of services provided.
  • Partnering with the Pre-sales team to estimate efforts and craft solutions.


What you will Bring 

  • Bachelor's degree in Computer Science, or any related field.
  • 8 to 12 years of experience with hands-on experience in RPA development and deployment.
  • Certifications with RPA platforms preferably on (UiPath or Power Automate).
  • Hands-on experience working in development or support projects.
  • Experience working in Agile SCRUM environment.
  • Strong communication, organizational, analytical and problem-solving skills.
  • Ability to succeed in a collaborative and fast paced environment.
  • Ensures the delivery of a high-quality solutions meeting client s expectations.
  • Ability to Lead Teams with Developers and Junior Developers.
  • Programming languages: knowledge on at least one of these – C#, Visual Basic, Python, .Net, Java


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.
Read more
Texple Technologies

at Texple Technologies

1 recruiter
Prajakta Mhadgut
Posted by Prajakta Mhadgut
Mumbai
7 - 10 yrs
₹10L - ₹20L / yr
MERN Stack
AWS
skill iconPython

We are looking for a highly experienced and visionary Tech Lead / Solution Architect with deep expertise in the MERN stack and AWS to join our organization. In this role, you will be responsible for providing technical leadership across multiple projects, guiding architecture decisions, and ensuring scalable, maintainable, and high-quality solutions. You will work closely with cross-functional teams to define technical strategies, mentor developers, and drive the successful execution of complex projects. Your leadership, architectural insight, and hands-on development skills will be key to the team’s success and the organization's technological growth.


Responsibilities:

  • You will be responsible for all the technical decisions related to the project.
  • Lead and mentor a team of developers, providing technical guidance and expertise.
  • Collaborate with product managers, business analysts, and other stakeholders.
  • Architect and design technical solutions that align with project goals and industry best practices.
  • Develop and maintain scalable, reliable, and efficient software applications.
  • Conduct code reviews, ensure code quality, and enforce coding standards.
  • Identify technical risks and challenges, and propose solutions to mitigate them.
  • Stay updated with emerging technologies and trends in software development.
  • Collaborate with cross-functional teams to ensure seamless integration of software components.

Requirements:

  • Bachelor's degree / Graduate
  • Proven experience 7-10 years as a Technical Lead or similar role in software development (start-up experience preferred)
  • Strong technical skills in programming languages such as MERN, Python, Postgres, MySQL.
  • Knowledge of cloud technologies (e.g., AWS, Azure, Google Cloud Platform) and microservices architecture.
  • Excellent leadership, communication, and interpersonal skills.
  • Ability to prioritize tasks, manage multiple projects, and work in a fast-paced environment.

Benefits:

  • Competitive salary and benefits package
  • Opportunities for professional growth and development
  • Collaborative and innovative work environment
  • Certifications on us


Joining : Immediate

Location : Malad (West) - Work From Office


This opportunity is for Work From Office.

Apply for this job if your current location is mumbai.

Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
2 - 5 yrs
₹2L - ₹5L / yr
Software Testing (QA)
Vulnerability Testing
Penetration testing
Automated testing
skill iconPython
+6 more

Job Title: QA Tester – Security & Vulnerability Testing

Experience: 3+ Years

Location: Gurugram (6 Days WFO)


Job Summary :

We’re seeking a QA Tester with strong experience in Vulnerability and Security Testing.

The ideal candidate will perform manual and automated penetration testing, identify security flaws, and work closely with development teams to ensure secure, compliant applications.


Key Responsibilities :

  • Perform vulnerability assessments on web, mobile, and cloud apps.
  • Conduct tests for OWASP Top 10 issues (e.g., SQLi, XSS, CSRF, SSRF).
  • Use tools like Burp Suite, OWASP ZAP, Metasploit, Kali Linux, Nessus, etc.
  • Automate security testing and integrate with CI/CD (Jenkins, GitHub, GitLab).
  • Test and secure APIs, including auth mechanisms (OAuth, JWT, SAML).
  • Ensure compliance with ISO 27001, GDPR, HIPAA, PCI-DSS.

Requirements :

  • 3+ Years in QA with a focus on Security/Vulnerability Testing.
  • Experience in manual & automated security testing.
  • Knowledge of scripting (Python, Bash, JS).
  • Familiarity with cloud platforms (AWS, Azure, GCP).
  • Bonus: Certifications like CEH, OSCP, Security+, etc.
Read more
SaaS Spend Management Platform

SaaS Spend Management Platform

Agency job
via Recruiting Bond by Pavan Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
1 - 3 yrs
₹4L - ₹7L / yr
skill iconPython
skill iconReact.js
SQL
Fullstack Developer
Large Language Models (LLM)
+14 more

Requirement:

● Role: Fullstack Developer

● Location: Noida (Hybrid)

● Experience: 1-3 years

● Type: Full-Time


Role Description : We’re seeking a Fullstack Developer to join our fast-moving team at Velto. You’ll be responsible for building robust backend services and user-facing features using a modern tech stack. In this role, you’ll also get hands-on exposure to applied AI, contributing to the development of LLM-powered workflows, agentic systems, and custom fi ne-tuning pipelines.


Responsibilities:

● Develop and maintain backend services using Python and FastAPI

● Build interactive frontend components using React

● Work with SQL databases, design schema, and integrate data models with Python

● Integrate and build features on top of LLMs and agent frameworks (e.g., LangChain, OpenAI, HuggingFace)

● Contribute to AI fi ne-tuning pipelines, retrieval-augmented generation (RAG) setups, and contract intelligence workfl ows

● Profi ciency with unit testing libraries like jest, React testing library and pytest.

● Collaborate in agile sprints to deliver high-quality, testable, and scalable code

● Ensure end-to-end performance, security, and reliability of the stack


Required Skills:

● Proficient in Python and experienced with web frameworks like FastAPI

● Strong grasp of JavaScript and React for frontend development

● Solid understanding of SQL and relational database integration with Python

● Exposure to LLMs, vector databases, and AI-based applications (projects, internships, or coursework count)

● Familiar with Git, REST APIs, and modern software development practices

● Bachelor’s degree in Computer Science or equivalent fi eld


Nice to Have:

● Experience working with LangChain, RAG pipelines, or building agentic workfl ows

● Familiarity with containerization (Docker), basic DevOps, or cloud deployment

● Prior project or internship involving AI/ML, NLP, or SaaS products

Why Join Us?

● Work on real-world applications of AI in enterprise SaaS

● Fast-paced, early-stage startup culture with direct ownership

● Learn by doing—no layers, no red tape

● Hybrid work setup and merit-based growth



Read more
ChicMic Studios
Akanksha Mittal
Posted by Akanksha Mittal
Mohali
2 - 3 yrs
₹4L - ₹9L / yr
skill iconDjango
skill iconPython
skill iconFlask
skill iconAmazon Web Services (AWS)
skill iconPostgreSQL

Job Description:

We are looking for a highly skilled and experienced Python Developer to join our dynamic team. The ideal candidate will have a robust background in developing web applications using Django and Flask, with experience in deploying and managing applications on AWS.

Proficiency in Django Rest Framework (DRF) and a solid understanding of machine learning concepts and their practical applications are essential.


Key Responsibilities:

 Develop and maintain web applications using Django and Flask frameworks.

 Design and implement RESTful APIs using Django Rest Framework (DRF).

 Deploy, manage, and optimize applications on AWS.

 Develop and maintain APIs for AI/ML models and integrate them into existing systems.

 Create and deploy scalable AI and ML models using Python.

 Ensure the scalability, performance, and reliability of applications.

 Write clean, maintainable, and efficient code following best practices.

 Perform code reviews and provide constructive feedback to peers.

 Troubleshoot and debug applications, identifying and fixing issues in a timely manner.

 Stay up-to-date with the latest industry trends and technologies to ensure our applications remain current and competitive.


Required Skills and Qualifications:

 Bachelor’s degree in Computer Science, Engineering, or a related field.

 3+ years of professional experience as a Python Developer.

 Proficient in Python with a strong understanding of its ecosystem.

 Extensive experience with Django and Flask frameworks.

 Hands-on experience with AWS services, including but not limited to EC2, S3, RDS, Lambda, and CloudFormation.

 Strong knowledge of Django Rest Framework (DRF) for building APIs.

 Experience with machine learning libraries and frameworks, such as scikit-learn, TensorFlow, or PyTorch.

 Solid understanding of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB).

 Familiarity with front-end technologies (e.g., JavaScript, HTML, CSS) is a plus.

 Excellent problem-solving skills and the ability to work independently and as part of a team.

 Strong communication skills and the ability to articulate complex technical concepts to non-technical stakeholders.

Read more
Client Located in Bangalore Location

Client Located in Bangalore Location

Agency job
Remote only
4 - 12 yrs
₹30L - ₹60L / yr
Large Language Models (LLM)
skill iconDeep Learning
skill iconMachine Learning (ML)
skill iconPython
Healthcare
+2 more

Experience:

  • Junior Level: 4+ years
  • Senior Level: 8+ years

Work Mode:Remote

About the Role:

We are seeking a highly skilled and motivated Data Scientist with deep expertise in Machine Learning (ML), Deep Learning, and Large Language Models (LLMs) to join our forward-thinking AI & Data Science team. This is a unique opportunity to contribute to real-world impact in the healthcare industry, transforming the way patients and providers interact with health data through Generative AI and NLP-driven solutions.

Key Responsibilities:

  • LLM Development & Fine-Tuning:
  • Fine-tune and customize LLMs (e.g., GPT, LLaMA2, Mistral) for use cases such as text classification, NER, summarization, Q&A, and sentiment analysis.
  • Experience with other transformer-based models (e.g., BERT) is a plus.
  • Data Engineering & Pipeline Design:
  • Collaborate with data engineering teams to build scalable, high-quality data pipelines for training/fine-tuning LLMs on structured and unstructured healthcare datasets.
  • Experimentation & Evaluation:
  • Design rigorous model evaluation and testing frameworks (e.g., with tools like TruLens) to assess performance and optimize model outcomes.
  • Deployment & MLOps Integration:
  • Work closely with MLOps teams to ensure seamless integration of models into production environments on cloud platforms (AWS, Azure, GCP).
  • Predictive Modeling in Healthcare:
  • Apply ML/LLM techniques to build predictive models for use cases in oncology (e.g., survival analysis, risk prediction, RWE generation).
  • Cross-functional Collaboration:
  • Engage with domain experts, product managers, and clinical teams to translate healthcare challenges into actionable AI solutions.
  • Mentorship & Knowledge Sharing:
  • Mentor junior team members and contribute to the growth of the team’s technical expertise.

Qualifications:

  • Master’s or Doctoral degree in Computer Science, Data Science, Artificial Intelligence, or related field.
  • 5+ years of hands-on experience in machine learning and deep learning, with at least 12 months of direct work on LLMs.
  • Strong coding skills in Python, with experience in libraries like HuggingFace Transformers, spaCy, NLTK, TensorFlow, or PyTorch.
  • Experience with prompt engineering, RAG pipelines, and evaluation techniques in real-world NLP deployments.
  • Hands-on experience in deploying models on cloud platforms (AWS, Azure, or GCP).
  • Familiarity with the healthcare domain and working on Real World Evidence (RWE) datasets is highly desirable.

Preferred Skills:

  • Strong understanding of healthcare data regulations (HIPAA, PHI handling, etc.)
  • Prior experience in speech and text-based AI applications
  • Excellent communication and stakeholder engagement skills
  • A passion for impactful innovation in the healthcare space


Read more
PGAGI
Javeriya Shaik
Posted by Javeriya Shaik
Remote only
0 - 1 yrs
₹1 - ₹3 / mo
skill iconPython
skill iconJava
TensorFlow
Keras
PyTorch
+2 more

Job Title: AI Architecture Intern

Company: PGAGI Consultancy Pvt. Ltd.

Location: Remote

Employment Type: Internship


Position Overview

We're at the forefront of creating advanced AI systems, from fully autonomous agents that provide intelligent customer interaction to data analysis tools that offer insightful business solutions. We are seeking enthusiastic interns who are passionate about AI and ready to tackle real-world problems using the latest technologies.


Duration: 6 months


Key Responsibilities:

  • AI System Architecture Design: Collaborate with the technical team to design robust, scalable, and high-performance AI system architectures aligned with client requirements.
  • Client-Focused Solutions: Analyze and interpret client needs to ensure architectural solutions meet expectations while introducing innovation and efficiency.
  • Methodology Development: Assist in the formulation and implementation of best practices, methodologies, and frameworks for sustainable AI system development.
  • Technology Stack Selection: Support the evaluation and selection of appropriate tools, technologies, and frameworks tailored to project objectives and future scalability.
  • Team Collaboration & Learning: Work alongside experienced AI professionals, contributing to projects while enhancing your knowledge through hands-on involvement.


Requirements:

  • Strong understanding of AI concepts, machine learning algorithms, and data structures.
  • Familiarity with AI development frameworks (e.g., TensorFlow, PyTorch, Keras).
  • Proficiency in programming languages such as Python, Java, or C++.
  • Demonstrated interest in system architecture, design thinking, and scalable solutions.
  • Up-to-date knowledge of AI trends, tools, and technologies.
  • Ability to work independently and collaboratively in a remote team environment


Perks:

- Hands-on experience with real AI projects.

- Mentoring from industry experts.

- A collaborative, innovative and flexible work environment

Compensation:

- Joining Bonus: A one-time bonus of INR 2,500 will be awarded upon joining.

- Stipend: Base is INR 8000/- & can increase up to 20000/- depending upon performance matrix.


After completion of the internship period, there is a chance to get a full-time opportunity as an AI/ML engineer (Up to 12 LPA).


Preferred Experience:

  • Prior experience in roles such as AI Solution Architect, ML Architect, Data Science Architect, or AI/ML intern.
  • Exposure to AI-driven startups or fast-paced technology environments.
  • Proven ability to operate in dynamic roles requiring agility, adaptability, and initiative.
Read more
Tecblic Private LImited
Ahmedabad
2 - 3 yrs
₹3L - ₹4.5L / yr
skill iconPython
skill iconDjango
skill iconFlask
FastAPI
skill iconPostgreSQL
+7 more

Job Profile : Python Developer

Job Location : Ahmedabad, Gujarat - On site

Job Type : Full time

Experience - 1-3 Years

 

Key Responsibilities:

Design, develop, and maintain Python-based applications and services.

Collaborate with cross-functional teams to define, design, and ship new features.

Write clean, maintainable, and efficient code following best practices.

Optimize applications for maximum speed and scalability.

Troubleshoot, debug, and upgrade existing systems.

Integrate user-facing elements with server-side logic.

Implement security and data protection measures.

Work with databases (SQL/NoSQL) and integrate data storage solutions.

Participate in code reviews to ensure code quality and share knowledge with the team.

Stay up-to-date with emerging technologies and industry trends.


Requirements:

1-3 years of professional experience in Python development.

Strong knowledge of Python frameworks such as Django, Flask, or FastAPI.

Experience with RESTful APIs and web services.

Proficiency in working with databases (e.g., PostgreSQL, MySQL, MongoDB).

Familiarity with front-end technologies (e.g., HTML, CSS, JavaScript) is a plus.

Experience with version control systems (e.g., Git).

Knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud) is a plus.

Understanding of containerization tools like Docker and orchestration tools like Kubernetes is good to have

Strong problem-solving skills and attention to detail.

Excellent communication and teamwork skills.


Good to Have:

Experience with data analysis and visualization libraries (e.g., Pandas, NumPy, Matplotlib).

Knowledge of asynchronous programming and event-driven architecture.

Familiarity with CI/CD pipelines and DevOps practices.

Experience with microservices architecture.

Knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch) is a plus.

Hands on experience in RAG and LLM model intergration would be surplus.

Read more
Tech Prescient

at Tech Prescient

2 candid answers
3 recruiters
Ashwini Damle
Posted by Ashwini Damle
Remote, Pune
7 - 9 yrs
₹15L - ₹25L / yr
skill iconPython
skill iconDjango
skill iconFlask
FastAPI
skill iconAmazon Web Services (AWS)

Job Description:

We are looking for a Python Lead who has the following experience and expertise -

  • Proficiency in developing RESTful APIs using Flask/Django or Fast API framework
  • Hands-on experience of using ORMs for database query mapping
  • Unit test cases for code coverage and API testing
  • Using Postman for validating the APIs Experienced with GIT process and rest of the code management including knowledge of ticket management systems like JIRA
  • Have at least 2 years of experience in any cloud platform
  • Hands-on leadership experience
  • Experience of direct communication with the stakeholders

Skills and Experience:

  • Good academics
  • Strong teamwork and communications
  • Advanced troubleshooting skills
  • Ready and immediately available candidates will be preferred.


Read more
Client based at Bangalore location.

Client based at Bangalore location.

Agency job
Remote only
8 - 12 yrs
₹24L - ₹30L / yr
Real World evidence
RWE Analyst
Healthcare
Large Language Models (LLM)
SQL
+10 more

Real-World Evidence (RWE) Analyst

Summary:

As an experienced Real-World Evidence (RWE) Analyst, you will leverage our cutting-edge healthcare data platform (accessing over 60 million lives in Asia, with ambitious growth plans across Africa and the Middle East) to deliver impactful clinical insights to our pharmaceutical clients. You will be involved in the full project lifecycle, from designing analyses to execution and delivery, within our agile data science team. This is an exciting opportunity to contribute significantly to a growing early-stage company focused on improving precision medicine and optimizing patient care for diverse populations.

Responsibilities:

·      Contribute to the design and execution of retrospective and prospective real-world research, including epidemiological and patient outcomes studies.

·      Actively participate in problem-solving discussions by clearly defining issues and proposing effective solutions.

·      Manage the day-to-day progress of assigned workstreams, ensuring seamless collaboration with the data engineering team on analytical requests.

·      Provide timely and clear updates on project status to management and leadership.

·      Conduct in-depth quantitative and qualitative analyses, driven by project objectives and your intellectual curiosity.

·      Ensure the quality and accuracy of analytical outputs, and contextualize findings by reviewing relevant published research.

·      Synthesize complex findings into clear and compelling presentations and written reports (e.g., slides, documents).

·      Contribute to the development of standards and best practices for future RWE analyses.

Requirements:

·      Undergraduate or post-graduate degree (MS or PhD preferred) in a quantitative analytical discipline such as Epidemiology, (Bio)statistics, Data Science, Engineering, Econometrics, or Operations Research.

·      8+ years of relevant work experience demonstrating:

o  Strong analytical and problem-solving capabilities.

o  Experience conducting research relevant to the pharmaceutical/biotech industry.

·      Proficiency in technical skills including SQL and at least one programming language (R, Python, or similar).

·      Solid understanding of the healthcare/medical and pharmaceutical industries.

·      Proven experience in managing workstream or project management activities.

·      Excellent written and verbal communication, and strong interpersonal skills with the ability to build collaborative partnerships.

·      Exceptional attention to detail.

·      Proficiency in Microsoft Office Suite (Excel, PowerPoint, Word).

Other Desirable Skills:

·      Demonstrated dedication to teamwork and the ability to collaborate effectively across different functions.

·      A strong desire to contribute to the growth and development of the RWE analytics function.

·      A proactive and innovative mindset with an entrepreneurial spirit, eager to take on a key role in a dynamic, growing company.

Read more
OneLab Ventures

OneLab Ventures

Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 3 yrs
₹5L - ₹6L / yr
skill iconPython
FastAPI
skill iconFlask
skill iconDjango
PyTorch
+3 more

AccioJob is conducting an offline hiring drive with OneLab Ventures for the position of:


  • AI/ML Engineer / Intern - Python, Fast API, Flask/Django, PyTorch, TensorFlow, Scikit-learn, Gen AI tools


Apply Now: https://links.acciojob.com/4cLL1Uw


Eligibility:

  • Degree: BTech / BSc / BCA / MCA / MTech / MSc / BCS / MCS
  • Graduation Year:
  • For Interns - 2024 and 2025
  • For experienced - 2024 and before
  • Branch: All Branches
  • Location: Pune (work from office)


Salary:

  • For interns - 25K for 6 months and 5- 6 LPA PPO
  • For experienced - Hike on the current CTC


Evaluation Process:

  • Assessment at AccioJob Pune Skill Centre.
  • Company side process: 2 rounds of tech interviews (Virtual +F2F) + 1 HR round


Apply Now: https://links.acciojob.com/4cLL1Uw


Important: Please bring your laptop & earphones for the test.

Read more
OneLab Ventures

OneLab Ventures

Agency job
via AccioJob by AccioJobHiring Board
Pune
0 - 3 yrs
₹5L - ₹6L / yr
skill iconPython
skill iconDjango
FastAPI
skill iconFlask
skill iconHTML/CSS
+3 more

AccioJob is conducting an offline hiring drive with OneLab Ventures for the position of:


  • Python Full Stack Engineer / Intern - Python, Fast API, Flask/Django, HTML, CSS, JavaScript, and frameworks like React.js or Node.js


Apply Now: https://links.acciojob.com/4cBQiO7


Eligibility:

  • Degree: BTech / BSc / BCA / MCA / MTech / MSc / BCS / MCS
  • Graduation Year:
  • For Interns - 2024 and 2025
  • For experienced - 2024 and before
  • Branch: All Branches
  • Location: Pune (work from office)


Salary:

  • For interns - 25K for 6 months and 5- 6 LPA PPO
  • For experienced - Hike on the current CTC


Evaluation Process:

  • Assessment at AccioJob Pune Skill Centre.
  • Company side process: 2 rounds of tech interviews (Virtual +F2F) + 1 HR round


Apply Now: https://links.acciojob.com/4cBQiO7


Important: Please bring your laptop & earphones for the test.

Read more
CD Edverse

at CD Edverse

2 candid answers
Ashish Yadav
Posted by Ashish Yadav
Remote only
1 - 5 yrs
₹10L - ₹15L / yr
skill iconPython
LangChain
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
AI Agents

Join CD Edverse, an innovative EdTech app, as AI Specialist! Develop a deep research tool to generate comprehensive courses and enhance AI mentors. Must have strong Python, NLP, and API integration skills. Be part of transforming education! Apply now.

Read more
DeepVidya AI Private Limited (OpenCV University)
Bengaluru (Bangalore)
2 - 5 yrs
₹5L - ₹10L / yr
skill iconPython
MySQL
skill iconAmazon Web Services (AWS)
Amazon EC2
Amazon S3
+6 more

About the job


Location: Bangalore, India

Job Type: Full-Time | On-Site


Job Description

We are looking for a highly skilled and motivated Python Backend Developer to join our growing team in Bangalore. The ideal candidate will have a strong background in backend development with Python, deep expertise in relational databases like MySQL, and hands-on experience with AWS cloud infrastructure.


Key Responsibilities

  • Design, develop, and maintain scalable backend systems using Python.
  • Architect and optimize relational databases (MySQL), including complex queries and indexing.
  • Manage and deploy applications on AWS cloud services (EC2, S3, RDS, DynamoDB, API Gateway, Lambda).
  • Automate cloud infrastructure using CloudFormation or Terraform.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Mentor junior developers and contribute to a culture of technical excellence.
  • Proactively identify issues and provide solutions to challenging backend problems.


Mandatory Requirements

  • Minimum 3 years of professional experience in Python backend development.
  • Expert-level knowledge in MySQL database creation, optimization, and query writing.
  • Strong experience with AWS services, particularly EC2, S3, RDS, DynamoDB, API Gateway, and Lambda.
  • Hands-on experience with infrastructure as code using CloudFormation or Terraform.
  • Proven problem-solving skills and the ability to work independently.
  • Demonstrated leadership abilities and team collaboration skills.
  • Excellent verbal and written communication.
Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune, Mumbai
5 - 11 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
skill iconPython
skill iconAngular (2+)

Should have strong hands on experience of 8-10 yrs in Java Development.

Should have strong knowledge of Java 11+, Spring, Spring Boot, Hibernate, Rest Web Services.

Strong Knowledge of J2EE Design Patterns and Microservices design patterns.

Should have strong hand on knowledge of SQL / PostGres DB. Good to have exposure to Nosql DB.

Should have strong knowldge of AWS services (Lambda, EC2, RDS, API Gateway, S3, Could front, Airflow.

Good to have Python ,PySpark as a secondary Skill

Should have ggod knowledge of CI CD pipleline.

Should be strong in wiriting unit test cases, debug Sonar issues.

Should be able to lead/guide team of junior developers

Should be able to collab with BA and solution architects to create HLD and LLD documents

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Mumbai
5 - 10 yrs
Best in industry
skill iconPython
SQL
Databases
Data engineering
skill iconAmazon Web Services (AWS)

Job Description: Data Engineer 

Position Overview:

Role Overview

We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.

 

Key Responsibilities

· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.

· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).

· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.

· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.

· Ensure data quality and consistency by implementing validation and governance practices.

· Work on data security best practices in compliance with organizational policies and regulations.

· Automate repetitive data engineering tasks using Python scripts and frameworks.

· Leverage CI/CD pipelines for deployment of data workflows on AWS.

 

Required Skills and Qualifications

· Professional Experience: 5+ years of experience in data engineering or a related field.

· Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.

· AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:

· AWS Glue for ETL/ELT.

· S3 for storage.

· Redshift or Athena for data warehousing and querying.

· Lambda for serverless compute.

· Kinesis or SNS/SQS for data streaming.

· IAM Roles for security.

· Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.

· Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.

· DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.

· Version Control: Proficient with Git-based workflows.

· Problem Solving: Excellent analytical and debugging skills.

 

Optional Skills

· Knowledge of data modeling and data warehouse design principles.

· Experience with data visualization tools (e.g., Tableau, Power BI).

· Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).

· Exposure to other programming languages like Scala or Java.

 

Education

· Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

 

Why Join Us?

· Opportunity to work on cutting-edge AWS technologies.

· Collaborative and innovative work environment.

 

 

Read more
The Alter Office

at The Alter Office

2 candid answers
Harsha Ravindran
Posted by Harsha Ravindran
Bengaluru (Bangalore)
3 - 6 yrs
₹12L - ₹18L / yr
skill iconNodeJS (Node.js)
MySQL
NOSQL Databases
skill iconMongoDB
Google Cloud Platform (GCP)
+14 more

Role: Senior Software Engineer - Backend

Location: In-Office, Bangalore, Karnataka, India

 

Job Summary:

We are seeking a highly skilled and experienced Senior Backend Engineer with a minimum of 3 years of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that power our applications. You will work closely with cross-functional teams to ensure seamless integration between frontend and backend components, leveraging your expertise to architect scalable, secure, and high-performance solutions. As a senior team member, you will mentor junior developers and lead technical initiatives to drive innovation and excellence.

 

Annual Compensation: 12-18 LPA


Responsibilities:

  • Lead the design, development, and maintenance of scalable and efficient backend systems and APIs.
  • Architect and implement complex backend solutions, ensuring high availability and performance.
  • Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
  • Design and optimize data storage solutions using relational databases and NoSQL databases.
  • Mentor and guide junior developers, fostering a culture of knowledge sharing and continuous improvement.
  • Implement and enforce best practices for code quality, security, and performance optimization.
  • Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
  • Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
  • Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
  • Conduct system design reviews and provide technical leadership in architectural discussions.
  • Stay updated with industry trends and emerging technologies to drive innovation within the team.
  • Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
  • Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.

Requirements:

  • Minimum of 3 years of proven experience as a Backend Engineer, with a strong portfolio of product-building projects.
  • Strong proficiency in backend development using Java, Python, and JavaScript, with experience in building scalable and high-performance applications.
  • Experience with popular backend frameworks and libraries for Java (e.g., Spring Boot) and Python (e.g., Django, Flask).
  • Strong expertise in SQL and NoSQL databases (e.g., MySQL, MongoDB) with a focus on data modeling and scalability.
  • Practical experience with caching mechanisms (e.g., Redis) to enhance application performance.
  • Proficient in RESTful API design and development, with a strong understanding of API security best practices.
  • In-depth knowledge of asynchronous programming and event-driven architecture.
  • Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
  • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
  • Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
  • Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
2 - 4 yrs
₹8L - ₹13L / yr
skill iconPython
RESTful APIs
SQL
JIRA

Requirements:

  • Must have proficiency in Python
  • At least 3+ years of professional experience in software application development.
  • Good understanding of REST APIs and a solid experience in testing APIs.
  • Should have built APIs at some point and practical knowledge on working with them
  • Must have experience in API testing tools like Postman and in setting up the prerequisites and post-execution validations using these tools
  • Ability to develop applications for test automation
  • Should have worked in a distributed micro-service environment
  • Hands-on experience with Python packages for testing (preferably pytest).
  • Should be able to create fixtures, mock objects and datasets that can be used by tests across different micro-services
  • Proficiency in gitStrong in writing SQL queriesTools like Jira, Asana or similar bug tracking tool, Confluence - Wiki, Jenkins - CI tool
  • Excellent written and oral communication and organisational skills with the ability to work within a growing company with increasing needs
  • Proven track record of ability to handle time-critical projects


Good to have:

  • Good understanding of CI/CDKnowledge of queues, especially Kafka
  • Ability to independently manage test environment deployments and handle issues around itPerformed load testing of API endpoints
  • Should have built an API test automation framework from scratch and maintained it
  • Knowledge of cloud platforms like AWS, Azure
  • Knowledge of different browsers and cross-platform operating systems
  • Knowledge of JavaScript
  • Web Programming, Docker & 3-Tier Architecture Knowledge is preferred.
  • Should have knowlege in API Creation, Coding Experience would be add on.
  • 5+ years experience in test automation using tools like TestNG, Selenium Webdriver (Grid, parallel, SauceLabs), Mocha_Chai front-end and backend test automation
  • Bachelor's degree in Computer Science / IT / Computer Applications


Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
1 - 7 yrs
₹4L - ₹12L / yr
skill iconPython
skill iconReact.js
skill iconHTML/CSS
skill iconPostgreSQL
Artificial Intelligence (AI)
+3 more

Job Title: Full Stack Engineer

Location: Delhi-NCR

Type: Full-Time

Responsibilities

Frontend:

  • Develop responsive, intuitive interfaces using HTML, CSS (SASS), React, and Vanilla JS.
  • Implement real-time features using sockets for dynamic, interactive user experiences.
  • Collaborate with designers to ensure consistent UI/UX patterns and deliver visually compelling products.

Backend:

  • Design, implement, and maintain APIs using Python (FastAPI).
  • Integrate AI-driven features to enhance user experience and streamline processes.
  • Ensure the code adheres to best practices in performance, scalability, and security.
  • Troubleshoot and resolve production issues, minimizing downtime and improving reliability.

Database & Data Management:

  • Work with PostgreSQL for relational data, ensuring optimal queries and indexing.
  • Utilize ClickHouse or MongoDB where appropriate to handle specific data workloads and analytics needs.
  • Contribute to building dashboards and tools for analytics and reporting.
  • Leverage AI/ML concepts to derive insights from data and improve system performance.

General:

  • Use Git for version control; conduct code reviews, ensure clean commit history, and maintain robust documentation.
  • Collaborate with cross-functional teams to deliver features that align with business goals.
  • Stay updated with industry trends, particularly in AI and emerging frameworks, and apply them to enhance our platform.
  • Mentor junior engineers and contribute to continuous improvement in team processes and code quality.




Read more
MindInventory

at MindInventory

1 video
4 recruiters
Khushi Bhatt
Posted by Khushi Bhatt
Ahmedabad
3 - 5 yrs
₹4L - ₹12L / yr
Data engineering
ETL
Google Cloud Platform (GCP)
Apache Airflow
Snow flake schema
+3 more
  • Required Minimum 3 years of Experience as a Data Engineer
  • Database Knowledge: Experience with Timeseries and Graph Database is must along with SQL, PostgreSQL, MySQL, or NoSQL databases like FireStore, MongoDB,
  • Data Pipelines: Understanding data Pipeline process like ETL, ELT, Streaming Pipelines with tools like AWS Glue, Google Dataflow, Apache Airflow, Apache NiFi.
  • Data Modeling: Knowledge of Snowflake Schema, Fact & Dimension Tables.
  • Data Warehousing Tools: Experience with Google BigQuery, Snowflake, Databricks
  • Performance Optimization: Indexing, partitioning, caching, query optimization techniques.
  • Python or SQL Scripting: Ability to write scripts for data processing and automation


Read more
Sim Gems Group

at Sim Gems Group

4 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 8 yrs
Upto ₹35L / yr (Varies
)
Odoo (OpenERP)
skill iconPython
skill iconJavascript
SQL

This is a full-time. Must have Odoo development experience:


  • Hands-on coding(is must) on Python, PostgreSQL, JavaScript;
  • Hands-on Proficiency in writing complex SQL queries and query optimisation.
  • Experience with Odoo framework, Odoo deployment, Kubernetes or docker.
  • Frontend with Owl.js or any JavaScript Framework;
  • Strong foundation in programming.
  • API integration and data exchange.
  • At least 4+ years of Odoo development.
  • Team leadership experience.
Read more
Deltek
Puja Rana
Posted by Puja Rana
Remote only
4 - 6 yrs
₹12L - ₹18L / yr
skill iconPython

Position Responsibilities :

  • Work with product managers to understand the business workflows/requirements, identify needs gaps, and propose relevant technical solutions
  • Design, Implement & tune changes to the product that work within the time tracking/project management environment 
  • Be understanding and sensitive to customer requirements to be able to offer alternative solutions
  • Keep in pace with the product releases
  • Work within Deltek-Replicon's software development process, expectations and quality initiatives
  • Work to accurately evaluate risk and estimate software development tasks
  • Strive to continually improve technical and developmental skills

Qualifications :

  • Bachelor of Computer Science, Computer Engineering, or related field.
  • 4+ years of software development experience (Core: Python v2.7 or higher).
  • Strong Data structures, algorithm design, problem-solving, and Quantitative analysis skills.
  • Knowledge of how to use microservices and APIs in code.
  • TDD unit test framework knowledge (preferably Python).
  • Strong and well-versed with Git basic and advanced concepts and their respective commands and should be able to handle merge conflicts.
  • Must have basic knowledge of web development technologies and should have worked on any web development framework.
  • SQL queries working knowledge.
  • Basic operating knowledge in some kind of project management tool like Jira.
  • Good to have:- Knowledge of EmberJs, C#, and .Net framework.

Read more
NA

NA

Agency job
via Method Hub by Sampreetha Pai
anywhere in India
4 - 5 yrs
₹18L - ₹22L / yr
SQL Azure
Apache Spark
DevOps
PySpark
skill iconPython
+1 more

Azure DE

Primary Responsibilities -

  • Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage.
  • Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure Create data models for analytics purposes
  • Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations
  • Use Azure Data Factory and Databricks to assemble large, complex data sets
  • Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data.
  • Ensure data security and compliance
  • Collaborate with data engineers, and other stakeholders to understand requirements and translate them into scalable and reliable data platform architectures

Required skills:

  • Blend of technical expertise, analytical problem-solving, and collaboration with cross-functional teams
  • Azure DevOps
  • Apache Spark, Python
  • SQL proficiency
  • Azure Databricks knowledge
  • Big data technologies


The DEs should be well versed in coding, spark core and data ingestion using Azure. Moreover, they need to be decent in terms of communication skills. They should also have core Azure DE skills and coding skills (pyspark, python and SQL).

Out of the 7 open demands, 5 of The Azure Data Engineers should have minimum 5 years of relevant Data Engineering experience.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Pune, Ahmedabad
4 - 9 yrs
₹10L - ₹35L / yr
skill iconPython
pytest
skill iconAmazon Web Services (AWS)
Test Automation (QA)
SQL

At least 5 years of experience in testing and developing automation tests.

A minimum of 3 years of experience writing tests in Python, with a preference for experience in designing automation frameworks.

Experience in developing automation for big data testing, including data ingestion, data processing, and data migration, is highly desirable.

Familiarity with Playwright or other browser application testing frameworks is a significant advantage.

Proficiency in object-oriented programming and principles is required.

Extensive knowledge of AWS services is essential.

Strong expertise in REST API testing and SQL is required.

A solid understanding of testing and development life cycle methodologies is necessary.

Knowledge of the financial industry and trading systems is a plus

Read more
Kreditventure

Kreditventure

Agency job
via Pluginlive by Harsha Saggi
Mumbai
7 - 9 yrs
₹20L - ₹25L / yr
Fullstack Developer
skill iconJava
skill iconPython
MERN Stack
SaaS
+4 more

Company: Kredit Venture

About the company:

KreditVenture is seeking a Technical Product Manager to lead the development, strategy, and

execution of our SaaS applications built on top of Loan Origination Systems and Lending Platforms.

This role requires a strong technical background, a product ownership mindset, and the ability to

drive execution through both in-house teams and outsourced vendors. The ideal candidate will play

a key role in aligning business goals with technical implementation, ensuring a scalable, secure,

and user-centric platform.

Job Description

Job Title: Senior Manager / AVP / DVP – Technical Product Manager


Location: Mumbai (Ghatkopar West)


Compensation: Upto 25 LPA


Experience: 7-8 years (Designation will be based on experience)


Qualification: 

- Bachelor’s degree in Computer Science, Engineering, or a related field.

- An MBA is a plus.


 Roles and Responsibilities


Technology Leadership:


  • Lead SaaS Platform Development – Strong expertise in full-stack development (Java, Python, MERN stack) and cloud-based architectures.
  • API & Workflow Design – Drive microservices-based REST API development and implement business process automation.
  • Third-Party Integrations – Enable seamless API integrations with external service providers.
  • Code Quality & Best Practices – Ensure code quality, security, and performance optimization through structured audits.


Vendor & Delivery Management:


  • Outsourced Vendor Oversight – Manage and collaborate with external development partners, ensuring high-quality and timely delivery.
  • Delivery Governance – Define SLAs, monitor vendor performance, and proactively escalate risks.
  • Quality Assurance – Ensure vendor deliverables align with product standards and integrate smoothly with internal development.


Collaboration & Stakeholder Engagement:


  • Customer Insights & Feedback – Conduct user research and feedback sessions to enhance platform capabilities.
  • Product Demos & GTM Support – Showcase platform features to potential clients and support sales & business development initiatives.


Platform Development & Compliance:


  • Component Libraries & Workflow Automation – Develop reusable UI components and enable no-code/low-code business workflows.
  • Security & Compliance – Ensure adherence to data protection, authentication, and regulatory standards (e.g., GDPR, PCI-DSS).
  • Performance Monitoring & Analytics – Define KPIs and drive continuous performance optimization.
Read more
Agivant Technologies

Agivant Technologies

Agency job
via Vidpro Consultancy Services by ashik thahir
Remote only
5 - 10 yrs
₹18L - ₹25L / yr
skill iconPython
SQL
Airflow
Snowflake
skill iconElastic Search
+3 more

Experience: 5-8 Years

Work Mode: Remote

Job Type: Fulltime

Mandatory Skills: Python,SQL, Snowflake, Airflow, ETL, Data Pipelines, Elastic Search, & AWS.


Role Overview:

We are looking for a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes.


Responsibilities:

  • Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness.
  • Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines.
  • Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS.
  • Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs.
  • Implement data quality checks and monitoring to ensure data integrity and identify potential issues.
  • Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes.
  • Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum and emerging technologies in data engineering.
  • Contribute to the development and enhancement of our data warehouse architecture

Required Skills:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes.
  • At least 3+ years of exp in Snowflake data warehousing technologies.
  • At least 3+ years of exp in creating and maintaining Airflow ETL pipelines.
  • Minimum 3+ years of professional level experience with Python languages for data manipulation and automation.
  • Working experience with Elastic Search and its application in data pipelines.
  • Proficiency in SQL and experience with data modelling techniques.
  • Strong understanding of cloud-based data storage solutions such as AWS S3.
  • Experience working with NFS and other file storage systems.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
3 - 6 yrs
₹8L - ₹13L / yr
skill iconAmazon Web Services (AWS)
Terraform
Ansible
skill iconDocker
Apache Kafka
+6 more

Must be:

  • Based in Mumbai
  • Comfortable with Work from Office
  • Available to join immediately


Responsibilities:

  • Manage, monitor, and scale production systems across cloud (AWS/GCP) and on-prem.
  • Work with Kubernetes, Docker, Lambdas to build reliable, scalable infrastructure.
  • Build tools and automation using Python, Go, or relevant scripting languages.
  • Ensure system observability using tools like NewRelic, Prometheus, Grafana, CloudWatch, PagerDuty.
  • Optimize for performance and low-latency in real-time systems using Kafka, gRPC, RTP.
  • Use Terraform, CloudFormation, Ansible, Chef, Puppet for infra automation and orchestration.
  • Load testing using Gatling, JMeter, and ensuring fault tolerance and high availability.
  • Collaborate with dev teams and participate in on-call rotations.


Requirements:

  • B.E./B.Tech in CS, Engineering or equivalent experience.
  • 3+ years in production infra and cloud-based systems.
  • Strong background in Linux (RHEL/CentOS) and shell scripting.
  • Experience managing hybrid infrastructure (cloud + on-prem).
  • Strong testing practices and code quality focus.
  • Experience leading teams is a plus.
Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
8 - 10 yrs
Best in industry
Engineering Management
skill iconJavascript
TypeScript
skill iconAngularJS (1.x)
skill iconReact.js
+7 more

About NonStop io Technologies:

NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment.


Brief Description:

We are looking for an Engineering Manager who combines technical depth with leadership strength. This role involves leading one or more product engineering pods, driving architecture decisions, ensuring delivery excellence, and working closely with stakeholders to build scalable web and mobile technology solutions. As a key part of our leadership team, you’ll play a pivotal role in mentoring engineers, improving processes, and fostering a culture of ownership, innovation, and continuous learning


Roles and Responsibilities:

● Team Management: Lead, coach, and grow a team of 15-20 software engineers, tech leads, and QA engineers

● Technical Leadership: Guide the team in building scalable, high-performance web and mobile applications using modern frameworks and technologies

● Architecture Ownership: Architect robust, secure, and maintainable technology solutions aligned with product goals

● Project Execution: Ensure timely and high-quality delivery of projects by driving engineering best practices, agile processes, and cross-functional collaboration

● Stakeholder Collaboration: Act as a bridge between business stakeholders, product managers, and engineering teams to translate requirements into technology plans

● Culture & Growth: Help build and nurture a culture of technical excellence, accountability, and continuous improvement

● Hiring & Onboarding: Contribute to recruitment efforts, onboarding, and learning & development of team members


Requirements:

● 8+ years of software development experience, with 2+ years in a technical leadership or engineering manager role

● Proven experience in architecting and building web and mobile applications at scale

● Hands-on knowledge of technologies such as JavaScript/TypeScript, Angular, React, Node.js, .NET, Java, Python, or similar stacks

● Solid understanding of cloud platforms (AWS/Azure/GCP) and DevOps practices

● Strong interpersonal skills with a proven ability to manage stakeholders and lead diverse teams

● Excellent problem-solving, communication, and organizational skills

● Nice to haves:

○ Prior experience in working with startups or product-based companies

○ Experience mentoring tech leads and helping shape engineering culture

○ Exposure to AI/ML, data engineering, or platform thinking


Why Join Us?:

● Opportunity to work on a cutting-edge healthcare product

● A collaborative and learning-driven environment

● Exposure to AI and software engineering innovations

● Excellent work ethics and culture


If you're passionate about technology and want to work on impactful projects, we'd love to hear from you!

Read more
SparrowHost
Anant kumar
Posted by Anant kumar
Dhanbad
0 - 1 yrs
₹1.5L - ₹2L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconLaravel
skill iconPython

We're seeking a talented and motivated Full Stack Developer who is experienced in Node.js, React.js, and Laravel (PHP) to work on building high-performance web applications, APIs, SaaS platforms, and hosting tools. This role is perfect for someone who thrives in a dynamic environment and wants to contribute to modern, scalable platforms used by thousands of users.


Responsibilities

Develop and maintain full-stack applications using Node.js (backend), React.js (frontend), and Laravel (backend).


Design, implement, and document APIs and microservices for internal and customer-facing platforms.


Integrate third-party APIs, payment gateways, and other services as needed.


Collaborate with UI/UX designers to ensure a high-quality user experience.


Optimize applications for maximum speed and scalability.


Troubleshoot, debug, and upgrade existing software.


Participate in code reviews, sprint planning, and team meetings.


Read more
Pluginlive

at Pluginlive

1 recruiter
Harsha Saggi
Posted by Harsha Saggi
Chennai
2 - 4 yrs
₹15L - ₹20L / yr
Data engineering
skill iconPython
SQL
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)

What you’ll do

  • Design, build, and maintain robust ETL/ELT pipelines for product and analytics data
  • Work closely with business, product, analytics, and ML teams to define data needs
  • Ensure high data quality, lineage, versioning, and observability
  • Optimize performance of batch and streaming jobs
  • Automate and scale ingestion, transformation, and monitoring workflows
  • Document data models and key business metrics in a self-serve way
  • Use AI tools to accelerate development, troubleshooting, and documentation


Must-Haves:

  • 2–4 years of experience as a data engineer (product or analytics-focused preferred)
  • Solid hands-on experience with Python and SQL
  • Experience with data pipeline orchestration tools like Airflow or Prefect
  • Understanding of data modeling, warehousing concepts, and performance optimization
  • Familiarity with cloud platforms (GCP, AWS, or Azure)
  • Bachelor's in Computer Science, Data Engineering, or a related field
  • Strong problem-solving mindset and AI-native tooling comfort (Copilot, GPTs)


Read more
NeoGenCode Technologies Pvt Ltd
Bengaluru (Bangalore), Pune, Chennai
3 - 6 yrs
₹2L - ₹12L / yr
Test Automation (QA)
Automation
Software Testing (QA)
Generative AI
Selenium
+7 more

Job Title : Automation Quality Engineer (Gen AI)

Experience : 3 to 5+ Years

Location : Bangalore / Chennai / Pune


Role Overview :

We’re hiring a Quality Engineer to lead QA efforts for AI models, applications, and infrastructure.

You'll collaborate with cross-functional teams to design test strategies, implement automation, ensure model accuracy, and maintain high product quality.


Key Responsibilities :

  • Develop and maintain test strategies for AI models, APIs, and user interfaces.
  • Build automation frameworks and integrate into CI/CD pipelines.
  • Validate model accuracy, robustness, and monitor model drift.
  • Perform regression, performance, load, and security testing.
  • Log and track issues; collaborate with developers to resolve them.
  • Ensure compliance with data privacy and ethical AI standards.
  • Document QA processes and testing outcomes.

Mandatory Skills :

  • Test Automation : Selenium, Playwright, or Deep Eval
  • Programming/Scripting : Python, JavaScript
  • API Testing : Postman, REST Assured
  • Cloud & DevOps : Azure, Azure Kubernetes, CI/CD pipelines
  • Performance Testing : JMeter
  • Bug Tracking : Azure DevOps
  • Methodologies : Agile delivery experience
  • Soft Skills : Strong communication and problem-solving abilities
Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
2 - 4 yrs
Best in industry
AWS Lambda
databricks
Database migration
Apache Kafka
Apache Spark
+3 more

About NonStop io Technologies:

NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment.

Brief Description:

We are looking for a talented Data Engineer to join our team. In this role, you will design, implement, and manage data pipelines, ensuring the accessibility and reliability of data for critical business processes. This is an exciting opportunity to work on scalable solutions that power data-driven decisions

Skillset:

Here is a list of some of the technologies you will work with (the list below is not set in stone)

Data Pipeline Orchestration and Execution:

● AWS Glue

● AWS Step Functions

● Databricks Change

Data Capture:

● Amazon Database Migration Service

● Amazon Managed Streaming for Apache Kafka with Debezium Plugin

Batch:

● AWS step functions (and Glue Jobs)

● Asynchronous queueing of batch job commands with RabbitMQ to various “ETL Jobs”

● Cron and subervisord processing on dedicated job server(s): Python & PHP

Streaming:

● Real-time processing via AWS MSK (Kafka), Apache Hudi, & Apache Flink

● Near real-time processing via worker (listeners) spread over AWS Lambda, custom server (daemons) written in Python and PHP Symfony

● Languages: Python & PySpark, Unix Shell, PHP Symfony (with Doctrine ORM)

● Monitoring & Reliability: Datadog & Cloudwatch

Things you will do:

● Build dashboards using Datadog and Cloudwatch to ensure system health and user support

● Build schema registries that enable data governance

● Partner with end-users to resolve service disruptions and evangelize our data product offerings

● Vigilantly oversee data quality and alert upstream data producers of issues

● Support and contribute to the data platform architecture strategy, roadmap, and implementation plans to support the company’s data-driven initiatives and business objective

● Work with Business Intelligence (BI) consumers to deliver enterprise-wide fact and dimension data product tables to enable data-driven decision-making across the organization.

● Other duties as assigned

Read more
Incubyte

at Incubyte

4 recruiters
Sarika Shitole
Posted by Sarika Shitole
Remote only
3 - 8 yrs
Best in industry
skill iconPython
skill iconReact.js

About Us

We are a company where the ‘HOW’ of building software is just as important as the ‘WHAT.’ We partner with large organizations to modernize legacy codebases and collaborate with startups to launch MVPs, scale, or act as extensions of their teams. Guided by Software Craftsmanship values and eXtreme Programming Practices, we deliver high-quality, reliable software solutions tailored to our clients' needs.


We thrive to: 

  • Bring our clients' dreams to life by being their trusted engineering partners, crafting innovative software solutions.
  • Challenge offshore development stereotypes by delivering exceptional quality, and proving the value of craftsmanship.
  • Empower clients to deliver value quickly and frequently to their end users.
  • Ensure long-term success for our clients by building reliable, sustainable, and impactful solutions.
  • Raise the bar of software craft by setting a new standard for the community.

Job Description

This is a remote position.

Our Core Values


  • Quality with Pragmatism: We aim for excellence with a focus on practical solutions.  
  • Extreme Ownership: We own our work and its outcomes fully.  
  • Proactive Collaboration: Teamwork elevates us all.  
  • Pursuit of Mastery: Continuous growth drives us.  
  • Effective Feedback: Honest, constructive feedback fosters improvement.  
  • Client Success: Our clients’ success is our success. 


Experience Level


This role is ideal for engineers with 3+ years of hands-on software development experience, particularly in ​Python and ReactJs at scale. 


Role Overview

If you’re a Software Craftsperson who takes pride in clean, test-driven code and believes in Extreme Programming principles, we’d love to meet you. At Incubyte, we’re a DevOps organization where developers own the entire release cycle, meaning you’ll get hands-on experience across programming, cloud infrastructure, client communication, and everything in between. Ready to level up your craft and join a team that’s as quality-obsessed as you are? Read on!   


What You'll Do

  • Write Tests First: Start by writing tests to ensure code quality 
  • Clean Code: Produce self-explanatory, clean code with predictable results 
  • Frequent Releases: Make frequent, small releases 
  • Pair Programming: Work in pairs for better results 
  • Peer Reviews: Conduct peer code reviews for continuous improvement 
  • Product Team: Collaborate in a product team to build and rapidly roll out new features and fixes 
  • Full Stack Ownership: Handle everything from the front end to the back end, including infrastructure and DevOps pipelines 
  • Never Stop Learning: Commit to continuous learning and improvement  





Requirements

What We're Looking For

  • Proficiency in some or all of the following: ReactJS,  JavaScript, Object Oriented Programming in JS
  • 3+ years of Object-Oriented Programming with Python or equivalent
  • 3+ years of experience working with relational (SQL) databases
  • 3+ years of experience using Git to contribute code as part of a team of Software Craftspeople




Benefits

What We Offer

  • Dedicated Learning & Development Budget: Fuel your growth with a budget dedicated solely to learning.
  • Conference Talks Sponsorship: Amplify your voice! If you’re speaking at a conference, we’ll fully sponsor and support your talk.
  • Cutting-Edge Projects: Work on exciting projects with the latest AI technologies
  • Employee-Friendly Leave Policy: Recharge with ample leave options designed for a healthy work-life balance.
  • Comprehensive Medical & Term Insurance: Full coverage for you and your family’s peace of mind.
  • And More: Extra perks to support your well-being and professional growth.

Work Environment 

  • Remote-First Culture: At Incubyte, we thrive on a culture of structured flexibility — while you have control over where and how you work, everyone commits to a consistent rhythm that supports their team during core working hours for smooth collaboration and timely project delivery. By striking the perfect balance between freedom and responsibility, we enable ourselves to deliver high-quality standards our customers recognize us by. With asynchronous tools and push for active participation, we foster a vibrant, hands-on environment where each team member’s engagement and contributions drive impactful results.
  • Work-In-Person: Twice a year, we come together for two-week sprints to collaborate in person, foster stronger team bonds, and align on goals. Additionally, we host an annual retreat to recharge and connect as a team. All travel expenses are covered.
  • Proactive Collaboration: Collaboration is central to our work. Through daily pair programming sessions, we focus on mentorship, continuous learning, and shared problem-solving. This hands-on approach keeps us innovative and aligned as a team.


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Bengaluru (Bangalore)
5 - 8 yrs
₹12L - ₹22L / yr
skill iconPython
skill iconDjango
skill iconAmazon Web Services (AWS)
skill iconFlask
Windows Azure

About the Role:


  • We are looking for a highly skilled and experienced Senior Python Developer to join our dynamic team based in Manyata Tech Park, Bangalore. The ideal candidate will have a strong background in Python development, object-oriented programming, and cloud-based application development. You will be responsible for designing, developing, and maintaining scalable backend systems using modern frameworks and tools.
  • This role is hybrid, with a strong emphasis on working from the office to collaborate effectively with cross-functional teams.


Key Responsibilities:

  • Design, develop, test, and maintain backend services using Python.
  • Develop RESTful APIs and ensure their performance, responsiveness, and scalability.
  • Work with popular Python frameworks such as Django or Flask for rapid development.
  • Integrate and work with cloud platforms (AWS, Azure, GCP or similar).
  • Collaborate with front-end developers and other team members to establish objectives and design cohesive code.
  • Apply object-oriented programming principles to solve real-world problems efficiently.
  • Implement and support event-driven architectures where applicable.
  • Identify bottlenecks and bugs, and devise solutions to mitigate and address these issues.
  • Write clean, maintainable, and reusable code with proper documentation.
  • Contribute to system architecture and code review processes.


Required Skills and Qualifications:


  • Minimum of 5 years of hands-on experience in Python development.
  • Strong understanding of Object-Oriented Programming (OOP) and Data Structures.
  • Proficiency in building and consuming REST APIs.
  • Experience working with at least one cloud platform such as AWS, Azure, or Google Cloud Platform.
  • Hands-on experience with Python frameworks like Django, Flask, or similar.
  • Familiarity with event-driven programming and asynchronous processing.
  • Excellent problem-solving, debugging, and troubleshooting skills.
  • Strong communication and collaboration abilities to work effectively in a team environment.


Read more
Kanjurmarg, Mumbai
1 - 2 yrs
₹3L - ₹4L / yr
Embedded C
Raspberry Pi
skill iconPython
UART
3D modeling
+5 more

Roles and Responsibilities:

* Strong experience with programming microcontrollers like Arduino, ESP32, and ESP8266.

* Experience with Embedded C/C++.

* Experience with Raspberry Pi, Python, and OpenCV.

* Experience with Low power Devices would be preferred

* Knowledge about communication protocols (UART, I2C, etc.)

* Experience with Wi-Fi, LoRa, GSM, M2M, SImcom, and Quactel Modules.

* Experience with 3d modeling (preferred).

* Experience with 3d printers (preferred).

* Experience with Hardware design and knowledge of basic electronics.

* Experience with Software will be preferred.ss

Detailed Job role (daily basis) done by the IOT developer.


* Design hardware that meets the needs of the application.

* Support for current hardware, testing, and bug-fixing.

* Create, maintain, and document microcontroller code.

* prototyping, testing, and soldering

* Making 3D/CAD models for PCBs.

Read more
Daten  Wissen Pvt Ltd

at Daten Wissen Pvt Ltd

1 recruiter
Ashwini poojari
Posted by Ashwini poojari
Mumbai
1.5 - 2.5 yrs
₹3L - ₹7L / yr
Computer Vision
Image Processing
skill iconDeep Learning
skill iconC++
skill iconPython
+1 more

Artificial Intelligence Researcher


Job description 


This is a full-time on-site role for an Artificial Intelligence Researcher at Daten & Wissen in Mumbai. The researcher will be responsible for conducting cutting-edge research in areas such as Computer Vision, Natural Language Processing, Deep Learning, and Time Series Predictions. The role involves collaborating with industry partners, developing AI solutions, and contributing to the advancement of AI technologies.


Key Responsibilities:

  • Design, develop, and implement computer vision algorithms for object detection, tracking, recognition, segmentation, and activity analysis.
  • Train and fine-tune deep learning models (CNNs, RNNs, Transformers, etc.) for various video and image-based tasks.
  • Work with large-scale datasets and annotated video data to enhance model accuracy and robustness.
  • Optimize and deploy models to run efficiently on edge devices, cloud environments, and GPUs.
  • Collaborate with cross-functional teams including data scientists, backend engineers, and UI/UX designers.
  • Continuously explore new research, tools, and technologies to enhance our product capabilities.
  • Perform model evaluation, testing, and benchmarking for accuracy, speed, and reliability.


Required Skills:

  • Proficiency in Python and C++.
  • Experience with object detection models like YOLO, SSD, Faster R-CNN.
  • Strong understanding of classical computer vision techniques (OpenCV, image processing, etc.).
  • Expertise in Machine Learning, Pattern Recognition, and Statistics.
  • Experience with frameworks like TensorFlow, PyTorch, MXNet.
  • Strong understanding of Deep Learning and Video Analytics.
  • Experience with CUDA, Docker, Nvidia NGC Containers, and cloud platforms (AWS, Azure, GCP).
  • Familiar with Kubernetes, Kafka, and model optimization for Nvidia hardware (e.g., TensorRT).


Qualifications

  • 2+ years of hands-on experience in computer vision and deep learning.
  • Computer Science and Data Science skills
  • Expertise in Pattern Recognition
  • Strong background in Research and Statistics
  • Proficiency in Machine Learning algorithms
  • Experience with AI frameworks such as TensorFlow or PyTorch
  • Excellent problem-solving and analytical skills

Location                    : Mumbai (Bhayandar) 



Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
6 - 12 yrs
₹5L - ₹24L / yr
skill iconPython
skill iconDjango
skill iconAmazon Web Services (AWS)
skill iconRedis
skill iconPostgreSQL
+2 more

Job Title : Python Django Developer

Experience : 6+ Years

Location : Gurgaon (Work from Office)

Job Type : Full-time

Working Days: Monday to Friday (5 Days)

Timings: 9:30 AM – 6:30 PM


Job Summary :

We are seeking a highly skilled and experienced Python Django Developer to join our dynamic team in Gurgaon.

The ideal candidate will have a strong background in backend development, Django frameworks, and RESTful API integration.

You will be responsible for building and maintaining scalable web applications and collaborating with cross-functional teams.


Key Responsibilities :

  • Develop, test, and maintain robust, scalable, high-performance web applications using Django and Python.
  • Design and implement RESTful APIs and integrate third-party APIs and services.
  • Write reusable, testable, and efficient code following best practices and coding standards.
  • Work with frontend developers to integrate user-facing elements using server-side logic.
  • Optimize applications for maximum speed and scalability.
  • Perform code reviews, troubleshoot issues, and ensure application performance and security.
  • Collaborate with product managers, designers, and other developers to deliver high-quality products.

Required Skills & Qualifications :

  • 6+ Years of hands-on experience with Python and Django framework.
  • Strong experience with RESTful APIs, Django ORM, and PostgreSQL/MySQL.
  • Proficiency in version control systems like Git.
  • Familiarity with frontend technologies such as HTML, CSS, JavaScript, and AJAX.
  • Experience with Docker, Celery, Redis, and cloud platforms (AWS/Azure) is a plus.
  • Solid understanding of software development principles and design patterns.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration abilities.

Good to Have :

  • Experience with Django REST Framework (DRF).
  • Knowledge of unit testing and CI/CD pipelines.
  • Exposure to Agile development methodologies.
  • Familiarity with containerization and deployment tools (Docker, Kubernetes).
Read more
RaptorX.ai

at RaptorX.ai

4 candid answers
Parminder Kaur
Posted by Parminder Kaur
Hyderabad
2 - 4 yrs
₹5L - ₹10L / yr
skill iconPython
TensorFlow
PyTorch
skill iconDocker
skill iconKubernetes

Job Description:

As a Machine Learning Engineer, you will:

  • Operationalize AI models for production, ensuring they are scalable, robust, and efficient.
  • Work closely with data scientists to optimize machine learning model performance.
  • Utilize Docker and Kubernetes for the deployment and management of AI models in a production environment.
  • Collaborate with cross-functional teams to integrate AI models into products and services.

Responsibilities:

  • Develop and deploy scalable machine learning models into production environments.
  • Optimize models for performance and scalability.
  • Implement continuous integration and deployment (CI/CD) pipelines for machine learning projects.
  • Monitor and maintain model performance in production.

Key Performance Indicators (KPI) For Role:

  • Success in deploying scalable and efficient AI models into production.
  • Improvement in model performance and scalability post-deployment.
  • Efficiency in model deployment and maintenance processes.
  • Positive feedback from team members and stakeholders on AI model integration and performance.
  • Adherence to best practices in machine learning engineering and deployment.

Prior Experience Required:

  • 2-4 years of experience in machine learning or data science, with a focus on deploying machine learning models into production.
  • Proficient in Python and familiar with data science libraries and frameworks (e.g., TensorFlow, PyTorch).
  • Experience with Docker and Kubernetes for containerization and orchestration of machine learning models.
  • Demonstrated ability to optimize machine learning models for performance and scalability.
  • Familiarity with machine learning lifecycle management tools and practices.
  • Experience in developing and maintaining scalable and robust AI systems.
  • Knowledge of best practices in AI model testing, versioning, and deployment.
  • Strong understanding of data preprocessing, feature engineering, and model evaluation metrics.

Employer:

RaptorX.ai

Location:

Hyderabad

Collaboration:

The role requires collaboration with data engineers, software developers, and product managers to ensure the seamless integration of AI models into products and services.

Salary:

Competitive, based on experience.

Education:

  • Bachelor's degree in Computer Science, Information Technology, or a related field.

Language Skills:

  • Strong command of Business English, both verbal and written, is required.

Other Skills Required:

  • Strong analytical and problem-solving skills.
  • Proficiency in code versioning tools, such as Git.
  • Ability to work in a fast-paced and evolving environment.
  • Excellent teamwork and communication skills.
  • Familiarity with agile development methodologies.
  • Understanding of cloud computing services (AWS, Azure, GCP) and their use in deploying machine learning models is a plus.

Other Requirements:

  • Proven track record of successfully deploying machine learning models into production.
  • Ability to manage multiple projects simultaneously and meet deadlines.
  • A portfolio showcasing successful AI/ML projects.


Founders and Leadership

RaptorX is led by seasoned founders with deep expertise in security, AI, and enterprise solutions. Our leadership team has held senior positions at global tech giants like Microsoft, Palo Alto Networks, Akamai, and Zscaler, solving critical problems at scale.

We bring not just technical excellence, but also a relentless passion for innovation and impact.


The Market Opportunity

Fraud costs the global economy trillions of dollars annually, and traditional fraud detection methods simply can't keep up. The demand for intelligent, adaptive solutions like RaptorX is massive and growing exponentially across industries like:

  • Fintech and Banking
  • E-commerce
  • Payments

This is your chance to work on a product that addresses a multi-billion-dollar market with huge growth potential.

The Tech Space at RaptorX

We are solving large-scale, real-world problems using modern technologies, offering specialized growth paths for every tech role.

Why You Should Join Us

  1. Opportunity to Grow: As an early-stage startup, every contribution you make will have a direct impact on the company’s growth and success. You’ll wear multiple hats, learn fast, and grow exponentially.
  2. Innovate Every Day: Solve complex, unsolved problems using the latest in AI, Graph Databases, and advanced analytics.
  3. Collaborate with the Best: Work alongside some of the brightest minds in the industry. Learn from leaders who have built and scaled successful products globally.
  4. Make an Impact: Help businesses reduce losses, secure customers, and prevent fraud globally. Your work will create a tangible difference.


Read more
OIP Insurtech

at OIP Insurtech

2 candid answers
Katarina Vasic
Posted by Katarina Vasic
Hyderabad
2 - 6 yrs
₹20L - ₹30L / yr
skill iconPython
Research and development
Natural Language Processing (NLP)
Data extraction

We are looking for a forward-thinking AI Engineer who thrives on pioneering advancements in AI technology and its practical applications within the insurance sector. This role is ideal for someone deeply engaged with AI research, enjoys exploring new technologies, and is adept at rapidly transitioning innovative concepts into real-world applications. Join our team to lead the integration of next-gen AI solutions that streamline and transform insurance processes.


Who You Are

An ideal candidate is someone deeply embedded in the AI research community, comfortable navigating uncharted tech territories, and skilled in translating complex theoretical ideas into testable and demonstrable prototypes. You are driven by curiosity and the challenge of solving complex problems that might one day reshape the insurance industry.



What We’re Looking For


● Education: Bachelor’s or Master’s degree in Computer Science, Artificial Intelligence, Data Science, or a related field. A PhD in AI or a related discipline is highly desirable.


● Experience:


○ Proven experience in AI research and implementation, with a deep

understanding of both theoretical and practical aspects of AI.


○ Strong proficiency in machine learning (ML), data science, and deep learning techniques.


○ Hands-on experience with Python and ML libraries such as TensorFlow, PyTorch, Scikit-learn, etc.


○ Experience with data preprocessing, feature engineering, and data

visualization.


○ Familiarity with cloud platforms such as AWS, Azure, or Google Cloud for AI and ML deployment.


○ Strong analytical and problem-solving skills.


○ Ability to translate AI research into practical applications and solutions.


○ Knowledge of AI model evaluation techniques and performance optimization.


○ Strong communication skills for presenting research and technical details to non- technical stakeholders.


○ Ability to work independently and in team environments.


Preferred Qualifications


● Experience working with natural language processing (NLP), computer vision, or reinforcement learning.


What You’ll Be Doing

  • Stay at the forefront of AI research to continually bring the latest innovations into our development pipeline. Your role will focus on experimenting with AI Agents and agentic AI, exploring how these technologies can enhance insurance workflows.
  • Rapidly prototype and validate your ideas, demonstrating feasibility and potential impact. Work with large, complex datasets to develop models that solve real business challenges. 
  • While the core of your role is research-oriented, collaboration with production teams is essential. You will provide them with tested, validated concepts and support the initial stages of production integration.
  • Partner with software developers, product managers, and other data scientists to ensure AI solutions are aligned with business needs
  • Leverage state-of-the-art AI tools, frameworks, and libraries to accelerate AI development.
  • Document AI research outcomes, development processes, and performance metrics. Present findings to stakeholders in an easily understandable manner.
Read more
OIP Insurtech

at OIP Insurtech

2 candid answers
Katarina Vasic
Posted by Katarina Vasic
Remote, Hyderabad
4 - 10 yrs
₹30L - ₹50L / yr
skill iconPython
Data extraction
Natural Language Processing (NLP)
TensorFlow
Large Language Models (LLM)
+1 more

What We’re Looking For


Proven experience as a Machine Learning Engineer, Data Scientist, or similar role


Expertise in applying machine learning algorithms, deep learning, and data mining techniques in an enterprise environment


Strong proficiency in Python (or other languages) and familiarity with libraries such as Scikit-learn, TensorFlow, PyTorch, or similar.


Experience working with natural language processing (NLP) or computer vision is highly desirable.


Understanding and experience with (MLOps), including model development, deployment, monitoring, and maintenance.


Experience with cloud platforms (like AWS, Google Cloud, or Azure) and knowledge of deploying machine learning models at scale.


Familiarity with data architecture, data engineering, and data pipeline tools.


Familiarity with containerization technologies such as Docker, and orchestration systems like Kubernetes.


Knowledge of the insurance sector is beneficial but not required.


Bachelor's/Master's degree in Computer Science, Data Science, Mathematics, or a related field.


What You’ll Be Doing

Algorithm Development:

Design and implement advanced machine learning algorithms tailored for our datasets.


Model Creation:

Build, train, and refine machine learning models for business integration.


Collaboration:

Partner with product managers, developers, and data scientists to align machine learning solutions with business goals.


Industry Innovation:

Stay updated with Insurtech trends and ensure our solutions remain at the forefront.


Validation:

Test algorithms for accuracy and efficiency, collaborating with the QA team.


Documentation:

Maintain clear records of algorithms and models for team reference.


Professional Growth:

Engage in continuous learning and mentor junior team members.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
4 - 8 yrs
₹20L - ₹35L / yr
skill iconPython
AWS
Amazon EC2
skill iconPostgreSQL
Service company preferred

Mandatory (Experience 1) - Must have a minimum 4+ years of experience in backend software development.

Mandatory (Experience 2) -Must have 4+ years of experience in backend development using Python (Highly preferred), Java, or Node.js.

Mandatory (Experience 3) - Must have experience with Cloud platforms like AWS (highly preferred), gcp or azure

Mandatory (Experience 4) - Must have Experience in any databases - MySQL / PostgreSQL / Postgres / Oracle / SQL Server / DB2 / SQL / MongoDB / Ne

Read more
Palcode.ai

at Palcode.ai

2 candid answers
Team Palcode
Posted by Team Palcode
Remote only
2 - 3 yrs
₹6L - ₹9L / yr
skill iconPython
FastAPI
skill iconAmazon Web Services (AWS)
skill iconPostgreSQL
Generative AI

At Palcode.ai, We're on a mission to fix the massive inefficiencies in pre-construction. Think about it - in a $10 trillion industry, estimators still spend weeks analyzing bids, project managers struggle with scattered data, and costly mistakes slip through complex contracts. We're fixing this with purpose-built AI agents that work. Our platform can do “magic” to Preconstruction workflows from Weeks to Hours. It's not just about AI – it's about bringing real, measurable impact to an industry ready for change. We are backed by names like AWS for Startups, Upekkha Accelerator, and Microsoft for Startups.


Why Palcode.ai

  • Tackle Complex Problems: Build AI that reads between the lines of construction bids, spots hidden risks in contracts, and makes sense of fragmented project data
  • High-Impact Code: Your code won't sit in a backlog – it goes straight to estimators and project managers who need it yesterday
  • Tech Challenges That Matter: Design systems that process thousands of construction documents, handle real-time pricing data, and make intelligent decisions
  • Build & Own: Shape our entire tech stack, from data processing pipelines to AI model deployment
  • Quick Impact: Small team, huge responsibility. Your solutions directly impact project decisions worth millions
  • Learn & Grow: Master the intersection of AI, cloud architecture, and construction tech while working with founders who've built and scaled construction software

Your Role:

  • Design and build our core AI services and APIs using Python
  • Create reliable, scalable backend systems that handle complex data
  • Work on our web frontend using ReactJs
  • Knowledge Redux, ReactJs, HTML CSS is a must
  • Help set up cloud infrastructure and deployment pipelines
  • Collaborate with our AI team to integrate machine learning models
  • Write clean, tested, production-ready code

You'll fit right in if:

  • You have 2 years of hands-on Python development experience
  • You have 2 Years of hands-on ReactJs Developement experience
  • You're comfortable with full-stack development and cloud services
  • You write clean, maintainable code and follow good engineering practices
  • You're curious about AI/ML and eager to learn new technologies
  • You enjoy fast-paced startup environments and take ownership of your work

How we will set you up for success


  • You will work closely with the Founding team to understand what we are building.
  • You will be given comprehensive training about the tech stack, with an opportunity to avail virtual training as well.
  • You will be involved in a monthly one-on-one with the founders to discuss feedback
  • A unique opportunity to learn from the best - we are Gold partners of AWS, Razorpay, and Microsoft Startup programs, having access to rich talent to discuss and brainstorm ideas.
  • You’ll have a lot of creative freedom to execute new ideas. As long as you can convince us, and you’re confident in your skills, we’re here to back you in your execution.


Location: Bangalore

Compensation: Competitive salary + Meaningful equity

If you get excited about solving hard problems that have real-world impact, we should talk.

  • All the best!!


Read more
CLOUDSUFI

at CLOUDSUFI

3 recruiters
Ayushi Dwivedi
Posted by Ayushi Dwivedi
Remote only
6 - 13 yrs
₹35L - ₹45L / yr
Google Cloud Platform (GCP)
skill iconMachine Learning (ML)
Generative AI
skill iconPython
MLOps
+1 more

AI Architect


Location and Work Requirements

-      Position is based in KSA or UAE

-      Must be eligible to work abroad without restrictions

-      Regular travel within the region required


Key Responsibilities

-      Minimum 7+ years of experience in Data & Analytics domain and minimum 2 years as AI Architect

-      Drive technical solution design engagements and implementations

-      Support customer implementations across various deployment modes (Public SaaS, Single-Tenant SaaS, and Self-Managed Kubernetes)

-      Provide advanced technical support, including deployment troubleshooting and coordinating with customer AI Architect and product development teams when needed

-      Guide customers in implementing generative AI solutions, including LLM integration, vector database management, and prompt engineering

-      Coordinate and oversee platform installations and configuration work

-      Assist customers with platform integration, including API implementation and custom model deployment

-      Establish and promote best practices for AI governance and MLOps

-      Proactively identify and address potential technical challenges before they impact customer success


Required Technical Skills

-      Strong programming skills in Python with experience in data processing libraries (Pandas, NumPy)

-      Proficiency in SQL and experience with various database technologies including MongoDB

-      Container technologies: Docker (build, modify, deploy) and Kubernetes (kubectl, helm)

-      Version control systems (Git) and CI/CD practices

-      Strong networking fundamentals (TCP/IP, SSH, SSL/TLS)

-      Shell scripting (Linux/Unix environments)

-      Experience in working on on-prem, airgapped environments

-      Experience with cloud platforms (AWS, Azure, GCP)


Required AI/ML Skills

-      Deep expertise in both predictive machine learning and generative AI technologies

-      Proven experience implementing and operationalizing large language models (LLMs)

-      Strong knowledge of vector databases, embedding technologies, and similarity search concepts

-      Advanced understanding of prompt engineering, LLM evaluation, and AI governance methods

-      Practical experience with machine learning deployment and production operations

-      Understanding of AI safety considerations and risk mitigation strategies



Required Qualities

-      Excellent English communication skills with ability to explain complex technical concepts. Arabic language is advantageous.

-      Strong consultative approach to understanding and solving business problems

-      Proven ability to build trust through proactive customer engagement

-      Strong problem-solving abilities and attention to detail

-      Ability to work independently and as part of a distributed team

-      Willingness to travel within the Middle East & Africa region as needed 

Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Eman Khan
Posted by Eman Khan
Remote, Delhi
6 - 9 yrs
Best in industry
skill iconPython
skill iconJava
skill iconJavascript
Locust
Gatling
+14 more

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website - https://www.gohighlevel.com/

YouTube Channel - https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post - https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage.


About the Role:

HighLevel Inc. is looking for a Lead SDET with 8-10 years of experience to play a pivotal role in ensuring the quality, performance, and scalability of our products. We are seeking engineers who thrive in a fast-paced startup environment, enjoy problem-solving, and stay updated with the latest models and solutions. This is an exciting opportunity to work on cutting-edge performance testing strategies and drive impactful initiatives across the organisation.


Responsibilities:

  • Implement performance, scalability, and reliability testing strategies
  • Capture and analyze key performance metrics to identify bottlenecks
  • Work closely with development, DevOps, and infrastructure teams to optimize system performance
  • Review application architecture and suggest improvements to enhance scalability
  • Leverage AI at appropriate layers to improve efficiency and drive positive business outcomes
  • Drive performance testing initiatives across the organization and ensure seamless execution
  • Automate the capturing of performance metrics and generate performance trend reports
  • Research, evaluate, and conduct PoCs for new tools and solutions
  • Collaborate with developers and architects to enhance frontend and API performance
  • Conduct root cause analysis of performance issues using logs and monitoring tools
  • Ensure high availability and reliability of applications and services


Requirements:

  • 6-9 years of hands-on experience in Performance, Reliability, and Scalability testing
  • Strong skills in capturing, analyzing, and optimizing performance metrics
  • Expertise in performance testing tools such as Locust, Gatling, k6, etc.
  • Experience working with cloud platforms (Google Cloud, AWS, Azure) and setting up performance testing environments
  • Knowledge of CI/CD deployments and integrating performance testing into pipelines
  • Proficiency in scripting languages (Python, Java, JavaScript) for test automation
  • Hands-on experience with monitoring and observability tools (New Relic, AppDynamics, Prometheus, etc.)
  • Strong knowledge of JVM monitoring, thread analysis, and RESTful services
  • Experience in optimising frontend performance and API performance
  • Ability to deploy applications in Kubernetes and troubleshoot environment issues
  • Excellent problem-solving skills and the ability to troubleshoot customer issues effectively
  • Experience in increasing application/service availability from 99.9% (three 9s) to 99.99% or higher (four/five 9s)


EEO Statement:

The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
3 - 8 yrs
₹3L - ₹27L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
snowflake
SQL

Key Responsibilities:

  • Design, develop, and maintain data pipelines on AWS.
  • Work with large-scale data processing using SQL, Python or PySpark.
  • Implement and optimize ETL processes for structured and unstructured data.
  • Develop and manage data models in Snowflake.
  • Ensure data security, integrity, and compliance on AWS cloud infrastructure.
  • Collaborate with cross-functional teams to support data-driven decision-making.

Required Skills:

  • Strong hands-on experience with AWS services 
  • Proficiency in SQL, Python, or PySpark for data processing and transformation.
  • Experience working with Snowflake for data warehousing.
  • Strong understanding of data modeling, data governance, and performance tuning.
  • Knowledge of CI/CD pipelines for data workflows is a plus.


Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Eman Khan
Posted by Eman Khan
Remote, Delhi
4 - 7 yrs
Best in industry
skill iconPython
skill iconJava
Locust
Gatling
K6
+10 more

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website: https://www.gohighlevel.com/

YouTube Channel: https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post: https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage.


About the Role:

HighLevel Inc. is looking for a SDET III with 5-6 years of experience to play a crucial role in ensuring the quality, performance, and scalability of our products. We are seeking engineers who thrive in a fast-paced startup environment, enjoy problem-solving, and stay updated with the latest models and solutions. This is a great opportunity to work on cutting-edge performance testing strategies and contribute to the success of our products.


Responsibilities:

  • Implement performance, scalability, and reliability testing strategies
  • Capture and analyze key performance metrics to identify bottlenecks
  • Work closely with development, DevOps, and infrastructure teams to optimize system performance
  • Develop test strategies based on customer behavior to ensure high-performing applications
  • Automate the capturing of performance metrics and generate performance trend reports
  • Collaborate with developers and architects to optimize frontend and API performance
  • Conduct root cause analysis of performance issues using logs and monitoring tools
  • Research, evaluate, and conduct PoCs for new tools and solutions
  • Ensure high availability and reliability of applications and services


Requirements:

  • 4-7 years of hands-on experience in Performance, Reliability, and Scalability testing
  • Strong skills in capturing, analyzing, and optimizing performance metrics
  • Expertise in performance testing tools such as Locust, Gatling, k6, etc.
  • Experience working with cloud platforms (Google Cloud, AWS, Azure) and setting up performance testing environments
  • Knowledge of CI/CD deployments and integrating performance testing into pipelines
  • Proficiency in scripting languages (Python, Java, JavaScript) for test automation
  • Hands-on experience with monitoring and observability tools (New Relic, AppDynamics, Prometheus, etc.)
  • Strong knowledge of JVM monitoring, thread analysis, and RESTful services
  • Experience in optimizing frontend performance and API performance
  • Ability to deploy applications in Kubernetes and troubleshoot environment issues
  • Excellent problem-solving skills and the ability to troubleshoot customer issues effectively


EEO Statement:

The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort