50+ Remote SQL Jobs in India
Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!


Job Description:
As a Tally Developer, your main responsibility will be to develop custom solutions in Tally using TDL as per the customer requirements. You will work closely with clients, business analysts, Senior developers, and other stakeholders to understand their requirements and translate them into effective Tally-based solutions.
Responsibilities:
Collaborate business analysts and senior developer/project manager to gather and analyses client requirements.
Design, develop, and customize Tally-based software solutions to meet the specific requirements of clients.
Write efficient and well-documented code in Tally Definition Language (TDL) to extend the functionality of Tally software.
Follow the Software Development Life Cycle including requirements gathering, design, coding, testing, and deployment.
Troubleshoot and debug issues related to Tally customization, data import/export, and software integrations.
Provide technical support and assistance to clients and end-users in utilizing and troubleshooting Tally-based software solutions.
Stay updated with the latest features and updates in Tally software to leverage new functionalities in solution development.
Adhere to coding standards, documentation practices, and quality assurance processes.
Requirements:
Any Degree. Relevant work experience may be considered in place of a degree.
Experience in Tally development and customization for projects using Tally Definition Language (TDL).
Hands-on experience in Tally and implementation of its features.
Familiarity with database systems, data structures, and SQL for efficient data management and retrieval.
Strong problem-solving skills and attention to detail.
Good communication and teamwork abilities.
Continuous learning mindset to keep up with advancements in Tally software and related technologies.
Key Skills Required:
TDL (Tally Definition Language), Tally, Excel, XML/JSON.
Good to have Basic Skills:
Database like MS SQL, MySQL
API Integration.
WORK EXPERIENCE- MINIMUM 2 YEARS AND MAXIMUM 7 YEARS
Interested candidate may what's app their cv on TRIPLE NINE ZERO NINE THREE DOUBLE ONE DOUBLE FOURE.
Please answer the below question?
Do you have knowledge of Tally Definition Language?
How many experience do you have as TDL?
The Consultant / Senior Consultant – Adobe Campaign is a technical role that requires providing Consulting advice and support to Clients for Implementing Adobe Campaign solution and any technical advisory required afterwards. This is a client-facing role and requires consultant to liaise with the client, understand their technical and business requirements and then Implement Adobe Campaign solution in a manner client gets most value out of the solution. Consultant’s main objective is to drive successful delivery and maintaining a high level of satisfaction for our customer.
What you need to succeed
• Expertise and Experience in SQL (Oracle / SQL Server / PostgreSQL) • Programming experience (Javascript / Java / VB / C# / PHP)
• Knowledge on Web Technologies like HTML, CSS would be a plus
• Good communication skills to ensure effective customer interactions, communications, and documentation
• Self-starter - Organized and highly motivated
• Fast learner, ability to learn new technologies/languages
• Knowledge of HTML DOM manipulation and page load events a plus
• Project Management skills a plus
• Ability to develop creative solutions to problems
• Able to multi-task in a dynamic environment
• Able to work independently with minimal supervision
• Experience leading team members will be a plus Adobe is an equal opportunity/affirmative action employer. We welcome and encourage diversity in the workplace.



Title - Pncpl Software Engineer
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Business Summary :
The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.
Principal Software Engineer
Position Responsibilities :
- Develop and manage integrations with third-party services and APIs using industry-standard protocols like OAuth2 for secure authentication and authorization.
- Develop scalable, performant APIs for Deltek products
- Accountability for the successful implementation of the requirements by the team.
- Troubleshoot, debug, and optimize code and workflows for better performance and scalability.
- Undertake analysis, design, coding and testing activities of complex modules
- Support the company’s development processes and development guidelines including code reviews, coding style and unit testing requirements.
- Participate in code reviews and provide mentorship to junior developers.
- Stay up-to-date with emerging technologies and best practices in Python development, AWS, and frontend frameworks like React. And suggest optimisations based on them
- Adopt industry best practices in all your projects - TDD, CI/CD, Infrastructure as Code, linting
- Pragmatic enough to deliver an MVP, but aspirational enough to think about how it will work with millions of users and adapt to new challenges
- Readiness to hit the ground running – you may not know how to solve everything right off the bat, but you will put in the time and effort to understand so that you can design architecture of complex features with multiple components.
Qualifications :
- A college degree in Computer Science, Software Engineering, Information Science or a related field is required
- Minimum 8-10 years of experience Sound programming skills on Python, .Net platform (VB & C#), TypeScript / JavaScript, Frontend technologies like React.js/Ember.js, SQL Db (like PostgreSQL)
- Experience in backend development and Apache Airflow (or equivalent framework).
- Build APIs and optimize SQL queries with performance considerations.
- Experience with Agile Development
- Experience in writing and maintaining unit tests and using testing frameworks is desirable
- Exposure to Amazon Web Services (AWS) technologies, Terraform, Docker is a plus
- Strong desire to continually improve knowledge and skills through personal development activities and apply their knowledge and skills to continuous software improvement.
- The ability to work under tight deadlines, tolerate ambiguity and work effectively in an environment with multiple competing priorities.
- Strong problem-solving and debugging skills.
- Ability to work in an Agile environment and collaborate with cross-functional teams.
- Familiarity with version control systems like Git.
- Excellent communication skills and the ability to work effectively in a remote or hybrid team setting.


Title - Sr Software Engineer
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Business Summary :
The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.
External Job Title :
Sr Software Engineer
Position Responsibilities :
- Develop and manage integrations with third-party services and APIs using industry-standard protocols like OAuth2 for secure authentication and authorization.
- Develop scalable, performant APIs for Deltek products
- Accountability for the successful implementation of the requirements by the team.
- Troubleshoot, debug, and optimize code and workflows for better performance and scalability.
- Undertake analysis, design, coding and testing activities of complex modules
- Support the company’s development processes and development guidelines including code reviews, coding style and unit testing requirements.
- Participate in code reviews and provide mentorship to junior developers.
- Stay up-to-date with emerging technologies and best practices in Python development, AWS, and frontend frameworks like React.
- Adopt industry best practices in all your projects - TDD, CI/CD, Infrastructure as Code, linting
- Pragmatic enough to deliver an MVP, but aspirational enough to think about how it will work with millions of users and adapt to new challenges
- Readiness to hit the ground running – you may not know how to solve everything right off the bat, but you will put in the time and effort to understand so that you can design architecture of complex features with multiple components.
Qualifications :
- A college degree in Computer Science, Software Engineering, Information Science or a related field is required
- Minimum 4-6 years of experience Sound programming skills on Python, .Net platform (VB & C#), TypeScript / JavaScript, Frontend technologies like React.js/Ember.js, SQL Db (like PostgreSQL)
- Experience in backend development and Apache Airflow (or equivalent framework).
- Build APIs and optimize SQL queries with performance considerations.
- Experience with Agile Development
- Experience in writing and maintaining unit tests and using testing frameworks is desirable
- Exposure to Amazon Web Services (AWS) technologies, Terraform, Docker is a plus
- Strong desire to continually improve knowledge and skills through personal development activities and apply their knowledge and skills to continuous software improvement.
- The ability to work under tight deadlines, tolerate ambiguity and work effectively in an environment with multiple competing priorities.
- Strong problem-solving and debugging skills.
- Ability to work in an Agile environment and collaborate with cross-functional teams.
- Familiarity with version control systems like Git.
- Excellent communication skills and the ability to work effectively in a remote or hybrid team setting.
Proficient in Looker Action, Looker Dashboarding, Looker Data Entry, LookML, SQL Queries, BigQuery, LookML, Looker Studio, BigQuery, GCP.
Remote Working
2 pm to 12 am IST or
10:30 AM to 7:30 PM IST
Sunday to Thursday
Responsibilities:
● Create and maintain LookML code, which defines data models, dimensions, measures, and relationships within Looker.
● Develop reusable LookML components to ensure consistency and efficiency in report and dashboard creation.
● Build and customize dashboard to Incorporate data visualizations, such as charts and graphs, to present insights effectively.
● Write complex SQL queries when necessary to extract and manipulate data from underlying databases and also optimize SQL queries for performance.
● Connect Looker to various data sources, including databases, data warehouses, and external APIs.
● Identify and address bottlenecks that affect report and dashboard loading times and Optimize Looker performance by tuning queries, caching strategies, and exploring indexing options.
● Configure user roles and permissions within Looker to control access to sensitive data & Implement data security best practices, including row-level and field-level security.
● Develop custom applications or scripts that interact with Looker's API for automation and integration with other tools and systems.
● Use version control systems (e.g., Git) to manage LookML code changes and collaborate with other developers.
● Provide training and support to business users, helping them navigate and use Looker effectively.
● Diagnose and resolve technical issues related to Looker, data models, and reports.
Skills Required:
● Experience in Looker's modeling language, LookML, including data models, dimensions, and measures.
● Strong SQL skills for writing and optimizing database queries across different SQL databases (GCP/BQ preferable)
● Knowledge of data modeling best practices
● Proficient in BigQuery, billing data analysis, GCP billing, unit costing, and invoicing, with the ability to recommend cost optimization strategies.
● Previous experience in Finops engagements is a plus
● Proficiency in ETL processes for data transformation and preparation.
● Ability to create effective data visualizations and reports using Looker’s dashboard tools.
● Ability to optimize Looker performance by fine-tuning queries, caching strategies, and indexing.
● Familiarity with related tools and technologies, such as data warehousing (e.g., BigQuery ), data transformation tools (e.g., Apache Spark), and scripting languages (e.g., Python).
Job Summary:
SiGa Systems are looking for a skilled and motivated Software Developer with expertise in NetSuite API and ODBC integrations. The ideal candidate will design, develop, and maintain robust data integration solutions to seamlessly move data between NetSuite and external database systems. This role demands a deep understanding of NetSuite’s data model, SuiteTalk APIs, ODBC connectivity, and strong programming skills for data manipulation and integration.
- Key Responsibilities:1. NetSuite API DevelopmentDesign and implement custom integrations using NetSuite SuiteTalk REST and SOAP APIs.
- Develop efficient, scalable scripts using SuiteScript 1.0 and 2.x.
- Build and maintain Suitelets, Scheduled Scripts, User Event Scripts, and other custom NetSuite components.
- Troubleshoot and resolve issues related to NetSuite API connections and data workflows.
- 2. ODBC Data IntegrationSet up and manage ODBC connections for accessing NetSuite data.
- Write complex SQL queries and stored procedures for ETL (Extract, Transform, Load) processes.
- Design and execute data synchronization workflows between NetSuite and external databases (e.g., SQL Server, MySQL, PostgreSQL).
- Ensure optimal performance and data accuracy across systems.
- 3. Data Modeling & Database ManagementAnalyze NetSuite data models and design efficient schemas for target systems.
- Perform data mapping, transformation, and migration tasks.
- Ensure data consistency and integrity throughout integration pipelines.
- Monitor database performance and maintain system reliability.
- 4. Software Development & DocumentationWrite clean, maintainable, and well-documented code.
- Participate in code reviews and contribute to coding best practices.
- Maintain technical documentation, including API specs, integration flows, and data mapping docs.
- Use version control systems (e.g., Git) for collaboration and code management.
- 5. Collaboration & CommunicationWork closely with business analysts, project managers, and cross-functional teams to understand integration requirements.
- Provide technical guidance and regular progress updates to stakeholders.
- Participate actively in Agile development processes and contribute to sprint planning and retrospectives.
Job Description:
We are looking for an experienced PL/SQL Developer to join our team. The ideal candidate should have a strong background in database development and optimization, with hands-on experience in writing complex PL/SQL code and working with large-scale databases.
Key Responsibilities:
- Design, develop, and optimize PL/SQL procedures, functions, packages, and triggers.
- Analyze business requirements and translate them into technical specifications.
- Write efficient SQL queries and improve existing code for performance.
- Perform data analysis and troubleshooting for production issues.
- Collaborate with application developers and business analysts to integrate database logic into applications.
- Ensure database security, integrity, and backup procedures are followed.
Required Skills:
- Strong experience in Oracle PL/SQL development.
- Expertise in writing complex SQL queries, stored procedures, and performance tuning.
- Good understanding of database design and data modeling.
- Experience with version control and deployment tools.
- Familiarity with ETL processes and tools is a plus.
- Strong problem-solving and analytical skills.
- Good communication and collaboration skills.
Preferred Qualifications:
- Experience working in Agile/Scrum environments.
- Exposure to cloud databases or migration projects is an advantage.
Job Title- Senior Full Stack Web Developer
Job location- Bangalore/Hybrid
Availability- Immediate Joiners
Experience Range- 5-8yrs
Desired skills - Java,AWS, SQL/NoSQL, Javascript, Node.js(good to have)
We are looking for 8-10 years Senior Full Stack Web Developer Java
- Working on different aspects of the core product and associated tools, (server-side or user-interfaces depending on the team you'll join)
- Expertise as a full stack software engineer of large scale complex software systems with at 8+ years of experience with technologies such as Java, Relational and Non relational databases,Node.js and AWS Cloud
- Assisting with in-life maintenance, testing, debugging and documentation of deployed services
- Coding & designing new features
- Creating the supporting functional and technical specifications
- Deep understanding of system architecture , and distributed systems
- Stay updated with the latest services, tools, and trends, and implement innovative solutions that contribute to the company's growth
Job Title- Senior Java Developer
Exp Range- 8-10 yrs
Location- Bangalore/ Hybrid
Desired skill- Java 8, Microservices (Must), AWS, Kafka, Kubernetes
What you will bring:
● Strong core Java, concurrency and server-side experience
● 8 + Years of experience with hands-on coding.
● Strong Java8 and Microservices. (Must)
● Should have good understanding on AWS/GCP
● Kafka, AWS stack/Kubernetes
● An understanding of Object Oriented Design and standard design patterns.
● Experience of multi-threaded, 3-tier architectures/Distributed architectures, web services and caching.
● A familiarity with SQL databases
● Ability and willingness to work in a global, fast-paced environment.
● Flexible with the ability to adapt working style to meet objectives.
● Excellent communication and analytical skills
● Ability to effectively communicate with team members
● Experience in the following technologies would be beneficial but not essential, SpringBoot, AWS, Kubernetes, Terraform, Redis
Real-World Evidence (RWE) Analyst
Summary:
As an experienced Real-World Evidence (RWE) Analyst, you will leverage our cutting-edge healthcare data platform (accessing over 60 million lives in Asia, with ambitious growth plans across Africa and the Middle East) to deliver impactful clinical insights to our pharmaceutical clients. You will be involved in the full project lifecycle, from designing analyses to execution and delivery, within our agile data science team. This is an exciting opportunity to contribute significantly to a growing early-stage company focused on improving precision medicine and optimizing patient care for diverse populations.
Responsibilities:
· Contribute to the design and execution of retrospective and prospective real-world research, including epidemiological and patient outcomes studies.
· Actively participate in problem-solving discussions by clearly defining issues and proposing effective solutions.
· Manage the day-to-day progress of assigned workstreams, ensuring seamless collaboration with the data engineering team on analytical requests.
· Provide timely and clear updates on project status to management and leadership.
· Conduct in-depth quantitative and qualitative analyses, driven by project objectives and your intellectual curiosity.
· Ensure the quality and accuracy of analytical outputs, and contextualize findings by reviewing relevant published research.
· Synthesize complex findings into clear and compelling presentations and written reports (e.g., slides, documents).
· Contribute to the development of standards and best practices for future RWE analyses.
Requirements:
· Undergraduate or post-graduate degree (MS or PhD preferred) in a quantitative analytical discipline such as Epidemiology, (Bio)statistics, Data Science, Engineering, Econometrics, or Operations Research.
· 8+ years of relevant work experience demonstrating:
o Strong analytical and problem-solving capabilities.
o Experience conducting research relevant to the pharmaceutical/biotech industry.
· Proficiency in technical skills including SQL and at least one programming language (R, Python, or similar).
· Solid understanding of the healthcare/medical and pharmaceutical industries.
· Proven experience in managing workstream or project management activities.
· Excellent written and verbal communication, and strong interpersonal skills with the ability to build collaborative partnerships.
· Exceptional attention to detail.
· Proficiency in Microsoft Office Suite (Excel, PowerPoint, Word).
Other Desirable Skills:
· Demonstrated dedication to teamwork and the ability to collaborate effectively across different functions.
· A strong desire to contribute to the growth and development of the RWE analytics function.
· A proactive and innovative mindset with an entrepreneurial spirit, eager to take on a key role in a dynamic, growing company.
Key Responsibilities:
- Design, develop, and optimize data pipelines using Snowflake.
- Develop complex SQL queries and stored procedures for data transformation and analysis.
- Implement and maintain enterprise data warehouse solutions.
- Ensure data quality, integrity, and consistency across multiple data sources.
- Collaborate with business and analytics teams to understand data requirements.
- Monitor and improve performance of data loads and transformations.
- Implement security and data governance best practices within the Snowflake environment.
Required Skills & Qualifications:
- 7+ years of experience in Data Engineering or related field.
- Strong hands-on experience with Snowflake.
- Expert-level proficiency in SQL and query optimization.
- Deep understanding of data warehouse concepts, data modeling, and ETL/ELT processes.
- Experience with cloud platforms (e.g., AWS, Azure, or GCP).
- Familiarity with data integration tools (e.g., Informatica, Matillion, or DBT) is a plus.
- Strong problem-solving and analytical skills.
Preferred Qualifications:
- Experience with Python or other scripting languages.
- Knowledge of data governance and security frameworks.
- Experience working in Agile/Scrum environments.

Experience: 5-8 Years
Work Mode: Remote
Job Type: Fulltime
Mandatory Skills: Python,SQL, Snowflake, Airflow, ETL, Data Pipelines, Elastic Search, & AWS.
Role Overview:
We are looking for a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes.
Responsibilities:
- Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness.
- Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines.
- Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS.
- Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs.
- Implement data quality checks and monitoring to ensure data integrity and identify potential issues.
- Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes.
- Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum and emerging technologies in data engineering.
- Contribute to the development and enhancement of our data warehouse architecture
Required Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes.
- At least 3+ years of exp in Snowflake data warehousing technologies.
- At least 3+ years of exp in creating and maintaining Airflow ETL pipelines.
- Minimum 3+ years of professional level experience with Python languages for data manipulation and automation.
- Working experience with Elastic Search and its application in data pipelines.
- Proficiency in SQL and experience with data modelling techniques.
- Strong understanding of cloud-based data storage solutions such as AWS S3.
- Experience working with NFS and other file storage systems.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.

Requirements:
5+ years of professional experience in React Js Developer
Strong proficiency in React.js and Node Js
Solid understanding of JavaScript (ES6+), HTML5, CSS3,and TypeScript.
Proficient with version control systems like Git.
Experience with modern build tools and package managers (e.g., Webpack, Babel, NPM/Yarn).
Familiarity with API integration (REST, GraphQL).
Strong understanding of web performance optimization techniques.
Ability to work in an Agile environment, collaborate with cross-functional teams, and deliver high-quality code under tight deadlines.
About Sun King
Sun King is the world’s leading off-grid solar energy company, delivering energy access to 1.8 billion people without reliable grid connections through innovative product design, fintech solutions, and field operations.
Key highlights:
- Connected over 20 million homes to solar power across Africa and Asia, adding 200,000 homes monthly.
- Affordable ‘pay-as-you-go’ financing model; after 1-2 years, customers own their solar equipment.
- Saved customers over $4 billion to date.
- Collect 650,000 daily payments via 28,000 field agents using mobile money systems.
- Products range from home lighting to high-energy appliances, with expansion into clean cooking, electric mobility, and entertainment.
With 2,800 staff across 12 countries, our team includes experts in various fields, all passionate about serving off-grid communities.
Diversity Commitment:
44% of our workforce are women, reflecting our commitment to gender diversity.
About the role:
The Backend Developer works remotely as part of the technology team to help Sun King’s EasyBuy business unit design and develop software to improve its field team operations.
What you will be expected to do
- Design and develop applications/systems based on wireframes and product requirements documents.
- Design and develop logical and physical data models to meet application requirements.
- Identify and resolve bottlenecks and bugs based on operational requirements.
- Perform unit tests on code to ensure robustness, including edge cases, usability, and general reliability.
- Write reusable and easily maintainable code following the principles of DRY (Don’t Repeat Yourself).
- Integrate existing tools and business systems, both in-house and external services, such as ticketing software and communication tools.
- Collaborate with team members and product managers to understand project requirements and contribute to the overall system design.
You might be a strong candidate if you have/are
- Have development experience: 1-2 years backend development experience and have strong problem-solving abilities, proficiency in data structures, and algorithms.
- Have a profound grasp of object-oriented programming (OOPS) standards and expertise in Core Java.
- Have knowledge of SQL, MySQL, or similar database management.
- Have Experience in integrating web services, such as SOAP, REST, JSON, and XML.
- Have familiarity with RESTful APIs for linking Android applications to backend services.
- Have preferred experience with version control systems like Git, but not mandatory.
- Have additional knowledge of web technologies like HTML, CSS, JavaScript, and frameworks like Spring or Hibernate would be advantageous.
What we offer (in addition to compensation and statutory benefits):
- A platform for professional growth in a rapidly expanding, high-impact sector.
- Immerse in a collaborative culture, energized by employees of Sun King who are collectively motivated by fostering a transformative, sustainable venture.
- A genuinely global environment: Engage and learn alongside a diverse group from varied geographies and backgrounds.
- Tailored learning pathways through the Sun King Center for Leadership to elevate your leadership and managerial capabilities.
Overview
As an engineer in the Service Operations division, you will be responsible for the day-to-day management of the systems and services that power client products. Working with your team, you will ensure daily tasks and activities are successfully completed and where necessary, use standard operating procedures and knowledge to resolve any faults/errors encountered.
Job Description
Key Tasks and Responsibilities:
Ensure daily tasks and activities have successfully completed. Where this is not the case, recovery and remediation steps will be undertaken.
Undertake patching and upgrade activities in support of ParentPay compliance programs. These being PCI DSS, ISO27001 and Cyber Essentials+.
Action requests from the ServiceNow work queue that have been allocated to your relevant resolver group. These include incidents, problems, changes and service requests.
Investigate alerts and events detected from the monitoring systems that indicate a change in component health.
Create and maintain support documentation in the form of departmental wiki and ServiceNow knowledge articles that allow for continual improvement of fault detection and recovery times.
Work with colleagues to identify and champion the automation of all manual interventions undertaken within the team.
Attend and complete all mandatory training courses.
Engage and own the transition of new services into Service Operations.
Participate in the out of hours on call support rota.
Qualifications and Experience:
Experience working in an IT service delivery or support function OR
MBA or Degree in Information Technology or Information Security.
Experience working with Microsoft technologies.
Excellent communication skills developed working in a service centric organisation.
Ability to interpret fault descriptions provided by customers or internal escalations and translate these into resolutions.
Ability to manage and prioritise own workload.
Experience working within Education Technology would be an advantage.
Technical knowledge:
Advanced automation scripting using Terraform and Powershell.
Knowledge of bicep and ansible advantageous.
Advanced Microsoft Active Directory configuration and support.
Microsoft Azure and AWS cloud hosting platform administration.
Advanced Microsoft SQL server experience.
Windows Server and desktop management and configuration.
Microsoft IIS web services administration and configuration.
Advanced management of data and SQL backup solutions.
Advanced scripting and automation capabilities.
Advanced knowledge of Azure analytics and KQL.
Skills & Requirements
IT Service Delivery, Information Technology, Information Security, Microsoft Technologies, Communication Skills, Fault Interpretation, Workload Prioritization, Automation Scripting, Terraform, PowerShell, Microsoft Active Directory, Microsoft Azure, AWS, Microsoft SQL Server, Windows Server, Windows Desktop Configuration, Microsoft IIS, Data Backup Management, SQL Backup Solutions, Scripting, Azure Analytics, KQL.
Job Summary:
We are seeking an experienced MS-SQL Server Database Developer with over 5+ years of hands-on experience in creating and managing database objects such as stored procedures, functions, triggers, and SSRS (Reports). The ideal candidate will be responsible for ensuring optimal performance, maintenance, and development of MS-SQL databases, including backup and restoration processes.
Key Responsibilities:
- Design, develop, and maintain stored procedures, functions, and triggers to support application functionality.
- Create and optimize complex queries for efficient data retrieval and manipulation.
- Develop reports using SQL Server Reporting Services (SSRS) or other reporting tools.
- Perform database backup, restoration, and recovery as per defined processes.
- Troubleshoot database-related issues and provide solutions for improving performance.
- Collaborate with development and QA teams to ensure database solutions meet business requirements.
- Ensure the security and integrity of data across all database environments.
Required Skills & Qualifications:
- 5+ years of experience working with MS-SQL Server (2016 or later).
- Strong expertise in writing complex T-SQL queries, stored procedures, functions, and triggers.
- Experience with report generation using SSRS or other reporting tools.
- Hands-on experience in backup, restoration, and database recovery processes.
- Familiarity with performance tuning and optimization techniques.
- Ability to work in a fast-paced environment and manage multiple tasks efficiently.
- Strong problem-solving and troubleshooting skills.
Role: Data Engineer
Industry Type: IT Services & Consulting
Department: Engineering - Software & QA
Employment Type: Full Time, 6M - 1Yr
Role Category: Software Development
Education
UG: BCA in Any Specialization, B.Tech/B.E. in Any Specialization
PG: M.Tech in Any Specialization

Overview
Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.
Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities:
Development and maintenance of data pipelines and automation scripts with Python.
Creation of data queries and optimization of database processes with SQL.
Use of bash scripts for system administration, automation and deployment processes.
Database and cloud technologies.
Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake).
Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular.
Composer (Airflow): Orchestration of data pipelines for ETL processes.
Cloud Functions: Development of serverless functions for data processing and automation.
Cloud Scheduler: Planning and automation of recurring cloud jobs.
Cloud Secret Manager: Secure storage and management of sensitive access data and API keys.
BigQuery: Processing, analyzing and querying large amounts of data in the cloud.
Cloud Storage: Storage and management of structured and unstructured data.
Cloud monitoring: monitoring the performance and stability of cloud-based applications.
Data visualization and reporting.
Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI.
Requirements:
Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP.
Combined with cloud storage technologies, cloud monitoring and cloud secret management
Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
Knowledge of agile methodologies and working in cross-functional, collaborative teams.
Skills & Requirements
SQL, BigQuery, GCP, Python, MongoDB, Exasol, Snowflake, Bash scripting, Airflow, Cloud Functions, Cloud Scheduler, Cloud Secret Manager, Cloud Storage, Cloud Monitoring, ETL, Data Pipelines, Power BI, Database Optimization, Cloud-Based BI Solutions, Data Processing, Data Automation, Agile Methodologies, Cross-Functional Collaboration.

Immediate Joiners Preferred. Notice Period - Immediate to 30 Days
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
About Us
adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities
- Development and maintenance of data pipelines and automation scripts with Python
- Creation of data queries and optimization of database processes with SQL
- Use of bash scripts for system administration, automation and deployment processes
- Database and cloud technologies
- Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
- Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
- Composer (Airflow): Orchestration of data pipelines for ETL processes
- Cloud Functions: Development of serverless functions for data processing and automation
- Cloud Scheduler: Planning and automation of recurring cloud jobs
- Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
- BigQuery: Processing, analyzing and querying large amounts of data in the cloud
- Cloud Storage: Storage and management of structured and unstructured data
- Cloud monitoring: monitoring the performance and stability of cloud-based applications
- Data visualization and reporting
- Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI
Requirements
- Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
- Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
- Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
- Combined with cloud storage technologies, cloud monitoring and cloud secret management
- Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
- Knowledge of agile methodologies and working in cross-functional, collaborative teams.
Overview
adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.
Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.
Job Description
We are seeking a skilled Cloud Data Engineer who has experience with cloud data platforms like AWS or Azure and especially Snowflake and dbt to join our dynamic team. As a consultant, you will be responsible for developing new data platforms and create the data processes. You will collaborate with cross-functional teams to design, develop, and deploy high-quality frontend solutions.
Responsibilities:
Customer consulting: You develop data-driven products in the Snowflake Cloud and connect data & analytics with specialist departments. You develop ELT processes using dbt (data build tool)
Specifying requirements: You develop concrete requirements for future-proof cloud data architectures.
Develop data routes: You design scalable and powerful data management processes.
Analyze data: You derive sound findings from data sets and present them in an understandable way.
Requirements:
Requirements management and project experience: You successfully implement cloud-based data & analytics projects.
Data architectures: You are proficient in DWH/data lake concepts and modeling with Data Vault 2.0.
Cloud expertise: You have extensive knowledge of Snowflake, dbt and other cloud technologies (e.g. MS Azure, AWS, GCP).
SQL know-how: You have a sound and solid knowledge of SQL.
Data management: You are familiar with topics such as master data management and data quality.
Bachelor's degree in computer science, or a related field.
Strong communication and collaboration abilities to work effectively in a team environment.
Skills & Requirements
Cloud Data Engineering, AWS, Azure, Snowflake, dbt, ELT processes, Data-driven consulting, Cloud data architectures, Scalable data management, Data analysis, Requirements management, Data warehousing, Data lake, Data Vault 2.0, SQL, Master data management, Data quality, GCP, Strong communication, Collaboration.
Overview
Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.
Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.
Job Description
The current application landscape features multiple Java web services running on JEE application servers, primarily hosted on AWS, and integrated with various systems such as SAP, other services, and external partners. DPS is committed to delivering the best digital work experience for the customers employees and customers alike.
Responsibilities:
Independent front- and backend implementation of business functionalities, defined as user stories by the customer, considering the cost-value ratio and maintenance effort for the customer
Implementation of user stories and incidents, including concept, implementation (including automated unit tests), and communication with the customer within the agile development process.
Database activities such as creation or modification of database schema as well as implementation of database access, queries, and data modification
Interface realization based on standard principles like REST or SOAP
Implementation of given Identity and Access Management Patterns for securing the application
Analysis and resolution of issues (3rd Level Support).
Documentation of the implementation.
Consultancy in technical and business topics within the applications.
Usage of selected tools for implementation, testing, rollout, and support.
Participation in regular meetings with the client to track the status of assigned tasks.
Requirements:
Experience with JEE technologies such as JPA, EJB, CDI, JAAS, and SAML.
Experience in technology related to JEE, like Maven.
Proficiency in HTML5, CSS, Angular, and Bootstrap.
Strong knowledge of SQL.
Experience with web services (SOAP, REST, JSON).
Skills & Requirements
JEE, JPA, EJB, CDI, JAAS, SAML, Maven, HTML5, CSS, Angular, Bootstrap, SQL, SOAP, REST, JSON, Database schema design, Unit testing, Agile development, Identity and Access Management (IAM), Troubleshooting, Documentation, Third-level support.
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
Overview
Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.
Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.
Job Description
The current application landscape features multiple Java web services running on JEE application servers, primarily hosted on AWS, and integrated with various systems such as SAP, other services, and external partners. DPS is committed to delivering the best digital work experience for the customers employees and customers alike.
Responsibilities:
Independent front- and backend implementation of business functionalities, defined as user stories by the customer, considering the cost-value ratio and maintenance effort for the customer
Implementation of user stories and incidents, including concept, implementation (including automated unit tests), and communication with the customer within the agile development process.
Database activities such as creation or modification of database schema as well as implementation of database access, queries, and data modification
Interface realization based on standard principles like REST or SOAP
Implementation of given Identity and Access Management Patterns for securing the application
Analysis and resolution of issues (3rd Level Support).
Documentation of the implementation.
Consultancy in technical and business topics within the applications.
Usage of selected tools for implementation, testing, rollout, and support.
Participation in regular meetings with the client to track the status of assigned tasks.
Requirements:
Experience with JEE technologies such as JPA, EJB, CDI, JAAS, and SAML.
Experience in technology related to JEE, like Maven.
Proficiency in HTML5, CSS, Angular, and Bootstrap.
Strong knowledge of SQL.
Experience with web services (SOAP, REST, JSON).
Skills & Requirements
JEE, JPA, EJB, CDI, JAAS, SAML, Maven, HTML5, CSS, Angular, Bootstrap, SQL, SOAP, REST, JSON, Database schema design, Unit testing, Agile development, Identity and Access Management (IAM), Troubleshooting, Documentation, Third-level support.


Position: Full Stack Developer ( PHP Codeigniter)
Company : Mayura Consultancy Services
Experience: 3 yr To 4 yrs
Location : Bangalore
Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI
Work Location: Work From Home(WFH)
Website : https://www.mayuraconsultancy.com/
Requirements :
- Prior experience in Full Stack Development using PHP Codeigniter
Perks of Working with MCS :
- Contribute to Innovative Solutions: Join a dynamic team at the forefront of software development, contributing to innovative projects and shaping the technological solutions of the organization.
- Work with Clients from across the Globe: Collaborate with clients from around the world, gaining exposure to diverse cultures and industries, and contributing to the development of solutions that address the unique needs and challenges of global businesses.
- Complete Work From Home Opportunity: Enjoy the flexibility of working entirely from the comfort of your home, empowering you to manage your schedule and achieve a better work-life balance while coding innovative solutions for MCS.
- Opportunity to Work on Projects Developing from Scratch: Engage in projects from inception to completion, working on solutions developed from scratch and having the opportunity to make a significant impact on the design, architecture, and functionality of the final product.
- Diverse Projects: Be involved in a variety of development projects, including web applications, mobile apps, e-commerce platforms, and more, allowing you to showcase your versatility as a Full Stack Developer and expand your portfolio.
Joining MCS as a Full Stack Developer opens the door to a world where your technical skills can shine and grow, all while enjoying a supportive and dynamic work environment. We're not just building solutions; we're building the future—and you can be a key part of that journey.
5 years of experience with PowerBI as a developer
Design, develop, and maintain interactive and visually appealing Power BI dashboards and reports.
Translate business requirements into technical specifications for BI solutions.
Implement advanced Power BI features such as calculated measures, KPIs, and custom visuals.
Strong proficiency in Power BI, including Power Query, DAX, and custom visuals.
Experience with data modeling, ETL processes, and creating relationships in datasets.
Knowledge of SQL for querying and manipulating data.
Familiarity with connecting Power BI to cloud services such as Azure.
Understanding of data warehousing concepts and BI architecture.

The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
Job Title: Data Engineer (Python, AWS, ETL)
Experience: 6+ years
Location: PAN India (Remote / Work From Home)
Employment Type: Full-time
Preferred domain: Real Estate
Key Responsibilities:
Develop and optimize ETL workflows using Python, Pandas, and PySpark.
Design and implement SQL queries for data extraction, transformation, and optimization.
Work with JSON and REST APIs for data integration and automation.
Manage and optimize Amazon S3 storage, including partitioning and lifecycle policies.
Utilize AWS Athena for SQL-based querying, performance tuning, and cost optimization.
Develop and maintain AWS Lambda functions for serverless processing.
Manage databases using Amazon RDS and Amazon DynamoDB, ensuring performance and scalability.
Orchestrate workflows with AWS Step Functions for efficient automation.
Implement Infrastructure as Code (IaC) using AWS CloudFormation for resource provisioning.
Set up AWS Data Pipelines for CI/CD deployment of data workflows.
Required Skills:
Programming & Scripting: Python (ETL, Automation, REST API Integration).
Databases: SQL (Athena / RDS), Query Optimization, Schema Design.
Big Data & Processing: Pandas, PySpark (Data Transformation, Aggregation).
Cloud & Storage: AWS (S3, Athena, RDS, DynamoDB, Step Functions, CloudFormation, Lambda, Data Pipelines).
Good to Have Skills:
Experience with Azure services such as Table Storage, AI Search, Cognitive Services, Functions, Service Bus, and Storage.
Qualifications:
Bachelor’s degree in Data Science, Statistics, Computer Science, or a related field.
6+ years of experience in data engineering, ETL, and cloud-based data processing.

Job Title: Senior .NET Developer (Remote)
Experience: 7+ Years Location: India (Remote)
Apply:jobs[at]cookieanalytix[dot]net
Company Overview: We are a dynamic and rapidly growing tech company looking for a talented Senior .NET Developer to join our team. As a Senior .NET Developer, you will work with a team of skilled professionals to build high-quality, scalable applications and contribute to the development of innovative software solutions.
Key Responsibilities:
* Design, develop, and maintain high-performance applications using .NET technologies.
* Collaborate with cross-functional teams to define, design, and ship new features.
* Write clean, maintainable, and efficient code.
* Optimize and troubleshoot SQL queries for database performance.
* Work with large, complex data sets and ensure the integrity of the database.
* Participate in code reviews and contribute to best practices for software development.
* Continuously improve development processes and implement automation where possible.
* Ensure all applications meet quality standards and provide excellent user experiences.
Required Skills and Experience:
* 7+ years of experience as a .NET Developer with a proven track record of designing and developing scalable applications.
* Strong proficiency in ASP.Net, C#, Web API, React/ Angular, Entity Framework.
* Solid experience with SQL Server, including database design, writing optimized queries, and performance tuning.
* Excellent problem-solving skills and the ability to debug and optimize complex systems.
* Strong communication skills with the ability to work independently and as part of a team in a remote setting.
Why Join Us?
* Work from the comfort of your own home in a fully remote environment.
* Opportunity to work on innovative and challenging projects.
* Collaborative and inclusive work culture.
- Develop & Maintain Dashboards: Create interactive and visually compelling dashboards and reports in Tableau, ensuring they meet business requirements and provide actionable insights.
- Data Analysis & Visualization: Design data models and build visualizations to summarize large sets of data, ensuring accuracy, consistency, and clarity in the reports.
- SQL Querying: Write complex SQL queries to extract and transform data from different data sources (databases, APIs, etc.), ensuring optimal performance.
- Data Cleansing: Clean, validate, and prepare data for analysis, ensuring data integrity and consistency.
- Collaboration: Work closely with cross-functional teams, including business analysts, data engineers, and stakeholders, to gather requirements and deliver customized reporting solutions.
- Troubleshooting & Support: Provide technical support and troubleshooting for Tableau reports, dashboards, and data integration issues.
- Performance Optimization: Optimize Tableau workbooks, dashboards, and queries for better performance and scalability.
- Best Practices: Ensure Tableau development follows best practices for data visualization, performance optimization, and user experience.


We are a dynamic and innovative technology company dedicated to delivering cutting-edge solutions that empower businesses and individuals. As we continue to grow and expand our offerings, we are seeking a coding fanatic, who is interested in working on and learning new technologies.
Position - Backend developer
Job type - Freelance or on contract
Location - Remote
Roles and Responsibilities:
- Plan,create and test REST APIs for back-end services such as authentication and authorization.
- Deploy and maintain backend systems on the cloud.
- Research and develop solutions for real life business problems.
- Creating and maintaining our apps' essential business logic, providing correct data processing and flawless user experiences.
- Database design, implementation, and management, including schema design, query optimisation and data integrity.
- Participating in code reviews, providing constructive input, and ensuring that code quality standards are met.
- Keep abreast of industry developments and best practices to bring new solutions to our initiatives.
Required skills and experience -
Must have skills : -
- Bachelor’s degree in computer programming, computer science, or a related field.
- 3 + years of experience in backend development.
- Proficient in Python,Mongodb,postgres/sql,Django and Flask
- Knowledge on nginx.
- C++/C +Cython for creating python modules
- Knowledge on Redis
- Familiarity with using AI provider apis and prompt engineering
- Experience in working with linux based instances on the cloud.
- Strong problem solving and verbal and written communication skills.
- Ability to work independently or with a group.
Good to have skills :-
- Experience in node.js and Java
- AWS and Google cloud knowledge.
🚨 Hiring Alert 🚨
We are Hiring Java Backend Intern for 2 months !
Skills Required:
1. Good understanding of Java 17, Spring and any Sql database
2. Good Understanding on designing low level code from scratch
3. Experience in building database schema and code architecture
4. Familiar with design patterns and willingness to write clean, readable, and well-documented code.
5. Familiarity with tools like postman, STS or intelij
6. Understanding of REST APIs and their role in application development.
7. Good DSA and problem solving skills
Roles & Responsibilities:
1. Assist in developing and maintaining web applications.
2. Learn to utilize open source tools for integration
3. Collaborate with team members to design and implement new features.
4. Contribute to optimizing application performance and resolving bugs.
5. Stay curious and keep learning new technologies relevant to spring boot and spring reactive
6. Exposure to version control systems like Git.
7. Passion for learning and contributing to real-world projects.
Preferred Qualifications:
1. Min exp of 0-2 years.
2. Skills in computer science/IT and relevant.
What You’ll Gain:
1. Stipend - 8k -10k/ month, subjective to your performance
2. Hands-on experience in building production-grade applications.

Roles and Responsibilities:
- 6+ years of IT experience with 3+ years in Camunda development; Camunda certification required.
- Expertise in designing, developing, and implementing Camunda components like Job Workers and Process Models, following best practices.
- Proficient in integrations with external systems using REST APIs, connectors, web services, and experience in building REST services with Spring Boot or .NET.
- Hands-on experience integrating Camunda with Front End, Streaming Products, PostGres, SMTP, SAP, and RPA systems; strong SQL query and function writing skills.
- Experienced in deploying solutions via Bitbucket and GIT, maintaining documentation, and participating in code reviews to ensure quality and compliance.
- Skilled in tracking and resolving CRs/Defects through JIRA and providing technical support for UAT/PROD environments.

Qualifications
- Bachelor's Degree or equivalent experience is required
- A minimum of 8-10 years of experience working in an IT environment
- At least 6-8 years of experience as a developer.
- Experience leading complex projects/tasks
- Strong hands-on experience with a variety of code, interface, API, and database concepts
- Experience in full project life cycle development for systems and applications
- Experience integrating and customizing business applications
- Excellent oral and written communication skills, with the ability to collaborate effectively with team members at all levels, from junior developers to senior managers and business customers
- A strong desire to learn the latest technologies and the ability to acquire knowledge quickly
- Exceptional problem-solving and debugging skills
Technical Knowledge & Skills
- Proficiency in programming languages such as JavaScript, .NET, C#, Java, PHP, PowerShell, and Regex
- Experience with ERP, HRIS, and CRM and Case Management systems
- Experience with ServiceNow (SNOW) glide extensions and JavaScript
- Experience with Microsoft SharePoint
- Proficiency in web services, Rest APIs, string manipulation, SQL, and creating stored procedures using XML/JSON and SOAP/REST protocols
- Experience with integrations using tools such as Power Automate, UiPath, MuleSoft, and IBM App Connect
- Possess knowledge of DBA tasks, including deployments, maintenance planning, string manipulation, and authoring SQL statements
- Experience using Source Code and Version Control Systems like Git
- Experience in cloud base deployment of applications
- Windows OS

About the Role:
This role provides you the opportunity to truly accelerate your engineering career by giving you a front-line seat on a Rocketship product. Our fintech product has grown by 10x over the last year and we are expecting that growth to continue for the foreseeable future. The leadership team is committed to making massive investments in its technology and people. Given the growth of the product, individuals will have opportunities to move into leadership roles.
Roles & Responsibilities:
● Perform full stack development activities using MERN Stack & GraphQL which is a MUST.
● Ability to translate UX Designs into functional web apps using React JS
● Technical architecture design along with system architect and product manager
● Writing effective business logic (using Rest API or GraphQL API)
● Algorithm design for system modules● Database design for scalable and secure systems using NoSQL (MongoDB) or RDBMS (MySQL or PostgreSQL)
● Experience in other JS frameworks like Next.JS, and React Native/Expo would be a plus.
● POC development with other engineers.
● Experience and Exposure to working with different projects and business models will be an advantage.
● Efforts estimations with the Product Manager and Engineering Head
● Test software to ensure responsiveness and efficiency
● Writing Unit testing for robust systems.
● Hands-on experience with AWS services (EC2, SQS, SES, Lambda).
● Hands-on experience in developing Rest/GraphQL API using Node.js with Typescript.
● Hands-on experience in MERN stack (MySQL or PostgreSQL would be a plus)
● Strong Knowledge of algorithms and data structures
● Experience working with US Clients is a Must
● Fundamentals of Docker/Containerized application development would be a plus.
● Experience in SaaS product development would be a plus
● Technical documentation
Desired Skills
● Technical Skills: MERN Stack, NoSQL (MongoDB) or RDBMS (MySQL or PostgreSQL),
JS frameworks, AWS services (EC2, SQS, SES, Lambda), Typescript, Rest API/GraphQL.
● Soft Skills: Excellent Communication, Problem-solving, Critical Thinking, Teamwork & Individual Contributor, Proactive

Sr .Net Developer
#REMOTE# India
What We’re Looking For:
7+ years of experience in .net and sql
Strong problem-solving skills and attention to detail.
Ability to work collaboratively in a team and remotely.
Why Join Us?
Flexible work-from-home options.
Competitive salary based on experience and skills.
Opportunity to work on innovative projects with the latest technologies.
A culture that values growth, learning, and collaboration.

Performance Test Engineer
Required Experience- 3+ Years
Summary:
We are seeking a highly motivated and skilled Performance Test Engineer to join our dynamic team. In this role, you will be responsible for ensuring the performance, scalability, and stability of our critical applications and systems. You will play a key role in designing, executing, and analyzing performance tests to identify and resolve bottlenecks and optimize system performance.
Responsibilities:
· Test Strategy & Design:
o Develop and execute comprehensive performance test strategies aligned with non-functional requirements and specifications.
o Design, develop, and maintain performance test scripts and scenarios using appropriate testing tools.
o Collaborate with development teams to understand system architecture and identify key performance indicators (KPIs).
· Test Execution & Analysis:
o Execute performance tests, monitor system behavior under load, and collect performance data.
o Analyze test results, identify performance bottlenecks and root causes of performance issues.
o Generate comprehensive performance reports with clear and concise findings and recommendations.
· Performance Tuning & Optimization:
o Work with development teams to implement performance improvements and optimizations.
o Conduct performance tuning activities, such as database tuning, code optimization, and infrastructure optimization.
· Tooling & Automation:
o Integrate performance tests into the CI/CD pipeline to ensure continuous performance monitoring and regression testing.
o Develop and maintain automated performance testing frameworks.
· Monitoring & Troubleshooting:
o Monitor system performance using tools such as APM (Dynatrace, etc.), JVM monitoring, and log analysis (Splunk, etc.).
o Troubleshoot performance issues and provide timely resolutions.
· Database & System Monitoring:
o Monitor database performance and identify potential issues.
o Monitor system resources (e.g., CPU, memory, disk I/O) for virtual machines, cloud environments, containers (OpenShift), databases, and operating systems (Linux/Unix).
· Data Manipulation & Processing:
o Perform data transformations, substitutions, copying, manipulation, and message listening.
o Develop and implement data generation and loading strategies for performance testing.
· Technical Skills:
o Strong understanding of performance testing methodologies and best practices.
o Experience with performance testing tools (e.g., JMeter, LoadRunner, Gatling).
o Experience with scripting languages (e.g., Java, Python, Groovy).
o Experience with SQL and database concepts.
o Experience with CI/CD pipelines (e.g., Jenkins, Azure DevOps).
o Experience with monitoring tools (e.g., APM, JVM monitoring, log analysis).
o Experience with cloud platforms (e.g., AWS, Azure, GCP) is a plus.
· Soft Skills:
o Strong analytical and problem-solving skills.
o Excellent communication and collaboration skills.
o Ability to work independently and as part of a team.
o Strong1 attention to detail and2 accuracy.
o Ability to learn new technologies quickly.
Education & Experience:
· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
· 3+ years of experience in performance testing.
Performance Engineering Lead
Required Experience: 6+ Years
Summary:
We are seeking a highly experienced and results-oriented Performance Engineering Lead to join our team. In this leadership role, you will be responsible for leading and mentoring a team of performance engineers, defining and implementing performance testing strategies, and ensuring the delivery of high-quality and high-performing applications.
Responsibilities:
· Team Leadership & Mentorship:
o Lead, mentor, and guide a team of performance engineers.
o Provide technical guidance, support, and training to team members.
o Foster a collaborative and high-performing team environment.
o Conduct performance reviews and provide constructive feedback.
· Performance Test Strategy & Planning:
o Define and implement comprehensive performance testing strategies aligned with business and technical objectives.
o Collaborate with stakeholders (e.g., product managers, development teams, architects) to understand system requirements and identify performance risks.
o Develop and maintain performance test plans, including objectives, scope, approach, and resource allocation.
· Test Design & Execution:
o Oversee the design and execution of performance tests, including load tests, stress tests, and endurance tests.
o Analyze test results, identify performance bottlenecks and root causes of performance issues.
o Generate comprehensive performance reports with clear and concise findings and recommendations.
· Performance Tuning & Optimization:
o Work with development teams to implement performance improvements and optimizations.
o Conduct performance tuning activities, such as database tuning, code optimization, and infrastructure optimization.
· Tooling & Automation:
o Establish and maintain performance testing best practices and standards.
o Integrate performance tests into the CI/CD pipeline to ensure continuous performance monitoring and regression testing.
o Evaluate and select appropriate performance testing tools and technologies.
· Stakeholder Management:
o Communicate effectively with stakeholders on performance testing progress, issues, and risks.
o Present performance test results and recommendations to key stakeholders.
o Build and maintain strong relationships with stakeholders across the organization.
· Technical Expertise:
o Deep understanding of performance testing methodologies and best practices.
o Strong experience with performance testing tools (e.g., JMeter, LoadRunner, Gatling).
o Experience with scripting languages (e.g., Java, Python, Groovy).
o Experience with SQL and database concepts.
o Experience with CI/CD pipelines (e.g., Jenkins, Azure DevOps).
o Experience with monitoring tools (e.g., APM, JVM monitoring, log analysis).
o Experience with cloud platforms (e.g., AWS, Azure, GCP) is a plus.
· Soft Skills:
o Strong leadership, communication, and interpersonal skills.
o Excellent analytical and problem-solving skills.
o Ability to effectively manage time, prioritize tasks, and meet deadlines.
o Strong decision-making and judgment skills.
o Ability to work effectively under pressure and adapt to changing priorities.
Education & Experience:
· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
· 5+ years of experience in performance testing, with at least 2 years of experience in a leadership role.
Performance Test Lead
Required experience: 6+ Years
Summary:
We are seeking a highly experienced and results-oriented Performance Test Lead to join our team. In this leadership role, you will be responsible for leading and mentoring a team of performance engineers, defining and implementing performance testing strategies, and ensuring the delivery of high-quality and high-performing applications.
Responsibilities:
· Team Leadership & Mentorship:
o Lead, mentor, and guide a team of performance engineers.
o Provide technical guidance, support, and training to team members.
o Foster a collaborative and high-performing team environment.
o Conduct performance reviews and provide constructive feedback.
· Performance Test Strategy & Planning:
o Define and implement comprehensive performance testing strategies aligned with business and technical objectives.
o Collaborate with stakeholders (e.g., product managers, development teams, architects) to understand system requirements and identify performance risks.
o Develop and maintain performance test plans, including objectives, scope, approach, and resource allocation.
· Test Design & Execution:
o Oversee the design and execution of performance tests, including load tests, stress tests, and endurance tests.
o Analyze test results, identify performance bottlenecks and root causes of performance issues.
o Generate comprehensive performance reports with clear and concise findings and recommendations.
· Performance Tuning & Optimization:
o Work with development teams to implement performance improvements and optimizations.
o Conduct performance tuning activities, such as database tuning, code optimization, and infrastructure optimization.
· Tooling & Automation:
o Establish and maintain performance testing best practices and standards.
o Integrate performance tests into the CI/CD pipeline to ensure continuous performance monitoring and regression testing.
o Evaluate and select appropriate performance testing tools and technologies.
· Stakeholder Management:
o Communicate effectively with stakeholders on performance testing progress, issues, and risks.
o Present performance test results and recommendations to key stakeholders.
o Build and maintain strong relationships with stakeholders across the organization.
· Technical Expertise:
o Deep understanding of performance testing methodologies and best practices.
o Strong experience with performance testing tools (e.g., JMeter, LoadRunner, Gatling).
o Experience with scripting languages (e.g., Java, Python, Groovy).
o Experience with SQL and database concepts.
o Experience with CI/CD pipelines (e.g., Jenkins, Azure DevOps).
o Experience with monitoring tools (e.g., APM, JVM monitoring, log analysis).
o Experience with cloud platforms (e.g., AWS, Azure, GCP) is a plus.
· Soft Skills:
o Strong leadership, communication, and interpersonal skills.
o Excellent analytical and problem-solving skills.
o Ability to effectively manage time, prioritize tasks, and meet deadlines.
o Strong decision-making and judgment skills.
o Ability to work effectively under pressure and adapt to changing priorities.
Education & Experience:
· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
· 5+ years of experience in performance testing, with at least 2 years of experience in a leadership role.
Performance Engineering Team
Required Exp.- 3+ Years
Summary:
We are seeking a highly skilled and motivated team of Performance Engineers to join our dynamic organization. This team will be responsible for ensuring the performance, scalability, and reliability of our critical applications and systems. You will work closely with development teams, architects, and other stakeholders to design, execute, and analyze performance tests, identify and resolve performance bottlenecks, and optimize system performance.
Responsibilities:
· Performance Testing & Analysis:
o Design, develop, and execute comprehensive performance tests, including load tests, stress tests, endurance tests, and soak tests.
o Analyze test results, identify performance bottlenecks and root causes of performance issues.
o Generate comprehensive performance reports with clear and concise findings and recommendations.
o Work with development teams to implement performance improvements and optimizations.
· Test Automation & Integration:
o Develop and maintain automated performance testing frameworks and scripts.
o Integrate performance tests into the CI/CD pipeline to ensure continuous performance monitoring and regression testing.
· Monitoring & Troubleshooting:
o Monitor system performance using tools such as APM (Dynatrace, etc.), JVM monitoring, and log analysis (Splunk, etc.).
o Troubleshoot performance issues and provide timely resolutions.
· Database & System Monitoring:
o Monitor database performance and identify potential issues.
o Monitor system resources (e.g., CPU, memory, disk I/O) for virtual machines, cloud environments, containers (OpenShift), databases, and operating systems (Linux/Unix).
· Data Manipulation & Processing:
o Perform data transformations, substitutions, copying, manipulation, and message listening.
o Develop and implement data generation and loading strategies for performance testing.
· Technical Skills:
o Strong understanding of performance testing methodologies and best practices.
o Experience with performance testing tools (e.g., JMeter, LoadRunner, Gatling).
o Experience with scripting languages (e.g., Java, Python, Groovy).
o Experience with SQL and database concepts.
o Experience with CI/CD pipelines (e.g., Jenkins, Azure DevOps).
o Experience with monitoring tools (e.g., APM, JVM monitoring, log analysis).
o Experience with cloud platforms (e.g., AWS, Azure, GCP) is a plus.
· Soft Skills:
o Strong analytical and problem-solving skills.
o Excellent communication and collaboration skills.
o Ability to work independently and as part of a team.
o Strong1 attention to detail and2 accuracy.
o Ability to learn new technologies quickly.
Education & Experience:
· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
· 2+ years of experience in performance testing.

Description
Come Join Us
Experience.com - We make every experience matter more
Position: Senior GCP Data Engineer
Job Location: Chennai (Base Location) / Remote
Employment Type: Full Time
Summary of Position
A Senior Data Engineer is a professional who specializes in preparing big data infrastructure for analytical or operational uses. He/She is responsible for develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity. They collaborate with data scientists and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organisation.
Responsibilities:
- Collaborate with cross-functional teams to define, prioritize, and execute data engineering initiatives aligned with business objectives.
- Design and implement scalable, reliable, and secure data solutions by industry best practices and compliance requirements.
- Drive the adoption of cloud-native technologies and architectural patterns to optimize the performance, cost, and reliability of data pipelines and analytics solutions.
- Mentor and lead a team of Data Engineers.
- Demonstrate a drive to learn and master new technologies and techniques.
- Apply strong problem-solving skills with an emphasis on building data-driven or AI-enhanced products.
- Coordinate with ML/AI and engineering teams to understand data requirements.
Experience & Skills:
- 8+ years of Strong experience in ETL and ELT data from various sources in Data Warehouses
- 8+ years of experience in Python, Pandas, Numpy, and SciPy.
- 5+ years of Experience in GCP
- 5+ years of Experience in BigQuery, PySpark, and Pub/Sub
- 5+ years of Experience working with and creating data architectures.
- Certified in Google Cloud Professional Data Engineer.
- Advanced proficiency in Google Cloud services such as Dataflow, Dataproc, Dataprep, Data Studio, and Cloud Composer.
- Proficient in writing complex Spark (PySpark) User Defined Functions (UDFs), Spark SQL, and HiveQL.
- Good understanding of Elastic search.
- Experience in assessing and ensuring data quality, data testing, and addressing data quality issues.
- Excellent understanding of Spark architecture and underlying frameworks including storage management.
- Solid background in database design and development, database administration, and software engineering across full life cycles.
- Experience with NoSQL data stores like MongoDB, DocumentDB, and DynamoDB.
- Knowledge of data governance principles and practices, including data lineage, metadata management, and access control mechanisms.
- Experience in implementing and optimizing data security controls, encryption, and compliance measures in GCP environments.
- Ability to troubleshoot complex issues, perform root cause analysis, and implement effective solutions in a timely manner.
- Proficiency in data visualization tools such as Tableau, Looker, or Data Studio to create insightful dashboards and reports for business users.
- Strong communication and interpersonal skills to effectively collaborate with technical and non-technical stakeholders, articulate complex concepts, and drive consensus.
- Experience with agile methodologies and project management tools like Jira or Asana for sprint planning, backlog grooming, and task tracking.
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress.
Principal Help Desk Engineer
Position Responsibilities :
As a Principal Help Desk Engineer in Maconomy Engineering, you will become a key member of our Engineering Help Desk team with the primary purpose of ensuring we find successful resolutions to customer issues that makes their way to the team.
Deltek Maconomy is a project-based enterprise resource planning (ERP) solution which is purpose-built specifically for the distinct needs of professional services firms.
We're looking for a proactive and logical Application Helpdesk Engineer to join our team based out of India. As a Principal Help Desk Engineer you will have a deep knowledge and understanding of the use of ERP software and use your technical experience & skills to troubleshoot our Maconomy solution. You will work closely with the Application Developers to gain specialized knowledge of the workings of the software, in order to independently resolve complex cases that are unable to be resolved by the Support Services team. As part of the role you will be working closely with the Support Services, Product Managers, Cloud Solutions and Engineering teams to ensure that the best and most timely solution to the cases are provided to our Customers.
Key Responsibilities
To be successful in this role, you will be expected to perform the following functions:
- Growing the relationship between Support Services, Cloud Solutions, Engineering and other key department areas
- Leading by example by adhering to the correct processes and procedures and ensuring that the knowledge you have is recorded and passed onto the team in an effective manner
- Deep investigations into the most complex cases and develop/come up with a workaround to assist clients who are not able to use the application features
- Focus on the high priority cases for clients to ensure strong and timely resolutions for key issues
- Work closely with the developers to trace down the root cause of defects and work to ensure long term solutions are developed
- Produce and maintain database fix scripts for issues affecting multiple clients
- Use TFS for defect management, RNT for customer issue management and Microsoft Teams for cross business collaboration
- Escalate issues requiring development assistance to the Development team
- Create defects using TFS and link information to Support Services cases
- Support communication between Developers and Support Services, following up on requests for further information about specific escalated cases or defects
- Review cases submitted to Help Desk by Support Services to determine if existing defects exist and create new defects if they do not exist
- Generate ad hoc reports regarding cases assigned to the Help Desk team, those with development, those being worked on, etc.
- Track the status of bug fix defects and follow up with Engineering as necessary.
Qualifications :
We are looking for people who have the following experience:
- Minimum Bachelor´s Degree level in Software Development
- 5+ years working as an Application Support Engineer
- 3+ years of Software programming experience using a variety of coding languages
- Experiencing in coaching and guiding others in team
- Excellent knowledge and skills with relational database management (RDBMS) systems including Oracle and SQL Server
- Have worked with ERP Software and have a thorough knowledge of the purpose and uses of this type of system
- An understanding of the types of challenges that our customers may face with using ERP software and the effects this will have on them
- Strong analytical skills
- Customer service oriented
- Experience with developing reports using SQL
- The ability to adapt quickly to new technical environments
- The ability to work under tight deadlines and work effectively in an environment with multiple competing priorities
- Strong communication skills including the ability to write clearly and concisely and to present information in a way which facilitates interpretation
- Excellent proficiency in written and spoken English
Required Skills and Experience:
Proficient in Java (Java 8 and above), with a strong understanding of object-oriented programming.
Knowledge in the trading domain, including familiarity with trading systems and protocols.
Strong skills in SQL and PL/SQL for database management and query optimization.
Hands-on experience with Linux and Windows operating systems for application deployment and maintenance.
Proficiency in scripting languages (e.g., Bash, PowerShell, or similar).
Knowledge of Python programming for auxiliary development and analytics tasks.
Familiarity with multithreading, concurrency, and low-latency application development.
Experience with CI/CD pipelines, version control systems (e.g., Git), and deployment workflows.
We are currently seeking skilled and Innovative QA Automation lead to join our dynamic team. As a QA Lead, you will be Responsible for Automation test planning, product test strategy, Create automation test scripts to verify and validate the quality of the product.
Join DataCaliper and step into the vanguard of technological advancement, where your proficiency will shape the landscape of data management and drive businesses toward unparalleled success.
Please find below our job description, if interested apply / reply sharing your profile to connect and discuss.
Company: Data caliper
Work location: Coimbatore( Remote)
Experience: 10+ Years
Joining time: Immediate – 4 weeks
Required skills:
- Good experience with Selenium, Cucumber or any other automation tools.
- Good experience in Selenium Based automation of Web, Mobile, and Desktop applications.
- Very good written and oral business communication and presentation skills required
- Basic SQL knowledge is required
- Experience in one test management and one defect management tool is required
- Should be aware of STLC and testing processes
- Good attitude and communication skills are required
- Willing to learn and stretch during the ramp-up period as he/she should be hands-on very quickly.
- Hands-on experience in Agile projects
Thank you


We are currently seeking skilled and motivated Senior Java Developers to join our dynamic and innovative development team. As a Senior Java Developer, you will be responsible for designing, developing, and maintaining high-performance, scalable Java applications.
Join DataCaliper and step into the vanguard of technological advancement, where your proficiency will shape the landscape of data management and drive businesses toward unparalleled success.
Please find below our job description, if interested apply / reply sharing your profile to connect and discuss.
Company: Data caliper
Work location: Coimbatore
Experience: 3+ years
Joining time: Immediate – 4 weeks
Required skills:
-Good experience in Java/J2EE programming frameworks like Spring (Spring MVC, Spring Security, Spring JPA, Spring Boot, Spring Batch, Spring AOP).
-Deep knowledge in developing enterprise web applications using Java Spring
-Good experience in REST webservices.
-Understanding of DevOps processes like CI/CD
-Exposure to Maven, Jenkins, GIT, data formats json /xml, Quartz, log4j, logback
-Good experience in database technologies / SQL / PLSQL or any database experience
-The candidate should have excellent communication skills with an ability to interact with non-technical stakeholders as well.
Thank you

Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
External Skills And Expertise
Must have Skills:
- Scala
- Spark
- SQL (Intermediate to advanced level)
- Spark Streaming
- AWS preferable/Any cloud
- Kafka /Kinesis/Any streaming services
- Object-Oriented Programming
- Hive, ETL/ELT design experience
- CICD experience (ETL pipeline deployment)
Good to Have Skills:
- AWS Certification
- Git/similar version control tool
- Knowledge in CI/CD, Microservices

Senior Data Analyst
Experience: 8+ Years
Work Mode: Remote Full Time
Responsibilities:
• Analyze large datasets to uncover trends, patterns, and insights to support business goals.
• Design, develop, and manage interactive dashboards and reports using Power BI.
• Utilize DAX and SQL for advanced data querying and data modeling.
• Create and manage complex SQL queries for data extraction, transformation, and loading processes.
• Collaborate with cross-functional teams to understand data requirements and translate them into actionable solutions.
• Maintain data accuracy and integrity across projects, ensuring reliable data-driven insights.
• Present findings to stakeholders, translating complex data insights into simple, actionable business recommendations.
Skills:
Power BI, DAX (Data Analysis Expressions), SQL, Data Modeling, Python
Preferred Skills:
• Machine Learning: Exposure to machine learning models and their integration within analytical solutions.
• Microsoft Fabric: Familiarity with Microsoft Fabric for enhanced data integration and management.


We are seeking a Senior or Staff Software Engineer (Node.js, Azure and React ) to lead new software development initiatives to join our team.
Responsibilities:
- Contribute hands-on to coding, code reviews, architecture, and design efforts, setting a solid example for the team.
- Will be a tech lead and manage a small engineering team, fostering a collaborative and productive work environment.
- Work closely with cross-functional teams, including Product Management and Data Engineering, to build empathetic and user-centric products.
- Drive the development of robust and scalable web experiences, leveraging modern technologies and best practices.
- Provide technical guidance and mentorship to team members, promoting continuous learning and growth.
- Collaborate with stakeholders to define and prioritize engineering initiatives aligned with business goals.
- Ensure high code quality, maintainability, and performance through the implementation of best practices and coding standards.
- Foster a culture of innovation, encouraging the team to explore new technologies and approaches to problem-solving.
Requirements:
- Bachelor’s degree in computer science, Software Engineering, or a related field (advanced degree preferred).
- 7+ years of professional experience in software engineering, with at least 1 year in a technical leadership role.
- Strong experience in product engineering, with a focus on building empathetic and user-centric products.
- Extensive experience in web development, particularly with technologies such as Node.js and React.
- Familiarity with cloud infrastructure, specifically Azure, and containerization technologies like Docker.
- Solid understanding of software development best practices, design patterns, and coding standards.
- Excellent problem-solving and analytical skills, with the ability to make data-driven decisions.
- Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams.
- Experience with Agile development methodologies (e.g., Scrum, Kanban).
Preferred Qualifications:
- Experience with web scraping techniques and tools.
- Knowledge of SQL query optimization and performance tuning.
- Familiarity with automated testing, continuous integration, and continuous deployment (CI/CD) practices.
- Experience with DevOps practices and tools (e.g., Jenkins, Ansible, Terraform).
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link: https://zrec.in/es8UJ?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com


We are seeking a Junior Software Engineer (AWS, Azure, Google Cloud,Spring, Node.js, Django) to join our dynamic team. As a Junior Software Engineer will have a passion for technology, a solid understanding of software development principles, and a desire to learn and grow in a collaborative environment. You will work closely with senior engineers to develop, test, and maintain software solutions that meet the needs of our clients and internal stakeholders.
Responsibilties:
- Software Development: Write clean, efficient, and well-documented code for various software applications.
- Testing & Debugging: Assist in testing and debugging software to ensure functionality, performance, and security.
- Learning & Development: Continuously improve your technical skills by learning new programming languages, tools, and AI methodologies.
- Documentation: Assist in the documentation of software designs, technical specifications, and user manuals.
- Problem-Solving: Identify and troubleshoot software defects and performance issues.
- Customer Communication: Interact with customers to gather requirements, provide technical support, and ensure their needs are met throughout the software development lifecycle. Maintain a professional and customer-focused attitude in all communications.
Requirements:
- Education: Bachelor's degree in Computer Science, Software Engineering, or a related field.
- Programming Languages: Proficiency in at least one programming language such as Java, Python, TypeScript or JavaScript.
- Familiarity with: Git version control system, Scrum software development methodology, and basic understanding of databases and SQL.
- Problem-Solving Skills: Strong analytical and problem-solving skills with a keen attention to detail.
- Communication: Good verbal and written communication skills with the ability to work effectively in a team environment and interact with customers.
- Adaptability: Ability to learn new technologies and adapt to changing project requirements.
- Internship/Project Experience: Previous internship experience or project work related to software development is a plus.
Preferred Skills:
- Experience with cloud platforms (e.g., AWS, Azure, Google Cloud).
- Familiarity with back-end frameworks (e.g., Spring, Node.js, Django).
- Knowledge of DevOps practices and tools.
Benefits:
- Work Location: Remote
- 5 days wortking
You can apply directly through the link: https://zrec.in/F57mD?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com


We are seeking a Data Engineer ( Snowflake, Bigquery, Redshift) to join our team. In this role, you will be responsible for the development and maintenance of fault-tolerant pipelines, including multiple database systems.
Responsibilities:
- Collaborate with engineering teams to create REST API-based pipelines for large-scale MarTech systems, optimizing for performance and reliability.
- Develop comprehensive data quality testing procedures to ensure the integrity and accuracy of data across all pipelines.
- Build scalable dbt models and configuration files, leveraging best practices for efficient data transformation and analysis.
- Partner with lead data engineers in designing scalable data models.
- Conduct thorough debugging and root cause analysis for complex data pipeline issues, implementing effective solutions and optimizations.
- Follow and adhere to group's standards such as SLAs, code styles, and deployment processes.
- Anticipate breaking changes to implement backwards compatibility strategies regarding API schema changesAssist the team in monitoring pipeline health via observability tools and metrics.
- Participate in refactoring efforts as platform application needs evolve over time.
Requirements:
- Bachelor's degree or higher in Computer Science, Engineering, Mathematics, or a related field.
- 3+ years of professional experience with a cloud database such as Snowflake, Bigquery, Redshift.
- +1 years of professional experience with dbt (cloud or core).
- Exposure to various data processing technologies such as OLAP and OLTP and their applications in real-world scenarios.
- Exposure to work cross-functionally with other teams such as Product, Customer Success, Platform Engineering.
- Familiarity with orchestration tools such as Dagster/Airflow.
- Familiarity with ETL/ELT tools such as dltHub/Meltano/Airbyte/Fivetran and DBT.
- High intermediate to advanced SQL skills (comfort with CTEs, window functions).
- Proficiency with Python and related libraries (e.g., pandas, sqlalchemy, psycopg2) for data manipulation, analysis, and automation.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link:https://zrec.in/e9578?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com


We are seeking a Senior Software Engineer (.NET, HTML5, and CSS) to join our team.
Responsibilities:
- Full stack development of web applications including projects ranging between data tiers, server-side, APIs, and front-end.
- Solve moderate to complex problems with minimal guidance and support.
- Help guide the progress of projects and tickets through the use of TechDev’s project and task management systems.
- Participate in release planning, support the success of released projects.
- Propose architectural directions when involved in planning projects.
- Ensure documentation and communication needs for projects are satisfied.
- Provide research, prototyping, and product/library exploration as requested – helping the TechDev team choose the best fits for technology.
- Production of automated testing as needed (including unit tests and end-to-end testing).
- Monitor the quality and security of projects with the use of static code analysis tools such as SonarCube.
- Respond to, troubleshoot, and resolve defects and outages in WLT software. This includes being able to respond to emergencies quickly if needed.
- Mentor and provide guidance to other Developers, perform constructive code reviews.
- Learn continuously and stay up-to-date with trends, technologies and direction in the technology industry and help surface. recommendations for Tech Dev, its processes, and its projects.
- Understand and display WLT’s values.
- Other duties as assigned.
Requirements:
- Ability to produce responsive and mobile-first front-ends using modern best practices and frameworks.
- Proficiency in technologies including but not limited to:.NET, HTML5, CSS, JavaScript, Angular, Svelte, SQL and non-relational DBs.
- Ability to be pragmatic in decision-making
- Comfort with implementation and management of packages and libraries to enhance software products (eg. Tailwind, PrimeNG, and others).
- Ability to juggle multiple priorities and respond dynamically as priorities change.
- Demonstrate a passion for learning new technologies and staying current.
- Strong time management capability, ability to estimate project scopes accurately, and adhere to timelines.
- Understands the “Big Picture” and has an entrepreneur way of thinking.
- Detailed knowledge of various browser capabilities, technologies, and good web design practices.
- Comfortable both architecting and implementing solutions through a team.
- Understanding the fundamentals of behind a scalable application.
- Familiar with various design and architectural patterns.
- Fluent with modern DevOps patterns.
- Strong communication and collaboration skills.
- Ability to uphold WLT values.
Experience:
- 10+ years hands on experience building dynamic web application using:.NET C#, JavaScript, CSS, Web APIs
- Experience with JavaScript front end frameworks such as Angular or Svelte
- Strong mentoring and interpersonal skills are required
- Experience with working on an agile development team
- Good understanding of databases, tools and techniques used for object to relational mapping, experience in performance tuning. Experience in technologies such as Microsoft SQL Server, SQL Azure, Entity Framework, other ORMs, and non-relational data stores.
- Experience integrating off-the-shelf solutions, understand build vs. buy decisions
- Experience with Azure DevOps, GIT, and Visual Studio for task and source code management – including CI and GIT branching strategies
- Experience with Microsoft Azure or similar cloud platforms
- Proficient in object-oriented design and development. Knowledge of common architectural patterns, SOLID principals, OWASP top-ten, and industry accepted best practices.
- Experience with Education technology and Learning Managements Systems (LMSs) a plus
Education or Certification:
- Bachelor’s degree in computer science, software development or equivalent experience.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link: https://zrec.in/1lunY?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com




We are seeking an Application Developer/Software Engineer with strong technical experience in all phases of the software development life cycle (SDLC) with a demonstrated technical expertise in one or more areas of state-of-the-art software development technology.
Responsibilties:
- Provides activities related to enterprise full life-cycle software development projects.
- Develops detailed functional and technical requirements for client-server and web software applications.
- Conduct detailed analyses and module-level specification development of software requirements.
- Define and implement high-performance and highly scalable product/application architectures and lead operational, tactical, and strategic integration activities.
- Perform complex programming and analysis for web and mobile applications and ETL processing; define requirements; write program specifications; design, code, test, and debug programming assignments; document programs.
- Supervise the efforts of other developers in major system development projects; determine and analyze functional requirements; determine proposed solutions information processing requirements; and optimize system performance.
- The work task could include total custom development, customization as needed for COTS, report development, data conversion, and support of legacy applications.
Requirements:
- Can code at an intermediate or expert level in applications such as C#, ASP.NET, .NET Core, SQL, Python, Java, React, TypeScript, CSS/JavaScript, Git, Azure, Knockout, MarkLogic, ORACLE, etc.
- 4+ years’ experience or specific educational background sufficient to demonstrate competency with Microsoft technology, including ASP.
- Experience with Artificial Intelligence (AI)/Machine Learning (ML), SharePoint.
- knowledge of HTML, XHTML, XML, XSLT, .NET Framework, Visual Studio, JavaScript.
- 4+ years with Cloud technologies such as Azure / AWS / Google Cloud.
- Proficient with appropriate programming languages, particularly ASP.
- NET and modern web frameworks like React.
- Comfortable with Object Oriented Programming and Software Patterns.
- Excellent interpersonal skills.
- High motivation and ability to work with teams to meet project objectives.
- Ability to work on multiple projects simultaneously.
- Ability to meet project deadlines and goals without management supervision.
- Awareness of database design concepts and proficiency in a general cloud environment.
Educational Requirements:
- BS in a field related to computer science or information systems, or advanced degree, or additional specific training and/or certification in 4th generation computing language.
- Must be able to define and implement high-performance and highly scalable product/application architectures, and able to lead integration activities for operational, tactical, and strategic systems.
- Able to develop detailed functional and technical requirements for client-server and web software applications and conduct detailed analyses and module-level specification development of software requirements.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link : https://zrec.in/RlUkC?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com
You will be working hands-on on a complex and compound product that has the potential to be used by millions of sales and marketing people around the world. Your contribution to delivering an excellent product platform that:
- enables quick iteration
- supports product customization
- and handles scale
What do we expect you to have?
- 2+ years of experience in backend engineering
- An intent to learn and an urge to build a product by learning different technologies
- Interest in writing complex, scalable, and maintainable backend applications
- Tech stack requirements:
Must haves
- Experience in building application server in Java (Spring / Spring boot) / NodeJS / Golang / Python
- Experience in using SQL databases and designing schemas based on application need
- Experience with container services and runtimes (docker / docker-compose / k8s)
- Experience with cloud paas (AWS / GCP / Azure cloud)
- Experience and familiarity with microservices’ concepts
- Experience with bash scripting
Good to have (Preferred)
- Preferred experience with org wide message queue (rabbitmq / aws sqs)
- Preferred experience with task orchestration services (apache airflow / aws step function)
- Preferred experience with infra as code (or system configuration) tools (terraform / chef / ansible)
- Preferred experience with build essential tools (make / makefile)
- Preferred experience with monitoring and tracing systems for performance / system / application monitoring (grafana + loki + prometheus / aws cloudwatch)
What will you learn?
- Building highly available, complex, compound, performant systems of microservices platform that acts as an API layer
- Industry-standard state-of-the-art tools + methodologies + frameworks + infra for building a product.
- Fable is not a trivial CRUD app. It requires a lot of consideration and care for building the API layer as the product is highly customizable per user.
- How different functions (sales, marketing, product, engineering) in a high-velocity product company work in synergy to deliver an iterative product in real life.
Who would you be working with?
- You would be directly working with the co-founder & CTO who has built multiple companies before and has built large teams in large-scale companies like ThoughtSpot, Unacademy, etc.
Position details
- Fully remote.
- 5 days/week (all public and government holidays will be non-working days).
- No specific work hours (we will sync over zoom over the course of the day).