4+ Distributed computing Jobs in India
Apply to 4+ Distributed computing Jobs on CutShort.io. Find your next job, effortlessly. Browse Distributed computing Jobs and apply today!

Position : Software Engineer (Java Backend Engineer)
Experience : 4+ Years
📍 Location : Bangalore, India (Hybrid)
Mandatory Skills : Java 8+ (Advanced Features), Spring Boot, Apache Spark (Spark Streaming), SQL & Cosmos DB, Git, Maven, CI/CD (Jenkins, GitHub), Azure Cloud, Agile Scrum.
About the Role :
We are seeking a highly skilled Backend Engineer with expertise in Java, Spark, and microservices architecture to join our dynamic team. The ideal candidate will have a strong background in object-oriented programming, experience with Spark Streaming, and a deep understanding of distributed systems and cloud technologies.
Key Responsibilities :
- Design, develop, and maintain highly scalable microservices and optimized RESTful APIs using Spring Boot and Java 8+.
- Implement and optimize Spark Streaming applications for real-time data processing.
- Utilize advanced Java 8 features, including:
- Functional interfaces & Lambda expressions
- Streams and Parallel Streams
- Completable Futures & Concurrency API improvements
- Enhanced Collections APIs
- Work with relational (SQL) and NoSQL (Cosmos DB) databases, ensuring efficient data modeling and retrieval.
- Develop and manage CI/CD pipelines using Jenkins, GitHub, and related automation tools.
- Collaborate with cross-functional teams, including Product, Business, and Automation, to deliver end-to-end product features.
- Ensure adherence to Agile Scrum practices and participate in code reviews to maintain high-quality standards.
- Deploy and manage applications in Azure Cloud environments.
Minimum Qualifications:
- BS/MS in Computer Science or a related field.
- 4+ Years of experience developing backend applications with Spring Boot and Java 8+.
- 3+ Years of hands-on experience with Git for version control.
- Strong understanding of software design patterns and distributed computing principles.
- Experience with Maven for building and deploying artifacts.
- Proven ability to work in Agile Scrum environments with a collaborative team mindset.
- Prior experience with Azure Cloud Technologies.


- Drive the architecture and design of large-scale, multi-tiered, distributed software applications, tools, systems and services using object-oriented design, distributed programming, Java/Node/ Python, .Net.
- Hands on experience in delivering high quality software products.
- Gather business and functional requirements from external and/or internal customers and end-users, and translate requirements into technical specifications to build robust, scalable, supportable solutions that work well between a ranges of complex systems.
- Serve as technical lead throughout the full development lifecycle, end-to-end, from scoping, planning, conception, design, implementation and testing, to documentation, delivery and maintenance.
- Provide design reviews for other engineers, including feedback on architecture and design issues, as well as integration, performance and scalability.
- Manage resources on multiple technical projects and ensure schedules, milestones, and priorities are compatible with technology and business goals.




As a Lead Solutions Architect at Aganitha, you will:
* Engage and co-innovate with customers in BioPharma R&D
* Design and oversee implementation of solutions for BioPharma R&D * Manage Engineering teams using Agile methodologies
* Enhance reuse with platforms, frameworks and libraries
Applying candidates must have demonstrated expertise in the following areas:
1. App dev with modern tech stacks of Python, ReactJS, and fit for purpose database technologies
2. Big data engineering with distributed computing frameworks
3. Data modeling in scientific domains, preferably in one or more of: Genomics, Proteomics, Antibody engineering, Biological/Chemical synthesis and formulation, Clinical trials management
4. Cloud and DevOps automation
5. Machine learning and AI (Deep learning)

We are looking for a Data Engineer that will be responsible for collecting, storing, processing, and analyzing huge sets of data that is coming from different sources.
Responsibilities
Working with Big Data tools and frameworks to provide requested capabilities Identify development needs in order to improve and streamline operations Develop and manage BI solutions Implementing ETL process and Data Warehousing Monitoring performance and managing infrastructure
Skills
Proficient understanding of distributed computing principles Proficiency with Hadoop and Spark Experience with building stream-processing systems, using solutions such as Kafka and Spark-Streaming Good knowledge of Data querying tools SQL and Hive Knowledge of various ETL techniques and frameworks Experience with Python/Java/Scala (at least one) Experience with cloud services such as AWS or GCP Experience with NoSQL databases, such as DynamoDB,MongoDB will be an advantage Excellent written and verbal communication skills