Cutshort logo
Google Cloud Platform (GCP) Jobs in Chennai

50+ Google Cloud Platform (GCP) Jobs in Chennai | Google Cloud Platform (GCP) Job openings in Chennai

Apply to 50+ Google Cloud Platform (GCP) Jobs in Chennai on CutShort.io. Explore the latest Google Cloud Platform (GCP) Job opportunities across top companies like Google, Amazon & Adobe.

icon
Fractal Analytics

at Fractal Analytics

5 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore), Hyderabad, Gurugram, Noida, Pune, Mumbai, Chennai, Coimbatore
4yrs+
Best in industry
Generative AI
skill iconMachine Learning (ML)
LLMOps
Large Language Models (LLM) tuning
Open-source LLMs
+15 more

Role description:

You will be building curated enterprise grade solutions for GenAI application deployment at a production scale for clients. Solid understanding and hands on skills for GenAI application deployment that includes development and engineering skills. The role requires development and engineering skills on GenAI application development including data ingestion, choosing the right fit LLMs, simple and advanced RAG, guardrails, prompt engineering for optimisation, traceability, security, LLM evaluation, observability, and deployment at scale on cloud or on premise. As this space evolves very rapidly, candidates must also demonstrate knowledge on agentic AI frameworks. Candidates having strong background on ML with engineering skills is highly preferred for LLMOps role.


Required skills:

  • 4-8 years of experience in working on ML projects that includes business requirement gathering, model development, training, deployment at scale and monitoring model performance for production use cases
  • Strong knowledge on Python, NLP, Data Engineering, Langchain, Langtrace, Langfuse, RAGAS, AgentOps (optional)
  • Should have worked on proprietary and open source large language models
  • Experience on LLM fine tuning, creating distilled model from hosted LLMs
  • Building data pipelines for model training
  • Experience on model performance tuning, RAG, guardrails, prompt engineering, evaluation and observability
  • Experience in GenAI application deployment on cloud and on-premise at scale for production
  • Experience in creating CI/CD pipelines
  • Working knowledge on Kubernetes
  • Experience in minimum one cloud: AWS / GCP / Azure to deploy AI services
  • Experience in creating workable prototypes using Agentic AI frameworks like CrewAI, Taskweaver, AutoGen
  • Experience in light weight UI development using streamlit or chainlit (optional)
  • Desired experience on open-source tools for ML development, deployment, observability and integration
  • Background on DevOps and MLOps will be a plus
  • Experience working on collaborative code versioning tools like GitHub/GitLab
  • Team player with good communication and presentation skills
Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Chennai, Bengaluru (Bangalore)
5 - 10 yrs
₹15L - ₹26L / yr
Google Cloud Platform (GCP)
bigquery
Data modeling
Snow flake schema
OLTP
+1 more

1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, BQ optimization, Airflow/Composer, Python(preferred)/Java

2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges

3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP

4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At Least 2 databases)

5. Data Warehouse concepts - Beginner to Intermediate level

6.Data Modeling, GCP Databases, DB Schema(or similar)

7.Hands-on data modelling for OLTP and OLAP systems

8.In-depth knowledge of Conceptual, Logical and Physical data modelling

9.Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same

10.Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.

11.Should have working experience on at least one data modelling tool,

preferably DBSchema, Erwin

12Good understanding of GCP databases like AlloyDB, CloudSQL, and

BigQuery.

13.People with functional knowledge of the mutual fund industry will be a plus Should be willing to work from Chennai, office presence is mandatory


Role & Responsibilities:

● Work with business users and other stakeholders to understand business processes.

● Ability to design and implement Dimensional and Fact tables

● Identify and implement data transformation/cleansing requirements

● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse

● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions

● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique

● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.

● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.

● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.

● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.

● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.

● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.

● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.

● Train business end-users, IT analysts, and developers.

Read more
Pluginlive

at Pluginlive

1 recruiter
Harsha Saggi
Posted by Harsha Saggi
Chennai
2 - 4 yrs
₹15L - ₹20L / yr
Data engineering
skill iconPython
SQL
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)

What you’ll do

  • Design, build, and maintain robust ETL/ELT pipelines for product and analytics data
  • Work closely with business, product, analytics, and ML teams to define data needs
  • Ensure high data quality, lineage, versioning, and observability
  • Optimize performance of batch and streaming jobs
  • Automate and scale ingestion, transformation, and monitoring workflows
  • Document data models and key business metrics in a self-serve way
  • Use AI tools to accelerate development, troubleshooting, and documentation


Must-Haves:

  • 2–4 years of experience as a data engineer (product or analytics-focused preferred)
  • Solid hands-on experience with Python and SQL
  • Experience with data pipeline orchestration tools like Airflow or Prefect
  • Understanding of data modeling, warehousing concepts, and performance optimization
  • Familiarity with cloud platforms (GCP, AWS, or Azure)
  • Bachelor's in Computer Science, Data Engineering, or a related field
  • Strong problem-solving mindset and AI-native tooling comfort (Copilot, GPTs)


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Pune, Hyderabad, Chennai, Kolkata
8 - 15 yrs
₹25L - ₹45L / yr
skill iconJava
skill iconSpring Boot
Microservices
skill iconLeadership
Team leadership
+11 more

Job Title : Lead Java Developer (Backend)

Experience Required : 8 to 15 Years

Open Positions : 5

Location : Any major metro city (Bengaluru, Pune, Chennai, Kolkata, Hyderabad)

Work Mode : Open to Remote / Hybrid / Onsite

Notice Period : Immediate Joiner/30 Days or Less


About the Role :

  • We are looking for experienced Lead Java Developers who bring not only strong backend development skills but also a product-oriented mindset and leadership capability.
  • This is an opportunity to be part of high-impact digital transformation initiatives that go beyond writing code—you’ll help shape future-ready platforms and drive meaningful change.
  • This role is embedded within a forward-thinking digital engineering team that thrives on co-innovation, lean delivery, and end-to-end ownership of platforms and products.


Key Responsibilities :

  • Design, develop, and implement scalable backend systems using Java and Spring Boot.
  • Collaborate with product managers, designers, and engineers to build intuitive and reliable digital products.
  • Advocate and implement engineering best practices : SOLID principles, OOP, clean code, CI/CD, TDD/BDD.
  • Lead Agile-based development cycles with a focus on speed, quality, and customer outcomes.
  • Guide and mentor team members, fostering technical excellence and ownership.
  • Utilize cloud platforms and DevOps tools to ensure performance and reliability of applications.

What We’re Looking For :

  • Proven experience in Java backend development (Spring Boot, Microservices).
  • 8+ Years of hands-on engineering experience with at least 2+ years in a Lead role.
  • Familiarity with cloud platforms such as AWS, Azure, or GCP.
  • Good understanding of containerization and orchestration tools like Docker and Kubernetes.
  • Exposure to DevOps and Infrastructure as Code practices.
  • Strong problem-solving skills and the ability to design solutions from first principles.
  • Prior experience in product-based or startup environments is a big plus.

Ideal Candidate Profile :

  • A tech enthusiast with a passion for clean code and scalable architecture.
  • Someone who thrives in collaborative, transparent, and feedback-driven environments.
  • A leader who takes ownership beyond individual deliverables to drive overall team and project success.

Interview Process

  1. Initial Technical Screening (via platform partner)
  2. Technical Interview with Engineering Team
  3. Client-facing Final Round

Additional Info :

  • Targeting profiles from product/startup backgrounds.
  • Strong preference for candidates with under 1 month of notice period.
  • Interviews will be fast-tracked for qualified profiles.
Read more
Xebia IT Architects

at Xebia IT Architects

2 recruiters
Vijay S
Posted by Vijay S
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Chennai, Bhopal, Jaipur
10 - 15 yrs
₹30L - ₹40L / yr
Spark
Google Cloud Platform (GCP)
skill iconPython
Apache Airflow
PySpark
+1 more

We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.


  • Shift: 2 PM 11 PM
  • Work Mode: Hybrid (3 days a week) across Xebia locations
  • Notice Period: Immediate joiners or those with a notice period of up to 30 days


Key Responsibilities:

  • Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
  • Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
  • Ensure data integrity, consistency, and availability across all systems.
  • Collaborate with data engineers, analysts, and stakeholders to optimize performance.
  • Document standards and best practices for data engineering workflows.

Required Experience:


  • 7-8 years of experience in data engineering, architecture, and pipeline development.
  • Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
  • Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
  • Understanding of Data Lake table formats (Delta, Iceberg, etc.).
  • Proficiency in Python for scripting and automation.
  • Strong problem-solving skills and collaborative mindset.


⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.


Looking forward to your response!


Best regards,

Vijay S

Assistant Manager - TAG

https://www.linkedin.com/in/vijay-selvarajan/

Read more
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai
1 - 3 yrs
₹9L - ₹20L / yr
Generative AI
AI Agents
Large Language Models (LLM)
Retrieval Augmented Generation (RAG)
tuning
+18 more

About Moative

Moative, an Applied AI Services company, designs AI roadmaps, builds co-pilots and predictive AI solutions for companies in energy, utilities, packaging, commerce, and other primary industries. Through Moative Labs, we aspire to build micro-products and launch AI startups in vertical markets.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.


Work you’ll do

As an AI Engineer at Moative, you will be at the forefront of applying cutting-edge AI to solve real-world problems. You will be instrumental in designing and developing intelligent software solutions, leveraging the power of foundation models to automate and optimize critical workflows. Collaborating closely with domain experts, data scientists, and ML engineers, you will integrate advanced ML and AI technologies into both existing and new systems. This role offers a unique opportunity to explore innovative ideas, experiment with the latest foundation models, and build impactful products that directly enhance the lives of citizens by transforming how government services are delivered. You'll be working on challenging and impactful projects that move the needle on traditionally difficult-to-automate processes.


Responsibilities

  • Utilize and adapt foundation models, particularly in vision and data extraction, as the core building blocks for developing impactful products aimed at improving government service delivery. This includes prompt engineering, fine-tuning, and evaluating model performance
  • Architect, build, and deploy intelligent AI agent-driven workflows that automate and optimize key processes within government service delivery. This encompasses the full lifecycle from conceptualization and design to implementation and monitoring
  • Contribute directly to enhancing our model evaluation and monitoring methodologies to ensure robust and reliable system performance. Proactively identify areas for improvement and implement solutions to optimize model accuracy and efficiency
  • Continuously learn and adapt to the rapidly evolving landscape of AI and foundation models, exploring new techniques and technologies to enhance our capabilities and solutions


Who you are

You are a passionate and results-oriented engineer who is driven by the potential of AI/ML to revolutionize processes, enhance products, and ultimately improve user experiences. You thrive in dynamic environments and are comfortable navigating ambiguity. You possess a strong sense of ownership and are eager to take initiative, advocating for your technical decisions while remaining open to feedback and collaboration. 


You are adept at working with real-world, often imperfect data, and have a proven ability to develop, refine, and deploy AI/ML models into production in a cost-effective and scalable manner. You are excited by the prospect of directly impacting government services and making a positive difference in the lives of citizens


Skills & Requirements

  • 3+ years of experience in programming languages such as Python or Scala
  • Proficient knowledge of cloud platforms (e.g., AWS, Azure, GCP) and containerization, DevOps (Docker, Kubernetes)
  • Tuning and deploying foundation models, particularly for vision tasks and data extraction
  • Excellent analytical and problem-solving skills with the ability to break down complex challenges into actionable steps
  • Strong written and verbal communication skills, with the ability to effectively articulate technical concepts to both technical and non-technical audiences


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, efficiency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less.


Here are some of our guiding principles:

  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, fix or improve – anything that isn’t done right, irrespective of who did it. Be selfish about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that out loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply. We encourage you to apply even if you believe you do not meet all the requirements listed above.  


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers. 


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Read more
Wekan Enterprise Solutions

at Wekan Enterprise Solutions

2 candid answers
Deepak  N
Posted by Deepak N
Bengaluru (Bangalore), Chennai
12 - 22 yrs
Best in industry
skill iconNodeJS (Node.js)
skill iconMongoDB
Microservices
skill iconJavascript
TypeScript
+3 more

Architect


Experience - 12+ yrs


About Wekan Enterprise Solutions


Wekan Enterprise Solutions is a leading Technology Consulting company and a strategic investment partner of MongoDB. We help companies drive innovation in the cloud by adopting modern technology solutions that help them achieve their performance and availability requirements. With strong capabilities around Mobile, IOT and Cloud environments, we have an extensive track record helping Fortune 500 companies modernize their most critical legacy and on-premise applications, migrating them to the cloud and leveraging the most cutting-edge technologies.

 

Job Description

We are looking for passionate architects eager to be a part of our growth journey. The right candidate needs to be interested in working in high-paced and challenging environments leading technical teams, designing system architecture and reviewing peer code. Interested in constantly upskilling, learning new technologies and expanding their domain knowledge to new industries. This candidate needs to be a team player and should be looking to help build a culture of excellence. Do you have what it takes?

You will be working on complex data migrations, modernizing legacy applications and building new applications on the cloud for large enterprise and/or growth stage startups. You will have the opportunity to contribute directly into mission critical projects directly interacting with business stakeholders, customer’s technical teams and MongoDB solutions Architects.

Location - Chennai or Bangalore


●     Relevant experience of 12+ years building high-performance applications with at least 3+ years as an architect.

●     Good problem solving skills

●     Strong mentoring capabilities

●     Good understanding of software development life cycle

●     Strong experience in system design and architecture

●     Strong focus on quality of work delivered

●     Excellent verbal and written communication skills

 

Required Technical Skills

 

● Extensive hands-on experience building high-performance applications using Node.Js (Javascript/Typescript) and .NET/ Golang / Java / Python.

● Strong experience with appropriate framework(s).

● Wellversed in monolithic and microservices architecture.

● Hands-on experience with data modeling on MongoDB and any other Relational or NoSQL databases

● Experience working with 3rd party integrations ranging from authentication, cloud services, etc.

● Hands-on experience with Kafka or RabbitMQ.

● Handsonexperience with CI/CD pipelines and atleast 1 cloud provider- AWS / GCP / Azure

● Strong experience writing and maintaining clear documentation

  

Good to have skills:

 

●     Experience working with frontend technologies - React.Js or Vue.Js or Angular.

●     Extensive experience consulting with customers directly for defining architecture or system design.

●     Technical certifications in AWS / Azure / GCP / MongoDB or other relevant technologies

Read more
Xebia IT Architects

at Xebia IT Architects

2 recruiters
Vijay S
Posted by Vijay S
Bengaluru (Bangalore), Pune, Hyderabad, Chennai, Jaipur, Bhopal, Gurugram
5 - 11 yrs
₹30L - ₹40L / yr
skill iconScala
Microservices
CI/CD
DevOps
skill iconAmazon Web Services (AWS)
+2 more

Dear,


We are excited to inform you about an exclusive opportunity at Xebia for a Senior Backend Engineer role.


📌 Job Details:

  • Role: Senior Backend Engineer
  •  Shift: 1 PM – 10 PM
  • Work Mode: Hybrid (3 days a week) across Xebia locations
  • Notice Period: Immediate joiners or up to 30 days


🔹 Job Responsibilities:


✅ Design and develop scalable, reliable, and maintainable backend solutions

✅ Work on event-driven microservices architecture

✅ Implement REST APIs and optimize backend performance

✅ Collaborate with cross-functional teams to drive innovation

✅ Mentor junior and mid-level engineers


🔹 Required Skills:


✔ Backend Development: Scala (preferred), Java, Kotlin

✔ Cloud: AWS or GCP

✔ Databases: MySQL, NoSQL (Cassandra)

✔ DevOps & CI/CD: Jenkins, Terraform, Infrastructure as Code

✔ Messaging & Caching: Kafka, RabbitMQ, Elasticsearch

✔ Agile Methodologies: Scrum, Kanban


⚠ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.


Looking forward to your response! Also, feel free to refer anyone in your network who might be a good fit.


Best regards,

Vijay S

Assistant Manager - TAG

https://www.linkedin.com/in/vijay-selvarajan/

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad
5 - 10 yrs
₹5L - ₹10L / yr
Google Cloud Platform (GCP)
Teradata
ETL
Datawarehousing

Overview:

We are seeking a talented and experienced GCP Data Engineer with strong expertise in Teradata, ETL, and Data Warehousing to join our team. As a key member of our Data Engineering team, you will play a critical role in developing and maintaining data pipelines, optimizing ETL processes, and managing large-scale data warehouses on the Google Cloud Platform (GCP).

Responsibilities:

  • Design, implement, and maintain scalable ETL pipelines on GCP (Google Cloud Platform).
  • Develop and manage data warehouse solutions using Teradata and cloud-based technologies (BigQuery, Cloud Storage, etc.).
  • Build and optimize high-performance data pipelines for real-time and batch data processing.
  • Integrate, transform, and load large datasets into GCP-based data lakes and data warehouses.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Write efficient, clean, and reusable code for ETL processes and data workflows.
  • Ensure data quality, consistency, and integrity across all pipelines and storage solutions.
  • Implement data governance practices and ensure security and compliance of data processes.
  • Monitor and troubleshoot data pipeline performance and resolve issues proactively.
  • Participate in the design and implementation of scalable data architectures using GCP services like BigQuery, Cloud Dataflow, and Cloud Pub/Sub.
  • Optimize and automate data workflows for continuous improvement.
  • Maintain up-to-date documentation of data pipeline architectures and processes.

Requirements:

Technical Skills:

  • Google Cloud Platform (GCP): Extensive experience with BigQuery, Cloud Storage, Cloud Dataflow, and Cloud Composer.
  • ETL Tools: Expertise in building ETL pipelines using tools such as Apache NiFi, Apache Beam, or custom Python-based scripts.
  • Data Warehousing: Strong experience working with Teradata for data warehousing, including data modeling, schema design, and performance tuning.
  • SQL: Advanced proficiency in SQL and relational databases, particularly in the context of Teradata and GCP environments.
  • Programming: Proficient in Python, Java, or Scala for building and automating data processes.
  • Data Architecture: Knowledge of best practices in designing scalable data architectures for both structured and unstructured data.

Experience:

  • Proven experience as a Data Engineer, with a focus on building and managing ETL pipelines and data warehouse solutions.
  • Hands-on experience in data modeling and working with complex, high-volume data in a cloud-based environment.
  • Experience with data migration from on-premises to cloud environments (Teradata to GCP).
  • Familiarity with Data Lake concepts and technologies.
  • Experience with version control systems like Git and working in Agile environments.
  • Knowledge of CI/CD and automation processes in data engineering.

Soft Skills:

  • Strong problem-solving and troubleshooting skills.
  • Excellent communication skills, both verbal and written, for interacting with technical and non-technical teams.
  • Ability to work collaboratively in a fast-paced, cross-functional team environment.
  • Strong attention to detail and ability to prioritize tasks.

Preferred Qualifications:

  • Experience with other GCP tools such as Dataproc, Bigtable, Cloud Functions.
  • Knowledge of Terraform or similar infrastructure-as-code tools for managing cloud resources.
  • Familiarity with data governance frameworks and data privacy regulations.
  • Certifications in Google Cloud or Teradata are a plus.

Benefits:

  • Competitive salary and performance-based bonuses.
  • Health, dental, and vision insurance.
  • 401(k) with company matching.
  • Paid time off and flexible work schedules.
  • Opportunities for professional growth and development.


Read more
Xerago

at Xerago

6 recruiters
Sridhar S
Posted by Sridhar S
Chennai, Mumbai
8 - 12 yrs
₹15L - ₹30L / yr
Solution architecture
Migration
Implementation
unica campaign
Google Cloud Platform (GCP)
+1 more

Job Summary:

We are seeking a highly experienced Unica Architect with 8+ years of expertise in designing, implementing, and optimizing Unica solutions. The ideal candidate will play a key role in architecting scalable and high-performing Unica-based marketing automation platforms, integrating Unica with enterprise systems, and driving best practices in campaign management, data strategy, and customer engagement.

Key Responsibilities:

Solution Architecture & Design:

·      Design and develop end-to-end Unica Campaign, Unica Interact, Unica Journey, and Unica Plan solutions.

·      Define best practices for campaign orchestration, segmentation, and personalization using Unica.

·      Architect scalable Unica environments ensuring high performance, reliability, and security.

·      Collaborate with business stakeholders, marketing teams, and IT teams to understand requirements and translate them into Unica-based solutions.

·      Architect scalable Unica environments ensuring high performance, reliability, and security.

Integration & Data Strategy:

·      Design seamless integration between Unica and CRM, data warehouses, digital platforms, and real-time decision systems.

·      Optimize ETL processes, audience segmentation strategies, and data flows for effective campaign execution.

·      Ensure proper data governance, compliance (GDPR, CCPA), and security policies within Unica solutions.


Unica Administration & Performance Optimization:

·      Lead the installation, configuration, and maintenance of Unica applications.

·      Optimize system performance, troubleshoot issues, and ensure high availability.

·      Manage application upgrades, patches, and environment migrations for Unica platforms.

·      Maintain the overall Unica system and perform regular housekeeping in an automated way.

·      Ensure all the integrated surrounding systems are functioning as expected.

·      Collaborating with relevant stakeholders from the client side to maintain the environment's stability.

·      Configuring notification facilities to the relevant stakeholders to make them aware of any system/functional failure that happens.

Campaign Strategy & Execution Support:

·      Guide campaign teams in setting up and executing complex multi-channel marketing campaigns.

·      Define audience segmentation models, personalization strategies, and A/B testing frameworks.

·      Ensure seamless customer journey automation and omnichannel engagement.

Leadership & Stakeholder Management:

·      Act as the subject matter expert (SME) and advisor for Unica implementations.

·      Mentor and train junior Unica developers and campaign specialists.

·      Engage with business leaders, marketing teams, and IT teams to drive Unica adoption and roadmap.

Required Skills & Experience:

·      8+ years of experience in HCL Unica Platform, Campaign, Plan, Interact, Journey, Deliver, Link, and related modules.

·      Deep understanding of campaign management, audience segmentation, and personalization.

·      Expertise in SQL, database design, and ETL processes for Unica.

·      Hands-on experience in Unica upgrades, migrations, and performance tuning.

·      Strong knowledge of REST APIs, SOAP APIs, and integrations with third-party tools (CRM, CDP, Web Analytics).

·      Experience in cloud deployments (AWS, Azure, GCP) for Unica.

·      Experience with data integration tools and techniques (e.g., ETL, APIs, data lakes).

·      Familiarity with web technologies (HTML, CSS, JavaScript) and CMS platforms.

·      Working knowledge of programming languages such as Java, PHP, and Python. Experience in developing, debugging, and maintaining code in these languages is preferred.

·      Experience with scripting languages like Bash, Shell, or PowerShell for automation and system management tasks.

·      Strong understanding of SQL (Structured Query Language) and experience in working with relational databases such as MySQL, PostgreSQL, or similar, and experience in working with stored procedures.

·      Proficiency in Linux operating systems, networking concepts, and server administration tasks.

·      Excellent problem-solving and analytical skills.

·      Strong communication and interpersonal skills.

·      Ability to work independently and as part of a team.

·      Detail-oriented with strong organizational skills.

 

Read more
TOP MNC

TOP MNC

Agency job
via TCDC by Sheik Noor
Bengaluru (Bangalore), Mangalore, Chennai, Coimbatore, Pune, Mumbai, Kolkata
6 - 10 yrs
₹10L - ₹21L / yr
skill iconJava
06692
Google Cloud Platform (GCP)

Java Developer with GCP

Skills : Java and Spring Boot, GCP, Cloud Storage, BigQuery, RESTful API, 

EXP : SA(6-10 Years)

Loc : Bangalore, Mangalore, Chennai, Coimbatore, Pune, Mumbai, Kolkata

Np : Immediate to 60 Days.


Kindly share your updated resume via WA - 91five000260seven

Read more
Xebia IT Architects

at Xebia IT Architects

2 recruiters
Vijay S
Posted by Vijay S
Bengaluru (Bangalore), Pune, Hyderabad, Chennai, Gurugram, Bhopal, Jaipur
5 - 15 yrs
₹20L - ₹35L / yr
Spark
ETL
Data Transformation Tool (DBT)
skill iconPython
Apache Airflow
+2 more

We are seeking a highly skilled and experienced Offshore Data Engineer . The role involves designing, implementing, and testing data pipelines and products.


Qualifications & Experience:


bachelor's or master's degree in computer science, Information Systems, or a related field.


5+ years of experience in data engineering, with expertise in data architecture and pipeline development.


☁️ Proven experience with GCP, Big Query, Databricks, Airflow, Spark, DBT, and GCP Services.


️ Hands-on experience with ETL processes, SQL, PostgreSQL, MySQL, MongoDB, Cassandra.


Strong proficiency in Python and data modelling.


Experience in testing and validation of data pipelines.


Preferred: Experience with eCommerce systems, data visualization tools (Tableau, Looker), and cloud certifications.


If you meet the above criteria and are interested, please share your updated CV along with the following details:


Total Experience:


Current CTC:


Expected CTC:


Current Location:


Preferred Location:


Notice Period / Last Working Day (if serving notice):


⚠️ Kindly share your details only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.


Looking forward to your response!

Read more
Amwhiz
Boomika S
Posted by Boomika S
Chennai
3 - 6 yrs
₹4L - ₹9L / yr
Fullstack Developer
skill iconNodeJS (Node.js)
TypeScript
NestJS
Angular(10+)
+18 more

Full Stack Developer (3+ Years Experience)

Location: Chennai - Work From Office

Job Type: Full-time


About the Role:


We are looking for a highly skilled Full Stack Developer with 3+ years of experience in designing, developing, and maintaining scalable web applications. The ideal candidate should have expertise in Node.js, TypeScript, NestJS, Angular, and databases like MongoDB, DynamoDB, and RDBMS. You will be working on building robust REST APIs, implementing various authentication methods (Cookies, JWT, OAuth, etc.), and deploying applications in AWS & GCP cloud environments using serverless technologies.


Key Responsibilities:


● Develop and maintain scalable backend services using Node.js, TypeScript, and NestJS.

● Design and implement frontend applications using Angular.

● Build and optimize RESTful APIs for high-performance web applications.

● Work with databases (MongoDB, DynamoDB, and RDBMS) to store and retrieve application data efficiently.

● Implement authentication and authorization mechanisms such as JWT, cookies, and OAuth.

● Deploy and manage applications on AWS (Lambda, API Gateway, S3, DynamoDB, IAM, Cognito, etc.) and GCP Cloud Functions.

● Ensure code quality with unit testing, integration testing, and CI/CD pipelines.

● Work on serverless architectures to optimize performance and scalability.

● Collaborate with cross-functional teams to define and develop new features.

● Troubleshoot and debug application issues in both development and production environments.


Required Skills & Qualifications:


● 3+ years of experience as a Full Stack Developer.

● Strong knowledge of Node.js, TypeScript, and NestJS.

● Experience with Angular (Angular 10+ preferred).

● Hands-on experience with MongoDB, DynamoDB, and RDBMS (e.g., PostgreSQL, MySQL).

● Expertise in REST API development and API security best practices.

● Experience with authentication methods (JWT, OAuth, Session Cookies).

● Proficiency in AWS (Lambda, API Gateway, S3, DynamoDB, IAM, Cognito, etc.) and/or GCP Cloud Functions.

● Familiarity with serverless development and microservices architecture.

● Strong knowledge of CI/CD pipelines and automated deployment strategies.

● Understanding of software development best practices, version control (Git), and

Agile methodologies.


Nice to Have:


● Experience with Kubernetes and containerized applications (Docker).

● Understanding of WebSockets and real-time applications.


Benefits:


● Competitive salary based on experience.

● Flexible working hours.

● Learning and development opportunities.

● Collaborative and growth-oriented team environment

Read more
Chennai
5 - 7 yrs
₹15L - ₹25L / yr
Apache Kafka
Google Cloud Platform (GCP)
BCP
DevOps

Job description

 Location: Chennai, India

 Experience: 5+ Years

 Certification: Kafka Certified (Mandatory); Additional Certifications are a Plus


Job Overview:

We are seeking an experienced DevOps Engineer specializing in GCP Cloud Infrastructure Management and Kafka Administration. The ideal candidate should have 5+ years of experience in cloud technologies, Kubernetes, and Kafka, with a mandatory Kafka certification.


Key Responsibilities:

Cloud Infrastructure Management:

· Manage and update Kubernetes (K8s) on GKE.

· Monitor and optimize K8s resources, including pods, storage, memory, and costs.

· Oversee the general monitoring and maintenance of environments using:

o OpenSearch / Kibana

o KafkaUI

o BGP

o Grafana / Prometheus


Kafka Administration:

· Manage Kafka brokers and ACLs.

· Hands-on experience in Kafka administration (preferably Confluent Kafka).

· Independently debug, optimize, and implement Kafka solutions based on developer and business needs.


Other Responsibilities:

· Perform random investigations to troubleshoot and enhance infrastructure.

· Manage PostgreSQL databases efficiently.

· Administer Jenkins pipelines, supporting CI/CD implementation and maintenance.


Required Skills & Qualifications:

· Kafka Certified Engineer (Mandatory).

· 5+ years of experience in GCP DevOps, Cloud Infrastructure, and Kafka Administration.

· Strong expertise in Kubernetes (K8s), Google Kubernetes Engine (GKE), and cloud environments.

· Hands-on experience with monitoring tools like Grafana, Prometheus, OpenSearch, and Kibana.

· Experience managing PostgreSQL databases.

· Proficiency in Jenkins pipeline administration.

· Ability to work independently and collaborate with developers and business stakeholders.

If you are passionate about DevOps, Cloud Infrastructure, and Kafka, and meet the above qualifications, we encourage you to apply!


Read more
top MNC

top MNC

Agency job
via Vy Systems by thirega thanasekaran
Hyderabad, Chennai
10 - 15 yrs
₹8L - ₹20L / yr
Data engineering
ETL
Datawarehousing
cicd
skill iconJenkins
+3 more

Key Responsibilities:

  • Lead Data Engineering Team: Provide leadership and mentorship to junior data engineers and ensure best practices in data architecture and pipeline design.
  • Data Pipeline Development: Design, implement, and maintain end-to-end ETL (Extract, Transform, Load) processes to support analytics, reporting, and data science activities.
  • Cloud Architecture (GCP): Architect and optimize data infrastructure on Google Cloud Platform (GCP), ensuring scalability, reliability, and performance of data systems.
  • CI/CD Pipelines: Implement and maintain CI/CD pipelines using Jenkins and other tools to ensure the seamless deployment and automation of data workflows.
  • Data Warehousing: Design and implement data warehousing solutions, ensuring optimal performance and efficient data storage using technologies like Teradata, Oracle, and SQL Server.
  • Workflow Orchestration: Use Apache Airflow to orchestrate complex data workflows and scheduling of data pipeline jobs.
  • Automation with Terraform: Implement Infrastructure as Code (IaC) using Terraform to provision and manage cloud resources.

Share Cv to




Thirega@ vysystems dot com - WhatsApp - 91Five0033Five2Three

Read more
Amwhiz
Aruljothi Kuppusamy
Posted by Aruljothi Kuppusamy
Chennai
2 - 4 yrs
₹4L - ₹8L / yr
skill iconNodeJS (Node.js)
skill iconMongoDB
Mongoose
NestJS
MySQL
+17 more

About the Role:

We are seeking a skilled and driven Backend Developer to join our team. The ideal candidate will have experience in database design (RDBMS and NoSQL), REST API and GraphQL development, cloud services, and programming with a strong foundation in Node.js and TypeScript. You’ll be responsible for designing and implementing scalable backend solutions, ensuring high performance, security, and reliability.


If you’re passionate about backend development, learning new technologies, and building modern applications, this is the role for you.


Key Responsibilities:

Backend Development:


Develop and maintain robust, scalable backend services using Node.js with TypeScript (strict mode).

Build APIs with REST and GraphQL, ensuring high security and performance.

Implement various authentication mechanisms such as OAuth2.0, SAML, JWT, MFA, and optionally, passkeys.

Database Design:


Design and optimize schemas for both relational (PostgreSQL, YSQL) and NoSQL (DynamoDB, MongoDB) databases.

Develop efficient data models for large-scale applications.

Cloud Services & Serverless Architecture:


Work extensively with AWS Cloud services, and optionally Azure and GCP.

Design and implement serverless architectures and event-driven systems using frameworks like AWS Lambda or equivalent on Azure/GCP.

Configure and manage webhooks for event notifications and integrations.

Programming Principles:


Apply design patterns, SOLID principles, and functional programming practices.

Showcase eagerness to learn new programming languages and paradigms.

DevOps & Deployment:


Utilize Docker and Kubernetes (K8s) for containerization and orchestration.

Collaborate with DevOps teams for CI/CD pipelines and scalable deployments.

Tools & Utilities:


Use Postman, Swagger, and cURL for API testing and documentation.

Demonstrate strong knowledge of Unix commands for troubleshooting and development.

Version Control:


Work collaboratively using Git for versioning and code management.


Key Skills & Qualifications:

Must-Have:

Proficiency in Node.js with TypeScript (strict mode).

Experience with the NestJS framework.

Expertise in RDBMS and NoSQL database design and optimization.

Hands-on experience with REST API and GraphQL development.

Familiarity with authentication protocols such as OAuth2.0, SAML, JWT, and MFA.

Strong understanding of AWS Cloud Services and Serverless Architecture.


Nice-to-Have:

Exposure to Azure and GCP serverless frameworks.

Knowledge of webhooks for event handling.

Experience with passkeys as an authentication option.


Soft Skills:

Problem-solving mindset with a passion for tackling complex challenges.

Ability to learn and adapt to new tools, frameworks, and programming languages.

Collaborative attitude and strong communication skills.


What We Offer:

Competitive compensation and benefits package.

Opportunity to work with modern technologies in a fast-paced environment.

A culture of learning, growth, and collaboration.

Exposure to large-scale systems and exciting technical challenges.

Read more
Koolioai
Swarna M
Posted by Swarna M
Remote, Chennai
5 - 7 yrs
₹20L - ₹30L / yr
skill iconPython
skill iconReact.js
skill iconFlask
Google Cloud Platform (GCP)

About koolio.ai

Website: www.koolio.ai

koolio Inc. is a cutting-edge Silicon Valley startup dedicated to transforming how stories are told through audio. Our mission is to democratize audio content creation by empowering individuals and businesses to effortlessly produce high-quality, professional-grade content. Leveraging AI and intuitive web-based tools, koolio.ai enables creators to craft, edit, and distribute audio content—from storytelling to educational materials, brand marketing, and beyond—easily. We are passionate about helping people and organizations share their voices, fostering creativity, collaboration, and engaging storytelling for a wide range of use cases.

About the Full-Time Position

We are seeking experienced Full Stack Developers to join our innovative team on a full-time, hybrid basis. As part of koolio.ai, you will work on a next-gen AI-powered platform, shaping the future of audio content creation. You’ll collaborate with cross-functional teams to deliver scalable, high-performance web applications, handling client- and server-side development. This role offers a unique opportunity to contribute to a rapidly growing platform with a global reach and thrive in a fast-moving, self-learning startup environment where adaptability and innovation are key.

Key Responsibilities:

  • Collaborate with teams to implement new features, improve current systems, and troubleshoot issues as we scale
  • Design and build efficient, secure, and modular client-side and server-side architecture
  • Develop high-performance web applications with reusable and maintainable code
  • Work with audio/video processing libraries for JavaScript to enhance multimedia content creation
  • Integrate RESTful APIs with Google Cloud Services to build robust cloud-based applications
  • Develop and optimize Cloud Functions to meet specific project requirements and enhance overall platform performance

Requirements and Skills:

  • Education: Degree in Computer Science or a related field
  • Work Experience: Minimum of 6+ years of proven experience as a Full Stack Developer or similar role, with demonstrable expertise in building web applications at scale
  • Technical Skills:
  • Proficiency in front-end languages such as HTML, CSS, JavaScript, jQuery, and ReactJS
  • Strong experience with server-side technologies, particularly REST APIs, Python, Google Cloud Functions, and Google Cloud services
  • Familiarity with NoSQL and PostgreSQL databases
  • Experience working with audio/video processing libraries is a strong plus
  • Soft Skills:
  • Strong problem-solving skills and the ability to think critically about issues and solutions
  • Excellent collaboration and communication skills, with the ability to work effectively in a remote, diverse, and distributed team environment
  • Proactive, self-motivated, and able to work independently, balancing multiple tasks with minimal supervision
  • Keen attention to detail and a passion for delivering high-quality, scalable solutions
  • Other Skills: Familiarity with GitHub, CI/CD pipelines, and best practices in version control and continuous deployment

Compensation and Benefits:

  • Total Yearly Compensation: ₹25 LPA based on skills and experience
  • Health Insurance: Comprehensive health coverage provided by the company
  • ESOPs: An opportunity for wealth creation and to grow alongside a fantastic team

Why Join Us?

  • Be a part of a passionate and visionary team at the forefront of audio content creation
  • Work on an exciting, evolving product that is reshaping the way audio content is created and consumed
  • Thrive in a fast-moving, self-learning startup environment that values innovation, adaptability, and continuous improvement
  • Enjoy the flexibility of a full-time hybrid position with opportunities to grow professionally and expand your skills
  • Collaborate with talented professionals from around the world, contributing to a product that has a real-world impact


Read more
Smartan.ai

at Smartan.ai

2 candid answers
Aadharsh M
Posted by Aadharsh M
Chennai
4 - 8 yrs
₹5L - ₹15L / yr
skill iconPython
NumPy
TensorFlow
PyTorch
Google Cloud Platform (GCP)
+4 more

Role Overview:

We are seeking a highly skilled and motivated Data Scientist to join our growing team. The ideal candidate will be responsible for developing and deploying machine learning models from scratch to production level, focusing on building robust data-driven products. You will work closely with software engineers, product managers, and other stakeholders to ensure our AI-driven solutions meet the needs of our users and align with the company's strategic goals.


Key Responsibilities:

  • Develop, implement, and optimize machine learning models and algorithms to support product development.
  • Work on the end-to-end lifecycle of data science projects, including data collection, preprocessing, model training, evaluation, and deployment.
  • Collaborate with cross-functional teams to define data requirements and product taxonomy.
  • Design and build scalable data pipelines and systems to support real-time data processing and analysis.
  • Ensure the accuracy and quality of data used for modeling and analytics.
  • Monitor and evaluate the performance of deployed models, making necessary adjustments to maintain optimal results.
  • Implement best practices for data governance, privacy, and security.
  • Document processes, methodologies, and technical solutions to maintain transparency and reproducibility.


Qualifications:

  • Bachelor's or Master's degree in Data Science, Computer Science, Engineering, or a related field.
  • 5+ years of experience in data science, machine learning, or a related field, with a track record of developing and deploying products from scratch to production.
  • Strong programming skills in Python and experience with data analysis and machine learning libraries (e.g., Pandas, NumPy, TensorFlow, PyTorch).
  • Experience with cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker).
  • Proficiency in building and optimizing data pipelines, ETL processes, and data storage solutions.
  • Hands-on experience with data visualization tools and techniques.
  • Strong understanding of statistics, data analysis, and machine learning concepts.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work collaboratively in a fast-paced, dynamic environment.


Preferred Qualifications:

  • Knowledge of microservices architecture and RESTful APIs.
  • Familiarity with Agile development methodologies.
  • Experience in building taxonomy for data products.
  • Strong communication skills and the ability to explain complex technical concepts to non-technical stakeholders.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Chennai
4 - 10 yrs
Best in industry
skill iconKubernetes
skill iconDocker
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more

Wissen Technology is hiring for Devops engineer


Required:


-4 to 10 years of relevant experience in Devops

-Must have hands on experience on AWS, Kubernetes, CI/CD pipeline

-Good to have exposure on Github or Gitlab

-Open to work from hashtag Chennai

-Work mode will be Hybrid


Company profile:


Company Name : Wissen Technology

Group of companies in India : Wissen Technology & Wissen Infotech

Work Location - Chennai

Website : www.wissen.com

Wissen Thought leadership : https://lnkd.in/gvH6VBaU

LinkedIn: https://lnkd.in/gnK-vXjF

Read more
Chennai, Coimbatore
6 - 10 yrs
₹10L - ₹25L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconAmazon Web Services (AWS)
+2 more

Role & responsibilities

  • Senior Java developer with 6 to 10 years of experience having worked on Java, SpringBoot, Hibernate, Microservices, Redis, AWS S3
  • Contribute to all stages of the software development lifecycle
  • Design, implement, and maintain Java-based applications that can be high-volume and low-latency
  • Analyze user requirements to define business objectives
  • Envisioning system features and functionality
  • Define application objectives and functionality
  • Ensure application designs conform to business goals
  • Develop and test software
  • Should have good experience in Code Review
  • Expecting to be 100% hands-on while working with the clients directly
  • Performing requirement analysis
  • Developing high-quality and detailed designs
  • Conducting unit testing using automated unit test frameworks
  • Identifying risk and conducting mitigation action planning
  • Reviewing the work of other developers and providing feedback
  • Using coding standards and best practices to ensure quality
  • Communicating with customers to resolve issues
  • Good Communication Skills 


Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Hyderabad, Pune, Noida, Bengaluru (Bangalore), Chennai
4 - 10 yrs
Best in industry
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
Windows Azure

Golang Developer

Location: Chennai/ Hyderabad/Pune/Noida/Bangalore

Experience: 4+ years

Notice Period: Immediate/ 15 days

Job Description:

  • Must have at least 3 years of experience working with Golang.
  • Strong Cloud experience is required for day-to-day work.
  • Experience with the Go programming language is necessary.
  • Good communication skills are a plus.
  • Skills- Aws, Gcp, Azure, Golang
Read more
HappyFox

at HappyFox

1 video
6 products
Lindsey A
Posted by Lindsey A
Chennai, Bengaluru (Bangalore)
5 - 10 yrs
₹10L - ₹15L / yr
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Windows Azure
+12 more

About us:

HappyFox is a software-as-a-service (SaaS) support platform. We offer an enterprise-grade help desk ticketing system and intuitively designed live chat software.

 

We serve over 12,000 companies in 70+ countries. HappyFox is used by companies that span across education, media, e-commerce, retail, information technology, manufacturing, non-profit, government and many other verticals that have an internal or external support function.

 

To know more, Visit! - https://www.happyfox.com/

 

Responsibilities:

  • Build and scale production infrastructure in AWS for the HappyFox platform and its products.
  • Research, Build/Implement systems, services and tooling to improve uptime, reliability and maintainability of our backend infrastructure. And to meet our internal SLOs and customer-facing SLAs.
  • Proficient in managing/patching servers with Unix-based operating systems like Ubuntu Linux.
  • Proficient in writing automation scripts or building infrastructure tools using Python/Ruby/Bash/Golang
  • Implement consistent observability, deployment and IaC setups
  • Patch production systems to fix security/performance issues
  • Actively respond to escalations/incidents in the production environment from customers or the support team
  • Mentor other Infrastructure engineers, review their work and continuously ship improvements to production infrastructure.
  • Build and manage development infrastructure, and CI/CD pipelines for our teams to ship & test code faster.
  • Participate in infrastructure security audits

 

Requirements:

  • At least 5 years of experience in handling/building Production environments in AWS.
  • At least 2 years of programming experience in building API/backend services for customer-facing applications in production.
  • Demonstrable knowledge of TCP/IP, HTTP and DNS fundamentals.
  • Experience in deploying and managing production Python/NodeJS/Golang applications to AWS EC2, ECS or EKS.
  • Proficient in containerised environments such as Docker, Docker Compose, Kubernetes
  • Proficient in managing/patching servers with Unix-based operating systems like Ubuntu Linux.
  • Proficient in writing automation scripts using any scripting language such as Python, Ruby, Bash etc.,
  • Experience in setting up and managing test/staging environments, and CI/CD pipelines.
  • Experience in IaC tools such as Terraform or AWS CDK
  • Passion for making systems reliable, maintainable, scalable and secure.
  • Excellent verbal and written communication skills to address, escalate and express technical ideas clearly
  • Bonus points – if you have experience with Nginx, Postgres, Redis, and Mongo systems in production.

 

Read more
HappyFox

at HappyFox

1 video
6 products
Lindsey A
Posted by Lindsey A
Chennai, Bengaluru (Bangalore)
7 - 15 yrs
₹15L - ₹20L / yr
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Windows Azure
+9 more

About us:

HappyFox is a software-as-a-service (SaaS) support platform. We offer an enterprise-grade help desk ticketing system and intuitively designed live chat software.

 

We serve over 12,000 companies in 70+ countries. HappyFox is used by companies that span across education, media, e-commerce, retail, information technology, manufacturing, non-profit, government and many other verticals that have an internal or external support function.

 

To know more, Visit! - https://www.happyfox.com/

 

Responsibilities

  • Build and scale production infrastructure in AWS for the HappyFox platform and its products.
  • Research, Build/Implement systems, services and tooling to improve uptime, reliability and maintainability of our backend infrastructure. And to meet our internal SLOs and customer-facing SLAs.
  • Implement consistent observability, deployment and IaC setups
  • Lead incident management and actively respond to escalations/incidents in the production environment from customers and the support team.
  • Hire/Mentor other Infrastructure engineers and review their work to continuously ship improvements to production infrastructure and its tooling.
  • Build and manage development infrastructure, and CI/CD pipelines for our teams to ship & test code faster.
  • Lead infrastructure security audits

 

Requirements

  • At least 7 years of experience in handling/building Production environments in AWS.
  • At least 3 years of programming experience in building API/backend services for customer-facing applications in production.
  • Proficient in managing/patching servers with Unix-based operating systems like Ubuntu Linux.
  • Proficient in writing automation scripts or building infrastructure tools using Python/Ruby/Bash/Golang
  • Experience in deploying and managing production Python/NodeJS/Golang applications to AWS EC2, ECS or EKS.
  • Experience in security hardening of infrastructure, systems and services.
  • Proficient in containerised environments such as Docker, Docker Compose, Kubernetes
  • Experience in setting up and managing test/staging environments, and CI/CD pipelines.
  • Experience in IaC tools such as Terraform or AWS CDK
  • Exposure/Experience in setting up or managing Cloudflare, Qualys and other related tools
  • Passion for making systems reliable, maintainable, scalable and secure.
  • Excellent verbal and written communication skills to address, escalate and express technical ideas clearly
  • Bonus points – Hands-on experience with Nginx, Postgres, Postfix, Redis or Mongo systems.

 

 

Read more
AuditCue
Anand Srinivasan
Posted by Anand Srinivasan
Chennai
3 - 6 yrs
Best in industry
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more

Bachelor's degree in Computer Science or a related field, or equivalent work experience

Strong understanding of cloud infrastructure and services, such as AWS, Azure, or Google Cloud Platform

Experience with infrastructure as code tools such as Terraform or CloudFormation

Proficiency in scripting languages such as Python, Bash, or PowerShell

Familiarity with DevOps methodologies and tools such as Git, Jenkins, or Ansible

Strong problem-solving and analytical skills

Excellent communication and collaboration skills

Ability to work independently and as part of a team

Willingness to learn new technologies and tools as required

Read more
German Based IT Start-up

German Based IT Start-up

Agency job
via People First Consultants by Naveed Mohd
Chennai, Bengaluru (Bangalore)
6 - 13 yrs
Best in industry
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconMongoDB
+5 more

6 - 12 years of professional experience with any of the below stacks:  

∙MERN stack: JavaScript - MongoDB - Express - ReactJS - Node,  

∙MEAN stack: JavaScript - MongoDB - Express - AngularJS - Node.js


Requirements:


∙Professional experience with JavaScript and associated web technologies (CSS, semantic HTML). 

∙Proficiency in the English language, both written and verbal, sufficient for success in a remote and largely asynchronous work environment. 

∙Demonstrated capacity to clearly and concisely communicate about complex technical, architectural, and/or organizational problems and propose thorough iterative solutions. 

∙Experience with performance and optimization problems and a demonstrated ability to both diagnose and prevent these problems. 

∙Comfort working in a highly agile software development process. 

∙Positive and solution-oriented mindset. 

∙Experience owning a project from concept to production, including proposal, discussion, and execution. 

∙Strong sense of ownership with the eagerness to design and deliver significant and impactful technology solutions. 

∙Demonstrated ability to work closely with other parts of the organization.

Read more
US Based Product MNC

US Based Product MNC

Agency job
via People First Consultants by Naveed Mohd
Chennai
7 - 13 yrs
Best in industry
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+10 more

DESIRED SKILLS AND EXPERIENCE

 Strong analytical and problem-solving skills

 Ability to work independently, learn quickly and be proactive

 3-5 years overall and at least 1-2 years of hands-on experience in designing and managing DevOps Cloud infrastructure

 Experience must include a combination of:

o Experience working with configuration management tools – Ansible, Chef, Puppet, SaltStack (expertise in at least one tool is a must)

o Ability to write and maintain code in at least one scripting language (Python preferred)

o Practical knowledge of shell scripting

o Cloud knowledge – AWS, VMware vSphere o Good understanding and familiarity with Linux

o Networking knowledge – Firewalls, VPNs, Load Balancers

o Web/Application servers, Nginx, JVM environments

o Virtualization and containers - Xen, KVM, Qemu, Docker, Kubernetes, etc.

o Familiarity with logging systems - Logstash, Elasticsearch, Kibana

o Git, Jenkins, Jira

Read more
Smart Glass manufacturing firm

Smart Glass manufacturing firm

Agency job
via People First Consultants by Aishwarya KA
Chennai
7 - 15 yrs
Best in industry
skill iconGo Programming (Golang)
skill iconRuby on Rails (ROR)
skill iconRuby
skill iconPython
skill iconJava
+3 more

Hiring for the below position with one of our premium client


Role: Senior DevOps Engineer

Exp:7+ years

Location: Chennai

Key skills: DevOps, Cloud, Python scripting


Description:

 Strong analytical and problem-solving skills 

Ability to work independently, learn quickly and be proactive

 7-9 years overall and at least 3-4 years of hands-on experience in designing and managing DevOps Cloud infrastructure

 Experience must include a combination of:

o Experience working with configuration management tools – Ansible, Chef, Puppet, SaltStack (expertise in at least one tool is a must)

o Ability to write and maintain code in at least one scripting language (Python preferred)

o Practical knowledge of shell scripting

o Cloud knowledge – AWS, VMware vSphere

o Good understanding and familiarity with Linux o Networking knowledge – Firewalls, VPNs, Load Balancers o Web/Application servers, Nginx, JVM environments

o Virtualization and containers - Xen, KVM, Qemu, Docker, Kubernetes, etc.

o Familiarity with logging systems - Logstash, Elasticsearch, Kibana o Git, Jenkins, Jira 


If interested kindly apply!

Read more
OJCommerce

at OJCommerce

3 recruiters
Thennarasi S
Posted by Thennarasi S
Chennai, Tamil Nadu, Pondicherry
2 - 7 yrs
₹2L - ₹15L / yr
skill icon.NET
ASP.NET
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconJavascript
+16 more

About OJ Commerce: 


OJ Commerce is a fast-growing, profitable online retailer based in Florida, USA with a full-fledged India office based in Chennai driven by a sophisticated, data-driven system to run the operations with virtually no human intervention. We strive to be the best-in-class ecommerce company delivering exceptional value to customers by leveraging technology, innovation and brand-partnerships to provide a seamless & enjoyable shopping of high-quality products at the best prices to our customers.

 

Responsibilities:

 

Work with business-stakeholders to understand requirements, prototype, build and deploy it.

CRUD the backend code you own keeping maintenance, performance and security in mind.

Keep up breast of latest technologies and its ecosystem and adopt ones that aid safe product delivery at speed.

Automate the boring and mundane stuff for you prefer to be productive than being busy.

We are flat. Be responsible for professional growth of self and the team. 

  • Tune application for performance.
  • Take initiatives and manage change to work towards business goals at speed without compromising safety.
  • Coach full-stack developers on backend skills.
  • Provides problem resolution support, specific to application issues, identifies and resolves problems in application software, determines symptoms and ensures accurate problem definition

Develop functional, architectural and other documentation as required for productive functioning of teams.

  • Be the brand ambassador for OJ Commerce by speaking at meetups, conferences, etc.
  • We are fluid. Be ready for changing dynamics in responsibilities from time to time. Exciting isn't?
  • Take the lead in digital transformation of legacy applications.

 

 

What you need to shine?

  • You have the prior experience in modernising legacy applications.
  • You are a passionate hands-on developer with deep experience in building enterprise grade software in Microsoft ASP.NET Core, ASP.NET MVC, Web API, SOA, Micro-Services and RESTful Services with knowledge of SQL Server database.
  • You have the ability to see and work on the big picture (Application Architecture) and devilish details (Complex Code).
  • Strong experience in developing web applications using C#, VB.Net, .NET, LINQ, Net Framework 4.0, MVC 3/4/5, ASP.NET Web API, .Net Core etc.
  • You are Cloud savvy, preferably Google Cloud.
  • You have rich experience in Object-Oriented Programming (OOP) with good knowledge of practical design-patterns and its applications.

Hands-on experience in building SOA or Micro-services preferably on .NET Core.

Proven Architectural skills with high standards in Code quality

Knowledge of ReactJS/Typescript would be added advantage.

Practical experience in Agile development methodologies of using CI/CD.

Extreme Programming (TDD) experience is sought after by us.

 

 

What we Offer

  • Greenfield opportunity to transform legacy backend applications to latest technology stack.
  • Fast-paced start-up environment: This is not for the faint hearted; you need grit and passion as much as you need the core skills.
  • Work in an interdisciplinary team where learning from one another and developing solutions cross-functionally is a key part of our culture.
  • Golden opportunity to make history by making big business impact.
  • Competitive salary to take good care of self and family.
  • Insurance Benefits: Medical and Accident cover.
  • Flexible Working Hours

 

 

 

Read more
OJCommerce

at OJCommerce

3 recruiters
Thennarasi S
Posted by Thennarasi S
Chennai
3 - 10 yrs
₹1L - ₹15L / yr
skill icon.NET
ASP.NET
skill iconC#
ASP.NET MVC
Web API
+13 more

About OJ Commerce: 


OJ Commerce is a fast-growing, profitable online retailer based in Florida, USA with a full-fledged India office based in Chennai driven by a sophisticated, data-driven system to run the operations with virtually no human intervention. We strive to be the best-in-class ecommerce company delivering exceptional value to customers by leveraging technology, innovation and brand-partnerships to provide a seamless & enjoyable shopping of high-quality products at the best prices to our customers.

 

Responsibilities:

 

Work with business-stakeholders to understand requirements, prototype, build and deploy it.

CRUD the backend code you own keeping maintenance, performance and security in mind.

Keep up breast of latest technologies and its ecosystem and adopt ones that aid safe product delivery at speed.

Automate the boring and mundane stuff for you prefer to be productive than being busy.

We are flat. Be responsible for professional growth of self and the team. 

  • Tune application for performance.
  • Take initiatives and manage change to work towards business goals at speed without compromising safety.
  • Coach full-stack developers on backend skills.
  • Provides problem resolution support, specific to application issues, identifies and resolves problems in application software, determines symptoms and ensures accurate problem definition

Develop functional, architectural and other documentation as required for productive functioning of teams.

  • Be the brand ambassador for OJ Commerce by speaking at meetups, conferences, etc.
  • We are fluid. Be ready for changing dynamics in responsibilities from time to time. Exciting isn't?
  • Take the lead in digital transformation of legacy applications.

 

 

What you need to shine?

  • You have the prior experience in modernising legacy applications.
  • You are a passionate hands-on developer with deep experience in building enterprise grade software in Microsoft ASP.NET Core, ASP.NET MVC, Web API, SOA, Micro-Services and RESTful Services with knowledge of SQL Server database.
  • You have the ability to see and work on the big picture (Application Architecture) and devilish details (Complex Code).
  • Strong experience in developing web applications using C#, VB.Net, .NET, LINQ, Net Framework 4.0, MVC 3/4/5, ASP.NET Web API, .Net Core etc.
  • You are Cloud savvy, preferably Google Cloud.
  • You have rich experience in Object-Oriented Programming (OOP) with good knowledge of practical design-patterns and its applications.

Hands-on experience in building SOA or Micro-services preferably on .NET Core.

Proven Architectural skills with high standards in Code quality

Knowledge of ReactJS/Typescript would be added advantage.

Practical experience in Agile development methodologies of using CI/CD.

Extreme Programming (TDD) experience is sought after by us.

 

 

What we Offer

  • Greenfield opportunity to transform legacy backend applications to latest technology stack.
  • Fast-paced start-up environment: This is not for the faint hearted; you need grit and passion as much as you need the core skills.
  • Work in an interdisciplinary team where learning from one another and developing solutions cross-functionally is a key part of our culture.
  • Golden opportunity to make history by making big business impact.
  • Competitive salary to take good care of self and family.
  • Insurance Benefits: Medical and Accident cover.
  • Flexible Working Hours
Read more
Cubera Tech India Pvt Ltd
Bengaluru (Bangalore), Chennai
5 - 8 yrs
Best in industry
Data engineering
Big Data
skill iconJava
skill iconPython
Hibernate (Java)
+10 more

Data Engineer- Senior

Cubera is a data company revolutionizing big data analytics and Adtech through data share value principles wherein the users entrust their data to us. We refine the art of understanding, processing, extracting, and evaluating the data that is entrusted to us. We are a gateway for brands to increase their lead efficiency as the world moves towards web3.

What are you going to do?

Design & Develop high performance and scalable solutions that meet the needs of our customers.

Closely work with the Product Management, Architects and cross functional teams.

Build and deploy large-scale systems in Java/Python.

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Create data tools for analytics and data scientist team members that assist them in building and optimizing their algorithms.

Follow best practices that can be adopted in Bigdata stack.

Use your engineering experience and technical skills to drive the features and mentor the engineers.

What are we looking for ( Competencies) :

Bachelor’s degree in computer science, computer engineering, or related technical discipline.

Overall 5 to 8 years of programming experience in Java, Python including object-oriented design.

Data handling frameworks: Should have a working knowledge of one or more data handling frameworks like- Hive, Spark, Storm, Flink, Beam, Airflow, Nifi etc.

Data Infrastructure: Should have experience in building, deploying and maintaining applications on popular cloud infrastructure like AWS, GCP etc.

Data Store: Must have expertise in one of general-purpose No-SQL data stores like Elasticsearch, MongoDB, Redis, RedShift, etc.

Strong sense of ownership, focus on quality, responsiveness, efficiency, and innovation.

Ability to work with distributed teams in a collaborative and productive manner.

Benefits:

Competitive Salary Packages and benefits.

Collaborative, lively and an upbeat work environment with young professionals.

Job Category: Development

Job Type: Full Time

Job Location: Bangalore

 

Read more
Tredence
Rohit S
Posted by Rohit S
Chennai, Pune, Bengaluru (Bangalore), Gurugram
11 - 16 yrs
₹20L - ₹32L / yr
Data Warehouse (DWH)
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Data engineering
Data migration
+1 more
• Engages with Leadership of Tredence’s clients to identify critical business problems, define the need for data engineering solutions and build strategy and roadmap
• S/he possesses a wide exposure to complete lifecycle of data starting from creation to consumption
• S/he has in the past built repeatable tools / data-models to solve specific business problems
• S/he should have hand-on experience of having worked on projects (either as a consultant or with in a company) that needed them to
o Provide consultation to senior client personnel o Implement and enhance data warehouses or data lakes.
o Worked with business teams or was a part of the team that implemented process re-engineering driven by data analytics/insights
• Should have deep appreciation of how data can be used in decision-making
• Should have perspective on newer ways of solving business problems. E.g. external data, innovative techniques, newer technology
• S/he must have a solution-creation mindset.
Ability to design and enhance scalable data platforms to address the business need
• Working experience on data engineering tool for one or more cloud platforms -Snowflake, AWS/Azure/GCP
• Engage with technology teams from Tredence and Clients to create last mile connectivity of the solutions
o Should have experience of working with technology teams
• Demonstrated ability in thought leadership – Articles/White Papers/Interviews
Mandatory Skills Program Management, Data Warehouse, Data Lake, Analytics, Cloud Platform
Read more
One of our Premium Client

One of our Premium Client

Agency job
Chennai
9 - 16 yrs
₹16L - ₹28L / yr
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
Microsoft Windows Azure
skill iconPython
skill iconC
+5 more
BASIC QUALIFICATIONS 
 
  • Bachelor’s degree preferably in Engineering or equivalent professional or military experience with 10-15 years of experience.
  • 5+ years of large-scale software development or application engineering with recent coding experience in two or more modern programming languages such as:Java,JavaScript, C/C++, C#, Swift, Node.js, Python, Go, or Ruby
  • Experience with Continuous Integration and Continuous Delivery (CI/CD)
  • Helping customers architect scalable, highly available application solutions that leverage at least 2 cloud environments out of AWS, GCP, Azure.
  • Architecting and developing customer applications to be cloud developed or re-engineered or optimized
  • Working as a technical leader alongside customer business, development and Development teams with support to Infrastructure team
  • Providing deep software development knowledge with respect cloud architecture,design patterns and programming
  • Advising and implementing Cloud (AWS/GCP/Azure) best practices
  • Working as both an application architect as well as development specialist in Cloud native Apps architecture, development to deployment phases.
  • Implementing DevOps practices such as infrastructure as code, continuous integration and automated deployment 
Read more
Agiletech Info Solutions pvt ltd
Chennai
5 - 8 yrs
₹5L - ₹15L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more

DevOps Engineer

Job Description:

 

The position requires a broad set of technical and interpersonal skills that includes deployment technologies, monitoring and scripting from networking to infrastructure. Well versed in troubleshooting Prod issues and should be able to drive till the RCA.

 

Skills:

 

  • Manage VMs across multiple datacenters and AWS to support dev/test and production workloads.
  • Strong hands-on over Ansible is preferred
  • Strong knowledge and hands-on experience in Kubernetes Architecture and administration.
  • Should have core knowledge in Linux and System operations.
  • Proactively and reactively resolve incidents as escalated from monitoring solutions and end users.
  • Conduct and automate audits for network and systems infrastructure.
  • Do software deployments, per documented processes, with no impact to customers.
  • Follow existing devops processes while having flexibility to create and tweak processes to gain efficiency.
  • Troubleshoot connectivity problems across network, systems or applications.
  • Follow security guidelines, both policy and technical to protect our customers.
  • Ability to automate recurring tasks to increase velocity and quality.
  • Should have worked on any one of the Database (Postgres/Mongo/Cockroach/Cassandra)
  • Should have knowledge and hands-on experience in managing ELK clusters.
  • Scripting Knowledge in Shell/Python is added advantage.
  • Hands-on Experience over K8s based Microservice Architecture is added advantage.
Read more
CoreStack

at CoreStack

2 recruiters
Dhivya R
Posted by Dhivya R
Chennai
2 - 6 yrs
₹3L - ₹8L / yr
skill iconAmazon Web Services (AWS)
Microsoft Windows Azure
Google Cloud Platform (GCP)
Cloud Computing

Roles & Responsibilities

  • Part of a Cloud Governance product team responsible for installing, configuring, automating and monitoring various Cloud Services (IaaS, PaaS, and SaaS)
  • Be at the forefront of Cloud technology, assisting a global list of customers that consume multiple cloud environments.
  • Ensure availability of internal & customers' hosts and services thru monitoring, analysing metric trends, investigating alerts.
  • Explore and implement a broad spectrum of open source technologies. Help the team/customer to resolve technical issues.
  • Extremely customer focused, flexible to be available on-call for solving critical problems.
  • Contribute towards the process improvement involving the Product deployments, Cloud Governance & Customer Success.
  • Skills Required

    • Minimum 3+ Years of experience with a B.E/B.Tech
    • Experience in managing Azure IaaS, PaaS services for customer production environments
    • Well versed in DevOps technologies, automation, infrastructure orchestration, configuration management and CI/CD
    • Experience in Linux and Windows Administration, server hardening and security compliance
    • Web and Application Server technologies (e.g. Apache, Nginx, IIS)
    • Good command in at least one scripting language (e.g. Bash, PowerShell, Ruby, Python)
    • Networking protocols such as HTTP, DNS and TCP/IP
    • Experience in managing version control platforms (e.g. Git, SVN)
Read more
CoreStack

at CoreStack

2 recruiters
Maria Godslin
Posted by Maria Godslin
Chennai
3 - 6 yrs
₹3L - ₹7L / yr
API
SoapUI
Selenium
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

CoreStack, an AI-powered multi-cloud governance solution, empowers enterprises to unleash the power of the cloud on their terms by helping them rapidly achieve continuous and autonomous cloud governance at scale. CoreStack enables enterprises to realize outcomes across FinOps, SecOps and CloudOps such as a 40% decrease in cloud costs and a 50% increase in operational efficiencies by governing operations, security, cost, access, and resources. CoreStack also assures 100% compliance with standards such as ISO, FedRAMP, NIST, HIPAA, PCI-DSS, AWS CIS & Well-Architected Framework (WAF). CoreStack works with many large global customers across multiple industries including Financial Services, Healthcare, Retail, Education, Telecommunications, Technology and Government. 

The company is backed by industry-leading venture investors. CoreStack is a recent recipient of the 2021 Gold Stevie American Business Awards in the Cloud Infrastructure category and the 2021 Gold Globee Winner of the Most Innovative Company of the Year in IT Cloud/SaaS. In addition, CoreStack won the 2021 Best New Products American Business Award in Cloud Governance as well as Golden Bridge Awards for Cloud Computing/SaaS Innovation and Cloud Security Innovation. CoreStack was recognized as IDC Innovator in Cloud Management Solutions and in the Gartner Magic Quadrant for Cloud Management Platforms in 2020. The Company is a three-time TiE50 Winner and an Emerge 50 League-10 NASSCOM award recipient in Enterprise Software. CoreStack is a Google Cloud Build Partner, Microsoft Azure Gold & Co-Sell Partner, and Amazon AWS Advanced Technology Competency Partner.

 

Responsibilities:

  • Part of a product team responsible for the Quality of the product used by Global customers
  • Drive the automation efforts to reduce the time taken to identify issues
  • Contribute towards the process improvement involving the entire Product development life cycle.
  • Should be able to capture the user behavior and identify the gaps in UX flows. 
  • Responsible for the performance and usability of the Product.

Requirements

  • Minimum 4+ Years of experience in testing domain.
  • Should be well versed with Software Testing Methodologies and Techniques
  • Should have experience in AWS or Azure or Google Cloud
  • Should have experience in Manual Testing and identifying the critical scenario.
  • Should have experience in Data Base testing
  • Should have experience in Test Management tools.
  • Should have experience in performance testing, load testing and basic knowledge on security testing.
  • API testing using Soap UI is an added advantage.
  • Selenium with Java is an added advantage.
  • CI & CD using Jenkins is an added advantage.
  • Prior work experience with Cloud domain and cloud based products is a Plus

CoreStack Offers

  • Competitive salary
  • Competitive benefit package with appreciable equity
  • Exciting, fast-paced and entrepreneurial culture
  • Health insurance and other company benefits
Read more
dfcs Technologies

dfcs Technologies

Agency job
via dfcs Technologies by SheikDawood Ali
Remote, Chennai, anywhere India
1 - 5 yrs
₹10L - ₹15L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more
  • Hands on experience in AWS provisioning of AWS services like EC2, S3,EBS, AMI, VPC, ELB, RDS, Auto scaling groups, Cloud Formation.
  • Good experience on Build and release process and extensively involved in the CICD using

Jenkins

  • Experienced on configuration management tools like Ansible.
  • Designing, implementing and supporting fully automated Jenkins CI/CD
  • Extensively worked on Jenkins for continuous Integration and for end to end Automation for all Builds and Deployments.
  • Proficient with Docker based container deployments to create shelf environments for dev teams and containerization of environment delivery for releases.
  • Experience working on Docker hub, creating Docker images and handling multiple images primarily for middleware installations and domain configuration.
  • Good knowledge in version control system in Git and GitHub.
  • Good experience in build tools
  • Implemented CI/CD pipeline using Jenkins, Ansible, Docker, Kubernetes ,YAML and Manifest
Read more
CoreStack

at CoreStack

2 recruiters
Maria Godslin
Posted by Maria Godslin
Chennai
3 - 5 yrs
Best in industry
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Microsoft Windows Azure
IaaS
Platform as a Service (PaaS)
+2 more

CoreStack, an AI-powered multi-cloud governance solution, empowers enterprises to rapidly achieve Continuous and Autonomous Cloud Governance at Scale. CoreStack enables enterprises to realize outcomes such as 40% decrease in cloud costs and 50% increase in operational efficiencies by governing operations, security, cost, access, and resources. CoreStack also assures 100% compliance with standards such as ISO, FedRAMP, NIST, HIPAA, PCI-DSS, AWS CIS & Well Architected Framework (WAF). We work with many large global customers across multiple industries including Financial Services, Healthcare, Retail, Education, Telecommunications, Technology and Government.

 

Responsibilities:

 

  • Part of a Cloud Governance product team responsible for installing, configuring, automating and monitoring various Cloud Services (IaaS, PaaS, and SaaS)
  • Be at the forefront of Cloud technology, assisting a global list of customers that consume multiple cloud environments.
  • Ensure availability of internal & customers' hosts and services thru monitoring, analysing metric trends, investigating alerts.
  • Explore and implement a broad spectrum of open source technologies. Help the team/customer to resolve technical issues.
  • Extremely customer focused, flexible to be available on-call for solving critical problems.
  • Contribute towards the process improvement involving the Product deployments, Cloud Governance & Customer Success.

 

Skills Required

  • Minimum 3+ Years of experience with a B.E/B.Tech
  • Experience in managing Azure IaaS, PaaS services for customer production environments
  • Well versed in DevOps technologies, automation, infrastructure orchestration, configuration management and CI/CD
  • Experience in Linux and Windows Administration, server hardening and security compliance
  • Web and Application Server technologies (e.g. Apache, Nginx, IIS)
  • Good command in at least one scripting language (e.g. Bash, PowerShell, Ruby, Python)
  • Networking protocols such as HTTP, DNS and TCP/IP
  • Experience in managing version control platforms (e.g. Git, SVN)
Read more
Quess Corp Limited

at Quess Corp Limited

6 recruiters
Anjali Singh
Posted by Anjali Singh
Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Bengaluru (Bangalore), Chennai
5 - 8 yrs
₹1L - ₹15L / yr
Google Cloud Platform (GCP)
skill iconPython
Big Data
Data processing
Data Visualization

GCP  Data Analyst profile must have below skills sets :

 

Read more
Top Management Consulting Company

Top Management Consulting Company

Agency job
via People First Consultants by Naveed Mohd
Gurugram, Bengaluru (Bangalore), Chennai
2 - 9 yrs
₹9L - ₹27L / yr
DevOps
Microsoft Windows Azure
gitlab
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+15 more
Greetings!!

We are looking out for a technically driven  "ML OPS Engineer" for one of our premium client

COMPANY DESCRIPTION:
This Company is a global management consulting firm. We are the trusted advisor to the world's leading businesses, governments, and institutions. We work with leading organizations across the private, public and social sectors. Our scale, scope, and knowledge allow us to address


Key Skills
• Excellent hands-on expert knowledge of cloud platform infrastructure and administration
(Azure/AWS/GCP) with strong knowledge of cloud services integration, and cloud security
• Expertise setting up CI/CD processes, building and maintaining secure DevOps pipelines with at
least 2 major DevOps stacks (e.g., Azure DevOps, Gitlab, Argo)
• Experience with modern development methods and tooling: Containers (e.g., docker) and
container orchestration (K8s), CI/CD tools (e.g., Circle CI, Jenkins, GitHub actions, Azure
DevOps), version control (Git, GitHub, GitLab), orchestration/DAGs tools (e.g., Argo, Airflow,
Kubeflow)
• Hands-on coding skills Python 3 (e.g., API including automated testing frameworks and libraries
(e.g., pytest) and Infrastructure as Code (e.g., Terraform) and Kubernetes artifacts (e.g.,
deployments, operators, helm charts)
• Experience setting up at least one contemporary MLOps tooling (e.g., experiment tracking,
model governance, packaging, deployment, feature store)
• Practical knowledge delivering and maintaining production software such as APIs and cloud
infrastructure
• Knowledge of SQL (intermediate level or more preferred) and familiarity working with at least
one common RDBMS (MySQL, Postgres, SQL Server, Oracle)
Read more
Leading Indian NBFC

Leading Indian NBFC

Agency job
Chennai
5 - 12 yrs
₹8L - ₹18L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Design patterns
+7 more
Java Tech Lead AVP 
 
  • Immediate joiners with 5 to 10 years of experience. 
  • Should have team leading experience.
  • Should be keen to work as a Developer.
  • Java, Spring boot and Design patterns are key areas where they should be excellent.
  • Good communication skills is a mandate.
  • Should be willing to work on alternate Saturdays (10 AM to 4:30 PM).
  • They will have to relocate to Chennai.
  • Strong SQL skills, Postgres SQL database knowledge.
  • Cloud Experience in deployment (CI/CD)
  • Unit Test case
  • Angular – good to have 
 
 
Read more
Leading Payment Solution Company

Leading Payment Solution Company

Agency job
via People First Consultants by Aishwarya KA
Chennai, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai
9 - 16 yrs
Best in industry
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Microsoft Windows Azure
+9 more

About Company:

The company is a global leader in secure payments and trusted transactions. They are at the forefront of the digital revolution that is shaping new ways of paying, living, doing business and building relationships that pass on trust along the entire payments value chain, enabling sustainable economic growth. Their innovative solutions, rooted in a rock-solid technological base, are environmentally friendly, widely accessible and support social transformation.

  • Role Overview
    • Senior Engineer with a strong background and experience in cloud related technologies and architectures. Can design target cloud architectures to transform existing architectures together with the in-house team. Can actively hands-on configure and build cloud architectures and guide others.
  • Key Knowledge
    • 3-5+ years of experience in AWS/GCP or Azure technologies
    • Is likely certified on one or more of the major cloud platforms
    • Strong experience from hands-on work with technologies such as Terraform, K8S, Docker and orchestration of containers.
    • Ability to guide and lead internal agile teams on cloud technology
    • Background from the financial services industry or similar critical operational experience
Read more
Leading Payment Solution Company

Leading Payment Solution Company

Agency job
Remote, Bengaluru (Bangalore), Chennai, Pune, Hyderabad, Mumbai
3 - 10 yrs
₹8L - ₹28L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more

Experience: 3+ years of experience in Cloud Architecture

About Company:

The company is a global leader in secure payments and trusted transactions. They are at the forefront of the digital revolution that is shaping new ways of paying, living, doing business and building relationships that pass on trust along the entire payments value chain, enabling sustainable economic growth. Their innovative solutions, rooted in a rock-solid technological base, are environmentally friendly, widely accessible and support social transformation.



Cloud Architect / Lead

  • Role Overview
    • Senior Engineer with a strong background and experience in cloud related technologies and architectures. Can design target cloud architectures to transform existing architectures together with the in-house team. Can actively hands-on configure and build cloud architectures and guide others.
  • Key Knowledge
    • 3-5+ years of experience in AWS/GCP or Azure technologies
    • Is likely certified on one or more of the major cloud platforms
    • Strong experience from hands-on work with technologies such as Terraform, K8S, Docker and orchestration of containers.
    • Ability to guide and lead internal agile teams on cloud technology
    • Background from the financial services industry or similar critical operational experience
 
Read more
Intuitive Technology Partners
Aakriti Gupta
Posted by Aakriti Gupta
Remote, Ahmedabad, Pune, Gurugram, Chennai, Bengaluru (Bangalore), india
6 - 12 yrs
Best in industry
DevOps
skill iconKubernetes
skill iconDocker
Terraform
Linux/Unix
+10 more

Intuitive is the fastest growing top-tier Cloud Solutions and Services company supporting Global Enterprise Customer across Americas, Europe and Middle East.

Intuitive is looking for highly talented hands-on Cloud Infrastructure Architects to help accelerate our growing Professional Services consulting Cloud & DevOps practice. This is an excellent opportunity to join Intuitive’s global world class technology teams, working with some of the best and brightest engineers while also developing your skills and furthering your career working with some of the largest customers.

Job Description :

  • Extensive exp. with K8s (EKS/GKE) and k8s eco-system tooling e,g., Prometheus, ArgoCD, Grafana, Istio etc.
  • Extensive AWS/GCP Core Infrastructure skills
  • Infrastructure/ IAC Automation, Integration - Terraform
  • Kubernetes resources engineering and management
  • Experience with DevOps tools, CICD pipelines and release management
  • Good at creating documentation(runbooks, design documents, implementation plans )

Linux Experience :

  1. Namespace
  2. Virtualization
  3. Containers

 

Networking Experience

  1. Virtual networking
  2. Overlay networks
  3. Vxlans, GRE

 

Kubernetes Experience :

Should have experience in bringing up the Kubernetes cluster manually without using kubeadm tool.

 

Observability                              

Experience in observability is a plus

 

Cloud automation :

Familiarity with cloud platforms exclusively AWS, DevOps tools like Jenkins, terraform etc.

 

Read more
MNC Company - Product Based

MNC Company - Product Based

Agency job
via Bharat Headhunters by Ranjini C. N
Bengaluru (Bangalore), Chennai, Hyderabad, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 9 yrs
₹10L - ₹15L / yr
Data Warehouse (DWH)
Informatica
ETL
skill iconPython
Google Cloud Platform (GCP)
+2 more

Job Responsibilities

  • Design, build & test ETL processes using Python & SQL for the corporate data warehouse
  • Inform, influence, support, and execute our product decisions
  • Maintain advertising data integrity by working closely with R&D to organize and store data in a format that provides accurate data and allows the business to quickly identify issues.
  • Evaluate and prototype new technologies in the area of data processing
  • Think quickly, communicate clearly and work collaboratively with product, data, engineering, QA and operations teams
  • High energy level, strong team player and good work ethic
  • Data analysis, understanding of business requirements and translation into logical pipelines & processes
  • Identification, analysis & resolution of production & development bugs
  • Support the release process including completing & reviewing documentation
  • Configure data mappings & transformations to orchestrate data integration & validation
  • Provide subject matter expertise
  • Document solutions, tools & processes
  • Create & support test plans with hands-on testing
  • Peer reviews of work developed by other data engineers within the team
  • Establish good working relationships & communication channels with relevant departments

 

Skills and Qualifications we look for

  • University degree 2.1 or higher (or equivalent) in a relevant subject. Master’s degree in any data subject will be a strong advantage.
  • 4 - 6 years experience with data engineering.
  • Strong coding ability and software development experience in Python.
  • Strong hands-on experience with SQL and Data Processing.
  • Google cloud platform (Cloud composer, Dataflow, Cloud function, Bigquery, Cloud storage, dataproc)
  • Good working experience in any one of the ETL tools (Airflow would be preferable).
  • Should possess strong analytical and problem solving skills.
  • Good to have skills - Apache pyspark, CircleCI, Terraform
  • Motivated, self-directed, able to work with ambiguity and interested in emerging technologies, agile and collaborative processes.
  • Understanding & experience of agile / scrum delivery methodology

 

Read more
Client Of People First Consultants

Client Of People First Consultants

Agency job
Chennai
3 - 8 yrs
₹2L - ₹9L / yr
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more

The candidates should have:

 

·  Strong knowledge on Windows and Linux OS

 

·   Experience working in Version Control Systems like git

 

·  Hands-on experience in tools Docker, SonarQube, Ansible, Kubernetes, ELK.

 

·  Basic understanding of SQL commands

 

·  Experience working on Azure Cloud DevOps

Read more
prevaj consultants pvt ltd
Nilofer Jamal
Posted by Nilofer Jamal
Chennai
3 - 12 yrs
₹1L - ₹15L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconJavascript
Fullstack Developer
Software Testing (QA)
+8 more
1. Must have Experience in React, Nodejs.
2. Experience with the CI Systems.
3. Experience with any of the Cloud Platforms Such as GCP,AWS.
4. Experience with Continuous Integration environments.
5. Experience in end-to-end testing frameworkS Like Puppeteer / Cypress / Protractor/ Other selenium testing framework.
6. Experience with JavaScript Unit Testing Frameworks Like Jest/Mocha.
7. Experience in TypeScript.
Read more
prevaj consultants pvt ltd
Nilofer Jamal
Posted by Nilofer Jamal
Chennai
5 - 15 yrs
₹2L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+1 more

  • 5+ years of experience building real-time and distributed system architecture, from whiteboard to production
  • Strong programming skills in Python, Scala  and SQL.
  • Versatility. Experience across the entire spectrum of data engineering, including:
  • Data stores (e.g., AWS RDS, AWS Athena, AWS Aurora, AWS Redshift)
  • Data pipeline and workflow orchestration tools (e.g., Azkaban, Airflow)
  • Data processing technologies (e.g., Spark, Pentaho)
  • Deployment and monitoring large database clusters in public cloud platforms (e.g., Docker, Terraform, Datadog)
  • Creating ETL or ELT pipelines that transform and process petabytes of structured and unstructured data in real-time
  • Industry experience building and productionizing innovative end-to-end Machine Learning systems is a plus.
Read more
Searce Inc

at Searce Inc

64 recruiters
Yashodatta Deshapnde
Posted by Yashodatta Deshapnde
Pune, Noida, Bengaluru (Bangalore), Mumbai, Chennai
3 - 10 yrs
₹5L - ₹20L / yr
DevOps
skill iconKubernetes
Google Cloud Platform (GCP)
Terraform
skill iconJenkins
+2 more
Role & Responsibilities :
• At least 4 years of hands-on experience with cloud infrastructure on GCP
• Hands-on-Experience on Kubernetes is a mandate
• Exposure to configuration management and orchestration tools at scale (e.g. Terraform, Ansible, Packer)
• Knowledge and hand-on-experience in DevOps tools (e.g. Jenkins, Groovy, and Gradle)
• Knowledge and hand-on-experience on the various platforms (e.g. Gitlab, CircleCl and Spinnakar)
• Familiarity with monitoring and alerting tools (e.g. CloudWatch, ELK stack, Prometheus)
• Proven ability to work independently or as an integral member of a team

Preferable Skills:
• Familiarity with standard IT security practices such as encryption,
credentials and key management.
• Proven experience on various coding languages (Java, Python-) to
• support DevOps operation and cloud transformation
• Familiarity and knowledge of the web standards (e.g. REST APIs, web security mechanisms)
• Hands on experience with GCP
• Experience in performance tuning, services outage management and troubleshooting.

Attributes:
• Good verbal and written communication skills
• Exceptional leadership, time management, and organizational skill Ability to operate independently and make decisions with little direct supervision
Read more
enterprise-grade, streaming integration with intelligence pl

enterprise-grade, streaming integration with intelligence pl

Agency job
via Jobdost by Mamatha A
Chennai
5 - 15 yrs
₹15L - ₹30L / yr
skill iconJava
skill iconC++
Data Structures
SQL
Amazon RDS
+15 more

Striim (pronounced “stream” with two i’s for integration and intelligence) was founded in 2012 with a simple goal of helping companies make data useful the instant it’s born.

Striim’s enterprise-grade, streaming integration with intelligence platform makes it easy to build continuous, streaming data pipelines – including change data capture (CDC) – to power real-time cloud integration, log correlation, edge processing, and streaming analytics.

Strong Core Java / C++ experience

·       Excellent understanding of Logical ,Object-oriented design patterns, algorithms and data structures.

·       Sound knowledge of application access methods including authentication mechanisms, API quota limits, as well as different endpoint REST, Java etc

·       Strong exp in databases - not just a SQL Programmer but with knowledge of DB internals

·       Sound knowledge of Cloud database available as service is plus (RDS, CloudSQL, Google BigQuery, Snowflake )

·       Experience working in any cloud environment and microservices based architecture utilizing GCP, Kubernetes, Docker, CircleCI, Azure or similar technologies

·       Experience in Application verticals such as ERP, CRM, Sales with applications such as Salesforce, Workday, SAP  < Not Mandatory - added advantage >

·       Experience in building distributed systems  < Not Mandatory - added advantage >

·       Expertise on Data warehouse < Not Mandatory - added advantage >

·       Exp in developing & delivering product as SaaS i< Not Mandatory - added advantage 

Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Bengaluru (Bangalore), Chennai, Hyderabad, Pune
3 - 8 yrs
₹3L - ₹9L / yr
Data Warehouse (DWH)
Big Data
Spark
Apache Kafka
Data engineering
+14 more
Day-to-day Activities
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus 
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort