Cutshort logo
Glue semantics jobs

3+ Glue semantics Jobs in India

Apply to 3+ Glue semantics Jobs on CutShort.io. Find your next job, effortlessly. Browse Glue semantics Jobs and apply today!

icon
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
7 - 15 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+20 more

Job Title : Tech Lead - Data Engineering (AWS, 7+ Years)

Location : Gurugram

Employment Type : Full-Time


Job Summary :

Seeking a Tech Lead - Data Engineering with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.


Key Responsibilities :

  • Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
  • Maintain data lakes & warehouses for analytics.
  • Ensure data integrity through quality checks.
  • Collaborate with data scientists & engineers to deliver solutions.

Qualifications :

  • 7+ Years in Data Engineering.
  • Expertise in AWS services, SQL, Python, Spark, Kafka.
  • Experience with CI/CD, DevOps practices.
  • Strong problem-solving skills.

Preferred Skills :

  • Experience with Snowflake, Databricks.
  • Knowledge of BI tools (Tableau, Power BI).
  • Healthcare/Insurance domain experience is a plus.


Read more
Hyderabad
5 - 15 yrs
₹4L - ₹14L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+4 more
Big Data Engineer:-


-Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight.

-Experience in developing lambda functions with AWS Lambda.

-
Expertise with Spark/PySpark

– Candidate should be hands on with PySpark code and should be able to do transformations with Spark

-Should be able to code in Python and Scala.

-
Snowflake experience will be a plus
Read more
Urgent Openings with one of our client

Urgent Openings with one of our client

Hyderabad
3 - 7 yrs
₹3L - ₹10L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+4 more


Experience : 3 to 7 Years
Number of Positions : 20

Job Location : Hyderabad

Notice : 30 Days

 

1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight

2. Experience in developing lambda functions with AWS Lambda

3. Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark

4. Should be able to code in Python and Scala.

5. Snowflake experience will be a plus

 

Hadoop and Hive requirements as good to have or understanding of is enough.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort