Cutshort logo
ETL Jobs in Ahmedabad

7+ ETL Jobs in Ahmedabad | ETL Job openings in Ahmedabad

Apply to 7+ ETL Jobs in Ahmedabad on CutShort.io. Explore the latest ETL Job opportunities across top companies like Google, Amazon & Adobe.

icon
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Ahmedabad, Indore
7 - 14 yrs
₹20L - ₹30L / yr
Solution architecture
skill iconData Analytics
Data architecture
Data Warehouse (DWH)
Enterprise Data Warehouse (EDW)
+21 more

As a Solution Architect, you will collaborate with our sales, presales and COE teams to provide technical expertise and support throughout the new business acquisition process. You will play a crucial role in understanding customer requirements, presenting our solutions, and demonstrating the value of our products.


You thrive in high-pressure environments, maintaining a positive outlook and understanding that career growth is a journey that requires making strategic choices. You possess good communication skills, both written and verbal, enabling you to convey complex technical concepts clearly and effectively. You are a team player, customer-focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You must have experience in managing and handling RFPs/ RFIs, client demos and presentations, and converting opportunities into winning bids. You possess a strong work ethic, positive attitude, and enthusiasm to embrace new challenges. You can multi-task and prioritize (good time management skills), willing to display and learn. You should be able to work independently with less or no supervision. You should be process-oriented, have a methodical approach and demonstrate a quality-first approach.


Ability to convert client’s business challenges/ priorities into winning proposal/ bid through excellence in technical solution will be the key performance indicator for this role.


What you’ll do

  • Architecture & Design: Develop high-level architecture designs for scalable, secure, and robust solutions.
  • Technology Evaluation: Select appropriate technologies, frameworks, and platforms for business needs.
  • Cloud & Infrastructure: Design cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP.
  • Integration: Ensure seamless integration between various enterprise applications, APIs, and third-party services.
  • Design and develop scalable, secure, and performant data architectures on Microsoft Azure and/or new generation analytics platform like MS Fabric.
  • Translate business needs into technical solutions by designing secure, scalable, and performant data architectures on cloud platforms.
  • Select and recommend appropriate Data services (e.g. Fabric, Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, Power BI etc) to meet specific data storage, processing, and analytics needs.
  • Develop and recommend data models that optimize data access and querying. Design and implement data pipelines for efficient data extraction, transformation, and loading (ETL/ELT) processes.
  • Ability to understand Conceptual/Logical/Physical Data Modelling.
  • Choose and implement appropriate data storage, processing, and analytics services based on specific data needs (e.g., data lakes, data warehouses, data pipelines).
  • Understand and recommend data governance practices, including data lineage tracking, access control, and data quality monitoring.



What you will Bring 

  • 10+ years of working in data analytics and AI technologies from consulting, implementation and design perspectives
  • Certifications in data engineering, analytics, cloud, AI will be a certain advantage
  • Bachelor’s in engineering/ technology or an MCA from a reputed college is a must
  • Prior experience of working as a solution architect during presales cycle will be an advantage


Soft Skills

  • Communication Skills
  • Presentation Skills
  • Flexible and Hard-working


Technical Skills

  • Knowledge of Presales Processes
  • Basic understanding of business analytics and AI
  • High IQ and EQ


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.
Read more
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Ahmedabad, Indore
5 - 10 yrs
₹10L - ₹20L / yr
Data engineering
Data modeling
Database Design
Data Warehouse (DWH)
Datawarehousing
+9 more

Job Summary: 

As a Data Engineering Lead, your role will involve designing, developing, and implementing interactive dashboards and reports using data engineering tools. You will work closely with stakeholders to gather requirements and translate them into effective data visualizations that provide valuable insights. Additionally, you will be responsible for extracting, transforming, and loading data from multiple sources into Power BI, ensuring its accuracy and integrity. Your expertise in Power BI and data analytics will contribute to informed decision-making and support the organization in driving data-centric strategies and initiatives.


We are looking for you!

As an ideal candidate for the Data Engineering Lead position, you embody the qualities of a team player with a relentless get-it-done attitude. Your intellectual curiosity and customer focus drive you to continuously seek new ways to add value to your job accomplishments.


You thrive under pressure, maintaining a positive attitude and understanding that your career is a journey. You are willing to make the right choices to support your growth. In addition to your excellent communication skills, both written and verbal, you have a proven ability to create visually compelling designs using tools like Power BI and Tableau that effectively communicate our core values. 


You build high-performing, scalable, enterprise-grade applications and teams. Your creativity and proactive nature enable you to think differently, find innovative solutions, deliver high-quality outputs, and ensure customers remain referenceable. With over eight years of experience in data engineering, you possess a strong sense of self-motivation and take ownership of your responsibilities. You prefer to work independently with little to no supervision. 


You are process-oriented, adopt a methodical approach, and demonstrate a quality-first mindset. You have led mid to large-size teams and accounts, consistently using constructive feedback mechanisms to improve productivity, accountability, and performance within the team. Your track record showcases your results-driven approach, as you have consistently delivered successful projects with customer case studies published on public platforms. Overall, you possess a unique combination of skills, qualities, and experiences that make you an ideal fit to lead our data engineering team(s).


You value inclusivity and want to join a culture that empowers you to show up as your authentic self. 


You know that success hinges on commitment, our differences make us stronger, and the finish line is always sweeter when the whole team crosses together. In your role, you should be driving the team using data, data, and more data. You will manage multiple teams, oversee agile stories and their statuses, handle escalations and mitigations, plan ahead, identify hiring needs, collaborate with recruitment teams for hiring, enable sales with pre-sales teams, and work closely with development managers/leads for solutioning and delivery statuses, as well as architects for technology research and solutions.


What You Will Do: 

  • Analyze Business Requirements.
  • Analyze the Data Model and do GAP analysis with Business Requirements and Power BI. Design and Model Power BI schema.
  • Transformation of Data in Power BI/SQL/ETL Tool.
  • Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas.
  • Experience writing SQL Queries and stored procedures.
  • Design effective Power BI solutions based on business requirements.
  • Manage a team of Power BI developers and guide their work.
  • Integrate data from various sources into Power BI for analysis.
  • Optimize performance of reports and dashboards for smooth usage.
  • Collaborate with stakeholders to align Power BI projects with goals.
  • Knowledge of Data Warehousing(must), Data Engineering is a plus


What we need?

  • B. Tech computer science or equivalent
  • Minimum 5+ years of relevant experience 


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.


Read more
MindInventory

at MindInventory

1 video
4 recruiters
Khushi Bhatt
Posted by Khushi Bhatt
Ahmedabad
3 - 5 yrs
₹4L - ₹12L / yr
Data engineering
ETL
Google Cloud Platform (GCP)
Apache Airflow
Snow flake schema
+3 more
  • Required Minimum 3 years of Experience as a Data Engineer
  • Database Knowledge: Experience with Timeseries and Graph Database is must along with SQL, PostgreSQL, MySQL, or NoSQL databases like FireStore, MongoDB,
  • Data Pipelines: Understanding data Pipeline process like ETL, ELT, Streaming Pipelines with tools like AWS Glue, Google Dataflow, Apache Airflow, Apache NiFi.
  • Data Modeling: Knowledge of Snowflake Schema, Fact & Dimension Tables.
  • Data Warehousing Tools: Experience with Google BigQuery, Snowflake, Databricks
  • Performance Optimization: Indexing, partitioning, caching, query optimization techniques.
  • Python or SQL Scripting: Ability to write scripts for data processing and automation


Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Pune, Hyderabad, Ahmedabad, Chennai
3 - 7 yrs
₹8L - ₹15L / yr
AWS Lambda
Amazon S3
Amazon VPC
Amazon EC2
Amazon Redshift
+3 more

Technical Skills:


  • Ability to understand and translate business requirements into design.
  • Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
  • Experience in creating ETL jobs using Python/PySpark.
  • Proficiency in creating AWS Lambda functions for event-based jobs.
  • Knowledge of automating ETL processes using AWS Step Functions.
  • Competence in building data warehouses and loading data into them.


Responsibilities:


  • Understand business requirements and translate them into design.
  • Assess AWS infrastructure needs for development work.
  • Develop ETL jobs using Python/PySpark to meet requirements.
  • Implement AWS Lambda for event-based tasks.
  • Automate ETL processes using AWS Step Functions.
  • Build data warehouses and manage data loading.
  • Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Read more
Service based company

Service based company

Agency job
via Vmultiply solutions by Mounica Buddharaju
Ahmedabad, Rajkot
2 - 4 yrs
₹3L - ₹6L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
SQL
ETL


Qualifications :

  • Minimum 2 years of .NET development experience (ASP.Net 3.5 or greater and C# 4 or greater).
  • Good knowledge of MVC, Entity Framework, and Web API/WCF.
  • ASP.NET Core knowledge is preferred.
  • Creating APIs / Using third-party APIs
  • Working knowledge of Angular is preferred.
  • Knowledge of Stored Procedures and experience with a relational database (MSSQL 2012 or higher).
  • Solid understanding of object-oriented development principles
  • Working knowledge of web, HTML, CSS, JavaScript, and the Bootstrap framework
  • Strong understanding of object-oriented programming
  • Ability to create reusable C# libraries
  • Must be able to write clean comments, readable C# code, and the ability to self-learn.
  • Working knowledge of GIT

Qualities required :

Over above tech skill we prefer to have

  • Good communication and Time Management Skill.
  • Good team player and ability to contribute on a individual basis.

  • We provide the best learning and growth environment for candidates.












Skills:


    NET Core

   .NET Framework

    ASP.NET Core

    ASP.NET MVC

    ASP.NET Web API  

   C#

   HTML


Read more
Product and Service based company

Product and Service based company

Agency job
via Jobdost by Sathish Kumar
Hyderabad, Ahmedabad
4 - 8 yrs
₹15L - ₹30L / yr
skill iconAmazon Web Services (AWS)
Apache
Snow flake schema
skill iconPython
Spark
+13 more

Job Description

 

Mandatory Requirements 

  • Experience in AWS Glue

  • Experience in Apache Parquet 

  • Proficient in AWS S3 and data lake 

  • Knowledge of Snowflake

  • Understanding of file-based ingestion best practices.

  • Scripting language - Python & pyspark

CORE RESPONSIBILITIES

  • Create and manage cloud resources in AWS 

  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 

  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 

  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 

  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.

  • Define process improvement opportunities to optimize data collection, insights and displays.

  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 

  • Identify and interpret trends and patterns from complex data sets 

  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 

  • Key participant in regular Scrum ceremonies with the agile teams  

  • Proficient at developing queries, writing reports and presenting findings 

  • Mentor junior members and bring best industry practices.

 

QUALIFICATIONS

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) 

  • Strong background in math, statistics, computer science, data science or related discipline

  • Advanced knowledge one of language: Java, Scala, Python, C# 

  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake  

  • Proficient with

  • Data mining/programming tools (e.g. SAS, SQL, R, Python)

  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)

  • Data visualization (e.g. Tableau, Looker, MicroStrategy)

  • Comfortable learning about and deploying new technologies and tools. 

  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 

  • Good written and oral communication skills and ability to present results to non-technical audiences 

  • Knowledge of business intelligence and analytical tools, technologies and techniques.

Familiarity and experience in the following is a plus: 

  • AWS certification

  • Spark Streaming 

  • Kafka Streaming / Kafka Connect 

  • ELK Stack 

  • Cassandra / MongoDB 

  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools

Read more
Discite Analytics Private Limited
Uma Sravya B
Posted by Uma Sravya B
Ahmedabad
4 - 7 yrs
₹12L - ₹20L / yr
Hadoop
Big Data
Data engineering
Spark
Apache Beam
+13 more
Responsibilities:
1. Communicate with the clients and understand their business requirements.
2. Build, train, and manage your own team of junior data engineers.
3. Assemble large, complex data sets that meet the client’s business requirements.
4. Identify, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, including the cloud.
6. Assist clients with data-related technical issues and support their data infrastructure requirements.
7. Work with data scientists and analytics experts to strive for greater functionality.

Skills required: (experience with at least most of these)
1. Experience with Big Data tools-Hadoop, Spark, Apache Beam, Kafka etc.
2. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
3. Experience in ETL and Data Warehousing.
4. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra etc.
5. Experience with cloud platforms like AWS, GCP and Azure.
6. Experience with workflow management using tools like Apache Airflow.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort