Cutshort logo
Datawarehousing Jobs in Pune

8+ Datawarehousing Jobs in Pune | Datawarehousing Job openings in Pune

Apply to 8+ Datawarehousing Jobs in Pune on CutShort.io. Explore the latest Datawarehousing Job opportunities across top companies like Google, Amazon & Adobe.

icon
Adesso

Adesso

Agency job
via HashRoot by Maheswari M
Kochi (Cochin), Chennai, Pune
3 - 6 yrs
₹4L - ₹24L / yr
Data engineering
skill iconAmazon Web Services (AWS)
Windows Azure
Snowflake
Data Transformation Tool (DBT)
+3 more

We are seeking a skilled Cloud Data Engineer who has experience with cloud data platforms like AWS or Azure and especially Snowflake and dbt to join our dynamic team. As a consultant, you will be responsible for developing new data platforms and create the data processes. You will collaborate with cross-functional teams to design, develop, and deploy high-quality frontend solutions. 

Responsibilities:

Customer consulting: You develop data-driven products in the Snowflake Cloud and connect data & analytics with specialist departments. You develop ELT processes using dbt (data build tool) 

Specifying requirements: You develop concrete requirements for future-proof cloud data architectures.

Develop data routes: You design scalable and powerful data management processes.

Analyze data: You derive sound findings from data sets and present them in an understandable way.

Requirements:

Requirements management and project experience: You successfully implement cloud-based data & analytics projects.

Data architectures: You are proficient in DWH/data lake concepts and modeling with Data Vault 2.0.

Cloud expertise: You have extensive knowledge of Snowflake, dbt and other cloud technologies (e.g. MS Azure, AWS, GCP).

SQL know-how: You have a sound and solid knowledge of SQL.

Data management: You are familiar with topics such as master data management and data quality.

Bachelor's degree in computer science, or a related field.

Strong communication and collaboration abilities to work effectively in a team environment.

 

Skills & Requirements

Cloud Data Engineering, AWS, Azure, Snowflake, dbt, ELT processes, Data-driven consulting, Cloud data architectures, Scalable data management, Data analysis, Requirements management, Data warehousing, Data lake, Data Vault 2.0, SQL, Master data management, Data quality, GCP, Strong communication, Collaboration.

Read more
Client based at Pune location.

Client based at Pune location.

Agency job
Pune
5 - 9 yrs
₹18L - ₹30L / yr
Data Engineer
Python
Datawarehousing
Snow flake schema
Data modeling
+7 more

Skills & Experience:

❖ At least 5+ years of experience as a Data Engineer

❖ Hands-on and in-depth experience with Star / Snowflake schema design, data modeling,

data pipelining and MLOps.

❖ Experience in Data Warehouse technologies (e.g. Snowflake, AWS Redshift, etc)

❖ Experience in AWS data pipelines (Lambda, AWS glue, Step functions, etc)

❖ Proficient in SQL

❖ At least one major programming language (Python / Java)

❖ Experience with Data Analysis Tools such as Looker or Tableau

❖ Experience with Pandas, Numpy, Scikit-learn, and Jupyter notebooks preferred

❖ Familiarity with Git, GitHub, and JIRA.

❖ Ability to locate & resolve data quality issues

❖ Ability to demonstrate end to ed data platform support experience

Other Skills:

❖ Individual contributor

❖ Hands-on with the programming

❖ Strong analytical and problem solving skills with meticulous attention to detail

❖ A positive mindset and can-do attitude

❖ To be a great team player

❖ To have an eye for detail

❖ Looking for opportunities to simplify, automate tasks, and build reusable components.

❖ Ability to judge suitability of new technologies for solving business problems

❖ Build strong relationships with analysts, business, and engineering stakeholders

❖ Task Prioritization

❖ Familiar with agile methodologies.

❖ Fintech or Financial services industry experience

❖ Eagerness to learn, about the Private Equity/Venture Capital ecosystem and associated

secondary market

Responsibilities:

o Design, develop and maintain a data platform that is accurate, secure, available, and fast.

o Engineer efficient, adaptable, and scalable data pipelines to process data.

o Integrate and maintain a variety of data sources: different databases, APIs, SAASs, files, logs,

events, etc.

o Create standardized datasets to service a wide variety of use cases.

o Develop subject-matter expertise in tables, systems, and processes.

o Partner with product and engineering to ensure product changes integrate well with the

data platform.

o Partner with diverse stakeholder teams, understand their challenges and empower them

with data solutions to meet their goals.

o Perform data quality on data sources and automate and maintain a quality control

capability.

Read more
Clients located in Bangalore,Chennai &Pune Location

Clients located in Bangalore,Chennai &Pune Location

Agency job
Bengaluru (Bangalore), Pune, Chennai
3 - 8 yrs
₹8L - ₹16L / yr
ETL
skill iconPython
Shell Scripting
Data modeling
Datawarehousing

Role: Ab Initio Developer

Experience: 2.5 (mandate) - 8 years

Skills: Ab Initio Development

Location: Chennai/Bangalore/Pune

only Immediate to 15 days joiners

should be available for in person interview only

Its a long term contract role with IBM and Arnold is the payrolling company.

JOB DESCRIPTION:

We are seeking a skilled Ab Initio Developer to join our dynamic team and contribute to the development and maintenance of critical data integration solutions. As an Ab Initio Developer, you will be responsible for designing, developing, and implementing robust and efficient data pipelines using Ab Initio's powerful ETL capabilities.


Key Responsibilities:

·      Design, develop, and implement complex data integration solutions using Ab Initio's graphical interface and command-line tools.

·      Analyze complex data requirements and translate them into effective Ab Initio designs.

·      Develop and maintain efficient data pipelines, including data extraction, transformation, and loading processes.

·      Troubleshoot and resolve technical issues related to Ab Initio jobs and data flows.

·      Optimize performance and scalability of Ab Initio jobs.

·      Collaborate with business analysts, data analysts, and other team members to understand data requirements and deliver solutions that meet business needs.

·      Stay up-to-date with the latest Ab Initio technologies and industry best practices.

Required Skills and Experience:

·      2.5 to 8 years of hands-on experience in Ab Initio development.

·      Strong understanding of Ab Initio components, including Designer, Conductor, and Monitor.

·      Proficiency in Ab Initio's graphical interface and command-line tools.

·      Experience in data modeling, data warehousing, and ETL concepts.

·      Strong SQL skills and experience with relational databases.

·      Excellent problem-solving and analytical skills.

·      Ability to work independently and as part of a team.

·      Strong communication and documentation skills.

Preferred Skills:

·      Experience with cloud-based data integration platforms.

·      Knowledge of data quality and data governance concepts.

·      Experience with scripting languages (e.g., Python, Shell scripting).

·      Certification in Ab Initio or related technologies.

Read more
Aequor Technologies

at Aequor Technologies

1 recruiter
Ranjana Guru
Posted by Ranjana Guru
Remote only
8 - 15 yrs
₹1L - ₹20L / yr
skill iconData Analytics
Datawarehousing
Data architecture
SAP HANA

Required Skills:

  • Proven work experience as an Enterprise / Data / Analytics Architect - Data Platform in HANA XSA, XS, Data Intelligence and SDI
  • Can work on new and existing architecture decision in HANA XSA, XS, Data Intelligence and SDI
  • Well versed with data architecture principles, software / web application design, API design, UI / UX capabilities, XSA / Cloud foundry architecture
  • In-depth understand of database structure (HANA in-memory) principles.
  • In-depth understand of ETL solutions and data integration strategy.
  • Excellent knowledge of Software and Application design, API, XSA, and microservices concepts

 

Roles & Responsibilities:

  • Advise and ensure compliance of the defined Data Architecture principle.
  • Identifies new technologies update and development tools including new release/upgrade/patch as required. 
  • Analyzes technical risks and advises on risk mitigation strategy.
  • Advise and ensures compliance to existing and development required data and reporting standard including naming convention.

 

The time window is ideally AEST (8 am till 5 pm) which means starting at 3:30 am IST. We understand it can be very early for an SME supporting from India. Hence, we can consider the candidates who can support from at least 7 am IST (earlier is possible).

Read more
InnovAccer

at InnovAccer

3 recruiters
Jyoti Kaushik
Posted by Jyoti Kaushik
Noida, Bengaluru (Bangalore), Pune, Hyderabad
4 - 7 yrs
₹4L - ₹16L / yr
ETL
SQL
Data Warehouse (DWH)
Informatica
Datawarehousing
+2 more

We are looking for a Senior Data Engineer to join the Customer Innovation team, who will be responsible for acquiring, transforming, and integrating customer data onto our Data Activation Platform from customers’ clinical, claims, and other data sources. You will work closely with customers to build data and analytics solutions to support their business needs, and be the engine that powers the partnership that we build with them by delivering high-fidelity data assets.

In this role, you will work closely with our Product Managers, Data Scientists, and Software Engineers to build the solution architecture that will support customer objectives. You'll work with some of the brightest minds in the industry, work with one of the richest healthcare data sets in the world, use cutting-edge technology, and see your efforts affect products and people on a regular basis. The ideal candidate is someone that

  • Has healthcare experience and is passionate about helping heal people,
  • Loves working with data,
  • Has an obsessive focus on data quality,
  • Is comfortable with ambiguity and making decisions based on available data and reasonable assumptions,
  • Has strong data interrogation and analysis skills,
  • Defaults to written communication and delivers clean documentation, and,
  • Enjoys working with customers and problem solving for them.

A day in the life at Innovaccer:

  • Define the end-to-end solution architecture for projects by mapping customers’ business and technical requirements against the suite of Innovaccer products and Solutions.
  • Measure and communicate impact to our customers.
  • Enabling customers on how to activate data themselves using SQL, BI tools, or APIs to solve questions they have at speed.

What You Need:

  • 4+ years of experience in a Data Engineering role, a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
  • 4+ years of experience working with relational databases like Snowflake, Redshift, or Postgres.
  • Intermediate to advanced level SQL programming skills.
  • Data Analytics and Visualization (using tools like PowerBI)
  • The ability to engage with both the business and technical teams of a client - to document and explain technical problems or concepts in a clear and concise way.
  • Ability to work in a fast-paced and agile environment.
  • Easily adapt and learn new things whether it’s a new library, framework, process, or visual design concept.

What we offer:

  • Industry certifications: We want you to be a subject matter expert in what you do. So, whether it’s our product or our domain, we’ll help you dive in and get certified.
  • Quarterly rewards and recognition programs: We foster learning and encourage people to take risks. We recognize and reward your hard work.
  • Health benefits: We cover health insurance for you and your loved ones.
  • Sabbatical policy: We encourage people to take time off and rejuvenate, learn new skills, and pursue their interests so they can generate new ideas with Innovaccer.
  • Pet-friendly office and open floor plan: No boring cubicles.
Read more
Numerator

at Numerator

4 recruiters
Ketaki Kambale
Posted by Ketaki Kambale
Remote, Pune
3 - 9 yrs
₹5L - ₹20L / yr
Data Warehouse (DWH)
Informatica
ETL
skill iconPython
SQL
+1 more

We’re hiring a talented Data Engineer and Big Data enthusiast to work in our platform to help ensure that our data quality is flawless.  As a company, we have millions of new data points every day that come into our system. You will be working with a passionate team of engineers to solve challenging problems and ensure that we can deliver the best data to our customers, on-time. You will be using the latest cloud data warehouse technology to build robust and reliable data pipelines.

Duties/Responsibilities Include:

  •  Develop expertise in the different upstream data stores and systems across Numerator.
  • Design, develop and maintain data integration pipelines for Numerators growing data sets and product offerings.
  • Build testing and QA plans for data pipelines.
  • Build data validation testing frameworks to ensure high data quality and integrity.
  • Write and maintain documentation on data pipelines and schemas
 

Requirements:

  • BS or MS in Computer Science or related field of study
  • 3 + years of experience in the data warehouse space
  • Expert in SQL, including advanced analytical queries
  • Proficiency in Python (data structures, algorithms, object oriented programming, using API’s)
  • Experience working with a cloud data warehouse (Redshift, Snowflake, Vertica)
  • Experience with a data pipeline scheduling framework (Airflow)
  • Experience with schema design and data modeling

Exceptional candidates will have:

  • Amazon Web Services (EC2, DMS, RDS) experience
  • Terraform and/or ansible (or similar) for infrastructure deployment
  • Airflow -- Experience building and monitoring DAGs, developing custom operators, using script templating solutions.
  • Experience supporting production systems in an on-call environment
Read more
Pinghala

at Pinghala

1 recruiter
Ashwini Dhaipule
Posted by Ashwini Dhaipule
Pune
6 - 10 yrs
₹6L - ₹15L / yr
Software Testing (QA)
Shell Scripting
Data management
ETL QA
ETL
+3 more

ABOUT US:
Pingahla was founded by a group of people passionate about making the world a better place by harnessing
the power of Data. We are a data management firm with offices in New York and India.Our mission is to help transform the way companies operate and think about their business. We make it easier
to adopt and stay ahead of the curve in the ever-changing digital landscape. One of our core beliefs is
excellence in everything we do!

JOB DESCRIPTION:
Pingahla is recruiting ETL & BI Test Manager who can build and lead a team, establish infrastructure, processes
and best practices for our Quality Assurance vertical. The candidates are expected to have at least 5+ years of experience with ETL Testing and working in Data Management project testing. Being a growing company, we will be able to provide very good career opportunities and a very attractive remuneration.

JOB ROLE & RESPONSIBILITIES:
• Plans and manages the testing activities;
• Defect Management and Weekly & Monthly Test report Generation;
• Work as a Test Manager to design Test Strategy and approach for DW&BI - (ETL & BI) solution;
• Provide leadership and directions to the team on quality standards and testing best practices;
• Ensured that project deliverables are produced, including, but not limited to: quality assurance plans,
test plans, testing priorities, status reports, user documentation, online help, etc. Managed and
motivated teams to accomplish significant deliverables within tight deadlines.
• Test Data Management; Reviews and approves all test cases prior to execution;
• Coordinates and reviews offshore work efforts for projects and maintenance activities.

REQUIRED SKILLSET:
• Experience in Quality Assurance Management, Program Management, DW - (ETL & BI)
Management
• Minimum 5 years in ETL Testing, at least 2 years in the Team Lead role
• Technical abilities complemented by sound communication skills, user interaction abilities,
requirement gathering and analysis, and skills in data migration and conversion strategies.
• Proficient in test definition, capable of developing test plans and test cases from technical
specifications
• Single handedly looking after complete delivery from testing side.
• Experience working with remote teams, across multiple time zones.
• Must have a strong knowledge of QA processes and methodologies
• Strong UNIX and PERL scripting skills
• Expertise with ETL testing & hands-on with working on ETL tool like Informatica, DataStage
PL/SQL is a plus.
• Excellent problem solving, analytical and technical troubleshooting skills.
• Familiar with Data Management projects.
• Eager to learn, adopt and apply rapidly changing new technologies and methodologies.
• Efficient and effective at approaching and escalating quality issues when appropriate.

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
4 - 10 yrs
₹5L - ₹20L / yr
ETL
SQL
Data engineering
Analytics
PL/SQL
+3 more

We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. 

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description

Experience : 4-10 years

Location : Pune

 


Mandatory Skills - 

  • Strong in ETL/SQL development
  • Strong Data Warehousing skills
  • Hands-on experience working with Unix/Linux
  • Development experience in Enterprise Data warehouse projects
  • Good to have experience working with Python, shell scripting
  •  

Opportunities -

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
  • Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing


 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

http://www.datametica.com/">www.datametica.com

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort