GCP ETL Data Stage
3 days ago
Job description
Some careers shine brighter than others.
If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of Senior Software engineer
In this role, you will:
- Co-ordinations with stakeholders to ensure timely deliverables.
- Provide solution architecture support to projects where required ensuring that solution defined meets business needs, is aligned to functional and target architecture with any deviations approved.
- Analyze and propose plan to demise legacy systems.
- Lead a team of data engineers and assume responsibilities as Technical Lead for the assigned projects.
- Ensure full ownership and efficient management of the GDT IT services and products.
- Ensure that any new technology products are taken through the technology design governance process.
- Mentor and coach less experienced members of staff and promotes an understanding of the value of architecture and of use of technologies and standards in their domain across IT.
- Periodical monitor of team progress.
- Delivering optimum solution that meets client requirements.
- Inputs provided for Estimations, Monitoring & Co-ordinate team related activities.
- Involved in Designing, Development & Unit testing, Performance Testing the application
Requirements
To be successful in this role, you should meet the following requirements:
- Extensive ETL tool experience using IBM Infosphere/Websphere DataStage.
- Worked on DataStage tools like DataStage Designer, DataStage Director and DataStage Administrator.
- Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.
- Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.
- Developed parallel jobs using different processing stages like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel, CDC, Change Apply and Filter.
- Used Enterprise Edition/Parallel stages like Datasets, Change Data Capture, Row Generator and many other stages in accomplishing the ETL Coding.
- Familiar in using highly scalable parallel processing infrastructure using parallel jobs and multiple node configuration files.
- Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tools like Control M
- Experience in troubleshooting of jobs and addressing production issues like data issues, performance tuning and enhancements
You'll achieve more when you join HSBC.
HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Issued by – HSDI
-
Data Stage Expert
3 days ago
Hyderabad, Telangana, India Anblicks Full time ₹ 15,00,000 - ₹ 25,00,000 per yearData Engineer – Data Stage Expert (Offshore - Chennai)We are seeking a skilled and experienced Data Engineer with strong expertise in IBM Datastage and/or Talend to join our team in Hyderabad. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes to support data warehousing and analytics initiatives. You'll work...
-
Onix is Hiring GCP ETL Data Engineers
3 days ago
Hyderabad, Telangana, India Datametica Full time ₹ 8,00,000 - ₹ 24,00,000 per yearAbout the RoleWe are looking for a highly skilled GCP Data Engineer to design, build, and optimize scalable data pipelines and solutions on Google Cloud. The ideal candidate will have strong expertise in ETL/ELT development, data modeling, and cloud-native services while collaborating with data analysts, scientists, and business teams.Key...
-
GCP Data Engineer
1 week ago
Hyderabad, Telangana, India Zennial Pro Private Limited Full time ₹ 15,00,000 - ₹ 25,00,000 per yearJob Title:GCP Data Engineer Location: Hyderabad / Pune Experience: 4+ YearsKey ResponsibilitiesDesign, build, and optimize scalable data pipelines on Google Cloud Platform (GCP) .Work on BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Functions for data ingestion, transformation, and analytics.Develop ETL/ELT processes ensuring data quality,...
-
Gcp Data Engineer
2 weeks ago
Hyderabad, Telangana, India Egen (Formerly SpringML) Full time ₹ 15,00,000 - ₹ 25,00,000 per yearJob Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working...
-
Lead GCP Data Engineer
2 weeks ago
Hyderabad, Telangana, India Incedo Inc. Full time ₹ 20,00,000 - ₹ 25,00,000 per yearRole: Lead GCP Data EngineerLocation: Hyderabad/ChennaiExperience: 6+ YearsJob details:Key ResponsibilitiesLead the end-to-end design and implementation of data pipelines , ETL/ELT processes , and data lake/warehouse solutions on GCP.Architect scalable data platforms leveraging BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Composer, and Cloud...
-
GCP Data Architect
1 week ago
Hyderabad, Telangana, India, Telangana Tata Consultancy Services Full timeJob Title :- GCP Data ArchitectExperience: 7 to 12 yearsLocation: Pan IndiaVirtual Drive : 10am to 4pmDate: 11th Oct 2025Greetings from TCS!!!Job Description:Design and Implement Data Architectures: Architect and build scalable, end-to-end data solutions on GCP, encompassing data ingestion, transformation, storage, and consumption.Develop Data Pipelines:...
-
PySpark & ETL Data Engineer
3 days ago
Hyderabad, Telangana, India CirrusLabs Full timeWe areCirrusLabs. Our vision is to become the world's most sought-after niche digital transformation company that helps customers realize value through innovation. Our mission is to co-create success with our customers, partners and community. Our goal is to enable employees to dream, grow and make things happen. We are committed to excellence. We are a...
-
GCP Data Engineer
5 days ago
Hyderabad, Telangana, India Zetamicron Full time ₹ 8,00,000 - ₹ 15,00,000 per yearJob Title:GCP Data Engineer (4+ Years Experience)Location:Hyderabad / PuneOpen Positions:3Compensation:Up to₹12 LPA(Based on experience & skills)Must Read Before ApplyingWe are looking forexperienced professionals with a minimum of 4+ yearsinGCP Data Engineering.Only candidates who meet theexperience requirementand possess themandatory skills listed...
-
GCP Data Engineer
1 week ago
Hyderabad, Telangana, India, Telangana Egen Full timeWe are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various...
-
GCP Data Engineer
3 days ago
Hyderabad, Telangana, India Qode Full time ₹ 15,00,000 - ₹ 25,00,000 per yearData EngineerLocation: Chennai, India or Hyderabad, IndiaWorkplace Type: HybridRequired Skills & QualificationsBachelor's degree in Computer Science, Engineering, or a related field.5-7 years of experience in data engineering.Strong proficiency in Python programming.Extensive experience with GCP services, including Dataflow, Dataproc, BigQuery, Cloud...