Data Engineer

3 weeks ago


Pune India HSBC Full time

Job Description

Job description

Some careers shine brighter than others.

If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.

- The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We serve a diverse audience of users and data visualisation requirements from Exco down, and over 80 data sources in multiple time-zones across Middle Office, Post-Trade and Securities Services IT and elsewhere. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics which allow senior management to measure the success of their BAU and CTB investment dollars.
- We are looking for a GCP developer who can design, develop, test and deploy ingest pipelines connected to a variety of on-prem and Cloud data sources - both data stores and files. We will be using mainly GCP technologies like Cloud Store, BigQuery, and Data Fusion.
- You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.

In this role, you will:

- Onboard new data sources - negotiate, agree, define, and document effective IT data contracts with source data providers (security, formats, schemas, extraction and load schedules, SLAs, data validation rules, error scenarios, retry mechanisms and etc)
- Design, build, test and deploy performant and effective Cloud data ingest pipelines (GCP DataFusion, Spark etc.) via API / SFTP / etc. into GCP Warehouse
- Develop, test and deploy GCP Data Fusion custom plugins.
- Build automated tests to validate ETL pipelines.
- Handle incremental and full data loading strategies for structured and semi-structured data with medium-to-high volume, velocity, variety.
- Perform data enrichment, standardization, cleanse, aggregation in Datafusion ensuring data integrity, consistency and compliance with Business and overall organisations standards, data governance and sovereignty.
- Develop procedures and scripts for data migration, back-population, and feed-to-warehouse initialization.
- Carrying Data Ops required activities ensuring pipelines health, performance and data in time delivery for the consumers.
- Protect the solution data Masking and Lineage capabilities as needed.
- Review and refine, interpret and implement business and technical requirements.

Requirements

To be successful in this role, you should meet the following requirements:

- Tech stack:

- GCP Data Fusion, BigQuery, Dataproc, SQL/T-SQL, Cloud Run, Secret Manager

- Git, Ansible Tower / Ansible scripts, Jenkins,

- Java, Python, Terraform, Cloud Composer/Airflow

- Must Have

- Proven (3+ years) hands on experience in designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc.

- Modern world data contract best practices in-depth understanding with proven experience (3+ years) for independently directing, negotiating, and documenting best in class data contracts.

- Good Java knowledge : experience in Java development

- Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions.

- Experience in working in Agile environment and toolset.

- Strong problem-solving and analytical skills

- Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently.

- Strong organisational and multi-tasking skills.

- Good team player who embraces teamwork and mutual support.

Nice to Have

- Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub
- Hands on development in Python, Terraform
- Java (2+ years) experience in development, testing and deployment (ideally custom plugins for Data Fusion)
- Strong SQL skills for data transformation, querying and optimization in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity (ideally in BigQuery dialect)
- Data Transformation/ETL/ELT pipelines development, testing and implementation ideally in Big Query
- Experience in working in DataOps model
- Experience in Data Vault modelling and usage.
- Proficiency in Git usage for version control and collaboration.
- Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)

You'll achieve more when you join HSBC.
www.hsbc.com/careers

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by - HSBC Software Development India


  • Azure Data Engineer

    2 days ago


    Pune, India Fragma Data Systems Full time

    Job Description Technology Skills - Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in...

  • Data Engineer

    6 days ago


    India NTT DATA Full time ₹ 9,00,000 - ₹ 12,00,000 per year

    Req ID: 343254NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Remote, Karnātaka (IN-KA), India (IN). "Key Responsibilities: Design and implement...

  • Data engineer

    7 days ago


    India Data-Hat AI Full time

    Department: Data Engineering & AI Solutions  Reports To: Lead Data Solutions Architect  Travel: International travel required (up to 30–40%)   Position Summary:   We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems. With 12+...

  • Data Engineer

    3 weeks ago


    Pune, India Data Axle Full time

    About Data Axle: Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for 50 years in the US. Data Axle has set a strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and leveraging...

  • Data Engineer

    1 week ago


    Pune, Maharashtra, India Data Axle Full time ₹ 15,00,000 - ₹ 28,00,000 per year

    About Data Axle:Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for 50 years in the US. Data Axle has set a strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and leveraging proprietary...

  • Data Engineer

    1 week ago


    India Data-Hat AI Full time

    Department: Data Engineering & AI Solutions Reports To: Lead Data Solutions Architect Travel: International travel required (up to 30–40%)  Position Summary: We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems. With 12+ years of...

  • Data Engineer

    7 days ago


    India Data-Hat AI Full time

    Department: Data Engineering & AI Solutions  Reports To: Lead Data Solutions Architect  Travel: International travel required (up to 30–40%)  Position Summary:   We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems. With 12+...

  • Data Engineer

    6 days ago


    India Data-Hat AI Full time

    Department: Data Engineering & AI Solutions  Reports To: Lead Data Solutions Architect  Travel: International travel required (up to 30–40%)  Position Summary:   We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems....

  • Data Engineer

    5 days ago


    india, IN Data-Hat AI Full time

    Department: Data Engineering & AI Solutions Reports To: Lead Data Solutions Architect Travel: International travel required (up to 30–40%) Position Summary: We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems. With 12+ years of...

  • Data Engineer

    3 weeks ago


    Pune, India Jash Data Sciences Full time

    Do you love solving real-world data problems with the latest and best techniques? And having fun while solving them in a team Then come join our high-energy team of passionate data people. Jash Data Sciences is the right place for you. We are a cutting-edge Data Sciences and Data Engineering startup based in Pune, India. We believe in continuous learning and...