Data Engineer
3 weeks ago
Job Description
Join us as a Data Engineer - Pyspark,SQL at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of team of developers, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions.
To be successful as a Data Engineer - Pyspark,SQL you should have experience with:
- Hands on experience in Pyspark and strong knowledge on Dataframes, RDD and SparkSQL
- Hands on experience in Pyspark performance optimization techniques .
- Hands on Experience in developing, testing and maintaining applications on AWS Cloud.
- Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena)
- Design and implement scalable and efficient data transformation/storage solutions with open table formats such as DELTA, Iceberg, Hudi.
- Experience in using DBT (Data Build Tool) with snowflake/Athena/Glue for ELT pipeline development.
- Experience in Writing advanced SQL and PL SQL programs.
- Hands On Experience for building reusable components using Snowflake and AWS Tools/Technology
- Should have worked at least on two major project implementations.
- Exposure to data governance or lineage tools such as Immuta and Alation is added advantage.
- Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage.
- Knowledge on Ab-initio ETL tool is a plus
Some Other Highly Valued Skills Includes
- Ability to engage with Stakeholders, elicit requirements/ user stories and translate requirements into ETL components
- Ability to understand the infrastructure setup and be able to provide solutions either individually or working with teams.
- Good knowledge of Data Marts and Data Warehousing concepts.
- Resource should possess good analytical and Interpersonal skills.
- Implement Cloud based Enterprise data warehouse with multiple data platform along with Snowflake and NoSQL environment to build data movement strategy.
You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills.
This role is based out of Pune.
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
- Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
- Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
- Development of processing and analysis algorithms fit for the intended data complexity and volumes.
- Collaboration with data scientist to build and deploy machine learning models.
Analyst Expectations
- To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement.
- Requires in-depth technical knowledge and experience in their assigned area of expertise
- Thorough understanding of the underlying principles and concepts within the area of expertise
- They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources.
- If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others.
- OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate.
- Will have an impact on the work of related teams within the area.
- Partner with other functions and business areas.
- Takes responsibility for end results of a team's operational processing and activities.
- Escalate breaches of policies / procedure appropriately.
- Take responsibility for embedding new policies/ procedures adopted due to risk mitigation.
- Advise and influence decision making within own area of expertise.
- Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct.
- Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function.
- Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function.
- Make evaluative judgements based on the analysis of factual information, paying attention to detail.
- Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents.
- Guide and persuade team members and communicate complex / sensitive information.
- Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.
All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave.
-
Azure Data Engineer
2 weeks ago
Pune, India Fragma Data Systems Full timeJob Description Technology Skills - Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in...
-
Data engineer
3 weeks ago
India Data-Hat AI Full timeDepartment: Data Engineering & AI Solutions Reports To: Lead Data Solutions Architect Travel: International travel required (up to 30–40%) Position Summary: We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems. With 12+...
-
Data Engineer
3 weeks ago
India Data-Hat AI Full timeDepartment: Data Engineering & AI Solutions Reports To: Lead Data Solutions Architect Travel: International travel required (up to 30–40%) Position Summary: We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems. With 12+...
-
Data Engineer
3 weeks ago
India Data-Hat AI Full timeDepartment: Data Engineering & AI Solutions Reports To: Lead Data Solutions Architect Travel: International travel required (up to 30–40%) Position Summary: We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems. With 12+ years of...
-
Data Engineer
3 weeks ago
india, IN Data-Hat AI Full timeDepartment: Data Engineering & AI Solutions Reports To: Lead Data Solutions Architect Travel: International travel required (up to 30–40%) Position Summary: We are hiring a senior-level Data Engineer to lead the design, development, and optimization of high-performance data infrastructure that underpins mission-critical AI systems. With 12+ years of...
-
Data Engineer
5 days ago
Pune, Maharashtra, India Jash Data Sciences Full time ₹ 15,00,000 - ₹ 25,00,000 per yearDo you love solving real-world data problems with the latest and best techniques? And having fun while solving them in a team Then come join our high-energy team of passionate data people. Jash Data Sciences is the right place for you.We are a cutting-edge Data Sciences and Data Engineering startup based in Pune, India.We believe in continuous learning and...
-
Senior Data Engineer
3 weeks ago
Pune, India Data Axle Full timeJob Description About Data Axle: Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for 50 years in the US. Data Axle has set up a strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology...
-
Data & AI Engineer Lead
1 week ago
India NTT DATA Full time ₹ 12,00,000 - ₹ 36,00,000 per yearReq ID: 344686NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data & AI Engineer Lead to join our team in Remote, Karnātaka (IN-KA), India (IN). "Job Duties: Role Overview The...
-
Data Engineer
2 weeks ago
Bengaluru, India NTT Data Full timeJob Description NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Design and implement tailored data...
-
Data Operations Engineer
5 days ago
Pune, Maharashtra, India Mars Data Insights Full time ₹ 15,00,000 - ₹ 25,00,000 per yearTitle: Data Operations Engineer & RUN SupportSkills:Data Operations Engineering, data manipulation, Python, Talend, GCP, Bigquery, DataIku, ITSM/ticketing tools, Helix, Jira, task management, data pipelines, RUN Service, data infrastructure, data quality Job Location:Pune Job Type:Fulltime/Hybrid Work Experience:5+ years We are seeking a highly...