AWS Data Lake Engineer
3 days ago
Are you ready to write your next chapter?
Make your mark at one of the biggest names in payments. With proven technology, we process the largest volume of payments in the world, driving the global economy every day. When you join Worldpay, you join a global community of experts and changemakers, working to reinvent an industry by constantly evolving how we work and making the way millions of people pay easier, every day.
What makes a Worldpayer? It's simple: Think, Act, Win. We stay curious, always asking the right questions to be better every day, finding creative solutions to simplify the complex. We're dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we're determined, always staying open – winning and failing as one.
We're looking for a Sr AWS Databricks Admin to join our Big Data Team to help us unleash the potential of every business.
Are you ready to make your mark? Then you sound like a Worldpayer.
About the team:
We are seeking a talented and experienced Senior AWS Data Lake Engineer to join our dynamic team who can design, develop, and maintain scalable data pipelines and manage AWS Data Lake solutions. The ideal candidate will have extensive experience in handling sensitive data, including Personally Identifiable Information (PII) and Payment Card Industry (PCI) data, using advanced tokenization and masking techniques.
What you will be doing
- Design, develop, and maintain scalable data pipelines using Python and AWS services.
- Implement and manage AWS Data Lake solutions, including ingestion, storage, and cataloging of structured and unstructured data.
- Apply data tokenization and masking techniques to protect sensitive information in compliance with data privacy regulations (e.g., GDPR, HIPAA).
- Collaborate with data engineers, architects, and security teams to ensure secure and efficient data flows.
- Optimize data workflows for performance, scalability, and cost-efficiency.
- Monitor and troubleshoot data pipeline issues and implement robust logging and alerting mechanisms.
- Document technical designs, processes, and best practices.
- Provide support on Databricks and Snowflake.
- Maintain comprehensive documentation for configurations, procedures, and troubleshooting steps.
What you bring:
- 5+ years of experience working as a Python developer/architect.
- Strong proficiency in Python, with experience in data processing libraries (e.g., Pandas, PySpark).
- Proven experience with AWS services such as S3, Glue, Lake Formation, Lambda, Athena, and IAM.
- Solid understanding of data lake architecture and best practices.
- Experience with data tokenization, encryption, and anonymization techniques.
- Familiarity with data governance, compliance, and security standards.
- Experience with Snowflake and/or Databricks (Nice to have).
- Experience with CI/CD tools and version control (e.g., Git, CodePipeline).
- Strong problem-solving skills and attention to detail.
Where you'll own it
You'll own it in our modern Bangalore/Pune/Indore hub. With hubs in the heart of city centers and tech capitals, things move fast in APAC. We pride ourselves on being an agile and dynamic collective, collaborating with different teams and offices across the globe.
Worldpay perks - what we'll bring for you
We know it's bigger than just your career. It's your life, and your world. That's why we offer global benefits and programs to support you at every stage. Here's a taste of what you can expect.
- A competitive salary and benefits.
- Time to support charities and give back to your community.
- Parental leave policy.
- Global recognition platform.
- Virgin Pulse access.
- Global employee assistance program.
What makes a Worldpayer
At Worldpay, we take our Values seriously, and we live them every day. Think like a customer, Act like an owner, and Win as a team.
- Curious. Humble. Creative. We ask the right questions, listening and learning to get better every day. We simplify the complex and we're always looking to create a bigger impact for our colleagues and customers.
- Empowered. Accountable. Dynamic. We stay agile, using our initiative, taking calculated risks to progress. Never standing still, never settling, we work at pace to achieve our goals. We champion our ideas and stay flexible to make them happen. We know that every action adds up.
- Determined. Inclusive. Open. Unlocking potential means working as one global community. Our work spans borders, and we stay united by our purpose. We collaborate, always encouraging others to perform at their best, welcoming new perspectives.
Does this sound like you? Then you sound like a Worldpayer.
Apply now to write the next chapter in your career. We can't wait to hear from you.
To find out more about working with us, find us on LinkedIn.
Privacy Statement
Worldpay is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how Worldpay protects personal information online, please see the Online Privacy Notice.
Sourcing Model
Recruitment at Worldpay works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. Worldpay does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company.
pridepass-
AWS Data Engineer
6 days ago
Pune, Maharashtra, India INTINERI INFOSOL PRIVATE LIMITED Full time ₹ 1,04,000 - ₹ 1,30,878 per yearJob Title: AWS Data Engineer Oil & Gas DomainLocation: Offshore/ REMOTEExperience: 4+ years Employment Type: Full time ContractExperienced AWS Data Engineer with Oil & Gas industry expertise to design, build, and optimize scalable data pipelines and cloud solutions. The ideal candidate will have strong experience in AWS cloud services, data engineering and...
-
Aws Data Engineer
6 days ago
Pune, Maharashtra, India Velotio Technologies Full time ₹ 1,04,000 - ₹ 1,30,878 per yearAbout Velotio:Velotio Technologies is a product engineering company working with innovative startups and enterprises. We are a certified Great Place to Work and recognized as one of the best companies to work for in India. We have provided full-stack product development for 110+ startups across the globe building products in the cloud-native, data...
-
AWS cloud Data Engineer
6 days ago
Pune, Maharashtra, India BriskWin IT (BWIT) Full time US$ 90,000 - US$ 1,20,000 per yearPosition Name - AWS Cloud Data EngineerExperience yearsWork Location - RemoteNotice Period - Immediate JoinerPrimary ResponsibilitiesAnalyze and understand existing data warehouse implementations to support migration and consolidation efforts.Reverse-engineer legacy stored procedures (PL/SQL, SQL) and translate business logic into scalable Spark SQL code...
-
Data Engineer
5 days ago
Pune, Maharashtra, India Jash Data Sciences Full time ₹ 8,00,000 - ₹ 12,00,000 per yearDo you love solving real-world data problems with the latest and best techniques? And having fun while solving them in a team Then come join our high-energy team of passionate data people. Jash Data Sciences is the right place for you.We are a cutting-edge Data Sciences and Data Engineering startup based in Pune, India.We believe in continuous learning and...
-
AWS Data Engineer – Scalable Solutions
1 week ago
Pune, Maharashtra, India beBeeData Full time ₹ 18,59,415 - ₹ 25,12,088Unlock Data Engineering Excellence in Oil & GasA leading organization is seeking an experienced AWS Data Engineer to design, build, and optimize scalable data pipelines and cloud solutions for the Oil & Gas industry.Key Responsibilities:Design, develop, and maintain large-scale data pipelines and ETL/ELT solutions using AWS cloud services.Ingest, process,...
-
Aws Data Engineer
6 days ago
Pune, Maharashtra, India ASV Consulting Services Full time ₹ 1,04,000 - ₹ 1,30,878 per yearKey Responsibilities:* Design, develop, and maintain scalable data pipelines and ETL/ELT solutions usingAWS cloud services.* Ingest, process, and store structured/unstructured data from multiple sources (IoTdevices, sensors, ERP systems, drilling logs, production data, etc.).* Implement data lake and data warehouse solutions* Optimize data pipelines for...
-
AWS Data Engineer
5 days ago
Pune, Maharashtra, India Mount Talent Consulting Full time ₹ 15,00,000 - ₹ 20,00,000 per yearJD:6+ years of experience in data engineering, specifically in cloud environments like AWS.Dont Not Share Data Science Profiles.Proficiency in Python and PySpark for data processing and transformation tasks.Solid experience with AWS Glue for ETL jobs and managing data workflows.Hands-on experience with AWS Data Pipeline (DPL) for workflow...
-
Senor Developer – AWS Data Engineering
2 weeks ago
Pune, Maharashtra, India R Systems Full timeWe are looking for a Senior Developer (4–6 years) in AWS Data Engineering with expertise in AWS Glue, Spark, S3, Redshift, Python, and SQL .This role offers the opportunity to design, build, and optimize large-scale data pipelines on AWS , enabling high-performance analytics and AI-ready data platforms . What You'll Do:Build & manage data...
-
Data Engineer with AWS Expertise
1 week ago
Pune, Maharashtra, India beBeeDataEngineering Full time ₹ 1,50,00,000 - ₹ 2,50,00,000Job Title: Data Engineer with AWS ExpertiseWe are seeking a skilled Data Engineer with expertise in AWS to join our team. This is an excellent opportunity for someone who wants to work on designing, building, and optimizing data warehousing and data lake solutions.This role involves gathering and analyzing data from diverse enterprise systems and operational...
-
Lead Aws Data Engineer
10 hours ago
Pune, Maharashtra, India ACL Digital Full timeLead Data Engineer 7+ yrs exp Location: Balewadi, Pune Immediate joiners only. Responsibilities Multi-year work experience as a Platform Lead, Delivery Leader or similar role. Good working knowledge of Aws Services, Python, Pyspark, data modelling. Knowledge of columnar and NoSQL databases, predictive analytics, data visualization, and unstructured data....