Data AWS Sagemaker
2 weeks ago
Technical Skills & Expertise
SQL: Expert-level proficiency (must have).
AWS: Redshift, S3, ECS, Lambda, Glue, SQS, SNS, CloudWatch, Step Functions, CDK SQS/SNS, Athena (must have).
PySpark: Expert (must have).
Python: Strong experience with API integrations, data handling, and automation.(must have)
LLM Integration: Experience integrating LLMs via APIs (e.g., OpenAI, Claude, Bedrock) into data workflows or analytics pipelines.
LLMOps: Understanding of prompt design, prompt tuning, RAG patterns, and model evaluation in production.
Data Modeling & Query Tuning: Hands-on experience in designing optimized schemas and writing performant queries.
Big Data Ecosystem: Solid understanding of Hadoop, Hive, MapReduce.
Orchestration Tools: Airflow (Open Source), MWAA on AWS (intermediate- nice to have).
Data Migration: Experience with AWS Data Migration Service (DMS).
Analytical Skills: Strong in Exploratory Data Analysis (EDA).
ETL Design Patterns: Proficiency with window functions, reusable ETL frameworks, and scalable automation.
Preferred Knowledge
Exposure to Data Lake vs. Data Warehouse architecture.
Experience in real-time data ingestion and streaming frameworks.
Hands-on with data quality, compliance, and governance frameworks.
Familiarity with security best practices in AWS environments.
Experience with data enrichment or summarization using LLMs.
Familiarity with RAG pipelines, vector databases (e.g., OpenSearch, Pinecone, FAISS), and metadata extraction using LLMs.
Key Responsibilities
Design & Build Intelligent Data Solutions: Develop and maintain scalable, LLM-enabled data pipelines using AWS services like Glue, Redshift, Lambda, Step Functions, S3, Athena, and ECS.
Integrate LLM APIs or fine-tuned models (e.g., OpenAI, HuggingFace, Amazon Bedrock, SageMaker JumpStart) into existing AWS-based data workflows.
Enable RAG (Retrieval-Augmented Generation) pipelines using AWS services and LLMs for advanced analytics and knowledge management.
ETL Development: Build efficient, reusable ETL frameworks that can incorporate LLMs for data enrichment, summarization, classification, or metadata generation.
Use orchestration tools like Airflow/MWAA or Step Functions to manage both data and LLM workflows.
LLM & AI Services in AWS: Work with Amazon Bedrock, SageMaker, or custom containers to deploy, monitor, and scale LLM-based solutions.
Optimize LLM usage for performance and cost within the AWS ecosystem (e.g., caching responses, throttling, model selection).
Data Security & Integrity: Implement security best practices and compliance standards for handling sensitive data within LLM pipelines.
Ensure proper prompt auditing, logging, and governance for LLM interactions.
Monitoring & Troubleshooting: Continuously monitor pipelines (including LLM calls), troubleshoot failures, and optimize performance and cost.
Documentation: Maintain detailed documentation of data architectures, frameworks, and engineering processes.
Architecture & Reviews: Participate in solution architecture, code reviews, and sign-offs to ensure quality and scalability.
Project & Stakeholder Management: Apply Agile/Scrum methodology for project delivery, manage risks, and effectively communicate with both technical and non-technical stakeholders.
About Virtusa
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
-
AWS Data Engineer
2 days ago
Chennai, Tamil Nadu, India Tata Consultancy Services Full time ₹ 15,00,000 - ₹ 25,00,000 per yearExperience: 5-10 YrsLocation- Bangalore,Chennai,Hyderabad,Pune,Kochi,Bhubaneshawar,KolkataKey SkillsAWS Lambda, Python, Boto3 ,Pyspark, GlueMust have SkillsStrong experience in Python to package, deploy and monitor data science appsKnowledge in Python based automationKnowledge of Boto3 and related Python packagesWorking experience in AWS and AWS LambdaGood...
-
AWS Engineer
2 weeks ago
Chennai, Tamil Nadu, India Gainwell Technologies Full time ₹ 12,00,000 - ₹ 36,00,000 per yearJob Description : AWS Engineer (C# / Java, Python, GenAI, Cloud Data) Date: Oct 9, 2025Location: Bangalore, KA, IN, 560100; Chennai, TN, IN, 600032Req ID: 32709Work Mode: Remote IndiaSummary As an AWS Engineer (C# / Java, Python, GenAI, Cloud Data) at Gainwell, you can contribute your skills as we harness the power of technology to help our clients...
-
AWS Engineer
1 week ago
Chennai, Tamil Nadu, India Gainwell Technologies Full time ₹ 12,00,000 - ₹ 36,00,000 per yearGainwell Technologies LLCSummaryAs a AWS Engineer (Dot Net/Java/Python, Gen AI) at Gainwell, is responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI pipelines and engineering practices. The role demands deep cloud infrastructure skills (ECS, Lambda, RDS, S3), automation (Terraform)....
-
AWS Engineer
7 days ago
Chennai, Tamil Nadu, India Gainwell Technologies Full time ₹ 12,00,000 - ₹ 36,00,000 per yearJob Description AWS Engineer (Dot Net/Java/Python, Gen AI) Date: Sep 19, 2025Location: Chennai, TN, IN, 600032Req ID: 32884Work Mode: Remote IndiaSummary As a AWS Engineer (Dot Net/Java/Python, Gen AI) at Gainwell, you will be responsible for end-to-end deployment, configuration, and reliability of AWS-based product demo environments, integrating GenAI...
-
AWS Data
7 days ago
Chennai, Tamil Nadu, India Virtusa Full time ₹ 12,00,000 - ₹ 36,00,000 per yearAWS Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and QS/SNS/Cloudwatch/Step function/CDK. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, Pyspark, SQL, Databricks, and various AWS...
-
Data Scientist
2 days ago
Chennai, Tamil Nadu, India 3across Full time ₹ 12,00,000 - ₹ 36,00,000 per yearJob Title: Data ScientistExperience: 5+ YearsLocation: Remote/ Indore/ Mumbai/ Chennai/ GurugramIndustry: Must be from BPO/KPO or Healthcare Org. Or Shared ServicesKey Responsibilities:AI/ML Development & ResearchDesign, develop, and deploy advanced machine learning and deep learning models to solve complex business problems.Implement and optimize Large...
-
Data Scientist
2 weeks ago
Chennai, Tamil Nadu, India Three Across Full time ₹ 12,00,000 - ₹ 36,00,000 per yearDescription :Job Title : Data ScientistExperience : 5 YearsLocation : Remote/ Indore/ Mumbai/ Chennai/ GurugramIndustry : Must be from BPO/KPO or Healthcare Org or Shared ServicesKey Responsibilities : AI/ML Development & Research : - Design, develop, and deploy advanced machine learning and deep learning models to solve complex business problems -...
-
AWS Data engineer
7 days ago
Chennai, Tamil Nadu, India Virtusa Full time ₹ 5,00,000 - ₹ 25,00,000 per yearP2-C3-STSAWS Data EngineerDesign and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3.Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes.Create and manage applications using Python, SQL, Databricks, and various AWS technologies.Automate repetitive tasks and...
-
AWS Data engineer
2 days ago
Chennai, Tamil Nadu, India Virtusa Full time ₹ 15,00,000 - ₹ 25,00,000 per yearAWS Data EngineerDesign and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3.Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes.Create and manage applications using Python, SQL, Databricks, and various AWS technologies.Automate repetitive tasks and build...
-
AWS Data Architect
2 days ago
Chennai, Tamil Nadu, India Virtusa Full time ₹ 12,00,000 - ₹ 36,00,000 per yearAWS Data Architect Role\: T0The ideal professional for this AWS Architect role will:● Have a passion for design, technology, analysis, collaboration, agility, and planning, along with a drive for continuous improvement and innovation.● Exhibit expertise in managing high-volume data projects that leverage Cloud Platforms, Data Warehouse reporting and BI...