Aws Data Engineer
2 days ago
Hi ,
Hope you are doing well. Please find the below Job description. IF this is of your interest kindly share the resume along with contact details and the current CTC and expected CTC and send it to -
Role : AWS Data Engineer with Snowflake,Spark and Medallion Architecture
Location : Banashankari 2nd Stage Bengaluru ( All 5 Days Onsite)
Duration : FTE with NAM INFO
About NAM
NAM Info is an IT application and implementation services entity having its US HQ in Cranbury, New Jersy and Development centers HQ in Bangalore in India. NAMs distinctive service line offerings include Professional Services, Managed Services – re-engineering, modernization largely involving emerging technology.
NAM is also home to a next-generation Data Intelligence Platform that enables enterprises to automate and accelerate their journey from data to insights. Our platform simplifies and unifies data engineering, governance, and analytics—empowering organizations to achieve end-to-end data intelligence at scale. As we expand our product capabilities, we are seeking a Data Product Tester to help ensure that the Inferyx platform delivers world-class performance, accuracy, and reliability.
About the Role
As a data engineer you will be responsible for building and maintaining data pipelines and systems to collect, store, and process data for analysis. Key responsibilities include designing infrastructure, building analytical tools, ensuring data quality and security, and collaborating with data scientists and analysts to make data ready for use.
Job Description:
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines using AWS services and Snowflake.
- Build scalable and efficient data architectures to support analytics, reporting, and data science use cases.
- Perform data modeling (conceptual, logical, and physical) to optimize Snowflake warehouse performance.
- Implement best practices for data ingestion, transformation, and storage.
- Collaborate with business analysts, data scientists, and stakeholders to understand data requirements.
- Monitor, optimize, and troubleshoot Snowflake performance, query tuning, and cost optimization.
- Ensure data quality, governance, and security in line with enterprise standards.
- Automate workflows using orchestration tools (e.g., Airflow, AWS Step Functions, or Glue workflows).
- Stay current with emerging cloud data technologies and recommend improvements.
Required Skills & Qualifications
- Bachelor's or Master's degree in computer science, Information Systems, or related field.
- 3–5 years of relevant experience in data engineering with a focus on cloud platforms.
- Strong hands-on expertise in Snowflake (schema design, data sharing, clustering, query optimization).
- Proficiency in AWS services such as S3, Glue, Redshift, Lambda, Kinesis, Step Functions, and IAM.
- Solid understanding of data modeling techniques (3NF, Star Schema, Snowflake Schema, Data Vault).
- Strong experience with SQL and at least one programming language (Python/Scala/Java).
- Experience with ETL/ELT tools (dbt, Informatica, Talend, or equivalent).
- Knowledge of data security, governance, and compliance best practices.
- Familiarity with CI/CD, version control (Git), and DevOps practices for data engineering.
Preferred Qualifications:
- Overall Experience: 7 – 8 years, recent 5 years as Senior DE.
- Strong hands-on UNIX/Shell scripting experience not just basic commands but comfort working directly on servers, SSHing into clusters, handling file operations, and executing CLI-based workflows.
- Deep expertise in AWS data services (S3, Glue, Lambda, EC2) integrated with Snowflake or Databricks for data lake development.
- Solid ETL/ELT design background with Python, PySpark, and SQL for transformation and pipeline orchestration.
- Familiarity with version control (GitHub) and automation tools like Jenkins or AWS-native CI/CD pipelines.
- Preferably exposure to Kafka, Airflow, and data governance practices.
What We Offer
- Opportunity to work on cutting-edge Data Intelligence Platform redefining enterprise analytics.
- Dynamic and collaborative work culture at Inferyx's Pune office.
- Competitive salary and performance-based growth opportunities.
- Exposure to the complete data ecosystem—from ingestion to analytics to AI-driven insights.
- A chance to directly influence the evolution of a high-impact enterprise data product.
Why NAM
At NAM, you'll be part of a global, innovation-driven organization that operates at the intersection of Data, Cloud, and Automation.
What makes NAM different:
- Enterprise Data Expertise – Work on AWS-based Data Lake and Lakehouse projects that use the latest frameworks like Glue, Spark, and Iceberg.
- Advanced Cloud DevOps – Manage complex, high-volume production data environments and drive continuous improvement in reliability and automation.
- Cutting-Edge Innovation – Be part of NAM's Inferyx platform development, helping clients accelerate their data-to-insights transformation.
- Collaborative, Global Culture – Partner with cross-functional teams and clients across geographies in a dynamic, learning-focused environment.
- Career Growth – Gain hands-on experience in DataOps, CloudOps, and DevSecOps, with opportunities for AWS and Kubernetes certifications and leadership development.
Manjunath
Staffing Manager
NAM Info Pvt Ltd,
29/2B-01, 1st Floor, K.R. Road,
Banashankari 2nd Stage, Bangalore
/ -
Linkedin : (99+) M.S.Manju nath | LinkedIn
Website: WWW.NAM-IT.COM
USA | CANADA | INDIA
MBE Certified Company , E Verify Company
-
Data Engineer
6 days ago
Bengaluru, Karnataka, India NTT DATA Full time ₹ 15,00,000 - ₹ 25,00,000 per yearMigrate ETL workflows from SAP BODS to AWS Glue/dbt/Talend. Develop and maintain scalable ETL pipelines in AWS. Write PySpark scripts for large-scale data processing. Optimize SQL queries and transformations for AWS PostgreSQL. Work with Cloud Engineers to ensure smooth deployment and performance tuning. Integrate data pipelines with existing Unix systems....
-
AWS Data Engineer
1 week ago
Bengaluru, Karnataka, India HireWand Technologies Full time ₹ 12,00,000 - ₹ 36,00,000 per yearThe AWS Data Engineer will design, develop, and maintain scalable data pipelines and architectures using AWS Glue, Amazon Redshift, AWS Lambda, and Amazon S3. This role requires expertise in AWS Kinesis, AWS CloudFormation, Apache Spark, Python, and AWS IAM to ensure secure and efficient data processing. The candidate will leverage AWS Data Pipeline, AWS...
-
AWS Data Engineer
2 weeks ago
Bengaluru, Karnataka, India PradeepIT Consulting Services Full time ₹ 5,00,000 - ₹ 12,00,000 per yearJob Description:As an AWS Data Engineer, you will be responsible for designing, building, and maintaining data pipelines and data storage solutions on Amazon Web Services (AWS). Your expertise will help us leverage data for critical business insights and analytics.Primary Skills:Experience in implementing at least 2 end-to-end Data Engineering projects.5+ to...
-
AWS Data Engineer
3 days ago
Bengaluru, Karnataka, India Teamware Solutions Full time ₹ 12,00,000 - ₹ 36,00,000 per yearAWS Data EngineersRole Overview:PwC is seeking a highly skilled AWS Data Engineer to join our Cloud, Data & AI practice and play a key role in delivering cutting-edge, cloud-native data solutions for our clients across industries. In this role, you will design and implement scalable, secure, and resilient data architectures on AWS that enable actionable...
-
Aws Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Coforge Full time ₹ 12,00,000 - ₹ 36,00,000 per yearAWS Data EngineerJob Location: BengaluruExperience Required: 5+ YearsMandatory Skills: AWS Services, ETL, ETL Integration, CodePipeline, Jenkins, Glue, EMR, Athena, ECS, EKS, Kubernetes, CloudWatch, Prometheus, Grafana, Python, Shell, or PowerShellJob Description:We are looking for an experienced AWS Engineer with around 5-8 years of hands-on experience in...
-
AWS Data Engineer
2 weeks ago
Bengaluru, Karnataka, India DoctusTech Full time ₹ 8,00,000 - ₹ 24,00,000 per yearPosition:AWS Data EngineerExperience:5–7 yearsDomain Preference:Healthcare data engineeringAbout the RoleWe are seeking a highly skilled AWS Data Engineer with 5–7 years of experience to join our data engineering team. The ideal candidate will design, build, and maintain robust, secure, and scalable data pipelines on AWS. Prior experience in healthcare...
-
Aws Data Engineer
1 week ago
Bengaluru, Karnataka, India GSPANN Full time ₹ 20,00,000 - ₹ 25,00,000 per yearUrgent Hiring: AWS Data Engineer, Senior Data Engineers & Lead Data EngineersApply Now: Send your resume to Location: Bangalore (6+ Years Experience)Company: GSPANN Technologies, Inc.GSPANN Technologies is seeking talented professionals with 4+ years of experience to join our team in Bangalore. We are looking for immediate joiners who are passionate about...
-
AWS Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Fractal Full time ₹ 12,00,000 - ₹ 36,00,000 per yearIt's fun to work in a company where people truly BELIEVE in what they are doingWe're committed to bringing passion and customer focus to the business.Job Profile - Senior Engineer AWS cloudExp yearsJob Location – BangaloreResponsibilities:As a Data Engineer, you will be responsible implementing complex data pipelines and analyticssolutions to support key...
-
Aws Data Engineer
2 weeks ago
Bengaluru, Karnataka, India LatentView Full time ₹ 15,00,000 - ₹ 25,00,000 per yearExperience: 7+ YearsLocation: BangaloreKey ResponsibilitiesDesign, develop, and maintain scalable ETL pipelines using AWS Glue and PySpark.Lead the data engineering team, ensuring best practices in coding, testing, and deployment.Implement workflow orchestration solutions for complex data pipelines.Optimize data pipelines and queries for advanced SQL...
-
AWS Data Engineer
7 days ago
Bengaluru, Karnataka, India KPMG Global Services Full time ₹ 20,00,000 - ₹ 25,00,000 per yearMandatory SkillsBachelor's degree in computer science, Data Science, engineering, mathematics, information systems, or a related technical discipline 3+ years of relevant experience in data engineering roles Detailed knowledge of data warehouse technical architectures, data modelling, infrastructure components, ETL/ ELT and reporting/analytic tools and...