Data Engineer – Python | dbt | Redshift | AWS

1 week ago


New Delhi, India Catalyst Info Labs Full time

Data Engineer – Python | dbt | Redshift | AWSAbout the CompanyCatalyst Info Labs Pvt. Ltd.is a growing software solutions company focused on developing scalable, data-driven platforms that empower businesses with actionable insights. We are looking for a skilled Data Engineer to join our team and contribute to the design, development, and optimization of high-performance data pipelines and infrastructure.Key ResponsibilitiesDesign, develop, and maintain data pipelines and ETL/ELT processes to support analytics and business operations. Build and optimize data models using dbt (data build tool) and manage transformations within Amazon Redshift. Write and optimize SQL queries for data processing and reporting. Ensure the reliability, performance, and scalability of data infrastructure on AWS. Collaborate with cross-functional teams including data analysts, scientists, and business stakeholders to ensure data quality and usability. Implement and manage version control using Git for dbt projects. Support workflow orchestration using tools such as Airflow, Dagster, or Prefect. Maintain documentation for data pipelines, models, and processes.Required Technical SkillsStrong proficiency in Python for data processing, automation, and ETL pipeline development. Hands-on experience with dbt for transformation and data modeling. In-depth knowledge of Amazon Redshift architecture, optimization, and best practices. Expertise in SQL, including complex queries, window functions, and performance tuning. Understanding of data warehouse design principles, dimensional modeling, and star/snowflake schemas. Experience working with AWS services such as S3, Lambda, Glue, and Step Functions. Familiarity with IAM policies, data security, and access management within AWS.Preferred QualificationsExperience with CI/CD implementation for data pipelines. Exposure to data governance and data lineage tools. Experience with Snowflake, BigQuery, or other cloud data warehouses. Knowledge of streaming technologies such as Kafka or Kinesis. Experience with Infrastructure as Code tools like Terraform or CloudFormation.Experience LevelsJunior (1–3 years): Exposure to dbt and Redshift. Mid-level (3–5 years): Proven experience managing production data pipelines. Senior (5+ years): Expertise in data architecture, performance optimization, and mentoring.Soft SkillsStrong analytical and problem-solving skills. Excellent verbal and written communication abilities. Attention to detail with a focus on data quality, testing, and monitoring. Ability to collaborate effectively in a fast-paced, cross-functional environment.How to ApplyInterested candidates can apply directly on LinkedIn or send their resumes to hr@catalystinfolabs.com with the subject line: “Application – Data Engineer – [Your Name]”



  • Delhi, India Catalyst Info Labs Full time

    Data Engineer – Python | dbt | Redshift | AWS About the Company Catalyst Info Labs Pvt. Ltd. is a growing software solutions company focused on developing scalable, data-driven platforms that empower businesses with actionable insights. We are looking for a skilled Data Engineer to join our team and contribute to the design, development, and optimization...


  • Delhi, India Catalyst Info Labs Full time

    Data Engineer – Python | dbt | Redshift | AWSAbout the CompanyCatalyst Info Labs Pvt. Ltd. is a growing software solutions company focused on developing scalable, data-driven platforms that empower businesses with actionable insights. We are looking for a skilled Data Engineer to join our team and contribute to the design, development, and optimization of...

  • AWS Data Engineer

    3 weeks ago


    New Delhi, India Tata Consultancy Services Full time

    Role -AWS Data Engineer Required Technical Skill Set -AWS,Snowflake,ETL,Python,Pyspark, DBT Experience Range - 6+ Years Technical/Behavioral Competency:• 5+ years experience with Sowflake development and integration • 3-7 years experience AWS cloud and AWS services such as S3 Buckets, Lambda, Glue, API Gateway, SQS queues, RDS, Redshift; • Experience...

  • Data Engineer

    3 weeks ago


    New Delhi, India Veraxion Full time

    We’re looking for a Data Engineer to design, build, and scale modern data platforms on AWS. You’ll work with Python, Spark, DBT, and AWS-native services in an Agile environment to deliver scalable, secure, and high-performance data solutions.What you’ll do- Develop and optimize ETL/ELT pipelines with Python, DBT, and AWS services (Data Ops Live). -...


  • New Delhi, India Veraxion Full time

    We're looking for a Data Engineer to design, build, and scale modern data platforms on AWS. You'll work with Python, Spark, DBT, and AWS-native services in an Agile environment to deliver scalable, secure, and high-performance data solutions.What you'll do- Develop and optimize ETL/ELT pipelines with Python, DBT, and AWS services (Data Ops Live). - Build and...

  • Data engineer

    3 weeks ago


    Delhi, India Veraxion Full time

    We’re looking for a Data Engineer to design, build, and scale modern data platforms on AWS. You’ll work with Python, Spark, DBT, and AWS-native services in an Agile environment to deliver scalable, secure, and high-performance data solutions.What you’ll do- Develop and optimize ETL/ELT pipelines with Python, DBT, and AWS services (Data Ops Live).-...

  • Data Engineer

    3 weeks ago


    Delhi, India Veraxion Full time

    We’re looking for a Data Engineer to design, build, and scale modern data platforms on AWS. You’ll work with Python, Spark, DBT, and AWS-native services in an Agile environment to deliver scalable, secure, and high-performance data solutions.What you’ll do- Develop and optimize ETL/ELT pipelines with Python, DBT, and AWS services (Data Ops Live).-...

  • Data Engineer

    2 weeks ago


    Delhi, India Veraxion Full time

    We’re looking for a Data Engineer to design, build, and scale modern data platforms on AWS . You’ll work with Python, Spark, DBT, and AWS-native services in an Agile environment to deliver scalable, secure, and high-performance data solutions. What you’ll do Develop and optimize ETL/ELT pipelines with Python, DBT, and AWS services (Data Ops Live)....

  • Data Engineer

    2 weeks ago


    Delhi, India Veraxion Full time

    We're looking for a Data Engineer to design, build, and scale modern data platforms on AWS. You'll work with Python, Spark, DBT, and AWS-native services in an Agile environment to deliver scalable, secure, and high-performance data solutions.What you'll do- Develop and optimize ETL/ELT pipelines with Python, DBT, and AWS services (Data Ops Live).- Build and...


  • New Delhi, India People Prime Worldwide Full time

    About Client: Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients...