Data Engineer

3 days ago


Ahmedabad, Gujarat, India Circle K Full time ₹ 12,00,000 - ₹ 36,00,000 per year

Job Description

Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day

It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success.

About the role

We are looking for a Data Engineer with a collaborative, "can-do" attitude who is committed & strives with determination and motivation to make their team successful. A Data Engineer who has experience implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K's next phase in the digital journey by transforming data to achieve actionable business outcomes.

Roles and Responsibilities

  • Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals

  • Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options

  • Determine solutions that are best suited to develop a pipeline for a particular data source

  • Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development

  • Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance)

  • Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver

  • Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders

  • Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability)

  • Stay current with and adopt new tools and applications to ensure high quality and efficient solutions

  • Build cross-platform data strategy to aggregate multiple sources and process development datasets

  • Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation

Job Requirements

  • Bachelor's degree in Computer Engineering, Computer Science or related discipline, Master's Degree preferred

  • 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment

  • 3+ years of experience with setting up and operating data pipelines using Python or SQL

  • 3+ years of advanced SQL Programming: PL/SQL, T-SQL

  • 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization

  • Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads

  • 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data

  • 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions

  • 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring

  • Strong analytical abilities and a strong intellectual curiosity.

  • In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts

  • Understanding of REST and good API design

  • Experience working with Apache Iceberg, Delta tables and distributed computing frameworks

  • Strong collaboration, teamwork skills, excellent written and verbal communications skills

  • Self-starter and motivated with ability to work in a fast-paced development environment

  • Agile experience highly desirable

  • Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools

Preferred Skills

  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management)

  • Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques

  • Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks

  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools

  • Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance)

  • Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting

  • ADF, Databricks and Azure certification is a plus

Technologies we use: Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake

#LI-DS1



  • Ahmedabad, Gujarat, India Technobeat Engineer Full time ₹ 8,00,000 - ₹ 12,00,000 per year

    We are seeking a dedicated Senior Admin Executive. The role is to manage and preparation of reports, ensure accuracy, and maintain monthly documentation. Responsible for collecting data from engineers, guiding team in report creation, and submission.


  • Ahmedabad, Gujarat, India ANBLICKS CLOUD DATA ENGG PRIVATE LIMITED Full time

    Job Title : Lead Data Engineer Snowflake MigrationLocation : Hyderabad / AhmedabadExperience : 7+ YearsBe a Trailblazer in Our Data Modernization Center of Excellence (COE)At Anblicks, we are on a mission to modernize legacy data ecosystems and help enterprises unlock the power of the cloud. Our Data Modernization COE is at the forefront of this...

  • Lead Engineer

    4 weeks ago


    Ahmedabad, Gujarat, India Crest Data Systems Full time

    Role : ReactJS Lead EngineerWe are looking for a strong ReactJS Lead Engineers to join our engineering team in Ahmedabad. The primary responsibility will be to lead the design and develop Enterprise Software for our Global Fortune 500 clients in Data Center and Big Data Responsibilities :- Responsible for providing expertise in software development life...

  • Azure Data Engineer

    6 days ago


    Ahmedabad, Gujarat, India Phoenix It And Data Solutions Full time ₹ 9,00,000 - ₹ 12,00,000 per year

    Responsibilities: Design, develop, and maintain scalable data solutions using Azure Data Factory, Databricks, PySpark, Python, and SQL. Optimize pipelines, ensure data quality, and enable insights with Microsoft Fabric.

  • Data Engineer

    3 days ago


    Ahmedabad, Gujarat, India ANS Group Full time ₹ 1,04,000 - ₹ 1,30,878 per year

    At ANS, we empower businesses to harness the full potential of their data through cutting-edge cloud and data technologies. As aData Engineer, you'll be at the heart of this mission—designing, building, and optimizing data solutions that drive real impact for our customers.This role sits in our Centre of Excellence team, which is an innovative hub...

  • Data Engineer

    5 days ago


    Ahmedabad, Gujarat, India Catalyst Partners Full time ₹ 9,00,000 - ₹ 12,00,000 per year

    Job Designation :-Data EngineerExperience:-3+ YearsLocation:Ahmedabad (Onsite)Employment Type:Full-TimeSalary:As per industry standardsJob Description:We are looking for a Data Engineer to join our growing team of analytics experts. The hire will beresponsible for expanding and optimizing our data and data pipeline architecture, as well asoptimizing data...

  • data engineer

    2 days ago


    Ahmedabad, Gujarat, India Ethos HR Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Responsibilities: • Design and implement data pipelines/ ETLs in Python, MSSQL server, Azure/ AWS as needed. • Manage the architectural design and development for the companys future unified BI system. • Extricate data pipeline and modelling requirements for business intelligence solutions. • Develop and orchestrate scripts as needed using Git and...

  • Data Engineer

    7 days ago


    Ahmedabad, Gujarat, India DXFactor Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    DXFactor is a US-based tech company working with customers across the globe.We are aGreat place to workwith certified company.We are looking for candidates forData Engineer (4 to 6 Yrs exp)We have our presence in:USIndia (Ahmedabad, Bangalore)Location: AhmedabadWebsite: Designation: Data Engineer (Expertise in SnowFlake, AWS & Python)Key...


  • Ahmedabad, Gujarat, India Crest Data Systems Full time

    Company Overview :Crest Data is a leading provider of data center solutions and engineering/marketing services in the areas of Networking/SDN, Storage, Security, Virtualization, Cloud Computing, and Big Data / Data Analytics.The team has extensive experience in building and deploying various Data Center products from Cisco, VMware, NetApp, Amazon AWS, EMC,...


  • Ahmedabad, Gujarat, India Crest Data Systems Full time

    Company Overview :Crest Data is a leading provider of data center solutions and engineering/marketing services in the areas of Networking/SDN, Storage, Security, Virtualization, Cloud Computing, and Big Data / Data Analytics.The team has extensive experience in building and deploying various Data Center products from Cisco, VMware, NetApp, Amazon AWS, EMC,...