Data Engineer
2 weeks ago
Overview:We are looking for a highly skilled Python Data Engineer to join our team in an on-premise data engineering environment. The ideal candidate will have experience in ETL tools, data processing technologies, data orchestration, and relational databases. Additionally, you should be proficient in Python scripting for data engineering tasks and have experience working with Spark, PySpark, and other relevant data technologies. While cloud tools are a good-to-have, this position primarily focuses on on-premise data infrastructure.This is an excellent opportunity to work on exciting projects that require developing scalable data pipelines, real-time data streaming, and optimizing data processing tasks using Python.Key Responsibilities:ETL Development & Optimization: Design, develop, and optimize ETL pipelines using open-source or cloud ETL tools (e.g., Apache Nifi, Talend, Pentaho, Airflow, AWS Glue).Python Scripting for Data Engineering: Write Python scripts to automate data extraction, transformation, and loading (ETL) processes. Ensure that the code is optimized for performance and scalability.Big Data Processing: Work with Apache Spark and PySpark to process large datasets in a distributed computing environment. Optimize Spark jobs for performance and resource efficiency.Job Orchestration: Use Apache Airflow or other orchestration tools to schedule, monitor, and automate data pipeline workflows.Data Streaming: Design and implement real-time data streaming solutions using technologies like Apache Kafka or AWS Kinesis for high-throughput, low-latency data processing.File Formats & Table Formats: Work with open-source table formats like Apache Parquet, Apache Avro, or Delta Lake, and other structured/unstructured data formats for efficient data storage and access.Database Management: Work with relational databases (e.g., PostgreSQL, MySQL, SQL Server) for data storage, management, and optimization. Understand database concepts such as normalization, indexing, and query optimization.SQL Expertise: Write and optimize complex SQL queries for data extraction, transformations, and aggregation across large datasets. Ensure queries are efficient and scalable.BI & Data Warehouse Knowledge: Exposure to BI tools and data warehousing concepts is a plus, ensuring the data is structured in a way that supports analytics and reporting.Required Skills & Experience:ETL Tools: Experience working with open-source ETL tools such as Apache Nifi, Talend, or Pentaho. Cloud-based tools like AWS Glue or Azure Data Factory are good to have.Python Scripting: Proficiency in Python for automating data processing tasks, writing data pipelines, and working with libraries such as Pandas, Dask, PySpark, etc.Big Data Technologies: Experience with Apache Spark and PySpark for distributed data processing, along with optimization techniques.Data Orchestration: Experience using Apache Airflow or similar tools for scheduling and automating data pipelines.Data Streaming: Experience with Apache Kafka or AWS Kinesis for building and managing real-time data pipelines.Open-Source File Formats: Knowledge of Apache Parquet, Apache Avro, Delta Lake, or similar open-source table formats for efficient data storage and retrieval.Relational Databases: Strong experience with at least one relational database (e.g., PostgreSQL, MySQL, SQL Server) and a solid understanding of database concepts like indexing, normalization, and query optimization.SQL Expertise: Strong skills in writing and optimizing complex SQL queries for data extraction, transformations, and aggregation.Nice to Have:BI/Analytics Tools: Familiarity with BI tools like Power BI, Tableau, Looker, or similar reporting and data visualization platforms.Data Warehousing: Knowledge of data warehousing principles, schema design (e.g., star/snowflake), and optimization techniques for large datasets.Cloud Technologies: Experience with cloud data platforms like Databricks, Snowflake, or Azure Synapse is beneficial, though the role is focused on on-prem environments.Containerization: Familiarity with containerization tools like Docker or Kubernetes for deploying data engineering workloads.Educational Qualifications:Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field (or equivalent work experience).Additional Qualities:Excellent problem-solving and troubleshooting skills.Ability to work both independently and in a collaborative environment.Strong communication skills, both written and verbal.Detail-oriented with a focus on data quality and performance optimization.Proactive attitude and the ability to take ownership of projects.
-
Data Engineer
2 days ago
Pune, Maharashtra, India Jash Data Sciences Full time ₹ 15,00,000 - ₹ 25,00,000 per yearDo you love solving real-world data problems with the latest and best techniques? And having fun while solving them in a team Then come join our high-energy team of passionate data people. Jash Data Sciences is the right place for you.We are a cutting-edge Data Sciences and Data Engineering startup based in Pune, India.We believe in continuous learning and...
-
Senior Data Engineer
3 weeks ago
Pune, India Data Axle Full timeJob Description About Data Axle: Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for 50 years in the US. Data Axle has set up a strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology...
-
Data Operations Engineer
2 days ago
Pune, Maharashtra, India Mars Data Insights Full time ₹ 15,00,000 - ₹ 25,00,000 per yearTitle: Data Operations Engineer & RUN SupportSkills:Data Operations Engineering, data manipulation, Python, Talend, GCP, Bigquery, DataIku, ITSM/ticketing tools, Helix, Jira, task management, data pipelines, RUN Service, data infrastructure, data quality Job Location:Pune Job Type:Fulltime/Hybrid Work Experience:5+ years We are seeking a highly...
-
Azure Data Engineer
2 weeks ago
Pune, India Fragma Data Systems Full timeJob Description Technology Skills - Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub. - Experience in...
-
Business Consulting-Data Engineer with Power BI
1 minute ago
Pune, Maharashtra, India NTT DATA Full timeReq ID: 345385NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Consulting-Data Engineer with Power BI to join our team in Pune, Mahārāshtra (IN-MH), India (IN).Job...
-
Data Scientist
1 week ago
Pune, Maharashtra, India Data Dynamics Full time**Objectives of this role**: - Collaborate with product design and engineering teams to develop an understanding of needs. - Research and devise innovative statistical models for data analysis. - Build innovative models and integrate them in our product. - Communicate findings to all stakeholders. - Keep current with technical and industry...
-
Data Scientist
2 days ago
Pune, Maharashtra, India Data Axle Full time**About Data Axle**: **Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 45 years in the USA. Data Axle has set up a strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by...
-
Mid Data Science
3 weeks ago
Pune, India NTT DATA Full timeJob Description Req ID: NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Mid Data Science & AIML, GenAI Lead/Engineer to join our team in Pune, Mahārāshtra (IN-MH), India...
-
Mid Data Science
3 weeks ago
Pune, India NTT DATA Full timeJob Description Req ID: NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Mid Data Science & AIML, GenAI Lead/Engineer to join our team in Pune, Mahārāshtra (IN-MH),...
-
Data Architect
3 weeks ago
Pune, India NTT DATA Full timeJob Description :• Work as Data architect/Senior data engineer to design and develop cost effective and reliable data solutions on any cloud platform like Azure or AWS.• Should be able to understand client requirements and convert them into technical solution leveraging cloud capabilities and modern technologies. • Should be able to contribute into...