
PySpark Data Engineer
4 days ago
Contract Assistant – Data Engineer Support (Remote, EST Hours)Start Date: Sept 10, 2025⏳ Duration: 6 months (extendable)Pay: $1,000/monthWork Hours: 8:00 AM – 5:30 PM ESTWe’re looking for a Contract Assistant to support a PySpark Data Engineer with daily activities. This is a remote contract role (not formal employment).What You’ll Do:- Execute creative software and data solutions, including design, development, and technical troubleshooting, by thinking beyond routine approaches to build solutions or break down technical problems. - Develop secure, high-quality production code and data pipelines, reviewing and debugging processes implemented by others. - Identify opportunities to eliminate or automate remediation of recurring issues to improve operational stability of software applications and systems. - Lead evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs and technical credentials. - Work with business stakeholders to understand requirements and design appropriate solutions, producing architecture and design artifacts for complex applications. - Implement robust monitoring and alerting systems to proactively identify and address data ingestion issues, optimizing performance and throughput. - Implement data quality checks and validation processes to ensure accuracy and reliability of data. - Design and implement scalable data frameworks to manage end-to-end data pipelines for workforce data analytics. - Share and develop best practices with Platform and Architecture teams to improve data pipeline framework and modernize the workforce data analytics platform. - Gather, analyze, and synthesize large, diverse data sets to continuously improve capabilities and user experiences, leveraging data-driven insights. - Contribute to software engineering communities of practice and events that explore new and emerging technologies, fostering a culture of diversity, opportunity, inclusion, and respect.What You’ll Need:- Ability to work alongside me during US EST business hours. - 5+ years of applied experience in data engineering, including design, application development, testing, and operational stability. - Advanced proficiency in data processing frameworks and tools, including Parquet, Iceberg, PySpark, Glue, Lambda, Databricks, and AWS data services like EMR, Athena, and Redshift. - Proficiency in programming languages like Python, Java, or Scala for data processing and application development. - Proficiency in automation and continuous delivery methods, utilizing CI/CD pipelines with tools like Git/Bitbucket, Jenkins, or Spinnaker for automated deployment and version control. - Hands-on practical experience delivering system design, application development, testing, and operational stability, with advanced understanding of agile methodologies, application resiliency, and security. - Demonstrated proficiency in software applications and technical processes within technical disciplines like cloud, artificial intelligence, machine learning, and mobile. - In-depth knowledge of the financial services industry and their IT systems. - Proficiency in database management and data modeling, working with relational databases like Oracle or SQL Server. Skilled in writing SQL queries for efficient data management and retrieval, utilizing SQL tools such as DML for data handling, DDL for schema management, and PL/SQL for procedural extensions in Oracle databases. - Experience with scheduling tools like Autosys to automate and manage job scheduling for efficient workflow execution.This role offers hands-on exposure to real-world BA activities, stakeholder engagement, and technology evaluation — perfect for someone looking to strengthen their BA career while working remotely.Interested? Send me a DM or comment below#NowHiring #RemoteJobs #TechTalent #DataEngineer #HiringNow #WorkFromHome #TechOpportunities #JobAlert #PySpark #AWSData #Databricks #Glue #Lambda #EMR #Athena #Redshift #SQL #PLSQL #Python #Scala #Java #Automation #CICD #Jenkins #Bitbucket #DataPipelines #DataQuality #DataArchitecture
-
Data Engineer
2 weeks ago
New Delhi, India Tata Consultancy Services Full timeJob Title :- Data Engineer - Pyspark Experience: 5 to 8 Years Location: Pune/HyderabadJob DescriptionRequired Skills: 5+ years of experience in Big data and pyspark Must-Have Good work experience on Big Data Platforms like Hadoop, Spark, Scala, Hive, Impala, SQL Good-to-Have Good Spark, Pyspark,Big Data experience Spark UI/Optimization/debugging techniques...
-
Data Engineer – Databricks
2 days ago
New Delhi, India Capgemini Engineering Full timeData Engineer – Databricks & PySpark Location:Bangalore Experience:6–9 yearsChoosing Capgemini means choosing a place where you’ll be empowered to shape your career, supported by a collaborative global community, and inspired to reimagine what’s possible. Join us in helping leading Consumer Products and Retail Services (CPRS) organizations unlock the...
-
Data Engineer
6 days ago
New Delhi, India Tata Consultancy Services Full timeJob Title :- Data Engineer - PysparkExperience: 5 to 8 YearsLocation: Pune/HyderabadJob DescriptionRequired Skills:5+ years of experience in Big data and pysparkMust-HaveGood work experience on Big Data Platforms like Hadoop, Spark, Scala, Hive, Impala, SQLGood-to-HaveGood Spark, Pyspark,Big Data experienceSpark UI/Optimization/debugging techniquesGood...
-
Data Engineer – Databricks
4 days ago
New Delhi, India Capgemini Engineering Full timeData Engineer – Databricks & PySparkLocation: BangaloreExperience: 6–9 yearsChoosing Capgemini means choosing a place where you’ll be empowered to shape your career, supported by a collaborative global community, and inspired to reimagine what’s possible. Join us in helping leading Consumer Products and Retail Services (CPRS) organizations unlock the...
-
Senior Data Engineer – Python
4 days ago
New Delhi, India Xebia Full timeWe’re Hiring: Senior Data Engineer – Python & PySparkLocation: Bangalore (Hybrid – 3 days office per week)We are looking for an experienced Senior Data Engineer with a strong background in Python (with OOPs concepts), PySpark, and building test cases. The ideal candidate must have 6+ years of hands-on experience and be available to join immediately or...
-
Data Engineer
3 weeks ago
Delhi, India Tata Consultancy Services Full timeTCS is hiring forPyspark Data Engineerfor BangaloreRole: Data Engineer with Mandatory Pyspark KnowledgeRequired Technical Skill Set: Python , Pyspark, BigQueryDesired Experience Range: 7 to 10 yrsLocation: BangaloreDesired Competencies (Technical/Behavioral Competency)Development, Production Support and delivery of Python, BigQuery, SQL, GCP, Airflow based...
-
Senior Data Engineer – Python
6 days ago
New Delhi, India Xebia Full timeWe’re Hiring: Senior Data Engineer – Python & PySpark Location: Bangalore (Hybrid – 3 days office per week)We are looking for anexperienced Senior Data Engineerwith a strong background inPython (with OOPs concepts), PySpark, and building test cases . The ideal candidate must have6+ years of hands-on experienceand be available to joinimmediately or...
-
PySpark Data Engineer
5 days ago
Delhi, India EXTRAGIG Full timeContract Assistant – Data Engineer Support (Remote, EST Hours)Start Date: Sept 10, 2025⏳ Duration: 6 months (extendable)Pay: $1,000/monthWork Hours: 8:00 AM – 5:30 PM ESTWe’re looking for a Contract Assistant to support a PySpark Data Engineer with daily activities. This is a remote contract role (not formal employment) .What You’ll Do:Execute...
-
AWS Data Engineer
2 weeks ago
New Delhi, India Tata Consultancy Services Full timeDear Candidate Greetings from TATA Consultancy Services Job Openings at TCS Skill - AWS Data Engineer- Redshift, Pyspark, Glue Exp range - 5 yrs - 8yrs Location- Chennai Notice period 30 daysPls find the Job Description below. Good hands-on experience in Python programming and Pyspark Data Engineering experience using AWS core services (Lambda, Glue, EMR and...
-
AWS Data Engineer
2 days ago
New Delhi, India Tata Consultancy Services Full timeDear CandidateGreetings from TATA Consultancy ServicesJob Openings at TCSSkill - AWS Data Engineer - Redshift, Pyspark, GlueExp range - 5 yrs - 8yrsLocation- ChennaiNotice period 30 daysPls find the Job Description below.- Good hands-on experience in Python programming and Pyspark - Data Engineering experience using AWS core services (Lambda, Glue, EMR and...