PySpark Data Engineer
6 hours ago
🚀 Contract Assistant – Data Engineer Support (Remote, EST Hours) 🚀📅 Start Date: Sept 10, 2025⏳ Duration: 6 months (extendable)💰 Pay: $1,000/month🕗 Work Hours: 8:00 AM – 5:30 PM ESTWe’re looking for a Contract Assistant to support a PySpark Data Engineer with daily activities. This is a remote contract role (not formal employment).What You’ll Do:Execute creative software and data solutions, including design, development, and technical troubleshooting, by thinking beyond routine approaches to build solutions or break down technical problems.Develop secure, high-quality production code and data pipelines, reviewing and debugging processes implemented by others.Identify opportunities to eliminate or automate remediation of recurring issues to improve operational stability of software applications and systems.Lead evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs and technical credentials.Work with business stakeholders to understand requirements and design appropriate solutions, producing architecture and design artifacts for complex applications.Implement robust monitoring and alerting systems to proactively identify and address data ingestion issues, optimizing performance and throughput.Implement data quality checks and validation processes to ensure accuracy and reliability of data.Design and implement scalable data frameworks to manage end-to-end data pipelines for workforce data analytics.Share and develop best practices with Platform and Architecture teams to improve data pipeline framework and modernize the workforce data analytics platform.Gather, analyze, and synthesize large, diverse data sets to continuously improve capabilities and user experiences, leveraging data-driven insights.Contribute to software engineering communities of practice and events that explore new and emerging technologies, fostering a culture of diversity, opportunity, inclusion, and respect.What You’ll Need:Ability to work alongside me during US EST business hours.5+ years of applied experience in data engineering, including design, application development, testing, and operational stability.Advanced proficiency in data processing frameworks and tools, including Parquet, Iceberg, PySpark, Glue, Lambda, Databricks, and AWS data services like EMR, Athena, and Redshift.Proficiency in programming languages like Python, Java, or Scala for data processing and application development.Proficiency in automation and continuous delivery methods, utilizing CI/CD pipelines with tools like Git/Bitbucket, Jenkins, or Spinnaker for automated deployment and version control.Hands-on practical experience delivering system design, application development, testing, and operational stability, with advanced understanding of agile methodologies, application resiliency, and security.Demonstrated proficiency in software applications and technical processes within technical disciplines like cloud, artificial intelligence, machine learning, and mobile.In-depth knowledge of the financial services industry and their IT systems.Proficiency in database management and data modeling, working with relational databases like Oracle or SQL Server. Skilled in writing SQL queries for efficient data management and retrieval, utilizing SQL tools such as DML for data handling, DDL for schema management, and PL/SQL for procedural extensions in Oracle databases.Experience with scheduling tools like Autosys to automate and manage job scheduling for efficient workflow execution.📌 This role offers hands-on exposure to real-world BA activities, stakeholder engagement, and technology evaluation — perfect for someone looking to strengthen their BA career while working remotely.📧 Interested? Send me a DM or comment below#NowHiring #RemoteJobs #TechTalent #DataEngineer #HiringNow #WorkFromHome #TechOpportunities #JobAlert #PySpark #AWSData #Databricks #Glue #Lambda #EMR #Athena #Redshift #SQL #PLSQL #Python #Scala #Java #Automation #CICD #Jenkins #Bitbucket #DataPipelines #DataQuality #DataArchitecture
-
Pyspark Data Engineer
6 days ago
bangalore, India Tata Consultancy Services Full timeRole: Data Engineer with Mandatory Pyspark Knowledge Required Technical Skill Set: Python , Pyspark, BigQuery Desired Experience Range: 5+ yrs Desired Experience: 0-60 Days Location: Bangalore, Pune, Chennai, Gurugram Desired Competencies (Technical/Behavioral Competency) Development, Production Support and delivery of Python, BigQuery, SQL, GCP, Airflow...
-
Senior Data Engineer
4 days ago
bangalore, India beBeePyspark Full timePyspark Developer RoleWe are seeking a skilled Pyspark Developer to join our team. In this role, you will be working on Big Data Platforms like Hadoop and Spark.Good work experience on Big Data Platforms is essential.A good understanding of data processing frameworks such as Pyspark and Scala is required.You should have good Spark, Pyspark, Big Data...
-
AWS Data Engineer
6 days ago
bangalore, India Atyeti Inc Full timeTechnical SkillsMust Have Skills:Proficient with Python, PySpark and AirflowStrong understanding of Object-Oriented Programming and Functional Programming paradigmMust have experience working with Spark and its architectureKnowledge of Software Engineering best practicesAdvanced SQL knowledge (preferably Oracle)Experience in processing large amounts of...
-
AWS Data Engineer
4 days ago
Bangalore, India Atyeti Inc Full timeTechnical Skills Must Have Skills: Proficient with Python, PySpark and Airflow Strong understanding of Object-Oriented Programming and Functional Programming paradigm Must have experience working with Spark and its architecture Knowledge of Software Engineering best practices Advanced SQL knowledge (preferably Oracle) Experience in processing large amounts...
-
Data engineer
2 weeks ago
bangalore, India Tata Consultancy Services Full timeDear Candidate Greetings from TATA Consultancy Services Job Openings at TCS Skill : Data engineer (Pyspark) with AIExp range: 6 yrs to10 yrs Location: BangaloreNotice period –30 days Pls find the Job Description below. 1) Strong design and data solutioning skills 2) PySpark hands-on experience with complex transformations and large dataset handling...
-
Python Pyspark Data Engineer
2 weeks ago
Bangalore, Karnataka, India DXC Technology Full timePython Pyspark Data Engineer Job Location Hyderabad Bangalore Chennai Kolkata Noida Gurgaon Pune Indore Mumbai Strong Python Skills - Proficient in Python for data manipulation automation and building reusable components Data Pipeline Development - Experience designing and maintaining ETL data pipelines using tools like Airflow or custom scripts Database...
-
Pyspark developer
3 weeks ago
bangalore, India Tata Consultancy Services Full timeThe ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code. Functional Skills: Experience in Credit Risk/Regulatory risk domain Technical Skills: Spark ,PySpark, Python, Hive, Scala, MapReduce, Unix shell scripting Good to Have Skills: Exposure to...
-
Data Engineer
1 week ago
bangalore, India Atyeti Inc Full timeBackground:This position will be responsible for design, build and maintenance of data pipelines running on Airflow, Spark on the AWS Cloud platform at Bank. Roles and Responsibility:Build and maintain all facets of Data Pipelines for Data Engineering team.Build the pipelines required for optimal extraction, transformation, and loading of data from a wide...
-
▷ 15h Left! Data engineer
4 weeks ago
Bangalore, India Tata Consultancy Services Full timeDear Candidate Greetings from TATA Consultancy Services Job Openings at TCS Skill : Data engineer (Pyspark) with AI Exp range: 6 yrs to10 yrs Location: Bangalore Notice period -30 days Pls find the Job Description below. - 1) Strong design and data solutioning skills 2) PySpark hands-on experience with complex transformations and large dataset handling...
-
Apply Now: Python Pyspark Data Engineer
4 weeks ago
Bangalore, Karnataka, India DXC Technology Full timePython Pyspark Data Engineer Job Location Hyderabad Bangalore Chennai Kolkata Noida Gurgaon Pune Indore Mumbai We are seeking a skilled Lead Data Engineer with strong programming and SQL skills to join our team The ideal candidate will have hands-on experience with Python and Pyspark Data Analytics services and a basic understanding of general AWS services...