
Data and Analytics Tester SQL, Python, Pyspark
7 days ago
Do you want to be part of an inclusive team that works to develop innovative therapies for patients? Every day, we are driven to develop and deliver innovative and effective new medicines to patients and physicians. If you want to be part of this exciting work, you belong at Astellas
Astellas Pharma Inc. is a pharmaceutical company conducting business in more than 70 countries around the world. We are committed to turning innovative science into medical solutions that bring value and hope to patients and their families. Keeping our focus on addressing unmet medical needs and conducting our business with ethics and integrity enables us to improve the health of people throughout the world. For more information on Astellas, please visit our website at .
We're looking for skilled Testers specializing in SQL, Python, and PySpark to join our FoundationX team. As part of this team, your role will involve maintaining operational, scalable data-driven systems that deliver business value.
*Purpose And Scope:*
As a Tester, you will play a crucial role in ensuring the quality and reliability of our pharmaceutical data systems. Your expertise in testing methodologies, data validation, and automation will be essential in maintaining high standards for our data solutions.
This position is based in Bengaluru and will require some on-site work.
*Essential Skills & Knowledge:*
- SQL ProficiencyStrong skills in SQL for data querying, validation, and analysis.
- Experience with complex joins, subqueries, and performance optimization.
- Knowledge of database management systems (e.g., SQL Server, Oracle, MySQL).
Python And PySpark:
- Proficiency in Python for test automation and data manipulation.
- Familiarity with PySpark for big data testing.
Testing Methodologies:
- Knowledge of testing frameworks (e.g., pytest, unittest).
- Experience with test case design, execution, and defect management.
Data Validation:
- Ability to validate data transformations, aggregations, and business rules.
- Understanding of data lineage and impact analysis.
Business Intelligence Tools:
- Proficiency in designing and maintaining QLIK/Tableau applications.
- Strong expertise in creating interactive reports and dashboards using Power BI.
- Knowledge of DAX and Power Automate (MS Flow).
Data Modelling And Integration:
- Ability to design and implement logical and physical data models.
- Familiarity with data warehousing concepts and ETL processes.
Data Governance And Quality:
- Understanding of data governance principles and metadata management.
- Experience ensuring data quality and consistency.
Testing Techniques:
- Familiarity with SQL, Python, or R for data validation and testing.
- Manual and automated testing (e.g., Selenium, JUnit).
- Test Management Tools: Experience with test management software (e.g., qTest, Zephyr, ALM).
Data Analysis:
- Knowledge of statistical analysis and data visualization tools (e.g., Tableau, Power BI).
Pharmaceutical Domain:
- Understanding of pharmaceutical data (clinical trials, drug development, patient records).
Attention To Detail:
- Meticulous review of requirements and data quality.
Collaboration And Communication:
- Ability to work with end users and technical team members. Self-starter who collaborates effectively.
Experience And Agile Mindset:
- Experience with data warehouses, BI, and DWH systems.
- Analytical thinking and adherence to DevOps principles.
*Preferred Skills And Knowledge:*
- Experience working in the Pharma industry. Understanding of pharmaceutical data (clinical trials, drug development, patient records) is advantageous.
- Certifications in BI tools or testing methodologies. Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making.
*Responsibilities:*
Development Ownership:
- Take ownership of testing key development projects related to the Data Warehouse and other MI systems
- Collaborate with senior team members in a multi-skilled project team.
- Contribute to efficient administration of multi-server environments.
Test Strategy And Planning:
- Collaborate with stakeholders, data scientists, analysts, and developers to understand project requirements and data pipelines
- Develop comprehensive end-to-end test strategies and plans for data validation, transformation, and analytics.
Data Validation And Quality Assurance:
- Execute manual and automated tests on data pipelines, ETL processes, and analytical models.
- Verify data accuracy, completeness, and consistency.
- Identify anomalies and data quality issues.
- Ensure compliance with industry standards (e.g., GxP, HIPAA).
Regression Testing:
- Validate changes and enhancements to data pipelines and analytics tools.
- Verify data accuracy, consistency, and completeness in QLIK, Power BI, and Tableau reports.
- Monitor performance metrics and identify deviations.
Test Case Design And Execution:
- Create detailed test cases based on business requirements and technical specifications
- Execute test cases, record results, and report defects.
- Collaborate with development teams to resolve issues promptly.
Documentation:
- Maintain comprehensive documentation of test scenarios, test data, and results
- Document test procedures and best practices.
Data Security And Privacy:
- Ensure data security and compliance with privacy regulations (e.g., GDPR)
- Validate access controls and encryption mechanisms.
Collaboration And Communication:
- Work closely with cross-functional teams, including data engineers, data scientists, and business stakeholders
*Experience:*
- 3+ years of experience as a Tester, Developer or Data Analyst within a Pharmaceutical or working within a similar regulatory environment.
- Analytical mindset and logical thinking
- Familiarity with Business Intelligence and Data Warehousing concepts
- Web integration skills (Qlik Sense).
- Advanced SQL knowledge.
- Understanding of stored procedures, triggers, and tuning.
- Experience with other BI tools (Tableau, ) is a plus.
*Education:*
- Bachelor's degree in computer science, information technology, or related field (or equivalent experience.)
*Qualifications:*
- Global Perspective: Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service, and enabling strategic insights and decision-making Other complex and highly regulated industry experience will be considered
- Data Analysis and Automation Skills: Proficient in identifying, standardizing, and automating critical reporting metrics and modelling tools
- Analytical Thinking: Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.
- Technical Proficiency: Strong coding skills in SQL, R, and/or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization.
Working Environment
At Astellas we recognize the importance of work/life balance, and we are proud to offer a hybrid working solution allowing time to connect with colleagues at the office with the flexibility to also work from home. We believe this will optimize the most productive work environment for all employees to succeed and deliver. Hybrid work from certain locations may be permitted in accordance with Astellas' Responsible Flexibility Guidelines.
\
Category FoundationX
Astellas is committed to equality of opportunity in all aspects of employment.
EOE including Disability/Protected Veterans
-
Python + Pyspark + SQL
2 weeks ago
Bengaluru, Karnataka, India Talent Worx Full time ₹ 12,00,000 - ₹ 24,00,000 per yearOur Client is a professional services firm, is the Indian member firm affiliated with International and wasestablished in September 1993. Our professionals leverage the global network of firms, providing detailedknowledge of local laws, regulations, markets, and competition. Our client has offices across India in Ahmedabad,Bengaluru, Chandigarh, Chennai,...
-
Data Engineer
1 week ago
Bengaluru, Karnataka, India Golden Opportunities Full time ₹ 1,20,000 - ₹ 1,80,000 per yearJob DescriptionJob title:Data Engineer ( Python + PySpark + SQL )Candidate Specification: Minimum 6 to 8 years of experience in Data EngineerJob DescriptionData Engineer with strong expertise in Python, PySpark, and SQL.Design, develop, and maintain robust data pipelines using PySpark and Python.Strong understanding of SQL and relational databases (e.g.,...
-
Lead Tester
2 weeks ago
Bengaluru, Karnataka, India NTT DATA Full time ₹ 12,00,000 - ₹ 36,00,000 per yearReq ID: 337565NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Lead Tester to join our team in Bangalore, Karnātaka (IN-KA), India (IN). "Job Duties: 8+ years of experience in a...
-
Pyspark Developer
6 days ago
Bengaluru, Karnataka, India, Karnataka Tata Consultancy Services Full timeGreetings from TCS!!TCS is hiring for Pyspark DeveloperRequired Skill Set: Pyspark, Python, SQL and relational databases, SparkSQL, Spark Scripting, UNIX Shell Scripting, ETL, Data Warehousing, CI/CDDesired Experience Range: 4 to 10 YearsJob Location: Hyderabad, Bangalore, Chennai, Kolkata, PuneMust Have:Data Engineer, Python developer with specialty in...
-
Pyspark Professional
2 weeks ago
Bengaluru, Karnataka, India IDESLABS PRIVATE LIMITED Full time ₹ 9,00,000 - ₹ 12,00,000 per yearImmediate Openings onPyspark Experience : 5+ Skill:-Pyspark Location :- Bangalore Notice Period :- Immediate. PysparkExperience in Cloud platform, e.g., AWS, GCP, Azure, etc.Experience in distributed technology tools, viz. SQL, Spark, Python, PySpark, ScalaPerformance Turing Optimize SQL, PySpark for performanceAirflow workflow scheduling tool for...
-
Data Architect
2 weeks ago
Bengaluru, Karnataka, India Quest Global Full time ₹ 20,00,000 - ₹ 25,00,000 per yearJob RequirementsData ArchitectA Data Architect works in complex data projects with a focus onCollating data requirements from data consumer teams, standardizing the data architecture and working with data supply teams to implement data productsBuilding data product validation plans and executing them to ensure the delivery of quality data productsYou will...
-
ETL Automation Tester
5 days ago
Bengaluru, Karnataka, India NTT DATA, Inc. Full time ₹ 8,00,000 - ₹ 12,00,000 per yearMandatory SkillsSQL,ELT/ETL Testing, Python, Data Validation, Pyspark/Pytest/Junit/TestNG and Azure Data services, Databricks QA, Data Lake QAGood to have SkillsDevops, Azure cloud Devops, Data governance, API testing
-
Data Engineer
1 week ago
Bengaluru, Karnataka, India NTT DATA Full time ₹ 15,00,000 - ₹ 25,00,000 per yearMigrate ETL workflows from SAP BODS to AWS Glue/dbt/Talend. Develop and maintain scalable ETL pipelines in AWS. Write PySpark scripts for large-scale data processing. Optimize SQL queries and transformations for AWS PostgreSQL. Work with Cloud Engineers to ensure smooth deployment and performance tuning. Integrate data pipelines with existing Unix systems....
-
Tredence Analytics
2 weeks ago
Bengaluru, Karnataka, India Tredence Analytics Solutions Private Limited Full time ₹ 15,00,000 - ₹ 25,00,000 per yearTredence is a global analytics services and solutions company. Our capabilities range from Data Visualization and Data Management to Advanced analytics, Big Data, and Machine Learning. Our uniqueness is in bringing the right mix of technology and business analytics to create sustainable white-box solutions that are transitioned to our clients at the end of...
-
Sr. Snowflake Data Engineer Pyspark
5 days ago
Bengaluru, Karnataka, India EduRun Group Full time ₹ 20,00,000 - ₹ 25,00,000 per yearSenior Data Engineer | 10-12 Years Experience | Must Have: Proven expertise in building scalable batch and streaming data pipelines using Databricks (PySpark) and Snowflake.Lead the design, implementation, and optimization of application data stores using PostgreSQL, DynamoDB, and advanced SQL.Strong programming skills in SQL, Python, and PySpark for...