Quality Analyst in Bangalore at Trellance LLC
Explore Related Opportunities
Job Description
Job Title: Quality Analyst
Department: CUDX
Reporting to: Manager
Company Detail
CU Rise Analytics Pvt. Ltd. is a wholly owned subsidiary of Trellance, Inc. CU Rise Analytics is an offshore development centre of Trellance Inc., working collaboratively to offer comprehensive solutions for data analytics, technology and talent to allow our credit union clients to provide high quality service to their members and remain competitive. Our core expertise lies in data science and technology, encompassing data analytics, predictive modelling, business intelligence, and technology services.
Trellance Cooperative Holdings, Inc. is a credit union cooperative and leading technology partner for credit unions. Its companies – consisting of Rise Analytics, ProBridge, Optiri and CUDX – provide innovative technology solutions for credit unions to increase efficiency, manage risk and improve the member experience. Trellance’s mission is to make sure credit unions have access to the tools and resources they need to grow, enhance member value and remain competitive in a rapidly evolving financial landscape.
Learn more at Trellance.com
Job Summary:
The Mid-Level QA Engineer ensures the quality, reliability, and correctness of our contributor facing and admin portals, REST APIs, and backend services. You will own functional testing across frontend workflows, API contracts, and backend data assertions — ensuring the platform behaves correctly for credit union contributors and internal staff alike. Your work spans UI validation, API testing, file ingestion verification, authentication boundary testing, and database-level assertions against Snowflake. As the platform evolves, you will grow into data pipeline and ETL testing, building full-stack quality coverage across a cloud-native Azure data platform handling sensitive financial data.
This role is ideal for someone with a strong foundation in application-level functional testing who is eager to grow their skills across the full data stack — from UI and API validation through to pipeline and transformation testing.
Role & Responsibilities:
- Conduct UI/platform testing to validate workflows, data‑driven screens, user interactions, and correct rendering of data returned from backend APIs.
- Test backend platform services, event flows, access controls, and error handling behaviours.
- Validate authentication and authorisation boundaries — including role‑based access controls, session behaviour, and expected access restrictions across user types and portals.
- Test file‑based ingestion paths including SFTP uploads — validating file acceptance, rejection handling, schema conformance, and correct downstream processing for both well‑formed and malformed inputs.
- Perform API testing for schema correctness, parameter validation, filtering logic, pagination, and expected outputs.
- Execute automated regression suites, validation scripts, and contribute to extending automation coverage.
- Log defects with clarity, analyse root causes, and collaborate with engineering to drive timely resolution.
- Prepare and maintain test environments, seed representative test data, and support environment promotion readiness.
- Maintain clear test documentation including cases, scenarios, results, and repeatable validation procedures.
- Participate in requirement reviews, refining acceptance criteria and identifying data or testing gaps early in the workflow.
- Assist in validating release readiness through smoke testing, sanity checks, and verification of critical platform features.
- Write SQL queries to set up test data, perform backend assertions, and verify application behaviour against expected results.
- Assist in source‑to‑target reconciliation and data completeness checks — documenting discrepancies, unexpected schema changes, and environment‑specific anomalies for engineering review.
- Grow into pipeline testing — support validation of data transformation outputs and pipeline flow correctness as you build ETL testing skills.
- Support validation of data model correctness, including key constraints, relationship integrity, and query output accuracy.
- Learn to query Snowflake or similar cloud data platforms to validate transformation results and confirm data quality.
Skills & Qualifications:
Must-Have:
- 3–5 years of experience in application‑level functional testing, including frontend workflow validation, API testing, and backend/database testing.
- Hands‑on experience with UI test automation frameworks (e.g. Selenium, Playwright, or Cypress) and API testing tools (e.g. Postman or RestAssured).
- Demonstrated ability to validate end‑to‑end user workflows in web applications, covering form validation, navigation flows, state transitions, and correct rendering of data returned from backend APIs.
- Experience performing API testing (schema validation, parameters, filtering, pagination, response correctness).
- Ability to test backend services, event flows, and error-handling scenarios.
- Experience building or maintaining automated test suites using frameworks such as JUnit, TestNG, pytest, or Cucumber.
- Working knowledge of SQL for data assertions, test data preparation, and result verification.
- Awareness of data sensitivity and PII handling practices — including the importance of using synthetic or anonymised test data and the risks of using production data in non‑production environments.
- Strong analytical and problem‑solving skills with experience performing root cause analysis.
- Ability to work effectively in agile environments with sprints, story refinement, and release cycles.
- Strong communication and documentation skills for reporting defects and test outcomes.
Preferred:
- Familiarity with Azure services relevant to testing — including App Service log streaming, Event Grid / Storage Queue message inspection, Blob Storage file validation, and Azure Monitor / Log Analytics for tracing issues across environments.
- Basic familiarity with containerised deployments — ability to read Docker / container logs and correlate application behaviour with container startup and runtime output.
- Familiarity with Java‑based backend services (Spring Boot) as a context for API test design and log interpretation.
- Exposure to ETL or data pipeline concepts and basic pipeline validation techniques.
- Awareness of cloud data platforms such as Snowflake and the ability to run basic SQL queries against them.
- Familiarity with transformation tools such as dbt or similar; prior hands‑on use is a strong plus.
- Basic understanding of data warehouse concepts (e.g. fact and dimension tables, data freshness) without requiring expertise in dimensional modelling.
- Familiarity with metadata, data lineage, or schema evolution concepts in a data platform context.
- Experience with performance or load testing tools (e.g. JMeter, k6, Locust).
- Awareness of data governance practices, quality gates, or data observability tools.
- Exposure to CI/CD pipeline integration for automated test execution (e.g. Azure Pipelines, GitHub Actions, Jenkins).