Compensation: The expected salary range for this role is between $80,000 and $110,000, depending on experience and qualifications.
Reason for Opening: Net New position
AI is used to screen, assess, or select applicants for this role.
InnoSoft Canada
InnoSoft is hiring a QA Lead who will be deep in the work — writing automation, hardening test infrastructure, fixing what's broken in our quality practices, and building the technical muscle of the QA team while doing it. This is not a role where you draw diagrams of how testing should work. It's a role where you build it, prove it works, and then teach others to do the same.
The QA Lead will carry a meaningful hands-on workload while also owning the systems, tools, and practices that make everyone around them more effective. You will ship automation that the team trusts, fix the gaps in our coverage that keep us up at night, and raise the technical floor across the entire QA function — one real improvement at a time.
About InnoSoft
InnoSoft Canada is a software product company building recreation management solutions under the Jonas Software Group, a subsidiary of Constellation Software Inc. (CSI), an international software provider with revenues exceeding $3.5 billion USD and 20,000+ employees worldwide. Our products — Fusion, FusionGO, Fusion Wave, Fusion Play, and Fusion Club — serve communities by enhancing access to recreation through technology. We believe in purpose-driven work and engineering excellence.
What You'll Do
Write, maintain, and extend automated test suites across the Fusion product suite. You will be the most prolific contributor to our automation codebase, not just the person who reviews it.
Design automation that is reliable, fast, and easy for other QA engineers to extend. Flaky tests, brittle locators, and slow suites are problems you take personally and fix directly.
Own the integration of automated tests into CI/CD pipelines — ensuring that quality gates run on every build, provide clear pass/fail signals, and don't become bottlenecks that the engineering team learns to ignore.
Build and maintain shared test utilities, data factories, and helper libraries that reduce duplication and make it faster for the whole team to write good tests.
Stand up performance and load testing capabilities where they don't exist today. Start with the highest-risk surfaces and build from there.
Conduct a thorough assessment of current test coverage across Fusion, FusionGO, Fusion Wave, Fusion Play, and Fusion Club. Identify the areas where we're exposed — undertested integrations, missing regression coverage, manual-only workflows that should be automated — and start closing those gaps.
Apply risk-based prioritization to decide where to invest effort first. Focus coverage on the paths that carry the most business risk: high-traffic workflows, payment processing, data integrity touchpoints, and areas with a history of escaped defects.
Own defect analysis. When bugs escape to production, trace them back to root cause — was it a gap in test design, a missing test environment, a flawed assumption in requirements? Use what you find to make targeted, practical improvements.
Work directly with developers to improve testability in the codebase. If a component is hard to test, work with the team to refactor it — don't just work around it.
Evaluate our current QA workflows — test planning, execution, defect triage, release sign-off — and improve them based on what you see firsthand, not based on theory. You'll know what's broken because you're doing the work.
Embed QA earlier in the development cycle. Participate in sprint planning, review requirements and acceptance criteria with product and engineering, and surface testability concerns before code is written — not after.
Define and track a small set of meaningful quality metrics: defect escape rate, automation coverage, regression cycle time, and mean time to detect. Use these to identify specific problems and measure whether your fixes are working.
Streamline release sign-off so it's fast, data-driven, and trusted. The goal is a process where everyone — QA, engineering, product — has confidence in what we're shipping without unnecessary ceremony.
Pair with QA engineers regularly. Work alongside them on real tasks — automation development, test design, debugging — and use those sessions to teach technique, not just complete tickets.
Assess each QA engineer's current technical strengths and gaps. Build practical growth plans focused on the skills that will have the most impact: automation fluency, API testing, exploratory testing technique, and debugging methodology.
Introduce better practices by demonstrating them in your own work first. Code reviews, well-structured test plans, clear defect reports — set the standard by doing it, then help others follow.
Make sure QA engineers on the satellite products (FusionPlay, FusionClub) are not developing skills in isolation. Pull them into automation work, code reviews, and technical discussions with the broader team.
Evaluate and introduce AI-assisted testing tools where they produce practical value — test generation, maintenance reduction, coverage analysis — and train the team to use them effectively.
Serve as the QA voice in sprint reviews, release planning, and incident retrospectives — bringing specific data and concrete observations, not abstract quality concerns.
Coordinate directly with tech leads and senior engineers to align testing timelines with delivery schedules. Push back clearly when timelines put quality at unacceptable risk, and frame the tradeoff in terms the business can act on.
Help product managers understand the quality cost of scope and timeline decisions. Your input should make these conversations sharper, not slower.
Not a management role that delegates all technical work. You carry a significant hands-on workload and lead by doing.
Not a project manager — sprint and delivery management stays with tech leads and their teams.
Not a gatekeeper who blocks releases on instinct. Your decisions are backed by data, specific risk analysis, and clear communication.
Bachelor's degree in Computer Science, Software Engineering, or a related technical field; an equivalent combination of education and experience will be considered.
7+ years of progressive QA experience, with at least 5 years in a senior hands-on QA engineering or QA lead capacity where you were still writing automation and doing technical work — not just managing it.
Proven experience in multi-product software environments where you personally built or significantly improved test automation infrastructure.
A track record of measurable quality improvements that you can walk through in detail — what you found, what you built, and what changed as a result.
Strong, current hands-on skills in test automation frameworks (e.g., Selenium, Playwright, or equivalent). You should be able to sit down on day one and start writing effective automation in our stack.
Experience building shared test infrastructure — frameworks, utilities, data management, environment configuration — that other engineers rely on daily.
Solid API testing skills (REST) with tools like Postman or equivalent, and the ability to design API test suites that cover integration points thoroughly.
Practical experience integrating automated tests into CI/CD pipelines, including configuring test stages, managing test data, and troubleshooting pipeline failures.
Experience with performance and load testing tools and the ability to stand up baseline performance testing where none exists.
Strong defect analysis skills — you can trace an escaped bug back to its root cause and turn that analysis into a specific improvement in coverage or process.
Proficiency with defect tracking and workflow tools (e.g., Jira or similar).
Experience with AI-assisted testing tools and a practical perspective on where they help and where they don't.
Deep understanding of the software development lifecycle and strong opinions, grounded in experience, about where QA involvement has the highest return.
Extensive experience with Agile methodologies. You know how to integrate QA meaningfully into sprint ceremonies — planning, refinement, retrospectives — without turning QA into a bottleneck.
A natural teacher. You improve the people around you by working alongside them, not by assigning training modules.
Clear, direct communication. You can explain a quality risk to a product manager, walk a developer through a test design decision, and present release readiness to leadership — all without jargon.
Process-minded but allergic to unnecessary overhead. You fix workflows by simplifying them, not by adding layers.
Self-directed. You identify the highest-value problem, build a plan to fix it, and execute — without waiting for permission or detailed direction.
You are a strong automation engineer in your own right. You don't just direct — you build, debug, and ship.
You see broken infrastructure, missing coverage, or a slow pipeline and you fix it. You don't write a proposal about fixing it.
You raise the team's capabilities by working alongside them on real problems, not by lecturing.
Practical Prioritization
You focus on the improvements that will have the most impact right now, not the ones that look best on a roadmap.
You identify patterns and shared problems across products and solve them once, well, rather than separately in each silo.
You translate quality problems into business language. When you raise a risk, people understand it and act on it.
This role sits parallel to product tech leads — it owns the quality dimension across the engineering organization the same way a tech lead owns the delivery dimension for their product. The QA Lead reports to the Director of Engineering and participates in the technical board for cross-product decisions that affect architecture, shared tooling, or release governance.
The key difference in this role is that ownership is exercised primarily through execution. You set the standard by being the strongest practitioner on the team, and you earn influence by delivering results that everyone can see.
Meaningful automation coverage added to the highest-risk, lowest-coverage areas of the product suite — with tests running reliably in CI/CD and trusted by the engineering team.
At least one significant coverage gap or systemic quality problem identified, addressed, and validated with data showing improvement.
QA metrics baseline established and actively used — defect escape rate, automation coverage, and regression cycle time tracked consistently, with at least one concrete process change driven by the data.
Visible improvement in the technical output of the QA team — engineers writing better automation, producing sharper defect analysis, and contributing to shared test infrastructure.
Shift-left practices adopted in at least two product teams — QA participating in sprint planning, reviewing requirements, and defining acceptance criteria before development begins.
A shared QA standards document that all products reference, built from what you've learned by doing the work across the suite.
#LI-PP1
Updated the requisition with Compensation details per the Canada legislation payment requirement.
Jonas Software is a leading provider of enterprise management software solutions, serving a wide range of vertical markets including hospitality, healthcare, construction, education, personal care, fitness, leisure, moving and legal services, to name a few. Within these markets, Jonas is comprised of over 65 distinct brands, each a respected leader in its domain.
Jonas’ vision is to be the branded global leader across these verticals and to be recognized by customers and industry stakeholders as the trusted provider of “Software for Life.” We are committed to technology, product innovation, quality, and exceptional customer service.
Jonas Software supports over 60,000 customers in more than 30 countries. We employ over 6,000 skilled professionals, including industry experts and technology specialists. Across our broader network, we support a global workforce of more than 30,000 employees.
Headquartered in Canada, Jonas Software has a global footprint with offices around the world. We’re a 100% owned subsidiary of Constellation Software Inc., based in Toronto, publicly listed on the TSX (CSU.TO), and a member of the S&P/TSX 60 Index.

Morgan Stanley

Edges Wellness Center LLC

WSP in Canada

Digitalenta

Cox Automotive Inc.

Premier Construction Software

Premier Construction Software

Premier Construction Software