Posts in Test Management

cGMP – Design and Development Master Validation Plan (10)

July 15th, 2019 Posted by Requirements Management Tool, Test Management, Validation and Verification 0 thoughts on “cGMP – Design and Development Master Validation Plan (10)”

People in the medical device industry are often wary of the term validation even though they shouldn’t. Validation is using objective evidence and experiment to ensure that a set of requirements are met when testing for product or service usage. Furthermore, as it pertains to device validation. The objective is to match the specifications with user needs and application. Therefore, validation is proof that using a specific process to manufacture a device will meet both device requirement and user demands.

Validation is the building block for verifying the quality of a product. As a result, the product that is tested during validation must represent the final product. According to ISO 13485, companies must keep a record of the product they use during validation.

Validation MVP

In the same way that the design control process starts with a plan, validation must take the same approach.  The plan is often extensive, covering several areas earning the name Validation Master Plan (VMP) or Master Validation Plan (MVP). It is best to always start the validation plan early in the design process. The plan should be able to pinpoint what will help satisfy the criteria like;

  • Methodologies
  • Performance properties
  • Validation activities

Likewise, there should be a review of the validation plan to avoid risk and deficiencies.

“The First Article”

The first article is the common name given to the first set of products. They are either serialized batch or initial batch (Validation batch). In some cases, validation reports help to document the properties of the first articles. Also, there could be a separate first article report. Forgetting to include labeling and packaging as part of the validation process is a common oversight.

It is crucial that companies include packing in their validation plan. Its effect of product performance is enormous and difficult to measure.  Some packaging can give off electrostatic charge or cause the material to leach into a sterile product. Therefore, testing the packaging can help prevent such occurrences. Similarly, the validation plan should include labels. Environmental conditions can cause labels to fail, leaving the product bare and unbranded.

Inclusion of clinal trials in the product validation is optional and depends on the type of product. Nevertheless, there should be some form of clinal evaluation just in case. Also, validation should account for the worst possible scenarios using simulations that mimic the conditions that the product will face. Some possible simulations can test for the following;

  • Vibration and shock
  • Temperature
  • Humidity
  • Other tests will account for either transportation or storage of the product.

Finally, validation should take into account the following customers/users;

  • Operators
  • Patients
  • Caregivers (nurses/doctors)
  • Other relevant parties

MVP and VMP

Orcanos Master Validation Plan

Related Links

Effective baseline management for ORCANOS | ALM Test Management

June 23rd, 2016 Posted by IEC 62304, Test Management, Tip Of The Week, Validation and Verification 0 thoughts on “Effective baseline management for ORCANOS | ALM Test Management”

ORCANOS | ALM  versions and baseline management have similar behavior as a source control, thus provides powerful and intuitive tools to manage and track versions, provides a full change history, prevent duplication, and allows reuse, using pointers (copy as link). orcanos

Overview

Rather than manage versions manually, ORCANOS | ALM has a built-in versions management engine, thus allows storing multiple versions views for single text case, or group of test cases, and handle each version separately, and this without duplicating information.

Test Execution

For instance, the historical steps results can be viewed at the point the execution of the test was performed. ORCANOS | ALM “Execution Set” is used for grouping test for execution. Once executed, the execution set stores historical data throughout each test cycle, and the full change history can be viewed in the “Execution History” tab.

Version views and baseline

As opposed to the common use of other test management tools, that physically duplicating test cases when project version advances, ORCANOS | ALM system maintains pointers of the previous version (which would act as a baseline), and track the changes in the current baseline. Test Cases information from older versions are kept, and once a test needs to be updated, Orcanos provides a BRANCH option that splits the test case to 2 instances. One in the previous version and one in the current. both instances are linked, so tracking changes is easy. 

So, in summary, BRANCHING Test Cases creates an instance of the test cases on the new version while preserving the original as a baseline. BRANCHING also easily manages and searches through test case versions, by setting up a multiple views to query the data.

Preserving Test Case Base Line as Test Runs

ORCANOS | ALM  uses both test cases and test runs. Test cases define the set of conditions, actions, expected results, and other criteria used to determine if a product component works correctly and meets its traced, specified requirements. Test cases can change over time to reflect functional changes in your product requirements. Such changes need to be identified easily and reflected on the correct baseline, which goes under testing.

Execution Runs on the other hand, are snapshots of test cases that are generated at a milestone in the testing cycle, such as when a new software release is provided by the development team. A test run contains all information from the related test case, in addition to the results of a specific instance of the test.

So while the test case may change in the future to reflect new user logic modifications, the test run steps that were executed in the past, remain in the history of the execution run, and will never change according to most restricted regulation standards for electronic systems, such as 21 CFR Part 11. By saving test runs, managers and auditors can look back on the exact steps followed during testing years, long after the tests have been completed.

A single test case can have one or more related execution runs such as Functional, Progression, Regression, Load and Stability to mention a few; depending on the test variants defined by test parameters selected when test runs are being executed. For example, should an application support multiple browsers, such as Chrome, Edge, and Firefox variants, these variants can be selected when you execute the test, creating a separate test run for each selected variant. The test runs include identical information, except for the test variant value, which indicates the browser used in testing.
orcanos

 

Viewing the Full Change History

As noted, test cases change over the course of the development cycle. Should  you want to view these changes, you may  select a test case, and then click the Execution History tab to view a detailed change report, including who made the change, when it was done, and what was changed.

Change reports display details of content added to and removed from a test case (or any item, for that matter), each time it is saved. ORCANOS | ALM keeps audit logs of changes according to best practices of  regulated design change management. These reports identify the specific changes made to a test case over time. Change reports are ALWAYS turned on and available for both historical item information logging, and detailed audit trail loggings for test cases.

 

orcanos

Once you are in the Change Report window, you will see content which has been added or removed from the test case. You can also view the electronic signature in cases where changes are made and signed (in ORCANOS | ALM 2.0 and later), and you have permission to view the audit log, If there is an attachment, a link to it will be visible.

Reuse Test Cases

Should you want to add test cases that share the same basic information, ORCANOS | ALM saves time by taking the test to the next baseline automatically. You can then decide if you would like to keep it as it is, make changes,  or just make deletions from that next baseline version. There is no need to duplicate an existing test case and then edit the new test case. Simply select the test case from the next baseline version on the product tree module, followed by selecting Edit Test Case by pressing the Edit button.

ORCANOS | ALM gives you the option to link the altered test case with the original test case into a new execution set. You can also specify the traceability between those test cases to a requirement, and any change to already existing traceability will be managed, based on the baseline it has been associated with.

For example, you might create a “Baseline Test Case” on version 1.0 traced to a requirement in 1.0, then you make changes to that test case in version 2.0. ORCANOS | ALM will alert you that there is a suspicious link needing attention. Allowing you to easily identify and manage changes.

Test Case Versioning in ORCANOS - IMG5

BRANCHING test cases also “copies” information you select from the original test case. Besides the results, all other information including attachments, are always included in BRANCHED test cases. The newly BRANCHED test case now has a new baseline for Execution Run that is kept in tandem with the previous version results. In such cases, you can measure the Feature Maturity by looking at the history of the test over executed version, and see that it gets stable. History of the test will be kept through both version 1.0 and 2.0 at the same audit log, as well as file attachments, links, and more, from the original test case.

Test Case Versioning in ORCANOS - IMG6

orcanos

Original Test Case

2016-06-28_13-28-45

Test Case after Branch

2016-06-28_13-29-09

For more information about duplicating test cases, along with step-by-step instructions, see ORCANOS | ALM online help.

Querying Using Views

The ORCANOS | ALM view designer can be used to easily manage and search through different versions of test cases. In the View dialog box, simply select “test case” from the dropdown list, to add specific criteria to the view by selecting the field VERSION, and use the operands to search the data as you wish.

The end result should look something like this:

orcano

You can use this field criteria to identify the test case version for which each test case is valid, and make it visible at a glance in the test case list window. In the example below, you can see test cases and their version, as well as identify whether they were BRANCHED or not. You can also see the traced requirement of that test case.

Test Case Versioning in ORCANOS - IMG9

orcanos

Test Case Versioning in ORCANOS - IMG11

In this example, the user has also displayed the column for Version Baseline Test Case Traceability relations links, enabling you to see which test cases are linked. Clicking on the blue bar links directly to that traced item.

When the 3.0 update is released, ORCANOS | ALM will advance the project, and automatically all test cases from version 1.0 and 2.0 will be available and will be managed separately. Any current test case that does not change, or does not need an update, will keep its original version. If a test case does need an update, the user would add that test case to Execution Run built into version 3.0, without duplicating the test case, Execute the test cases, and it will be marked as valid for 3.0.

Creating a general custom field,  allows you to filter by many other attributes and creates KPI measurement on your overall productivity. ORCANOS | ALM enables email notifications about changes based on product version, enabling field-level security, and more.

Again, for more information about creating custom fields, see ORCANOS | ALM online help.

Folders are another option for managing test case versions in ORCANOS | ALM. Check out the online support and learning resources at www.orcanos.com for more information.

סדנה מעשית בהבטחת איכות ורגולציה של מכשור רפואי

June 4th, 2015 Posted by 510(k), CE Marking, FDA, Risk Management, Software Lifecycle Management, Test Management 0 thoughts on “סדנה מעשית בהבטחת איכות ורגולציה של מכשור רפואי”

2015-06-04_12-31-49

2015-06-04_12-36-35

QPack Web – making it work with distributed teams

October 9th, 2014 Posted by ALM 2.0, Collaboration, Distributed development, IEC 62304, Requirements Management, Test Management 0 thoughts on “QPack Web – making it work with distributed teams”

Recently, I get lots of queries regarding “How QPack can help us managing distributed teams?”

Well, QPack has all it takes for managing and control a distributed development project.

In this post I will describe in short what are the main QPack tools that are used for distributed development.

Later on, I will give you our best practice and some tips of how to start and how to make it work.

So,

whether you are a Project manager, product owner, project manager, software manager, tester or developer – you can use QPack collaborative tools and methodology for creating productive, distributed development teams using .

qpack_dashbaord

Web based system

For first, in order to implement a good collaborative environment, QPack web interface should be used, at least for the end users.

QPack Web is HTML5 based, and accessible using any browser and any operating system, thus making it accessible anywhere anytime.

Integrated ALM system with one repository

QPack in it’s nature is an integrated ALM system. QPack suite offers all modules required to manage a project, from market requirements definitions, to system and software requirements, detail design, test plan, test cases and test execution with defect tracking and task management, so every participant in the process has his own interface and everyone shares one central repository.

Full permissions and personalization

In QPack Web admin interface it is very easy to setup user profiles, and admin can decide with just few clicks who will see what and who can do what.

Instant messaging

A unique and integrated instant messaging allows every participant in the process to share his ideas, ask questions and provide information.

the uniqueness of QPack instant messaging is in that every discussion is saved as audit log of the specific work items history (such as customer requirements or a defect). Users can later on go back and track decisions taken. It actually replaces conversations, emails, and uncontrolled documents

Queries, Alerts and notifications

by defining queries and alerts its very easy to track information and get notifications in “Push” mode, where QPack sends alerts containing activities to perform based on predefined rules.

Dashboards

Whether you are a developer, tester, or manager, you can personalize your dashboard accordingly.

You can setup any type of report

 

Software Maturity Performance

July 5th, 2013 Posted by KPI, Software Lifecycle Management, Test Management 0 thoughts on “Software Maturity Performance”

IEEE talks about changes made in the software, over 3 dimensions,  to measure the maturity of the software, here in this post we wish to present the relation between test being performed (regardless to their results) vs. the defect discovery during the same period.

 

Test Metrics can be used to measure test coverage prior to software delivery. It provides a measure of the percentage of the software tested at any point during testing.

It is calculated as follows:
Functional Test Coverage = FE/FT
Where,
FE –  is the number of test requirements that are covered by test cases that were executed against the software
FT –  is the total number of test requirements

Software Maturity Metric

Software Maturity Index is that which can be used to determine the readiness for release of a software system. This index is especially useful for assessing release readiness when changes, additions, or deletions are made to existing software systems. It also provides a historical index of the impact of changes. It is calculated as follows:
SMI = Mt – ( Fa + Fc + Fd)/Mt

Where:

SMI is the Software Maturity Index value
Mt is the number of software functions/modules in the current release
Fc is the number of functions/modules that contain changes from the previous release
Fa is the number of functions/modules that contain additions to the previous release
Fd is the number of functions/modules that are deleted from the previous release
Reliability Metrics

Looking at the image below you will find that some of the above calculation can also be reflected in simple presentation of the execution of tests being performed at ALL software releases regardless to the module add/change/delete and at the same time looking at the trend of defects discovery.

Looking at the first period of the software lifecycle you can identify that the number of tests being executed reflects directly on the defect discovery during the same period of time. That is vs. the more advanced period where the software is getting more matured, even those there is dramatic increase in the testing effort, still the defect discovery is not showing the same performance as on the first period.

 

QPack Analytics™ report based on OBIEE technology 

Software_Maturity_Test_Run_vs_Defect_Discovery_Rate

 

 

QPack Medical Webinar, October 2012

November 11th, 2012 Posted by IEC 62304, ISO 13485, ISO 14971, Requirements Management, Risk Management, Software Lifecycle Management, Test Management, Validation and Verification 0 thoughts on “QPack Medical Webinar, October 2012”

Can we link test case to hazard for mitigation?

June 28th, 2011 Posted by ISO 14971, Risk Management, Test Management 0 thoughts on “Can we link test case to hazard for mitigation?”

In some occasions, test case is traced to a risk in order to assure mitigation. I believe this traceability used for risk mitigation is wrong.

The correct flow is adding a risk control, and then add traceability from the test case to the risk control (can be software requirement, user manual reference, etc.)

The test case will be used to verify that the control implemented is actually reduce the risk

So the test case is used for verification of the risk control and NOT as risk mitigation.

Orcanos

Contact

8 Tozeret Ha'aretz Street
Tel Aviv, Israel
+972-3-5372561
info@orcanos.com

Copyright © Orcanos, All rights reserved. | Privacy policy | Terms of use