Posts in Validation and Verification

cGMP – Design and Development Master Validation Plan (10)

July 15th, 2019 Posted by Requirements Management Tool, Test Management, Validation and Verification 0 thoughts on “cGMP – Design and Development Master Validation Plan (10)”

People in the medical device industry are often wary of the term validation even though they shouldn’t. Validation is using objective evidence and experiment to ensure that a set of requirements are met when testing for product or service usage. Furthermore, as it pertains to device validation. The objective is to match the specifications with user needs and application. Therefore, validation is proof that using a specific process to manufacture a device will meet both device requirement and user demands.

Validation is the building block for verifying the quality of a product. As a result, the product that is tested during validation must represent the final product. According to ISO 13485, companies must keep a record of the product they use during validation.

Validation MVP

In the same way that the design control process starts with a plan, validation must take the same approach.  The plan is often extensive, covering several areas earning the name Validation Master Plan (VMP) or Master Validation Plan (MVP). It is best to always start the validation plan early in the design process. The plan should be able to pinpoint what will help satisfy the criteria like;

  • Methodologies
  • Performance properties
  • Validation activities

Likewise, there should be a review of the validation plan to avoid risk and deficiencies.

“The First Article”

The first article is the common name given to the first set of products. They are either serialized batch or initial batch (Validation batch). In some cases, validation reports help to document the properties of the first articles. Also, there could be a separate first article report. Forgetting to include labeling and packaging as part of the validation process is a common oversight.

It is crucial that companies include packing in their validation plan. Its effect of product performance is enormous and difficult to measure.  Some packaging can give off electrostatic charge or cause the material to leach into a sterile product. Therefore, testing the packaging can help prevent such occurrences. Similarly, the validation plan should include labels. Environmental conditions can cause labels to fail, leaving the product bare and unbranded.

Inclusion of clinal trials in the product validation is optional and depends on the type of product. Nevertheless, there should be some form of clinal evaluation just in case. Also, validation should account for the worst possible scenarios using simulations that mimic the conditions that the product will face. Some possible simulations can test for the following;

  • Vibration and shock
  • Temperature
  • Humidity
  • Other tests will account for either transportation or storage of the product.

Finally, validation should take into account the following customers/users;

  • Operators
  • Patients
  • Caregivers (nurses/doctors)
  • Other relevant parties


Orcanos Master Validation Plan

Related Links

cGMP – Medical Equipment Calibration – How it affect our success – ISO 13485:2016

June 17th, 2019 Posted by e-GMP, Requirements Management, Validation and Verification 0 thoughts on “cGMP – Medical Equipment Calibration – How it affect our success – ISO 13485:2016”

Calibration is considered as an essential procedure for any equipment and device, in order to maintain and improve its accuracy and precision. Calibration is the process, in which equipment under test is compared with some other standard equipment, in order to understand the accuracy of the one being produced. The calibration of medical equipment is also based on the same principle.

Medical equipment calibration is essential to the success of the product, the demand for calibration planning system is increasing, owing to various factors, such as rising number of hospitals, increasing environmental regulations, and rising customer focus towards quality and precision. The purpose of this article is to help identify both the current and future of calibration in the medical device market.

Medical device calibration has two sections, the service types, and the equipment types. The equipment types have a market in the following segments;


  • Infusion pumps
  • Fetal monitors
  • Ventilators
  • Imaging equipment
  • Vital sign monitors
  • Cardiovascular monitors etc.


Meanwhile, the service types have three major markets namely;

  • In-house Calibration: The Professional personnel of the company will perform the calibration. The staffs are mainly from the production line.
  • Third Party Calibration Services: Other professionals outside the company will perform the calibration for a fee.
  • OEM Calibration Services: The owner of the service will need to set up plans and notification ahead of time.


Out of the all above devices, The medical device producers of imaging equipment requires calibration services are the largest demand. Although, there are expectations that cardiovascular monitors will keep growing at the highest growth rate to match demands.

Increasing focus of customers on the quality, rising growing need for more control on the calibration planning and documentation due to strict compliance environment which are key factors expected to drive the growth of this demand.

The critical factors in driving the demands for cardiovascular monitors include:

  1. Customers are focusing on quality.
  2. The need to control calibration planning.
  3. Strict compliance requires documentation.
  4. A rise in product recall.


Reports from the FDA in the US show that in the past decade, product recalls has grown from 763 to 3202 between 2009 and 2017.

These recalls were observed due to software design failure, component and material issues and packaging and labeling. Hence, such frequent product recall affects the company’s reputation and thus, the companies are offering a strong emphasis on the calibration of their products before and after commercialization.

This fact is considered as an important growth propeller of this demand by the medical device manufacturers market. In addition, rising demand for third party and in-house calibration services is another important driver for the need of calibration planning system such as Orcanos eQMS.

What could affect the implementation of Calibration system?

Some of the crucial factors include;

  • High Capital
  • The use of modular instrumentation
  • Regional and local companies dominating the market


Medical equipment calibration services are segmented in areas such as North America, Europe, Asia-Pacific and Rest of the World (RoW).

Presently, we see the European region is the largest market in the world, owing to extensive R&D practices by the industry, a large number of local and regional players and rapidly growing medical and healthcare infrastructure.

However, Asia-Pacific region is expected to be the fastest growing market during the forecast period 2019 – 2025. This growth is driven by rising demand for good quality services, steadily increasing medical infrastructure and rising government regulations.

The purpose of this article is to help identify both the current and future of calibration in the medical device market.

Orcanos provide for these players a greater potential by collaborating with the vendor directly over Orcanos eQMS cloud system to plan and execute the calibration program.

Some of the global service players include Fluke Biomedical, Tektronix, Inc., JPen Medical Ltd., NS Medical Systems and Biomed Technologies, Inc. amongst others. However, these companies have to face stiff competition from various players operating at the regional level and hence; collaboration or acquisition of cloud system is considered as an important strategy for the players to grow in this market.

Related Links

cGMP – Design and Development Outputs (SwRS-MecRS-HwRS-FwRS) – ISO 13485:2016 (8) Clause 7

June 16th, 2019 Posted by e-GMP, Requirements Management, Validation and Verification 0 thoughts on “cGMP – Design and Development Outputs (SwRS-MecRS-HwRS-FwRS) – ISO 13485:2016 (8) Clause 7”

In the same manner that we have design and development input. We also have design and development output. The result of satisfying the criteria for design input is the design output.  The output will possess risk assessment for the following ;

  • Assembly drawings
  • The specification for raw materials and components
  • Design and process
  • Instruction for installation and service
  • Guideline for the assembly process
  • Specification for labeling and packaging
  • Source code and technical files
  • Biocompatibility studies
  • Results of verification activity
  • Validation activities such as sterility, reliability testing or shelf-life studies and shipping.


The design and development output is also known as the first realized product. Depending on the type of product. It could be the first of several lines of assemblies or the first batch of products manufactured. The initial set of the first realized product must undergo evaluation checks. The checks will ensure that the design output requirement is met. Likewise, there will be serial number checks to ensure there is consistency in the process.

Orcanos ALM provides all the tools you need for complete coverage of the design outputs both from the product definition but as well from the change control as risk management according to ISO 14971:2012 with full traceability and impacts analysis tools, all in the same tool.



Related Links

Preventing Potential Recall by Testing Right Your Product

June 23rd, 2018 Posted by Recall, Risk Management, Safety, Tip Of The Week, Validation and Verification, Workshops 0 thoughts on “Preventing Potential Recall by Testing Right Your Product”

On the QA Geek Week giving the following lecture based on a selected example of recalls researched by Orcanos to provide some preventive actions methods on the validation and verification methodology. These recalls give an example of how to pay attention to simple observational engineering faults that could harm the patient. All are true stories that just happened during 2017 – 2018 years which were already reported the increase in recalls during Q1 2018. The numbers shows that it is to be the largest recall quarter since 2005. Orcanos R&D effort is to continue to be a market leader by daily investigating these events and integrating into actions in Orcanos ALM/QMS system, helping to prevent the next recall to our customers.

Try for Free NOW!



December 8th, 2017 Posted by ISO 14971, Risk Management, Validation and Verification 0 thoughts on “WHITE PAPER ACHIEVING ISO 26262 COMPLIANCE WITH ORCANOS ALM and QMS”

ISO 26262 is an automotive standard that places requirements on the quality of software, for which tools such as ORCANOS ALM and QMS are ideally positioned to enforce. With the highest adoption in the industry and a strong heritage in safety-critical applications, ORCANOS ALM and QMS have been certified as being “fit for purpose” to be used as tools by development teams wishing to achieve ISO 26262. This document describes the parts of the standard that are addressed by using ORCANOS ALM and QMS.

Read More:

Effective baseline management for ORCANOS | ALM Test Management

June 23rd, 2016 Posted by IEC 62304, Test Management, Tip Of The Week, Validation and Verification 0 thoughts on “Effective baseline management for ORCANOS | ALM Test Management”

ORCANOS | ALM  versions and baseline management have similar behavior as a source control, thus provides powerful and intuitive tools to manage and track versions, provides a full change history, prevent duplication, and allows reuse, using pointers (copy as link). orcanos


Rather than manage versions manually, ORCANOS | ALM has a built-in versions management engine, thus allows storing multiple versions views for single text case, or group of test cases, and handle each version separately, and this without duplicating information.

Test Execution

For instance, the historical steps results can be viewed at the point the execution of the test was performed. ORCANOS | ALM “Execution Set” is used for grouping test for execution. Once executed, the execution set stores historical data throughout each test cycle, and the full change history can be viewed in the “Execution History” tab.

Version views and baseline

As opposed to the common use of other test management tools, that physically duplicating test cases when project version advances, ORCANOS | ALM system maintains pointers of the previous version (which would act as a baseline), and track the changes in the current baseline. Test Cases information from older versions are kept, and once a test needs to be updated, Orcanos provides a BRANCH option that splits the test case to 2 instances. One in the previous version and one in the current. both instances are linked, so tracking changes is easy. 

So, in summary, BRANCHING Test Cases creates an instance of the test cases on the new version while preserving the original as a baseline. BRANCHING also easily manages and searches through test case versions, by setting up a multiple views to query the data.

Preserving Test Case Base Line as Test Runs

ORCANOS | ALM  uses both test cases and test runs. Test cases define the set of conditions, actions, expected results, and other criteria used to determine if a product component works correctly and meets its traced, specified requirements. Test cases can change over time to reflect functional changes in your product requirements. Such changes need to be identified easily and reflected on the correct baseline, which goes under testing.

Execution Runs on the other hand, are snapshots of test cases that are generated at a milestone in the testing cycle, such as when a new software release is provided by the development team. A test run contains all information from the related test case, in addition to the results of a specific instance of the test.

So while the test case may change in the future to reflect new user logic modifications, the test run steps that were executed in the past, remain in the history of the execution run, and will never change according to most restricted regulation standards for electronic systems, such as 21 CFR Part 11. By saving test runs, managers and auditors can look back on the exact steps followed during testing years, long after the tests have been completed.

A single test case can have one or more related execution runs such as Functional, Progression, Regression, Load and Stability to mention a few; depending on the test variants defined by test parameters selected when test runs are being executed. For example, should an application support multiple browsers, such as Chrome, Edge, and Firefox variants, these variants can be selected when you execute the test, creating a separate test run for each selected variant. The test runs include identical information, except for the test variant value, which indicates the browser used in testing.


Viewing the Full Change History

As noted, test cases change over the course of the development cycle. Should  you want to view these changes, you may  select a test case, and then click the Execution History tab to view a detailed change report, including who made the change, when it was done, and what was changed.

Change reports display details of content added to and removed from a test case (or any item, for that matter), each time it is saved. ORCANOS | ALM keeps audit logs of changes according to best practices of  regulated design change management. These reports identify the specific changes made to a test case over time. Change reports are ALWAYS turned on and available for both historical item information logging, and detailed audit trail loggings for test cases.



Once you are in the Change Report window, you will see content which has been added or removed from the test case. You can also view the electronic signature in cases where changes are made and signed (in ORCANOS | ALM 2.0 and later), and you have permission to view the audit log, If there is an attachment, a link to it will be visible.

Reuse Test Cases

Should you want to add test cases that share the same basic information, ORCANOS | ALM saves time by taking the test to the next baseline automatically. You can then decide if you would like to keep it as it is, make changes,  or just make deletions from that next baseline version. There is no need to duplicate an existing test case and then edit the new test case. Simply select the test case from the next baseline version on the product tree module, followed by selecting Edit Test Case by pressing the Edit button.

ORCANOS | ALM gives you the option to link the altered test case with the original test case into a new execution set. You can also specify the traceability between those test cases to a requirement, and any change to already existing traceability will be managed, based on the baseline it has been associated with.

For example, you might create a “Baseline Test Case” on version 1.0 traced to a requirement in 1.0, then you make changes to that test case in version 2.0. ORCANOS | ALM will alert you that there is a suspicious link needing attention. Allowing you to easily identify and manage changes.

Test Case Versioning in ORCANOS - IMG5

BRANCHING test cases also “copies” information you select from the original test case. Besides the results, all other information including attachments, are always included in BRANCHED test cases. The newly BRANCHED test case now has a new baseline for Execution Run that is kept in tandem with the previous version results. In such cases, you can measure the Feature Maturity by looking at the history of the test over executed version, and see that it gets stable. History of the test will be kept through both version 1.0 and 2.0 at the same audit log, as well as file attachments, links, and more, from the original test case.

Test Case Versioning in ORCANOS - IMG6


Original Test Case


Test Case after Branch


For more information about duplicating test cases, along with step-by-step instructions, see ORCANOS | ALM online help.

Querying Using Views

The ORCANOS | ALM view designer can be used to easily manage and search through different versions of test cases. In the View dialog box, simply select “test case” from the dropdown list, to add specific criteria to the view by selecting the field VERSION, and use the operands to search the data as you wish.

The end result should look something like this:


You can use this field criteria to identify the test case version for which each test case is valid, and make it visible at a glance in the test case list window. In the example below, you can see test cases and their version, as well as identify whether they were BRANCHED or not. You can also see the traced requirement of that test case.

Test Case Versioning in ORCANOS - IMG9


Test Case Versioning in ORCANOS - IMG11

In this example, the user has also displayed the column for Version Baseline Test Case Traceability relations links, enabling you to see which test cases are linked. Clicking on the blue bar links directly to that traced item.

When the 3.0 update is released, ORCANOS | ALM will advance the project, and automatically all test cases from version 1.0 and 2.0 will be available and will be managed separately. Any current test case that does not change, or does not need an update, will keep its original version. If a test case does need an update, the user would add that test case to Execution Run built into version 3.0, without duplicating the test case, Execute the test cases, and it will be marked as valid for 3.0.

Creating a general custom field,  allows you to filter by many other attributes and creates KPI measurement on your overall productivity. ORCANOS | ALM enables email notifications about changes based on product version, enabling field-level security, and more.

Again, for more information about creating custom fields, see ORCANOS | ALM online help.

Folders are another option for managing test case versions in ORCANOS | ALM. Check out the online support and learning resources at for more information.

How to go About Good Practice in Validating Computer Systems in a Regulated Environment

May 30th, 2015 Posted by Company News, Orcanos Cafe, Presentation, Validation and Verification 0 thoughts on “How to go About Good Practice in Validating Computer Systems in a Regulated Environment”


Computer System Validation is the technical discipline that involves the use of life sciences companies, to ensure applications provide the information they were intended to. FDA monitoring and regulations evidence the need for strict quality measures, The Food and Drug Administration (FDA)  which ensure specific controls and procedures during the Software Development Life Cycle (SDLC/ALM). Regulations such as those of the FDA also underscore the importance and need of not only following checks and procedures, but that these procedures are well documented. Said documents must be able to stand up to scrutiny by trained inspectors, especially since the financial penalties in the absence of an audit, can be exorbitant. Among the implications of not following relevant protocols in a  Life Science Software application, include the loss of life. In applying the appropriate SDLC/ALM protocols such as documentation, are all part of the technical discipline of Computer System Validation. In effect, Computer System Validation involves what many IT people consider testing software.


According to the FDA, process validation is “Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its pre-determined specifications and quality attributes” (1987).

In 2011, the FDA defined process validation as “the collection and evaluation of data, from the process design stage through commercial production, which establishes scientific evidence that a process is capable of consistently delivering quality product.”

Guidance of Validation Process

Validation involves all aspects of a process (including buildings, equipment, and computer systems) meeting requirements of quality, and compliance with applicable rules, regulations and guiding product quality, safety and traceability.

In 2011 three stages were involved in the validation process:

  1. Process design, the commercial process is defined based on knowledge gained through scale-up activities and development.
  2. Process qualification, the process design is evaluated and assessed to determine if the process is capable of reproducible commercial manufacturing.
  3. Continued process verification, ongoing assurance is gained during routine production that the process remains in a state of control

Purpose of Validation

To validate, is to confirm that a product or service meets the needs of its users. Validation starts at the planning stages, and continues through to the maintenance and operation phases. It is important to consider all the documentation that comes out of validation, and the entire process, so as to ensure that one’s system, and state remain validated over a period of time.


We are already familiar with the model of validation, and that the model follows a particular pattern of documentation. It is the relationship with these documents that is critical at the end, to not only maintains quality assurance of these documents, as would any cGMP document. It should be noted, that one must also consider the traceability of validation as one proceeds through an initiative.


When we talk about validation, we are referring to the whole validation and verification process. The process which is without validation and verification is a waste of considerable time, energy, money and resources, as the validation of unnecessary things can occur. So from the perspective of the specialist, validation is also very important.

Consideration of Validation

I often observe in my practice, organizations going overboard looking at  commercial off-the-shelf applications, which have very low risk, much in the same manner as custom developed applications with very high risk. So verification without validation is a factor that every organization that is looking into a validation process should take note of. Who is involved in validation? When we look across validation we observe that it is a process involving many organizations, and many individuals; especifically quality assurance, which is at the heart of software validation.

Key Role

Upon observing key roles such as the validation manager, he or she is really the driver, the overall architect of your software validation initiative, if you will. We look at the business system owner whose is consistently concerned with the business requirements. The input of business owners in the validation process is a vital key. When we consider the role of project managers, they are duly responsible for the overall implementation and execution of the validation program.

So in summary, the project manager’s role, and that of the head of quality assurance, is absolutely critical throughout the process, saving time and money. No validation effort is complete or effective, without the input of quality assurance. A technical lead is also needed to ensure that all technical requirements are addressed for the IQ and OQ procedures, as well as that development progresses in the manner intended. The role of the validation manager I think is one of the most critical roles within any validation initiative, as this individual is responsible for the overall methodology and the execution of the validation initiative. The validation manager also works hand in hand with the quality assurance manager and the development organization or your technical organization to ensure that the project is on time, within budget, and meets regulatory guidelines.

Importance of Quality Assurance

If quality assurance staff lack the skill set necessary or the quality assurance background to ensure the effectiveness of your validation initiatives as you goes through the validation process, variations in workload will result. This may determine when you actually decide to bring in consultants or when you decide to augment your existing staff. At the beginning of the project it may be noticed that the business system owners and the requirements development people, are very much involved at the beginning. Because what you are doing is establishing the requirements essential for this validation initiative. So when you look at validation as mapping the process according to the intended use, it is very important that was that these requirements are established up front, and those of you who are using off-the-shelf software, hold your vendors’ feet to the fire, in terms of getting the requirements down for their software application. Keep the intended use principle in mind during the early stages. As you get to the middle part of your validation initiative, you will find more resource load on the design or either the system’s integration part. So as we go through our systems integration, or as you’re deploying off-the-shelf software, you are required to deal with this aspect, and this may involve a significant number of resources during the middle of the project. And then as we move to the tail end, you may see more testing resources that are now need to be brought to bear.

Validation testing

Validation testing is an area that software vendors really focus a lot of their attention, specifically on the IQ, OQ and PQ of validation. However, that’s not all there is to software validation. There’s a whole process involved here. So as you look at the resource load across your validation initiative, be sure that you have the right number of resources at the right time during your validation process. As you look at validation as a dynamic process, maintaining the state a validation is absolutely critical. You need to make sure that the system is validated the first time correctly, as well as over time; the system maintains a state of validation. You need to be concerned with change control than with configuration management control. As you look at your system, there may be operating system changes, network system changes, as well as security changes. You have a number of changes that come up throughout the validation initiative. It is important to make sure that these changes are addressed overtime, and that more importantly, they are documented and follow procedures.

Validation initiative

Upon establishing a validation initiative, standard operating procedures must also be put in place. These procedures include backup and recovery processes for security Training is a crucial part of a validation initiative. It is also necessary for example to look into a comprehensive incident management procedure. This is often overlooked when organizations are validating off-the-shelf software. But it must be ensured that every incident is actually tracked so as to monitor, and take corrective action processes to be put in place validating that the system is corrected after such reported incidents, and all procedures are done in a controlled manner.


Validate is a system to which ensures that incidents are corrected, and so corrected in a controlled manner. When we look at what triggers software validation or revalidation if you well, every software installation or the integration of new software applications and/or modules could actually trigger software validation. Maintenance upgrades such as the upgrade of an operating system, or changes in your network, could also trigger software validation. The additional new hardware, software could trigger it, or systems integrated requirements. Regulations as you well know are constantly changing. Over time different product control requirements could also trigger revalidation. Within a validation master plan, there should not only be the triggers for software validation, but also there should be follow-ups to ensure that the system is maintained in a validated state. Who are the consultants that one can use for software validation, and what’s the business case for consultants? Consultants can play a really key role. For one, they bring independent expertise to the table. When you look at software offenders with commercially off-the-shelf software, and you are given IQ, OQ, QP scripts, those scripts are designed to work the first time, but they don’t necessarily offered the independence that a software consultant could to bring to the table. So first of all they bring independence, but more importantly expertise in software validation can help accelerate the process and deliver best practices for software validation.

Been There Done That, or NOT!

I run into a lot of companies that have never done a validation project. They may have never validated an ERP system, or never validated an integrated system. Software consultants can be invaluable, helping to save time and money, from having to go and learn things, and accelerate the learning curve for your organization. These consultants can be very valuable. Consultants can also deliver predefined validation protocols, and package deliveries of methodology that can help accelerate your process. But more importantly, ensure that there is a quality process at the heart of your validation initiative. Finally they can augment your existing staff. Recalling the workload I discussed, the validation workload varies over the entire life cycle of your validation initiative. Consultants could come in, and at different points in time during that process, help you accelerate retaliation initiative. So there’s a good reason for using experience qualified validation consultants and I recommend that you should look at those if you have an either complicated validation initiative or if you’re looking to validate a system for the first time. Consultants can be invaluable, giving assistance through the whole process. So why consultants should get involved early in the process? The early involvement of consultants,  are so that they understand the requirements and keep in mind that intended use principal If you are validating a system according to Intended use, you need all consultants to understand what the intended use of the system is, so that it affects the usability. So get the consultants involved earlier, they can help you to optimize your validation process if you don’t have standard operating procedures in place. They can assist with optimizing, and as I have seen in some organizations, validating low risk systems, in the same manner that they validate higher risk systems. It is strongly recommended that you conduct a comprehensive risk assessment prior to your validation, to ensure that you’re not expending valuable resources validating systems that are very low risk. So you want to look at the strategy for your overall validation, and you want to get consultant’s to really help you with, that so that you can have a comprehensive validation assessment around your particular initiative. And also one of the most overlooked areas is that of migration. Organizations are either developing custom systems, or their migrating from system ‘A’ to system ‘B’, and they don’t consider migration. Migration is absolutely important key. Sometimes when you look at custom systems, validation efforts can be about 30%, of your validation initiative. But if you have a large migration initiative, it can be even more. I strongly recommend that you take validation and migration into consideration and be careful not to overlook this area. Records management is also important when looking at validation initiatives. It is crucial to ensure that all records associated with your validation and nation are properly archived and stored for future use. It is also a good idea to conduct an audit management on your validation systems over time. Conducting a risk assessment is also a valuable tool.


So if you are about to plan a validation project you can follow those highlighted principles to assure you have the correct resources to get this project done

  1. Asses your system going under validation
  2. Analyse your intend of use
  3. Put in place all resources on time
  4. Consider all possible aspects of the system to include migration factor
  5. Use best practice knowledge either from internal or external resource


Rami Azulay

ALM Master @


Download PDF: How to go About Good Practice in Validating Computer Systems in a Regulated Environment



Elcam Medical Mitigating The Approval Risk – By Placing their documents on Pre-Audit alert system that implemented the complete V&V Medical Device module

July 14th, 2013 Posted by 510(k), Company News, KPI, Software Lifecycle Management, Validation and Verification 0 thoughts on “Elcam Medical Mitigating The Approval Risk – By Placing their documents on Pre-Audit alert system that implemented the complete V&V Medical Device module”

Based 2,323 California biomedical companies research, here are the main 10 threats those companies has reported on

  • #1 – FDA regulatory / environment
  • #3 – R&D productivity
  • #7 – Intellectual property protections
  • #8 – Ability to demonstrate effectiveness
  • #9 – Product liability
  • #10 – Unprepared workforce

The unspoken rule is that at least 50% of the studies published even in top tier academic journals – Science, Nature, CellPNAS, etc… – can’t be repeated. More than that if there was a development behind those studies it was impossible to recreate the documentations requires utilizing their business potential.

According to the same research the following answers given to the question: Why did the company delay the research or development project? Were as follows:

  • 40.2% – Funding not available (Second Round)
  • 27.8% – Regulation (FDA, EPA, SEC)
  • 25.8% – Change in corporate priorities or strategy
  • 4.1% – layoffs
  • 7.2% – Other

Orcanos Implementing NPI (New Product Introduction) for one of the legacy Israeli Medical Device company Elcam Ltd. In this project the focus was on getting the project started on the correct regulatory path and have taken the initiative documents created by the R&D group into a preset system that control and governance the regulatory path selected by the organization. The overall idea was to define the development path in which the specific product shall be using and to match the perfect system that will control and governance each step in the development lifecycle. To achieve this goal we have selected QPack Medical™ system that accepted the validation documents and for each document the system created set of KPI (Key Performance Indicators) as well as KRI (Key Regulatory Indicators) that triggered QPack Medical alert system with Pre-Audit notifications.


Mitigating Audit-Submission Risks



For example:

The insertion of MRD documents during the Idea/Concept stage trggered the following alerts


  • Market Requirements missing coverage matrix
  • Market Requirements maturity based on functional test results
  • Market Requirements missing due date
  • Unapproved market requirements
  • Readiness for PDR review



  • Missing Market Requirement Document
  • Missing Market Requirement  Specifications
  • Market Requirements missing traceability to product requirements
  • Market Requirements missing validation procedures






QPack Medical Webinar, October 2012

November 11th, 2012 Posted by IEC 62304, ISO 13485, ISO 14971, Requirements Management, Risk Management, Software Lifecycle Management, Test Management, Validation and Verification 0 thoughts on “QPack Medical Webinar, October 2012”

Software Validation and Verification

June 22nd, 2011 Posted by Validation and Verification 0 thoughts on “Software Validation and Verification”

Software Verification

The Goal is to provide objective evidence that the software meets all the specified requirements

> building the thing right

Software Validation

The Goal is to confirm that the software meets the user needs and intended uses

> building the right thing

Page 1 of 2
1 2


8 Tozeret Ha'aretz Street
Tel Aviv, Israel

Copyright © Orcanos, All rights reserved. | Privacy policy | Terms of use