Posts tagged "Branch"

Effective baseline management for ORCANOS | ALM Test Management

June 23rd, 2016 Posted by IEC 62304, Test Management, Tip Of The Week, Validation and Verification 0 thoughts on “Effective baseline management for ORCANOS | ALM Test Management”

ORCANOS | ALM  versions and baseline management have similar behavior as a source control, thus provides powerful and intuitive tools to manage and track versions, provides a full change history, prevent duplication, and allows reuse, using pointers (copy as link). orcanos


Rather than manage versions manually, ORCANOS | ALM has a built-in versions management engine, thus allows storing multiple versions views for single text case, or group of test cases, and handle each version separately, and this without duplicating information.

Test Execution

For instance, the historical steps results can be viewed at the point the execution of the test was performed. ORCANOS | ALM “Execution Set” is used for grouping test for execution. Once executed, the execution set stores historical data throughout each test cycle, and the full change history can be viewed in the “Execution History” tab.

Version views and baseline

As opposed to the common use of other test management tools, that physically duplicating test cases when project version advances, ORCANOS | ALM system maintains pointers of the previous version (which would act as a baseline), and track the changes in the current baseline. Test Cases information from older versions are kept, and once a test needs to be updated, Orcanos provides a BRANCH option that splits the test case to 2 instances. One in the previous version and one in the current. both instances are linked, so tracking changes is easy. 

So, in summary, BRANCHING Test Cases creates an instance of the test cases on the new version while preserving the original as a baseline. BRANCHING also easily manages and searches through test case versions, by setting up a multiple views to query the data.

Preserving Test Case Base Line as Test Runs

ORCANOS | ALM  uses both test cases and test runs. Test cases define the set of conditions, actions, expected results, and other criteria used to determine if a product component works correctly and meets its traced, specified requirements. Test cases can change over time to reflect functional changes in your product requirements. Such changes need to be identified easily and reflected on the correct baseline, which goes under testing.

Execution Runs on the other hand, are snapshots of test cases that are generated at a milestone in the testing cycle, such as when a new software release is provided by the development team. A test run contains all information from the related test case, in addition to the results of a specific instance of the test.

So while the test case may change in the future to reflect new user logic modifications, the test run steps that were executed in the past, remain in the history of the execution run, and will never change according to most restricted regulation standards for electronic systems, such as 21 CFR Part 11. By saving test runs, managers and auditors can look back on the exact steps followed during testing years, long after the tests have been completed.

A single test case can have one or more related execution runs such as Functional, Progression, Regression, Load and Stability to mention a few; depending on the test variants defined by test parameters selected when test runs are being executed. For example, should an application support multiple browsers, such as Chrome, Edge, and Firefox variants, these variants can be selected when you execute the test, creating a separate test run for each selected variant. The test runs include identical information, except for the test variant value, which indicates the browser used in testing.


Viewing the Full Change History

As noted, test cases change over the course of the development cycle. Should  you want to view these changes, you may  select a test case, and then click the Execution History tab to view a detailed change report, including who made the change, when it was done, and what was changed.

Change reports display details of content added to and removed from a test case (or any item, for that matter), each time it is saved. ORCANOS | ALM keeps audit logs of changes according to best practices of  regulated design change management. These reports identify the specific changes made to a test case over time. Change reports are ALWAYS turned on and available for both historical item information logging, and detailed audit trail loggings for test cases.



Once you are in the Change Report window, you will see content which has been added or removed from the test case. You can also view the electronic signature in cases where changes are made and signed (in ORCANOS | ALM 2.0 and later), and you have permission to view the audit log, If there is an attachment, a link to it will be visible.

Reuse Test Cases

Should you want to add test cases that share the same basic information, ORCANOS | ALM saves time by taking the test to the next baseline automatically. You can then decide if you would like to keep it as it is, make changes,  or just make deletions from that next baseline version. There is no need to duplicate an existing test case and then edit the new test case. Simply select the test case from the next baseline version on the product tree module, followed by selecting Edit Test Case by pressing the Edit button.

ORCANOS | ALM gives you the option to link the altered test case with the original test case into a new execution set. You can also specify the traceability between those test cases to a requirement, and any change to already existing traceability will be managed, based on the baseline it has been associated with.

For example, you might create a “Baseline Test Case” on version 1.0 traced to a requirement in 1.0, then you make changes to that test case in version 2.0. ORCANOS | ALM will alert you that there is a suspicious link needing attention. Allowing you to easily identify and manage changes.

Test Case Versioning in ORCANOS - IMG5

BRANCHING test cases also “copies” information you select from the original test case. Besides the results, all other information including attachments, are always included in BRANCHED test cases. The newly BRANCHED test case now has a new baseline for Execution Run that is kept in tandem with the previous version results. In such cases, you can measure the Feature Maturity by looking at the history of the test over executed version, and see that it gets stable. History of the test will be kept through both version 1.0 and 2.0 at the same audit log, as well as file attachments, links, and more, from the original test case.

Test Case Versioning in ORCANOS - IMG6


Original Test Case


Test Case after Branch


For more information about duplicating test cases, along with step-by-step instructions, see ORCANOS | ALM online help.

Querying Using Views

The ORCANOS | ALM view designer can be used to easily manage and search through different versions of test cases. In the View dialog box, simply select “test case” from the dropdown list, to add specific criteria to the view by selecting the field VERSION, and use the operands to search the data as you wish.

The end result should look something like this:


You can use this field criteria to identify the test case version for which each test case is valid, and make it visible at a glance in the test case list window. In the example below, you can see test cases and their version, as well as identify whether they were BRANCHED or not. You can also see the traced requirement of that test case.

Test Case Versioning in ORCANOS - IMG9


Test Case Versioning in ORCANOS - IMG11

In this example, the user has also displayed the column for Version Baseline Test Case Traceability relations links, enabling you to see which test cases are linked. Clicking on the blue bar links directly to that traced item.

When the 3.0 update is released, ORCANOS | ALM will advance the project, and automatically all test cases from version 1.0 and 2.0 will be available and will be managed separately. Any current test case that does not change, or does not need an update, will keep its original version. If a test case does need an update, the user would add that test case to Execution Run built into version 3.0, without duplicating the test case, Execute the test cases, and it will be marked as valid for 3.0.

Creating a general custom field,  allows you to filter by many other attributes and creates KPI measurement on your overall productivity. ORCANOS | ALM enables email notifications about changes based on product version, enabling field-level security, and more.

Again, for more information about creating custom fields, see ORCANOS | ALM online help.

Folders are another option for managing test case versions in ORCANOS | ALM. Check out the online support and learning resources at for more information.

Tip Of The Week – Why ORCANOS | ALM and GitHub Can Work So Good Togther

April 12th, 2016 Posted by Tip Of The Week 0 thoughts on “Tip Of The Week – Why ORCANOS | ALM and GitHub Can Work So Good Togther”

The Branch Concept – Powerful Tool To Control Your Product Lifecycle and Not Just Your Code

April 11th, 2016 Rami Azulay in Tip of the Week, QPack

This week’s tip expounds on the GitHub concept of branching, and ORCANOS | ALM. While working on a project, you will  undoubtedly have a number  of different features or ideas in progress at any particular  time – Some of which are ready to go, while others are not. Branching allows you to better manage this workflow.

The life span of the R&D activity vs. the other parts of the product lifecycle have the same technical work style,  but in different magnitudes. So as we discuss product definition, we conduct  similar experiments as we do with our R&D activity. However,  since it is less detailed and more functional, there are less daily activities than in R&D. Yet we need the same ability as the R&D to differentiate one release content from the next one to come out. GitHub does that extremely well by way of  the Branch mechanism.

When you create a branch in your project, you are creating an environment in which you are free to try out new ideas. Changes you make on a branch do not affect the master branch, so you are free to experiment and commit changes, safe in the knowledge that your branch changes will not  be merged until it is ready to be reviewed by someone you are collaborating with.

What this means, is that , when you are working on a product release, you firstly need a place where you can evaluate your content against the market needs, and secondly, against the possible implementation by the R&D. This ofcourse is an advantage of ORCANOS | ALM. It provides benefits that are lacking in the GitHub. This deficit causes  a backlog of items in the pool area.

When a decision needs to be made about the content of your release, the ORCANOS | ALM branch mechanism should be considered. It gives you the option to create separate views (branches) for each release, and you are able to move/split/remove content from each view as needed. At the end of the day your view contains only the requirement, design, test, defects and so on, of that specific view.

Open a Pool Request

Pool Requests facilitates discussion about your new idea, while keeping design ideas in a  pool. Because pool requests are tightly integrated within the underlying ORCANOS | ALM repository, anyone can see suggested changes, and how they would be merged, should they be approved.

You can open a Pool Request at any point during the development process: Whether  you have few ideas or a single line description of your requirement, same can be shared as screenshots or general ideas.This approach is useful when  you are stuck and need help or advice, or when you are ready for someone to review your work. By using ORCANOS | ALM @discussion system in your Pool Request message, you can ask for feedback from specific people or teams, whether they’re down the hall or ten time zones away.

Discuss and review your change request

Once a Pool Request has been submitted, the person or team reviewing your request for changes, may have questions or comments. Perhaps the requirement template style does not match project guidelines, the change is missing test data, or maybe everything looks great and commendations are in order – Pool Requests are designed to encourage and capture these productive conversations.

In ORCANOS | ALM you can also continue to push to your view/branch in light of discussion and feedback about your changes. If someone comments that you forgot to do something or if there is a logical bug in the requirement, design or test, you can fix it in your branch and push up the change. Similarly, the  GitHub shows your new commits and any additional feedback you may receive in the unified Pool Request view.

The following flowchart demonstrates the basic concept of using the ORCANOS | ALM branch mechanism,  in order to increase control of content delivery and quality.




8 Tozeret Ha'aretz Street
Tel Aviv, Israel

Copyright © Orcanos, All rights reserved. | Privacy policy | Terms of use