COLLABORATIVE_VALIDATION

Prior to any Service publication in the European SWIM Registry, CCS partners organise a joint validation that involve both CCS Providers and the first CCS Customer.
Test Cases dealing with several test topics are run using a happy flow of few flights to check that the services are consistent, compliant with the actual service description and meet the acceptance criteria formulated by the first CCS Customer.
Any anomaly raised by the test case execution is conveyed into the CCS bug management process. For each anomaly, a criticality level is assigned:
- Critical: blocking issue that prevents the usage of a service functionality
- Major: issue that prevents the usage of a service functionality for which a workaround has been identified
- Minor: other anomalies neither Critical nor Major
Depending on the impacts, the issue is addressed to the specific team(s) in charge of the resolution (specification team, software team, dataset team...).
Once fixed, the issue is verified during one of the next validation sessions and closed if the resolution is confirmed by the validation team.

In addition to the functional validation, CCS partners also organise performance validations. The objectives of such sessions are:
- to measure the CCS system response times according to several Key Performance Indicators (KPI) and Non-Functional Requirements (NFR) agreed among parties
- to indicate if the KPI target values are reached or not
The following KPI are evaluated:
- Maximum Provider response time for receiving the related to a short process request triggered from the CCS Client
- Maximum Provider response time for receiving the result related to a long process request triggered from the CCS Client
- Maximum Provider response time for receiving the result Provider related to a SFPL external event processing
While the following KPI is just monitored for information:
- Maximum Provider response time for receiving an Acknowledge (or Reject) related to a request triggered from the CCS Client.
It is worth noting that the performance test cases involve only the operations from a subset of CCS services.
The response times from end to end are computed by costumer equipments. CCS reference platform internal traversal times are measured on provider side.
The details about executed test cases and related results are provided in the CCS Validation evidence document of this service.