Conformance testing: “We need to be meticulous…”

The first pre-verification of a mission-critical push-to-talk conformance test case has been achieved. Philip Mason talks to MCS-TaaSting project co-ordinator Dr Fidel Liberal about the implications for the sector

In the statement announcing the pre-verification, it mentions that the project involved “nine months of joint effort”. Who else has been involved?

We developed as a consortium, consisting of a variety of organisations. On the academic side, that included Texas A&M University in the US, as well as the University of the Basque Country, which is leading the project. The latter is providing expertise on mission-critical protocols, while the former offers hands-on training for first-responders.

On the technology side, we have GridGears and Enensys, two companies developing the platform which allows testing to be carried out in the cloud, as well as eMBMS-capable RF equipment. Nemergent Solutions is providing the MCS client used to carry out the test. SONIM – the UE manufacturer – is providing LTE handsets-related expertise.

These organisations were joined by TCCA and the Public Safety Technology Alliance (PSTA), whose interest is in the certification process itself. We have also been collaborating with 3GPP RAN 5 working group and TF160. MCS-TaaSting was initially funded by the US National Institute of Standards and Technology [NIST], who we approached as a consortium.

Could you go into detail about the ‘pre-verification’ process – what was the test case?

The test case related to the whole process from switching the terminal on to being ready for making a call. There are something like 45 steps, all of which have to be checked one at a time.

It is called ‘pre-verification’ because the testing took place over IP – using the so-called IP model – rather than LTE. However, the signalling is exactly the same as would be taking place in a real LTE network.

We have also completed an additional nine test cases, and are now working towards the so-called IPCAN mode. By the end of the first quarter of 2021, we’re hoping to have the resulting tester ready for formal approval. This will enable real, binding, verification processes.

Why is it an important accomplishment?

It’s important because stakeholders – purchasers, clients – currently have no globally certified means of testing whether what is being supplied matches the standard. This includes all current critical-communications-over-broadband roll-outs, such as the Emergency Services Network in the UK, and FirstNet in the United States.

The foundations for certification have been laid by the 3GPP RAN Working Group 5 (RAN5), which is obviously great. But as of this moment, there’s no actual MCX certification programme in place because there’s no conformance test tool available.

This is in contrast to the broader commercial telco industry where, since 2G, there’s a global consensus to follow 3GPP specifications, as well as the availability of test tools to check it. Whatever the technology is within the commercial context, there’s a set of technical specifications that define, to the bit level, how systems should behave.

The critical communication sector needs to have access to similar testing and conformance standards, in order to start building confidence within our community.

There’s a general feeling currently that any certification programme would be several years away. This is why I think what we’re doing will ultimately prove extremely valuable, enabling certification programmes to be in place much earlier.

What does the pre-verification/verification process involve? Could you provide a simplified description of the testing tool?

In essence, the tool provides a way of codifying the literal description of the test case – equipment A sends a particular message to equipment B. These definitions are set out in comprehensive tables defined by RAN5. However, for them to be useful they need to be transformed into a ‘neutral’ test automation language, called TTCN-3.

This can in turn be compiled into an automation engine. It can then be rolled out in the form of running code which generates every single message, and checks the format of the responses. To put it crudely, the tool acts as a kind of philosopher’s stone for the testing of communications technology.

Why are the literal – ‘prose’ – definitions not adequate by themselves?

The central flaw with the natural language definition is that there still may be room for interpretation. This is in regard to the level of detail when coding it into messages, as well as checking the results. The testing needs to be ‘standardised’ in the same fashion as the technology, and the TTCN-3 suite is how this is typically accomplished in the telco world.

If all the testers have is the ‘prose’, there is the opportunity for different behaviours, which is exactly what we don’t want. For example, if easier-to-pass tests existed, it stands to reason that manufacturers will gravitate towards them, just for the sake of ease.

It’s like taxes, which obviously people will avoid paying if there’s a legal way to do it. That’s why the use of TTCN-3 is now so widely accepted as a requirement to make sure that the certification process is correct.

With no uniform standardisation, you risk some vendors simply imposing the size of their market share in order to make slightly different, and finally incompatible, versions of the standards. We are running fast to prevent this from happening. This means developing the tools needed for conformance certification programmes, following the well-defined 3GPP procedure.

What are the key differences between the critical communications and commercial markets that make your project necessary? Standardised testing is crucial, so why was the only option to form a consortium and pursue funding?

The simple answer to that is the size of the critical communications market and, again, what’s currently happening on the consumer side.

To address the first point, we are incredibly niche as an industry, which in terms of devices means that we only buy in the order of millions of units worldwide on an annual basis. You then compare that to the global consumer market, which sees more than 1.5 billion being sold every year. Those being the conditions, test equipment manufacturers are clearly going to address the commercial market first because there’s less risk.

At the same time, this is all happening at the same time as 5G certification, which is where the real money is. Test equipment manufacturers – quite understandably – have lately invested all their resources in that area.

I would refer to the situation as a perfect storm, not dissimilar to the slow development which we’re seeing around ProSe [proximity services]. It’s pretty much the worst timing ever.

What implications does the current lack of conformance testing have for national mission-critical broadband networks which are already being rolled out? What is the real-world impact?

The implication is that, quite simply, there’s no impartial, trusted party who can verify the technology. There are organisations like the Global Certification Forum (GCF) that already have mission-critical working items, but they’re in a dormant state because there’s no testing equipment.

There are also consulting companies in the field, as well as the operators themselves, who will likely carry out manual testing of what the supplier is deploying in the build-out of their network. But again, the current situation around testing means that there could be errors in the definition at some level.

When operators ask suppliers for assurance that they’ll be clinging to the standards, meanwhile, the most they can typically give are statements showing internal commitment, and attendance of the ETSI MCX Plugtests. This enables them to claim that they have already interconnected with another vendor, or tested most of the defined test cases.

Obviously, this is incredibly important in itself, but the certificate issued afterwards is merely one of attendance. The Plugtests are all about interoperability not conformance certification. This is very important to understand.

The initial pre-verification test case was in relation to MCPTT. Are you focusing on mission-critical data and video as well?

As you say, at this stage, we have only MCPTT. The rest will follow as a waterfall, however, with a timescale of around two years. These are complicated processes and functionalities, and we need to be meticulous.

Ultimately, the work we’re carrying out will provide value for everyone, whether they’re involved on the supply side, or are an operator or a user. When the TTCN-3 coded test cases are verified, any testing tools manufacturer can make it run on their equipment. This is the only way to go about the process of building safe, effective, critical communications technology.

Editorial contact

Philip Mason
Editor, Critical Communications Today
Tel: +44 (0)20 3874 9216