How will 6G impact testing and validation?

4 mins read

With the development of 6G how will testing environments have to adapt and change? New Electronics spoke with Jonathan Borrill, Anritsu Head of Global Market Technology.

6G will be the successor to 5G cellular technology and networks will be able to use higher frequencies than is the case with 5G networks and will be able to provide substantially higher capacity as well as much lower latency.

The development of 6G is expected to support one microsecond latency communications and bring significant improvements in areas such as imaging, presence technology and location awareness and, together with artificial intelligence, is expected to have a major impact on data storage, processing and sharing.

NE talked with Jonathon Borrill of Anritsu about the testing and validation challenges confronting the industry when it comes to 6G, despite the roll out of the technology being many years in the future.

How will 6G testing differ from 5G testing?

AI/ML and virtualisation are fundamental aspects of telecoms that are expected to have biggest impact on 6G testing. As AI/ML becomes embedded ‘native’ into the design of the RAN functions and air interface and is required to run in ‘real time’, then new test and validation methodologies will be required. This will involve the generation of suitable ‘learning data’ as well as independent ‘validation data’ on which to perform validation of the AI/ML powered functions.

So far only the Core Network is defined as a virtual architecture in 3GPP standards, and the next step is expected to be a RAN network defined as a virtual function.

Various industry bodies, such as ORAN Alliance, have already created additional specifications to help guide industry to implement 5G RAN into a virtual environment, but 6G is expected to fundamentally define the RAN as virtual functions as has already been done for the 5G Core Network.

A fully virtualised RAN network will require virtualised probes and measurement functions that can then be deployed alongside the virtual functions.

Is it just a question of numbers – higher frequencies etc – or is there some qualitative difference that 6G testing will demand?

It is not only a discussion on frequency, but latency is a key topic and KPI to measure. Already in 5G we see the ‘industry verticals’ focus on key parameters such as latency and security, rather than only data rate and throughput.

As 6G expands into new use cases and applications, it is expected that latency will be a key metric to verify, as many use cases are built upon having ultra-low latency and/or ultra-high reliability wireless links.

What challenges will capabilities such as terahertz signal transmission and microsecond latency pose for 6G testing regimes?

New test methodologies are expected for 6G, not only scaling the current test equipment but also changing the way we make measurements. Over The Air (OTA) test set-ups have been introduced in 5G for FR2 (24-52GHz range) and it is expected to evolve further for sub-THz (100-300GHz) frequency ranges. The move to 5G introduced an approx. 10x increase in frequency (1-3 GHZ going to 20-40 GHz), and 6G is expected to bring another 10x increase (going to 100-325 GHz) and the OTA test methods will need to evolve further.

In the sub-THz bands then new challenges in terms of beam forming, antenna packaging and integration, test probes, and 10x decrease in wavelength will create new demands on OTA system accuracy and link budgets. The use case requirement toward microsecond latency and synchronisation will being new testing challenges in terms of the generation and measurement of time stamp referenced data packets for test purposes, and the time sync accuracy of the end-to-end test probes. To enable this, specific test solutions using ‘measurement grade’ test packet generation and analysis are needed, and commercial grade solutions (such as smartphone apps and PING measurements) will no longer be sufficient.

A measurement grade performance is needed to give the required accuracy and confidence level for measuring and verifying networks to meet the required Service Level Agreements (SLA’s) and Quality of Service (QoS) levels.

What are the considerations when designing a 6G test set up?

Virtualised network functions will require virtualised test equipment. So, a key consideration for a 6G test set up will be to create the right test architecture with virtualised components, to match the physical and virtualised components in the 6G system.

We will expect to be testing both the basic performance of the physical components which host the 6G network, and the virtualised functions which are dynamically configured and then running on the physical infrastructure. So, a seamless blend of both physical and virtual test environments will be required.

What data will developers be looking for in testing? What parameters will be of interest?

AI/ML will require the generation of data learning sets for the verification and test of AI based functions. AI/ML is sensitive to the learning set that has been used to teach (learning bias), so the test and verification data sets need to be fully independent to learning (training) data sets, to prevent the learning bias being reinforced by the test data. This will require the ability to generate fully independent data sets, and measures to evaluate and confirm the level of independence between them. So, the parameters to describe the quality, scope, and information range enclosed within the data sets used for learning and for testing will be key new parameters.

Do analytics devices need any particular sensitivity or accuracy to conduct 6G testing, and what new features or capabilities will they need?

When a function is driven by AI/ML based algorithms, then the evaluation of outputs (evaluation of the decisions made by AI/ML) may require a different approach. Rather than use traditional pass/fail criteria, it is likely that more functions will be evaluated using statistical analysis methods (likelihood of certain output, distribution of output decisions).

Such statistical testing normally takes much longer, due to the need to generate enough output data and cover a wide enough range of combinations to be statistically significant and representative of the function under test. For the new generation of RF test equipment, looking at the sub-THz frequency ranges, then the achievable sensitivity and accuracy is depending on the available performance from semiconductor devices. Here we can see new generations of transistors/mixers/amplifiers being designed and tested in research labs, and they are expected to provide suitable performance for future test equipment needs.

Can existing 5G solutions be adapted or extended to encompass 6G testing?

It’s still too early to judge this, as the discussions are at early stage and standardisation has not yet started. Fundamental parameter testing (e.g. RF and Optical spectrum analysers) can be extended where necessary, but standards specific measurements (e.g. protocol test or emulation) will need to be adapted to new 6G protocols.

So, we need to wait until 6G characteristics such as Modulation and Coding Schemes (MCS) and Error Correction (EC) schemes have been selected, to then develop the corresponding emulators and test systems for these.

In the meantime, there is a lot of activity in the research communities to develop and evaluate these schemes, and the need for fundamental test equipment that enables these evaluations.