September 15, 2021
Research and development (R&D) teams are tasked with finding more efficient ways to move data as they work on the next great technologies, such as 6G and machine learning. One commonality with all these efforts is to move operating frequencies into the millimeter-wave (mmWave) bands.
Using higher frequencies to transfer increasingly large amounts of data can be complicated, especially when you consider bandwidth as a percentage of the carrier frequency. The higher a carrier frequency, the larger the bandwidth. Data rate is a function of bandwidth. In simplest terms, higher frequency leads to greater bandwidth and that leads to larger data rates.
Another reason for the move to mmWave is because there are more swaths of contiguous bandwidth available. Wider bandwidth modulation is supported in a single carrier, compared to overcrowded lower frequencies unable to accommodate the massive data throughput associated with emerging designs.
As device developments are engineered in the 100+ GHz space, challenges arise. Hardware is certainly one. Another is measurement setup, including calibration of vector network analyzers (VNAs), such as the VectorStar™ family.
What Engineers Need to Know About Calibration
VNAs are the default test equipment for active device testing. These instruments typically have generic accuracy for various S-parameter and power measurements, but also have the ability to calibrate the source and receiver based on the test requirements.
A VNA provides accurate results for most measurements without the engineer conducting calibration. There are many setups, however, that require higher accuracy of interpolation in frequency and power axes. Another reason for user calibration is to move the power measurement away from the test port and establish a power reference plane in a desired location.
Engineers use VNAs to measure active devices using familiar power measurements, such as those in figure 1. While this is common, especially for low frequency calibrations, challenges can present themselves when designing in the mmWave frequencies. Complete frequency coverage of the instrument, automatic level control trade-offs, calibration time, and voltage standing wave ratio (VSWR)/ accuracy across frequency can all be present when performing a broadband calibration up to 110 GHz.
For many users, nothing is more frustrating than test equipment that covers the frequency range of interest but is limited by components and/or equipment for calibrations and operation. Often, engineers must resort to separate low- and high-frequency power sensors to conduct a broadband calibration. Obviously, this is not ideal because it adds cost and time.
Level Control Coverage
Another challenge is the lack of available calibrated automatic level control (ALC) to address power requirements. While an instrument may address the power requirements, the power sensor used for calibration sometimes becomes non-responsive during the calibration. This is generally due to the sensor’s inability to resolve power levels for extremely low signals or for signal bandwidth that may be too large to cover all the desired power levels.
Voltage VSWR and Accuracy
A common challenge for many RF systems and components is good VSWR over a very broadband frequency set, especially up to 110 GHz. The main issue is that VSWR deteriorates as the frequency climbs above 90 GHz in most well-designed RF systems.
Poor VSWR can have a direct impact on measurement accuracy. Fortunately, there are ways to improve accuracy for test systems, including power sensors, such as Anritsu’s Power Master, available.
Time-to-market is critical in any industry but it is particularly true as 5G and 6G progress. Because power measurements are a requirement for amplifiers, especially for the receiving and transmitting sides of a transceiver system, measurements should be fast and accurate. Assuming engineers can get the full frequency and ALC levels corrected, calibration time can come into play. Considerable hours can be spent on calibration throughout a year, especially for custom calibrations with a multitude of points across a broadband frequency set.
As we all know, cost can increase steeply as designs extend in frequency for R&D and/or testing. Engineers are continuously challenged with finding an instrument that meets their performance needs and is in their budget. Many times, a product’s price tag is the sole contributor for a customer to consider a product.
When conducting calibrations with power sensors, there are options. For example, thermal sensors and receiver-based Non-Linear Transmission Line (NLTL) sensors are available up to 110 GHz. There are some stark differences and trade-offs, however, between both technologies.
There are a few approaches for thermal sensing. A Wheatstone bridge with a thermistor can be used to sense RF power. The RF energy is applied to the thermistor, a change in temperature is detected, and an alteration in resistance can quantify the power level.
Another approach is to utilize a Seebeck effect in which dissimilar materials are joined at one end and a potential difference is produced when the junction changes temperature. Like the former example, changes in the temperature can allow the characterization of power through the device. Figure 2 shows the Wheatstone and Seebeck technologies.
This technology is widely used and features the best accuracy for power sensors but there are trade-offs. One is slow settling times, which translate to an overall increase in calibration time. This may not be an issue for small frequency coverage but over large frequency bandwidths, such as 9 kHz to 110 GHz, time can be important. Long calibration time can also affect overall throughput in manufacturing.
Another limiting feature of this technology is level control coverage. Thermal sensors are limited to approximately -30 dBm, so they may not adequately address manufacturing or design concerns.
One important distinction is that thermal sensors measure integrated power. This approach is useful in some applications but falls short on designs that require an emphasis on fundamental power.
NLTL is a pseudo-transmission line where voltage-dependent varactor diodes are in place where a shunt capacitor would exist on a standard transmission line. When voltage is applied across the NLTL, a shockwave is produced. Based on the applied voltage, the incident wave compresses the fall time on the shockwave, creating a train of very narrow gating pulses that feeds into a sampler and can be used to characterize power (figure 3). These pulses contain harmonics with nearly the same power as the fundamental frequency and can be multiplied to provide input power levels across a broad frequency range of 9 kHz to 110 GHz.
NLTL implementation also allows for level control coverage down to -90 dBm and up to +23 dBm. Thermal sensors are limited to -30 dBm to +10 dBm, typically.
Calibration time can be alleviated through NLTL because it doesn’t rely on physical changes. Physical changes can take several seconds per minute to settle and report in the sensor to create a good power characterization.
To learn more about the need of calibrations in mmWave designs and their importance, download an Anritsu application note.