Competition and the Semiconductor Triple-Play

  Get all the signal analyzer performance you pay for

After perusing a pair of new application briefs, I was impressed by the improvements in signal analyzers driven by the combination of evolving semiconductor technology and plain old competition. I’ll write a little about the history here, but will also highlight a few of the benefits and encourage you to take advantage of them as much as possible. They’ll be welcome help with the complex signals and stringent standards you deal with every day.

Intel’s Gordon Moore and David House are credited with a 1960s prediction that has come to be known as Moore’s law. With uncanny accuracy, it continues to forecast the accelerating performance of computers, and this means a lot to RF engineers, too. Dr. Moore explicitly considered the prospects for analog semiconductors in his 1965 paper, writing that “Integration will not change linear systems as radically as digital systems.” *

Note that as radically qualifier. Here’s my mental model for the relative change rates.

Comparing the rate of change in performance over time of the semiconductor-based elements of signal analyzers. Processors have improved the fastest, though analog circuit performance has improved dramatically as well.

Comparing the rate of change in performance over time of the semiconductor-based elements of signal analyzers. Processors have improved the fastest, though analog circuit performance has improved dramatically as well.

In analyzer architecture and performance, it seems sensible to separate components and systems into those that are mainly digital, mainly analog, or depend on both for improved performance.

It’s no surprise that improvements in each of these areas reinforce each other. Indeed, the performance of today’s signal analyzers is possible only because of substantial, coordinated improvement throughout the block diagram.

Digital technology has worked its way through the signal analyzer processing chain, beginning with the analog signal representing the detected power in the IF. Instead of being fed to the Y-axis of the display to drive a storage CRT, the signal was digitized at a low rate to be sent to memory and then on to a screen.

The next step—probably the most consequential for RF engineers—was to sample the downconverted IF signal directly. With enough sampling speed and fidelity, complex I/Q (vector) sampling could represent the complete IF signal, opening the door to vector signal analysis.

Sampling technology has worked its way through the processing chain in signal analyzers. It’s now possible to make measurements at millimeter frequencies with direct (baseband) sampling, though limited performance and high cost mean most that RF/microwave measurements will continue to be made by IF sampling and processing.

Sampling technology has worked its way through the processing chain in signal analyzers. It’s now possible to make measurements at millimeter frequencies with direct (baseband) sampling, though limited performance and high cost mean most that RF/microwave measurements will continue to be made by IF sampling and processing.

This is where competition comes in, as the semiconductor triple-play produced an alternative to traditional RF spectrum analyzers: the vector signal analyzer (VSA). Keysight—then part of HP—introduced these analyzers as a way to handle the demands of the time-varying and digitally modulated signals that were critical to rapid wireless growth in the 1990s.

A dozen years later, competitive forces and incredibly fast processing produced RF real-time spectrum analyzers (RTSAs) that calculated the scalar spectrum as fast as IF samples came in. Even the most elusive signals had no place to hide.

VSAs and RTSAs were originally separate types of analyzers, but continuing progress in semiconductors has allowed both to become options for signal-analyzer platforms such as Keysight’s X-Series.

This takes us back to my opening admonition that you should get all you’ve paid for in signal analyzer performance and functionality. Fast processing and digital IF technologies improve effective RF performance through features such as fast sweep, fast ACPR measurements, and noise power subtraction. These capabilities may already be in your signal analyzers if they’re part of the X-Series. If these features are absent, you can add them through license key upgrades.

The upgrade situation is the same with frequency ranges, VSA, RTSA, and related features such as signal capture/playback and advanced triggering (e.g., frequency-mask and time-qualified). The compounding benefits of semiconductor advances yield enhanced performance and functionality to meet wireless challenges, and those two new application briefs may give you some useful suggestions.

* “Cramming more components onto integrated circuits,” Electronics, Volume 38, Number 8, April 19, 1965

Share
No Comments ↓

Tagged with: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Posted in Aero/Def, EMI, History, Low frequency/baseband, Measurement techniques, Measurement theory, Microwave, Millimeter, Signal analysis, Wireless

The Power and Peril of Intuition

  This one really is rocket science

For engineers, intuition is uniquely powerful, but not infallible. It can come from physical analogies, mathematics, similar previous experiences, pattern recognition, the implications of fundamental natural laws, and many other sources.

I have a lot of respect for intuitive analysis and conclusions, but I’m especially interested in situations where it fails us, and especially in understanding why it failed. Because I like to avoid erroneous conclusions, I often find that understanding the specific error points to a better intuitive approach.

Previously, I wrote about one intuitive assumption and its flaws, and also about a case in which a simple intuitive approach was perfectly accurate. This time, I’d like to discuss another example related to power and energy, albeit one that has very little to do with RF measurements.

Let me set the stage with this diagram of an interplanetary spacecraft, making a close pass to a planet as a way to steal kinetic energy and use that boost to get where it’s going faster.

An interplanetary spacecraft is directed to pass near a planet, using the planet’s gravity field to change the spacecraft trajectory and accelerate it in the desired direction. Equivalent rocket burns at time intervals a and b produce different changes in the spacecraft’s speed and kinetic energy. (Mars photo courtesy of NASA)

An interplanetary spacecraft is directed to pass near a planet, using the planet’s gravity field to change the spacecraft trajectory and accelerate it in the desired direction. Equivalent rocket burns at time intervals a and b produce different changes in the spacecraft’s speed and kinetic energy. (Mars photo courtesy of NASA)

One of the most powerful tools in the engineer’s kit is the law of conservation of energy. RF engineers may not use it as often as mechanical engineers or rocket scientists, but it’s an inherent part of many of our calculations. For reasons that escape me now, I was thinking about how we get spacecraft to other planets using a combination of rockets and gravity assist maneuvers, and I encountered a statement that initially played havoc with my concept of the conservation of energy.

Because most rocket engines burn with a constant thrust and have a fixed amount of fuel, I intuitively assumed that it wouldn’t matter when, in the gravity-assist sequence, the spacecraft burned its fuel. Thrust over time produces a fixed delta-V and that should be it… right?

Nobody has repealed the law of conservation of energy, but I was misapplying it. One clue is the simple equation for work or energy, which is force multiplied by distance. When a spacecraft is traveling faster—say finishing its descent into a planet’s gravity well before climbing back out—it will travel farther during the fixed-duration burn. Force multiplied by an increased distance produces an increase in kinetic energy and a higher spacecraft speed.*

My intuition protested: “How could this be? The math is unassailable, but the consequences don’t make sense. Where did the extra energy come from?”

One answer that satisfied, at least partially, is that burning at time b rather than time a in the diagram above gives the planet the chance to accelerate the spacecraft’s fuel before it’s burned off. The spacecraft has more kinetic energy at the start of the burn than it would have otherwise.

Another answer is that the law of conservation of energy applies to systems, and I had defined the system too narrowly. The planet, its gravity field, and its own kinetic energy must all be considered.

Fortunately, intuition is as much about opportunity for extra insight as it is about the perils of misunderstanding. Lots of RF innovations have come directly from better, deeper intuitive approaches. In the wireless world, CDMA and this discussion of MIMO illustrate intuition-driven opportunities pretty well. Refining and validating your own intuition can’t help but make you a better engineer.

 

* This effect is named for Hermann Oberth, an early rocket pioneer with astonishing foresight.

Share
1 Comment ↓

Tagged with: , , , , , ,
Posted in Aero/Def, Off-topic (almost)

Signal Capture and Playback: A DVR for RF/Microwave Engineers

  “OK, Jamie, let’s go to the high-speed”

There are times when understanding an event or phenomenon—or simply finding a problem—demands a view using a different time scale. I’m a fan of the Mythbusters TV series, and I can’t count the number of times when the critical element in understanding the myth was a review of high-speed camera footage. I’m sure the priority was mostly on getting exciting images for good TV, but high-speed footage was often the factor that really explained what was going on.

Another common element of those Mythbusters experiments was their frequent one-shot nature, and high-speed cameras were critical for this as well. Single events were trapped or captured so that they could be examined over and over, from different angles and at different speeds.

Time capture, also called signal capture, came to the general RF analyzer world in the early 1990s with the introduction of vector signal analyzers (VSAs), whose block diagram was a natural fit for the capability. While it was primarily a matter of adding fast memory and a user interface for playback or post-processing, significant innovation went into implementing a practical magnitude trigger and achieving trigger-timing alignment.

The block diagram of a VSA or a signal analyzer with a digital IF section is good foundation for time capture, and it required the expansion of just two blocks. Capture/playback is especially useful for the time-varying signals that VSAs were designed to handle.

The block diagram of a VSA or a signal analyzer with a digital IF section is good foundation for time capture, and it required the expansion of just two blocks. Capture/playback is especially useful for the time-varying signals that VSAs were designed to handle.

Over the years, I’ve used time capture for many different measurements and think it’s really under-used as a tool for RF/microwave applications in wireless, aerospace/defense, and EMI. It’s an excellent way to leverage the knowledge of the RF engineer, and it’s easy to use: first select the desired frequency and span and then press the record button.

The insight-creating and problem-solving magic comes during playback or post-processing. Captures are gap-free, and playback speed in VSA software can be adjusted over a huge range. Just press the play button and explore wherever your insight leads. You can see the time, frequency, and modulation domains at the same time, with any number of different measurements and trace types. You can easily navigate large capture buffers with numeric and graphical controls, and even mark a specific section to replay or loop continuously so you can see everything that happened.

A simple capture/playback of a transmitter switching on shows a transient amplitude event in the bottom trace. The top two traces use variable persistence to show the signal spectrum and RF envelope as playback proceeds in small steps.

A simple capture/playback of a transmitter switching on shows a transient amplitude event in the bottom trace. The top two traces use variable persistence to show the signal spectrum and RF envelope as playback proceeds in small steps.

Today’s signals are highly dynamic, the RF spectrum is crowded, and design requirements are stringent. You often need to optimize and troubleshoot in all three domains—time, frequency, and modulation—at once. You have the skill and the knowledge, but you need a total view of the signal or system behavior. In my experience, there’s nothing to match the confidence and insight that follow from seeing everything that happened during a particular time and frequency interval.

I’ll write about some specific measurement examples and techniques in posts to come. In the meantime, feel free to try out time capture on your own signals. The 89600 VSA software includes a free trial mode that works with all of the Keysight X-Series signal analyzers and many other Keysight instruments, too. Just press that red record button and then press play. It’ll make you feel like an RF Mythbuster, too.

Share
No Comments ↓

Tagged with: , , , , , , , , , , , , , , , , ,
Posted in Aero/Def, EMI, History, Measurement techniques, Microwave, Millimeter, Signal analysis, Wireless

RF Engineers to the Rescue—in Space

  Here we come to save the day!

“Space” said author Douglas Adams, “is big. Really big. You just won’t believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it’s a long way down the road to the chemist, but that’s just peanuts to space.”*

Low Earth orbit (LEO) is a significant chunk of space, but its bigness wasn’t enough to save the working satellite Iridium 33 from a 2009 collision with Russia’s defunct Kosmos 2251 communications satellite. The impact at an altitude of about 500 miles was the first time two satellites had collided at hypervelocity, and the results were not pretty.

Yellow dots represent estimated debris from the 2009 collision of two LEO satellites. This graphic from Wikimedia Commons shows the debris location about 50 minutes after the collision. The structure of the debris field is an excellent example of conservation of momentum.

Yellow dots represent estimated debris from the 2009 collision of two LEO satellites. This graphic from Wikimedia Commons shows the debris location about 50 minutes after the collision. The structure of the debris field is an excellent example of conservation of momentum.

The danger of collisions such as this is highest in LEO, which spans altitudes of 100 – 1200 miles. The danger is a function of the volume of that space, the large number of objects there, and their widely varying orbits. The intersections of these orbital paths provide countless opportunities for destructive high-velocity collisions.

It’s estimated that the 2009 collision alone produced 1000 pieces of debris four inches or larger in size, and countless smaller fragments. Because objects as small as a pea can disable a satellite, and because larger ones can turn a satellite into another cloud of impactors, the danger to vital resources in LEO is clear.

This chain reaction hazard or “debris cascade” was detailed as far back as 1978 by NASA’s Donald Kessler in a paper that led to the scary term Kessler syndrome.

The concept is scary because there’s no simple way to avoid the problem. What’s worse, our existing tools aren’t fully up to the task of identifying objects and accurately predicting collisions. The earlier 1961-vintage ground-based VHF radar system could track only those objects bigger than a large beach ball, and accuracy was not sufficient to allow the Iridium satellite to move out of danger.

Cue the RF/microwave engineering cavalry: With their skill and the aid of signal analyzers, signal generators, network analyzers, and the rest of the gear we’re so fond of, they have created a new space fence. Operating in the S-band, this large-scale phased-array radar will have a wide field of view and the ability to track hundreds of thousands of objects as small as a marble with the accuracy required to predict collisions.

Alas, predicting collisions is most of what we can do to avoid a Kessler catastrophe. Though the company designing and building the fence mentions “cleaning up the stratosphere,” it’s Mother Nature and the very faint traces of atmosphere in LEO that will do most of the job. Depending on altitude, mass, and cross-section, the cleaning process can take decades or longer.

In the meantime, we’ll have to make the most of our new tools, avoid creating new debris, and perhaps de-orbit a few big potential offenders such as Envistat.

There may be another opportunity for the engineering cavalry to save the day. There are proposals for powerful lasers, aimed with unbelievable precision, to blast one side of orbiting debris, creating a pulse of vapor that will aim objects to atmospheric destruction and render them mostly harmless.*  I’m looking forward to the RF/microwave designs for tracking and aiming that will make that possible.

 

* From the book The Hitchhiker’s Guide to the Galaxy

Share
2 Comments ↓

Tagged with: , , , , , , , , ,
Posted in Aero/Def, Hazards, History, Microwave, Millimeter

Fans of a Touch Interface for Signal Analyzers, Please Extend Your Hands

  Borrowing stuff from phones and tablets that we helped make possible

Gary Glitter asked it first, but people my age probably remember Joan Jett’s version: “Do you wanna touch?” In the last few years, test and measurement companies have decided the answer is yes.

In combination with wireless networks, touchscreens have proven themselves in the realm of smartphones and tablets. I’ll admit that I was initially skeptical about instruments, however, based on my experience with laptops. For most regular laptop use, I couldn’t see the benefit of suspending my heavy arm in front of me to make imprecise selections of text or numbers, while my fingers partially blocked my view.

I guess it’s a matter of choosing the right application, and I think Apple should get a lot of credit for recognizing that, then creating and refining the touch UI. They’ve done this so thoroughly that it’s hard for me to see which gestures and other touch elements are inspired innovation and which are simply taking advantage of innate human wiring.

It seems clear that touchscreens for certain UI tasks are a win, so what about RF instruments?

The hardkey/softkey interface in analyzers has proven to be remarkably durable, but it’s about 35 years old. That’s ancient compared to everything happening behind the front panel. Sure, it was a great step up from analog switches, knobs and buttons as instruments evolved to digital control of measurement parameters in the 1970s.  The hardkey/softkey interface leveraged the display paradigm of terminals and other computer displays, as analyzers gained computing power and signals—and corresponding measurements—became much more complex.

Instruments have continued to borrow UI schemes from the world of computers. Looking back about 15 years, I can still remember the first PC-based vector signal analyzer and the novelty of changing center frequency, span and display scaling by clicking and dragging a window with the mouse. On the other hand, clickable hot spots and adjusting parameters with the scroll wheel felt completely natural from the start. Just point to what you want to change and tweak it.

Touchscreens have been used on some RF signal analyzers in recent years, though the early ones were often the less-sensitive—and very slightly blurry—resistive types, and all the ones I’m familiar with were single-touch. Borrowing liberally from tablets and phones, Keysight has recently introduced a new line of X-Series signal analyzers with multi-touch screens that are nearly 70% larger than their touch-less predecessors, with nearly three times the screen area.

The new X-Series signal analyzers include much larger displays and a multi-touch UI. On-screen buttons and hot spots are sized for fingertips, providing direct access to measurement settings and bringing measurements within two touches. Two-finger gestures such as pinching or spreading can change display scaling or parameters such as frequency center and span.

The new X-Series signal analyzers include much larger displays and a multi-touch UI. On-screen buttons and hot spots are sized for fingertips, providing direct access to measurement settings and bringing measurements within two touches. Two-finger gestures such as pinching or spreading can change display scaling or parameters such as frequency center and span.

Touchscreens are cumbersome unless the rest of the UI does its part with control features compatible with the size of a finger. In the X-Series the larger display was the first step. Some tasks are still easier to accomplish with hardkeys and a knob, and perhaps that will never change.

User menus have previously been available in signal analyzers, though they weren’t used very often—and perhaps it was because they weren’t easy to set up. Today, the ease of creating and accessing them in the X-Series analyzers with touch may change that. It’s a feature worth exploring if you’re working with the same group of settings over and over again.

I’ve written a lot before about making better measurements. A multi-touch UI, if done right, should be a way to instead make measurements better. Of course, whether they’re better for you is a matter of preference and the needs of your application. If you get the chance, give these new analyzers a try.

Share
No Comments ↓

Tagged with: , , , , , , , ,
Posted in Aero/Def, EMI, History, Microwave, Millimeter, Signal analysis, Wireless

Vexing Connections

   RF sins that may not be your fault

Many years ago, I knew an engineer who used to chuckle when he used the term “compromising emanations” to describe unintentional RF output. Today, most of us use the less colorful term “interference” to refer to signals that appear where they should not, or are more powerful than regulations allow. We’re likely more concerned about coexisting in the RF spectrum or violating a standard than we are about revealing something.

Wireless systems seem infinitely capable of generating unintended signals, and one of the more interesting is the rusty bolt effect.

A rusty bolt can form a metal-to-metal oxide connection that is rectifying rather than simply resistive. (Image from Wikimedia Commons)

A rusty bolt can form a metal-to-metal oxide connection that is rectifying rather than simply resistive. (Image from Wikimedia Commons)

I recently ran across a discussion of this when looking into the causes and consequences of imperfect connections in RF systems. Though I’ve previously written about connections of various kinds, including coaxial connectors, cables, adapters and waveguide, I’ve focused more on geometry and impedance than metal-to-metal contact.

Dealing with the wrong impedance is one thing, but for some time I’ve wanted to better understand why so many bad electrical contacts tend to be rectifying rather than Ohmic. Not surprisingly, it involves semiconductors. Accidental semiconductors, but semiconductors nonetheless.

Some oxides are conductive and some are insulating, but a number of common metal oxides are semiconductors. Oxidation or other corrosion—say from skin oils—makes it easy to produce a metal-to-semiconductor contact and a resulting nonlinearity.

Voltage/current curves for Ohmic and rectifying contacts. The nonlinear curve of a rectifying contact is essentially that of a diode.

Voltage/current curves for Ohmic and rectifying contacts. The nonlinear curve of a rectifying contact is essentially that of a diode.

Nonlinear connections are problematic in wireless, primarily because of the RF distortion products they produce. In the simple sinewave case, they create energy at harmonic frequencies, and when multiple signals are present they produce intermodulation distortion. The intermodulation distortion is particularly troublesome because it can appear in-band or in nearby bands, and at many frequencies at once.

Modern multi-channel systems, including base stations and carrier-aggregation schemes, create many simultaneous signals to “exercise” these nonlinearities and create distortion products. The distortion may be described as passive intermodulation (PIM) because it’s generated without powered elements. The rusty bolt example involves high currents through imperfect antenna or metal structure connections, though wireless systems offer many other opportunities for nonlinear mischief.

One of the most maddening characteristics of this phenomenon is its elusive nature. Outdoor antennas are subject to strain from wind and temperature changes as well as weathering from salt air or acid rain. Nonlinearities can appear and disappear, seemingly at random. Even indoor wireless transmitters have to contend with mechanical stresses, changing humidity and temperature, and contamination of all kinds.

In many cases, astute mechanical design and mitigation of oxidation or contamination will help eliminate nonlinear connections. Because Ohmic metal-to-semiconductor connections are essential to their products, semiconductor manufacturers are a good source of information and techniques.

At some point, of course, you need to make spectrum measurements to find intermodulation problems or verify that emissions are within limits. Signal analyzes do the job well, and many measurement applications are available for popular standards to automate setup, perform measurements, and provide pass/fail results. They’re the most efficient way to ensure you avoid sins that you’d rather not be blamed for.

 

Share
No Comments ↓

Tagged with: , , , , , , , , , , , , , , , , , , , , , ,
Posted in Aero/Def, EMI, Hazards, Microwave, Millimeter, Signal analysis, Wireless

All Our RF Sins Exposed

  Trespassing is harder to miss in a densely occupied country

The 802.11ah wireless standard mentioned in my last post is promising, but it highlights a challenge that’s facing many engineers in the wireless space: out-of-band or out-of-channel emissions.

In an article from Electronic Products via Digi-Key’s article library, Jack Shandle writes: “Two significant design issues in the 915-MHz band are: 1) The third, fourth, and fifth harmonics all fall in restricted bands, which imposes some design constraints on output filtering. 2) Although it is unlicensed in North America, Australia and South Korea, the band is more strictly regulated in other parts of the world.

Of course, the higher allowed transmit power and improved propagation of the 915 MHz band—compared to the 2.4 GHz band—adds to the potential for interference. But these days, your harmonic and spurious emissions don’t have to fall in restricted bands to be a concern. Compared to previous eras, the modern wireless spectrum is so crowded that excess emissions are far more likely to cause someone a problem and be noticed. Wireless standards are correspondingly stringent.

For RF engineers, the interference challenges exist in both the frequency and time domains, and this can make problems harder to find and diagnose. The time-domain concerns are not new, affecting any TDMA scheme—including the one used in my 20-plus-year-old marine VHF handheld. Using Keysight vector signal analyzers, I long ago discovered that the little radio walked all over a dozen channels in the first 250 ms after each press of the transmit key. A newer handheld was actually worse in terms of spectrum behavior, but settled down more quickly.

Back then, that behavior was neither noticed nor troublesome, and I don’t suppose anyone would complain even today. However, that quaint FM radio is nothing like the vast number of sophisticated wireless devices that crowd the bands today. Even a single smartphone uses multiple radios and multiple bands, and interference is something that must be discovered and fixed at the earliest stages to reduce cost and risk.

Given the dynamic nature of the signals and their interactions, gaining confidence that you’ve found all the undesirable signals is tough. Using the processing power of today’s signal analyzers is a good first step.

This composite real-time spectrum analysis (RTSA) display shows both calculated density and absolute spectrum peaks. Real-time spans of up to 500 MHz are available, letting you know you’ve seen everything that happened over that span and in that measurement interval.

This composite real-time spectrum analysis (RTSA) display shows both calculated density and absolute spectrum peaks. Real-time spans of up to 500 MHz are available, letting you know you’ve seen everything that happened over that span and in that measurement interval.

Though RTSA makes the task easier and the results more certain, RF engineers have been finding small and elusive signals for many years. Peak-hold functions and peak detectors have been available in spectrum analyzers since the early days and they’re effective, if sometimes time-consuming.

Minimizing noise in the measurement is essential for finding small signals, but the traditional approach of reducing RBW can make sweep times unreasonably long. Fast-sweep features and noise subtraction are available in some signal analyzers, leveraging signal processing to expand the speed/performance envelope. Keysight’s noise floor extension is particularly effective with noise-like signals such as digital modulation.

Of course, finding harmonic and spurious emissions is only half the battle. A frequency reading may be all you need to deduce their origin, but in many cases you need more information to decisively place blame.

In addition to frequency, the most useful things to know about undesirable signals are their spectral shape and timing. That means isolating the suspects and relating them to the timing of other signals. One traditional approach is a zero-span measurement, centered on the signal of interest. It’s a narrow view of the problem but it may be enough.

Far more powerful tools are available using the memory and processing power of today’s signal analyzers. Frequency-mask triggering is derived from RTSA and can isolate the signal for display or generate a trigger for complete capture and playback of signals. Signal recording is usually done with the 89600 VSA software and can include capture of events that occurred before the trigger.

For even more complex time relationships, the VSA software borrows from oscilloscopes to provide a time-qualified trigger for RF engineers. Command of both the time and frequency domains is the most effective path to interference solutions.

If you don’t have these advanced tools, you can add them to existing signal analyzers with a minimum of fuss. With your RF intuition and good tools, interference has no place to hide.

Share
No Comments ↓

Tagged with: , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Posted in Aero/Def, EMI, Measurement techniques, Microwave, Signal analysis, Wireless

Turning Back RF Technology to Take a Step Forward

  New activity on familiar terrain

High-profile developments in wireless networking usually involve ever-wider bandwidths and ever-higher operating frequencies. These support the insatiable need for increased wireless capacity, and they parallel mobile developments such as 5G cellular. And if the prognosticators are correct, more users generating more traffic will compete with increasing traffic from non-human users such as the Internet of things (IoT) and machine-to-machine (M2M) communications.

The increasing need for data capacity is undeniable, but the focus on throughput seems a bit narrow to me. I’m probably a typical wireless user—if there is such a thing—and find myself more often dissatisfied with data link availability or reliability than with capacity.

For example, Wi-Fi in my family room is mostly useless when the microwave in the kitchen is on. Sure, I could switch to a 5.8 GHz wireless router, but those signals don’t travel as far, and I would probably relocate the access point if I made the change. Another example: The 1.9 GHz DECT cordless phone in the family room will cover the front yard and the mailbox, but the one in my downstairs office won’t. A phone doesn’t demand much data throughput for voice, but it must provide a reliable connection. Yes, I can carry my mobile phone and forward to it, but I sometimes appreciate the lack of a tether.

I often think about the digital cordless phone I had a dozen years ago, operating on the 900 MHz ISM band with a simple 12-bit PN code for spread spectrum. Its range was hundreds of yards with obstructions and over half a mile in the open.

I’ve been reading a little about the proposed new 802.11ah wireless networking standard in that same 900 MHz band, and thinking about the implications. Two important technical factors are the limited width of the band—902 to 928 MHz—and improved signal propagation compared to the 2.4 and 5.8 GHz bands. In the technical press you’ll frequently see a diagram similar to this one:

Lower frequencies generally propagate better, and the difference can be significant in terms of network coverage in a house or office space. Of course, practical range depends on many other factors as well.

Lower frequencies generally propagate better, and the difference can be significant in terms of network coverage in a house or office space. Of course, practical range depends on many other factors as well.

The diagram is certainly oversimplified, in particular neglecting any band-crowding, interference or obstruction issues. Nonetheless, the potential range benefits are obvious. Some claim that real-world distances of more than a kilometer are feasible, and the 900 MHz band may allow higher effective transmit power than 2.4 or 5.8 GHz.

Throughput, however, is modest compared to other WLAN standards. Data can be sent using a down-scaled version of the 802.11a/g physical layer for data rates ranging from 100 Kb/s to more than 25 Mb/s. Significantly, the standard supports power-saving techniques including predefined active/quiescent periods.

As usual, the standard has many elements, supporting a variety of potential uses, and a few are likely to dominate. Those mentioned most often relate to IoT and M2M. Compared to existing Wi-Fi, 802.11ah should be better optimized for the required combination of data rate, range and power consumption.

Although that presumption seems reasonable, recent history tells us that attractive combinations of spectrum and PHY layer will be bent to unanticipated purposes. I think there are many situations in which users would be happy to trade transfer speed for a high-reliability link with longer range.

From an RF engineering standpoint, improved propagation is a double-edged sword. Current WLAN range relates well to the scale of homes and small businesses, naturally providing a degree of geographic multiplexing and frequency reuse due to lower interference potential. The combination of propagation and transmit power in the narrower 900 MHz band will change tradeoffs and challenge radio designers.

The 802.11ah standard is expected to be ratified sometime in 2016, and RF tools are already available. Keysight’s WLAN measurement applications for RF signal generators and signal analyzers already support the standard, and vector signal analysis is supported with pre-stored settings in the custom OFDM demodulation of the 89600 VSA software Option BHF.

With established alternatives ranging from ZigBee to 802.11ac, some are skeptical about the success of this effort in the relatively neglected 900 MHz ISM band. It’s a fool’s errand to try to predict the future, but it seems to me this band has too much going for it to remain under-occupied.

Share
1 Comment ↓

Tagged with: , , , , , , , , , , , , , , , ,
Posted in EMI, Signal analysis, Signal generation, Wireless

Today’s R&D Comparison: RF Engineering and Rocket Fuel

  An engineering field where the products you’re designing are constantly trying to kill you

The holiday season is here and it’s been a while since I have wandered off topic. I hope you’ll indulge me a little, and trust my promise that this post will bring some insight about the business we’re in, at least by happy contrast.

It’s a good time of the year to reflect on things we’re thankful for, and in this post I’d like to introduce you to a book about a fascinating field of R&D: developing rocket fuel. Compared to work on rocket fuel, our focus on RF technology has at least two major advantages, including longevity and personal safety.

Let’s talk first about safety and the excitement engendered by the lack of it. Robert Goddard is generally credited with the first development and successful launch of a liquid-fueled rocket. Here he is with that rocket, just before its launch in March of 1926.

Robert Goddard stands next to his first liquid-fueled rocket before the successful test flight of March 16, 1926. The combustion chamber is at the top and the fuel tanks are below. The pyramidal structure is the fixed launch frame. (photo by Esther Goddard, from the Great Images in NASA collection)

Robert Goddard stands next to his first liquid-fueled rocket before the successful test flight of March 16, 1926. The combustion chamber is at the top and the fuel tanks are below. The pyramidal structure is the fixed launch frame. (photo by Esther Goddard, from the Great Images in NASA collection)

In my mind, the popular image of Goddard has been primarily that of an experimenter, skewed by the footage we’ve all seen of his launches. In reality, he was also a remarkable theoretician, very early on deriving the fundamental parameters and requirements of atmospheric, lunar, and interplanetary flight.

He also showed good sense in choosing gasoline as his primary rocket fuel, generally with liquid oxygen as the oxidizer. This may seem like a dangerous combination, but it was tame compared to what came just a few years later.

That brings me to the fascinating book about the development of liquid rocket fuels. The author is John D. Clark, a scientist, chemist, science/science-fiction writer, and developer of fuels much more exotic than those Goddard used. The introduction to the book was written by author Isaac Asimov and it describes the danger of these fuels very well:

There are, after all, some chemicals that explode shatteringly, some that flame ravenously, some that corrode hellishly, some that poison sneakily, and some that stink stenchily. As far as I know, though, only liquid rocket fuels have all these delightful properties combined into one delectable whole.

Delectable indeed! And if they don’t get you right away, they’re patient: it’s no surprise that many of these fuels are highly carcinogenic.

The book is titled Ignition! An Informal History of Liquid Rocket Propellants. It was published in 1972 and is long out of print, but a scan is available at the link. Fittingly, the book opens with two pictures of a rocket engine test cell, before and after an event called a “hard start.” Perhaps rocket engineers think the term “massive explosion” is too prejudicial.

For many spacecraft and missiles, the most practical fuels are hypergolic, those that burn instantly on contact, requiring no separate ignition source. Clark describes their benefits and extravagant hazards in the chapter “The Hunting of the Hypergol.” The suit on the technician in this picture and the cautions printed on the tank give some idea of the potential for excitement with these chemicals.

Hydrazine, one part of a hypergolic rocket-fuel pair, is loaded on the Messenger spacecraft. The warnings on the tank note that the fuel is corrosive, flammable, and poisonous. The protective gear on the technician gives some idea of the dangers of this fuel. (NASA image via Wikimedia commons)

Hydrazine, one part of a hypergolic rocket-fuel pair, is loaded on the Messenger spacecraft. The warnings on the tank note that the fuel is corrosive, flammable, and poisonous. The protective gear on the technician gives some idea of the dangers of this fuel. (NASA image via Wikimedia commons)

Clark is a skilled writer with a delightful sense of humor, and the book is a great read for holiday downtime at home or on the road. However, it is also a little sad to hear that most of the development adventure in this area came to an end many years ago. Clark writes:

This is, in many ways, an auspicious moment for such a book. Liquid propellant research, active during the late’40s, the ’50s, and the first half of the ’60s, has tapered off to a trickle, and the time seems ripe for a summing up, while the people who did the work are still around to answer questions.

So in addition to being thankful that we’re doing research on things that aren’t constantly trying to kill us, we can also be grateful for a degree of career longevity. RF/microwave engineering has been a highly active field for decades and promises to continue for decades more. No moon suits or concrete bunkers required.

Plus, while we give up a certain degree of excitement, we don’t need to wear a moon suit to prepare for tests and we don’t need run them from a concrete bunker.

Share
2 Comments ↓

Tagged with: , ,
Posted in Aero/Def, Hazards, History, Humor, Off-topic (almost)

Analyzer Upgrades: Going Back in Time and Changing Your Mind

  A practical way to revisit decisions about bandwidth and frequency range

No matter how carefully you consider your test equipment choices, your needs will sometimes evolve beyond the capabilities you’ve purchased. You may face a change in standards or technology, or the need to improve manufacturing productivity or margins. Business decisions may take you in a different direction, with some opportunities evaporating and new ones cropping up.

The one thing you can predict with confidence is that your technological future won’t turn out quite the way you expect. Since test equipment is probably a significant part of your capital-asset base, and your crystal ball will always have hazy inclusions, you have to find the best ways to adapt after the fact.

Analyzer software and firmware can certainly help. New and updated measurement applications are often available, tracking standards as they evolve. Firmware and operating-system updates can be performed as needed, though they’re likely more difficult and sometimes more disruptive than just installing an app.

In some cases, however, the new demands may be more fundamental. The most common examples are increasing measurement bandwidth and extended frequency range, both being recurring themes in wireless applications.

Of course, the obvious solution is a new analyzer. You get a chance to polish your crystal ball, make new choices, and hope for the best. Unfortunately, there is not always capital budget for new equipment, and the purchase-and-redeployment process burns time and energy better spent on engineering.

If the analyzer is part of a modular system, it may be practical to change individual modules to get the capability you need, without the expense of complete replacement. Of course, there are still details like capital budget, management of asset numbers and instrument re-calibration.

One approach to upgrading instrument fundamentals is sometimes called a “forklift upgrade,” a term borrowed from major IT upgrades requiring actual forklifts. In the case of test equipment, it’s a tongue-in-cheek reference to the process of lifting up the instrument serial-number plate and sliding a new instrument underneath. For instruments not designed to be upgradable, this term applies pretty well.

Fortunately, the forklift upgrade reflects a prejudice that is out of date for analyzers such as Keysight’s X-Series. Almost any available option can be retrofitted after purchase, even for analyzers purchased years ago.

Fundamental characteristics such as analysis bandwidth, frequency range, and real time spectrum analysis (RTSA) can be added to Keysight X-Series signal analyzers at any time after purchase. This example shows a 160 MHz real-time display and a frequency-mask trigger (inset) on an instrument upgraded from the base 3.6 GHz frequency range.

Fundamental characteristics such as analysis bandwidth, frequency range, and real time spectrum analysis (RTSA) can be added to Keysight X-Series signal analyzers at any time after purchase. This example shows a 160 MHz real-time display and a frequency-mask trigger (inset) on an instrument upgraded from the base 3.6 GHz frequency range.

Upgradability is part of the analyzer design, implemented in several ways. The internal architecture is highly modular, including the RF/microwave front end, IF digitizing, and DSP. The main CPU, disk/SSD memory, and its external digital interfaces are directly upgradable by the user.

For RF engineers, this is the best substitute for time travel. Hardware upgrades include installation, calibration, and a new warranty, with performance specifications identical to those of a new instrument.

There are organizational and process benefits as well, avoiding the need for new instrument purchase approvals and changes in tracking for asset and serial numbers.

If the decisions of the past have left you in a box, check out the new application brief on analyzer upgrades for a way out. If the box you’re in looks more like a signal generator, Keysight has solutions to that problem too.

Share
No Comments ↓

Tagged with: , , , , , , , , , , , , , , , , , , , ,
Posted in Aero/Def, EMI, Microwave, Millimeter, Signal analysis, Signal generation, Wireless

About

My name is Ben Zarlingo and I'm an applications specialist for Keysight Technologies.  I've been an electrical engineer working in test & measurement for several decades now, mostly in signal analysis.  For the past 20 years I've been involved primarily in wireless and other RF testing.

RF engineers know that making good measurements is a challenge, and I hope this blog will contribute something to our common efforts to find the best solutions.  I work at the interface between Keysight’s R&D engineers and those who make real-world measurements, so I encounter lots of the issues that RF engineers face. Fortunately I also encounter lots of information, equipment, and measurement techniques that improve accuracy, measurement speed, dynamic range, sensitivity, repeatability, etc.

In this blog I’ll share what I know and learn, and I invite you to do the same in the comments.  Together we’ll find ways to make better RF measurements no matter what “better” means to you.

Subscribe via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.