Posted on

How Does WiFiRanger Make Starlink Better?

In our last blog we provided an overview of how the Starlink network consists of thousands of satellites each coming into view of your antenna for only a few minutes at a time.  It sounds like a marvelous system, and it is pretty good, but that doesn’t mean it is perfect.

Have you considered what happens when one Starlink satellite is moving out of your view and another is coming into it?  Yes, the system must direct your receiver to change the phased array antenna it is using and to use another one instead so it can point to a different satellite.  Does all this happen instantly?  Virtually nothing in the world happens instantly.  Control signals must be sent to your receiver/antenna and it has to change where it “points.”  Everything takes a certain amount of time!

In fact, we can estimate that it takes a couple of tenths of a second for this to happen.  One reason we can make this guess with some confidence Is that your Starlink connection exhibits numerous brief interruptions on the order of a couple of tenths of a second in duration.  This information is easily obtained from the Starlink app as shown in Figure 1.  Notice that these interruptions occur every couple of minutes.

Being a bit of a geek, your author asked Starlink what caused these interruptions, and I was told that these are caused by the switching of my connection from one satellite to the next.

Figure 1—Brief Starlink outages occur every few minutes

So, who cares if there is an outage of a couple of tenths of a second?  In fact, you may not care if your Starlink usage is primarily for streaming video from major streaming sources such as Netflix or YouTube TV.  Most, but not all, programming on those services is heavily buffered with stored video data, sometimes with as much as 10 minutes of buffer.  That makes you pretty immune from brief outages doesn’t it?

It sort of does until you choose to watch a live sporting event or even your local news broadcast.  It can also affect you if you participate in lots of Zoom meetings or use your internet connection for video phone calls.  Those can freeze, lose lip sync and more as a result of these minor interruptions.

In addition, Starlink also has random outages of several seconds or more most days.  They aren’t very frequent, but they can be very annoying if you are trying to use Starlink to run a business or teach a class. Figure 2 is a list of such interruptions on my Starlink over the past 12 hours.  Most of them are caused by minor obstructions in the field of view of the dish caused by trees, etc., but a couple were caused by the Starlink network itself.  One additional thing to think about is that when your Starlink connection pauses as the result of such interruptions , all the WiFi enabled devices in your home or RV will stop functioning if their setup involves the Starlink’s WiFi.

Figure 2–Longer interruptions result from obstructions and network issues

So, what can you do about this?  How can you minimize the effects these Starlink interruptions have on your use of the internet.  The answer is redundancy! If we could “combine” the Starlink data signal with one or more other internet sources, then we could minimize the impact caused by a Starlink signal interruption.

But isn’t the Starlink signal unique?  Not really! Even though Starlink’s satellite constellation makes it rather unique, once a customer’s data signal is decoded by the Starlink receiver, it is no different from signals received through most any sort of hotspot, phone, etc.  Starlink provides a router to connect to its receiver, but that router is rather basic and has a very limited set of user-adjustable parameters.

So how do we achieve redundancy?  Actually, it’s quite easy with a WiFiRanger router.  Ranger routers now have “native” support for Starlink which means that, if you have a rectangular Gen 2 Starlink you can put the Starlink router into “bypass” mode and connect by Ethernet to your Ranger (as long as it’s operating on firmware version 7.1.0b13 or later).  If you have a Gen 1 round Starlink or the new High Performance in-motion system, you can literally remove the Starlink router entirely and use your Ranger in place of it.  In both cases you will retain the full functionality of the Starlink app.  But even more importantly you will be able to use the Ranger in MultiWAN mode with your Starlink connection being combined with, for example, a cellular hotspot.

That’s how I use my Starlink with my WiFiRanger Aspen in combination with a Verizon hotspot.  I use the Ranger in Load Balance mode and the result is almost total elimination of any buffering caused by Starlink outages.  I could probably get similar results with a Hot Standby configuration, but I’m not confident that all the brief outages would be entirely eliminated.  With Load Balance, there is always a “fallback” connection regardless of whether or not your Starlink is connected.  My hotspot has an unlimited data allowance; if it didn’t have that I’d probably use Hot Standby to minimize its data usage.

Furthermore, if you are in a location where Starlink reception is impossible due to tree cover or similar issues, the Ranger’s MultiWAN will continue to provide a secure local network with internet access even though the Load Balance would have only one source.  As a result, your Alexa or similar smart devices won’t have to be reprogrammed since they would still be set to connect to the Ranger’s WiFi which won’t have changed.  In today’s world of interconnected devices, I sure wouldn’t want to have to reprogram all my WiFi-enabled devices just because I couldn’t receive Starlink at a particular campground!

Similarly, if you’re an RVer using the Starlink app’s newly available sleep schedule to reduce your nighttime power consumption, a MultiWAN with your cellular hotspot and Starlink will permit you to keep your internet access available for off-hours updating even when you let your Starlink “snooze” to reduce the drain on your batteries.

The bottom line is that Starlink is amazing, but even the most amazing thing can often be improved upon.  WiFiRanger enables you to make your Starlink even better!

Posted on

How does Starlink Work?

By now many RVers and lots of non-RVers are familiar with the name Starlink associated with an internet system created by the Space X company. Most folks are also probably aware that it involves satellites that connect you to the internet. But beyond that, I’m not sure how much the general public actually knows about how the Starlink system works. In social media one encounters posts stating that Starlink is the best thing that have happened to the internet as well as posts stating that is it nothing but overblown hype. The purpose of this blog is to try to explain a bit about how Starlink works. Subsequent blogs will explain more about Starlink’s performance as an internet connection.

First of all, Starlink is unlike any other satellite-based communications systems and that includes DirecTV, Dish Network, Sirius-XM, etc. All of those systems utilize satellites that are placed in what are called geosynchronous orbits at >22,000 miles above the earth’s surface. Satellites in “geo-sync” orbit appear to stay in place over locations on Earth. This doesn’t mean they are stationary, not at all. What is meant is that they rotate around the earth at the same speed as the surface of the Earth is rotating. By rotating at this rate, it becomes pretty easy to aim a receiver/transmitter dish at one of them. Once you’ve pointed your dish, all you have to do is connect to it and do something like watch TV or connect to the internet using a service such as Hughesnet.

Although Starlink also uses satellites, they have little in common with the typically very large geosync ones. Starlink satellites are rather small and are placed in orbits that are only a few hundred miles above the earth’s surface. The physics of orbital mechanics tells us that such “low Earth orbit” (LEO) satellites will move much faster than will locations on the earth. To observers on the ground, they will appear above the horizon and then move across to the opposite horizon in a matter of tens of minutes. That means that we can no longer just point our dish at a satellite that stays in place; we need to be able to somehow track its motion. Furthermore, we will need to have lots of satellites in orbit because when one satellite goes below the horizon we don’t want to have to wait until it comes back into view after it has gone around the Earth. That would really mess up our internet connection! LOL

Starlink currently has thousands of satellites in orbit so that, at any given time, there is one in view over most of the Earth’s surface pretty much all the time. As more satellites are launched into orbit, more and more of the Earth’s surface will always have at least one in view at at all times.

Figure 1 shows where the satellites were as this blog was being written.

So, if Starlink satellites are visible to us for only a few minutes at a time, how does our dish know how to get its signal to the correct one and how can it follow it across the sky? Many of you will have seen Starlink dishes and will know that they don’t seem to move. If it doesn’t move,

how does it track the satellites as they “fly” across the sky? That’s where some engineering “magic” comes into play. Those “flat” Starlink dishes aren’t just pieces of plastic facing the sky. Under that plastic are layer upon layer of tiny antenna arrays printed like circuit boards and stacked one atop the next. If you were to take apart a Starlink dish (something we definitely do not recommend you do), this “stack” of antenna arrays might all look alike But they’re not! Each one is slightly shifted from the next so that each antenna array allows the system to look a slightly different part of the sky. As a satellite moves across the sky, different “layers” of the antenna come into play in sequence. The net result is that the system remains focused on the satellite without the physical receiver moving at all! This is what people mean when they say that the Starlink antenna uses a “phased array.”

Phased array technology was developed by the US Department of Defense several decades ago as a way of eliminating the rotating radar antennas that were commonplace back then. Those rotating systems were used to track moving aircraft. Each time the antenna rotated around it would take another “snapshot” of where an aircraft was and would enable you to determine its course. The Aegis radar systems on Navy combat ships was one of the first major deployments of large-scale phased array radar. Now stationary “panels” could track aircraft better than rotating antennas! The first operational Aegis system was deployed in 2004 on the guided missile cruiser USS Ticonderoga. Figure 2 shows two “panels” of the AN-SPY-1 radar, the latest phased array system to be deployed. Although a Starlink dish costs a lot less than the Spy-1 system, the operating principles are the same. Now you can begin to get a sense of why a Starlink dish is as expensive as it is.

Obviously, there’s a lot more to the Starlink system, such as figuring out which satellite is in view at your location and directing your receiver to switch to the next satellite as the one you are connected to starts to go out of your sight. With 3,271 Starlink satellites in orbit as of November 2022, that’s a massive computer job in its own right. Now multiply that by the many hundreds of thousands of users and that is quite a “bookkeeping” task! If you want to get a sense of how complex this is, go to this website and watch how the satellites orbits. Expand the map and find your location. Notice how different satellites come into range of your station. All of this goes on “in the background” whenever you use Starlink to access the internet. It sort of makes pointing a dish at a fixed satellite seem pretty easy!

The next installment of this blog will discuss how Starlink performs as an internet source.

Posted on

I don’t care about the upcoming 3G shutdown… or should I?

I kept seeing articles explaining that the cellular carriers plan on shutting down their 3G networks later this year and my first inclination was to ignore them because I’m a “high tech” kind of guy who only has 4G hardware. But after I started doing a bit more reading, I began to understand that I could be affected by the shutdown even if all my phones, hotspots, tablets and watches are relatively new 4G/LTE devices.

Abstract vector created by vectorpouch - www.freepik.com
Sunset is coming to 3G cellular. Image designed by Freepik

It turns out that lots of 4G devices are designed to use the 3G network as they “authenticate” (that is, when they log on). You might ask why this is the case and the best answer I can come up with is “it was available and easy”. It’s not that doing this was wrong but, rather, that it didn’t consider the fact that eventually the 3G network would be phased out. The techies among us will explain that devices on the LTE network can be defined as being “voice centric” or “data centric” and some of those voice centric devices are in danger of losing their connection to the cellular networks when the 3G network is disabled.

And when will the 3G networks be disabled? Well, for AT&T that’s going to happen on February 22! (For Verizon and T-Mobile) it will be closer to the end of 2022.) So, if you’re an AT&T customer you don’t have a lot of time to act!

So how does anyone find out if their devices are affected by the 3G shutdown? One way is to ask the manufacturers of your devices. WiFiRanger is posting this blog as part of our effort to contact our customers to explain that the Quectel modems used in our routers potentially would be impacted by the 3G shutdown unless we took pro-active steps to prevent it from causing a problem with your Ranger and its modem. Of course, if you don’t have a modem in your WiFiRanger router you can ignore the rest of this blog; the 3G shutdown won’t affect your Ranger.

For those of you who do have Rangers with modems, we have created a pair of software “work-arounds” for the issue and we have both of them online. We’ve implemented them as a “hot fixes” which means that you don’t even have to download a new firmware update. All you must do is get your Ranger online and click on the Cloud Disconnected/Check for Updates link in the upper right corner of the control panel. Click on the link a couple of times until blue bars start to scroll. When they finish scrolling , if the link reads “Update Firmware,” you MUST update your Ranger’s firmware to latest version, 7.1.0b11 which will upgrade the router and apply the “hot fix” as well. Otherwise, the link will read Check for Updates and your “hot fix” will have already been installed. Then use any one of the SAVE buttons on any page of the Ranger’s control panel and, finally, reboot your Ranger. Now, you’re finished!

The reason we needed two different hotfixes is that we currently have two “classes” of modems in use by customers. Most of you are using modems that have allowed your Rangers to be updated to the current 7.1.0.b11 firmware, whereas some of you are using older modems that restrict your Ranger to using the 7.0.8 firmware. Both hotfixes are applied in the same manner. If you refer to the “how to implement a hotfix” document which is linked to this blog, please note that the screen shots in it will only show the 7.1.0b11 version.

We strongly urge you to go through this hot-fix process even if your RV has been laid up for the winter. Otherwise, if you have a Ranger with an AT&T SIM card in its modem next spring you could well find that your modem no longer will connect to the cellular network. Of course, you would still be able to implement our hot fix, but you would have to get your Ranger online with something other than its cellular modem.

However, even though we’ve just explained how to update your WiFiRanger to avoid the impact of the 3G shutdown doesn’t mean that you don’t own other devices that also require updating. We encourage you to check with the manufacturers of any device you own that has cellular connectivity to see if updates are required. This includes phones, watches, tablets, your cars and even some smart machines. The issue is far more widespread than most of us would have thought

(A step-by-step process to apply the 3G Sundown HotFix can be found here.)

Posted on

I live close to a tower — why do I have such poor cell service??

We continually hear people complain about their cell service being slow, weak or otherwise unacceptable even though they live near a cell tower.  Of course, the simplest response is to question whether or not the tower they can see is actually used by the cellular carrier they have an account with.  Even if you can access the fence around most towers, you can’t always tell from the signage which carrier(s) are using that tower.  Towers are often owned by third parties and any particular carrier may or may not be broadcasting from any particular tower. 

Assuming that you are correct and the tower near your location has an antenna for the carrier you use, the next question is whether or not the beam from the antenna is pointed in your direction.  Although we may, conceptually think of antennas as radiating in all directions, cell phone towers don’t operate that way.   

Figure 1 is a screenshot from an Android app called Network Cell Info Lite.  The app is reporting how my Verizon Pixel 5 is connecting to the network.  For the moment we’re going to ignore the numbers at the top of the picture and will only look at the map. My location is shown as the blue dot and the tower I am connected to is to my south.  

Notice that there are two yellow circles with signal strength bars just to my north.  Those are Verizon-owned towers which the app has in its database.  But notice that I’m not connected to either of them even though they seem closer.  

The answer to that question, most likely comes from the beam pattern of those antennas compared to the one I am connected to.  The antennas in the yellow circles are positioned to service the towns of Rockport and Fulton and their beam patterns are, most likely optimize to the urban areas around them.  The tower my phone is connected to is probably optimized to service State highway 35 that lies just to the west of our location.  It makes sense that that tower would have an “elongated” footprint roughly parallel with the highway.   

I happen to know from my own testing that my location is in a particularly poor spot because we are far away enough from the highway to be in the edge of the serving tower’s beam but too far away from the in-town towers to receive their signals.   

So now if you think you can guess the tower that serves you by knowing its beam pattern, you are only partly correct.  That’s only the first-tier decision process; the next step comes about when your phone or hotspot determines which of the towers in your area actually provides the best usable signal.  Notice that I didn’t say which tower provides the “strongest” signal.  Lots of people make the mistake of focusing solely on signal strength.  But that’s not always the same as which tower provides the best usable signal. 

To answer that question, we need to look at the “dials” and numbers at the top of the picture.  They depict the cell I am connected (on the left) and the “neighboring cell” on the right.  That might seem odd; why should we care about the signal from a neighboring cell that we’re not connected to?  That’s because our phone is always “looking” for a better connection and if it can find it in a neighboring cell it will shift our connection to that cell.  That’s what happens when you are driving along the highway; your phone is continuously checking the neighboring cells for signal strength and quality.  When it finds a better combination of the two it will switch your connection to that cell? 

Let’s now look at the dials at the top of the screenshot.  In the left dial the large number tells us that the signal strength is -104 dBm (decibels).  In “cell phone lingo” that’s called the RSRP.  If you haven’t heard this before, decibels are a logarithmic measurement and signal strength is measured in negative decibels so a smaller negative number will represent a stronger signal.  In our case the signal strength is -104 dB which Is not a particularly good signal strength but it’s what I have at my location.  But the reason that I don’t have much of a problem using that weak signal is embedded in that little “-10 dB” that you can see to the right of the -104.  That -10 dB is what is called the RSRQ and it’s a measure of what engineers call the “signal to noise ratio.”  We’re not going to worry about how the RSRQ is calculated, we’re simply going to accept the fact that an RSRQ of -10 dB is rather good.  Notice that just above the dial it notes that we are connected using Band 13. 

Figure 2 is a table published by Quectel, a major manufacturer of cellular modems.  The table provides comparative “ratings” of RSRP and RSRQ values.  My -104 dBm RSRP value is weak, but my RSRQ of -10 dB is rated as excellent (a value equal to or greater than -10).  That makes it possible for my phone to provide excellent performance despite the weak signal.  Would I benefit by boosting the strength of the signal; maybe not because the signal quality is already as good as it can be.  For those of us raised on analog signals, this is one of the oddities of working with digital signals; they only need to be “strong enough” to be quite usable.  Making them stronger doesn’t necessarily improve the situation. 

So now let’s look at the dial on the top right side of the first figure. It shows the RSRP and RSRQ for our neighboring cell.  We can see that the RSRP (the signal strength) is -112 dBm.  Because of the logarithmic nature of decibels, a reduction of 6 dBm in signal strength is a factor of 4.  The neighboring cell is definitely weaker.  In addition, when we look at the RSRQ we see that it is -11 dB which means that the signal to noise ratio is not quite as good as we have in the cell we are connected to.  In this instance, there is no question but that we’ll stay connected to the cell we’re in.  But these numbers can and will vary over time and every once in a while, even without moving my location, the connection will switch to the other cell.  Notice that in this case the evaluation of the signal was made using Band 2 compared to Band 13 used in the first case.

If all of this seems a bit confusing, I can assure you that this has been an extremely simplified discussion of how your phone selects a tower and a band to use for your phone conversation or internet connection.  And think about the fact that it is constantly re-evaluating that “decision” multiple times per second in order to give you the highest speeds and best voice conversation quality possible.  So the next time someone says to you “how come my cell service is so poor even though I live next to a tower?” you can tell them that “there’s more to it than you might have thought!” 

Posted on

How is a VPN like a phone booth?

A long, long time ago when people wanted privacy for their phone call, they might choose to use a phone booth. The booth created a “box” around their conversation giving them a reasonable degree of privacy. Today, phone booths are long gone but the need for privacy still endures. Today’s phone booth is a software construct known as a Virtual Private Network (VPN) which acts like a phone booth in that it builds an encrypted connection (think of that as like a long, protective box) around the internet “wires” carrying your conversation or data stream. Anyone trying to eavesdrop on your activity will find themselves up against a strong encryption algorithm which provides nearly unbreakable security.

A VPN works by routing your device’s internet connection through your chosen VPN’s private server rather than your internet service provider (ISP) so that when your data is transmitted to the internet, it comes from the VPN rather than your computer. The VPN acts as an intermediary of sorts as you connect to the internet, thereby hiding your IP address – the string of numbers your ISP assigns your device – and protecting your identity. Furthermore, if your data is somehow intercepted, it will be unreadable until it reaches its final destination.

VPNs can have fixed endpoints on both ends, something which might be used between multiple locations of a large business. In this sort of environment all data traffic between fixed locations is encrypted by the VPN. Individual users don’t realize they’re using a VPN because all traffic between those locations passes through it automatically.

Or a VPN might have your computer as one endpoint with the other being user-selectable depending on where your data stream is headed to or from. A number of major VPN providers such as ExpressVPN and NordVPN provide a selection of VPN endpoints all over the world. If you want to send a secure message to an associate in another city or even in another country, you can select a VPN endpoint near them to minimize the distance your unencrypted message will be exposed on the internet.

When you use a VPN that has an endpoint in something other than your company’s facilities, your data stream will look as if it originated at the endpoint, in that city. If the recipient of your message uses a “locate this IP” service, it will appear that your message originates in the city where the endpoint is located.

WiFiRanger routers all incorporate a built-in VPN which we call SafeSurfTM . This is a VPN that originates at your Ranger and ends at the WiFiRanger servers in Idaho. If you ask, “what does that do for me?” it totally encrypts your data at your local campground or other location at which you are concerned about possibly being hacked. When your data enters the internet, it will be as part of a large data stream going from our server to the “backbone” of the internet the size of which makes intercepting a single data stream much more difficult.

So, if VPNs are so wonderful, why doesn’t everyone use one all the time? Well, as is often said about many things in life “you don’t get something for nothing.” VPNs do work exactly like I’ve described but providing the encryption they provide means that more data bits have to be used. It’s as if that encrypted connection that the VPN creates is built out of a web of data bits. Using extra data to create that encrypted connection means that your data stream might get slowed down because some of the available data gets used for security rather than for your data stream. If we all had gigabit fiber connecting our homes and offices maybe this wouldn’t be a concern, but, for most of us, our more limited data capabilities mean that we can’t use VPNs indiscriminately. But when you need security, nothing beats a VPN.

Posted on

Aggregation or Aggravation

Is Carrier Aggregation just more techno-babel aggravation?

A long time ago, in a galaxy far, far away, understanding radio broadcasts was pretty simple- just get the strongest possible signal and soon you’ll be able to listen to radio broadcasts from all over the country! Television added some complexity to this and taught us about things like “ghosts” but, still, increasing signal strength remained the primary objective.

Even when cellular service began some ~30 years ago, it was an analog transmission with “stronger is better” still being a key principle. However, in the mid-90’s cellular service transitioned to digital and this maxim was no longer operative.

With digital signals, “strong enough” became the underlying principle. If a signal is “strong enough,” further increases in strength usually didn’t buy much of an improvement. This is true not just for cellular signals, but also for HDTV, RV park WiFi, satellite TV, etc.

But this doesn’t mean that there’s no way to improve a signal, and that’s where today’s digital technology has brought us some exciting new options. Understanding this new technology isn’t nearly as easy as understanding that stronger signals are better than weaker ones.

Understanding Carrier Aggregation

In a previous blog we talked about how MIMO antenna technology makes it possible for a smart phone or hotspot to create multiple connections to a cell tower. It’s pretty easy to understand that multiple physical connections to a tower using multiple antennas are better than a single one. As we explained in that earlier blog, having more than one “pipe” makes it easier for more “stuff” to flow through them. But even though MIMO can result in significantly improved download speeds, the bandwidth of each of these multiple connections is the same as each of them would have been individually. Therefore, if each MIMO channel is 20 MHz wide, that’s the bandwidth you have matter how many “pipes” you connect.

Carrier aggregation (CA) is like taking MIMO to the next level. Instead of multiple independent pipes, with carrier aggregation it’s as if all the stuff flowing through those pipes is all part of a much larger pipe. With respect to radio frequency signals, this means that the effective bandwidth of the aggregated signals is the sum of the bandwidth of each of the “pipes!” So, if each channel has a 20 MHz bandwidth, with 5-level carrier aggregation, the effective bandwidth of the resulting connection is 100 MHz.

Think of CA as if, in addition to the parallel pipes created by the MIMO antennas, there are now “virtual pipes” created by your phone or hotspot connecting to the tower on multiple frequencies. Essentially, what you end up with is a matrix of “pipes.” In one direction there are the multiple connections created using the MIMO antennas and in the other direction there are the multiple connections created by using multiple communications channels.

Visualize Carrier Aggregation

Carrier aggregation can be implemented using channels in the same cellular band (intra-band) or in multiple bands (inter-band). Within a band, aggregated channels can either be adjacent (contiguous) or not. Figure 1 provides examples of the various ways in which carrier aggregation can be implemented with channel widths ranging from 5 to 20 MHz:

Figure 1: Carrier Aggregation Models

Speeds of Carrier Aggregation and MIMO

MIMO and Carrier Aggregation (CA) work together to improve upload and download speeds. Both create multiple pathways between a cellular device and a cell tower. It’s a bit easier to see multiple antennas than it is to visualize how CA works, but Figure 2 shows how they work together:

Figure 2: Overview of Speeds with Carrier Aggregation & MIMO (Artiza Networks, 2021)

Although this illustration appears rather complex, it’s actually understandable if we examine it piece by piece. Let’s first look at just the section of Figure 3 labeled 2×2 MIMO:

Figure 3: Analyzing Speeds of 20 MHz Channel and 2×2 MIMO (Artiza Networks, 2021)

Without CA, a 2×2 MIMO system with a 20 MHz active channel will provide a theoretical maximum data rate of ~150 Mbps. However, by adding a layer of CA using a second 20 MHz channel, the maximum data rate rises to ~300 Mbps as shown in Figure 4:

Figure 4: Analyzing Speeds of two 20 MHz Channels (CA) and 2×2 MIMO (Artiza Networks, 2021)

As we increase either the complexity of our antenna system (MIMO) or the number of channels employed (CA), we can increase the effective maximum speed of the connection to the cellular tower. As currently defined, up to five 20 MHz CA channels to create an effective 100 MHz channel which would provide a theoretical maximum of 750 Mbps. But if 8×8 MIMO was used instead of 2×2, this maximum increases to 3.0 Gbps! That’s the maximum effective speed of a 4G LTE connection, not a 5G one!

The Takeaways of Carrier Aggregation

So, what does all this mean to you? What it means is that there’s a lot of growth left in the 4G LTE system before we even consider transitioning to 5G. For example, my new Inseego M2000 hotspot has the capability to utilize 5 layer carrier aggregation with built-in 4×4 MIMO which gives it a maximum theoretical speed of ~1.5 Gbps using LTE! Sure, it could go faster if I could access 5G, but those speeds would be more than adequate for anything I would want to do. Similarly, with my Pixel 5 phone, the other day I measured a download speed >140 Mbps on 4G. The phone is capable of using 5G, but I won’t argue about 4G speeds like that!

If all of this stuff is technically baffling, you certainly aren’t alone. These are pretty complicated technologies being used. Fortunately, you don’t need to fully understand it in order to use it as a consumer!

I had set out to explain that the solution to obtaining better cell service is no longer just buying a bigger cellular amplifier. Amplifiers definitely still have their place in locations with very weak cell service; but now there are additional tools you can use to improve your cellular service, even when the signal is strong enough to not need an amplifier.

High on your list of tools ought to be the advanced modems and antennas built into the best of the new phones and hotspots. Unleashing the power of CA and MIMO does require newer hardware, but “newer” doesn’t always mean “very expensive.” Look for the detailed specs of devices you are interested in purchasing. Quite often the usual advertising materials focus on colors, cameras, and other things that don’t impact cellular performance. But if you dig, you can find out info on “the guts” of the device. What category modem does it use? What MIMO antennas are built in? What 5G service does it support? By becoming a knowledgeable consumer, you can identify high performance devices that don’t cost an arm and a leg! Yes, you can pay a thousand dollars for a high quality phone, but you can pay half of that and get one with virtually equal cellular performance.

References

Artiza Networks (retrieved 01/08/2021). DL Acceleration with CA and MIMO. https://www.artizanetworks.com/resources/tutorials/accelera_tech.html

Posted on

When is 5G Not Really 5G?

If you’re one of those consumers who likes to have the latest technology in your technology “toys,” the cellular carriers continue to make that more difficult as they periodically redefine the meaning of the term “5G”. As I noted in one of my earlier blogs, 5G isn’t really a single frequency band or a single technology. In reality, it is principally the “evolution” of cellular communications to the next level of technology. It does include some new frequency spectrum, but it also shares much of that with the 4G networks.

Traditional vs Dynamic Cellular Rollouts

In that earlier post, I noted that the term “5G” covers a rather broad distribution of frequencies from millimeter waves at the many tens of gigahertz all the way down to several hundred megahertz signals “reclaimed” from the old UHF TV band. This is an enormous range of frequencies and the characteristics of the “5G” that employs these frequencies varies dramatically over that range. To say it simply, the 5G implemented using Band 71 in the 617-698 MHz range will be rather different in performance from the 5G implemented in the region of 25 GHz, and it will require different technology to broadcast and receive it. 

How Carriers are Quickly Implementing 5G 

You may have noticed that in October Verizon made a big advertising splash about somehow increasing the size of its 5G network and when you can expect to be able to access it. You might have thought that Verizon had made a huge investment in new towers and hardware to bring this new capability to you. No, what Verizon had done was to redefine some of its 4G frequency spectrum to be 5G, something that is called Dynamic Spectrum Sharing (DSS). 

One of the big drawbacks of millimeter wave 5G is that it does require new towers and hardware and its characteristics result in needing many more towers per square mile of coverage area than does LTE 4G operating at <2 GHz. By using DSS, cellular carriers can provide a “version” of 5G using their existing 4G infrastructure. The benefits of DSS to the carriers are shown in the first figure. DSS implementation is faster and much cheaper for the carriers. 

Cellular Resource Allocation – Sourced from Nokia

So how do the carriers manage to combine two different technologies on the same towers? They accomplish this magic by “slicing and dicing” both time and frequency space as shown in the second illustration. It’s not essential for us to totally understand how this is done, but the key concept is that the frequencies and “block” of time will be shared by the 4G and 5G signals. The concept is significant because it means that as 5G phones become increasingly available, they find that 5G signals are readily available. This is essentially because they have been “invisibly” integrated with the 4G signals that have been there all along. 

Despite Verizon having a big splash about this last month, DSS isn’t something that is limited to Verizon. For the past year T-Mobile has been actively rolling out Band 71 in two phases, a 4G phase and a 5G one. T-Mobile’s acquisition of Sprint gave it a large “chunk” of frequencies in the 2.5 GHz region which is ideal for 5G DSS implementation.   

Similarly, AT&T had previously announced an expansion of its 5G coverage to include 28 additional cities using DSS technology, principally focused on the 850 MHz band. 

So, all three major carriers are now employing DSS to speed up 5G implementation and reduce their capital costs. Many phones currently being introduced into the market, such as the iPhone 12 and the Pixel 5, are equipped to receive both mm wave 5G and what we’ll call “DSS 5G”. 

Setting Real-world Expectations for the Near Future 

That should be a big win for the consumer, right? We’ll get 5G sooner than we expected to, right? Well, we will, and we won’t! DSS 5G will be an improvement over 4G, but it will not be the very high-speed technology improvement many of us have been waiting for. Part of the reason for this is simply physics; lower frequency signals can’t carry as much information content as can higher frequency signals. So, the information content of a 600-800 MHz DSS transmission can’t match the information content of a 25 GHz transmission. Furthermore, by sharing the existing lower band structure between 4G and 5G no new bandwidth is being created. Therefore, the information content of the network as a whole doesn’t increase. In fact, there’s a slight capacity decrease because of it, because with DSS, you need to have 4G signaling and 5G signaling in the same band. Therefore, that signaling takes up a little bit of the capacity. So, if one is a bit cynical, they could say that the carriers are actually taking away some bandwidth from users so they can crow about deploying 5G in lots of places! To say it differently, everyone will have a bit less available bandwidth to share because some users will have some form of 5G to use! 

Average 5G Download Speeds in US – Sourced from Opensignal

The 5G that consumers are going to see, at least for a while, is what we might call “5G Lite.” The following figure dates from January 2020 and shows the download speeds provided by several implementations of 5G technologies. Note that because the figure pre-dates the Verizon press release about its DSS activity, that information isn’t included. Notice that the bars labeled mm wave 5G provide the super-fast speeds that have been touted for 5G. However, the bars representing lower-band 5G (600 MHz and 850 MHz) provide only a modest improvement over currently available 4G speeds. The 5G implementation at 2.5 GHz is nearly as good as the mm wave examples because it utilizes some “new” spectrum which doesn’t have to be shared with 4G.   

Making Informed Decisions About Upgrading Hardware 

So, what does this all mean to us, the consumers? In my opinion, it means that you shouldn’t throw out your 4G phones and hotspots unless you have another reason to do so. For a number of years, the incremental benefit of switching to 5G-compatible hardware will be modest at best, and that’s assuming that you have 5G DSS service in your part of the country. As more mid-band (<6 GHz) 5G gets built out, overall network speeds will increase, but that’s not going to happen overnight. Millimeter wave 5G will continue to expand in urban areas where population density makes the investment worthwhile. But 4G service will continue to be the backbone of the rural cellular network for quite some time to come, and the routers and modems you purchase today are likely to have many years of service life before they are overtaken by technology advancement.   

 
References: 

5G Networks.net, 5G Dynamic Spectrum Sharing (DSS), 7/24/2020 

Chaim Gartenberg, Verizon announces its nationwide 5G network, The Verge, 10/13/2020 

Linda Hardesty, The 5G of T-Mobile, Verizon and AT&T all rank badly for different reasons, Fierce Wireless, 3/3/2020 

Posted on

How Load Balancing Makes Netflix and Chill Better

These days, it seems everyone is using more and more internet data so they can stream video. But if you’re like many RVers, you’re forced to access the internet much of the time using cellular connections that don’t have the high speeds you’re used to in the city. In rural America, where lots of RVers like to be, cell service speeds typically are in the ~5-15Mbps range. These speeds can be even less if you’re using a low-cost data plan that is “network managed,” which is “cellular speak” for: you are low in the queue with respect to network priority!

To combat this challenge, many RVers rely on multiple cellular connections with more than one carrier. But by themselves, those don’t resolve the problem. If you’re watching a Netflix video, it doesn’t matter that you have two or three cellular connections plus the RV park’s WiFi, unless you can use them all at the same time.

How Load Balancing Provides Multiple “Pipes”

In an earlier post, we provided an introduction as to how Load Balancing can enable you to use multiple cellular connections at the same time. With Load Balancing, it’s as if you have parallel “pipes” carrying data to your router. Even though the download speed of the data going through each of those pipes doesn’t change, the fact that there are more of them means that more data can flow to your router in each time interval. Think of the pipes as if there was water in them. If each pipe can provide 1 gal per minute, then, by having 5 of them, we can now get 5 gal per minute of water.

Visualization of Multiple Water / Data Pipes

So, let’s imagine that you have two internet connections, each providing 5Mbps of data (when they’re not being network managed). Even if you aren’t watching Netflix, you may have several people in the family surfing on laptops or iPads, or playing video games. Each person is capable of consuming several megabits per second in data. Don’t forget that Facebook and YouTube video are still video even if the screen size is small and the resolution is low.

It’s fairly easy to envision that having multiple internet “pipes” makes it easier for multiple people to engage in all these activities. It’s as if one pipe is providing data to mom, a second one to dad and, maybe a third to a teenage gamer.

A conceptual illustration of Load Balancing is shown in the following graphic which shows data calls going to all available internet connections in turn:

Example of Load Balancing Data Paths

The above concept of multiple data paths to multiple end users is fairly easy to understand. However, what’s a lot harder to understand is how Load Balancing helps when you want to watch Netflix. That’s because in our minds many of us envision a Netflix data stream as being just that, a continuous flow of data bits that come down the wire to us. It turns out that’s hardly at all how Netflix gets video to us.

How Netflix Delivers Content

The first thing you need to understand is that giant Content Delivery Networks (CDN) like Netflix are cloud-based, distributed systems that don’t exist in a particular location. So, when you click on a movie to watch, one of the first things the CDN determines is where you are located and what ground-based link is best suited for sending the data to you. If you’re enjoying yourself at the Grand Canyon, there’s no point to sending your movie through the internet from New York, for example.

Once the CDN decides where it’s going to send your movie from, it has to determine the speed of the data link you are using so it can send your video in a resolution your connection can handle. And it has to have available resolutions for both higher and lower speed links just in case your speed changes for better or for worse while you are watching.

Now, most of us have experienced the changes in resolution that Netflix (or other CDNs) can employ when your speed decreases. But what most of us don’t realize is that better CDNs, like Netflix, can even remove individual video frames in order to reduce the data flow! Resolution-reduction is sort of what happens when the reduced data rate can’t be accommodated by frame elimination and other tricks.

By now, some you are wondering how all of this helps explains how Load Balancing makes it easier to watch a Netflix video. Well, the answer to that is that once Netflix has decided how much data you need (based on the data rate your connection can handle), it doesn’t send it as a continuous stream. Rather, it sends it to you by the bucketful, and that’s where our water analogy comes into play.

How Buffering Comes Into Play

You may not realize that the CDNs provide you a buffer of 5-10 minutes of video before you start watching. That’s why there’s always a pause before your video begins, no matter how fast your connection is. Your buffer is filling!

Once your buffer is full and you start watching, all the CDN cares about is keeping the buffer level full. Contrary to how we might envision it working, the CDN keeps your “bucket full” using all the data it can get delivered to you. Every time your buffer gets low, it issues a call for more data. If you think of your buffer as a bathtub, then you can see that having extra pipes will enable it to fill faster. All that matters is that the level in the tub fills faster than the water goes down the drain. If the tub gets empty, the data stream stops and you get the dreaded re-buffering symbol.

In the following graphic, you can see the buffer filling and then repeated “data calls” to keep it filled. Furthermore, you can see that Netflix is relying on three TCP data streams to provide the data. Hmm, did I just say three data flows? Doesn’t that seem analogous to our multiple pipes?

Bandwidth Graph of Netflix Buffer

Although I admit that the above graphic was obtained from an outside source (Singh, 2018), the following one was taken directly from my WiFiRanger Aspen’s real-time data utilization display. For simplicity, I have only shown one data source, but the general nature of the graph is the same. Because my data streams were much slower than those used in the first example, the initial fill period took longer. However, after the initial fill period, the spikey nature of the refill data calls are very similar to those in the first illustration. Although it took some time for the initial fill period to complete, the video had started playback early in the process. There is no doubt that Netflix realizes people are impatient and wouldn’t want to wait for their video to begin.

Bandwidth Graph of WiFiRanger During Netflix Buffer

The key point to understand is that a video “stream” is NOT a steady stream of data in sequential order.

Clearly, the more “pipes” you can use to contribute to this refilling process, the better your video stream will be.

In summary, by allowing you to use multiple data pipes to fill your “bucket,” Load Balancing makes it easier to maintain high-quality data streams even with a bunch of relatively slow-speed connections.


References:

Netflix – How It (Actually) Works: Insights for Network Managers, by Harneet Singh, Apr 12, 2018

Load Balancing Techniques and Optimizations, by Jason Potter, Posted on April 2, 2019

Posted on

MIMO, SU-MIMO, MU-MIMO, finding NEMO? It’s all buzzword bingo to me!

There are times when even the most “techie” of consumers begins to wonder if there’s any way of making sense out of the barrage of features available in the rapidly evolving world of communications.  As soon as you think you understand something, it gets changed or placed.  Everything is a jumble of letters and numbers.   First there was CDMA, then 4G/LTE, now 5G.  And, of course there’s 802.11b/g/n and ac!   We’ve talked about these in previous blog posts. In this post I’m going to discuss a feature that goes by the acronym MIMO which stands for Multiple Input/Multiple Output.  You may have heard people saying that you “have to get MIMO antennas”; today we’ll talk a bit about what that means!

I’m sure that virtually everyone has, at one time or another, stared at the top of a cell phone tower and wondered why there were so many antennas clustered up there.  Why do they have to have so many antennas side by side?  We usually think about connecting to an antenna tower as if we drew a line from the top of the tower to our device.  But what if we could draw more than one line from a cellular tower to our device?  What if we could draw lines from our device to several of the antennas on the tower?  Could we get more data to flow between the tower and our device?

It’s easy to understand that if we were connecting water hoses from a water source to our RV we could get more water to flow if we connected several hoses in parallel.  Several hoses in parallel would act as if they formed a bigger pipe. 

It’s also easy to understand that if we were running electrical current through wires, we could safely pass more current through several wires than we could any single wire.

With digital radio signals the concept is similar, but the process is a lot more complicated.  If we had several antennas on the tower and several on our device, there’s no way to ensure that the signal from Antenna X on the tower gets to Antenna A on your device.  In fact, what Antenna A is actually going to see is a mixture of the signals from Antennas X, Y, Z, etc.  Likewise, Antenna B on our device is going to see a similar mixture of signals coming from each of the antennas that are broadcasting to you. 

Even if the exact same signal is transmitted by both antennas, what will be received by A and B is going to be a mix of all of that and that mix will also be supplemented by reflected signals which may even have slight time delays.  Quite often what’s done is to broadcast the same signal using two different polarizations as is shown in Fig 1.  Even though the both polarizations contain the same information, from a signal processing perspective we can consider them to be two different data streams and use digital signal processing to separate them.

[WARNING—MATH ALERT!  This next section uses a little bit of algebra; if you’ve given up math for retirement, you are free to skip to the next section!]

As a simple example, lets assume that the tower has two antennas broadcasting to you and we’ll call them  X and Y.  We’ll assume that your phone has two antennas and we’ll call them A and B.  Mathematically, the signal seen by antenna A on your phone can be represented as:

Signal A(t) = AxX(t) + AyY(t)  where Ax and Ay are the signal strengths of antennas X and Y as seen by antenna A all of which are functions of time (t)

Similarly, the signal seen by antenna B on your phone is going to look something like:

Signal B(t) = BxX(t)+ ByY(t)

For some of you these equations are going to bring back faint (painful?) memories from algebra because what we have in this example is nothing more than two equations with two unknowns.

Now the good news is that our little algebra course will end here—we’re not going to have to solve those equations ourselves.  But, thanks to modern signal processing techniques, our cellular modems do just that.  In fact, by solving these equations the two pairs of antennas on the tower and on your device can act as if they are two separate data transmission “pipes” so the amount of data you can receive in a given period of time is twice as much.

[This is the end of the MATH ALERT!]

A simple MIMO setup with twin antennas as we’ve described is called a 2×2 MIMO (2 transmit antennas and 2 receive antennas) and such simple systems are now common on smartphones, tablets, hotspots, etc.  In fact, 2×2 MIMO is now being superseded by 4×4 MIMO on some newer devices and nothing prevents systems from having even more than four.

If all of this wasn’t complicated enough, there’s actually a bit of difference between how MIMO operates in urban environments compared how it works in more rural ones.  In a rural environment multiple antennas on a cell tower essentially transmit the same signal but with coding differences (such as different polarizations) so they can be distinguished from each other.  A cell phone with multiple antennas can receive these transmissions and “compare” them.  By doing this the “accuracy” of the received signal is improved which results in an overall improvement in phone performance.  This most basic use of MIMO is called “transmit diversity and it can enable phones to achieve fairly high speeds with relatively weak signals. 

However, in a more urban environment, where there are more cell towers within range and more surfaces to cause reflections, different data streams can be transmitted from different antennas so that the data speeds achieved can be significantly higher that would be possiblle with a single data stream.  MIMO operating in this manner is said to be using “spatial diversity.”  For those of us who grew up in an analog broadcast world, an amazing aspect of spatial diversity MIMO is that it is actually beneficial to have reflections, the very things that used to cause “ghosts” on our old TV pictures. It’s the use of these reflected signals that enables MIMO to differentiate the signals coming from different antennas.  Figure 2 illustrates how urban reflections can be used in the MIMO process.  The red and purple signals travel on different paths and have different delays as a result.  Using signal processing both data streams are recovered and the total speed can be twice or more than the speed of either stream

So how does all affect how a phone performs?

Performance tests have demonstrated that going from 2×2 MIMO to 4×4 MIMO can give you improved wireless signal strength and speed.  For example, some tests compared the iPhone XR to the iPhone XS. The iPhone XR and iPhone XS have the same wireless modem, but the XR has 2×2 MIMO whereas the  XS has 4×4.  When both phones were both connected to a 4×4 MIMO LTE network, the 4×4 iPhone XS topped out at a download speed of just under 400 Mbps. The 2×2 MIMO iPhone XR topped out at right under 200 Mbps at the same signal strength.  That’s a pretty amazing performance improvement without any other differences between the two phones.  Figure 3 [reference 1] shows the insides of a Samsung Galaxy S8.  The cellular antennas are along the top and near the bottom.  It’s amazing how much is stuffed into these devices.

So, the next time you upgrade your phone, ask what type of MIMO it uses.

One additional consideration worth noting about MIMO is that using it reduces the benefit of having a simple cellular amplifier.  In fact, using an such an amplifier can actually result in a performance decrease because it will prevent the phone or hotspot from taking advantage of the speed increases that derive from MIMO.  When a MIMO-equipped cell phone is combined with a single-channel cellular amplifier all the embedded MIMO information is lost. Yes, the signal seen by the phone will be stronger, but all the advantages provided by MIMO will be lost.  Essentially, the phone will revert back to a 1×1 MIMO which is how we define a single antenna configuration.  The “rule of thumb” these days is that if you can obtain a usable data signal without an amplifier, you’ll probably be better off without it!  That’s not to say that an amplifier is never beneficial, but in many cases you may be better off just relying on MIMO to achieve maximum speed.

By now you should have basic understanding of how MIMO can improve your cell phone reception. Next month we’ll talk about how the same concepts can be applied to WiFi communications.

References:

IEEE Spectrum, “Building Smartphone Antennas That Play Nice Together”,  Sampson Hu and David Tanner, 10/23/2018

Posted on

4G. 5G, 5G+!!! Gee, why do I care?

Written by WiFiRanger Ambassador, Joel Weiss “docj”

To the average person, today’s cellular data marketplace is a jumble of technobabble. Carriers continuously boast of the capabilities of their networks while also claiming that even better service is soon to be available. At the same time several companies planning to establish satellite-based internet systems claim that users will be better off with those (when they exist)! If only there was a way to sift through the “Geek speak” to better understand what the situation actually is!

The acronym 4G LTE actually stands for 4th Generation Long Term Evolution and, believe it or not, it is even a registered trademark. It pertains to cellular transmission standards that were first proposed all the way back in 2004. To be called 4G LTE a cellular system has to be capable of providing at least 100 Mbps capability. 4G LTE is in use essentially all over the world and LTE phones can, with some specific exceptions, be used in most countries. 4G LTE replaced the 3G CDMA network used by some US carriers and that network will be shut down in the near future.

Even though people (and advertisers) use the terms 4G and LTE as if they are synonyms, in reality, the term LTE encompasses futures evolution beyond 4G.

So if LTE is what we have today, what comes next? I’ve heard people talk about Advanced LTE; is that the same as 5G?

Advanced 4G LTE is an improvement on “regular” 4G LTE but it doesn’t represent a whole new technology…For properly equipped cell towers and receivers (phones) Advanced 4G, sometimes called LTE+ in ads, can provide increased download speeds, up to ~300Mbps. To enable this, the cellular network essential permits a receiver to make multiple simultaneous connections to the network. It’s as if you phone or hotspot had two or more parallel connections to the same cellular tower. In “Geek speak” this is called carrier aggregation!

For carrier aggregation to work, the modem in your phone (the device that actually talks to the cellular network) has to be of an advanced type and it has to be communicating with a tower that has the proper hardware on it. Suffice it to say that at present, most of your phone and hotspots won’t yet support this capability and it is not uniformly available in the US.

To make matters even more confusing, some marketing flacks at AT&T decided to create a non-existent standard that they called “5Ge” which is nothing more than AT&T’s implementation of 4G LTE+. Irrespective of anything you hear in an ad, 5Ge is NOT 5G

So, if we don’t yet even have LTE+ why are we worrying about 5G? What would be different about 5G?

The 5G cellular system will be a completely new cellular implementation that will enable users to experience download speeds up to the Gbps range. Although, the actual speed obtained by users on any specific tower will probably be less than that, on the average most people will see download speed improvement of factors of at least 10 to 100. In addition, one of the advantages of 5G will be greatly reduced “ping times” (the time it takes for your “click” to reach the computer on the receiving end.) That would mean that a cellular connection would have plenty of bandwidth to support multiple video streams and/or to engage in real-time gaming

5G technology actually will come in three “flavors” and the implementation you encounter will depend significantly on which carrier you subscribe with and where you live. Different carriers have purchased the rights to use different sets of frequencies for their own 5G implementation. Furthermore, 5G implementation will be different in different parts of the country depending on the population density.

The following graphic depicts a portion of the electromagnetic spectrum and how our current and proposed communications networks fit together. The orange oval in the 0.8-2 GHz region is where today’s cellular phones and hotspots operate. The red oval shows the general spectral region called millimeter wave where the highest performance 5G systems will operate.

At the high performance end of the 5G spectrum there will be very high frequency 5G using what some people refer to as “millimeter waves.” The good news is that systems using mm waves will be capable of download speeds in the ~10 Gbps range. These transmissions will use frequencies of around 25 GHz. The bad news is that these wave are easily blocked by the walls of buildings, trees, rain and other obstacles and there will have to be many small “towers” to serve an area compared with the relatively small number of large towers we have today. Most people expect that this high frequency 5G will mostly be limited to urban and/or suburban environments.

At somewhat lower frequencies, in the 1-6 GHz range, there will be other implementations of 5G. Sprint had made investments in this frequency spectrum and other carriers are expected to use it also. Signals at these slightly lower frequencies will penetrate buildings and other obstacles better than do mm waves, but they won’t have quite as much penetration capability as we are used to with cellular signals today. The download speeds provided by 5G systems operating at these frequencies will be somewhat less than those made possible in the mm wave region.

At the lower end of the spectrum, there will be “low frequency” 5G and the carrier most aggressively pursuing this approach is T-Mobile which made a large investment in frequencies around 600 MHz, the so-called Band 71. T-Mobile is already using Band 71 for 4G LTE service, but later in 2020 it is expected to begin 5G operations using the same band . However, existing phones and hotspots that can receive Band 71 will not, in general, be able to receive 5G broadcasts on Band 71.

Furthermore, the physics of low frequency transmissions, however, limits 5G using these low frequencies to download speeds of ~100 Mbps. That might not compare with the Gbps speeds of higher frequency approaches, but it is sure a lot faster than the 1-10 Mbps speeds many of us live with today!

Wow, that’s a lot of information. When will all this happen?

5G is currently being rolled out by all the major carriers and is available in quite a few major metro areas. Here’s an interactive map of where you can already get 5G: https://www.digitaltrends.com/mobile/5g-availability-map/

It will probably take a number of years before the mix of technologies being offered by different carriers shakes out completely. Since I live in a relatively rural area, I doubt I’m going to see much of anything any time soon! But, no doubt our grandchildren will grow up in a word in which everything is wireless. “Grandpa, what’s that funny dish-like thing on the roof of your RV?”