How far do radio waves travel?
$begingroup$
I have read that radio waves just keep travel but the signal gets weaker because of how the wave spreads. Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones. If GPS modules can pick up waves from space why cant my laptop reach my Wifi Signal from far away? And why are Walkie-talkies not able to transmit and receive at much further distances? Do all radio waves potentially travel the same distance? Or does the distance depend on the power of signal? What determines the power of signal, wavelength or frequency?
propagation physics
$endgroup$
add a comment |
$begingroup$
I have read that radio waves just keep travel but the signal gets weaker because of how the wave spreads. Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones. If GPS modules can pick up waves from space why cant my laptop reach my Wifi Signal from far away? And why are Walkie-talkies not able to transmit and receive at much further distances? Do all radio waves potentially travel the same distance? Or does the distance depend on the power of signal? What determines the power of signal, wavelength or frequency?
propagation physics
$endgroup$
1
$begingroup$
"regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
$endgroup$
– Marcus Müller
Jan 25 at 12:12
add a comment |
$begingroup$
I have read that radio waves just keep travel but the signal gets weaker because of how the wave spreads. Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones. If GPS modules can pick up waves from space why cant my laptop reach my Wifi Signal from far away? And why are Walkie-talkies not able to transmit and receive at much further distances? Do all radio waves potentially travel the same distance? Or does the distance depend on the power of signal? What determines the power of signal, wavelength or frequency?
propagation physics
$endgroup$
I have read that radio waves just keep travel but the signal gets weaker because of how the wave spreads. Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones. If GPS modules can pick up waves from space why cant my laptop reach my Wifi Signal from far away? And why are Walkie-talkies not able to transmit and receive at much further distances? Do all radio waves potentially travel the same distance? Or does the distance depend on the power of signal? What determines the power of signal, wavelength or frequency?
propagation physics
propagation physics
edited Jan 24 at 18:08
Kevin Reid AG6YO♦
16.2k33170
16.2k33170
asked Jan 24 at 17:42
z Eyelandz Eyeland
212
212
1
$begingroup$
"regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
$endgroup$
– Marcus Müller
Jan 25 at 12:12
add a comment |
1
$begingroup$
"regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
$endgroup$
– Marcus Müller
Jan 25 at 12:12
1
1
$begingroup$
"regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
$endgroup$
– Marcus Müller
Jan 25 at 12:12
$begingroup$
"regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
$endgroup$
– Marcus Müller
Jan 25 at 12:12
add a comment |
4 Answers
4
active
oldest
votes
$begingroup$
Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.
Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.
Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones
There are several factors here, including:
- The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.
- (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.
- GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.
- Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.
Do all radio waves potentially travel the same distance?
There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.
Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!
Or does the distance depend on the power of signal?
If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.
What determines the power of signal, wavelength or frequency?
Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).
If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.
(Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = frac{c}{f}$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)
$endgroup$
$begingroup$
Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
$endgroup$
– z Eyeland
Jan 24 at 19:19
1
$begingroup$
@zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 19:27
$begingroup$
Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
$endgroup$
– z Eyeland
Jan 24 at 21:44
1
$begingroup$
@zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 22:07
$begingroup$
The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
$endgroup$
– Cecil - W5DXP
Jan 26 at 19:27
add a comment |
$begingroup$
The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.
And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.
$endgroup$
add a comment |
$begingroup$
Light is a form of radio wave. That type of radio wave travels billions of light years before being received on earth by telescopes.
$endgroup$
add a comment |
$begingroup$
Signals that don't bounce or refract are also limited to line of sight propagation. In other words, your walkie talkie held in your hand can only be heard by other walkie talkies about 3-5 miles away in general, because further than that, the two walkie talkies are over the horizon from each other (i.e., this big rock called the planet Earth is between them).
To overcome this limitation, the height of one or both has to be raised.
So distance is limited by all of these factors (some of these were described well in other answers):
- attenuation of the signal by absorption (air, trees, walls, rocks, etc.)
- attenuation of the signal by spreading
- signal to noise ratio (noise from other signals, natural and otherwise, on frequency and on close frequencies)
- propagation effects (multipath propagation, reflection, refraction, ducting, and horizon)
Each of these factors can be mathematically modeled with equations, but different effects dominate in different situations, so no one equation will give you a distance for all situations.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ifUsing("editor", function () {
return StackExchange.using("schematics", function () {
StackExchange.schematics.init();
});
}, "cicuitlab");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "520"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fham.stackexchange.com%2fquestions%2f12690%2fhow-far-do-radio-waves-travel%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.
Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.
Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones
There are several factors here, including:
- The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.
- (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.
- GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.
- Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.
Do all radio waves potentially travel the same distance?
There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.
Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!
Or does the distance depend on the power of signal?
If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.
What determines the power of signal, wavelength or frequency?
Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).
If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.
(Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = frac{c}{f}$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)
$endgroup$
$begingroup$
Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
$endgroup$
– z Eyeland
Jan 24 at 19:19
1
$begingroup$
@zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 19:27
$begingroup$
Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
$endgroup$
– z Eyeland
Jan 24 at 21:44
1
$begingroup$
@zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 22:07
$begingroup$
The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
$endgroup$
– Cecil - W5DXP
Jan 26 at 19:27
add a comment |
$begingroup$
Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.
Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.
Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones
There are several factors here, including:
- The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.
- (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.
- GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.
- Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.
Do all radio waves potentially travel the same distance?
There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.
Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!
Or does the distance depend on the power of signal?
If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.
What determines the power of signal, wavelength or frequency?
Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).
If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.
(Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = frac{c}{f}$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)
$endgroup$
$begingroup$
Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
$endgroup$
– z Eyeland
Jan 24 at 19:19
1
$begingroup$
@zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 19:27
$begingroup$
Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
$endgroup$
– z Eyeland
Jan 24 at 21:44
1
$begingroup$
@zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 22:07
$begingroup$
The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
$endgroup$
– Cecil - W5DXP
Jan 26 at 19:27
add a comment |
$begingroup$
Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.
Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.
Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones
There are several factors here, including:
- The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.
- (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.
- GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.
- Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.
Do all radio waves potentially travel the same distance?
There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.
Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!
Or does the distance depend on the power of signal?
If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.
What determines the power of signal, wavelength or frequency?
Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).
If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.
(Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = frac{c}{f}$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)
$endgroup$
Radio waves don't stop at a distance, they just get weaker; you've read this correctly. The reason that communications stop working at some distance is that the signals are too weak to be understood.
Besides distance (and being absorbed or reflected by objects in the path) causing the signal to be weak in an absolute sense (how much power there is), there is also the question of signal-to-noise ratio. That is, there are other radio waves, from other transmitters, natural sources, and even unintentional noise sources inside the receiver itself, all of which “drown out” the desired signal just like acoustic noise can make it hard to hold a conversation.
Ok i get this but what confuses me is that satellites emit signals from space that our phones and GPS modules pick up regardless of distance versus; Walkie-talkies and WiFi routers whos siganls don't travel nearly as far relative to GPS and phones
There are several factors here, including:
- The GPS system is predictable by the receivers. If you've ever used a dedicated GPS receiver, you may notice that it takes longer to get a location fix the first time it's turned on or if it's been off for a while. This is because it's using the information about where it last was, and what time it is, and the last satellite-orbit information it copied from the transmissions, to make good guesses about what it expects to receive. This allows GPS to work with a very poor signal-to-noise ratio.
- (Almost) nobody else is transmitting on the GPS frequencies, because that's illegal. They're reserved for the purpose. In WiFi, there are lots and lots of devices all using the same few channels; if two transmit at the same time on the same channel (and distance/obstacles don't make one significantly stronger) then neither will get through, for that one packet.
- GPS is sending a lot less information per second than WiFi. The Shannon-Hartley theorem tells us that there is a maximum rate of information transfer across any channel (here, a limited range of radio frequencies) depending on the signal-to-noise ratio. So WiFi is doing a harder task.
- Your phone does not just use GPS to obtain its location; it also detects nearby WiFi devices and cell towers, and constructs a best guess from all of these information sources.
Do all radio waves potentially travel the same distance?
There is no limit on distance. In a vacuum, with nothing else around, a wave simply loses power with distance. On Earth, with atmosphere and trees and buildings and so on, different wavelengths/frequencies will be reflected and absorbed differently. Generally, longer wavelengths (lower frequencies) can be used at greater distances, because absorption generally tends to increase with frequency.
Also, in the "HF" regions of the spectrum, below 30 MHz, signals are actually refracted off the ionosphere allowing them to propagate around the curve of the earth, whereas higher frequencies usually pass through the ionosphere — which is better if you want to talk to satellites!
Or does the distance depend on the power of signal?
If you increase power from the transmitter, then any receiver receives proportionally more power. Therefore, the signal-to-noise ratio improves (unless the power is so high as to cause overload). So more power means a larger usable range.
What determines the power of signal, wavelength or frequency?
Neither; they're completely independent. If you have a transmitter that can produce a power of $x$ watts at a frequency of $y$ MHz, then you can always reduce its power output to some lesser value. This is done routinely for any non-broadcast communication; reducing output power saves battery and lets other users use the same frequency at a distance without "overhearing" each other as much (just the same whether these are 'walkie-talkie' voice communications or several WiFi networks or anything else).
If you get into fundamental physics, you may hear that the energy of a photon is proportional to its frequency, and that radio waves are made of photons. This is all true, but practically useful radio transmissions are made up of many photons. So changing the transmitter power changes the number of photons emitted per second, but each photon still has the same energy.
(Also note that wavelength and frequency are the same thing just measured reciprocally: you can convert one to the other using $lambda = frac{c}{f}$, where $lambda$ is the wavelength, $f$ is the frequency, and $c$ is the constant speed of light.)
answered Jan 24 at 18:37
Kevin Reid AG6YO♦Kevin Reid AG6YO
16.2k33170
16.2k33170
$begingroup$
Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
$endgroup$
– z Eyeland
Jan 24 at 19:19
1
$begingroup$
@zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 19:27
$begingroup$
Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
$endgroup$
– z Eyeland
Jan 24 at 21:44
1
$begingroup$
@zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 22:07
$begingroup$
The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
$endgroup$
– Cecil - W5DXP
Jan 26 at 19:27
add a comment |
$begingroup$
Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
$endgroup$
– z Eyeland
Jan 24 at 19:19
1
$begingroup$
@zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 19:27
$begingroup$
Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
$endgroup$
– z Eyeland
Jan 24 at 21:44
1
$begingroup$
@zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 22:07
$begingroup$
The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
$endgroup$
– Cecil - W5DXP
Jan 26 at 19:27
$begingroup$
Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
$endgroup$
– z Eyeland
Jan 24 at 19:19
$begingroup$
Great explanations to my questions, Thank you. Oh one more thing; how should I visualize increase/decrease of power of a transmitter. Naturally I identify a power increase would speed up the wave but I am sure that is not the case. You mentioned that more power only increases the usable range. Should I visualize a greater powered wave with more tightly packed photons that take look to spread?
$endgroup$
– z Eyeland
Jan 24 at 19:19
1
1
$begingroup$
@zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 19:27
$begingroup$
@zEyeland The wave has greater amplitude (it's "louder", or "taller") Considered as photons, there are more in the same space, so you could say they're more tightly packed, but they don't interact (superposition principle) so it's not that they behave differently but that at any given location, there will be more for a receiver to collect. So if you draw the spherical boundary of "minimum useful power", that'll be bigger, but defining that boundary depends on the performance of your receiver, not any kind of physical distance limit.
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 19:27
$begingroup$
Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
$endgroup$
– z Eyeland
Jan 24 at 21:44
$begingroup$
Great thanks for information. So I increase the amp of a radio wave; now the wave is stronger/louder/taller but I still can not hear it because of the frequency. Versus a sound wave amp being increase; I will noticeably hear the loudness. Correct?
$endgroup$
– z Eyeland
Jan 24 at 21:44
1
1
$begingroup$
@zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 22:07
$begingroup$
@zEyeland There can be radio waves and sound waves of the same frequency (low frequency radio waves and high frequency sound might both be found in the kHz range). You can't hear them because they're waves in the electromagnetic field, not in the pressure of the air. (Comments aren't a good place to ask questions. Let's wrap this up. Chat is also an option.)
$endgroup$
– Kevin Reid AG6YO♦
Jan 24 at 22:07
$begingroup$
The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
$endgroup$
– Cecil - W5DXP
Jan 26 at 19:27
$begingroup$
The FAST radio telescope in China has received radio signals from 5,870,000,000,000 miles away.
$endgroup$
– Cecil - W5DXP
Jan 26 at 19:27
add a comment |
$begingroup$
The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.
And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.
$endgroup$
add a comment |
$begingroup$
The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.
And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.
$endgroup$
add a comment |
$begingroup$
The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.
And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.
$endgroup$
The curvature of the Earth also makes a difference between how a signal will propagate near the ground versus in outer space.
And at a large enough distance, the random quantum behavior of the atoms and electrons in your receiver radio will drown out their reaction to any incoming radio waves.
answered Jan 28 at 16:38
hotpaw2hotpaw2
3,15321733
3,15321733
add a comment |
add a comment |
$begingroup$
Light is a form of radio wave. That type of radio wave travels billions of light years before being received on earth by telescopes.
$endgroup$
add a comment |
$begingroup$
Light is a form of radio wave. That type of radio wave travels billions of light years before being received on earth by telescopes.
$endgroup$
add a comment |
$begingroup$
Light is a form of radio wave. That type of radio wave travels billions of light years before being received on earth by telescopes.
$endgroup$
Light is a form of radio wave. That type of radio wave travels billions of light years before being received on earth by telescopes.
answered Feb 15 at 15:00
MarqTwineMarqTwine
513
513
add a comment |
add a comment |
$begingroup$
Signals that don't bounce or refract are also limited to line of sight propagation. In other words, your walkie talkie held in your hand can only be heard by other walkie talkies about 3-5 miles away in general, because further than that, the two walkie talkies are over the horizon from each other (i.e., this big rock called the planet Earth is between them).
To overcome this limitation, the height of one or both has to be raised.
So distance is limited by all of these factors (some of these were described well in other answers):
- attenuation of the signal by absorption (air, trees, walls, rocks, etc.)
- attenuation of the signal by spreading
- signal to noise ratio (noise from other signals, natural and otherwise, on frequency and on close frequencies)
- propagation effects (multipath propagation, reflection, refraction, ducting, and horizon)
Each of these factors can be mathematically modeled with equations, but different effects dominate in different situations, so no one equation will give you a distance for all situations.
$endgroup$
add a comment |
$begingroup$
Signals that don't bounce or refract are also limited to line of sight propagation. In other words, your walkie talkie held in your hand can only be heard by other walkie talkies about 3-5 miles away in general, because further than that, the two walkie talkies are over the horizon from each other (i.e., this big rock called the planet Earth is between them).
To overcome this limitation, the height of one or both has to be raised.
So distance is limited by all of these factors (some of these were described well in other answers):
- attenuation of the signal by absorption (air, trees, walls, rocks, etc.)
- attenuation of the signal by spreading
- signal to noise ratio (noise from other signals, natural and otherwise, on frequency and on close frequencies)
- propagation effects (multipath propagation, reflection, refraction, ducting, and horizon)
Each of these factors can be mathematically modeled with equations, but different effects dominate in different situations, so no one equation will give you a distance for all situations.
$endgroup$
add a comment |
$begingroup$
Signals that don't bounce or refract are also limited to line of sight propagation. In other words, your walkie talkie held in your hand can only be heard by other walkie talkies about 3-5 miles away in general, because further than that, the two walkie talkies are over the horizon from each other (i.e., this big rock called the planet Earth is between them).
To overcome this limitation, the height of one or both has to be raised.
So distance is limited by all of these factors (some of these were described well in other answers):
- attenuation of the signal by absorption (air, trees, walls, rocks, etc.)
- attenuation of the signal by spreading
- signal to noise ratio (noise from other signals, natural and otherwise, on frequency and on close frequencies)
- propagation effects (multipath propagation, reflection, refraction, ducting, and horizon)
Each of these factors can be mathematically modeled with equations, but different effects dominate in different situations, so no one equation will give you a distance for all situations.
$endgroup$
Signals that don't bounce or refract are also limited to line of sight propagation. In other words, your walkie talkie held in your hand can only be heard by other walkie talkies about 3-5 miles away in general, because further than that, the two walkie talkies are over the horizon from each other (i.e., this big rock called the planet Earth is between them).
To overcome this limitation, the height of one or both has to be raised.
So distance is limited by all of these factors (some of these were described well in other answers):
- attenuation of the signal by absorption (air, trees, walls, rocks, etc.)
- attenuation of the signal by spreading
- signal to noise ratio (noise from other signals, natural and otherwise, on frequency and on close frequencies)
- propagation effects (multipath propagation, reflection, refraction, ducting, and horizon)
Each of these factors can be mathematically modeled with equations, but different effects dominate in different situations, so no one equation will give you a distance for all situations.
answered Feb 18 at 12:20
user10489user10489
57116
57116
add a comment |
add a comment |
Thanks for contributing an answer to Amateur Radio Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fham.stackexchange.com%2fquestions%2f12690%2fhow-far-do-radio-waves-travel%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
"regardless of distance": no, not at all. Move the satellite 1.5 times further away, and you won't be able to do anything useful with the received signal. And: the data rate of GPS is roughly rounded 0, and of Iridium satellite phones is a couple of kb/s; of your LTE phone it can be multiple MB/s.
$endgroup$
– Marcus Müller
Jan 25 at 12:12