Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The only problem of this super duper technology is that gigabits of speed will be practically achievable only in line of sight 1 meter / 3 feet from the base station. :rolleyes:

It has a good chance for replacing short distance cabling. For longer distances, cable will never be obsoleted. And for long distances over wireless we will probably see only 802.11n speeds.

They can not beat laws of physics. :apple:
 
Wi-Gig is sort of the converse of Wi-Max. Wi-Max is about using the newly available frequencies for Wi-Fi over distances similar to cellphone signals.

Wi-Gig is for connecting your DVR, DVD and Set-Top box to your TV without a tangle of cables.
 
I would expect the range on something at 60GHz to be not much more than line-of-sight within the same room. As for the existing 802.11n supporting a "600 Mbps transfer rate," that is pretty much a crock. In theory, the link rate can approach that type of number but when actually going device-to-device data transfers with typical consumer equipment you'll be limited to something around 100Mbps (best case at close range).

I'd guess that the real data throughput on most home setups when running 802.11n is more likely in the 50 to 70Mbps range (when doing more than the simplest room-to-room transfer).

For everybody who's wondering about it, it's 600Mbps total throughput for the wifi-n for all 4x4 antennas working at 40MHz wide band. It is possible to get all 600Mbps if both devices have 4x4 antennas at 40MHz with lower GI rate (400ns). Today laptops aren't equipped with 4x4 wifi-n setup yet, most likely 3x2 and higher GI latency with 20MHz band stable. We aren't likely to get 4x4@40MHz@400ns setups for another year or two.
 
wireless n easily handles all my streaming needs, incl. HD video. Not to mention none of this matters for internet connectivity since no broadband has even come close to catch g speeds yet, they are barely surpassing the b threshold. Sounds very niche, especially with the range limitation.

Well said, and I don't see this changing any time soon. Maybe in the future something will come down the pipe the saturates G or even N then I can see needing to look at some other technology, but for now even G seems to meet all my needs and have found N almost useless, except for moving lots of small files between computers.
 
There's some info on it here:

http://en.wikipedia.org/wiki/Digital_television_transition_in_the_United_States

Basically, the frequencies were supposed to herald an era of nationwide long-range WiFi. From what I've read, many (if not all) of the available frequencies have been bought up by Verizon and AT&T. So I guess that you can assume whenever those frequencies are put to use (if they ever are), the WiFi service won't be free.

The bulk was bought by the US telecoms Verizon and AT&T for their LTE networks ("4G"). Wireless broadband is different than "nationwide WiFi". The 700MHz is longer range than the higher frequencies commonly used for 3G now. Conceptually, that could mean less towers and/or greater coverage. Remains to be seen though what kind of coverage those two vendors will roll out.

Those two paid billions for that frequency allocation. That's why strange when folks say they are going to drag their feet for very long time to roll out LTE.


The WiMAX that ( Clearwire Sprint ) are using is up in the 2.5GHz range.

There was a push by Google to get some terms on it that would have precluded "locked devices", but not sure that worked well in the end. The problem with using the unlicensed, 'left over' bands is that they are unlicensed. Unlikely, will get uniformly consistent high speed out of them if numerous folks are using them for disparate purposes.


Not sure why folks are so wirephobic that can't hook a TV to a receiver. Objects that are primarily stationary should have wires. More energy efficient and don't pollute the airwaves with even more stuff. This 60+GHz stuff isn't going to penetrate decently thick walls (not that folks make decent walls anymore... most are paper thin these days. ) If primarily useful over short distances where is the big win over wires?
 
Apple will adopt this technology about 3 years after every other computer maker has already had it on their $500 low end systems. Then they will charge a premium for it.
 
Sounds great for file transfers. Run Cat 6 to each room, then put a Giga router in every room where you work. This will let you move big files but not worry about connecting/disconnecting cables.
 
60Ghz

I wonder which part of our brain fries at 60Ghz, BTW, it seems another announcement material for WWDC: "apple will ship Wi-Gig next ..."
 
Wavelength's got nothing to do with it - there's a good reason why the 60GHz band isn't terribly usable. Namely, oxygen molecules have a pronounced absorption peak in that region, so it's useless past a handful of feet anywhere there's AIR.

Actually, no.

Yes, oxygen absorbs RF. But at the peak wavelength and typical atmospheric oxygen content, the loss is 10db/km. Not a significant factor for in-home use, unless you are Bill Gates. ;)

RF Cafe Article on Atmospheric Absorption

The higher the frequency (and shorter the wavelength) the more attenuation there is through solid objects. Very low frequencies (long wavelengths) can actually pass through the earth, and are used to communicate with submarines.

Those systems that we use to communicate with submarines can send a simple message (like "launch") in a few minutes. It's likely one of the main reasons we freaked-out so in the U.S. at the prospect of missiles in Cuba...

Anyway, walls are a significantly greater obstacle at 60gHz than is atmospheric absorption.

Unfortunately, I'd expect router makers to abuse the intent of this and tout products that will be impractical in the typical home. Think of this as "high speed Bluetooth" but with a shorter range yet.
 
I don't get the need for this.

I buy or rent an HD movie from itunes store via internet on apple tv and i'm business in less than 40 seconds. I can also stream a HD movie out of itunes on a laptop to apple tv connected to HD TV and watch in real time in less than a second using wireless N?

Why do I need this local speed when cable internet is the bottleneck at 10 base T and the N is fast enough to take care of the local HD streams?
 
Perhaps an explanation on how Frequency and Wavelength are inversely proportional in the laws of Physics would help explain to the audience how come the range is so short?

As I remember, range is influenced by many things, not just frequency and wavelength. I think you have to use the Beer-Lambert Law to calculate range, and it's a bit more complicated since it depends on the material you're dealing with as well.
 
Why do I need this local speed when cable internet is the bottleneck at 10 base T and the N is fast enough to take care of the local HD streams?

Because the local "HD" streams you are referring-to are almost always not full HD today. It's just marketing-speak. It's not "high" def. It's "higher" def. Sure, the pixel count may be there, but it's be heavily-compressed.

BluRay needs 54mbit/sec. That's the theoretical max for wireless G, but as a practical matter, fergidaboutit. You're really lucky to actually get 54mbit/sec throughput a couple of rooms away using N. Typically, N performance drops-off vs. G once you pass through a couple of walls. Sure, you can get 300mbit/sec in the same room, but go a couple of rooms away, and you are in the 20's, while G will be significantly higher.
 
Jobs: "Blu ray...blu ray...we don't need no stinkin' blu ray"

Obviously you don't get it!

The entire reason he doesn't care about Bluray is because they are going toward a digital platform for movies etc. This wireless tech is exactly the kind of tech they want to be able to achieve that! Or is that what you are saying in an odd way?
 
I don't get the need for this.

I buy or rent an HD movie from itunes store via internet on apple tv and i'm business in less than 40 seconds. I can also stream a HD movie out of itunes on a laptop to apple tv connected to HD TV and watch in real time in less than a second using wireless N?

Why do I need this local speed when cable internet is the bottleneck at 10 base T and the N is fast enough to take care of the local HD streams?

I'm sure we don't need it NOW, but it's inevitable that we will need something that fast sometime in the future. Not just for movies but anything else that requires tremendous amounts of data. The faster things are, the better it is! Imagine copying 2 DVDs worth of data wirelessly in one second, it may become useful one day...
 
Perhaps an explanation on how Frequency and Wavelength are inversely proportional in the laws of Physics would help explain to the audience how come the range is so short?

It's not carrier frequency or wavelength that's the limiting factor, it is data rate or bandwidth. The range is short because the signal strength at the receiving end of the link needs to be very strong to be able to reliably detect 7 gb/s.

Another important factor is that oxygen in the atmosphere absorbs (i.e. is heated up by) the 60 GHz carrier frequency. This means that even with a high gain antenna or amplifiers to boost the signal, the range can never be made to be very long (for communication to or from the surface of the Earth). But this is actually helpful in a short range application as it keeps the ambient noise level from other 60 GHz sources very low (unless they are very close). This in turn means that devices can get by with a very low power transmitter for short range applications.
 
As I remember, range is influenced by many things, not just frequency and wavelength. I think you have to use the Beer-Lambert Law to calculate range, and it's a bit more complicated since it depends on the material you're dealing with as well.

For interference material lattice properties are a large factor, but my statement of inversely proportional relationships is a law of physics, period.
 
Higher frequencies have shorter wavelengths. Shorter wavelengths can only penetrate thinner obstacles.

This is not always true, it depends on the material. For example, light which has a very short wavelength (e.g. 1.5 * 10e-6 meters) can travel through several kilometers of glass in a fiber optic. Alternately, AM radio waves with a wavelength of hundreds of meters will be stopped by a layer of aluminum foils if it's formed into a closed box.
 
There's some info on it here:

http://en.wikipedia.org/wiki/Digital_television_transition_in_the_United_States

Basically, the frequencies were supposed to herald an era of nationwide long-range WiFi. From what I've read, many (if not all) of the available frequencies have been bought up by Verizon and AT&T. So I guess that you can assume whenever those frequencies are put to use (if they ever are), the WiFi service won't be free.

Was anyone assuming that it would be free? It certainly wasn't free for Verizon and AT&T.
 
Funny Coincidence

I just finished reading Wired's article on Wi-Fi / Wi-Gig. Now this pops up.
 
The Physics of WiFi

Let's start with wireless communication in general. In 1831, Michael Faraday discovered the foundation of radio, decades before the first message was ever sent. He discovered that when he put a wire in a magnetic field, a current was induced in the wire for a brief period of time. Essentially, a small amount of energy was being transmitted wirelessly from the magnetic field to the wire. Each time he connected and then disconnected the two terminals of a battery (using a long loop of wire, one end to +, other end to -), the voltage in the other, unconnected wire increased slightly and then fell to 0 again.

This effect was first used to transmit over a distance between two mountains in Virginia, by Mahlon Loomis in 1864. Mahlon flew two kites, each with a 600ft wire grounded to the earth. When he, using a simple switch, disconnected one kite's wire from it's ground, a change in current was detected in the other kite. The distance between them: 18 miles.

Today, we know that radio waves are responsible for interaction. Radio waves are electromagnetic radiation, just like the light that we see. Radio and light waves are distinguished by their wavelength. Picture a sine wave. The distance from the highest point to another of the highest point (or, as physicists say, from crest to crest) is the wavelength, usually denoted by the greek letter "lambda." Mathematicians and scientists are overly fond of Greek letters--but I use L. Wavelength is usually measured in meters

Another important piece of information about a radio wave is it's amplitude, or A. A simply means the height of the wave, the vertical distance from high point to low point, or from crest to trough. Amplitude is also measured in meters

Finally, waves have a frequency f. If you stood still and watched radio waves fly past you (considerably fast!), and you counted all the crests that passed in one second, that'd be the frequency. An interesting mathematical relationship for waves is (remember the speed of light?) c = L x f. The speed of light is the wavelength of a wave times its frequency. Later on, we will use this formula to compute the wavelength of 2.4Ghz WiFi.

This equation has an interesting consequence. Since the speed of light is constant (as most people think...there are some who question and I am one of them; assuming it's constant works for our purposes, no matter how these arguments turn out), then the wavelength can be computed from the frequency and visa versa. Frequency is measured in some form of Hertz (i.e. Megaherts, Gigahertz, Kilohertz, etc.)

Modern radios use a continuously varying current instead of an on/off system to transmit a signal. This way they can transmit information in a sine wave, which can hold more information more accurately. Sine waves are particularly suited to sound transmission.

To send, say, a voice signal, a "carrier" wave is generated, usually as a sine function. The information to be carried is then entwined with the carrier signal (how depends on whether it's AM, FM or even PM). AM, short for amplitude modulation, means that the amplitude A of the wave changes to signify the bits of the information. Far more common is FM, or frequency modulation, in which the wave's frequency (and hence wavelength, remember?) changes. FM is far more reliable than AM for transmitting information. Almost unheard of is PM, Pulse Modulation. PM waves simply mean that the circuit is turned on and off to transmit a signal, much like how the first radios worked. The only large-scale use of PM I can think of is for carrying time information to all the atomically synchronized clocks. While unreliable in terms of data (how many times a day do you really need to update your clock, anyway?) PM is a powerful transmission method, so much so that a single transmitter covers the entire United States.

The wattage of a transmitter determines in how powerful the signal is. WiFi calls for maximum 1 Watt transmission, an AM radio station might use 50,000 Watts.

Once radio waves are in the air on a particular frequency, a tuner in range must then pick them up and convert the sine wave back into usable information. Tuners work based on another physics principle known as resonance. A common example of resonance is when people break glass by singing alarmingly high notes. The sound waves, vibrating through the air at the natural frequency of glass (a frequency that causes glass to vibrate with giant amplitudes), induce vibrations in the glass, which cause it to break. In a radio, the tuner resonates at the particular frequency it's tuned to, so only those waves will be amplified. The net result is that waves of a set frequency are picked out of the air and amplified far above the strength of the other waves. Thus, tuners 'select' which radio waves we want to listen to.

After the signal is picked out from among all the radio noise in the air, another component called a demodulator subtracts the carrier signal from the radio wave to obtain the original signal.

WiFi works by radio transmission, usually in the unlicensed 2.4 Ghz ISM Band (ISM = Insustrial, Scientific, and Medical). WiFi transmission is essentially FM transmission, in that the frequency is changed to transmit data. For example, 2.4 Ghz WiFi uses something called Complementary Code Keying to vary the frequency and send data. That's for another tutorial, though (read: I'll explain it when I understand it).

Radio interference is a natural problem, and simply refers to the cluttered state of the airwaves?everybody is using radio waves! For many purposes this isn't a problem, but WiFi works in the unlicensed band. Therefore, anyone can use it. Thus, the 2.4 Ghz frequency is stuffed full of radio waves just waiting to interfere with transmissions.

But wait! There are 11 channels in WiFi. Each channel has a (slightly) different frequency, so by using a different channel we can avoid colliding networks, right?

Alas, this doesn't work very well, for two reasons: one related to FCC regulations and the other a physical principle. As for the first, each of these channels is 5 Mhz apart and 22 Mhz wide. Therefore, the channels overlap and cut into each other. The only mutually nonoverlapping channels are 1, 6, and 11. Unfortunately, microwave ovens are known to wreak havoc on channel 11. And if anyone's using channel 3, they'll interfere with both 1 and 6. Go figure.

The second reason channels don't work concerns the origins of radio waves of a certain frequency. By generating a carrier wave of frequency f and allowing it to control the current in a wire, we create a magnetic field. But this magnetic field is created by electrons moving in the antenna. The electrons, it turns out, vibrate at a certain frequency. But phyiscs tells us that whenever something vibrates, there are other, harmonic frequencies that vibrate alongside it. Therefore, we can't send radio waves at a certain frequency only, because if we try we get waves at the harmonic frequencies too. Thus, all transmissions interfere with each other to some extent.

A basic principle of electromagnetic waves is that when they overlap, they add. But if a crest is added with a trough, the result is no wave at all. Thus, two waves can cancel each other. To do this, they have to be "offset" by half their wavelength. Even when they don't perfectly cancel, some destructive interference still occurs. ;)
 
Equivalent isotropically radiated power

In radio communication systems, Equivalent isotropically radiated power (EIRP) or, alternatively, Effective isotropically radiated power is the amount of power that a theoretical isotropic antenna (which evenly distributes power in all directions) would emit to produce the peak power density observed in the direction of maximum antenna gain. EIRP can take into account the losses in transmission line and connectors and includes the gain of the antenna. The EIRP is often stated in terms of decibels over a reference power emitted by an isotropic radiator with an equivalent signal strength. The EIRP allows comparisons between different emitters regardless of type, size or form. From the EIRP, and with knowledge of a real antenna's gain, it is possible to calculate real power and field strength values.

where EIRP and PT (power of transmitter) are in dBm, cable losses (Lc) is in dB, and antenna gain (Ga) is expressed in dBi, relative to a (theoretical) isotropic reference antenna.
This example uses dBm, although it is also common to see dBW.
Decibels are a convenient way to express the ratio between two quantities. dBm uses a reference of 1mW and dBW uses a reference of 1W.

and

A transmitter with a 50W output can be expressed as a 17dBW output or 47dBm.

The EIRP is used to estimate the service area of the transmitter, and to co-ordinate transmitters on the same frequency so that their coverage areas do not overlap.
In built-up areas, regulations may restrict the EIRP of a transmitter to prevent exposure of personnel to high power electromagnetic fields however EIRP is normally restricted to minimize interference to services on similar frequencies.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.