When I moved to the temporary housing that Microsoft provided me in Redmond in 2004, the first thing I did was to buy a laptop. The most memorable access point around me was an open connection with the name "Bring food and beer to B308". I used that person's Internet for a while, and wanted to bring beer to them. Unfortunately, when I surveyed the area, I could find no "Bxxx" blocks in Timberlawn Apartments, only A's. It was probably a neighboring complex. I want to use this opportunity thank that person.
I ran a small ISP around the same time that used this behavioral pattern to bring down the customer acquisition cost to near zero. Essentially we sold ADSL connections with Wi-Fi and a second SSID where anybody could connect and sign up for internet access. If too may signed up we sent out personal offers for ADSL service to some of them and wired up their homes too. Fun project, but stressful and not very profitable.
I still find it strange how people use the word “WiFi” to mean internet. For so many young people today, WiFi IS the internet. They have never plugged in an Ethernet cable in their life.
I still get frustrated by WiFi, though, and never use it for my computers unless I had no choice. So many devices these days, the performance is still subpar. Packet loss on the best connections cause so many performance degradations.
> I still find it strange how people use the word “WiFi” to mean internet. For so many young people today, WiFi IS the internet.
I don't think that's the case, people don't call mobile internet "WiFi". In their minds "WiFi" probably means "home internet", so it's more like they call LAN "WiFi", because they have never used cable connection.
I have a friend who, no matter how many times I have explained it, calls her Internet connection WiFi. I was really confused at first when she said she pays for Wi-Fi. She's 27.
And they used to say kids were good with computers. I think if you graph computer illiteracy vs. year of birth, people who were born around the 80's are in the bottom of a bathtub curve.
The boomers are hopeless with tech, because they grew up without it. But gen-z is also hopeless, because they grew up with opaque appliances like Iphones. You use it how Apple allows you to. The inner workings of it are hidden behind endless abstractions and shiny (liquid!) UI. Every error message is something like "Oopsie woopsie" with a sad face emoji. When it breaks you toss it in the garbage.
This reads like an "uphill both ways" story, but we used to build our own computers (still do, but I used to too), because that was how you aquired one unless you bought a Dell like a lamer. When your software messed up it segfaulted or coredumped, and you had to figure it out. When your hardware broke, you took it apart. People today use Discord because picking a client and specifying a server and port combo to connect to is too hard. And so on and so forth, you get the point...
And the point is these damn kids are on my lawn, making TikTok videos.
This also applies to many other things, such as cars. But perhaps the timeline is shifted.
But if we go earlier a bit, it was common for people to know much more about house construction, electric work, flooring, making furniture etc. IKEA emulates some of this, but it's really a different thing to live around things you truly understand, you participated in builting your house, you know how and why everything is in it, you can fix your car, you can make produce in your garden, you eat your chickens' eggs, which you can turn into baked chicken, the whole process from hatching to hen to plate is managed by you etc.
People are having less and less control and understanding of their lives. It's all "coming from somewhere" and wrapped in abstractions and euphemisms. You no longer buy things, just rent, etc. It really changes the mentality to a more child-like thinking, at the mercy of some opaque system. With AI we will get the final blow. No skills, no intellectual muscle, just as people don't even remember the and driving directions to even places they regularly visit, because GPS gives instructions so it's just not memorized. It will be the same but for everything.
It's why I became an electrician, and specialised in telecoms. I realised in my 20's that I didn't want to spend the rest of my life typing on a keyboard, and telecoms is the right kind of combination of geekery and practical work. Plus I know all the fundamental electrician stuff. I highly recommend this career path if you feel the same way. My last job was in robotics where my combined skillset of being a Linux-focused computer+networking geek and having electrical skills made me pretty unique. Lately my career has taken a bizarre turn where I do AV, networking and electrical work, plus... swimming pool maintenance. Life, uh, finds a way.
Anyways, apart from knowing how a car works on a theoretical level, I have no idea how to fix mine, and getting fucked at the dealership is a part of my life I have begrudgingly accepted.
I had similar revelation watching friend’s son asking him stuff about how to use cell phone. When I was young, it was always the other way around. And it still is, I keep helping my parents. But now I wonder if I’ll end up the IT guy in both generational directions.
I think if you're not "on Wifi" then you're "on data" (or perhaps "on 5G" if it's a social status thing). If one were to connect via Starlink then it would still be correct to say you were on Wifi I think but if you could hook up ethernet directly... I think most of us would say you're now "on satellite"?
I didn't even consider that the newest iPhones can connect to satellites directly.
Yeah, that’s true. I even thought about that when I was typing my comment but wasn’t sure the best way to articulate the difference, but I think you are right with it being about home internet vs cellular.
Although I really think it is just used to mean “non-phone based internet”, rather than just home internet.
I don’t think people realize tha Wi-Fi is a brand name for 'IEEE 802.11b Direct Sequence'." WiFi, Wifi, or wifi, are not approved by the Wi-Fi Alliance. Despite common belief the name Wi-Fi is not short-form for 'Wireless Fidelity'.
It really was revolutionary. Surprisingly the biggest target market for WiFi ended up being phones, which already have a wireless connection to the Internet.
2003 WiFi was routinely awful, though. Generally unstable, poor compatibility and lousy range. A lot better now, but still could be easier for non-techs.
Also unsecure. I remember driving around looking for Wi-Fi to steal internet from, I routinely found network shares full of sensitive documents. And I only looked for open WiFi and wasn't even trying to hack anything.
If I actually wanted to hack into networks, encrypted WiFi used WEP which could be cracked in minutes on a typical laptop. Most communication was unencrypted too, pwning entire WiFi networks wasn't even fun considering how easy it was.
In the early 2000s hotels also still routinely charged for WiFi. But someplace like NY, you could usually find an open ssid within range. But, yes, in that period many WiFi transmissions didn’t have a password.
I miss when free wifi was everywhere. When I was traveling Italy my phone seemed to almost never have signal despite paying for roaming data. And I couldn’t find free wifi anywhere. The few places that did have it like mcdonalds wanted me to authenticate with sms which wasn’t working since I had no phone signal.
Used to be that every cafe and business just had an open wifi network. Now if they provide it at all it’s password protected
One of my early IT jobs in the 2000s was at a SME with Wifi: after you connected radio-wise, you had to start the VPN client, because at the time there really wasn't any (effective) encryption of the signal in 802.11 itself.
I think it is more correct to say open wifi made it easier to connect to otherwise already insecure systems. After all, anybody in the house with an Ethernet cable could get those sensitive documents, right? Open wifi just expected users to actually follow the mantra of the time: don’t trust the infrastructure!
>the biggest target market for WiFi ended up being phones
If a particular category is considered then yes, phones are the biggest chunk. But virtually every device these days comes with WiFi. So wifi is now the default method of connecting something.
Interestingly, I see an increasing number of young-ish people that simply skip "landline" ISPs (hence WiFi) entirely and only use their phone.
Mostly because around here you can have 100GB over 5G for less than 10€ + they mostly don't use computers (a.k.a laptops) except for a) school (where they have free WiFi+Internet) and b) binge-watching the occasional Netflix (and then they use connection sharing)
Their first move upon setting up a new phone is to disable Bluetooth+WiFi to, uh, "save battery" (their cargo-cult answer, every single time)
That is increasingly true and will only get more mainstream as times comes by. Why pay for Home WiFi. ( Internet ) when you have 5G / Mobile Data. 5GSA will further increase Mobile network capacity, and mobile network will be all you need. MNOs will specially those without cable / fibre to home infrastructure are already promoting 5G home solution as replacement.
What they didn't mention in the article and most Wi-Fi historical narrative is the critical contribution from OFDM modulation waveform technology, the idea originated and patented by the radio astronomy research of CSIRO Australia [1],[2].
In the early days of Wi-Fi, IEEE 802.11 group was still testing spread spectrum and OFDM with 802.11b and 802.11a, respectively. But then it's become apparent that the best bandwidth come from the proper orthogonality of wireless modulation aka OFDM [1].
At the time of the OP article back in 2003 the incumbent cellular mobile modulation of 3G is still spread spectrum based CMDA system but by 4G it's OFDM all-in and the rest is history. CSIRO become much richer due to the patent, and radio astronomy based technology generated some hard cash for the research institute that mainly pursuing science.
Just one more example of how investment in fundamental science, without explicit reference to marketability or industry applications, often produce revolutionary technology which does have economic applications.
I still remember the shock when my father told me he had connected his laptop to the internet without a cable. I'd heard of wireless networking but didn't know it was a standard feature in laptops at the time and all you need is to find a wifi point.
We tested Wifi-7 in our lab due to planned migrations. It's a huge quality mess right now.
Either MLO doesn't work correctly or the drivers of the Modems (we tested Intel, Mediatek, Qualcomm, etc.) hit the shitter.
For my private stuff I stick to Wifi-6 and wait until Wifi-8 arrives. Finally having "friendly coordinated handovers" between APs is one of the biggest wins for me.
In case this still isn't common knowledge. You should always use WiFi 7 hardware as the latest WiFi 6E solution. I.e Latest Generation of WiFi is the best version of last generation.
Just like when WiFi 6 came out, OFDMA didn't work well or wasn't even turned on by default. The same thing happened with WiFi 5 MU-MIMO, and WiFi 7 MLO. Expect the MLO to only work with WiFi 8.
And for all the latency reduction and reliability upgrade with WiFi 8? The expect them to work well in WiFi 9.
The tone of the article sounds so breathlessly over-excited that it's bordering on self-parody.
The cell phone companies will regret their purchase of 3G spectrum! Those fools, they did not realize their 3G cell towers would soon be rendered obsolete, nay, ridiculous, by my mighty wireless router!
It's not consumers buying consumer electronics, no, it's "an authentic grassroots phenomenon."
It seems to me that, for the most part, it was warranted enthusiasm. Wi-Fi has lived up to most of the wildest predictions and probably achieved even greater adoption than anyone could have imagined back in the early days.
> The tone of the article sounds so breathlessly over-excited that it's bordering on self-parody.
Welcome to every Wired article ever, certainly from its inception well into the mid-2000s at least.
Today it's at best amusing, but those were times just 1-1.5 generations ago when that was truly, genuinely generally enjoyed (by techies & youngsters) as neither Tired nor Expired but (Hot)Wired, and as a needed/welcome breath of air in an ocean of seemingly-immutable last-century whiffs & echoes =)
When my parents had a house built in the early 2000s, my father was adamant that Ethernet should be wired to every room. It seemed like a good way to future-proof the building for the 21st century at the time. The year we moved in, tweenage me asked about connecting my Nintendo DS to the internet in order to play Animal Crossing online.
I wonder if we would have done the Ethernet again if he knew that Wi-Fi was going to become so common.
Even with wifi, big houses or tough RF environments need mesh units to get ubiquitous wifi coverage. And there, ethernet wired backhaul is far, far superior to wireless. So maybe your dad was prescient in a different way.
The issue with wiring your house for Ethernet is that 2003-era Cat5 that a random builder or DIYer grabs from Home Depot isn't going to carry nearly as much as the Cat6A cable you would want if you need the cable plant to have a chance of keeping up with network capacity growth. But that needs quality installation.
I had a fairly extensive Ethernet and audio speaker setup in the course of a couple of house renovations. Much of that is trashed from smoke mitigation after a kitchen fire. Will pretty much just use WiFi from here on out.
> Like other open spectrum technologies rising in its wake, Wi-Fi is a way to use the handful of frequencies set aside for unrestricted consumer use. That's true of the old CB radio, too, but unlike the trucker channels Wi-Fi is digital and smart enough to avoid congestion. After 100 years of regulations that assumed serious wireless technologies were fragile and in need of protection by monopolies on exclusive frequencies (making spectrum the most valuable commodity of the information age), Wi-Fi is fully capable of protecting itself.
It’s true that, unlike other wireless transmission technologies, Wi-Fi allows any company to make a product that can transmit or receive on all frequency bands authorized by a country, whereas for mobile networks, for example, each operator acquires exclusive rights to a frequency band.
That shows that open standards work well and enable healthy competition.
> A box the size of a paperback, and costing no more than dinner for two, magically distributes broadband Internet to an area the size of a football field. A card no larger than a matchbook receives it.
An interesting historical document for studying the unit systems used in 2003.
Router/Access point: Matchbox, eg: TP-Link TL-WR802N (57 × 57 ×18 mm) [6] or GL-Inet GL-MT300N (58 x 58 x 25mm) [7]
WiFi USB Client: Fingernail, eg: Asus USB-AX56 adapter (25.5 x 16 x 9mm) [8]
WiFi Card Client: Postage stamp, eg: Intel Dual-Band Wireless Adapter AC-7260NGW M.2 2230 (22mm x 30mm x 2.4mm) [10]
802.11n Wireless N range: theoretically 230 ft (70 m) range indoors and 820 ft (250m) range outdoors, which is approximately 1 FIFA soccer field's (105 x 68 m) width for indoor range, and 2 soccer fields end-to-end by 3 side-to-side for outdoor range. Newer protocols do not seem to have extended the range of individual radios but rather rely on repeaters or range extender devices to provide additional coverage.
Some of these are slightly older devices dating from around 2020, but the sizes should generally still be approximately the same in 2025.
Router/Access point: Mobile operating systems like WebOS, Android, and iOS have had hotspot capabilites built-in since around around 2010 (Mobile Hotspot on Palm Pre and Pixi Plus in 2010 [1], Wifi Tethering in Android 2.2 Froyo in 2010 [2], and Personal Hotspot in iOS 4.3 for iPhone 4 and 3GS in 2011 [3]). Today you can configure your phone to become a personal Wifi router with one button press [4], [5].
For a discrete router device, TP-Link TL-WR802N (57 × 57 ×18 mm) [6] and GL-Inet GL-MT300N (58 x 58 x 25mm) [7] are matchbox-sized travel router devices that fit into the palm of your hand, cost less than $30 USD, and consume less than 2.75W (they can be run from a basic 5V, 1A USB connection).
Wifi Client: Wifi capabilities are now built into mobile phone, TV, and many camera chipsets, there is no separate card required. If you want a separate WiFi adapter, you can buy a USB adapter that is about the size of a fingernail or 5 small coins stacked together (eg: 5 US pennies, or 5 Euro 1-cent coins), basically take the metal part of a USB-A male connector and extend or extrude it a bit. This Asus USB-AX56 adapter [8] is 25.5 x 16 x 9mm and supports WiFi 6 (802.11ax) with theoretical speeds up to 9.6 Gbps (actual real-world speeds appear closer to 800 Mbps). Streams in Wifi 6 can be up to 160MHz wide via channel bonding of up to 8 adjacent 20 MHz channels, but in practice this is only feasible in locations with low or no interference. Note that wider channels have a reduced effective range and perform poorly at a distance or through obstructions.
Discrete Wifi cards can still be found in some laptop computers (although many Wifi cards are now soldered down), one of the more recent (introduced around 2013 to 2015) and smaller sizes is M.2 2230 (22mm x 30mm x 2.4mm) which is also the form factor for some M.2 PCI-Express SSDs (solid state storage drives, typically flash devices). These are about the size of a postage stamp, weigh around 2.3 grams, and cost around $20 USD or less.
One example is Intel AX210 [9] which supports Wi-Fi 6E, on 3 bands: 2.4 / 5 / 6GHz in a 2x2 configuration (2 TX transmit and 2 RX receive antennas) at speeds of 2.4 Gbps, and also features Bluetooth 5.3.
A popular previous-generation device is Intel Dual-Band Wireless Adapter AC-7260NGW [10] with good support for GNU/Linux, again with a M.2 2230 (22mm x 30mm x 2.4mm) form-factor approximately the size of a postage stamp, supporting Wifi 5 802.11ac with up to 867 Mbps theoretical bandwidth on dual bands in a 2x2 configuration (2 TX transmit and 2 RX receive radios), and 433 Mbps per stream. Channels can be up to 80MHz wide via channel bonding of 4 adjacent 20 MHz channels.
An alternative, earlier form factor was Half Mini-PCI-Express, for example Intel Dual-Band Wirelss Adapter AC-7260HMW [10] at 26.80 x 30 x 2.4 mm.
Side note, it's interesting how common it is for tech-savvy people to wire their homes for ethernet (more common now than 10-15 years ago) and how it is still common, or at least not rare, for people reliant on wi-fi to suffer from video streaming issues. The underlying technology keeps getting better, so maybe the improvements will outpace the growth in congestion at some point -- fingers crossed that makers of apps and household appliances don't eat up all future gains and keep us stuck in the same place.
Is congestion still an issue? Seems to me like after the switch from 2.4ghz to 5ghz, congestion stopped being a problem since wifi hardly leaves your own home. Amusingly, in my apartment sitting on the balcony, shutting the glass door would cause a total loss of connection, while leaving it open resulted in a very strong connection.
The future is probably just having multiple wifi APs wired up and then just running extremely fast but low range wifi.
5GHz certainly helps, but congestion/co-channel interference can still be an issue in high density environments, especially in a multi-user environment like an apartment complex where nothing is coordinated. The addition of 6GHz will help alleviate this problem too, but a lot of consumer gear seems to default to the widest channels possible.
Also, your glass door probably has Low-E glass which has a metallic coating.
> The future is probably just having multiple wifi APs wired up and then just running extremely fast but low range wifi.
This is somewhat the case, but it is limited. For example, in 5GHz there are 21x 20MHz channels available. In a highly dense environment, this can support roughly 30x devices per channel well and 50x devices per channel with some degradation.
Limiting the TX power on an AP can help, but it's not a panacea since clients always transmit their control frames at their default power (usually ~15dBm). There have been some improvements to this in .11ax, but depending on the spatial organization of the devices, it can only do so much.
It’s not like everything has moved off 2.4ghz - I’m in a SFH in a relatively low-density suburb and everything in my house that can be wired is, but I still typically see ~50% congestion on the 2.4ghz radios.
Yeah I just had a look and I'm seeing a reasonable amount of interference on 2.4ghz, very little on 5ghz, and then nothing at all on 6ghz.
My guess is people with large houses and only one wifi AP are probably using 2.4 when their devices are out of range for 5. But other people doing this probably doesn't impact you at all since you can just make sure you have good coverage for 5ghz and enjoy uncongested wifi.
> The future is probably just having multiple wifi APs wired up and then just running extremely fast but low range wifi.
Well, what if you want wifi in your garden? Maybe you own a few acres of property and you want wifi for a wedding? Now you actually have to do a minor bit of planning. Supported wifi versions, max EIRP, range, modulation rate, throughput, XPIC, coverage area, frequency, beamwidth, MU-MIMO, does the frequency require prior coordination with any government entity, etc.
I'm just giving a different use-case on the other end of the spectrum, to be fair. I agree 100% with your analysis however, we're going to mmwave frequency ranges with small but many APs. Massive Multi User MIMO, etc.
Oh, how I wish. I have 3 Firewalla AP7s to get decent coverage through my house. Its lath and plaster walls may as well be lead lined. You could put a CT scanner in my living room and not notice a thing 2 rooms away.
Lath and plaster walls have a distressing tendency to be lined with chicken wire mesh which really damages your chances of any radio signal getting through.
I’ve strung CAT 6e from one end of the house to the other to link 2 of them. The 3rd’s in a place not amenable to cabling without way more effort than I’m up for, but it’s close enough to one of them that they mesh alright.
My first use of Wi-Fi was for "broadband" internet in very early 2000's. It wasn't that fast, but it was pretty cool. The access point was on a mountain top about 7 miles from my condo. My antenna was a parabolic aluminum grid in my attic. I think the permitted bandwidth was about 400 Kbps. The transceivers were Cisco Aironet 802.11b devices.
That was my main Internet uplink for 5 or more years. About half way through I moved to another house and mounted the antenna outside on the roof for more gain, because the distance increased to about 11 miles. Caught some grief from the HOA, but I kept it up.
In 2000 my neighbour built a small network using two Orinoco Gold cards - an ad-hoc[1] network between his laptop (A Sony with a Neomagic chipset, I don't remember the precise model but it was beautiful) and the desktop in his room, and this was
(a) utterly magical
(b) his father was the son of someone very high up in one of the Scottish banks and so this was affordable for him and clearly outside the range of normal people
In 2001 I bought a set of Prism 2 based cards that let me run HostAP (https://hostap.epitest.fi/) and was able to build my own network that didn't rely on ad-hoc mode and so everything was better but the speed at which all of this changed was incredible - we went from infrastructure being out of the reach of normal humans to it being a small reach, and by 2005 we were in the territory of all laptops having it by default. It was an incredible phase shift.
[1] ad hoc was a way for wifi cards to talk to each other without there being an access point, and there was a period where operating systems would show ac-hoc devices as if they were access points, and Windows would remember the last ad-hoc network you'd joined and would advertise that if nothing else was available, and this led to "Free Internet Access" being something that would show up because it was an ad-hoc network someone else advertised and obviously you'd join that and then if you had no internet your laptop would broadcast it and someone else would join it and look the internet was actually genuinely worse in the past please stop assuming everything was better
And the FCC just so happened to approve the spectrum of frequencies that human bodies absorb, turning each Wi-Fi hotspot into surveillance spotlight, and each handheld device into a unique beacon. With everything we know about NSA's influence in other government agencies (like NIST), I think it's entirely reasonable to ask, "why 2.4 GHz?" But I've not seen anyone ask that question here. I'd also wonder whether NRO has satellite capability to measure Wi-Fi signals (and interference from human bodies) from orbit.
2.4GHz was used for microwave ovens and thus the spectrum was reserved for their interference. Or rather, the spectrum was made free for low power uses because Serious Business couldn’t be done on those frequencies due to the microwave ovens.
While that provides a plausible origin story, it doesn't stand to reason why the 2.4 GHz carrier frequency has been sustained for so long. For toy or prototype purposes, sure, the FCC could say "put them next to the microwave ovens." But Wi-Fi is, at this point, a critical national security utility, or even as you put it yourself, "serious business."
I'm not a radio engineer, but it doesn't take that many brain cells to beg the question: for a handheld/laptop device, why choose a carrier wave frequency absorbed by the body holding it, and by the metallic electronics sitting beside it? Logically, that's one of the most energy-inefficient frequencies one could choose, and a terrible design choice for personal wireless communication technology. I think a good engineer would want to conserve power and not be blocked by the very body holding it.
However, as the future unfolded, we now have nearly every household with a bright radiant point light casting human-shaped shadows, trivially reconstructed, to detect not only the body's silhouette but it's heartbeat and respiration, too.
And with everything we know, with leaks going back decades about the abuses of government power, surveilling their own citizenry, recording, analyzing, and manipulating the population in subtle ways, resulting in the financial benefactor of a handful of billionaires and the power benefactor of media-savvy pawns, why are these basic technological choices not being questioned more?
(mastax, I'm replying to you because you're top reply, but felt it important to continue my original point.)
TL;DR because the FCC regulates available frequency bands, and 900MHz, 2.4GHz, and 5GHz were the ones that were 1) the right combination of high enough to be fast and low enough to be energy efficient and easy to generate, and 2) actually available for use at the time.
A curious artifact of the older days is even to this day a surprising number of devices will caution about the dangers of connecting to an SSID named linksys, even if its WPA3, modern 802.11ax on both AP and client end, etc.
I remember the jump from 802.11b to g was profound. Speed was no longer a luxury. You could browse while torrenting an MP3 file at the same time, wirelessly! It was the golden era of the Internet :)
When I moved to the temporary housing that Microsoft provided me in Redmond in 2004, the first thing I did was to buy a laptop. The most memorable access point around me was an open connection with the name "Bring food and beer to B308". I used that person's Internet for a while, and wanted to bring beer to them. Unfortunately, when I surveyed the area, I could find no "Bxxx" blocks in Timberlawn Apartments, only A's. It was probably a neighboring complex. I want to use this opportunity thank that person.
I ran a small ISP around the same time that used this behavioral pattern to bring down the customer acquisition cost to near zero. Essentially we sold ADSL connections with Wi-Fi and a second SSID where anybody could connect and sign up for internet access. If too may signed up we sent out personal offers for ADSL service to some of them and wired up their homes too. Fun project, but stressful and not very profitable.
I still find it strange how people use the word “WiFi” to mean internet. For so many young people today, WiFi IS the internet. They have never plugged in an Ethernet cable in their life.
I still get frustrated by WiFi, though, and never use it for my computers unless I had no choice. So many devices these days, the performance is still subpar. Packet loss on the best connections cause so many performance degradations.
> I still find it strange how people use the word “WiFi” to mean internet. For so many young people today, WiFi IS the internet.
I don't think that's the case, people don't call mobile internet "WiFi". In their minds "WiFi" probably means "home internet", so it's more like they call LAN "WiFi", because they have never used cable connection.
I have a friend who, no matter how many times I have explained it, calls her Internet connection WiFi. I was really confused at first when she said she pays for Wi-Fi. She's 27.
She's not alone - here's an example (from today!) of an HN user using WiFi to mean cellular/mobile data
https://news.ycombinator.com/item?id=45620379
My mother too. She can't understand how she can access the internet anywhere in nature if there's no wifi.
And they used to say kids were good with computers. I think if you graph computer illiteracy vs. year of birth, people who were born around the 80's are in the bottom of a bathtub curve.
The boomers are hopeless with tech, because they grew up without it. But gen-z is also hopeless, because they grew up with opaque appliances like Iphones. You use it how Apple allows you to. The inner workings of it are hidden behind endless abstractions and shiny (liquid!) UI. Every error message is something like "Oopsie woopsie" with a sad face emoji. When it breaks you toss it in the garbage.
This reads like an "uphill both ways" story, but we used to build our own computers (still do, but I used to too), because that was how you aquired one unless you bought a Dell like a lamer. When your software messed up it segfaulted or coredumped, and you had to figure it out. When your hardware broke, you took it apart. People today use Discord because picking a client and specifying a server and port combo to connect to is too hard. And so on and so forth, you get the point...
And the point is these damn kids are on my lawn, making TikTok videos.
This also applies to many other things, such as cars. But perhaps the timeline is shifted.
But if we go earlier a bit, it was common for people to know much more about house construction, electric work, flooring, making furniture etc. IKEA emulates some of this, but it's really a different thing to live around things you truly understand, you participated in builting your house, you know how and why everything is in it, you can fix your car, you can make produce in your garden, you eat your chickens' eggs, which you can turn into baked chicken, the whole process from hatching to hen to plate is managed by you etc.
People are having less and less control and understanding of their lives. It's all "coming from somewhere" and wrapped in abstractions and euphemisms. You no longer buy things, just rent, etc. It really changes the mentality to a more child-like thinking, at the mercy of some opaque system. With AI we will get the final blow. No skills, no intellectual muscle, just as people don't even remember the and driving directions to even places they regularly visit, because GPS gives instructions so it's just not memorized. It will be the same but for everything.
Brawndo: it's got what plant's crave.
It's why I became an electrician, and specialised in telecoms. I realised in my 20's that I didn't want to spend the rest of my life typing on a keyboard, and telecoms is the right kind of combination of geekery and practical work. Plus I know all the fundamental electrician stuff. I highly recommend this career path if you feel the same way. My last job was in robotics where my combined skillset of being a Linux-focused computer+networking geek and having electrical skills made me pretty unique. Lately my career has taken a bizarre turn where I do AV, networking and electrical work, plus... swimming pool maintenance. Life, uh, finds a way.
Anyways, apart from knowing how a car works on a theoretical level, I have no idea how to fix mine, and getting fucked at the dealership is a part of my life I have begrudgingly accepted.
Thanks for reading my blog.
I had similar revelation watching friend’s son asking him stuff about how to use cell phone. When I was young, it was always the other way around. And it still is, I keep helping my parents. But now I wonder if I’ll end up the IT guy in both generational directions.
I think if you're not "on Wifi" then you're "on data" (or perhaps "on 5G" if it's a social status thing). If one were to connect via Starlink then it would still be correct to say you were on Wifi I think but if you could hook up ethernet directly... I think most of us would say you're now "on satellite"?
I didn't even consider that the newest iPhones can connect to satellites directly.
Yeah, that’s true. I even thought about that when I was typing my comment but wasn’t sure the best way to articulate the difference, but I think you are right with it being about home internet vs cellular.
Although I really think it is just used to mean “non-phone based internet”, rather than just home internet.
It means Internet where your monthly data cap is not consumed.
Please make sure to throw away your Kleenex before stepping onto the Escalator.
I don’t think people realize tha Wi-Fi is a brand name for 'IEEE 802.11b Direct Sequence'." WiFi, Wifi, or wifi, are not approved by the Wi-Fi Alliance. Despite common belief the name Wi-Fi is not short-form for 'Wireless Fidelity'.
Would an adapter in promiscuous mode technically be Wireless Infidelity?
In my country, GIGA means "data transfer quota", USB means "USB storage"
It really was revolutionary. Surprisingly the biggest target market for WiFi ended up being phones, which already have a wireless connection to the Internet.
2003 WiFi was routinely awful, though. Generally unstable, poor compatibility and lousy range. A lot better now, but still could be easier for non-techs.
Also unsecure. I remember driving around looking for Wi-Fi to steal internet from, I routinely found network shares full of sensitive documents. And I only looked for open WiFi and wasn't even trying to hack anything.
If I actually wanted to hack into networks, encrypted WiFi used WEP which could be cracked in minutes on a typical laptop. Most communication was unencrypted too, pwning entire WiFi networks wasn't even fun considering how easy it was.
In the early 2000s hotels also still routinely charged for WiFi. But someplace like NY, you could usually find an open ssid within range. But, yes, in that period many WiFi transmissions didn’t have a password.
Yikes. I forgot you could get free WiFi practically anywhere you could get a signal because everyone's WiFi was open or easy to hack.
I miss when free wifi was everywhere. When I was traveling Italy my phone seemed to almost never have signal despite paying for roaming data. And I couldn’t find free wifi anywhere. The few places that did have it like mcdonalds wanted me to authenticate with sms which wasn’t working since I had no phone signal.
Used to be that every cafe and business just had an open wifi network. Now if they provide it at all it’s password protected
Part of the general decrease of trust.
Or just greater awareness of possible bad outcomes if you just leave your WiFi open.
> Also unsecure.
One of my early IT jobs in the 2000s was at a SME with Wifi: after you connected radio-wise, you had to start the VPN client, because at the time there really wasn't any (effective) encryption of the signal in 802.11 itself.
I think it is more correct to say open wifi made it easier to connect to otherwise already insecure systems. After all, anybody in the house with an Ethernet cable could get those sensitive documents, right? Open wifi just expected users to actually follow the mantra of the time: don’t trust the infrastructure!
>the biggest target market for WiFi ended up being phones
If a particular category is considered then yes, phones are the biggest chunk. But virtually every device these days comes with WiFi. So wifi is now the default method of connecting something.
Biggest market agreed. But relative impact on utility of laptops seems enormous.
Certainly before every electronic device on WiFi became ubiquitous.
Interestingly, I see an increasing number of young-ish people that simply skip "landline" ISPs (hence WiFi) entirely and only use their phone.
Mostly because around here you can have 100GB over 5G for less than 10€ + they mostly don't use computers (a.k.a laptops) except for a) school (where they have free WiFi+Internet) and b) binge-watching the occasional Netflix (and then they use connection sharing)
Their first move upon setting up a new phone is to disable Bluetooth+WiFi to, uh, "save battery" (their cargo-cult answer, every single time)
That is increasingly true and will only get more mainstream as times comes by. Why pay for Home WiFi. ( Internet ) when you have 5G / Mobile Data. 5GSA will further increase Mobile network capacity, and mobile network will be all you need. MNOs will specially those without cable / fibre to home infrastructure are already promoting 5G home solution as replacement.
This has also happened for entire countries.
I had 2005 wifi with 56k dial-up, so even at the wifi’s worse I could sustain my full internet speed.
What they didn't mention in the article and most Wi-Fi historical narrative is the critical contribution from OFDM modulation waveform technology, the idea originated and patented by the radio astronomy research of CSIRO Australia [1],[2].
In the early days of Wi-Fi, IEEE 802.11 group was still testing spread spectrum and OFDM with 802.11b and 802.11a, respectively. But then it's become apparent that the best bandwidth come from the proper orthogonality of wireless modulation aka OFDM [1].
At the time of the OP article back in 2003 the incumbent cellular mobile modulation of 3G is still spread spectrum based CMDA system but by 4G it's OFDM all-in and the rest is history. CSIRO become much richer due to the patent, and radio astronomy based technology generated some hard cash for the research institute that mainly pursuing science.
[1] Orthogonal frequency-division multiplexing (OFDM):
https://en.wikipedia.org/wiki/Orthogonal_frequency-division_...
[2] How the Aussie government "invented WiFi" and sued its way to $430 million [PDF]:
https://www.vbllaw.com/wp-content/uploads/2020/11/How-The-Au...
Just one more example of how investment in fundamental science, without explicit reference to marketability or industry applications, often produce revolutionary technology which does have economic applications.
I still remember the shock when my father told me he had connected his laptop to the internet without a cable. I'd heard of wireless networking but didn't know it was a standard feature in laptops at the time and all you need is to find a wifi point.
We tested Wifi-7 in our lab due to planned migrations. It's a huge quality mess right now.
Either MLO doesn't work correctly or the drivers of the Modems (we tested Intel, Mediatek, Qualcomm, etc.) hit the shitter.
For my private stuff I stick to Wifi-6 and wait until Wifi-8 arrives. Finally having "friendly coordinated handovers" between APs is one of the biggest wins for me.
In case this still isn't common knowledge. You should always use WiFi 7 hardware as the latest WiFi 6E solution. I.e Latest Generation of WiFi is the best version of last generation.
Just like when WiFi 6 came out, OFDMA didn't work well or wasn't even turned on by default. The same thing happened with WiFi 5 MU-MIMO, and WiFi 7 MLO. Expect the MLO to only work with WiFi 8.
And for all the latency reduction and reliability upgrade with WiFi 8? The expect them to work well in WiFi 9.
Wifi 7 without MLO works just fine and quickly though.
The tone of the article sounds so breathlessly over-excited that it's bordering on self-parody.
The cell phone companies will regret their purchase of 3G spectrum! Those fools, they did not realize their 3G cell towers would soon be rendered obsolete, nay, ridiculous, by my mighty wireless router!
It's not consumers buying consumer electronics, no, it's "an authentic grassroots phenomenon."
It seems to me that, for the most part, it was warranted enthusiasm. Wi-Fi has lived up to most of the wildest predictions and probably achieved even greater adoption than anyone could have imagined back in the early days.
> The tone of the article sounds so breathlessly over-excited that it's bordering on self-parody.
Welcome to every Wired article ever, certainly from its inception well into the mid-2000s at least.
Today it's at best amusing, but those were times just 1-1.5 generations ago when that was truly, genuinely generally enjoyed (by techies & youngsters) as neither Tired nor Expired but (Hot)Wired, and as a needed/welcome breath of air in an ocean of seemingly-immutable last-century whiffs & echoes =)
When my parents had a house built in the early 2000s, my father was adamant that Ethernet should be wired to every room. It seemed like a good way to future-proof the building for the 21st century at the time. The year we moved in, tweenage me asked about connecting my Nintendo DS to the internet in order to play Animal Crossing online.
I wonder if we would have done the Ethernet again if he knew that Wi-Fi was going to become so common.
Even with wifi, big houses or tough RF environments need mesh units to get ubiquitous wifi coverage. And there, ethernet wired backhaul is far, far superior to wireless. So maybe your dad was prescient in a different way.
The issue with wiring your house for Ethernet is that 2003-era Cat5 that a random builder or DIYer grabs from Home Depot isn't going to carry nearly as much as the Cat6A cable you would want if you need the cable plant to have a chance of keeping up with network capacity growth. But that needs quality installation.
I had a fairly extensive Ethernet and audio speaker setup in the course of a couple of house renovations. Much of that is trashed from smoke mitigation after a kitchen fire. Will pretty much just use WiFi from here on out.
> I wonder if we would have done the Ethernet again if he knew that Wi-Fi was going to become so common.
Today, if your wiring up a house you put ethernet drops everywhere.
POE is a thing, and it's getting more popular.
Cameras, blinds, MM wave... It's almost to the point where one should be putting a media box in every closet as a mini wiring hookup.
Ethernet in each room remains valuable for other reasons, such as set top box devices, etc.
The issue wasn't whether wifi was going to become so common, it was the guaranteed improvement in reliability and speed of wifi.
Anyone could use ethernet, and still can.
Ethernet in each room remains valuable for other reasons, such as set top box devices, etc.
https://archive.is/8tLce
> Like other open spectrum technologies rising in its wake, Wi-Fi is a way to use the handful of frequencies set aside for unrestricted consumer use. That's true of the old CB radio, too, but unlike the trucker channels Wi-Fi is digital and smart enough to avoid congestion. After 100 years of regulations that assumed serious wireless technologies were fragile and in need of protection by monopolies on exclusive frequencies (making spectrum the most valuable commodity of the information age), Wi-Fi is fully capable of protecting itself.
It’s true that, unlike other wireless transmission technologies, Wi-Fi allows any company to make a product that can transmit or receive on all frequency bands authorized by a country, whereas for mobile networks, for example, each operator acquires exclusive rights to a frequency band.
That shows that open standards work well and enable healthy competition.
Remember the Steve Jobs presentation where he put an iBook through a hula hoop to prove there are no cables? Classic
> A box the size of a paperback, and costing no more than dinner for two, magically distributes broadband Internet to an area the size of a football field. A card no larger than a matchbook receives it.
An interesting historical document for studying the unit systems used in 2003.
What might be the 2025 equivalents?
Router/Access point: Matchbox, eg: TP-Link TL-WR802N (57 × 57 ×18 mm) [6] or GL-Inet GL-MT300N (58 x 58 x 25mm) [7]
WiFi USB Client: Fingernail, eg: Asus USB-AX56 adapter (25.5 x 16 x 9mm) [8]
WiFi Card Client: Postage stamp, eg: Intel Dual-Band Wireless Adapter AC-7260NGW M.2 2230 (22mm x 30mm x 2.4mm) [10]
802.11n Wireless N range: theoretically 230 ft (70 m) range indoors and 820 ft (250m) range outdoors, which is approximately 1 FIFA soccer field's (105 x 68 m) width for indoor range, and 2 soccer fields end-to-end by 3 side-to-side for outdoor range. Newer protocols do not seem to have extended the range of individual radios but rather rely on repeaters or range extender devices to provide additional coverage.
https://www.hummingbirdnetworks.com/articles/what-is-the-dis...
https://en.wikipedia.org/wiki/IEEE_802.11#Protocol
https://publications.fifa.com/de/football-stadiums-guideline...
Some of these are slightly older devices dating from around 2020, but the sizes should generally still be approximately the same in 2025.
Router/Access point: Mobile operating systems like WebOS, Android, and iOS have had hotspot capabilites built-in since around around 2010 (Mobile Hotspot on Palm Pre and Pixi Plus in 2010 [1], Wifi Tethering in Android 2.2 Froyo in 2010 [2], and Personal Hotspot in iOS 4.3 for iPhone 4 and 3GS in 2011 [3]). Today you can configure your phone to become a personal Wifi router with one button press [4], [5].
[1] https://www.cnn.com/2010/TECH/01/07/ces.palm.pre.plus.pixi/i...
https://web.archive.org/web/20101125023042/http://articles.c...
https://archive.is/UD1Dm
[2] https://www.wired.com/2010/05/android-22-froyo-features-usb-...
https://web.archive.org/web/20140707210122/https://www.wired...
https://archive.is/kjH1N
[3] https://www.engadget.com/2011-03-09-ios-4-3-spotlight-person...
https://web.archive.org/web/20251018061108/https://www.engad...
https://archive.is/QBRod
[4] https://support.google.com/android/answer/9059108?hl=en
[5] https://support.apple.com/guide/iphone/share-your-internet-c...
For a discrete router device, TP-Link TL-WR802N (57 × 57 ×18 mm) [6] and GL-Inet GL-MT300N (58 x 58 x 25mm) [7] are matchbox-sized travel router devices that fit into the palm of your hand, cost less than $30 USD, and consume less than 2.75W (they can be run from a basic 5V, 1A USB connection).
[6] https://www.tp-link.com/us/home-networking/wifi-router/tl-wr...
https://web.archive.org/web/20250724184720/https://www.tp-li...
[7] https://www.gl-inet.com/products/gl-mt300n-v2/
https://web.archive.org/web/20250822045920/https://www.gl-in...
Wifi Client: Wifi capabilities are now built into mobile phone, TV, and many camera chipsets, there is no separate card required. If you want a separate WiFi adapter, you can buy a USB adapter that is about the size of a fingernail or 5 small coins stacked together (eg: 5 US pennies, or 5 Euro 1-cent coins), basically take the metal part of a USB-A male connector and extend or extrude it a bit. This Asus USB-AX56 adapter [8] is 25.5 x 16 x 9mm and supports WiFi 6 (802.11ax) with theoretical speeds up to 9.6 Gbps (actual real-world speeds appear closer to 800 Mbps). Streams in Wifi 6 can be up to 160MHz wide via channel bonding of up to 8 adjacent 20 MHz channels, but in practice this is only feasible in locations with low or no interference. Note that wider channels have a reduced effective range and perform poorly at a distance or through obstructions.
[8] https://www.asus.com/networking-iot-servers/adapters/all-ser...
https://web.archive.org/web/20251018061524/https://www.asus....
https://www.wi-fi.org/wi-fi-macphy
Discrete Wifi cards can still be found in some laptop computers (although many Wifi cards are now soldered down), one of the more recent (introduced around 2013 to 2015) and smaller sizes is M.2 2230 (22mm x 30mm x 2.4mm) which is also the form factor for some M.2 PCI-Express SSDs (solid state storage drives, typically flash devices). These are about the size of a postage stamp, weigh around 2.3 grams, and cost around $20 USD or less.
One example is Intel AX210 [9] which supports Wi-Fi 6E, on 3 bands: 2.4 / 5 / 6GHz in a 2x2 configuration (2 TX transmit and 2 RX receive antennas) at speeds of 2.4 Gbps, and also features Bluetooth 5.3.
[9] https://www.intel.com/content/www/us/en/products/sku/204836/...
A popular previous-generation device is Intel Dual-Band Wireless Adapter AC-7260NGW [10] with good support for GNU/Linux, again with a M.2 2230 (22mm x 30mm x 2.4mm) form-factor approximately the size of a postage stamp, supporting Wifi 5 802.11ac with up to 867 Mbps theoretical bandwidth on dual bands in a 2x2 configuration (2 TX transmit and 2 RX receive radios), and 433 Mbps per stream. Channels can be up to 80MHz wide via channel bonding of 4 adjacent 20 MHz channels.
An alternative, earlier form factor was Half Mini-PCI-Express, for example Intel Dual-Band Wirelss Adapter AC-7260HMW [10] at 26.80 x 30 x 2.4 mm.
[10] https://www.mouser.com/datasheet/2/612/dual-band-wireless-ac...
https://web.archive.org/web/20251018061520/https://www.mouse...
Side note, it's interesting how common it is for tech-savvy people to wire their homes for ethernet (more common now than 10-15 years ago) and how it is still common, or at least not rare, for people reliant on wi-fi to suffer from video streaming issues. The underlying technology keeps getting better, so maybe the improvements will outpace the growth in congestion at some point -- fingers crossed that makers of apps and household appliances don't eat up all future gains and keep us stuck in the same place.
Is congestion still an issue? Seems to me like after the switch from 2.4ghz to 5ghz, congestion stopped being a problem since wifi hardly leaves your own home. Amusingly, in my apartment sitting on the balcony, shutting the glass door would cause a total loss of connection, while leaving it open resulted in a very strong connection.
The future is probably just having multiple wifi APs wired up and then just running extremely fast but low range wifi.
5GHz certainly helps, but congestion/co-channel interference can still be an issue in high density environments, especially in a multi-user environment like an apartment complex where nothing is coordinated. The addition of 6GHz will help alleviate this problem too, but a lot of consumer gear seems to default to the widest channels possible.
Also, your glass door probably has Low-E glass which has a metallic coating.
> The future is probably just having multiple wifi APs wired up and then just running extremely fast but low range wifi.
This is somewhat the case, but it is limited. For example, in 5GHz there are 21x 20MHz channels available. In a highly dense environment, this can support roughly 30x devices per channel well and 50x devices per channel with some degradation.
Limiting the TX power on an AP can help, but it's not a panacea since clients always transmit their control frames at their default power (usually ~15dBm). There have been some improvements to this in .11ax, but depending on the spatial organization of the devices, it can only do so much.
It’s not like everything has moved off 2.4ghz - I’m in a SFH in a relatively low-density suburb and everything in my house that can be wired is, but I still typically see ~50% congestion on the 2.4ghz radios.
Yeah I just had a look and I'm seeing a reasonable amount of interference on 2.4ghz, very little on 5ghz, and then nothing at all on 6ghz.
My guess is people with large houses and only one wifi AP are probably using 2.4 when their devices are out of range for 5. But other people doing this probably doesn't impact you at all since you can just make sure you have good coverage for 5ghz and enjoy uncongested wifi.
> The future is probably just having multiple wifi APs wired up and then just running extremely fast but low range wifi.
Well, what if you want wifi in your garden? Maybe you own a few acres of property and you want wifi for a wedding? Now you actually have to do a minor bit of planning. Supported wifi versions, max EIRP, range, modulation rate, throughput, XPIC, coverage area, frequency, beamwidth, MU-MIMO, does the frequency require prior coordination with any government entity, etc.
I'm just giving a different use-case on the other end of the spectrum, to be fair. I agree 100% with your analysis however, we're going to mmwave frequency ranges with small but many APs. Massive Multi User MIMO, etc.
I would love for a single AP to serve 500mbps throughout a whole house.
Though I would certainly not have complained about 50-100mbps throughout in 2003 — 1GBps wired networking was not mainstream then.
My tp-link ax6000 does that just fine.
Oh, how I wish. I have 3 Firewalla AP7s to get decent coverage through my house. Its lath and plaster walls may as well be lead lined. You could put a CT scanner in my living room and not notice a thing 2 rooms away.
Lath and plaster walls have a distressing tendency to be lined with chicken wire mesh which really damages your chances of any radio signal getting through.
Pictures for anyone wondering what it looks like:
https://www.civilengineermag.com/chicken-mesh-for-plaster/
Wired wifi mesh or Ethernet all over would be my prescription. You basically have Faraday cages in every room right now!
I’ve strung CAT 6e from one end of the house to the other to link 2 of them. The 3rd’s in a place not amenable to cabling without way more effort than I’m up for, but it’s close enough to one of them that they mesh alright.
Ethernet/fiber is far better in that scenario. You may even have better luck using existing coax and moca adapters than wifi in those scenarios.
The Serial Port put up a video saying that Apple really helped push Wifi into the mainstream with the AirPort:
* https://www.youtube.com/watch?v=EhBxWHrG7K8
My first use of Wi-Fi was for "broadband" internet in very early 2000's. It wasn't that fast, but it was pretty cool. The access point was on a mountain top about 7 miles from my condo. My antenna was a parabolic aluminum grid in my attic. I think the permitted bandwidth was about 400 Kbps. The transceivers were Cisco Aironet 802.11b devices.
That was my main Internet uplink for 5 or more years. About half way through I moved to another house and mounted the antenna outside on the roof for more gain, because the distance increased to about 11 miles. Caught some grief from the HOA, but I kept it up.
In 2000 my neighbour built a small network using two Orinoco Gold cards - an ad-hoc[1] network between his laptop (A Sony with a Neomagic chipset, I don't remember the precise model but it was beautiful) and the desktop in his room, and this was
(a) utterly magical (b) his father was the son of someone very high up in one of the Scottish banks and so this was affordable for him and clearly outside the range of normal people
In 2001 I bought a set of Prism 2 based cards that let me run HostAP (https://hostap.epitest.fi/) and was able to build my own network that didn't rely on ad-hoc mode and so everything was better but the speed at which all of this changed was incredible - we went from infrastructure being out of the reach of normal humans to it being a small reach, and by 2005 we were in the territory of all laptops having it by default. It was an incredible phase shift.
[1] ad hoc was a way for wifi cards to talk to each other without there being an access point, and there was a period where operating systems would show ac-hoc devices as if they were access points, and Windows would remember the last ad-hoc network you'd joined and would advertise that if nothing else was available, and this led to "Free Internet Access" being something that would show up because it was an ad-hoc network someone else advertised and obviously you'd join that and then if you had no internet your laptop would broadcast it and someone else would join it and look the internet was actually genuinely worse in the past please stop assuming everything was better
And the FCC just so happened to approve the spectrum of frequencies that human bodies absorb, turning each Wi-Fi hotspot into surveillance spotlight, and each handheld device into a unique beacon. With everything we know about NSA's influence in other government agencies (like NIST), I think it's entirely reasonable to ask, "why 2.4 GHz?" But I've not seen anyone ask that question here. I'd also wonder whether NRO has satellite capability to measure Wi-Fi signals (and interference from human bodies) from orbit.
2.4GHz was used for microwave ovens and thus the spectrum was reserved for their interference. Or rather, the spectrum was made free for low power uses because Serious Business couldn’t be done on those frequencies due to the microwave ovens.
While that provides a plausible origin story, it doesn't stand to reason why the 2.4 GHz carrier frequency has been sustained for so long. For toy or prototype purposes, sure, the FCC could say "put them next to the microwave ovens." But Wi-Fi is, at this point, a critical national security utility, or even as you put it yourself, "serious business."
I'm not a radio engineer, but it doesn't take that many brain cells to beg the question: for a handheld/laptop device, why choose a carrier wave frequency absorbed by the body holding it, and by the metallic electronics sitting beside it? Logically, that's one of the most energy-inefficient frequencies one could choose, and a terrible design choice for personal wireless communication technology. I think a good engineer would want to conserve power and not be blocked by the very body holding it.
However, as the future unfolded, we now have nearly every household with a bright radiant point light casting human-shaped shadows, trivially reconstructed, to detect not only the body's silhouette but it's heartbeat and respiration, too.
And with everything we know, with leaks going back decades about the abuses of government power, surveilling their own citizenry, recording, analyzing, and manipulating the population in subtle ways, resulting in the financial benefactor of a handful of billionaires and the power benefactor of media-savvy pawns, why are these basic technological choices not being questioned more?
(mastax, I'm replying to you because you're top reply, but felt it important to continue my original point.)
Unfortunately in the real world, truth is far less interesting than fantasy
Less conspiratorially, Wired themselves have an article about that: https://www.wired.com/2010/09/wireless-explainer/
TL;DR because the FCC regulates available frequency bands, and 900MHz, 2.4GHz, and 5GHz were the ones that were 1) the right combination of high enough to be fast and low enough to be energy efficient and easy to generate, and 2) actually available for use at the time.
A curious artifact of the older days is even to this day a surprising number of devices will caution about the dangers of connecting to an SSID named linksys, even if its WPA3, modern 802.11ax on both AP and client end, etc.
I remember the jump from 802.11b to g was profound. Speed was no longer a luxury. You could browse while torrenting an MP3 file at the same time, wirelessly! It was the golden era of the Internet :)