Why Is the Internet So Slow?

A recent story on the BBC reported that Virgin Media will beat British Telecom to the marketplace with 100Mbps broadband speeds. Virgin plans to start rolling out its fiber network next year and will complete the process by 2012. Until then, the Brits will just have to make do with a top speed of 50Mbps. Make do? Every time I begin to feel good about my 6Mbps download speed, I run across a story about a service that's more than 8 times faster than what I have. In Japan, 60Mbps service is available today. And it costs about what I pay for far slower service. Why?

In part, it's because the Federal Communications Commission has, for nearly a decade, failed to do what it's supposed to do. Instead of promoting technology, the FCC was in a continuous tizzy about the occasional dirty word or "costume malfunction". But that's not the sole cause of the problem. In part, it's because the United States is so large.

The CIA Fact Book lists the 5 largest countries in the world, by land mass: Russia, Canada, U.S., China, and Brazil. Sorting the list for the largest countries by population, a different picture emerges: China, India, U.S., Indonesia, and Brazil. But neither of these views provides a clear picture that shows why high-speed Internet and cellular service takes so long to build out here compared to places such as Japan.

The real story is population density and sorting the list that way produces a far different picture. This time I'll list the top 10 countries because a few in the top 10 are so small that they can be ignored for the purposes of this story: Macau, Monaco, Hong Kong, Singapore, Gibraltar, Malta, Bermuda, Maldives, Bahrain, and Bangladesh.

Take Hong Kong for example, third on the list. Hong Kong Broadband Network Limited reaches more than 90% of homes in Hong Kong and its slow speed is 25Mbps. Speeds up to 1000Mbps are available. Even a country that we in the West think of as "poor", Bangladesh, has Internet service providers that provide download speeds of just under 1Mbps. You might be wondering where the U.S. is on that list. We're number 175.

Clearly population density isn't the entire story. A poor nation with a high population density won't see high-speed Internet service anytime soon. But a reasonably prosperous nation that has a relatively high population density provides conditions that are ideal to support widespread high-speed service.

Fiber is costly to deploy and if the population density hovers around 2 people per square mile, as in Mongolia, there's little incentive to install fiber. On the other hand, Taiwan (600+ people per square mile), South Korea (490), Japan (336), Great Britain and Germany (each around 240), and Switzerland (180) have the investment capital and the population density to support a fast build-out.

China is far higher on the list than the United States, too, 76th with a density of just over 130 compared to the U.S. in 175th place with a density less than 30. It's true that about 80% of us live in metro areas, though, and it's unlikely that someone who lives on Beer Can Alley, about 5 miles north of Wolf Point, Montana, expects high-speed service anytime soon.

You might be wondering about the source of all those wonderful spams that offer to send you millions of dollars in cash. Nigeria is 73rd by population density with about 141 people per square mile. Those who can afford Internet access in Nigeria are largely limited to speeds of 300 to 500Kbps according to Speedtest.net. That's still 6 to 10 times the speed of modem-based Internet service.

For the Residents of Beer Can Alley*

*I just liked the name of the road. Actually, it appears that nobody lives on Beer Can Alley. Google Maps shows nothing more than farm fields north and south of the road. But somebody might build a house there someday.

If you live in northeast Montana (or just about anywhere else in the country) you can have "high-speed" Internet access if you're willing to use a satellite service. I took a look at HughesNet Satellite. The company has 6 tiers of service:

No Easy Solutions

Click any of the images here for a full-size view.

Click for a larger view.For those who live in or near to cities, where houses are close together, fiber could be built out relatively fast. In many cases, fiber is already in the neighborhood. All that's needed is the time, money, and effort required to run fiber from a utility pole to and through your home.

Click for a larger view.As for the guy who might someday build a house on Beer Can Alley: Don't hold your breath.

To see the spreadsheet I used to compare land mass, population, and population density, download this Excel 2007 file. If you have Excel 2003, you will need to download the Microsoft Office Compatibility Pack for Word, Excel, and PowerPoint 2007 File Formats.

Linux Might Win

If Linux is the Answer, What's the Question? I said last week that Linux won't win the fight for the desktop. I mentioned, among other things, a lack of Linux administrators in the corporate world, fear of the unknown, and a general inability to run Windows applications. During a week of vacation in March, I was able to work for long periods on several days with my primary computer running Linux. I still can't realistically see a time when Linux will be the king of the desktop, but I know for certain that Linux does most of the tasks needed by a large number of computer users. Maybe it's time to take a look at some of the Linux advantages.

Cost Advantage: Linux is free. Part of the cost of a computer is the operating system and Windows adds about $200 to the cost of the hardware. That's not a lot if you're buying a high-end computer, but it's quite a bit if you're looking for a computer in the sub-$500 range. Eliminating the cost of the operating system would allow the buyer to reduce the price or to obtain better hardware at the same cost.

Performance Advantage: Linux can generally do more with less hardware. It will happily use large amounts of memory and high-power video cards, but Linux manages to get along better than Windows does on low-power machines. You'll also probably find that Linux starts and shuts down faster than Windows.

Reliability Advantage: Recently, when Windows lost track of both DVD drives in my computer, I was able to determine that the problem wasn't a hardware failure by booting Linux and allowing it to confirm that the drives were working. (Story below.) The problem turned out to be a Registry problem. Linux doesn't store all of the important configuration information in a single gigantic (and easily corrupted) file as Windows does.

Security Advantage (sometimes): Some Linux distributions take security more seriously than others, but generally speaking Linux and Unix machines have been more secure than Windows. Vista's security is good, although cumbersome, and Windows 7 will probably be better. Still, because Linux is based on Unix, the directory structure was designed from the beginning with security in mind.

Robustness Advantage: I mentioned the Windows Registry, which is one of the reasons that performance degrades over time when outdated entries build up. Windows users who routinely add and remove software usually need to reinstall the operating system occasionally. This is rarely done with Linux.

Disk Advantage: Because of the way Windows writes to hard drives, files become fragmented. This reduces performance and the only solution is to run a defragmentation program. Linux takes a different approach to writing files and, as a result, fragmentation is far less likely to occur until the disk drive is nearly full.

Sharing Advantage: It's a common misconception that Linux can't write to Windows NTFS volumes. It can. As I wrote this report in a text editor under Linux, I saved the document to "/media/Data/WEBSITES/TechByter.com/_development/TechByter pending", a directory on an NTFS volume on drive D. Linux doesn't use drive letter designations, but it can mount the volume as well as read and write files. I wouldn't need that, of course, if the computer had only Linux installed.

Ubuntu classifies software into 4 categories: Main (officially supported software), Restricted (supported software that is not available under a free license), Universe (community maintained software that is not supported by Ubuntu), and Multiverse (software that is not free).

One-Stop-Shopping Advantage: When you're looking for an application, and particularly if you're using one of the more consumer-oriented distributions ("distros" in Linux-speak), adding an application is as easy as selecting Add Programs from a menu and picking the applications you want from a list that's divided into categories (utilities, programming, office, games, and such). Select as few or as many as you want and tell Linux to start. The operating system locates all of the applications, downloads them, installs them, and sets them up on the menu. It doesn't get any easier than this.

Updating Advantage: Similar to the previous point, updating is a snap. If you have used the distro's installer, it will track updates to applications as well as updates to the operating system. You'll be notified when there are updates and obtaining them consists of a single click that instructs the updater to get to work, along with typing the password required for administrator tasks.

For every advantage, there is probably a disadvantage, but I covered those in the other article. If you need a computer, but you don't require any Windows-specific or Mac-specific applications, maybe it would be worth your time to consider Linux.

The Mysterious Case of the Missing DVD Drives

Sometimes things aren't where you left them. If I leave a cat on the bed, there's a pretty good chance he'll still be there when I return unless he suspects that he might be able to obtain food or catnip. And even then, he'll probably return. I'm fairly good about remembering where I put my glasses or the car keys. Sometimes I do have to call the cell phone to find it. But DVD drives that are installed in a computer should always be there. One day, they weren't. Physically they were there. I could see them, but Windows couldn't. That made using them somewhat difficult.

Click any of the images here for a full-size view.Click for a larger view.

The first step is both easy and obvious: Power connectors are notorious for becoming detached, so I press the button that opens the tray on each drive. Both opened. They're getting power. The data cable could have become loose, but data cables are firmly attached and, besides, two of them would have to have come unplugged at the same time. Possible, but not worth opening the case for just yet.

Click for a larger view.I plugged in an external USB DVD drive, but the USB device detector had a problem with it, too.

The Windows Device Manager showed the USB drive and the 2 internal drives as having problems.

Click for a larger view.This should be an easy fix. The usual procedure is to delete the devices and then either restart the computer or have the Device Manager scan for new devices.

Nothing doing. No matter what I did, the two DVD drives simply weren't present. It could be the result of a hardware problem. Both DVD drives should use the same part of the main board, so the problem could be there.

Click for a larger view.But I thought I'd try performing a system restore to see if that would resolve the problem. After all, both drives had been working less than 24 hours earlier.

As is usually the case with System Restore, the process acted as if it would work, but ultimately failed. So I still don't know if I'm looking for a hardware problem or a software problem.

Linux to the Rescue

Because the computer is a dual-boot machine, I could conclusively rule out either hardware or software by booting to Linux. If Linux could see the drives, the problem would be a Windows issue. If Linux couldn't see the drives, the problem would be entirely hardware related.

Linux could see the drives. Both of them. And it could play a DVD movie. So that meant the problem was Windows. And probably the Registry.

The Registry is nearly 180MB on my computer. There's no way that I could ever hope to scroll through there, line by line, with any hope of finding the problem. So I turned to Google. That's my usual procedure, even if I think it's a Windows problem. Instead of searching Microsoft's Knowledgebase, I start with Google. The knowledgebase articles are all indexed, but often I'll find better descriptions of the problem and possible resolutions elsewhere.

About halfway down the first page of results, there was a Microsoft Knowledgebase article that looked promising.

http://support.microsoft.com/kb/314060: Microsoft Windows XP
1. Click Start, and then click Run.
2. In the Open box, type regedit, and then click OK.
3. In the navigation pane, locate and then click the following registry subkey:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class\{4D36E965-E325-11CE-BFC1-08002BE10318}
4. In the right pane, click UpperFilters.
Note You may also see an UpperFilters.bak registry entry. You do not have to remove that entry. Click UpperFilters only. If you do not see the UpperFilters registry entry, you still might have to remove the LowerFilters registry entry. To do this, go to step 7.
5. On the Edit menu, click Delete.
6. When you are prompted to confirm the deletion, click Yes.
7. In the right pane, click LowerFilters.
Note If you do not see the LowerFilters registry entry, unfortunately this content cannot help you any further. Go to the "Next Steps" section for information about how you can find more solutions or more help on the Microsoft Web site.
8. On the Edit menu, click Delete.
9. When you are prompted to confirm the deletion, click Yes.
10. Exit Registry Editor.
11. Restart the computer.

So the good news is that Microsoft knows about this problem, even if I've never heard of it. I found the Registry subkey and deleted the two entries. When I restarted the computer, the DVD drives were home again.

And My Point Is ... ?

My point is that this isn't (as a friend of mine like to say) "rocket surgery". The first thing to do is remain calm. Ask anyone who's been trained by the Red Cross what the first order of business is and you'll probably be told that you should be part of the solution, not part of the problem.

I could have yanked off the side of the case and started moving wires around and, at best, that would have produced no change. I could have decided that the only solution would be to reinstall Windows; that would have solved the problem, but at the cost of several days of lost productivity.

Think things through. Use the resources available to you. Being able to boot the system to Linux and see the drives told me that I would have been foolish to have cracked open the case. Once I knew the problem was related entirely to Windows, Google became an ally.

But sometimes a Google search can lead you to something that's "almost" like what you're experiencing. There's a great temptation to do something (anything!) to fix the problem. For those of us who aren't terribly patient, this is a dangerous juncture. Before taking action to solve the problem, it's a good idea to make sure that the problem you're solving is the one that has presented itself. If you don't, you're in danger of creating a solution that only makes the problem worse than it was.

Short Circuits

Sun Slides Behind an Oracle

Things haven't been so bright for Sun lately. The company that pioneered graphics-intensive terminals has seen basic desktop computers' power increase to the point that machines costing a small fraction of a Sun Workstation could perform virtually the same tasks. IBM considered acquiring Sun, but antitrust concerns got in the way. So now Sun will be acquired by Oracle. I'm still trying to figure out who wins and who loses, so any commentary will have to wait for a later time, but I can talk about the acquisition. Well, maybe just one comment. I'm sorry to see Scott McNealy's company by gobbled up by Larry Ellison's Oracle. Oracle will acquire Sun for $7.4 billion.

It was 1982 when 4 Silicon Valley wizards created a new class of "inexpensive" workstations that could do what once required mainframe machines or at least minicomputers. But the relentless march of hardware turned today's desktop systems into yesterday's supercomputers.

The name "Sun" was a nod to the "Stanford University Network", founded by McNealy and 2 other Stanford grad students and UC Berkeley software engineer Bill Joy, a Unix guru. Sun workstations were able to create graphics that until then had required much more powerful systems. Sun servers powered much of the dot-com boom of the late 1990s, but the dot-com crash was the end of the company's prominence.

In recent years, Sun acquired the German StarDivision's Office Suite and renamed it Star Suite. Development of Star Suite is now an open-source project known as Open Office. Sun released the source code on condition that it could use improvements developed by the open-source community.

McNealy remained chairman in 2006 when Jonathan Schwartz became Sun's CEO. Sun and IBM discussed a merger for several months, but IBM called off the chase, which gave an opening to Oracle.

Bill Gates is Right!

For years, Bill Gates said that Microsoft couldn't continue its record quarter-over-quarter growth. But for years (23 years, if you want to be exact) Microsoft recorded gains. For the first time, sales have dropped year-over-year.

Revenue fell 6% in the quarter that ended March 31. And Microsoft failed to meet analysts' estimates. The company was expected to earn 39 cents per share, but the final results were $3.0 billion of net income, which is 33 cents a share. Year-over-year profits dropped from $4.4 billion last year.

The personal computer industry has been hit hard by the recession. Business and home purchases are down sharply. Making things worse is a decision by a lot of people to wait for Windows 7 to be released. Intel says it expects sales to increase soon and that would mean gains for Microsoft. Intel says it's begun to see larger orders from manufacturers, so there are indications that the situation may improve.

Jaunty Jackalope Hops onto Computers Worldwide

If you're an Ubuntu Linux fan, you already know this. Version 9.04 of Ubuntu Linux was released this week as an update to version 8.10. Canonical released 9.04 on Thursday, April 23, and I'm writing this as the 618MB update is downloading. In all, 10 packages will be removed, 104 will be installed, and 993 will be upgraded. It's all automatic.

The expected download time was shown as just shy of 2 hours, but the actual time will be closer to 5 hours. Part of this is because the Ubuntu FTP servers are clearly swamped on this first day to download the new version. But I also had in process a 2GB backup procedure that sends files to Carbonite. Simultaneous big uploads and big downloads don't work very well. When I shut down the Carbonite process, the download speed increased to about 80Kbps, which is clearly the result of an overload on the Canonical side. I could easily handle 10 times that speed, which would reduce the download time to the expected 2 hours.

The new release is available for netbook computers. It contains a preview edition of Eucalyptus, which is an open-source project from the University of California (Santa Barbara). Eucalyptus makes it possible for organizations to build their own in-house Amazon-like EC2-compute clouds.

Ubuntu 9.04 can work acceptably in both desktop and server roles, but it's far less robust as a server OS than as a desktop OS. If you're concerned about the breakneck speed of Ubuntu releases (8.10 was released just 6 months ago and 9.10 will be out 6 months hence) you could consider 8.04, which comes with long-term support.

Talking About Adobe at the Columbus Computer Society

I'll have the opportunity to show some of the features that I've talked about on TechByter Worldwide. In the interest of allowing everyone to get home well before midnight, I'll have to seriously limit the number of features I can show, but I'll try to use your time wisely.

The public is welcome.