TechByter Worldwide

Listen to the Podcast


23 Apr 2021 - Podcast #740 - (22:43)

It's Like NPR on the Web

If you find the information TechByter Worldwide provides useful or interesting, please consider a contribution.

PayPal

Subscribe

23 Apr 2021

5G Will Be Fast, But Progress Is Slow

Approximately three quarters of the US population has access to some variant of 5G coverage. All services that are billed as 5G aren't the same, and Russia continues to spread disinformation about 5G aimed at slowing adoption in the Unites States.

 Click any small image for a full-size view. To dismiss the larger image, press ESC or tap outside the image.

TechByter ImageMost major cellular providers have something they call 5G in some areas. Lifewire has a comprehensive report on where 5G is available.

There are advantages and disadvantages to this the new technology.

In most cases, 5G systems operate on frequencies that are higher than those used by 2G, 3G, and 4G systems. Some service providers have pseudo-5G systems that operate on 4G frequencies and with slower data rates than true 5G. Because the frequencies are higher, the signals won't travel as far. That means there will be more cell towers, but 5G towers are usually shorter and less obvious. The towers look a lot like light poles and are being combined with street lighting in some areas.

<< Traditional cell towers are huge compared to 5G towers.

Because there are more cell towers, the individual cells are smaller. 5G cells can handle as many as 10 times the number of devices that a 4G cell can handle. Combined with smaller cell size, there's the possibility of massively better throughput. But because cells are smaller and systems need more cell towers, the development cost is considerably higher. This alone has slowed development.

As 5G systems become more available, more computing power will probably be added to cars and trucks so that they can send and receive information about their position, and possibly communicate with smart traffic control devices that can be adjusted in real time for better traffic flow. That's a possibly, but don't expect to see it anytime soon. The infrastructure needs to be present first, and it isn't. Then enough 5G-enabled vehicles need to be on the road for the system to have a noticeable impact. Think decades, not years.

Internet of Things (IoT) devices will figure in to mobile use, too. Security has been lagging with IoT devices, although it has improved in the past few years. Having devices with unreliable security protocols on the network is a danger that must be addressed.

Data rates are dependent on the frequency as well as the number of users on a given cell. 4G systems were supposed to be able to deliver 100Mbps speeds, but data rates like that in real life are all but impossible. 5G promises a theoretical maximum of 10Gbps, and it's unlikely that those speeds will be seen in practice. But maybe half the theoretical speed — 5Gbps — or even a quarter of the theoretical speed — 2.5Gbps. Even one tenth would be 1000Mbps would still be ten times the theoretical maximum of 4G systems. So faster speeds are clearly coming.

TechByter ImageSome opposition to 5G technology is based on the possibly that the new devices will contribute to climate change or that they will be used by governments and businesses to spy on citizens. There's no question that 5G will increase the number of devices in use, but it won't matter when (hopefully "when" and not "if") the planetary use of renewable energy resources largely eliminates the use of coal, oil, and gas. The surveillance concern isn't exaggerated, but it also seems irrelevant. Nearly every person on the planet already carries a tracking device around all day, every day. We use devices at home that listen to us. We use applications that report information about us to organizations such as Facebook, Microsoft, and Google. All of that calls into question our real concerns about privacy. Do we really care?

The Point Of Disinformation From Abroad

5G technology is a game changer for connectivity, and the technology can give a one nation a competitive advantage over another. As a result, it shouldn't be too surprising to find that Russia is pushing hard to develop 5G technology at home while using disinformation campaigns to create and empower resistance movements in the United States.

Unfortunately, some of these campaigns have been quite effective. RT Television has run several so-called special reports on the dangers of 5G. The "R" in RT Television stands for Russia. Russia Today is operated by the state-owned news agency RIA Novosti (РИА Новости, Federal State Unitary Enterprise International News Agency) as part of a public relations effort intended to improve the image of Russia abroad. It has a long history of meddling and fanning the flames of conspiracy theories.

TechByter ImageGerman news magazine Der Spiegel says RT "uses a chaotic mixture of conspiracy theories and crude propaganda." In the United Kingdom, The Observer's Nick Cohen wrote that RT spreads conspiracy theories and is a "prostitution of journalism", and Oliver Kamm at The Times called it a "den of deceivers." In the US, journalists at the Daily Beast and the Washington Post have written that RT continues to promote long-discredited bits of disinformation such as control of the world by "the Illuminati" and the forged "Protocols of the Elders of Zion" that were created before the Russian Revolution is Tsarist Russia.

The respected non-profit think tank RAND characterizes RT as "a firehose of falsehood", and anyone who has studied actions by TASS, the old Soviet news agency (as I have) will immediately recognize the techniques used to spread lies.

Much of the disinformation is based on the assumption that non-scientists will not understand the difference between ionizing and non-ionizing radiation. We are surrounded by radio waves. AM and FM radio, television, cell phones, smart devices, and more all use radio waves. These signals are all non-ionizing. The disinformation campaigns make a big deal out of "radiation" from 5G devices without mentioning all of the other radio signals and without differentiation between non-ionizing and ionizing.

Radiation generated by nuclear power stations and held inside containment structures is ionizing. That's why the containment structures are needed. This kind of radiation can cause burns, cancer, and radiation sickness. Standard radio waves do not.

So those who oppose 5G technology based on what they've seen or heard on RT Television or on websites that base their "research" on propaganda promulgated by RT Television should seriously question their sources.

Short Circuits

DeepL Translate Takes On Google Translate

Encountering a website or a comment written in a language we are unable to read can be frustrating, and Google Translate steps up to provide translations that, while not perfect, may at least be sufficient to determine the general intent of the writer. But there's a competing service from Germany.

Which is better? The primary point to consider with translations is that no automated system can match work done by a professional translator who understands both languages. Idioms and puns are just two of the most difficult tasks because each language has its own vast store of meanings.

During World War Two, some reports say that a disagreement between American and British generals occurred because of a misunderstanding of the word "tabled". Both British and American generals wanted to act on a proposal. The British generals called for the proposal to be "tabled"; the American generals were astonished. Eventually both sides understood the "tabled" meant "putting the idea on the agenda so it could be discussed" to the British, but the Americans understood it to mean "we don't have time to put this on the agenda, but perhaps we'll bring it up later."

English speakers can't always agree on what a term means, and translation simply compounds the issues and confounds the participants. So consider any translation provided by an automated system to possibly contain fatal flaws.

Spanish-based translation service AT Language Solutions, which also offers its own online translation service puts it this way: "Although the speed at which machine translation can be done does offer a great advantage over human translation in terms of time, the translation is usually very literal and the meaning may sometimes be unclear. This is only to be expected, as the way sentences are formed differs in each language. Translating each word and then putting them together can lead to a sentence that makes no sense. We therefore advise you only to use automatic translation to understand a text, never for publication or professional use."

But we can't always employ human translators who have native-like understandings of both languages, so that leaves the question: Which is better?

Google Translate certainly offers more languages. The German-based DeepL is limited to Chinese, Dutch, English, French, German, Italian, Japanese, Polish, Portuguese, Russian, Spanish, and about 30 other languages. That's still a respectable list and one that will cover virtually all possibilities. If you need to translate to or from Bulgarian, Hindi, Maltese, Persian, Vietnamese, Zulu, or any of about 100 other languages, Google Translate has you covered.

 Click any small image for a full-size view. To dismiss the larger image, press ESC or tap outside the image.

TechByter ImageGoogle also wins when it comes to cost because it can translate whole websites, files, and large chunks of text without charge while DeepL is limited to 5000 characters until you've signed up for one of the paid services. DeepL allows the user to click any word in the translation and it will offer alternate suggestions.

DeepL also has an application that can be downloaded and installed. If the application is running, the user can select text and press Ctrl-C twice to copy the text and paste it into the DeepL application.

TechByter ImageI did some very simple testing based on what I still remember from learning Russian in college. The text I provided for translation from English to Russian was basic and straightforward. Both Google and DeepL performed accurate, usable translations:

TechByter ImageThe primary differences between Google's approach and DeepL's approach is that DeepL frequently uses the first-person singular pronoun (Я) where Google omits it because "I" (or "Я") is implied by the verb. Either is correct. DeepL also offers, for paid subscribers, the ability to switch between formal and informal usage for languages that make this distinction. When set to "formal", DeepL would replace the familiar usage for you (ты) with the formal word for you (вы). By default, DeepL seems to favor informal while Google uses formal tone.

Although AT Language Solutions offers professional translation services, the company's research says that DeepL is generally more accurate: "In the translation from English into Spanish, DeepL scored much higher than other translators such as Microsoft, Google or Facebook. In the other language combinations tested, DeepL translations performed three times better than those carried out by the other translation services."

This conclusion was based on a test in which professional translators selected the best translation from among more than 100 sentences that had been translated by automatic translators.

So for quick and easy translations, Google will generally be adequate for revealing the general summary, but DeepL will probably do a better job. For critical issues, though, hire a translator.

The Origin Of “Microsoft Gets It Right On The Third Try”

Once there was an aphorism about Microsoft: "They get it right on the third try." Part of that may be traced back to 1992 and Windows 3.1 even though Microsoft hadn't quite gotten it "right" with Windows 3.0 in 1990.

Windows 1.0 was abysmal. Both Steve Jobs at Apple and Bill Gates at Microsoft were working to see who would be able to introduce the first graphical user interface based on what each had seen at the Xerox Palo Alto Research Center. Both companies released such systems: Apple in 1984 and Microsoft in 1985. Apple got there first and with a better system than Microsoft's, but 1990's Windows 3 and especially 1992's Windows 3.1 began to close the gap.

April 1992 is when Microsoft released Windows 3.1. A year later, the final version of Windows 3 was released as version 3.11, Windows for Workgroups. Licensing and support for Windows for Workgroups continued for the next 15 years although mainstream support ended in December 2001.

 Click any small image for a full-size view. To dismiss the larger image, press ESC or tap outside the image.

TechByter ImageWindows 3 was the final version of the operating system that ran on top of DOS.

The 3.0 version introduced an improved graphical user interface that offered a three-dimensional appearance by borrowing technology from the OS/2 Presentation Manager. OS/2 was the operating system that Microsoft and IBM were developing jointly.

There were also improvements in memory management, and Windows 3.0 introduced virtual memory -- the ability for the operating system to use a file on the computer's disk drive as if it was RAM. Virtual memory was far slower than true RAM, but it allowed the development of large applications that could have modules swapped in and out of main memory.

Windows 3.11 was also the last operating system from Microsoft to run solely as a 32-bit system, with all of the limitations imposed by 32-bit architecture.

Images are from Wikipedia.org.

TechByter ImageOne of the most significant enhancements introduced by Windows 3.1 was TrueType even though the technology had been created by Apple as competition for Adobe's Type 1 fonts that were used in Postscript documents. This technology provided scalable typefaces that were essential to desktop typesetting applications that had been available since the mid-1980s. Users had access to Arial, Courier New, and Times New Roman, in regular, bold, italic, and bold-italic versions. Symbol added a group of scalable symbols.

Even more significant for the future was elimination of real-mode support and a requirement for at least an Intel 80286 processor and a system with a minimum of 1MB of RAM. Windows 3.0 with real-mode support could be counted on to crash several times per day.

The Enhanced 386 mode allowed users to run multiple DOS windows in which users could manipulate menus and other objects in the program with the Windows mouse pointer. Sometimes. This worked if the DOS application supported the use of a mouse. Some DOS applications even gained access to the Windows Clipboard. This magic was accomplished by adding special DOS drivers at boot time to provide the hooks DOS applications needed to be exposed to Windows.

Windows 3.0 had been limited to using 16MB of memory, but Windows 3.1 in Enhanced mode gave the operating system access, theoretically, to 16MB of RAM -- not that any motherboards could have that much installed RAM or that anybody would be able to afford that much.

Microsoft provided Windows 3.1 on three versions of floppy disks and, for the very first time, on CD ROMs.

And last, Windows 3.1 introduced the Registry, a centralized database intended to store configuration information as well as settings for operating system modules and installed applications. The joke that the time was that there was one person in Redmond who understood what the Registry does.

That has all changed in the past 29 years. Now there is nobody in Redmond (or anywhere else) that understands what the Registry does.*

*OK, so I'm being sarcastic here, but the Registry is still something best left untouched by users.

Spare Parts

Get A Faster PC Without Spending Anything

When you're tired of waiting for a slow PC, you can buy a new computer or you you speed up the computer you have even without buying more memory or a faster disk drive.

DISCLAIMER: You can't turn a slow computer into a fast computer, but you might be able to eke a bit more performance out of it. The changes won't supercharge the computer. They won't add memory or processor cores. They won't upgrade the processor.

But you might see a noticeable improvement.

TechByter ImageThis first change will affect the appearance of Windows and may turn off some of the fancy effects such as animations and fade-ins. These effects require CPU cycles and turning them off lets the computer use the CPU for computing instead of appearance.

Open the Settings panel and type "performance". Choose Adjust the appearance and performance of Windows to open the Performance Options panel. Then either check Adjust for best performance or choose Custom and turn off all of the features you don't need. These changes will be minimal for most users, but they can improve the performance of a computer with a low CPU and limited memory.

TechByter ImageNext, return to the Settings panel and type "power". Choose Power & sleep settings, then click Additional power settings. This will open the old style Control Panel Power Options panel and you'll probably find "Balanced (recommended)" is selected. Leave these settings alone in case you decide later that you want to go back to them. Instead, click Create a power plan and give it a name. For best performance, select the High performance plan as a starting point.

TechByter ImageMost of the settings have two options: One for when the computer is running on AC power and one for when it's battery powered. If you set higher performance settings for battery mode, the battery will discharge faster. Higher performance settings will also cause the computer to run hotter.

Expand each of the options and set Maximum Performance when it's available. Then save the settings and use the computer for a while. You should notice small performance improvements. I'm generally willing to trade a slightly hotter running temperature for better performance, but I prefer the visual effects too much to turn them off.

Office Use Is Down, But Collaboration Space Is Up

Research by San Francisco firm VergeSense shows that office space use in general is low, and was already low before the pandemic. But the pandemic has seen an increase in the use of collaborative spaces.

The report says that about 83% of office space was allocated to individual work in January 2020 and just 17% was dedicated to collaborative work. Before the pandemic, collaborative spaces were 25% more utilized than spaces dedicated to individual work.

Collaborative space is seeing greater use now and VergeSense says businesses that invest in workplace transformation will create a competitive advantage that will attract employees, get the best use of real estate, and better deliver on what employees need.

Another change seems to be the development of shared office spaces for use by individuals and small companies, organizations that perhaps want to have occasional use of office space without high monthly rentals.

In some cases, small and single-proprietor businesses could be run from the business owner's home much of the time, then from a shared workspace as needed when meeting with clients.

 Click any small image for a full-size view. To dismiss the larger image, press ESC or tap outside the image.

TechByter ImageEven small towns are seeing the trend. Bellefontaine, Ohio, besides being where I grew up, has about 13 thousand residents. It has a manufacturing and railroad history, but both of those faded long ago and the town was in decline. A large revitalization project has been underway for several years and Bellefontaine has become a destination for residents of much larger surrounding cities.

Now Build CoWork + Space has renovated one of the historic buildings to provide 8000 square feet of space with desks, dedicated desks, offices, and even an executive suite with prices ranging from $39 for a day pass and $99 per month to $2000 per month for the executive suite with space "for a small team with a private conference room and kitchenette access plus all the perks."

Twenty Years Ago: 3Com Was In Trouble

In April 2001, I wrote "Ask "What's new" at 3Com and the answer will probably be "Everything!" The company is now in the midst of its THIRD restructuring in a little LESS than a year. Five years ago, it would have been hard to imagine 3Com being in the position it's in."

This is the company that invented networking for computers, with help from the Xerox Palo Alto Research Center. It was still selling a lot of Ethernet cards that connect computers to networks, but by then the technology was mature and network cards had become commodity items, sold mainly on price. In the early 2000s, 3Com shrunk from 12,000 employees to about 2,000. By 2009, 3Com had about 9,000 employees and lots of valuable patents. In November 2009, 3Com was acquired by Hewlett-Packard and was integrated into HP's networking division.

Just A Passing Thought ....

Assuming, of course, that there are no other civilizations that have developed powered flight
in any of the planetary systems around billions of other stars.