August 30, 2018

The tortoise and the hare (2/7)

The challenge for journalists covering science and industry is to capture the state of a rapidly changing environment without forgetting they are taking pictures of moving objects. Reducing technology to snapshots in time masks all the action going on behind the scenes. Compared to the jack-rabbiting hare, the tortoise doesn't seem to be moving at all.

This was certainly true of software and hardware in the late 1980s. Gordon Moore devised his famous law in 1965. As with a standard exponential growth curve, the rate of change was barely perceptible at first. But by the 1990s, it was taking off like a rocket. The technology was evolving so fast that the real world appeared utterly unreal.

In his 12 June 1990 Inside Track column, John Dvorak reported that a

two million transistor 586 will show up in 1992, and the 4-6 million 686 in 1996. By the year 2000, the expected technology breakthroughs will allow the company 100 million transistor on the 786. It will crank out 2 billion instructions per second [2,000 MIPS] at a clock speed of 250 MHz! Sheesh. The company confirms these assertions.

Such predictions were gobsmacking unbelievable at the time. But the intel Dvorak got from Intel was spot-on. Released in 1999, The Pentium 3 had 9,500,000 transistors on board, a clock speed of 450 MHz, and cranked out 2,054 MIPS. The Pentium 4 topped 100,000,000 transistors in 2004. Multi-core CPUs today contain billions.

(Click image to enlarge.)

Once a slow-moving technology begins to pick up speed, the acceleration can catch bystanders by surprise. Just as when a much heralded technology begins to slow, it can quickly get stuck in the mud.

The most surprising things about the origins of the personal computer is that a lumbering entity like IBM brought the PC to market in so nimble a fashion. So quickly, in fact, that neither IBM nor Microsoft had time to develop its own operating system. Microsoft instead purchased the progenitor of MS-DOS from Seattle Computer Products.

Come the 1990s, a decade on, a new operating system was needed, a 32-bit OS for the 32-bit microprocessors that already dominated the market, that extended the memory address space from 1 MB to 4 GB. And it should come with a graphical user interface (GUI). OS/2 ticked off all the boxes. Obviously history was simply going to repeat itself.

Except that IBM viewed the accidental standard it had created as exactly that: an accident. A mistake, one it was not eager to repeat. Still, everybody expected another turn on a dime, a reboot of the personal computer industry once again engineered by the hit Hope and Crosby duo of Microsoft and IBM.

Everybody but Microsoft, it turned out.

Work on a GUI OS interface began at Microsoft in 1981. Windows had been in formal development since 1983, when Bill Gates recruited Scott McGregor, a key developer behind Xerox PARC's pioneering windowing system, and put him in charge of the project. That was four years prior to the release of OS/2 1.0.

Contrary to the lore that emerged later, the first 16-bit version of Windows was lauded by the press. In his 20 August 1985 pre-release review, PC Magazine editor Bill Machrone predicted that "Windows will become a powerful tool for power users, and it lays a solid groundwork for future generations of PC applications."

PC Magazine later praised Windows 1.0 in its "Best of 1985" issue. Machrone again described it as "a harbinger of the future," especially in the way it integrated device drivers into the operating system. In a featured 25 February 1986 product review, Jeff Duntemann called Windows "the face DOS will wear in the future."

Indeed, DOS would wear that face until the release of Windows XP in 2001.

Holding Windows back from widespread consumer acceptance at the time was the hardware. To run well, Windows 1.0 required a maxed-out PC AT with an EGA card and monitor, which could almost double the price of an already pricey system.

But like the stubborn tortoise, Microsoft plodded along, releasing versions that first took advantage of the hybrid 16/32-bit 286 and then the fully 32-bit 386.

As Steve Ballmer recounted in Triumph of the Nerds, "I was the development manager for Windows 1.0 and we kept slogging and slogging to get things right for [Windows 3.0 in] 1990."

Microsoft was slogging in the right direction. The 15 March 1988 issue of PC Magazine inaugurated the "Pipeline" column that included a bestseller list focusing on business software and operating systems. The first list began as follows:

1. Lotus 1-2-3 (DOS)
2. WordPerfect 4.2 (DOS)
3. Microsoft Windows 2.03

That third entry remains a surprise. Flying well below the radar, Windows had already established a sizable market presence. When Windows 3.0 debuted in May 1990, Microsoft finally had the consumer operating system it'd been looking for and a good idea of how fast that critter could run. It was time to press the pedal to the metal.

Consider a 30 April 1991 column in which Jim Seymour ranted about Microsoft changing the default keyboard shortcuts in Word 5.5 (DOS) to match those of the Windows interface. A change he points to is that ALT-F-O now launched the File Open dialog box. With the benefit of hindsight, ALT-F-O is an amazing example of forward thinking.

A quarter century later, ALT-F-O retains the same function in practically all Windows applications. In 1991, Microsoft committed itself to a future based on the Windows interface. And followed through.

But back in the 27 December 1988 issue, columnist Ray Duncan wasn't

ready to join either the "DOS is Dead" or the "DOS Forever" school of thought. I think it quite likely that OS/2's successors will eventually eclipse MS-DOS, but I suspect this will take a lot longer than anybody now imagines—perhaps ten years or more.

Actually, it'd only take four more years for Windows 3.x to bury OS/2, selling forty times more copies than OS/2 had thus far in its entire existence.

Related posts

The future that wasn't (introduction)
The future that wasn't (1/7)
The future that wasn't (3/7)
The future that wasn't (4/7)
The future that wasn't (5/7)
The future that wasn't (6/7)
The future that wasn't (7/7)

The accidental standard
The grandfathers of DOS

Labels: , , , ,

August 23, 2018

The streaming chronicles (3/4)

In which I upgrade my cable modem and router.

So I got an Arris Surfboard SB6183 DOCSIS 3.0 cable modem. Xfinity has bugged me about it for years. I'd procrastinated because a faster Internet connection wouldn't speed up the Pentium M processor in my ancient ThinkPad laptop and its poky b/g Wi-Fi.

Getting into streaming was sufficient motivation to upgrade my home network. Streaming at a good data rate looks great. Not to mention that the way Xfinity stairsteps the "default" service level, I was paying for bandwidth my router and cable modem couldn't deliver.

Yeah, that's dumb. But whatever can be put off until tomorrow I will put off until tomorrow. The old stuff still works. It's always best to upgrade while the old stuff still works. Well, tomorrow has arrived!

The new router is a TP-Link TL-WR841N. It's gotten decent reviews for a very affordable class of wireless router. Common objections, like it only having 10/100 Ethernet and not supporting 802.11ac, are mostly irrelevant to my current setup. And did I mention that it's affordable? Twenty bucks!


I averaged a dozen or so samples with Fast and the Speedtest app for each hardware setup. Though the ThinkPad reports a 54 Mbps connection, neither it nor the Belkin could handle that data rate. Download speeds increased 20-30 percent with the TP-Link. The DOCSIS 3.0 router made little difference.

The Fire 7, on the other hand, has dual-band b/g/n. The specs for the DOCSIS 2.0 SB5101 claim a maximum transfer rate of 30 Mbps. The 802.11n protocol on the TP-Link and Fire 7 appear to be delivering every bit of it. Now I'm tempted to experiment with an 802.11n USB Wi-Fi adapter for the ThinkPad.

I'm paying for a maximum 70 Mbps down. The fine print doesn't guarantee it and I'd have to string an Ethernet cable to set a baseline. Using 802.11n, the DOCSIS 2.0 modem gave me half that. The DOCSIS 3.0 router delivers two-thirds. Considering the ageing infrastructure in this part of town, not bad.

BelkinTP-LinkDOCSIS 3
ThinkPad  (g)8.3  (g)11.2  (g)11.3 
Fire 7  (g)11.1  (n)33.2  (n)43.2 

DOCSIS 2.0 upload speeds averaged 5.1 Mbps with both the Belkin and TP-Link. DOCSIS 3.0 kicked it up to 6.2 Mbps, even on the ThinkPad. That extra 20 percent is handy when backing up data to the cloud.

The Roku Express supports 802.11b/g/n but not dual-band. Getting precise numbers is tricky because of the adaptive bit-rate streaming. According to the playback screen (HOME x5, RW, RW, RW, FF, FF), using the TP-Link with the DOCSIS 2.0 modem increased throughput from 1-5 Mbps to 10-15 Mbps.

The DOCSIS 3.0 modem upped network throughput to 15-40 Mbps at the high end. I'm assuming this screen reports the bandwidth provided by a given CDN stream and not the connection with the router, which shouldn't otherwise change. Though this could be a product of the glitches I discussed previously.

So far, the upgrade has done what I wanted it to do—eliminate buffering. Granted, it only ever happened with Crunchyroll, probably because Crunchyroll is migrating its CDN to AWS CloudFront. Occasional bouts of buffering since are more likely due to normal WAN hiccups than to LAN flakiness.

Oh, the Roku logging functions are still definitely flaky. The Network > About screen randomly reports signal strength as anything from "Poor" to "Excellent." The "secret" Wi-Fi screen (HOME x5, Up, Down, Up, Down, Up) still lists the signal strength as -80 dBm. Effectively zero. But it works fine.

Related posts

The streaming chronicles (1)
The streaming chronicles (2)
The streaming chronicles (4)
Anime's streaming solution

Labels: , , , ,

August 16, 2018

The last shogun

In the textbooks, at least, the 1868 Meiji Restoration ended the rule of the shoguns and reestablished the reign of the emperors. The effect, however, was to create a government where the "separation of powers" simply meant that the powers of the government were all separated.

Oh, those powers were, on paper, vested in the emperor. So had they been during the shogunate. It's just that from the 17th century through the early 19th, the Tokugawa shogun unquestionably controlled the emperor. Now nobody controlled the emperor. And the emperor didn't control anything either.

In a deadly game of king of the hill, the years in Japan between the Meiji Restoration and WWII were punctuated by a series of attempted coups. None succeeded, but all had the effect of pushing the government further to the right in hopes of deflecting the next military revolt, until the army was operating without any practical constraints.

Echoes of the first half of the 16th century, when the slow rot of the Ashikaga shogunate ignited battles amongst the military governors that culminated in the Warring States period.

Lacking the checks and balances of civilian oversight, the Japanese army ended up starting a small war in China that grew out of control, basically Vietnam on a continental scale. When the U.S. cut off oil and scrap metal exports to Japan as a response, the military lashed out without considering its capabilities or the military consequences.

Thanks to the military doctrine of Kantai Kessen, meaning a winner-take-all contest between battleships, the Japanese war effort was doomed from the start. Japanese military leaders couldn't stop believing in Kantai Kessen because it had proved so decisive during the Russo-Japanese War.

But by June of 1942 and the Battle of Midway, the battleship was a white elephant. The aircraft carrier ruled the waves. To be sure, the Japanese navy had indeed crushed the Russian fleet at the Battle of Tsushima in 1905, compelling a shaky Russian government to sue for peace.

This "underdog" victory was hailed around the world (even though it began with a "sneak attack"). The Japanese government was quick to believe its own press, forgetting that the land war going on at the same time was about as decisive as the First World War would be, with the Japanese infantry taking as many casualties as the Russians.

Notwithstanding one the greatest diplomatic achievements in history, the victorious Japanese came away from the Treaty of Portsmouth (1905) believing that the western powers had robbed them of their due. This combination of victimhood, aggrievement, and overconfidence set the stage for the next forty years of accumulating disasters.

In Japan, ordinary citizens—already living under draconian rationing and sumptuary laws—took the December 1941 attack on Pearl Harbor to be a second Tsushima, signaling an end to the conflict.

By the Battle of Okinawa, nobody in the Japanese government believed they could prevail by force of arms alone. But they could convince the Americans that invading the main islands carried too high a cost, essentially Robert E. Lee's strategy in 1864, that might have succeeded except for the fall of Atlanta and Sherman's March to the Sea.

The bitter irony is that in this they succeeded. Thus the atomic bomb. But the atomic bomb probably had a greater influence on Stalin, who, thanks to his spies, knew more about it than Truman. Stalin didn't launch his invasion of Manchuria until after Nagasaki. Once the bomb was dropped, Stalin had to act before Japan surrendered.

One of Stalin's goals was payback for the Russo-Japanese War. The Soviet army reclaimed all of its former territories, plus several islands that had always been part of Japan. From 560,000 to 760,000 Japanese were shipped off to the gulags, where from 10 to 50 percent of them died. This treatment by a former "ally" still rankles in Japan.

There is much talk of "formally" ending the Korean War. The one-week war between the Soviet Union and Japan has never been formally resolved either.

All through the Second World War, Japan and the Soviet Union had a non-aggression pact. Until the bitter end, the Japanese Supreme Council saw the Soviet Union as a "good faith" intermediary while raising arcane and legalistic objections to the Potsdam Declaration. Stalin's abrogation of the non-aggression pact destroyed that illusion.

But a negotiated surrender would not be acceptable to the Allies and certainly not to their citizens. They had been there and done that and suffered the consequences. In July of 1918, Winston Churchill laid out the terms for a lasting armistice with Germany. In the process, he made clear why the "Great War" would not be "the war to end all wars."

Germany must be beaten; Germany must know that she is beaten; Germany must feel that she is beaten. Her defeat must be expressed in terms and facts which will, for all time, deter others from emulating her crime, and will safeguard us against their repetition.

Despite all the treaties signed and reparations extracted at Versailles, between the two world wars, Germany acceded to none of these conditions. But in August of 1945, as John Dower vividly lays out in Embracing Defeat: Japan in the Wake of World War II, Japan very much did.

The atomic bomb was considerably less destructive than Curtis LeMay's ongoing firebombing campaigns. But it forced Stalin's hand and that forced the Japanese government to finally face reality. And when he finally did face reality, the atomic bomb gave the emperor a transcendent power to whom he could surrender Japan's wartime ideology.

This time, history would not repeat itself.

Though in a very real sense, history was repeating itself for the fourth time. In 1185, Minamoto Yoritomo destroyed the Taira clan—the power behind the throne—and moved the capital of Japan to Kamakura, inaugurating the rule of the shoguns. On and off for the next 700 years, the emperor reigned as little more than a figurehead.

When Tokugawa Ieyasu consolidated power after the Battle of Sekigahara in 1600, the country breathed a sigh of relief and mostly aligned itself with the new regime. Like Ieyasu himself, it was an opportunistic resolution that demanded little in the way of ideological conformity, except to go along to get along, a social compact that worked.

In the mid-1860s, as the Tokugawa regime crumbled around them and the center could no longer hold, this opportunistic ambivalence was expressed in the "Ee ja nai ka" movement, an anarchic yet strangely playful popular uprising that proclaimed, "So what? Why not? Who cares?"


In the late summer of 1945, the population was too exhausted to dance in the streets. But they'd had enough of ideology. Observes John Dower, when General MacArthur arrived in Japan on August 30 of that year,

he easily became a stock figure in the political pageantry of Japan: the new sovereign, the blue-eyed shogun, the paternalistic military dictator, the grandiloquent but excruciatingly sincere Kabuki hero.

Dower wryly concludes, "Indeed, the response of huge numbers of Japanese was that the supreme commander was great, and so was democracy." So it comes as no surprise that they should so readily switch their allegiances to the man who promised them much less torment and a much better future.

Related posts

The grudge and the dream
Kantai Kessen
Hirohito and the Making of Modern Japan

Labels: , , , , ,

August 09, 2018

The Cassandras of computers (1/7)

Cassandra was cursed by the Greek god Apollo with the power to make true prophesies that nobody believed. A quarter century ago, the Cassandras of the computer world had an additional problem: they didn't always believe the future they were forecasting either.

Columnist John Dvorak was a curmudgeonly contrarian back when he started writing for PC Magazine in the 1980s. During his first decade, he made several notable predictions, reported a story that foreshadowed a tidal wave of technological change, and then missed the very confirmation of what he was writing about.


After a hands-on demo of the Canon RC-701, the first commercially-available camera to use a CCD instead of film, Dvorak stated in the 26 January 1988 issue,

It's the future. Not only will we one day take photos on floppies or plug-in RAM (or both), but we will manipulate them on our machines at home. Finally, a use for the home computer: a device to edit snapshots. And note: because the photo is on a cheap disk, there will be no reluctance to take tons of pictures because we'll know we can erase and reuse the disk—something we'll never actually do.

After first musing that it was time to sell Kodak stock, he reconsidered and thought that maybe Kodak had a future selling printers and paper. His initial reaction was the correct one.

At the time, it was also a warning nobody wanted to hear (including Kodak). After he again broached the subject two years later in his 25 December 1990 column, a reader wrote in to scoff, "What a laugh. John Dvorak says that photography as we know is dead." But this time Dvorak had seen the future with 20/20 foresight.

He hit the nail on the head a few other times too, pointing to the rise of Unicode and predicting that LCD screens were "the wave of the future." In his 26 January 1993 column, he railed against pagers and portable phones, saying they reminded him of the Borg from Star Trek. "The desire for instant communication reflects our own insecurities. Dump them."

Dvorak recently revisited the latter subject in a 6 June 2018 column, in which he termed the malady "FOMO," or "the fear of missing out." (Internet pioneer Jaron Lanier recently devoted an entire monograph to the subject.)

Also in 1993, John Dvorak foresaw the importance of Texas Instrument's brand-new DLP projection system and correctly predicted that giant magnetoresistance technology would make 250 GB hard drives "commonplace" a decade hence. Wading into the field of business anthropology, after a junket to Japan in 1990, he observed,

The Japanese are incredible time wasters. While the Japanese may make an efficient assembly line, the Japanese style of doing business is the opposite. It's a miracle anything gets done. Companies compete fiercely with each other. "Japan Inc." is a cooperation between government and business that is the opposite of what we experience in America. In Japan, government helps business succeed because the government knows that business success means wealth and jobs for everyone.

Dvorak naturally caught a lot of politically correct flack for those comments, but I don't see anything there to disagree with. In fact, in one short paragraph, he neatly summed up The Enigma of Japanese Power by Karel van Wolferen.

But back to the subject at hand. In the last issue of 1989, John Dvorak reported in his "Inside Track" column that

According to Silicon Valley rumor mongers, Bill Gates has hired operating system guru and program designer David Cutler to develop what everyone is calling portable OS/2. This will be generic OS/2, but completely written in C. The idea is that once OS/2 becomes a viable and popular operating system, it will still be confronted by the portability issue. Portable OS/2 could be quickly ported to a RISC machine. This may be the secret project that finally makes Microsoft the biggest software company in the world.

He got it two-thirds right. Thanks in large part to David Cutler's work, Microsoft quickly became the biggest software company in the world. In its 2017 ranking of "Software & Programming" companies (pure play or nearly pure play), Forbes still put Microsoft right at the top. 

And Windows NT was initially developed on the MIPS R3000 and was later ported Digital Equipment's 64-bit Alpha, another of the many attempts to run Windows on non-x86 platforms, most recently Windows RT (nope) and the Qualcomm Snapdragon using software emulation (maybe).

But Bill Gates had actually hired David Cutler (along with most of Cutler's DEC Prism team) in 1988 to write Windows NT. Since this yet-undisclosed operating system would compete directly with OS/2 and thus IBM (the irascible Cutler loathed both OS/2 and Unix), the information had been leaked using language that didn't alarm the wrong people.

After all, IBM and Microsoft were still best buds. They invented the PC hand-in-hand. OS/2 belonged to both of them, even more so than DOS. That's what everybody believed, and they believed it so completely that early news of the breakup was taken with a grain of salt.

When the Wall Street Journal reported in its 28 January 1991 edition that Microsoft was planning to "scrap OS/2 and refine Windows," John Dvorak and the rest of the computer press labeled this spot-on revelation "dubious" and dismissed the "premature obituary for OS/2" as a "fiasco."

After previously doubting that "IBM's OS/2 would be able to knock Windows and DOS from the top of the hill," in his August 1992 column Dvorak hopped off the fence and wrote that PC users with the necessary hardware would be "nuts not to try OS/2." A year later, with the release of OS/2 2.1, he predicted that

the popularity of OS/2 will increase dramatically when people finally start to grasp the power of true multitasking.

Well, that stood to reason, did it not? If OS/2 wasn't the future of the personal computer, then what was?

Oddly enough, John Dvorak made perhaps his most prescient statement about OS/2 at a time when nobody outside Microsoft and IBM had seen it. The July 1987 issue of PC Magazine introducing the IBM's PS/2 line of personal computers also announced the upcoming OS/2 operating system.

The official release was still half a year away. So the cynical Dvorak trusted his gut and opined that "OS/2 might easily become the Lisa of software." Continuing with the analogy, he added, "It remains to be seen whether or not Microsoft has a Macintosh of software waiting in the wings."

Well, Microsoft did. And it wasn't waiting in the wings. Windows 1.0 had debuted in late 1985 and impressed those users with the hardware powerful enough to make the most of what was, at the time, cutting-edge technology. But the Cassandras of computers couldn't believe their own lying eyes.

Related posts

The future that wasn't (introduction)
The future that wasn't (2/7)
The future that wasn't (3/7)
The future that wasn't (4/7)
The future that wasn't (5/7)
The future that wasn't (6/7)
The future that wasn't (7/7)

The accidental standard
The grandfathers of DOS

Labels: , , ,

August 02, 2018

The streaming chronicles (2/4)

Click image to enlarge.
In which I troubleshoot the Roku Express for a problem that so far has turned out not to be one.

I've been using the Roku Express for two months. I still recommend it as the most economical standalone streaming solution, though with some caveats. Namely that it thinks my router is physically located in another zip code.

A Wi-Fi analyzer placed next to the Roku Express never drops below -50 dBm on a clear channel. Full strength. But the Network > About screen lists the signal strength as "Poor" and the "secret" Wi-Fi screen (HOME x5, Up, Down, Up, Down, Up) reports a signal strength of -80 dBm.

("HOME x5" means to first press the HOME button five times. Click on the sidebar graphic for a list of the Roku secret screens.)

At first I thought Wi-Fi Direct, a Bluetooth-like protocol that communicates with the remote control, might be the guilty party.

The Roku broadcasts an SSID called "DIRECT-roku" on the same channel as the access point. With a Wi-Fi signal strength around -50 dBm, the DIRECT-roku SSID often peaked at over -40 dBm, which is like cranking the stereo to eleven and blasting it all over the neighborhood.

I program my AV gadgets to an IR universal remote (a big reason to prefer the Roku Express over competing models) and I don't need the mirroring functions. The following steps suggested by Brendan Long turned off the DIRECT-roku SSID on my Roku Express.

Go to Settings > System > Advanced system settings > Device connect and disable Device connect. Restart the Roku.

It turns out that "Device connect" is Wi-Fi Direct. Other than de-cluttering the radio spectrum, turning off Wi-Fi Direct didn't help. Neither did reinstalling the software (HOME x5, FF, FF, FF, RW, RW) or the Wi-Fi drivers (HOME x5, Up, Down, Up, Down, Up).

The 8.1.0.4145-51 software update might have improved the adaptive bitrate streaming protocols. It didn't affect Wi-Fi signal strength. But everything is working the way it is supposed to. My solution is to stick a piece of black tape over the "check engine" light and keep driving.

In any case, thirty dollars is simply not that big of a sunk cost. If it stops working, I'll take that as an excuse to play with other streaming toys. The next step is to implement some long-overdue upgrades to my home network and see what kind of a difference that makes.

Related posts

The streaming chronicles (1)
The streaming chronicles (3)
The streaming chronicles (4)
Anime's streaming solution

Labels: , , ,