September 20, 2018

Frenemies (3/7)

As the 1990s began, big changes in the computer business were just over the horizon.

Intel was rolling out the 80486 microprocessor. With over a million transistors on board and clock speeds that would climb to 100 MHz, the 486 made the Graphical User Interface truly usable. Featuring that GUI front and center would be Windows 3.0 and OS/2 2.0.

Conventional wisdom had already concluded that Microsoft took three tries to get the software right. That meant Windows 3.0 was going to be a Big Deal, and Microsoft treated it as such, with multi-page advertising spreads in the major tech publications.

Nevertheless, the personal computing world was not ready to to cast Windows and OS/2 as competing products. Almost two years before, in the 14 March 1988 issue of PC Magazine, Gus Venditto reported on a "recent policy statement" by Bill Gates that "outlined a timeline in which 15 percent of new PCs are running OS/2 in 1989, growing to 50 percent by 1991."

In his 12 April 1988 cover story, "What OS/2 Will Mean to Users," Charles Petzold concluded that "Everybody currently using DOS on an 80286 or 80386-based machine will eventually consider upgrading to OS/2."

Everybody knew what IBM and Microsoft intended to do.

Over a year later, Petzold still predicted that "IBM and Microsoft intend OS/2 to be the dominant PC operating system of the 1990s—and they seem ready to fix any problem that could inhibit this goal." In the 27 February 1990 issue William F. Zachmann  stated that "OS/2 is clearly the intended heir to DOS as far as IBM and Microsoft are concerned."

Despite the overwhelming success of Windows 3.0, Zachmann doubled down on this prediction in the 25 September 1990 issue: "Windows 3.0 will light the way to OS/2, not eclipse it. And that's really what Microsoft always wanted."

By 1990, what Microsoft really wanted was to get out of its software development relationship with IBM, and had been hedging its bets for the past two years. It hired David Cutler—designer of the revered VMS operating system—away from Digital Equipment in order to create Windows NT. In the first issue of 1990, John Dvorak reported in PC Magazine that

Everyone is talking about Microsoft Windows 3.0, but not all of the talk is pleasant. It seems that Microsoft's sudden re-emphasis on Windows may result in more grousing by developers who have put their hopes into OS/2. Windows 3.0 now looks like the hot ticket to the future. I'm told that Microsoft employees have gone back to Windows en masse.

Although Gates increasingly had every reason to question IBM's competence in the retail arena, the source of the widening rift was the diverging corporate philosophies of the two companies. Keep in mind Microsoft's mission statement: "A computer on every desk and in every home all running Microsoft software."


Microsoft got started selling software for the 8-bit Altair. Microsoft made a CP/M card for the Apple II and is still a major software developer for the Macintosh. "A computer on every desk" manifestly did not mean "An IBM computer on every desk." Microsoft had much bigger aims than that.

Every computer on every desk in the universe, if possible.

IBM's lurch toward proprietary solutions, starting with the Micro Channel bus, was tossing sand into the gears of this goal. OS/2 cheerleader William Zachmann plainly admitted that Micro Channel was "IBM's standard. And nobody else's. From the very beginning, IBM intended Micro Channel to eliminate competition from vendors of compatible systems."


In the same 12 April 1988 issue that Charles Petzold declared "the OS/2 decade has begun," a more pessimistic Robert Hummel keenly perceived the same existential threat to the Microsoft and the huge base of existing DOS software that Bill Gates must have.

If you use IBM software, OS/2 may not live up to the capabilities you want unless you buy your computers from IBM. And now you see the real reason for OS/2.

Six months later, in his 31 January 1989 column, Charles Petzold mused that "The conspiracy-minded among us have suggested that OS/2 Extended Edition is the first step in making OS/2 an IBM proprietary operating system."

Gates was not about to get hemmed in by IBM's possessive hold on the platform and its parochial approach to software development. Despite the huge industry it had created, IBM demonstrated no interest or ability in driving the business forward at the retail level and maximizing the consumer base.

Simply consider that a personal computer user who resolved to plunk down 340 dollars for OS/2 Standard Edition 1.1 still had to figure out how to buy it. The OS/2 ads that appeared in PC Magazine directed the reader to "your local authorized IBM dealer." Whoever that was.

By contrast, Microsoft made sure that Windows 3.0 came bundled with most new computers. And if you wanted to buy a copy, simply flip though the pages of PC Magazine to an ad from, say, mail-order powerhouse PC Connection, pull out your credit card, and it was yours for 99 bucks.

Even William Zachmann had to admit that "IBM was never really aggressive on pricing. IBM was never aggressive when it came to innovation either." No surprise that Microsoft should resolve to reassert control over the personal computer operating system.

As it had since its founding, IBM envisioned itself as a highly profitable purveyor of proprietary computing systems. In other words, what Apple would become two decades hence. Not a bad corporate goal to have—for IBM. But Bill Gates wasn't about to sacrifice a 90-plus percent market share in order to bolster IBM's our-way-or-the-highway strategy.

Not when Microsoft was busily building an interstate of its own. In the 25 April 1989 issue of PC Magazine, John Dvorak mused that

If IBM had not become preoccupied with its Micro Channel patents and closed architecture, I think it would have sold twice as many PS/2s. More important to IBM, Big Blue would be in the driver's seat, controlling the destiny of the market. Now it's just a target for bypass.

The time had come for Microsoft to take the bypass and lead the personal computing world in a direction of its own devising.

Related posts

The future that wasn't
The Cassandras of computers (1/7)
The tortoise and the hare (2/7)
The accidental standard
The grandfathers of DOS

Labels: , , , ,

September 13, 2018

The streaming chronicles (4/4)

In which I have nice things to say about Tubi.

Tubi is available on-line and via streaming services such as Roku. It's an ad-supported channel with a straightforward interface and a decent selection of anime and Japanese movies.

The catalog doesn't say whether an anime title is subbed or dubbed. If the voice actors in the cast list are Japanese, then it's subtitled. In some cases, closed captioning has to be enabled to get the subtitles.

For once, the ad audio is quieter than the main programming. And Tubi deserves a round of applause for its ad engine. The ad transitions are smooth and not at all annoying (though the ads can get repetitive).

If Tubi has the title and you don't have an ad-free subscription at Crunchyroll, I'd recommend Tubi.

Once in a blue moon, the closed captioning stops displaying. Tapping auto-rewind (the 10 second skip-back) always fixes it. I have no complaints about the image quality. Tubi never buffers that I can tell.

Tubi is free. It works. It's got a ton of content besides anime. It's an app well worth installing.

I've also installed the NASA TV Roku app. The content is the same as its cable/satellite channel, with additional libraries of archived material. I currently use the TuneIn app to get J1 Radio.

Related posts

The streaming chronicles (1)
The streaming chronicles (2)
The streaming chronicles (3)
Anime's streaming solution

Labels: , , ,

September 06, 2018

New old titles at CR

The Crunchyroll streaming library already exceeds a staggering twelve-hundred titles (adding up to tens of thousands of episodes) and over a hundred live-action series. They recently scooped up the licenses for a bunch of full-length movies and glittering golden oldies.

Sherlock Hound features some of Hayao Miyazaki's earliest work. As you might surmise from the title, in this version, Sherlock Holmes is a dog. And so is everybody else. Lots of fun. I reviewed the series here.

In Magic Users Club (watch the OVA first), we learn that sitting on a broom (sans a pillow) hurts your butt, and the best way to deal with an alien spacecraft is to turn it into a giant cherry tree. (The first scene of the OVA has no sound because there is no sound in space.)

Patema Inverted and King of Thorn explore the unreliability of human perspective. I reviewed the former here.

Patema Inverted literally asks which way is up. King of Thorn wonders if really you know what time it is. Both require mighty suspensions of disbelief to get past the premises. But there's tons of material for anybody who enjoys musing about philosophical what-ifs.

In terms of narrative structure, King of Thorn reminded me of the "No Reason" episode of House.

Crunchyroll doesn't yet have the 3DCG Appleseed movies but it does have the 3DCG Vexille, a pastiche of every post-apocalyptic, mecha, and military anime series ever made. Watch it as a work of social commentary rather than for its dubious cinematic merits. I reviewed it here.

Voices of a Distant Star is Makoto Shinkai's brilliant debut film (and the best version of Ender's Game that isn't Ender's Game). I reviewed it here. I didn't much care for 5 Centimeters per Second, but it is the most beautiful teen soap opera ever made.

Welcome to the Space Show takes a gang of kids from rural Japan on an Art Deco roller coaster ride through a fractious galactic empire ruled by a reality TV show host. As the title suggests, it's a dazzling and hilarious trek through the stars.

Night on the Galactic Railroad is based on the fantasy novel by Kenji Miyazawa, an agronomist and social activist who died in 1933 at the age of thirty-seven. Little known for his poetry and fiction in his lifetime, he is now considered one of Japan's great literary figures.

Night on the Galactic Railroad inspired Leiji Matsumoto's anime classic Galaxy Express 999. This morally complex work of science fiction won the Shogakukan Manga Award in 1978 and the Animage Anime Grand Prix prize in 1981.


Video links

5 Centimeters per Second
King of Thorn
Galaxy Express 999 (Tubi)
Magic User's Club
Magic User's Club OVA (YouTube)
Night on the Galactic Railroad
Patema Inverted
Sherlock Hound (YouTube)
Voices of a Distant Star
Vexille
Welcome to the Space Show

Labels: , , , , , , , , ,

August 30, 2018

The tortoise and the hare (2/7)

The challenge for journalists covering science and industry is to capture the state of a rapidly changing environment without forgetting they are taking pictures of moving objects. Reducing technology to snapshots in time masks the movement going on behind the scenes. Compared to the jack-rabbiting hare, the tortoise isn't moving at all.

This was certainly true of software and hardware in the late 1980s. Gordon Moore devised his famous law in 1965. As with a standard exponential growth curve, the rate of change was barely perceptible at first. But by the 1990s, it was taking off like a rocket. The technology was evolving so fast that the real world appeared utterly unreal.

In his 12 June 1990 Inside Track column, John Dvorak reported that a

two million transistor 586 will show up in 1992, and the 4-6 million 686 in 1996. By the year 2000, the expected technology breakthroughs will allow the company 100 million transistor on the 786. It will crank out 2 billion instructions per second [2,000 MIPS] at a clock speed of 250 MHz! Sheesh. The company confirms these assertions.

Such predictions were gobsmacking unbelievable at the time. But the intel Dvorak got from Intel was spot-on. Released in 1999, The Pentium 3 had 9,500,000 transistors on board, a clock speed of 450 MHz, and cranked out 2,054 MIPS. The Pentium 4 topped 100,000,000 transistors in 2004. Multi-core CPUs today contain billions.

(Click image to enlarge.)

Once a slow-moving technology begins to pick up speed, the acceleration can catch bystanders by surprise. Just as when a much heralded technology begins to slow, it can quickly get stuck in the mud.

The most surprising things about the origins of the personal computer is that a lumbering entity like IBM brought the PC to market in so nimble a fashion. So quickly, in fact, that neither IBM nor Microsoft had time to develop its own operating system. Microsoft instead purchased the progenitor of MS-DOS from Seattle Computer Products.

Come the 1990s, a decade on, a new operating system was needed, a 32-bit OS for the 32-bit microprocessors that already dominated the market, that extended the memory address space from 1 MB to 4 GB. And it should come with a graphical user interface (GUI). OS/2 ticked off all the boxes. Obviously history was simply going to repeat itself.

Except that IBM viewed the accidental standard it had created as exactly that: an accident. A mistake, one it was not eager to repeat. Still, everybody expected another turn on a dime, a reboot of the personal computer industry once again engineered by the hit Hope and Crosby duo of Microsoft and IBM.

Everybody but Microsoft, it turned out.

Windows had been in development since 1985, two years prior to the release of OS/2 1.0. At first, it was little more than a 16-bit shell that compared poorly with the Macintosh GUI. But like the stubborn tortoise, Microsoft plodded along, releasing versions that first took advantage of the hybrid 16/32-bit 286 and then the fully 32-bit 386.

As Steve Ballmer recounted in Triumph of the Nerds, "I was the development manager for Windows 1.0 and we kept slogging and slogging to get things right for [Windows 3.0 in] 1990."

Microsoft was slogging in the right direction. The 15 March 1988 issue of PC Magazine inaugurated the "Pipeline" column that included a bestseller list focusing on business software and operating systems. The first list began as follows:

1. Lotus 1-2-3 (DOS)
2. WordPerfect 4.2 (DOS)
3. Microsoft Windows 2.03

That third entry remains a surprise today. Flying well below the radar, Windows had already established a sizable market presence. When Windows 3.0 debuted in May 1990, Microsoft finally had the consumer operating system it'd been looking for and a good idea of how fast that critter could run. It was time to press the pedal to the metal.

Consider a 30 April 1991 column in which Jim Seymour ranted about Microsoft changing the default keyboard shortcuts in Word 5.5 (DOS) to match those of the Windows interface. A change he points to is that ALT-F-O now launched the File Open dialog box. With the benefit of hindsight, ALT-F-O is an amazing example of forward thinking.

A quarter century later, ALT-F-O retains the same function in practically all Windows applications. In 1991, Microsoft committed itself to a future based on Windows. And followed through.

But back in the 27 December 1988 issue, columnist Ray Duncan wasn't

ready to join either the "DOS is Dead" or the "DOS Forever" school of thought. I think it quite likely that OS/2's successors will eventually eclipse MS-DOS, but I suspect this will take a lot longer than anybody now imagines—perhaps ten years or more.

Actually, it'd only take four more years for Windows 3.x to bury OS/2, selling forty times more copies than OS/2 had thus far in its entire existence.

Related posts

The future that wasn't
The Cassandras of computers (1/7)
Frenemies (3/7)
The accidental standard
The grandfathers of DOS

Labels: , , , ,

August 23, 2018

The streaming chronicles (3/4)

In which I upgrade my cable modem and router.

So I got an Arris Surfboard SB6183 DOCSIS 3.0 cable modem. Xfinity has bugged me about it for years. I'd procrastinated because a faster Internet connection wouldn't speed up the Pentium M processor in my ancient ThinkPad laptop and its poky b/g Wi-Fi.

Getting into streaming was sufficient motivation to upgrade my home network. Streaming at a good data rate looks great. Not to mention that the way Xfinity stairsteps the "default" service level, I was paying for bandwidth my router and cable modem couldn't deliver.

Yeah, that's dumb. But whatever can be put off until tomorrow I will put off until tomorrow. The old stuff still works. It's always best to upgrade while the old stuff still works. Well, tomorrow has arrived!

The new router is a TP-Link TL-WR841N. It's gotten decent reviews for a very affordable class of wireless router. Common objections, like it only having 10/100 Ethernet and not supporting 802.11ac, are mostly irrelevant to my current setup. And did I mention that it's affordable? Twenty bucks!


I averaged a dozen or so samples with Fast and the Speedtest app for each hardware setup. Though the ThinkPad reports a 54 Mbps connection, neither it nor the Belkin could handle that data rate. Download speeds increased 20-30 percent with the TP-Link. The DOCSIS 3.0 router made little difference.

The Fire 7, on the other hand, has dual-band b/g/n. The specs for the DOCSIS 2.0 SB5101 claim a maximum transfer rate of 30 Mbps. The 802.11n protocol on the TP-Link and Fire 7 appear to be delivering every bit of it. Now I'm tempted to experiment with an 802.11n USB Wi-Fi adapter for the ThinkPad.

I'm paying for a maximum 70 Mbps down. The fine print doesn't guarantee it and I'd have to string an Ethernet cable to set a baseline. Using 802.11n, the DOCSIS 2.0 modem gave me half that. The DOCSIS 3.0 router delivers two-thirds. Considering the ageing infrastructure in this part of town, not bad.

BelkinTP-LinkDOCSIS 3
ThinkPad  (g)8.3  (g)11.2  (g)11.3 
Fire 7  (g)11.1  (n)33.2  (n)43.2 

DOCSIS 2.0 upload speeds averaged 5.1 Mbps with both the Belkin and TP-Link. DOCSIS 3.0 kicked it up to 6.2 Mbps, even on the ThinkPad. That extra 20 percent is handy when backing up data to the cloud.

The Roku Express supports 802.11b/g/n but not dual-band. Getting precise numbers is tricky because of the adaptive bit-rate streaming. According to the playback screen (HOME x5, RW, RW, RW, FF, FF), using the TP-Link with the DOCSIS 2.0 modem increased throughput from 1-5 Mbps to 10-15 Mbps.

The DOCSIS 3.0 modem upped network throughput to 15-40 Mbps at the high end. I'm assuming this screen reports the bandwidth provided by a given CDN stream and not the connection with the router, which shouldn't otherwise change. Though this could be a product of the glitches I discussed previously.

So far, the upgrade has done what I wanted it to do—eliminate buffering. Granted, it only ever happened with Crunchyroll, probably because Crunchyroll is migrating its CDN to AWS CloudFront. Occasional bouts of buffering since are more likely due to normal WAN hiccups than to LAN flakiness.

Oh, the Roku logging functions are still definitely flaky. The Network > About screen randomly reports signal strength as anything from "Poor" to "Excellent." The "secret" Wi-Fi screen (HOME x5, Up, Down, Up, Down, Up) still lists the signal strength as -80 dBm. Effectively zero. But it works fine.

Related posts

The streaming chronicles (1)
The streaming chronicles (2)
The streaming chronicles (4)
Anime's streaming solution

Labels: , , , ,

August 16, 2018

The last shogun

In the textbooks, at least, the 1868 Meiji Restoration ended the rule of the shoguns and reestablished the reign of the emperors. The effect, however, was to create a government where the "separation of powers" simply meant that the powers of the government were all separated.

Oh, those powers were, on paper, vested in the emperor. So had they been during the shogunate. It's just that from the 17th century through the early 19th, the Tokugawa shogun unquestionably controlled the emperor. Now nobody controlled the emperor. And the emperor didn't control anything either.

In a deadly game of king of the hill, the years in Japan between the Meiji Restoration and WWII were punctuated by a series of attempted coups. None succeeded, but all had the effect of pushing the government further to the right in hopes of deflecting the next military revolt, until the army was operating without any practical constraints.

Echoes of the first half of the 16th century, when the slow rot of the Ashikaga shogunate ignited battles amongst the military governors that culminated in the Warring States period.

Lacking the checks and balances of civilian oversight, the Japanese army ended up starting a small war in China that grew out of control, basically Vietnam on a continental scale. When the U.S. cut off oil and scrap metal exports to Japan as a response, the military lashed out without considering its capabilities or the military consequences.

Thanks to the military doctrine of Kantai Kessen, meaning a winner-take-all contest between battleships, the Japanese war effort was doomed from the start. Japanese military leaders couldn't stop believing in Kantai Kessen because it had proved so decisive during the Russo-Japanese War.

But by June of 1942 and the decisive Battle of Midway, the battleship was a white elephant. The aircraft carrier rules the waves. To be sure, the Japanese navy had indeed crushed the Russian fleet at the Battle of Tsushima in 1905, compelling a shaky Russian government to sue for peace.

This "underdog" victory was hailed around the world (even though it began with a "sneak attack"). The Japanese government was quick to believe its own press, forgetting that the land war going on at the same time was about as decisive as the First World War would be, with the Japanese infantry taking as many casualties as the Russians.

Notwithstanding one the greatest diplomatic achievements in history, the victorious Japanese came away from the Treaty of Portsmouth (1905) believing that the western powers had robbed them of their due. This combination of victimhood, aggrievement, and overconfidence set the stage for the next forty years of accumulating disasters.

In Japan, ordinary citizens—already living under draconian rationing and sumptuary laws—took the December 1941 attack on Pearl Harbor to be a second Tsushima, signaling an end to the conflict.

By the Battle of Okinawa, nobody in the Japanese government believed they could prevail by force of arms alone. But they could convince the Americans that invading the main islands carried too high a cost, essentially Robert E. Lee's strategy in 1864, that might have succeeded except for the fall of Atlanta and Sherman's March to the Sea.

The bitter irony is that in this they succeeded. Thus the atomic bomb. But the atomic bomb probably had a greater influence on Stalin, who, thanks to his spies, knew more about it than Truman. Stalin didn't launch his invasion of Manchuria until after Nagasaki. Once the bomb was dropped, Stalin had to act before Japan surrendered.

One of Stalin's goals was payback for the Russo-Japanese War. The Soviet army reclaimed all of its former territories, plus several islands that had always been part of Japan. From 560,000 to 760,000 Japanese were shipped off to the gulags, where from 10 to 50 percent of them died. This treatment by a former "ally" still rankles in Japan.

There is much talk of "formally" ending the Korean War. The one-week war between the Soviet Union and Japan has never been formally resolved either.

All through the Second World War, Japan and the Soviet Union had a non-aggression pact. Until the bitter end, the Japanese Supreme Council saw the Soviet Union as a "good faith" intermediary while raising arcane and legalistic objections to the Potsdam Declaration. Stalin's abrogation of the non-aggression pact destroyed that illusion.

But a negotiated surrender would not be acceptable to the Allies and certainly not to their citizens. They had been there and done that and suffered the consequences. In July of 1918, Winston Churchill laid out the terms for a lasting armistice with Germany. In the process, he made clear why the "Great War" would not be "the war to end all wars."

Germany must be beaten; Germany must know that she is beaten; Germany must feel that she is beaten. Her defeat must be expressed in terms and facts which will, for all time, deter others from emulating her crime, and will safeguard us against their repetition.

Despite all the treaties signed and reparations extracted at Versailles, between the two world wars, Germany acceded to none of these conditions. But in August of 1945, as John Dower vividly lays out in Embracing Defeat: Japan in the Wake of World War II, Japan very much did.

The atomic bomb was considerably less destructive than Curtis LeMay's ongoing firebombing campaigns. But it forced Stalin's hand and that forced the Japanese government to finally face reality. And when he finally did face reality, the atomic bomb gave the emperor a transcendent power to whom he could surrender Japan's wartime ideology.

This time, history would not repeat itself.

Though in a very real sense, history was repeating itself for the fourth time. In 1185, Minamoto Yoritomo destroyed the Taira clan—the power behind the throne—and moved the capital of Japan to Kamakura, inaugurating the rule of the shoguns. On and off for the next 700 years, the emperor reigned as little more than a figurehead.

When Tokugawa Ieyasu consolidated power after the Battle of Sekigahara in 1600, the country breathed a sigh of relief and mostly aligned itself with the new regime. Like Ieyasu himself, it was an opportunistic resolution that demanded little in the way of ideological conformity, except to go along to get along, a social compact that worked.

In the mid-1860s, as the Tokugawa regime crumbled around them and the center could no longer hold, this opportunistic ambivalence was expressed in the "Ee ja nai ka" movement, an anarchic yet strangely playful popular uprising that proclaimed, "So what? Why not? Who cares?"


In the late summer of 1945, the population was too exhausted to dance in the streets. But they'd had enough of ideology. When General MacArthur arrived in Japan on August 30, he was greeted as the last of the Japanese shoguns. The Japanese people accordingly switched their allegiances to the man who promised them less torment and a better future.

Related posts

The grudge and the dream
Kantai Kessen
Hirohito and the Making of Modern Japan

Labels: , , , ,

August 09, 2018

The Cassandras of computers (1/7)

Cassandra was cursed by the Greek god Apollo with the power to make true prophesies that nobody believed. A quarter century ago, the Cassandras of the computer world had an additional problem: they didn't always believe the future they were forecasting either.

Columnist John Dvorak was a curmudgeonly contrarian back when he started writing for PC Magazine in the 1980s. He's still on the job thirty years later. During his first decade, he made several notable predictions, reported a story that foreshadowed a tidal wave of technological change, and then missed the very confirmation of what he was writing about.


After a hands-on demo of the Canon RC-701, the first commercially-available camera to use a CCD instead of film, Dvorak stated in the 26 January 1988 issue,

It's the future. Not only will we one day take photos on floppies or plug-in RAM (or both), but we will manipulate them on our machines at home. Finally, a use for the home computer: a device to edit snapshots. And note: because the photo is on a cheap disk, there will be no reluctance to take tons of pictures because we'll know we can erase and reuse the disk—something we'll never actually do.

After first musing that it was time to sell Kodak stock, he reconsidered and thought that maybe Kodak had a future selling printers and paper. His initial reaction was the correct one.

At the time, it was also a warning nobody wanted to hear (including Kodak). After he again broached the subject two years later in his 25 December 1990 column, a reader wrote in to scoff, "What a laugh. John Dvorak says that photography as we know is dead." But this time Dvorak had seen the future with 20/20 foresight.

He hit the nail on the head a few other times too, pointing to the rise of Unicode and predicting that LCD screens were "the wave of the future." In his 26 January 1993 column, he railed against pagers and portable phones, saying they reminded him of the Borg from Star Trek. "The desire for instant communication reflects our own insecurities. Dump them."

Dvorak recently revisited the latter subject in a 6 June 2018 column, in which he termed the malady "FOMO," or "the fear of missing out." (Internet pioneer Jaron Lanier recently devoted an entire monograph to the subject.)

Also in 1993, John Dvorak foresaw the importance of Texas Instrument's brand-new DLP projection system and correctly predicted that giant magnetoresistance technology would make 250 GB hard drives "commonplace" a decade hence. Wading into the field of business anthropology, after a junket to Japan in 1990, he observed,

The Japanese are incredible time wasters. While the Japanese may make an efficient assembly line, the Japanese style of doing business is the opposite. It's a miracle anything gets done. Companies compete fiercely with each other. "Japan Inc." is a cooperation between government and business that is the opposite of what we experience in America. In Japan, government helps business succeed because the government knows that business success means wealth and jobs for everyone.

Dvorak naturally caught a lot of politically correct flack for those comments, but I don't see anything there to disagree with. In fact, in one short paragraph, he neatly summed up The Enigma of Japanese Power by Karel van Wolferen.

But back to the subject at hand. In the last issue of 1989, John Dvorak reported in his "Inside Track" column that

According to Silicon Valley rumor mongers, Bill Gates has hired operating system guru and program designer David Cutler to develop what everyone is calling portable OS/2. This will be generic OS/2, but completely written in C. The idea is that once OS/2 becomes a viable and popular operating system, it will still be confronted by the portability issue. Portable OS/2 could be quickly ported to a RISC machine. This may be the secret project that finally makes Microsoft the biggest software company in the world.

He got it two-thirds right. David Cutler was designing an operating system that would prove wildly successful. He would port Windows NT to Digital Equipment's 64-bit Alpha, the first of many attempts to run Windows on non-x86 platforms, most recently Windows RT (nope) and the Qualcomm Snapdragon using software emulation (maybe).

But Bill Gates had actually hired David Cutler (along with most of Cutler's DEC Prism team) in 1988 to write Windows NT. Since this yet-undisclosed operating system would compete directly with OS/2 and thus IBM (the irascible Cutler loathed both OS/2 and Unix), the information had been leaked using language that didn't alarm the wrong people.

After all, IBM and Microsoft were still best buds. They invented the PC hand-in-hand. OS/2 belonged to both of them, even more so than DOS. That's what everybody believed, and they believed it so completely that early news of the breakup was taken with a grain of salt.

When the Wall Street Journal reported in its 28 January 1991 edition that Microsoft was planning to "scrap OS/2 and refine Windows," John
Dvorak and the rest of the computer press labeled this spot-on revelation "dubious" and dismissed the "premature obituary for OS/2" as a "fiasco."

After previously doubting that "IBM's OS/2 would be able to knock Windows and DOS from the top of the hill," in his August 1992 column Dvorak hopped off the fence and wrote that PC users with the necessary hardware would be "nuts not to try OS/2." A year later, with the release of OS/2 2.1, he predicted that

the popularity of OS/2 will increase dramatically when people finally start to grasp the power of true multitasking.

Well, that stood to reason, did it not? If OS/2 wasn't the future of the personal computer, then what was? Because it wasn't Unix and it wasn't Mac OS and Windows NT was designed for high-powered workstations and servers. The answer was obvious even then, but the Cassandras of computers couldn't believe their own lying eyes.

Related posts

The future that wasn't
The tortoise and the hare (2/7)
Frenemies (3/7)
The accidental standard
The grandfathers of DOS

Labels: , , ,

August 02, 2018

The streaming chronicles (2/4)

Click image to enlarge.
In which I troubleshoot the Roku Express for a problem that so far has turned out not to be one.

I've been using the Roku Express for two months. I still recommend it as the most economical standalone streaming solution, though with some caveats. Namely that it thinks my router is physically located in another zip code.

A Wi-Fi analyzer placed next to the Roku Express never drops below -50 dBm on a clear channel. Full strength. But the Network > About screen lists the signal strength as "Poor" and the "secret" Wi-Fi screen (HOME x5, Up, Down, Up, Down, Up) reports a signal strength of -80 dBm.

("HOME x5" means to first press the HOME button five times. Click on the sidebar graphic for a list of the Roku secret screens.)

At first I thought Wi-Fi Direct, a Bluetooth-like protocol that communicates with the remote control, might be the guilty party.

The Roku broadcasts an SSID called "DIRECT-roku" on the same channel as the access point. With a Wi-Fi signal strength around -50 dBm, the DIRECT-roku SSID often peaked at over -40 dBm, which is like cranking the stereo to eleven and blasting it all over the neighborhood.

I program my AV gadgets to an IR universal remote (a big reason to prefer the Roku Express over competing models) and I don't need the mirroring functions. The following steps suggested by Brendan Long turned off the DIRECT-roku SSID on my Roku Express.

Go to Settings > System > Advanced system settings > Device connect and disable Device connect. Restart the Roku.

It turns out that "Device connect" is Wi-Fi Direct. Other than de-cluttering the radio spectrum, turning off Wi-Fi Direct didn't help. Neither did reinstalling the software (HOME x5, FF, FF, FF, RW, RW) or the Wi-Fi drivers (HOME x5, Up, Down, Up, Down, Up).

The 8.1.0.4145-51 software update might have improved the adaptive bitrate streaming protocols. It didn't affect Wi-Fi signal strength. But everything is working the way it is supposed to. My solution is to stick a piece of black tape over the "check engine" light and keep driving.

In any case, thirty dollars is simply not that big of a sunk cost. If it stops working, I'll take that as an excuse to play with other streaming toys. The next step is to implement some long-overdue upgrades to my home network and see what kind of a difference that makes.

Related posts

The streaming chronicles (1)
The streaming chronicles (3)
The streaming chronicles (4)
Anime's streaming solution

Labels: , , ,

July 26, 2018

Kazuya Kosaka

One of the rewards of listening to the Gold(en oldies) channel on J1 Radio is hearing covers of songs that catch you totally by surprise. Flash back to the late 1950s and early 1960s when Japan's first television stations were going on the air. They licensed Hollywood productions to fill in their program schedules. And still do.

(I've had trouble of late getting the J1 Radio app to work on my Roku. J1 Radio is also available on the free version of the TuneIn app.)

Westerns were a staple of American television at the time, and so the genre naturally became a staple of Japanese television. Rawhide was a big hit. During a February 1962 publicity tour, Clint Eastwood, Paul Brinegar, and Eric Fleming met the Japanese press at the Palace Hotel in Tokyo.


It was only a matter of time before Japanese musicians began performing Western music and rockabilly. Kazuya Kosaka & The Wagon Masters not only covered the hits but reinterpreted them as well. I can't find Kosaka's cover of the Rawhide theme song (in Japanese) on YouTube. Here is his version of "Jailhouse Rock."


Kazuya Kosaka is probably better remembered today in Japan for his long career in movies and television.

Labels: , , ,

July 19, 2018

The future that wasn't

As the old Danish proverb (attributed to everyone from Niels Bohr to Yogi Berra) observes, "It's difficult to make predictions, especially about the future."

Forty years ago, a world-changing industry distilled out of the ether of human ingenuity. At the end of the 1970s, a Darwinistic fight for the survival of the technologically fittest seemed poised to crown CP/M and the Apple II as the king and queen of the micro-computer beasts.

And then a big asteroid called the Personal Computer slammed into Silicon Valley.

Unlike at the end of the Jurassic, when the smoke cleared, one very big dinosaur was still left standing. But IBM-Rex soon discovered that the underbrush was crawling with equally persistent critters, competing like crazy and nipping at its heels.

Fueling this frenzy was the knowledge that the meteor showers hadn't ended. Another big one was on the way. There was going to be a Next Big Thing. It was in the cards from the start. The rapid evolution of the CPU had obsoleted the 16-bit Intel 8088 only four years after the debut of the IBM PC.

In its haste to get a product to market, IBM used off-the-shelf parts and an operating system from Microsoft (that Microsoft hurried out and bought from Seattle Computer Products). Within a year, Compaq had reversed-engineered the IBM BIOS to produce a 100-percent IBM PC compatible computer.

With this accidental standard in place, it was off to the races.

Beginning with the Intel 8080 in 1974, personal computing has undergone a major technological consolidation at the beginning of each decade. The 1980s saw the emergence and dominance of DOS, culminating with Apple's famous 1984 commercial that (mistakenly) targeted IBM as "Big Brother."

Now the billion-dollar behemoths thrashed about trying to figure out what the Next Big Thing would be. They figured it out soon enough. The past was prelude, and a mutated amalgam of IBM and Microsoft were going to produce a 32-bit multitasking operating system that would soon rule the world.

Except OS/2 didn't. In the words of tech writer William Shakespeare, "It strutted and fretted its hour upon the stage. And then was heard no more."

Microsoft had toyed with Xenix (which it licensed from AT&T and eventually sold to SCO) and delved deeply into OS/2 development with IBM. In the end, Bill Gates chose to stick with Windows and maintained out-of-the-box backwards compatibility with MS-DOS for the next thirty years.

At the time, the consensus of option pointed to anything but that outcome. Right up until nobody could imagine any other result. Unfolding between 1988 and 1992, what makes this high-tech drama so fascinating is that the writers of the tale didn't know how it would end.

But now, a quarter-century later, we do.

Our time machine, thanks to Google Books, is PC Magazine. Over the next several months, I'll be hopping into that digital Tardis and zooming back to the recent past, following the story as its editors and commentators debated how the future—meaning the present day—was going to unfold.

Related posts

The Cassandras of computers (1/7)
The tortoise and the hare (2/7)
Frenemies (3/7)
The accidental standard
The grandfathers of DOS

Labels: , , , ,

July 12, 2018

Hyouka

Clint Eastwood defined the essence of the role in Sergio Leone's spaghetti westerns. A lone rider with no ties and no dependencies and no interest in the human condition, the "Man with No Name" is an unapologetic misanthrope who, despite himself, ends up doing right by his fellow man.


A Fistful of Dollars and For a Few Dollars More were based on characters created by Akira Kurosawa and Toshiro Mifune for the equally iconic chambara films Yojimbo and Sanjuro.

Manga and anime embraced the trope, often adding a sidekick (a gregarious Watson to his taciturn Sherlock) and spirited girl with a cause or quest of her own. The relationship between the "wandering swordsman" Himura Kenshin and Kaoru Kamiya in Rurouni Kenshin is a case in point.

Such pairings became a staple of the romantic dramedy, perhaps no better exemplified than in Clannad. When we first meet him, Tomoya (Yuichi Nakamura) is a senior in high school. Cynical and aloof (not without his reasons), he proudly wears the label of "class delinquent."

The first day of school (one of those halcyon days in early April), he runs into Nagisa and his whole life changes. Not because he falls for her (that takes two dozen episodes) but because she presents him with a problem to solve. Solving the problem is what brings them together.

Hyouka follows a similar formula with equally outstanding results. That includes again casting Yuichi Nakamura in the lead and again pairing him with Daisuke Sakaguchi, who played his sidekick in Clannad.

Unlike Tomoya, Hotaro Oreki has no "troubled past." His goal is to get through high school with the least possible social involvement, expending as little energy as possible. That goal is frustrated when his older sister insists that he join the soon-to-be defunct "Classic Literature Club."

He shows up for the first club meeting to find one other person there, Eru (Elle) Chitanda, scion of one of the wealthiest families in town. The story, though, avoids the "poor little rich girl" meme and instead begins with series of one-off Encylopedia Brown type mysteries.

As it turns out, Hotaro is really good at solving puzzles. This realization prompts Eru to present him with an unresolved family scandal. Along with Satoshi (his childhood friend) and Mayaka (the student librarian), they tackle the curious fate of Eru's uncle.

Her uncle helmed the Classic Literature Club forty years before, until he was expelled from school under questionable circumstances. Hotaro ends up expending a whole lot of energy figuring out why.

Hyouka is the title of the literary anthology the club publishes every year. It becomes the most revealing clue of all. "A dumb joke," Hotaro mutters when he figures it out, and exactly what a wronged teenager would come up with.

The author of the series, Honobu Yonezawa, includes an additional twist in the opening and closing credits with his punning alternate titles to the stories, such as "The Niece of Time." I got that one. I had to google "Why Didn't They Ask Eba [Evans]?" to get the Agatha Christie reference.

The ED for the second cour is a delightful tribute to the "cozy" genre that could constitute an episode all on its own.


The ED for the first cour, on the other hand, is simply surreal.


Some episodes are straightforward head-scratchers, even so basic a matter as why a teacher messed up his lesson plan (which begins with a debate of why some people have shorter tempers than others, which leads to discussion of the seven deadly sins, which leads to Eru's version of "greed is good").

And then the film club sets out to make a murder mystery video for their class project. In the middle of the shoot, the girl writing the script has to leave. So the film club turns to Classic Literature Club to figure out how she intended to finish it, which means solving the mystery.

No sooner has he done that but Hotaro finds himself wrestling with issues of artistic integrity and authorial intent. These themes also arise in a surprisingly complex arc in the second cour that begins with a mostly harmless prank and concludes with a meditation about creativity and talent.

These slice-of-life whodunits usually involve no crime at all. The real mystery is human nature, and why Eru can so easily knock the otherwise cool Hotaro off his stride. Sensing that "the game is afoot," she is certain to lean in and exclaim, "Ki ni narimasu!" (I'm curious!) And will not relent.


Alas, he cannot resist.

Hyouka gives us Kyoto Animation at its finest, and more stellar work from the talented and productive Yasuhiro Takemoto. His previous directorial projects include Amagi Brilliant Park, Full Metal Panic? Fumoffu, Miss Kobayashi's Dragon Maid, and The Melancholy Of Haruhi Suzumiya.

Honobu Yonezawa wrote five novels and half a dozen short stores in the "Classic Literature Club" series, which have been adapted to 11 manga volumes, 22 anime episodes (plus an OVA), and a 2017 live-action film.

You can watch Hyouka on Crunchyroll.

Labels: , , ,

July 05, 2018

And then there was one

PBS affiliate in Utah, that is. For the last half-century or so, Utah's two biggest universities have hosted two independent PBS stations: KUED 7 (University of Utah) and KBYU 11 (Brigham Young University). For the last half-century or so, KBYU played second-string to KUED, carrying the same programming a month after KUED.

While it was nice to have a "backup" channel if you missed a show the first time around, KBYU couldn't help diluting KUED's audience and ratings, and dividing loyalties especially during membership drives.

KUED's launch of the Create subchannel (7.4) eliminated any problem of catching reruns of the DIY shows. And then last year, both stations arrived at a win-win resolution that was a huge win for KUED. On July 2, KBYU dropped its PBS affiliation and shifted its satellite channel, BYUtv, over to the primary OTA broadcast channel.

BYU Broadcasting announced plans to consolidate its television operations, BYUtv, KBYU Channel Eleven and BYUtv International, into one nationwide television network. Similarly, BYU Broadcasting said it plans to consolidate its radio operations, BYUradio (on SiriusXM Satellite Radio) and KBYU-FM/Classical 89, into a single radio network.

But listeners to Utah's last classical radio station proved to be a scrappy bunch. They weren't going down without a fight. And they won. Earlier this year, BYU Broadcasting purchased KUMT-FM (107.9) to host BYUradio,

preserving [the only] over-the-air classical music station in Utah. Classical 89 will continue to operate on its current frequency at 89.1 and 89.5 (Southern Utah County) on the FM dial.

So make that a win-win-win.

Labels: , , ,

June 28, 2018

The publishing industry in Japan

In the course of my Internet research about publishing costs in Japan, I collected three white papers and a Robert Whiting interview (all were posted for download on non-gated websites). Also recommended is mangaka Shuho Sato's tell-all retrospective about his own profession.

"An Introduction to Publishing in Japan" by the Japan Book Publishers Association: JBPA.pdf

"The Field of Japanese Publishing" by Brian Moeran: BrianMoeran.pdf

"The Japanese Way! Relationships between Authors and Publishers in the Context of Developing Works into Diverse Forms" by Tetsuro Daiki: TetsuroDaiki.pdf

"You've Gotta Have Wa If You Want to Get Published" by Robert Whiting: RobertWhiting.pdf

Manga Poverty by Shuho Sato (translated by Dan Luffey): Kindle ebook

The following are a live-action drama and three anime. Antiquarian Bookshop may be the coziest cozy mystery series ever. In the process, you'll learn a good deal about the used book trade in Japan. Shirobako is an "inside baseball" account about how an anime series is made.

Monthly Girls' Nozaki-kun and Bakuman feature teenagers who want to be professional mangaka when they grow up, a subject that constitutes its own genre. Bakuman in particular pays close attention to the technical details of the profession. It debuted on NHK Educational TV.

The Sakuga blog provides a good explanation of the "production committee."

Related posts

The proof is in the printing
The actual value of the written word

Labels: , , , , , ,

June 21, 2018

The proof is in the printing

A while back on the ANN website, Justin Sevakis asked, "Why Does Manga [printed in the U.S.] Turn Yellow?" That question raises the obvious antithesis: Why do Japanese tankoubon (manga published in perfect bound format) and paperbacks age so well?

A "light novel" (novella) I purchased back in 1989 for 360 yen ($3.25) has grayed and faded a bit but the paper remains pliable and the spine hasn't lost a bit of flexibility. Manga and paperbacks I ordered from Japan over a decade ago remain in near mint condition.

Despite a consignment system and resale price maintenance laws, paperbacks in Japan often cost much less than mass market paperbacks in the U.S. The Chihayafuru tankoubon I recently purchased are 429 yen each. Less than four dollars at the current exchange rate.

A 350 page short story collection by Fuyumi Ono is priced at 637 yen. That's about $5.75. The paper, full-color dust cover, and binding are comparable to the higher-grade "trade paperback" category. So what accounts for these differences in quality and cost? Shouldn't English-language publishers be able to leverage enormous economies of scale?

To start with, Japanese publishers don't dole out advances. Instead, they pay up-front at the time of the print run. Japanese publishers were essentially printing-on-demand before POD became a thing (though short print runs also mean that books can go out of print pretty fast).

According to Tetsuro Daiki, general manager of legal and licensing at Shogakukan (a major publisher), "The full sum [of royalties] is paid one month after the release of a book." And all those royalties go straight to the writer.

Publishing contracts in Japan are so standard that agents are rarely used (except when licensing foreign translations). This is in large part because the writer retains subsidiary rights by default. In the land of the doujinshi, Japanese publishers know that if you love something, you set it (sort of) free.

To be sure, when negotiating subsidiary rights, the publisher typically steps in as the agent, often with a seat on the "production committee." Again, as Tetsuro Daiki explains, "the authors as well as Shogakukan stand side by side in the contract negotiations." He believes, of course, this is for the best.

If authors try to keep all the [rights] to themselves and regard publishers as enemies, they [have] to confront all the odds single-handedly, leading to negligence of their essential creative activities. It is better if the authors devote themselves to writing, painting and creating new works, leaving business to publishers. This is the choice of the majority of authors in Japan.

The upshot is that publishers like Shogakukan can make available to their authors media formats (including manga, anime, periodicals, video games, television and theatrical adaptations, and even radio dramas on CD) rarely if ever offered to mid-listers in the English-speaking market.

For example, the Bakuman manga series (Shueisha) by Tsugumi Ohba and Takeshi Obata has been adapted to an anime series (NHK-Educational), video game (Bandai), novel (Shueisha), and a live-action film (Toho). The extensive cross-ownership inherent in the production committee system results in extensive cross-promotion and pooled risks.

Which is all well and good. But as bestselling manga artist Shuho Sato explains in Manga Poverty, his autobiographical exposé of publishing industry finances in Japan, the "average" mangaka can still spend years in the red and never earn enough to cover his out-of-pocket expenses.

The market for print magazines in Japan has contracted sharply over the past decade. Publishers regularly lose money on first serialization rights. Reading the writing on the wall, when Shuho Sato renegotiated with Shogakukan, he transferred the secondary rights to his own company.

Shuho Sato's story ends with him adopting a hybrid approach. Shogakukan prints and sells the paper product while he publishes electronically through his website and shares that platform with other mangaka. After all, he asks,

If you truly believe that [authors] should feel indebted to publishers for making [their books] sell, then doesn't it also make it the publisher's fault if they don't sell?

One of Sato's more interesting revelations is how much it costs to produce a perfect bound book in volume. He secured from an industry source a quote of 150 yen per copy on a print run of 50,000 units that included a 10 percent royalty based on a list price of 500 yen. (Remember that Japanese publishers pay out royalties at the time of the print run.)

Subtract the royalty payment and the unit cost falls under a dollar. This again raises questions about the costs of manufacturing perfect bound books on this side of the Pacific and what exactly all the "overhead" is paying for.

A safe prediction is that hybrid or self-publishing will become the predominant economic model for mid-list writers and artists capable of producing all their own IP by the sweat of their own brows. The future of "traditional" publishing may well be a return to its roots primarily as printers.

Related posts

The publishing industry in Japan
The actual value of the written word

Labels: , , , , , ,