September 14, 2024

Spy x Sony CV-2000

A scene in episode five of Spy x Family makes use of a reel-to-reel video tape recorder sitting under the television in the living room. At first, I was sure it was an anachronism. But a little research revealed that the technology existed in the 1960s when the series takes place.

Sony introduced the CV-2000 video tape recorder (VTR) in 1965 as part of its home electronics line. At the time, the CV-2000 retailed for $695. Adjusted for inflation, that'd be $7000 today. I'm sure Lloyd put it on his expense account (which his handler complains about).

In Japan, the initialism VTR is still used to refer to prerecorded video content on broadcast television (even though it's all digital by now).

Labels: , , , , , , , ,

January 24, 2024

Reframing the mainframe plot

I've ranted about this before, but the mainframe-as-antagonist (commanding an army of dumb terminal minions) was a well-worn meme sixty years ago when Captain Kirk was outwitting IBM System/360 lookalikes on a regular basis. It's so overdone by now you can't stick a fork in it. It's mush.

And yet Hollywood keeps serving it up. Because we keep chomping it down.

Conquering the galaxy since the 1950s.

Even the ending of Edge of Tomorrow (without a computer network in sight) is straight out of The Phantom Menace. And straight out of Oblivion, the previous Tom Cruise SF post-apocalyptic, blow-up-the-alien-mainframe actioner.

Making it an organic mainframe is a slight improvement but just as dumb. The whole "hive mind" thing needs to go too.

Speaking of organic mainframes, Star Wars fell back on the Evil Emperor trope, a linchpin apparently holding the whole universe together by his lonesome. How is never explained, but all the good guys have to do is knock out this one bad guy and peace and prosperity is restored to the galaxy.

Well, after they deal with the truly killer mainframe that is the Death Star. The whole Star Wars franchise ended up being about destroying Death Stars, each one a more ludicrous violation of the laws of physics than the last. Again, what long term problems this solves is never made clear.

Does the mail start arriving on time now? Does the tax code suddenly become more comprehensible? And what happens to the unemployment rate when all those Death Star jobs get instantaneously terminated? Imagine the size of the catering contract for just one of those behemoths.

Of course, destroying a single machine in a single place and winning the war everywhere makes for easy denouements. But if the Earth is ever attacked by malevolent aliens who know how to implement autonomous distributed network technology, we are so screwed.

That aside, though, what do the aliens hope to accomplish by attacking Earth? Or attacking any inhabited planet? (Besides giving the director an excuse to restage the Battle of Britain or the Invasion of Normandy.)

If they wanted to wipe out the humans along with the infrastructure—the whole objective of the Independence Day aliens—there'd be no need to get anywhere near the planet's surface, as Heinlein pointed out back in 1966 with The Moon is a Harsh Mistress. An asteroid makes for a handy ICBM.

There are lots of big asteroids out there.

Another reason is, they want our water. But there is plenty water elsewhere, that is not at the bottom of a deep gravity well. Europa, for starters.

Then there's the "To Serve Man" plot device. But homo sapiens is a lousy food and energy source (The Matrix is dumber than dirt in this regard). That's why so few people get eaten by sharks (surprisingly few!).

Besides, a blown-up country is a huge resource sink. Hence the Marshall Plan. By 1950, SCAP was already regretting Article 9 in the 1947 Japanese Constitution (forbidding war) and was revving up Japanese industry to support the Korean War, which was just what the economy needed.

In The Phantom Menace, Lucas tosses the politics of trade into the picture, but without explaining what is being traded, why, or how. The result is a blur of handwaving when it comes to the story because there are no underlying rational reasons for anything that happens.

The economic model of the Star Wars universe makes no more sense than the socialist utopianism of Star Trek, which finally gave us the robber baron Ferengi to make things interesting.

Still, Lucas was onto something. The unequal treaties imposed on Japan and China by the U.S. and European powers in the mid-19th century led to the Boxer Rebellion in China and propelled Japan into a regional arms race in order to even the scales. Lots of dramatic conflict there.

The thing is, China and Japan had stuff the foreign powers wanted, stuff as trivial (to our modern eyes) as tea. But like spice in Dune, there were underlying economic causes behind the conflicts. And at the time, a bad trade deal was a better deal for both sides than smash and grab.

And so we're back to the Lebensraum ("living space") ideology promulgated by Germany in the 1930s. (The Nazi bad guy connection certainly doesn't hurt.) The Japanese equivalent was used to justify the annexation of Korea and Manchuria.

Both Germany and Japan were doing rather well at expanding their territories (employing their own "unequal treaty" tactics) before they started actually invading their neighbors, after which everything went downhill fast.

So we'll assume our invading aliens are smart enough not to turn the whole thing into a scorched-earth shooting war. The problem is how to make that interesting.

A good place to start is Ryomaden, which describes the opening of Japan in the mid-19th century, the shock to the system, the unequal treaties, the escalating civil strife, finally resolved by a quickly-concluded civil war that launched Japan on a burning quest to surpass the west.

If gunboat melodrama is what you want, (bad) diplomacy seems pretty good at supplying the necessary Sturm und Drang motivations. Kudos to Guardians of the Galaxy on this score.

The problem is the time frame required by real politics. Summing up two decades of geopolitics in two hours would be tough. I suppose it really is simpler to just have Tom Cruise blow up the mainframe.

Labels: , , , , , , , ,

December 02, 2023

The last picture tube show

When I was a kid, a television was a hulking appliance that ran on a forest of vacuum tubes and produced as much heat as a wood stove. Even after the transition to transistors, the cathode ray tube (CRT) lingered behind as a living dinosaur. Like the internal combustion engine, the miracle of this Rube Goldberg contraption is that it works at all.

The CRT was the last true vacuum tube—a filament, cathode, grid and anode sealed inside of glass and depleted of air—left in consumer electronics. For decades after transistors took over, a television set had two vacuum tubes: the CRT and the high-voltage rectifier that charged the anode. The latter was long ago replaced by silicon devices.


We now live in a solid state world. HD flat panel displays are par for the course and Moore's law rules the roost. But while there will always be a need for speed at the high end, Intel's budget N100 is fast enough at the low end. We've reached a performance plateau where the only thing holding back a Windows upgrade is the UEFI requirement.

Going forward, the ability to squeeze the guts of not just computers but most ordinary electronic devices onto inexpensive SoCs will have transformative effects on the costs and capabilities of consumer electronics.

The way a twenty dollar Roku runs off a 64 bit ARM CPU and you can get an AM/FM/clock radio on a five dollar SoC (a lot less if purchased in quantity). Even more amazing (to me, at least) is that all of the key electronics in an old school CRT television can be handled by a single chip. Yes, somewhere in China, dinosaurs still roam the Earth.

The vacuum tube is dead! Long live the CRT!

Related posts

Complex simplicity
The old brand new
HDTV on the cheap
The drama of the PCB

Labels: , , , ,

September 09, 2023

The Dial Comes to Town

The best technical support video ever. I'm so ancient that I grew up with a dial telephone. And those massive telephone books. This was an era when AT&T owned the entire system from end to end, including the telephone. Touch-Tone (a registered trademark) debuted in November 1963.

The AT&T monopoly (also known as "Ma Bell," after Bell Telephone founded by Alexander Graham Bell) was broken up in 1984 into seven regional "Baby Bells." I was in college at the time, and one of the first manifestations of the break-up was the proliferation of cheap Touch-Tone phones.

Those wall-sized racks of electromechanical switches make the geek in me smile. Today, the equipment that filled entire buildings would fit into a small closet. But this was the cutting edge of computing in 1940. And why the invention of the transistor at Bell Labs in 1947 changed everything.


Not only has the dial telephone gone the way of the dinosaurs, but the landline (also known affectionately as "POTS" or "plain old telephone service") is fast on their heels. Today, only two percent of households in the United States rely solely on a hardwired connection to place a phone call.

The question going forward is how fast fiber will replace the now "traditional" coaxial cable connection. And when and if wireless will replace everything else.

Labels: , , ,

July 13, 2022

Speeding up the Slimline

My main machine is a bargain basement HP Slimline 290 with 4 GB RAM and a Celeron G4900 CPU. Less than $300 at Walmart. Thanks to the UEFI BIOS, GPT, and a dual-core CPU, it actually qualifies to run Windows 11! But I'll put that off until Windows 10 reaches end-of-life in another three years.

Doubling the RAM to 8 GB and adding a 500 GB SSD costs less than $100. Budget-wise, it's a no-brainer. Except I mostly run Chrome, Word, Notepad++, and JWPce (a Japanese text editor). With the unnecessary screen effects and background apps turned off, Windows 10 is surprisingly snappy.

(The first half of this video explains how to lighten the load with the standard Windows settings. The second half using the recommended utility didn't make as big a difference, perhaps because I'd already disabled many of the startup processes.)

The computer only bogs down noticeable when editing high-resolution cover art in Paint Shop Pro 2019. I don't do that very often so I wasn't in a hurry.

But then the system battery on the motherboard died, which resulted in a Groundhog Day moment. One day when booting up, I glanced at the login screen and said to myself, "Huh. I thought today was Friday." A minute later, the OS pinged the timeserver and it was indeed Friday.
When the battery dies, the CMOS stores the last logout date. The documentation gives the battery a lifespan of three years, so it failed right on schedule. Upgrading the RAM and installing an NVMe M.2 SDD is no more difficult than replacing the battery. This was as good an excuse as any.

YouTube comes in handy for jobs like this. HP has a how-to guide and David Noble did a how-to on the same model as mine. The one major obstacle, the drives cage, has to be removed to access the battery, RAM, and M.2 slots. The only unexpected variable here turned out to be the three Torx 15 screws.
I could have wrestled them off with a pair of pliers, but at times like this, the Tim "The Toolman" Taylor gene kicks in. Home Depot has a Klein 4-in-1 Torx screwdriver for ten bucks. Who knows, it may come in handy again one day.

As it turns out, Torx screws actually are easier to work with than Phillips ("plus") screwheads. Though I'd recommend adding a little piece of tape to the Klein driver to keep the bit from falling out of the holder.

Not counting dropping stuff (I forgot about the WiFi keyboard dongle and it popped off too), the whole job took less than 20 minutes. Honestly, the hardest part was replacing the battery. The RAM snapped right in as did the SSD. I did need this M.2 screw kit as one doesn't come pre-installed.
I put everything back together, crossed my fingers, and booted to BIOS. The BIOS reported 4 GB of RAM in both banks for a total of 8 GB and the 500 GB Samsung 980 SSD in the PCIe M.2 slot. Much easier than I expected.

Related posts

From XP to X
The state of the solid state
New and improved benchmarks

Labels: , , ,

July 06, 2022

The state of the solid state

Since the dawn of the PC era, the easiest way to give a consumer PC a boost (short of swapping in a clock-doubled CPU, as I did with my old Windows 95 machine) has been to add more RAM and a hard disk drive (HDD).

Though the technological world has completely changed in the past 40 years, that is still the case, except that HDD upgrade is now a solid-state drive (SSD).

The first IBM PC shipped with 64K of RAM and two full-height floppy disk drives, which ran at the blazing speed of "slow as mud." When buffering a print job, the floppy drives in my Kaypro II sounded like a washing machine in spin cycle.
The IBM XT released in 1983 was equipped with 128K of RAM and a 10 MB HDD. The 4.77 MHz 8088 CPU was the same, but the hard drive made a big difference. You could upgrade to 640KB of RAM but there weren't many options if you wanted a bigger HDD.

Unless you were willing to spend several boatloads of money. The 18 September 1984 issue of PC Magazine featured a 350 MB external hard disk system that could be yours for a mere $14,900 or approximately $50/MB.

That was actually a good deal. Two years earlier, Corona advertised a 10 MB HDD for $2495. By the end of the 1980s, the typical consumer HDD was 30 MB and prices had fallen to $10/MB, a respectable though linear decline in costs.

But in the decade that followed, something astounding happened. The capacity of the typical consumer hard disk drive rose to 20 GB while the cost fell a full three orders of magnitude to $.01/MB. That's a factor of 1000.

And then it happened again! A decade after that, 1 TB consumer hard drives were commonplace at $.0001/MB, two more orders of magnitude.

What happened was the discovery of giant magnetoresistance (GMR) in 1988 by Albert Fert and Peter Grunberg, for which they won the 2007 Nobel Prize in Physics. Their work led to the application of what became known as spintronics to HDD technology.

IBM began manufacturing MR HDD read/write heads in 1990 and GMR read/write heads in 1997.

As with CPU clock speeds, the "spinning rust" of the HDD is reaching its practical limits as a low-cost consumer technology. Over the last ten years, HDD prices have fallen only half an order of magnitude, stabilizing at about $.04/GB. About the cost of assembly.
I imagine that without the GMR revolution, the SSD would have evolved much faster. Like the internal combustion engine and the cathode ray tube, the amazing thing about the HDD is that it works at all, let alone as well as it does.

Even so, next-generation heat-assisted magnetic recording (HAMR) and microwave-assisted magnetic recording (MAMR) hard drive technologies are being rolled out, guaranteeing the HDD will live on in data centers and the cloud.

As Jeffrey Burt puts it, the hard drive is the Mark Twain of technology. "Reports of its death are greatly exaggerated."

A comparable SSD costs around $.08/GB, about twice that of a HDD but still dirt cheap. So while the SSD is standard in portable devices, slapping a 500 GB HDD into a low-end PC like mine is still an easy way to increase the profit margin.

Then again, I recall the day I walked into Walmart and all the tube TVs were gone. Microsoft reportedly wants to hurry the process along and is pushing manufacturers to install an SSD as the boot drive in all PCs starting in 2023.

The day soon will come when, aside from the fan, the only moving parts left in the humble PC will be the DVD or Blu-Ray drive, until they too are relegated to the same niche as turntables and vacuum tube electronics.

Related posts

From XP to X
Speeding up the Slimline
New and improved benchmarks
Back to the digital future
The last picture tube show

Labels: , , ,

October 10, 2020

Back to the AT future

I don't get nostalgic about high school or college. For me, it's the decade from 1985 to 1995, the heyday of the personal computer. By 1985, the clone wars were over (and IBM lost). By 1995, the GUI wars were over (and IBM lost again).

During those ten years when the PC came of age, the computer was truly personal. You simply couldn't live a life online at a few thousand baud. Even in 1995, soon-to-be online colossus AOL only had around three million active users.

Ah, an era now gone for good. For a stroll down memory lane, it's enough for me to browse though old issues of PC Magazine. But then there are those dedicated Dr. Frankensteins devoted to bringing hardware long thought dead back to life.

As his YouTube channel title suggests, 8-Bit Guy focuses more on the Stone Age. Clint Basinger takes us up to the Medieval Period. In the two videos below, he unboxes and upgrades an IBM AT from 1988, still sealed in its original packaging.



As Crocodile Dundee would put it (there's another 1980s reference for you), "Now that's a PC."

Clint Basinger paid $500 for the AT on eBay. The IBM AT cost five times that in 1988, ten times as much when adjusted for inflation. By contrast, a thirty dollar Roku today has orders of magnitude more memory and computing power.

In the following episode, Basinger plays Indiana Jones exploring a warehouse crammed full of computer equipment dating back to the 1970s.


Call it the excavation of a PC Pompeii, a look back at a once thriving past now relegated to landfills, museums, and our memories.

Related posts

The future that wasn't
The accidental standard
MS-DOS at 30
The grandfathers of DOS

Labels: , ,

July 18, 2019

From XP to X (benchmarks)

I recently (literally) stuck my ThinkPad T42 laptop on the shelf and upgraded to a low-end HP 290-p0043w desktop PC. I continue my review with two pleasant unboxing surprises.

Some chassis guides I reviewed prior to purchase suggested that the HP Slimline 290-p0043w had an external power brick. It came with an internal power supply. HP's own product specs list six USB ports. It has eight. I suspect that some of the spec sheets for the Slimline weren't updated from the nearly identical Celeron G3930 model.

The HP 290-p0043w sports a Celeron G4900 under the hood, the Toyota Corolla of CPUs. It does what it has to do as long as you don't ask it to tow a boat.

First off, I went through Add/Remove Programs and got rid of everything I didn't want and didn't need, including the McAfee trial version software. As I said, a Toyota Corolla runs fine as long as you're not trying to tow a boat, and one such boat is a heavy-duty antivirus program. I rely on Windows Defender and uBlock and scan all downloads with Jotti.

Late model Celeron processors approach earlier Core i3 benchmarks (newer i3s match older i5s). The technological improvements are reflected in the benchmarks. With one dramatic exception, there's about a fifteen fold improvement in performance at the hardware level, and that's comparing what was a mid-range business laptop with a very basic system.

Prime95 is a freeware app that searches for Mersenne prime numbers. It includes a benchmark function based on running batches of Fast Fourier Transforms. It runs in Windows XP, making possible an apples-to-apples comparison. As you can see from the following samples, Prime95 has the Celeron G4900 running around 15 times faster than the Pentium M.
Intel Pentium M @ 1.70 GHz 1 core
Timings for 2048K FFT length 179.35 ms @ 5.58 iter/sec.
Timings for 4096K FFT length 376.32 ms @ 2.66 iter/sec.
Timings for 8192K FFT length 708.85 ms @ 1.41 iter/sec.

Intel Celeron G4900 @ 3.10 GHz 2 cores
Timings for 2048K FFT length 12.00 ms @ 83.33 iter/sec.
Timings for 4096K FFT length 23.67 ms @ 42.24 iter/sec.
Timings for 8192K FFT length 51.66 ms @ 19.36 iter/sec.
The Pentium M has a Passmark CPU benchmark of 414, versus 3262 for the Celeron G4900. DDR4 RAM and the PCI Express bus run about twenty times faster. But perhaps the most dramatic changes are in the GPU.

The ATI Mobility Radeon 7500 in the ThinkPad T42 has a G3D benchmark of 4. That's four. The onboard Intel UHD Graphics 610 has a G3D benchmark of 784, a 200 fold improvement in performance for a low-end integrated GPU. This revolution in GPU design is why a $30 Roku Express can output 1080p HD video. For ten dollars more, the Roku Premiere handles 4K video.

Wi-Fi had only reached the 802.11g standard when my old ThinkPad shipped, giving me a maximum download speed of 17 Mb/s. The 802.11n Wi-Fi in my Fire tablet tops out at 44 Mb/s. The HP Slimline delivers twice that. Unfortunately, upload speeds improved only 10 to 20 percent, but that's on Comcast. At least I'm getting the download speeds I'm paying for.

Someday I'll get around to doubling the RAM and installing an SSD (both for less than $100).

The mouse that ships with the HP is pretty good. The keyboard is meh. It's a full-sized keyboard in a workspace built for a laptop so it doesn't really fit. I replaced it with a Logitech K360. The K360 combines the number pad and cursor keys, saving four inches in width. It's wireless, eliminating a set of cables. It has a unifying receiver so I could add a mouse later.

I use Sharpkeys to reassign the Caps Lock key to Ctrl and Scroll Lock to Caps Lock, and Autohotkey to map a bunch of keyboard macros. It's been fairly easy to approximate the look and feel of XP without using one of those Start Menu apps. In fact, having gotten rid of the live tiles and populated the Taskbar with my shortcuts, I've grown to like the Windows 10 UI.

In any case, OneDrive integration makes the upgrade very much worth it. OneDrive installs with 5 GB of free storage, which is more than enough to back up my critical files without having to think about it.

Related posts

From XP to X (hardware)
From XP to X (software)

Labels: , , ,

January 31, 2019

Apps are where it's at (7/7)

After repeated failed attempts, an industry giant debuted a completely revamped operating system. The tech press was impressed across the board. With true multitasking, higher screen resolutions, expanded storage, and an innovative graphical user interface, this new OS, wrote Brad Molen, was "precisely what we wanted to see in the first place."

The new OS had only one—fatal—weakness: a lack of native apps. Concluded Molen,

It is going to take a fair amount of time for developers to push out enough earth-shaking apps to persuade the typical user that has already heavily invested in their ecosystem of choice. An OS is only as strong as its ecosystem, it's been an ongoing struggle to sell the platform to developers and attract popular titles.

A succinct summary of the challenges IBM faced trying to sell OS/2 in a world dominated by DOS and Microsoft Windows. Except the above excerpts are from a 2012 article in Engadget. Brad Molen is writing about the debut of Windows Phone 8. The struggle wasn't IBM's but Microsoft's. Microsoft came late the market with a technologically competitive product but failed.


In the words of the philosopher Friedrich Hegel, "The only thing we learn from history is that we learn nothing from history."

As Jim Seymour explained in his 8 December 1992 PC Magazine column, "No one buys a PC to run an operating system. Good applications sell PC hardware; good applications sell operating systems. Every single dazzling PC program I can think of right now is a Windows app."

The "accidental" origins of the personal computer caused much of the confusion about the primacy of the hardware or the operating system or the software that runs on it. The hardware specs of the first IBM PC were pedestrian even by the standards of the time. In his 15 March and 26 April PC Magazine columns, John Dvorak recalled that

to move from a red-hot CP/M machine to an early IBM PC was a step backward. None of the purveyors of 8-bit microcomputers saw the utility in the IBM PC. Theirs was a robust and mature industry and the 16-bit upstarts had nothing to offer. The IBM, though, had more potential. None of the potential was apparent to the CPM-ers.

That was true of my father's Epson QX-10 CP/M machine. Dvorak correctly pinpoints the potential of the 16-bit PC and the standard-setting status of IBM as conntibuting to its success. But he drew the wrong lesson for the future, that "all we need is a platform that the core influencers all agree on and we're off to the races."

William Zachmann conceded in the 10 September 1991 issue of PC Magazine that "nobody is going to buy new hardware system or new operating systems is there is no software for them." But then he jumped to the same wrong conclusion.

It's the fundamental capabilities of a new platform—not applications—that determine its success or failure. If the platform has "the right stuff" it will succeed even if applications vendors are intially slow to develop for it.

The widely cited proof of this thesis is the Macintosh. Dvorak argued in his 29 May 1990 column that just as the IBM PC "had virtually no software when it arrived on the scene, the Macintosh also arrived with nothing but a word processor, an operating system, and a paint program." He thus concluded that "initial massive software support" was not important.

In fact, the Macintosh proved the opposite. It debuted with no development tools native to the platform. After a year on the market, it had a quarter of the applications that the primitive IBM PC had a year after its release. Apple only survived because the other Steve—Wozniak—returned to reboot the Apple II line that was keeping the company in the black.

Rather, the lesson is that if you are going to establish a new standard in a world not looking for one, you need a lot of patience and a positive cash flow. For his NeXT project, Steve Jobs had more of the former and less of the latter. The NeXT line of computers failed to garner any market share. The future of the NeXT OS was to be acquired by Apple.

Without an extensive library of software, NeXT never extended its market beyond a handful of vertical applications. Steve Jobs learned his lesson. Soon after returning to Apple, he buried the hatchet with Bill Gates. Along with a $150 million investment, Gates promised ongoing development of the hugely popular Microsoft Office suite for the Mac.

In his 29 October 1991 PC Magazine column, Michael Miller provided a better rule of thumb:

In order to be successful, a new operating system has to be both necessary and sufficient: necessary in the sense that it must give computer users a compelling reason to switch; and sufficient in that it must have enough functionality to do all of the things that a computer user would want to do.

In other words, the 8-bit computers of the late 1970s were better than nothing, only viewed favorably in terms of the the very low expectations of personal computer users at the time. But by the late 1980s, 16-bit DOS applications like Lotus 1-2-3 and WordPerfect had feature sets more complete than the typical consumer even today will ever use.

Keeping the market alive meant selling consumers software solutions they could only imagine they needed, and often, as in the case of the graphical user interface, were sure they didn't.

To coax that customer to make that leap, Bill Gates resolved to maintain backwards compatibility at the cost of sleekness and simplicity. That ruled out the elegant "clean break" that Steve Jobs championed, also making it a more technologically challenging task. As Charles Petzold explained in the 12 September 1989 issue of PC Magazine,

A GUI that can potentially support every graphics video display and every graphics printer ever made for the PC is going to more complex than one needs to support only one video display and two printers, as was the case with the original Mac. I guarantee you, if Apple had put complete Apple II compatibility into the Mac, it would have lost a lot of its simplicity.

Microsoft's clunkier but open architecture solution pushed Windows forward on all fronts while generating the necessary cash flow from its legacy operating systems and applications. IBM tried to duplicate this model by building DOS and Windows support into OS/2. But if DOS and Windows were already good enough, why spend more to switch?

In his 31 October 1989 column about the downsides of RISC architecture (which also was supposedly going to conquer the PC world), OS/2 stalwart William Zachmann inadvertently explained why OS/2 wouldn't succeed either.

For users, the costs of moving from the Intel x86 family to an incompatible RISC-based microprocessor architecture, which would require new versions of every bit of software, are very steep. Users aren't going to make the move without a compelling reason, which RISC alone doesn't provide.

Bingo. OS/2 didn't provide a compelling reason to make the move. The default position for anyone not seeking an IBM-branded solution was to keep using DOS and Windows while waiting for the Microsoft to slowly evolve its product line. Which is exactly what the rest of the computing world did.

Jim Seymour observed in his 11 June 1991 column (and the same could be said about WordPerfect's OS/2 efforts), "Lotus spent a fortune developing 1-2-3/G for OS/2. It was released—and almost disappeared. No one was using OS/2 so no one cared about apps for it. You've gotta have DOS and Windows versions of your programs."

Perhaps nothing drove the point home more decisively than an article by Christopher Barr in the 12 May 1992 PC Magazine. Titled "Waiting for Godot," he summarized a report from the Software Publishing Association, according to which "OS/2 applications accounted for .03 percent of the market" in 1991. Not 3 percent. That's 3/100 of 1 percent.

The release of OS/2 2.1 (with Windows 3.1 compatibility) in the summer of 1993 finally propelled it onto the bestseller chart, debuting at number five in the 14 September 1993 issue. IBM had additionally come to its marketing senses and sold OS/2 at retail and via mail order in the same price range as Windows.

With IBM claiming to be moving 300,000 copies per month, Bill Machrone noted that "if OS/2 were anything other than an operating system, it would be a runaway bestseller." By contrast, in 1993 alone, MS-DOS 6 shipped a combined 5 million upgrades and 10 million OEM installs.

Two weeks later, OS/2 2.1 rose to number four, except those were upgrades from older versions. At number one was Windows 3.1, and those were new installations (everybody with Windows 3.0 had already upgraded). The 12 October 1993 chart showed Windows 3.1 slipping a notch to second place. OS/2 2.1 fell completely off the chart.

In one of those signs of the times, in the July 1993 issue, Charles Petzold switched the focus of his Environments column from OS/2 to Windows NT, calling Microsoft's new preemptive multitasking 32-bit operating system "what OS/2 should been in the first place."

Over the next five years, new releases of OS/2—such as OS/2 for Windows (ironically) and OS/2 Warp—propelled it onto the bestseller chart for several weeks at a time until it vanished once again, while DOS and Windows upgrades and Microsoft office applications dominated it issue after issue.

To see where things were headed, consider the bestseller chart following the release of Windows 95 in August of 1995.

Microsoft released Windows NT shortly after OS/2 2.1, though not as a consumer product. NT was a high-end workstation and server OS with far more functionality than OS/2. Both NT and Windows 3.1 could natively run apps that were Win32 compliant. OS/2 2.x ran Windows code licensed from Microsoft, a license that expired at the end of 1993.

As a result, OS/2 for Windows ran a separate copy of Windows in a virtual DOS machine, a cumbersome solution. In the run-up to Windows 95, IBM at first promoted OS/2 Warp, but ended up licensing Windows 95 on the same terms as Compaq. "IBM officials conceded that OS/2 would not have been a viable operating system to keep them in the PC business."

And so Michael Miller had correctly concluded in his 28 September 1993 column that

the desktop operating system standard for the next 12 to 18 months will be DOS and Windows 3.1. That's because I've become convinced that we've all understated the importance of compatibility. It's been clear for a long time that for an environment to work, we need great applications that work under it.

With so many applications available on the Windows platform, Microsoft eventually became a victim of its own success. Old luddites like me could put off upgrading their computers because what they already had was "good enough."

With an 85 percent market share, Microsoft still owns the desktop. But to gain a foothold in the portable environment dominated by iOS and Android, Microsoft has to make Windows software platform-agnostic. Instead of "Windows Everywhere," Microsoft is moving its software to the cloud and providing "Microsoft Services Everywhere."

And once again, against the odds, Microsoft appears to be succeeding. Abandoning Internet Explorer and adopting a Chromium-based browser is one more step along that path. Because no matter how they are delivered to whatever screen the user is using, the apps are where it's at.

Related posts

The future that wasn't (introduction)
The future that wasn't (1/7)
The future that wasn't (2/7)
The future that wasn't (3/7)
The future that wasn't (4/7)
The future that wasn't (5/7)
The future that wasn't (6/7)

The accidental standard
The grandfathers of DOS

Labels: , , , ,

January 17, 2019

The old brand new

AT&T CEO John Donovan recently announced that, going forward, DirecTV would be transitioning from satellite transmission to streaming technology for content distribution. In other words, depending less on outer space and more on wires hanging from telephone poles.

To be sure, in large swaths of the United States and the world, there are still no viable alternatives to satellite content delivery. But like a medieval circle of fate, technology is always circling around to where it began. The old becomes brand new again.

In terms of the large-scale infrastructure, the communications satellite was a simpler solution than the microwave relay stations that once dotted the land. In turn, those relay stations were a vast improvement over the copper wire telephone circuits they replaced.

Fiber optic cable wiped out the microwave towers and may soon do in the communications satellites.

Like the transistor, vacuum tube electronics, and the internal combustion engine, the amazing thing about television satellite service is that it works at all, let alone that it can be mass-produced as a consumer good.

A communications satellite orbits 22,236 miles above the equator, a tenth of the way to the Moon. And yet it beams a signal down to the Earth's surface that can be scooped up with an eighteen-inch dish on your roof and decompressed into 500 channels.

When I first got Dish, I was impressed at how "clean" the picture was. Completely static free. These days, it's ho-hum compared to free over-the-air HDTV.

OTA HDTV breathed new life into the old UHF broadcast spectrum. 5G networks promise to steal that precious "last mile" connection to the home away from fiber and cable.

Google's foray into the home Internet business ran into the buzz saw of regulatory capture, which lets cable cartels box out the competition. So Microsoft is going wireless instead, much as the smartphone leapfrogged the landline in the developing world.

The Microsoft Airband Initiative launched in July 2017 with the goal of working with partners to make broadband available to 2 million Americans in rural communities who lack access today and to help catalyze an ecosystem to connect millions more.

Radio really is all the rage these days. Smartphones are just smart radios operating at UHF frequencies. That microwave relay technology that got passed over by the telecommunications satellite and then buried by fiber optics? It didn't go away. It mutated.

Back in 2016, Ars Technica reported that some of those old microwave towers are being repurposed to augment fiber optic networks. Because it's cheaper than laying brand new fiber and because radio signals move through the air faster than light through fiber.

And let's not too hastily write off satellites either. Elon Musk plans to tackle the latency problem of satellite-based Internet service with a swarm of satellites in low Earth orbit (such that at end-of-life they'll simply burn up in the atmosphere).

Every time you turn around, another moribund technology is "not dead yet." The solid-state disc drive should have sent old-fashioned "spinning rust" into retirement. Except every time it's knocked to the canvas, the hard disk drive staggers back to its feet like Rocky Balboa.


For example, Seagate has successfully prototyped a 16TB HDD using HAMR (Heat Assisted Magnetic Recording). The heat comes from a laser diode attached to the read/write head. Western Digital answered that challenge with a 16TB MAMR HDD (Microwave Assisted Magnetic Recording).

In the steampunk space opera future I like to imagine, the only way to build a faster-than-light starship engine will be with old-fashioned vacuum tubes and analog circuitry. And thus technology from the 1930s will end up being the most modern thing ever.

Labels: , , , , , , ,

December 20, 2018

The true believer (6/7)

In Silicon Valley, an "evangelist" shares many characteristics with his religious counterpart, preaching the good word of the new technological doctrine while demonstrating an unflagging faith in the utopia sure to come if only all within earshot would only convert to the cause.

PC Magazine columnist William Zachmann was one such evangelist, zealously devoted to the gospel of OS/2 as the one true software sect in the church of the x86 PC.

During the 1980s he was hardly a lone voice crying in the wilderness. Microsoft and IBM were working hard to make OS/2 the 32-bit multitasking operating system that would replace MS-DOS. In the 15 March 1988 issue of PC Magazine, Gus Venditto reported that while Microsoft CEO Bill Gates did not expect DOS

to give way to OS/2 for many years more, he outlined a timeline in which 15 percent of new PCs were running OS/2 in 1989, growing to 50 percent by 1991.

One can hardly fault Zachmann for agreeing with Bill Gates.

A month later, launching a new column dedicated to programming for OS/2, Charles Petzold predicted that "everybody currently using DOS on an 80286 or 80386-based machine will eventually consider upgrading to OS/2" for the simple reason that "Microsoft expects OS/2 to establish the foundations of PC operating systems for the next decade."

But a year later, convictions began to waver. In the 28 March 1989 issue, Gus Venditto counted up "2 million copies of Windows sold to date." In his 11 April 1989 column, Jim Seymour concluded that, a year after the introduction of the PS/2 and OS/2,

DOS is livelier than ever. And the original PC bus and especially the PC AT bus are robust and dominant in the market. [As for OS/2], it is mired in high costs and little value for the PC user—a fatal pairing.

But William Zachmann stuck to his guns, claiming in his 28 November 1989 column that the upcoming release of Windows 3.0 would "make the transition from Windows to OS/2 easier. The future lies with OS/2. And it is just around the bend." Then in the 16 January 1990 issue he predicted that

OS/2 will take off. By the end of 1990, many more users will be running OS/2 than most pundits predict. The speed of its acceptance is just as surely underestimated today as it was overestimated in 1987. Windows is strictly a transition product. Whatever Windows does, OS/2 will do better.

Released in May 1990, Windows 3.0 immediately raced to the top of the bestseller charts. In his August 1990 preview of the OS, Gus Venditto concluded that "A funny thing happened on the road to OS/2. Microsoft Windows has turned into the dazzling multitasking operating system that OS/2 is still struggling to become."

Countered Zachmann in his 25 September 1990 column, "Windows 3.0 will light the way to OS/2, not eclipse it. And that's really what Microsoft always wanted."

What Microsoft wanted at that point was to dump the whole OS/2 mess back in IBM's lap. The pair of September 1990 press releases hinting at but not directly announcing the breakup between the two companies had turned into a Rorschach test.

In the last issue of the year, Zachmann had to admit that the "smashing success of Windows 3.0 rolled mercilessly over my prediction [that OS/2 would take off in 1990]." And yet he remained convinced that "The renewed cooperation between IBM and Microsoft announced in September should help pave the way for OS/2."

In the same 15 January 1991 issue in which Ray Duncan methodically explained why IBM and Microsoft were not getting back together, Zachmann again miraculously managed to find the silver lining.

A significant number of desktops running Windows 3.0 will switch to OS/2 once 2.0 is first released. Windows and OS/2 will be made truly complementary to one another by both Microsoft and IBM. Windows will not compete with OS/2 but become an option on top of OS/2.

Even John Dvorak spent the first several months of the year dismissing Wall Street Journal reports that Microsoft would indeed drop OS/2 development.

But by May, Zachmann was also reporting on Microsoft's System Strategy Seminar and its undeniable message: "Forget about OS/2 2.0 and stick with Windows." In June, Dvorak concluded that the feud between IBM and Microsoft was real. And finally, in his 29 October 1991 column, William Zachmann conceded the now obvious.

Microsoft and IBM aren't merely "divorced"—they are at war. What started as a difference of opinion escalated through growing levels of mutual mistrust and suspicion into what amounted, by the middle of the year, to overt hostilities. Microsoft has all but completely disavowed OS/2.

Somehow this was all the more evidence, as far as Zachmann was concerned, that IBM was destined to triumph. He warned in his 12 November 1991 column that

Microsoft made a big mistake going to war with IBM over Windows and OS/2. Microsoft could lose this war—and lose big. If OS/2 2.0 delivers as promised, Microsoft will be in tough shape trying to use the mere promise of Windows NT to hold the line with Windows 3.0. Momentum will shift dramatically away from Windows and toward OS/2.

In the meantime, the editorial board of PC Magazine was making its biases clear. The Letters editor in particular seemed to enjoy poking Zachmann in the ribs, and for the past four years had run letters every few months like the following from computer consultant Jim Barrett:

I would be on welfare if I were making my living on PM. While OS/2 and Presentation Manager articles abound in the industry journals, I wonder if anybody actually reads them! Never have I see so much written about so little for so few.

The editorial content of the magazine approached the issue with more tact but came to similar conclusions.

In their 30 June 1992 analysis, Bill Bettini, Joe Salemi, and Don Willmott concluded that "OS/2 isn't a better Windows than Windows, but it could be called a potentially safer Windows than Windows." And OS/2 was a "better DOS than DOS" only because of the caching software, an advantage that disappeared when using updated Windows 3.1 drivers.

Later that year, in a head-to-head face-off between OS/2 and Windows 3.1 published in the 10 November 1992 issue, PC Magazine gave its Editor's Choice to Microsoft Windows 3.1 "for its ease of use, solid performance, and rich selection of high-quality applications in every software category."

But William Zachmann was not about to abandon his 30 June 1992 prediction that OS/2 was on its way to being the software hit of 1992. "OS/2 2.0 is even better than I'd expected. Windows 3.1 is much worse. The result will be a more rapid "paradigm shift" from Windows toward OS/2 than I'd dared to expect."

Five months later, he was only more confident that the "shift from Windows to OS/2 that has already begun on a small scale will gather momentum." And in his 22 December 1992 column, his last for PC Magazine, Zachmann predicted "Growing success for OS/2 and the Macintosh and competitive losses for Microsoft and Windows over the next two years."

A true believer to the end.

In fact, OS/2 never gained more than a fraction of the market share of even the Macintosh. And Apple's share of the market declined over the next four years. Its financials were not reversed until Steven Jobs returned to Apple in 1997, accompanied by a $150 million investment by Microsoft in non-voting Apple stock and a pledge to support Office on the Macintosh.


But as a parting gift, in that same December issue, PC Magazine gave OS/2 2.0 its Award for Excellence. And perhaps for a short period of time at the end of 1992 OS/2 was the better operating system.

But OS/2 had few software solutions for the average consumer that didn't run better and cheaper under DOS and Windows. And with DOS and Windows pre-installed on practically every PC sold, the average consumer had no practical reason to buy OS/2. And so practically none of them did.

Related posts

The future that wasn't (introduction)
The future that wasn't (1/7)
The future that wasn't (2/7)
The future that wasn't (3/7)
The future that wasn't (4/7)
The future that wasn't (5/7)
The future that wasn't (7/7)

The accidental standard
The grandfathers of DOS

Labels: , , ,

November 15, 2018

A parting of the ways (5/7)

In a 12 April 1988 PC Magazine article, "OS/2: A New Beginning for PC Applications," Charles Petzold restated what had become by then the party line amongst the personal computer prognosticators: "Microsoft expects OS/2 to establish the foundations of PC operating systems for the next decade."


Robert Hummel begged to differ. Only a few pages later in his PC Tutor column, he made one of the most spot-on predictions to grace the magazine.

Years from now, when programmers sit around and wax nostalgic, someone is sure to ask, "Remember OS/2?" Everyone will chuckle. Despite the hype and fanfare, I believe OS/2 is going to be short-lived. Rather than getting an improved DOS, we've gotten a new, completely incompatible operating system.

Or as reader Patrick Anderson stated in the 11 October 1988 Letters section,

All the gushing over OS/2 is amazing. It shows how far out of touch the gurus in Redmond and the magazine editors in New York are with real PC users.

But along with Robert Hummel, Ray Duncan was keeping in touch. He'd previously predicted that it'd take ten years for "OS/2's successors to eclipse MS-DOS." But in two October 1990 issues, and then in the 15 January 1991 issue, he drastically collapsed that time frame. Writing in his 16 October 1990 Power Programming column, Duncan observed that

Somewhere along the tortuous path from the original implementation of OS/2, thing went badly awry. A system designed to provide users and programmers with a painless migration path from DOS was transformed into a system designed to sell hardware and compete with Unix.

Two weeks later, Duncan counted up an installed based of 45 million DOS users, and short of an outright catastrophe, predicted 100 million DOS users by 1995. Microsoft, he advised,

should reconcile itself to the marketplace's resistance to the size and complexity of OS/2, and commit itself wholeheartedly to making DOS everything that it can be—regardless of the impact this might have on Microsoft's Joint Development Agreement with IBM or on OS/2 sales.

In fact, he was handing out advice that had already been taken. Microsoft had indeed fully committed itself to "integrating Windows into DOS," and would soon abandon OS/2 in favor of the massive installed base of DOS and Windows applications.

The momentous event—the dissolution of the Joint Development Agreement between IBM and Microsoft—happened that year. As with the hiring of David Cutler in 1988 to design Windows NT, it took a while for the news to leak out, and then everyone was so committed to the established storyline that it took even more time for the news to sink in.


In the meantime, DOS powerhouses like WordPerfect and Lotus invested heavily in OS/2. They were caught flatfooted when Windows took off like a rocket and never recovered. IBM acquired Lotus and it slowly faded away. In the worst deal of the decade, Novell bought WordPerfect and then sold it a few years later to Corel for pennies on the dollar.

Rumors of the "great divorce" between IBM and Microsoft had circulated the previously year, finally prompting coordinated press releases from the two companies in September 1990. The statements "reaffirmed their relationship" and extended the licensing arrangements for DOS, Windows, and OS/2.

"Semantic content: zero" was how Ray Duncan summed up the substance of these press releases. Authoring two separate articles in the 15 January 1991 issue, he again cut to the heart of the matter:

Although IBM and Microsoft agreed to cross-license everything, they committed themselves to nothing in the way of marketing the cross-licensed products. I suspect that Microsoft took a hard look at the startling success of Windows 3.0, compared it with the dismal penetration of the desktop market by OS/2 after three years (less than 2 percent by the most optimistic estimates), and decided to cut its losses.

Two weeks later, John Dvorak observed that "there has been much chitchat about a falling-out between IBM and Microsoft with denials all around, and more and more evidence indicates that the two are going in opposite directions." But then in the 30 April 1991 issue, Dvorak hedged his bets once again to pooh-pooh a report from January of that year.

The biggest fiasco in the industry was the obituary written in Wall Street Journal recently when OS/2 was pronounced dead. Microsoft was supposedly going to drop the product and concentrate on Windows. After all the facts were straightened out it seemed that nothing changed except there was even more talk of a portable OS/2.

Well, the Wall Street Journal got it exactly right. Microsoft handed OS/2 development back to IBM and concentrated its efforts on Windows and the Win32 API. This guaranteed that compliant programs written for DOS-based Windows would also run on NT, thus staving off the drought of applications that had plagued OS/2 from the start.


With that (mostly) "painless migration path from DOS" now in place, the fate of OS/2 was sealed. Five years later, in the PBS documentary Triumph of the Nerds, Steve Ballmer recalled the moment when everything went sideways.

We were in a major negotiation in early 1990, right before the Windows launch. We wanted to have IBM on stage with us to launch Windows 3.0 but they wouldn't do the kind of deal that would allow us to profit. It would allow them essentially to take over Windows from us, and we walked away from the deal.

After a decade of tumultuous growth, the weirdest marriage in American corporate history was over. And yet the true believers still couldn't believe that digital Mom and Dad were really getting divorced.

Related posts

The future that wasn't (introduction)
The future that wasn't (1/7)
The future that wasn't (2/7)
The future that wasn't (3/7)
The future that wasn't (4/7)
The future that wasn't (6/7)
The future that wasn't (7/7)

The accidental standard
The grandfathers of DOS

Labels: , , ,

November 08, 2018

Bakuman (the future)

Another way to watch Bakuman is as a historical document. It is decidedly old school. Pen and ink. Fax machines and copiers. Fat reams of paper stuffed into manila envelopes. It is also the end of an era.

The editors in Bakuman do pay a lot of attention to their spreadsheets. Akito writes on a laptop. But then everything gets printed out on paper. And faxed. Final proofs are hand-delivered.


In one of the more poignant scenes in the series, Moritaka is walking home from a school reunion where everybody was talking up their holiday plans. He glances at his calloused, ink-stained hands and realizes that, aside from gall bladder surgery, he's never taken a day off.

"No regrets," he tells Akito, and Akito agrees. When they got married, he and Kaya barely managed to squeeze in a honeymoon.

That could be changing. There is plenty of talk about the aging of Japan's population. Over the past quarter century, circulation at the major manga magazines dropped by two-thirds as the baby boom echo aged out of the target demographic and into middle age.

But at the same time, manga and anime have gone international and gone online, with Crunchyroll and Netflix leading the way. Justin Sevakis points out that "there has never been more money flowing from international fans to anime productions in the history of the art form."

Even light novels are getting in on the act in a big way, something I would not have predicted just a decade ago.

At Yen Press, a joint venture between Kadokawa and the New York-based Hachette Book Group, Kurt Hassler launched light novel imprint Yen On in 2014, introducing Reki Kawahara's Sword Art Online with the modest goal of publishing 12 books annually. That figure doubled the following year, and now Hassler says that Yen On will release 110 light novels through the rest of this year, representing growth of nearly 1000 percent in four years.

How popular culture is being created is also changing. As depicted in Shirobako, out of sheer necessity, technology has transformed the animation industry. 3DCG animation is only a small part of the revolution.

Even if an artist works initially on paper, everything gets scanned and imported into the animation software where the cleanup, coloring, and actual animation takes place. "Dailies" are generated and tweaked on the fly.

This process allows animation studios (in Japan and Hollywood) to subcontract with companies in South Korea, China, and Vietnam. Work product can be uploaded to and downloaded from the cloud in real time.

When it comes to creating backgrounds, directors like Makoto Shinkai have become masters of Photoshop (Garden of Words may be his most staggeringly gorgeous). This approach is disparaged by purists of the hand-drawn school. I don't care as long as it works.

The first time I saw the opening credit roll for Inari Kon Kon in HD, I was gobsmacked. Sure, it's a Photoshop, but it's breathtakingly beautiful.


When it comes to manga, the silly Eromanga Sensei offers a serious look at the future. Masamune naturally writes on a laptop. Sagiri (the artist) works entirely in the digital domain, using a Cintiq 13HD Wacom tablet (according to people who pay close attention to such things).

When she's finished with an illustration, she simply shoots an email off to her editor with a multi-layer PDF attached.

To be sure, a computer won't be drawing Moritaka's manga for him anytime soon. But the cost and time savings could prove considerable.

To start with, the ink is gone, along with the most physically onerous and time-consuming chores, such as whiting out mistakes (using, yes, Wite-Out) and often redrawing whole pages, manually layering in background textures, and sizing screentone overlays with an Exacto knife.

I grew up in at the end of the typewriter era, when "high-tech" was an IBM Selectric. But after using a primitive word processor on my brother's Apple IIe, there was no going back.

There are productivity gains to be made on both the production and publishing sides. The iconoclastic Shuho Sato adopted the increasingly popular "hybrid" model, his "traditional" publisher dealing with the paper product while he maintains a platform for distributing manga electronically.

We are quickly approaching the day when all commercial art is digital from start to finish. Using platforms like Amazon KDP, you can publish digitally and on paper (print-on-demand) for "free." And then with a push of a button, your book will appear in every Amazon store in the world.


"Free," however, doesn't factor in the costs in time and resources incurred by the writer, which can range from very little to a whole lot. Formatting a professional-looking ebook is a much more straightforward process than formatting a professional-looking print-ready manuscript.

And the eternal challenge still remains of reaching the reader. So perhaps the future of publishers will not be to physically publish but to publicize.

Related posts

Bakuman (the context)
Bakuman (the review)
Bakuman (the anime)
Manga economics
The teen manga artist
The manga development cycle

Labels: , , , , , , , , ,

October 18, 2018

The problem child (4/7)

It there was a single piece of technology that marked where things started to go wrong between IBM and Microsoft, the final straw that convinced Bill Gates, "It's not you, it's—no, it is you," it was DOS 4.0. DOS 4.0 was released by IBM in 1988. Microsoft walked away from the relationship two years later.

DOS 4.0 was at first greeted with great acclaim. Paul Somerson stated in the 27 September 1988 PC Magazine, "DOS 4.0 answers just about every major complaint about prior versions." But the glow faded fast. DOS 4.0 was soon causing more problems than it solved.

Asked John Dvorak in his 15 November 1988 Inside Track column,

Can't IBM do anything right? [DOS 4] is their baby, and it has so many bugs that we're told that we can expect to see 4.1 sooner than expected. I'd wait for 4.3 the way they are going.

In the 17 January 1989 issue, Ray Duncan rose to the defense of the OS, arguing that it was a victim of inflated expectations.

When IBM's DOS 4 first appeared, analysts and pundits hailed it as a major evolutionary step. A few weeks later, the same analysts and pundits came to their senses and there was a severe backlash. DOS 4 was subjected to a torrent of abuse for a handful of bugs no worse than those that accompanied the release of DOS 2 or 3.

But a little over a year later, Duncan took another opportunity to analyze the role of DOS 4.0 in what he saw now as the systematic undoing of IBM. In his 16 October 1990 Power Programming column, he succinctly summarized the beginning of the IBM's declining influence in the personal computer arena.

DOS 4 will probably merit a footnote in the history books as one of personal computing's major operating-system fiascos. The changes that appeared in PC-DOS 4 were entirely implemented by IBM, leaving Microsoft in the uncomfortable position of having to reverse-engineer the system in order to come up with a "generic" MS-DOS 4 that could be licensed to other OEMs. Users stayed away from DOS 4 in droves. As I am writing this, DOS 3.3 is still outselling DOS 4 by a significant margin.

As a result, Duncan reported later in the same issue,

By 1990, Microsoft had awakened from its preoccupation with OS/2, realized that DOS was still a cash cow, wrested control of DOS's development back from Boca Raton, and deployed famous software guru Gordon Letwin to recoup the damage done to DOS's reputation by DOS 4.

Microsoft and IBM parted ways that year. Once there was no longer any need to pretend they liked each other in public, the gloves came off. PC Magazine editor-in-chief Bill Machrone reported in the 28 May 1991 issue that Microsoft's message at the System Strategy Seminar earlier that year was hard to miss:

Forget DOS. Forget OS/2. Forget LAN Manager. Forget NetWare. Forget Unix. But don't forget Windows. That's what Microsoft will ask you to do in the coming months and years.

The pieces puzzled over by the prognosticators for years now fell into place.

The next OS/2 from Microsoft will bear little resemblance to the versions we've seen to date. OS/2 3.0 will feature a "New Technology" (NT) kernel, full Windows support, portability, distributed processing, and POSIX support. The development team is headed by Dave Cutler, the guy formerly responsible for VMS at DEC.

OS/2 optimist William Zachmann heard the same message—"Forget about OS/2 2.0 and stick with Windows"—and fretted that "this Windows-centric outlook has led many to assert—incorrectly—that OS/2 is dead."

Well, those "many" were asserting correctly. As far as the PC consumer was concerned, OS/2 was dead. The death certificate was delivered a year later (though the undead OS/2 would wander the Earth for another decade).

A month later, John Dvorak observed in his 11 June 1991 Inside Track column that the feud between IBM and Microsoft "over the direction of the industry, over Microsoft's emphasis of Windows, and over DOS 4"

finally erupted in a Forbes article where the two companies clearly stated that they were in disagreement about direction. Gates said that he and Microsoft were largely responsible for the success of the IBM PC and today's hot computer market. He made it clear that IBM's contribution to the leadership was a mirage.

In his 15 October 1991 column, Dvorak pointed to the "infamous Bill Gates memo" first leaked by the San Jose Mercury News. "The memo criticizes IBM for providing Microsoft with lousy code and intimates that Microsoft is better off without IBM."

IBM had hamstrung the DOS operating system long before DOS 4. In an truly ironic juxtaposition, the same 28 April 1992 issue that previewed Windows 3.1 in a cover story (and mentioned OS/2 2.0 in a much smaller font) also ran a letter from one Tim Paterson of Redmond, Washington.

Titled "IBM was the problem, not MS-DOS," Paterson objected to the "common misconception" that the designers of MS-DOS had "divided the first megabyte of memory into 640K RAM for the operating system and application and 384K reserved for hardware," creating the long-derided "barrier" that restricted memory access in MS-DOS to 640K RAM.

And who was this Tim Paterson to make such a claim?

As the sole designer of MS-DOS, Version 1.0, I did no such thing. The first computer that ran DOS could have a full megabyte of memory of DOS and applications. The system ROM occupied only the last 2K of the address space, but even that could be switched out after boot-up.

When Microsoft purchased DOS from Seattle Computer Products, Tim Paterson came along as part of the deal.

Those of us who lived with the 64K address space of the 8080/Z80 had learned our lesson. IBM, unfortunately, did not. They alone decided to build memory-mapped hardware right into the middle of our precious address space. I am mystified why anyone would consider this poor hardware design to be an aspect of DOS. DOS uses as much contiguous memory as can be made available.

Four years later, in the documentary Triumph of the Nerds, Steve Ballmer ruefully admitted that

Windows today is probably four years behind, three years behind where it would have been had we not danced with IBM for so long. Because the amount of split energy, split works, split IQ in the company cost our end customer real innovation in our product line.

Microsoft's successor to DOS 4.0 was a case in point. First reviewed in the July 1991 issue, PC Magazine would give DOS 5.0 a Product of the Year award.

Between 1990 and 1995, Windows 3.0 was followed by DOS 5.0, Windows 3.1, DOS 6.0, Windows for Workgroups 3.11, Windows NT, and Windows 95. Microsoft was on a roll. IBM became an afterthought as a supplier of PC operating systems.

Though IBM did engineer some of the best laptops ever, before selling its entire PC division to Lenovo.

Paul Allen (1953-2018)
Co-founder of Microsoft
Requiescat in pace

Related posts

The future that wasn't (introduction)
The future that wasn't (1/7)
The future that wasn't (2/7)
The future that wasn't (3/7)
The future that wasn't (5/7)
The future that wasn't (6/7)
The future that wasn't (7/7)

The accidental standard
The grandfathers of DOS

Labels: , , , , ,

September 20, 2018

Frenemies (3/7)

As the 1990s began, big changes in the computer business were just over the horizon.

Intel was rolling out the 80486 microprocessor. With over a million transistors on board and clock speeds that would climb to 100 MHz, the 486 made the Graphical User Interface truly usable. Featuring that GUI would be Windows 3.0 and OS/2 2.0.

Conventional wisdom had already concluded that Microsoft took three tries to get the software right. That meant Windows 3.0 was going to be a Big Deal and Microsoft treated it as such, with advertising spreads in all the major tech publications.

The 26 June 1990 issue of PC Magazine included a massive 70 page insert. Titled "The Next Generation of Windows" and featuring gushing quotes from the major players in the industry, it was only distinguishable from the rest of the publication by the "Special Advertising Section" header. (The insert was cheekily preceded by a rare full-page ad for the Apple Macintosh.)

Nevertheless, the personal computing world was not ready to to cast Windows and OS/2 as competing products. Almost two years before, in the 14 March 1988 issue of PC Magazine, Gus Venditto reported on a "recent policy statement" by Bill Gates that "outlined a timeline in which 15 percent of new PCs are running OS/2 in 1989, growing to 50 percent by 1991."

In his 12 April 1988 cover story, "What OS/2 Will Mean to Users," Charles Petzold concluded that "Everybody currently using DOS on an 80286 or 80386-based machine will eventually consider upgrading to OS/2."

Everybody knew what IBM and Microsoft intended to do.

Over a year later, Petzold still predicted that "IBM and Microsoft intend OS/2 to be the dominant PC operating system of the 1990s—and they seem ready to fix any problem that could inhibit this goal." In the 27 February 1990 issue William F. Zachmann  stated that "OS/2 is clearly the intended heir to DOS as far as IBM and Microsoft are concerned."

Despite the overwhelming success of Windows 3.0, Zachmann doubled down on this prediction in the 25 September 1990 issue: "Windows 3.0 will light the way to OS/2, not eclipse it. And that's really what Microsoft always wanted."

By 1990, what Microsoft really wanted was to get out of its software development relationship with IBM, and had been hedging its bets for the past two years. It hired David Cutler—designer of the revered VMS operating system—away from Digital Equipment in order to create Windows NT. In the first issue of 1990, John Dvorak reported in PC Magazine that

Everyone is talking about Microsoft Windows 3.0, but not all of the talk is pleasant. It seems that Microsoft's sudden re-emphasis on Windows may result in more grousing by developers who have put their hopes into OS/2. Windows 3.0 now looks like the hot ticket to the future. I'm told that Microsoft employees have gone back to Windows en masse.

Although Gates increasingly had every reason to question IBM's competence in the retail arena, the source of the widening rift was the diverging corporate philosophies of the two companies. Keep in mind Microsoft's mission statement: "A computer on every desk and in every home all running Microsoft software."


Microsoft got started selling software for the 8-bit Altair. Microsoft made a CP/M card for the Apple II and is still a major software developer for the Macintosh. "A computer on every desk" manifestly did not mean "An IBM computer on every desk." Microsoft had much bigger aims than that.

Every computer on every desk in the universe, if possible.

IBM's lurch toward proprietary solutions, starting with the Micro Channel bus, was tossing sand into the gears of this goal. OS/2 cheerleader William Zachmann plainly admitted that Micro Channel was "IBM's standard. And nobody else's. From the very beginning, IBM intended Micro Channel to eliminate competition from vendors of compatible systems."


In the same 12 April 1988 issue that Charles Petzold declared "the OS/2 decade has begun," a more pessimistic Robert Hummel keenly perceived the same existential threat to the Microsoft and the huge base of existing DOS software that Bill Gates must have.

If you use IBM software, OS/2 may not live up to the capabilities you want unless you buy your computers from IBM. And now you see the real reason for OS/2.

Six months later, in his 31 January 1989 column, Charles Petzold mused that "The conspiracy-minded among us have suggested that OS/2 Extended Edition is the first step in making OS/2 an IBM proprietary operating system."

Gates was not about to get hemmed in by IBM's possessive hold on the platform and its parochial approach to software development. Despite the huge industry it had created, IBM demonstrated no interest or ability in driving the business forward at the retail level and maximizing the consumer base.

Simply consider that a personal computer user who resolved to plunk down 340 dollars for OS/2 Standard Edition 1.1 still had to figure out how to buy it. The OS/2 ads that appeared in PC Magazine directed the reader to "your local authorized IBM dealer." Whoever that was.

By contrast, Microsoft made sure that Windows 3.0 came bundled with most new computers. And if you wanted to buy a copy, simply flip though the pages of PC Magazine to an ad from, say, mail-order powerhouse PC Connection, pull out your credit card, and it was yours for 99 bucks.

Even William Zachmann had to admit that "IBM was never really aggressive on pricing. IBM was never aggressive when it came to innovation either." No surprise that Microsoft should resolve to reassert control over the personal computer operating system.

As it had since its founding, IBM envisioned itself as a highly profitable purveyor of proprietary computing systems. In other words, what Apple would become two decades hence. Not a bad corporate goal to have—for IBM. But Bill Gates wasn't about to sacrifice a 90-plus percent market share in order to bolster IBM's our-way-or-the-highway strategy.

Not when Microsoft was busily building an interstate of its own. In the 25 April 1989 issue of PC Magazine, John Dvorak mused that

If IBM had not become preoccupied with its Micro Channel patents and closed architecture, I think it would have sold twice as many PS/2s. More important to IBM, Big Blue would be in the driver's seat, controlling the destiny of the market. Now it's just a target for bypass.

The time had come for Microsoft to take the bypass and lead the personal computing world in a direction of its own devising.

Related posts

The future that wasn't (introduction)
The future that wasn't (1/7)
The future that wasn't (2/7)
The future that wasn't (4/7)
The future that wasn't (5/7)
The future that wasn't (6/7)
The future that wasn't (7/7)

The accidental standard
The grandfathers of DOS

Labels: , , , ,