September 22, 2016

Carnivorous vegetarians


The Japanese may be the most pragmatic people on the planet. Going all-in on half-of-the-world domination and then losing everything knocked the stuffing out of that sort of zealotry. And unlike the Germans, they decided not to dwell on it.

Well, except when silly westerners try to ratchet up their own virtue signaling by apologizing for beating them.

And unlike the Germans, the ultra-nationalists and their rhetoric aren't banned. Some even get elected to high office. Then there's that whole Yasukuni Shrine business, which prime ministers pretend to be "sensitive" about.

Until the cameras are turned off, that is.

All the paeans to pacifism are pragmatic as well. In a neighborhood full of angry bulls, it's a good idea not run around waving a red flag. But at home, disturb the social order and the kid gloves come off. Japan has the death penalty and uses it.

And they don't pay much real attention to foreigners who complain about such things. Frankly, I think the Japanese government sticks to that whole whale hunting thing (it's for "research," don't you know) because foreigners complain about it.

It's a passive-aggressive way of asserting Japan's sovereignty and national prerogatives.

Japan's eating habits are doing a lot worse to the unagi, but when's the last time you heard anybody campaigning to "Save the eels!"


As Homer Simpson would put it: "Mmmm . . . eels."

Which brings us to the subject of another bunch of virtue-signaling westerners that amuse the Japanese when they're not bemusing them: vegetarians. Long story short: the best way to be a vegetarian in Japan is to not ask about the ingredients.

Eryk points out in his This Japanese Life blog that the

long life expectancy of Japanese people isn't from a vegetarian diet, because none of them are vegetarians. Okinawans are usually singled out—longest life expectancy in the world—but Okinawans actually eat taco rice and chicken.

The same goes for cancer rates. Japan's cancer rates aren't low because they avoid meat. Japan's diet is heavy on meat and soy—tofu, in particular—and soy can lower the risk of certain cancers. But tofu in Japan is usually served alongside meat, not in place of it.

Far from utopian, Japan is one of the least vegetarian-friendly places on Earth.

Vegan visitors in particular are warned that it is almost impossible to strictly adhere to a vegan diet in Japan. Even in vegetable dishes, the dashi (broth) that is a ubiquitous component of Japanese cuisine almost certainly contains pork or fish.

Courtesy Nami.

Laments Anne Lauenroth at GaijinPot, dashi is commonly made from bonito (related to tuna), and it is everywhere,

from sauces, salad dressings and miso soup to udon and soba noodles being boiled in it. Better restaurants pride themselves on making their own dashi, and they will be inclined to cook even their vegetables in this special broth instead of lovely, ordinary water.

But as far as Japanese cooks are concerned, dashi doesn't count as "meat," regardless of what it's made from. If you can't see the meat, there isn't any meat. Warns a site called the Vegetarian Resource Group,

It may be difficult to explain to Japanese people what you cannot have, because the concept of vegetarianism is not widely understood. For example, if you say you are vegetarian, they may offer you beef or chicken soup without meat itself.

Agrees Peter Payne,

One special challenge is being a vegetarian in Japan, since the country generally doesn't understand the lifestyle. One restaurant even advertised "vegetarian" bacon-wrapped asparagus, as if the presence of a vegetable was enough to make it vegetarian.

He advises sticking to shoujin ryori, the food traditionally eaten by Buddhist priests. Which could be tough for the typical tourist to arrange alone. So the Inside Japan Tours website "will advise all your accommodation of your dietary needs in advance."

 Why? Because it is

decidedly more difficult to be a full vegetarian or vegan due to the ubiquity of fish in the Japanese diet. In fact, it is so rare in Japan that you will find many restaurants that do not offer any vegetarian dishes at all.

Protecting tourists from vegetarian dishes that aren't really is a great example of what Tyler Cowen calls "Markets in Everything."

Granted, I find actual "travel" utterly unappealing as a hobby, let alone a necessity. (Fun to watch on television, though.) But this strikes me as an odd tourism mentality. It's a kind of reverse cultural appropriation: "Don't do as the Roman do."

Then why go to Rome in the first place?

When it comes joining the culinary globetrotting set, I think Phil Rosenthal has the right idea in I'll Have What Phil's Having. He travels the world and eats whatever he is served with great elan and with barely a care about where it came from.

After all, all those other people are eating it and they didn't fall down dead. Yet.

Related posts

Eat, drink, and be merry
Hungry for entertainment
Kitchen Car

Labels: , , , , , ,

September 15, 2016

The cover


The cover of a magazine for baby boomer geeks and nerds can change the world—when the right person sees it.


The personal computer, posits Robert Cringely, was the product of "people who find creativity so effortless that invention becomes like breathing or who have something to prove to the world."

They are the people who are left unchallenged by the simple routine of making a living and surviving in the world and are capable, instead, of first imagining and then making a living from whole new worlds they've created in the computer.

Even when the computer in question exists only on the cover of a magazine. Because of deadlines, the actual Altair computer gracing the cover of that famous issue of Popular Electronics was a mockup, not a working model. When the photograph was taken, a working production model wasn't available to demo.

That didn't matter. For Bill Gates, "enlightenment came in a flash."

Walking across Harvard Yard while Paul Allen waved in his face the January 1975 issue of Popular Electronics announcing the Altair 8800 microcomputer from MITS, they both saw instantly that there would really be a personal computer industry and that the industry would need programming languages. Although there were no microcomputer software companies yet, 19-year-old Bill's first concern was that they were already too late. "We realized that the revolution might happen without us," Gates said. "After we saw that article, there was no question of where our life would focus."

The difference this single-minded focus made on the future is apparent in the interviews with Bill Gates and Gary Kildall in the first (Feb/Mar) and third (Jun/Jul) issues of PC Magazine. (The first three issues are bound together into a single volume; you can find Kildall's by searching on his name.)

Gates comes across as hyper-aware of the emerging digital zeitgeist, the needs of his client (IBM) and the geek culture that spawned the then-nascent PC industry. But he is also thinking past all of them to all the ordinary consumers out there who just wanted a tool, an appliance. They were the future.

"A computer," Gates boldly promised in Microsoft's mission statement, "on every desk and in every home all running Microsoft software."


Kildall, by contrast, is very much the tenured professor he was before founding Digital Research. He's not quite sure what the rush is all about (a big reason the hard-pressed Boca Raton IBM team quickly turned to Microsoft to supply an operating system for the IBM PC).

Kildall gets animated about the then-arcane subject (a decade premature) of "concurrency" (multitasking) and proudly points to the assembly language compiler and debugger that ships with CP/M and CP/M-86. "So you can just pick up CP/M-86 and start developing your own high-performance applications."

Well, um, no. The vast majority of us can't, and neither could most of the geeks and nerds excited about the new, affordable "personal computer."

Okay, I used Kildall's debugger to hack the screen display and printer buffer in the CP/M version of WordStar so it'd run correctly on my Kaypro II. That was pretty much the beginning and the end of the life as a developer of "high-performance applications" using machine code.

In his interview, Gates instead enthuses about BASIC. BASIC is literally about as basic as a programming language gets. BASIC compilers were even a thing for a while, because ordinary computer enthusiasts (like me) could understand BASIC well enough to write working code.

Microsoft BASIC was initially the only reason to buy an Altair or an IBM PC. Microsoft Corporation was created to sell BASIC for the Altair, and the IBM PC shipped with Microsoft BASIC in ROM. The importance of BASIC (and a smattering of assembly language) is reflected in the early issues of PC Magazine.

"The Microsoft Touch" in the September 1983 issue of PC Magazine nicely ties BASIC to the beginnings of Microsoft.

But even in the premier issue, the emphasis was on the up and coming commercial apps—in particular, the spreadsheet and word processor—not programming languages. The VisiCalc spreadsheet made the Apple II the first "office PC," and Lotus 1-2-3 would do the same for the IBM PC.

Though Kildall was right for a time. Because of the enormous cost of memory and the constraints on bus and CPU speeds, DOS programs like WordPerfect (up to version 5) were written in assembly language. But it took thousands of employees to develop and market WordPerfect 5.

So Gates was being amazingly prophetic when he predicted in 1982 that in the future,

We'll be able to write big fat programs. We can let them run somewhat inefficiently because there will be so much horsepower that just sits there. The real focus won't be who can cram it down it, or who can do it in the machine language. It will be who can define the right end-user interface and properly integrate the main packages.

But I don't think Gates could have imagined then just how much of the technological world 30 years hence would run on high-level interpreted code, or that hardly anybody would notice or complain because the hardware had gotten so fast and so inexpensive. (Well, I notice on my old Windows XP laptop.)

In 2015, Apple produced a watch with orders of magnitude more memory and a CPU a hundred times faster that cost a tenth as much as the original IBM PC. Though, frankly, a creaky old IBM XT running Lotus 1-2-3 and WordPerfect 4.2 would be a lot more useful.

Productivity. That's why the PC changed everything.

Related posts

The accidental standard
MS-DOS at 30
Digital_Man/Digital_World
The grandfathers of DOS

Labels: , , ,

September 08, 2016

The grandfathers of DOS


Courtesy PC Magazine, June/July 1982.
Ken Olsen and Digital Equipment Corporation disrupted the IBM mainframe cartel with the minicomputer, that became the micro, that became the personal computer. Except DEC failed to make the critical transition itself and was acquired by a PC maker that also challenged IBM at the height of its powers: Compaq.

One of the tech pioneers who navigated the rocky transitional period was Gary Kildall (1942–1994). Kildall's CP/M operating system played a key role in shifting the software paradigm from the mainframe and minicomputer to the personal computer.

Kildall came a generation after Ken Olsen, half a generation before Gates, Wozniak, and Jobs. Olsen served in WWII. Kildall was a graduate student at the University of Washington when he was drafted into the Vietnam War. He would spend his enlistment teaching computer science at the Naval Postgraduate School.

He later became a tenured professor at NPS while consulting in Silicon Valley. In the early 1970s, he started work on CP/M, an 8-bit operating system designed to power the new microcomputers that ordinary people could afford.

Like the Altair, the 8080-based PC kit that Ed Roberts was building in Albuquerque, world-of-mouth ignited a tidal wave of interest and curiosity in the burgeoning "home brew" computer community. Kildall retired from teaching and together with his wife started Digital Research to develop and market CP/M.

By 1978, the company (headquartered in their house in Pacific Grove, California) had achieved sales of $100,000 a month.

Along with CP/M, two more Kildall innovations made the PC possible. On the technical side, the BIOS chip created a hardware "abstraction" layer that allowed an operating system to work "out of the box" with various hardware configurations without being hand-tuned for the particularities of each one.

On the business side, with the BIOS chip in hand, Digital Research could divorce the OS from dependency on a single hardware platform or manufacturer and sell CP/M to all comers, a marketing model that Microsoft would follow with great success.

The Apple I had debuted in 1976, built on "that horrible MOS Technology 6502 processor," as Kildall described it. But CP/M remained the dominate general-purpose microcomputer OS, running on the 8-bit Intel 8080 and Zilog Z-80. For a time, Microsoft sold the Z-80 SoftCard, enabling CP/M to run on an Apple II.

The SoftCard was Microsoft's number one revenue source in 1980, making Microsoft a major CP/M vendor. And was probably the reason IBM thought Microsoft was also an OS developer.

During the late 1970s, Kildall got distracted customizing the PL/I compiler for Intel CPUs. Development of CP/M languished for almost two years.

Apple released the Apple III in 1980. It was plagued by reliability problems, a lack of software, and like the later Lisa, carried a "sky-high" price. On sabbatical at the time, Steve Wozniak returned to Apple in order to supervise production of the highly successful Apple IIe. Apple regained its footing in 1983.

But in 1981, the microcomputer industry was without a technological leader. In August of that year, IBM changed everything with its 16-bit Intel 8088-based PC.

In Triumph of the Nerds,  Jack Sams recounts how his IBM team, in desperate need of an operating system for the IBM PC, approached Digital Research (on the recommendation of Bill Gates) but couldn't get anybody to sign the strict non-disclosure agreement or agree to their tight production schedule (accounts differ).

The second time IBM raised the issue with Microsoft, Bill Gates signed in a heartbeat. Gates didn't care about the licensing terms as long as it was non-exclusive and Microsoft could sell MS-DOS to other hardware manufacturers. IBM agreed and Microsoft changed the world.

Except Microsoft didn't have an OS in development. So it licensed 86-DOS from Seattle Computer Products for a song and hired the guy who designed it, Tim Paterson.

Kildall later protested that Tim Paterson hadn't reversed-engineered CP/M but had copied his source code. He never pursued this claim. (When Compaq later reversed-engineered the IBM BIOS, it documented every step with legal precision and was never sued by the litigious IBM.)

Paterson, employee number 80 at Microsoft, remembers his historic role with something of a philosophical shrug.

It's been pooh-poohed as Seattle Computer being suckers or something for taking the deal because it made Microsoft so much money. I don't know how many people would have said the guy who provides the operating system to IBM is going to make it rich. I have the impression Bill Gates and Paul Allen felt it was a gamble, not that they were sitting on a gold mine and knew it.

Tim Paterson.
Paterson's Microsoft stock options did allow him to retire at age 42. And when Seattle Computer Products found out who Microsoft's client really was and went to court about it, that "song" became a million dollars.

In any case, Kildall's 16-bit version of CP/M for the PC didn't come out until April 1982, and then initially at six times the price of MS-DOS. Alas, it wouldn't be competitive at any price. The computing world finally had a software and a hardware standard and was sticking with it.

These latter details don't make it into Kildall's memoir, which concludes at the end of the 1970s. Or at least the version we have. Kildall never published the manuscript. The first half was recently made available by his estate as a free PDF download.

Titled Computer Connections: People, Places, and Events in the Evolution of the Personal Computer Industry, Kildall writes with a readable style, not overburdened with technical jargon (although there is plenty of that). It's a compelling personal account about the roots of the PC operating system.

The theme of his recollections might be: "You kids don't know how tough we old-timers had it!" He says this with a wink and a nod, but he's absolutely right. Compared to the hoops programmers once had to jump through, the floppy disk drive and the command line were absolutely amazing steps forward in usefulness.

Kildall's account ends at the end of the 1970s, before the stormy advent of the IBM PC (read the rest of the story here). But he does mention two other times he and Bill Gates crossed paths. The first sounds like a script straight out of Hollywood.

When Kildall was at the University of Washington, two high school students hacked into C-Cubed, a time-sharing computer company run by the director of the UW Computer Center. The kids were none other than Bill Gates and Paul Allen, the future founders of Microsoft. And what happened to them? C-Cubed hired them.

A decade later, a fledgling Microsoft was based in Albuquerque, creating software tools for the Altair. In 1977, Gates came to Pacific Grove to discuss the future of the business with Kildall, who was then running the world's most successful microcomputer software company.

Kildall remembers them getting along like oil and water, "his manner too abrasive and deterministic, although he mostly carried a smile through a discussion of any sort." Kildall had no desire to "compete with his customers," and turn Digital Research into a one-stop that sold both tools and applications.

Exactly what Gates was planning to do. Recalled Kildall,

We talked of merging our companies in the Pacific Grove area. Our conversations were friendly, but for some reason, I have always felt uneasy around Bill. I always kept one hand on my wallet, and the other on my program listings. It was no different that day.

So Microsoft ended up back in Seattle, where Gates and Allen grew up.

Kildall didn't think highly of Gates as a computer scientist. But in all fairness, I'll point to this landmark interview by Dave Bunnell in the debut issue of PC Magazine. As early as 1982, a young Bill Gates demonstrated a remarkably insightful grasp of where the personal computer industry was headed.

Gary Kildall may not have liked the man he ended up passing the baton to, but there's no denying that Bill Gates grabbed it and ran like a bat out of hell.

Related posts

Computer Connections
The accidental standard
MS-DOS at 30
Digital Man/Digital World
The cover

Labels: , , , ,

September 01, 2016

Digital Man/Digital World


I'm a big fan of documentaries about the early years of the personal computer industry. Digital Man/Digital World provides a much-needed look at the often overlooked DEC (it can be watched online at the above link).

Before Intel, before Microsoft, before Apple, before the IBM PC and Compaq Computer (the company that later acquired it), there was Digital Equipment Corporation. The new reality that a computer could be "small enough to be stolen" (based on an actual incident) began not in Silicon Valley but in Maynard, Massachusetts.

Digital Equipment Corporation (DEC) was founded by Ken Olsen in 1957.

Ken Olsen had served in the Navy during WWII as a radar technician. After the war, he earned a degree in electrical engineering at MIT, where he worked on the Whirlwind project. The Whirlwind computers powered the prototypes of the Distant Early Warning (DEW) Line during the early years of the Cold War.

Olsen had acted as a liaison with IBM during the Whirlwind project (IBM build the computers that MIT designed), and recalled that IBM was "like going to a communist state." He brought that attitude with him when he founded DEC.

Although a conservative Christian who wore a tie, banned alcohol from company parties, and always ate dinner with his family (before going back to work), Olsen was one of the first "hippie" CEOs. He championed the flat corporate hierarchy, the employee-friendly workplace, and the customer-oriented business.

He drove a sedan and vacationed in the Canadian wilderness. Abandoning the organizational chart, he managed by walking around, and was on a first-name basis with his employees. He actively recruited women and minorities (this was in the 1960s and 1970s) and didn't lay off a single person until 1988.

In committee meetings, managers were expected to own their ideas, defend them, and fight things out: "You could get in somebody's face as long as you didn't stab them in the back."

DEC was the first high-tech company funded by venture capital (American Research and Development Corporation) and produced a crop of multi-millionaires when it went public in 1966. After Ken Olsen retired (involuntarily), he gave away most of his accumulated wealth.

DEC's truly disruptive innovation was the minicomputer. Instead of IBM's room-sized mainframes, the PDP-1 was a filing-cabinet sized time-sharing computer that cost less than $120,000, a bargain back in the early 1960s. The later PDP-8, introduced in 1965, shaved more than $100,000 off that price.

The 16-bit PDP-11 was the first computer to tie internal communications together on a shared bus, a feature later adopted by the Altair, the Apple II and the IBM PC. The 32-bit, network-ready VAX debuted in 1977 and became its most popular minicomputer, a mainstay of university engineering labs.

The PDP-11 control console (top) looks like a "computer."
Built for computer geeks, the Altair front panel resembled
the PDP-11 console.

By the early 1980s, Digital had become the second-largest computer company in the world. In one of the great ironies that typify the last half-century of the tech industry, while disrupting the staid mainframe business and making computers truly affordable, DEC sowed the seeds of its own downfall.

This trend in affordability accelerated in the 1970s with the advent of a slew of inexpensive 8-bit CPUs that powered the Altair, the Apple II, and the Commodore, with the Intel and Zilog varieties running the soon ubiquitous CP/M operating system.

IBM responded to the PC threat with the 16-bit IBM PC, engineered in its freewheeling Boca Raton division using low-cost OEM components and a second-hand operating system from an upstart software company called Microsoft. It produced a smash hit product that eventually sowed the seeds of its own downfall too.

By comparison, the IBM PC looks like an appliance.

DEC went in the opposite direction, sinking resources into the VAX 9000 supercomputer, high-end multiprocessor microcomputers, and the Alpha 64-bit RISC processor. This was a full decade before 64-bit computing would arrive in an affordable PC package. Everybody loved the Alpha but nobody knew what to do with it.

Powered by Intel's inexpensive x86 chips, the PC was growing so fast that "good enough" quickly became more than enough to do the job. Before long, personal computers were easily matching the power of DEC's previous minicomputers. The PC had turned into a minicomputer.

In a complete turnaround (just as Clayton Christensen would have predicted), DEC found itself defending the high end of the market and getting disrupted from below.

DEC's premium hardware failed to find a market, resulting in a $2.8 billion loss in 1992. That year, Olsen was ousted as CEO. Compaq acquired Digital in June 1998, only to merge with HP four years later. DEC all but disappeared amidst the corporate reorganization rubble.

Though largely forgotten, its influence lives on.

When Paul Allen and Bill Gates developed a BASIC interpreter for the brand-new Altair in 1975, they didn't have an Altair computer. Instead, they used an Intel 8008 emulator Allen had written for the DEC PDP-10 in Harvard's Aiken Lab. Amazingly, the program ran the first time it was installed on an actual Altair.

BASIC was Microsoft's founding product, its first best-selling product, and would later be adopted by both IBM and Apple. And it was born on a DEC minicomputer.

Then in 1988, Microsoft hired Dave Cutler, architect of the VMS operating system for the DEC VAX, to design a preemptive multitasking OS. Since the release of Windows XP, all desktop and server versions of the Microsoft OS (plus Windows Phone since version 8) have been built on Dave Cutler's NT kernel.

Our modern, technological world was in no small part created by Ken Olsen and Digital Equipment Corporation.

Related posts

The accidental standard
MS-DOS at 30
The grandfathers of DOS
Something Ventured

Labels: , ,

August 25, 2016

What's in a name?


In Japan, you can't name your kid anything. The Ministry of Justice has the final say (as in France, the cops must have a linguistics division). Currently, only the 843 "name kanji" (kanji rarely used for anything but names) and 2,136 "common-use kanji" are permitted in first names.

But thanks to on'yomi, kun'yomi, na'nori, and ateji, parents can get very creative about how a kanji is pronounced. And the bias of late, grouch the old-timers, has been toward the unpronounceable.

In Chinese, there is exactly one phoneme per character. Kanji was imported to Japan from China and adopted into a language that has nothing phonemically or grammatically in common with Chinese.

As a result, the original Chinese pronunciations had to be heavily modified to fit the Japanese language, resulting in on'yomi ("Chinese" reading). Because there's such a poor overlap between the two phonemic systems, there are often multiple on'yomi for each kanji.

At the same time, kanji were retrofitted to represent existing Japanese words (kun'yomi). As a result, a single kanji can have several different readings, including na'nori, readings that evolved specifically for use in names.

In Chinese, foreign (untranslated) words are written using ateji. That means the foreign word is "spelled" phonetically using the pronunciation associated with the kanji. The inverse form of ateji is assigning (often foreign) pronunciations to a kanji based on the meaning.

There are a number of websites that sound out Western names using Chinese characters. You can do this in Japanese too, but Japanese has a purely phonetic "alphabet" (a syllabary) made specifically for foreign words and names called katakana.

Nevertheless, as I illustrate here, (reverse) ateji is too much linguistic fun for writers to ignore.

Let's say you wanted to name your kid "Star Child." Sounds very hipster in English, but in Japanese it produces pretty ordinary pronunciations (with one exception). The suffix 子 ("child") is common in Japanese names for girls (sort of like all the girl names that end with /ly/).

Girl names

星子 Shouko
星子 Seiko
星子 Tiara
星子 Hoshiko

/Shou/ and /Sei/ are on'yomi. /Hoshi/ is kun'yomi. "Tiara" is, of course, (reverse) ateji.

Boy names

星郎 Hoshirou
星男 Hoshio

/Rou/ is on'yomi and /o/ is kun'yomi. The suffix 郎 (used similarly to 子 for girls) means "son" and 男 means "man."

Because the most common "spelling" of Seiko is 聖子 ("holy child"), which also just happens to be the name of the hugely famous singer Seiko Matsuda, you would have to explain to a person you just met that your name is instead spelled with the kanji for "star."

And, yes, sans a business card, Japanese provide these sorts of explanations all the time when introducing themselves, and/or write the kanji in the air or on the palm of the hand.

Labels: , ,

August 18, 2016

Canon MG2520


I rarely need a printer or scanner these days, but when I do, I really do. And it's hard to fret about a 28 dollar investment in a Canon MG2520 when I'd just spent almost that much at the FedEx copy center printing out a bunch of stuff that I suddenly needed yesterday.

I ordered it from Walmart online and picked it up a week later. The out-of-box instructions were actually readable (or lookable, as they contained little text) and fairly useful.

The telescoping paper tray slides neatly out of the way. But I wouldn't trust it with more than a dozen sheets. My old HP could hold at least a quarter of a ream. Then again, I don't play on printing more than a few dozen sheets a year.

The power brick is cleverly built into the chassis. It looks like it's snapped in during the assembly process. The power cord feeds out flush with the back of the case rather than jutting straight out. That means no extraneous dongles and dangling cables to deal with.

This is an ingenious design that I wish more electronics manufacturers would adopt. It makes it possible to source the power supply from an OEM without turning it into the annoying encumbrance that is the power brick (the bane of consumer gadget market).

Otherwise, my only gripe is that, instead of mounted flush like the power cord, the USB cable pokes straight out the back at the widest point. It's impossible to push the printer against the wall without unplugging it.

The USB port (upper right) should be oriented 90 degrees down.

The verdict: the printer prints and the scanner scans. Good enough.

Labels: ,

August 11, 2016

Out with the old


The Car Talk guys argued that, in most cases, repairing an old car is cheaper than buying a new one. The reasons for buying a new(er) car come down to improved safety features and reliability, along with the utilitarian demands placed on the vehicle (how many child seats will fit in it).

Otherwise, comparing the amortized cost (or monthly payments) of old against new makes clear which way the economic scales are tipping.

When it comes to modern consumer electronics, there's rarely anything that can be repaired. Then the question is whether to buy an extended warranty. The answer is usually no. If the gadget doesn't break within the manufacturer's warranty, odds are it won't break within the extended warranty.

My HP 895cxi inkjet printer had been a workhorse for almost twenty years. Until it simply decided to not work, flashing an "ink cartridge" error I'd never seen before, even when an ink cartridge ran out. The usual cleaning remedies (plus a few more) didn't help.

It's possible that the almost new (OEM) cartridge dried out from long lack of use and a new one would work. Except it'd cost more to replace the cartridge than to buy a new printer.

Granted, in computer years we're talking about an antique, but HP 51645A cartridges are still being made and sold. HP lists the black cartridge at almost fifty bucks. A remanufactured cartridge goes for a more reasonable $13. But the last remanufactured cartridge I tried was broken out of the box.

Add in the color cartridge and the total comes to $30. I don't even know that the cartridge is the problem. The problem is, these days, a printer, scanner, and a CD-ROM drive are the kind of peripherals I can do without—until I absolutely need them.

Meanwhile, a brand new all-in-one Canon MG2520 sells for $28 at Walmart. Cartridges included. It'd replace my equally ancient (and excruciatingly slow) CanoScan scanner at the same time.

As we all know, inkjet printers operate on the razor blade economics model: "Give 'em the razor, sell 'em the blades." The tiny Canon cartridges make that strategy clear. But I don't plan on printing out any novels (I did literally print out a couple of novels on that HP).

What using a Centronics printer cable was like.

Tossing the old HP was a blast from the past. Ah, the good old Centronics parallel printer interface. Bulky, heavy, unwieldy—makes me think of a 19th century transatlantic telegraph cable. Surprisingly, they're still available at reasonable prices. The Windows XP of the cable world, I suppose.

Labels: , , ,

August 04, 2016

The rebirth of Japan's mass media


Mitsuki Takahata (bottom right) plays
Shizuko Ohashi in the NHK series.
As I noted last week, Homer Sarasohn was the first quality control guru to visit Japan, invited by General Douglas MacArthur to rebuild Japan's electronics industry. Why was that a priority for the Supreme Commander for the Allied Powers?

Because MacArthur believed in the power of the mass media to spread the good word of freedom and democracy. His good word. It wasn't simply a political pose. MacArthur was Ronald Reagan with ten times the ego and a papal sense of infallibility.

In other words, the perfect personality for a Japanese shogun (with access to a radio studio).

In fact, the first few years of the Occupation saw a spate of surprisingly liberal reforms (that drove Shigeru Yoshida up a wall). Leftists, labor organizers, and even communists were let out of jail and the press was unleashed.

In Embracing Defeat, John Dower documents how enthusiastically the Japanese embraced these freedoms. Soon SCAP was censoring as many articles and broadcasts as it was approving. A free press, you see, wasn't free to criticize SCAP.

But the fire had been lit. It's telling that the moral backlash that "brought about the collapse of the comic book industry in the 1950s" was shrugged off almost as soon as it arrived in Japan (though, to be sure, it never entirely went away).

The current NHK Asadora, Toto Nee-chan, is a fictionalized biography of Shizuko Ohashi (1920–2013), who in 1948 co-founded 「暮しの手帖」 ("Notebook for Living"), a women's magazine still in print.

This retrospective at the magazine's website is in Japanese, but the illustrations largely speak for themselves.

This was an era when movie makers as well were yanking themselves up by their bootstraps. Akira Kurosawa turned the devastated landscape of Tokyo into a set in his second post-war film, One Wonderful Sunday, released in 1947.

Related links

When quality came to Japan
How Homer Sarasohn Brought Industrial Quality to Japan

Labels: , , , , , , ,

July 28, 2016

When quality came to Japan


Sarasohn (top) and Deming.
Edwards Deming is revered as the father of Japan's quality revolution. The revolution began in August 1950 when Deming, then working on the Japanese census, delivered a speech on "Statistical Product Quality Administration."

While Deming would long be a prophet without honor in his own land, the Japanese took his advice to heart, applying it to their assembly lines and rewarding those who met its exacting standards with the "Deming Prize."

Less well known is that Deming was building on the substantial work already done by Homer Sarasohn, who'd been recruited by General MacArthur to rebuild Japan's electronics industry following the war.

When his stay in Japan came to a close, Sarasohn, in turn, recruited Deming.

Robert Cringely endeavors to correct the record in this compelling essay from his PBS column back in 2000: "How Homer Sarasohn Brought Industrial Quality to Japan and Why It Took Japan So Long to Learn."

(And note Sarasohn's quip about Donald Trump sixteen years ago).

Sarasohn's recollections of what he discovered upon inspecting the state of Japanese manufacturing in 1946 certainly come across as wildly incongruous now.

With the exception of the Zero fighter and some aircraft engines, their designs were bad and their manufactured goods were shoddy. Having come from the Rad Lab, I was particularly appalled to see the primitive nature of Japanese naval radar. Their vacuum tubes were bad and the radios were even worse, since each was hand-wired by untrained, often unsupervised, workers. They produced goods in mass quantities, ignoring quality.

Despite the Zero's reputation, Japan's war machine produced nothing like the deadly and reliable F6F Hellcat. Grumman designed the fighter to be simple to build and maintain, and manufactured 12,200 Hellcats in two years, continually improving the frame and powerplant.

As a result, the Hellcat racked up a 13:1 kill ratio over the most widely produced Model 52 Zero. The Model 64 Zero might have begun to match the much improved flight characteristics of the Hellcat, but never made it past the prototype stage.

And by then, the successor to the Hellcat, the Bearcat (which also didn't see action in WWII), had leapt far past the Hellcat and the Model 64, setting performance records that would be eclipsed only by jet fighters.

Essentially, Mitsubishi made Zeros the same way an artisan makes a fine watch. As Hayao Miyazaki observes, "Structurally, the Zero was not designed for mass production." Each Zero was a one-off. It was amazing that Mitsubishi managed to build 10,000 of them.

Meanwhile, the U.S. would deploy four air-superiority fighters into the Pacific Theater: the F6F Hellcat, the P-38 Lightning, the F4U Corsair, and by the end of the war, the P-51 Mustang.

Mass production in Japan before the war emphasized the "mass" part of production, betting on the numerical odds to produce a usable number of quality components. The result was vacuum tube yields of 10 percent. Sylvania, by comparison, had pushed yields to 85 percent.

Jonathan Parshall and Anthony Tully point out in Shattered Sword: The Untold Story of the Battle of Midway that Zero pilots had so little faith in their radios that they often removed them to save weight.

The aircraft radios carried on the Zero fighter were of inferior quality and of limited range and power and were difficult to use. As a result, while all carrier Zeros had radios, pilots rarely relied on them.

One of Homer Sarasohn's students was Akio Morita, cofounder of Sony Corporation, whose breakthrough product was the transistor radio.

At first, discrete transistors were treated the same as vacuum tubes. The real breakthrough in quality came with the planar process developed by Fairchild Semiconductor, that employed photolitholography to "print" solid state devices onto silicon wafers.

Unlike a discrete transistor, that could be tossed if a single unit didn't meet the right specs, a flaw in a silicon wafer ruined the whole batch. Producing literally perfect wafers became an economic necessity. And that, Sarasohn argues, is what lit the fire.

The problem is, there's nothing proprietary about quality. It took a while, but Detroit caught on, and the Koreans did too, taking over the DRAM business by 1991. And two decades later had grabbed the bulk of the consumer electronics business from Sony and Panasonic.

The job Japan has ahead of it is not only to iterate and improve but to truly create, to somehow (frankly, it might be impossible at this late date) rekindle the white-hot passion for innovation that propelled Japan, Inc. to greatness in those golden postwar years.

Related links

How Homer Sarasohn Brought Industrial Quality to Japan
Twilight of the Zero
The rebirth of Japan's mass media

Labels: , , ,

July 21, 2016

Dramatic conservation


A common charge leveled by the cultural right against popular mass media is that its essentially dissolute nature is corrupting the moral fiber of the nation. There is certainly no lack of kindling to toss onto that fire, but it is hardly true across the board.

Police procedurals like Blue Bloods (Tom Selleck as a Rudy Giuliani-style police commissioner and devout Catholic) and Bones (David Boreanaz as a by-the-book FBI agent who's a reasonably observant Catholic) and Murdoch Mysteries (Yannick Bisson as yet another practicing Catholic) cast conservative characters in a favorable light.

When in doubt, make your cop Catholic.

And, of course, then there's the not-entirely lapsed Dana Scully (Gillian Anderson) on The X-Files. Her conservatism is more of an empirical nature, but in that domain, compared to Mulder, she's definitely conservative.

Ironically, the very nature of these shows means they must necessarily exaggerate the extent and prevalence of criminality, especially in middle-class society. This was just as true of Arthur Conan Doyle and Agatha Christie.

Such generalizations hardly stop at the water's edge. Thanks to Hollywood, the average Japanese assumes the average American to be both more religious and more libertine, and the U.S. more crime-ridden, than in reality. The media messages traveling east across the Pacific presents an even narrower slice of the media pie and an even more distorted view of the cultures that produce it.

Japanese police procedures represent real crime rates about as well as British police procedures. More cinematic mayhem per week in Tokyo (or London) than in the entire country. (Though Antiquarian Bookshop Biblia refreshingly features hardly any murders in the entire series.)

Making things worse, perception-wise, most of the contemporary live-action Japanese movies that dominate the Hulu and Netflix catalogs reflect what U.S. distributors can license inexpensively in niche genres that have a build-in audience. Nothing close to a representational sample.

On this score, Studio Ghibli is perhaps the best well-known (to the non-otaku public) indicator about the tastes of the Japanese public in general (especially titles like Only Yesterday and Whisper of the Heart). Aside from anime feature films, very few "family-friendly" live-action Japanese movies ever make it to the U.S.

As a result, Peter Payne notes the common conclusion that "Judging from all those hentai anime and games the Japanese love, they must be the most perverted people on the planet, leading sex lives that would amaze us all, right?"

Long answer short: nope. Not even close.

Japan is a more conservative country than the U.S. Unlike in the west, the common culture has subsumed most of the historically "religious" practices and values, to the extent that there is no clear bifurcation between the two. It's not the "religious right" influencing modern culture as much as the past influencing the present. And nobody's rebelling much.

One of Faulkner's best-known lines is even more true about Japan: "The past is never dead. It's not even past."

Further complicating things is the gap between honne and tatemae, or between true (inner) intent and the outer display of behavior. This isn't considered less hypocrisy than a reflexive social necessity.

What is easily interpreted as a reflection of pervasive moral laxity in popular media is only tenuously—and often not at all—tied to individual, personal behavior. It's entertainment. Even there, storytelling conventions in manga and anime often "normalize" more conservative behavior than what exists in Japanese society (like the whole "first kiss" business).

Americanizing a hugely popular series like Kimi ni Todoke would only work if set in the 1950s or perhaps Utah County. Though I also think that built-in reticence (without the attendant religious moralizing) is a big part of the appeal among the American audience.

As I've argue before, a thread of conservatism (or rather, conservationism) makes for better stories. And I mean this more in the naturalistic sense: conserving stuff that's existed for a long time for a reason. The Japanese in particular are huge believers in Chesterton's fence: don't go changing things unless you've got a really good reason.

And probably not even then.

Let us say, for the sake of simplicity, a fence or gate [is blocking] a road. The more modern type of reformer goes gaily up to it and says, "I don't see the use of this; let us clear it away." To which the more intelligent type of reformer will do well to answer: "If you don't see the use of it, I certainly won't let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it."

When it comes to narrative fiction, an old gate that can be swung open without a second thought (or a brand new gate that's padlocked just because) makes for poor dramatic conflict. Some resistance, a little rust in the hinges, makes the task a lot more interesting.

Labels: , , ,

July 14, 2016

Aogashima Island


Mining the ocean floor is one of those perennial futuristic things that is perennially ten years in the future.

But this time high concentrations of gold and silver ore were found in the vicinity of Aogashima Island. Gold in them there underwater hills may provide the kind of motivation to get a gold rush going. Reports the Japan Times:

A team of researchers at the University of Tokyo have discovered high-grade gold ore on a seabed off a remote island south of Tokyo. The ore, collected from a hydrothermal deposit at an underwater volcanic crater off Aogashima, contained as much as 275 grams of gold per ton, a figure that is higher than usual for such deposits on land or sea in Japan.

If you think this sounds like the premise for a James Bond flick, well, check out Aogashima Island. It even looks like the lair of a James Bond villain! (Click to embiggen.)


Those blue spots aren't water. They appear to be roofs, Quonset huts and greenhouses, perhaps. Aogashima Island, 222 miles south of Tokyo, is the the southernmost inhabited island of the Izu archipelago.

Labels: , , , , ,

July 07, 2016

The Force Awakens


Star Wars isn't "science fiction." It's medieval fantasy with suits of armor made from extruded ballistic plastic (painted white to make them easier targets, I suppose). Light sabers instead of swords and lasers instead of longbows. (Except the laser bolts move slower than actual arrows.)


Not that there's anything wrong with that. Repurposing old genres makes the topsoil of popular entertainment all the richer. And like McDonald's french fries, when it comes to genre entertainment, the decent low-brow stuff beats the tony high-brow stuff nine times out of ten.

The first Star Wars movie (1977) defined this revised genre. With Irvin Kershner at the helm, The Empire Strikes Back (1980) extended it (it even included a dragon in a cave). Then things went downhill and never recovered. Not even after George Lucas bowed out and laughed all the way to the bank

Granted, at that stage there was no place to go but up. But so determined was Disney to rekindle some of that now "classic" fairy tale goodness (its specialty, after all), that they made the same movie all over again, only with better CGI and a worse script.

It'd be one thing if they'd made exactly the same move. But everybody was so familiar with the archetypes that they forgot to fill in the rest. In between each predictable turn of plot, there's supposed to be, you know, a story. And the accompanying material that fashions ongoing character development.

As a result, The Force Awakens ends up a compilation of deus ex machina moments, the characters and their reasons for being there springing into existence out of empty space like subatomic particles.

The original Star Wars has a few of these problems too, though they're not nearly as glaring. For example, Luke demonstrating the skills of an experienced ball turret gunner straight off the literal farm.

In fact, everybody in the Star Wars universe is surprisingly adept at both operating (and sabotaging) complex military hardware they've never seen before. Galaxies long ago and far away must have had the same high school curriculum as Girls und Panzer (in which armored combat is an extracurricular activity).

And the last act of Star Wars is plain silly, suggesting that a couple hundred hours flying VFR in a Piper Cub qualifies a pilot to jump into an F-22 and fly circles around an MIG-29. (See also: Independent Day, but at least the Randy Quaid and Bill Pullman characters had flown military jets before.)

Otherwise, Luke is realistically shown to be the novice that he is, whose best option in a tight situation (again, until the heroic last act) is to run away or hire some muscle. Even after extensive one-on-one training, he is incapable of besting Darth Vader with a light saber in The Empire Strikes Back.

By contrast, the learning curve for any activity, any acquired skill, any knowledge-dependent process in The Force Awakens—from "I've never seen this thing before" to "I can use it as well as a professional"—is about sixty seconds long.

All the more exasperating is that most of these glaring plot holes could have been easily fixed.

1. Finn goes AWOL after ten minutes of doing whatever he was programmed/trained to do since forever.

Well, they certainly don't make stormtroopers like they used to. For such a key character, a bit more substance behind the decision would go a long way to informing us about his character and personality.

Easy fix: Make Finn part of Kylo Ren's detail. Finn is sick and tired of babysitting this whiny kid with anger management issues, has been nicked one too many times during his temper tantrums. Then witnessing Ren's depravity in person punches his ticket to get out of there before he ends up as cannon fodder.

This would also explain why a narcissistic sociopath like Ren would notice who Finn was in the first place, let alone bother to call him a "traitor." Because he knew Finn personally.

2. I read the manual and now I can fly a starship better than an experienced pilot.

"Howling Mad" Murdock on the A-Team could fly anything because he learned how to fly everything. But since the original Star Wars, "The Force" somehow became shorthand for "Hard work, study, and practice is for suckers."

Easy fix: Make Ridley a mechanic when we first meet her. She works for the pawnshop proprietor who owns the ticket on the Millennium Falcon. She's trying to fix it because Han disabled it before hawking it and it won't go FTL, making it worthless. In the meantime, Ridley uses it to cart junk around.

One day she spots Finn and BB-8 out in the desert and gives them a ride. When they get back, Han Solo and Chewbacca have shown up to claim their craft. In the middle of arguing about who owns what and who owes whom, the stormtroopers charge in and all hell breaks loose.

3. I didn't even read the manual but just holding a light saber means I can beat a guy with way more experience than me.

It's easy to establish that both Finn and Ridley can handle themselves in a fight. But that's not enough. Not after the first three Star Wars movies established the deadly difficulty of light saber fighting.

Easy fix: This was sorta hinted at, but it should be pointed out (by Finn, say) that, sans the Force, Ren can't fight his way out of a brown paper bag. Lazy jerk that he is, he never had to. But now he has to. The question is whether actually applying himself will make him a better man too. Ah, a character arc!

On the other hand, some things are not fixable.

The Death Star was a cool enough concept that, first time out, I could quell the eye rolling. But this predilection to "Do the exactly same thing only bigger" movie after movie is just inane.

And here I thought that Space 1999 boasted the stupidest SF premise of all time. Supposing that the Queen in Through the Looking-Glass really could believe six impossible things before breakfast, she couldn't believe this:

Moonbase Alpha is a scientific research colony and watchdog over silos of atomic waste from Earth stored on the Moon's far side. On September 13, 1999, magnetic energy builds to cause an explosive chain-reaction of the waste, blasting the Moon out of Earth orbit and off the plane of the ecliptic, out of the Solar System.

The first Death Star (so sad there's more than one) was the spherical version of the Doomsday Machine from Star Trek. And the Doomsday Machine was huge but not-unreasonable sized. But a whole freaking planet on the run? I'd need a space elevator for my suspension of disbelief to go that high.

Also, these super-advanced societies can travel faster than light but can't make a decent circuit breaker. Or make a non-combustible space ship. (Also see Independence Day, but I do give Independence Day credit for setting off a nuke inside the mothership, which would do pretty much as depicted.)

This single-point-of-failure problem extends to the Republic, which hasn't figured out distributed networking either. They need to take lessons from Monty Python on "Not Being Seen."

Speaking of Monty Python, watching Star Wars gets me into a "What have the Romans ever done for us?" frame of mind. The entire argument against the regime du jour is that they're mean. And not very bright, taking a sledgehammer approach (repeatedly) to solving small problems.

These are the kind of people who, lacking a flyswatter, grab a hammer. Now all the windows are broken and the walls are full of holes. With Disney committed to pumping out rebooted Star Wars sequels on a regular basis, turning every conflict into an galactic existential threat will get old fast.

It's already old.

Firefly employed a not-dissimilar premise—big bad bureaucracy against the little guy—with an important difference: our motley crew has a job to do, and overthrowing the Alliance tomorrow isn't anywhere near the top of the list.

Posit instead that the Empire or First Order or whatever rules with a heavy hand but is basically competent. The Republic doesn't want to (and can't) overthrow the whole shebang. It's the Republic of Texas: it'd rather not be part of Mexico anymore (it helped that Santa Anna was not a nice guy or a smart general).

Even if the center could not hold, the result would likely resemble the Warring States period in Japan, which is still producing great story material four centuries later.

The sovereign power wielded by the warlords during the era compares to that of the Italian city-states, with conflicts taking place mostly at the peripheries of their domains, leaving commerce and agriculture largely undisturbed. This, in turn, led to significant economic, cultural and technological growth.

But the lack of central control also produced a veritable queue of claimants to the throne, and great business opportunities for the pirates and mercenaries in (or out of) their employ. The kind of universe in which Han Solo and crew would feel right at home.

Related posts

Attack of the Clones
The Phantom Menace
McKee meets the "Menace"

Labels: , , , , , ,

June 30, 2016

The streaming scythe


What at first appeared to be a full-scale purge of anime at Hulu turned out to be a far less-drastic but systematic cull of low-rated titles. Live-action Japanese television series are pretty much gone altogether (as always, Korean dramas are alive and well).

So what threatened to be a ruthless application of the 80-20 rule ("Twenty percent of inventory accounts for eighty percent of sales") was more the lopping off of the bottom 10 percent. The way CEO Jack Welch once boasted of running General Electric.

I suspect Hulu will be repeating this "Rank-and-Yank" process on a regular basis. In other words, truncate the long tail and concentrate on hits. Or at least the midlisters and up. And let's be clear: including the midlisters and up, Hulu still has a ton of anime.

Incidentally, this is why Cosco has three times the earnings-per-employee as Walmart and thus can pay a higher base wage. Cosco carries about 4000 SKUs while Walmart warehouses a staggering 140,000. It costs big bucks to maintain that physical inventory.

When it comes to anime, Hulu wants to be more like Cosco. So does Netflix.

Or rather, more like HBO: produce a few shows that capture the cultural zeitgeist and backfill the rest with reruns of standard Hollywood fare. It's  about "narrow-casting" to the broadest possible audience. In other words, the subscription model since forever.

Rather than broadcasting a signal to the whole wide world and hoping a few percentage of available households tune in, send it instead only to the viewers who already have a vested interest in watching.

As the cable industry has long proved, if you can get subscribers hooked on one or two channels (or even one or two shows) and fiddle with the packages to hide the sunk costs, they'll stick around out of sheer momentum.

In his 2004 treatise on the subject (and 2006 book), Chris Anderson cited Netflix as an example of the long tail in action. Streaming would seem to bolster his argument. Except what Netflix really wants is a heavily curated long tail. That's not too long.

Justin Fox at Bloomberg confirms that

Today's Netflix and its "brand halo" seem to have a lot more in common with existing TV channels, most obviously HBO, than the back-catalog specialist that it was back in 2006.

I don't think Chris Anderson was wrong about the long tail, simply wrong about it aggregating under one roof, the exception being virtual department store retailers like Amazon and Walmart. But even those behemoths can't stock everything.

The long tale very much exists, except it's been it's been stretched and scattered across all creation. So it takes a bit of dowsing to find the viable concentrations of your particular ore.

One thing remains very true about Anderson's original thesis: going completely digital cuts inventory costs drastically. The marginal costs for adding each additional title or user are close to zero.

Which is why Amazon could build out its existing infrastructure and turn AWS into such a profitable enterprise. And why every new software play must somehow leverage the "cloud." The challenge is what to do with it, how to collect and collate all the content to fill it.

Sure, information wants to be free, but the licensors are still going to charge whatever the market will bear.

And there's no better way to get control over content licensing fees than to produce it in-house. Though as HBO has discovered, getting irrationally exuberant with that approach can lose you your shirt.

Netflix has already been the principal producer on several anime series, and adapted Matayoshi Naoki's award-winning novel Hibana for a series that will be shown in all Netflix markets. Here the advantage goes to streaming over the traditional cable model.

But for those of us not so much interested in the smorgasbord?

The past is prologue and streaming economics hearkens back thirty years when the average middle-class household had a dozen magazine subscriptions (not counting the catalogs). "Big tent" at one extreme, (extremely) specialized at the other.

Such as a newsletter just for the QX-10. My dad subscribed to one of those. Today it'd be a website.

Going forward, the streaming market in the U.S. will probably be left with Crunchyroll, Funimation and maybe Hulu as the major online anime distributors, with Amazon and Netflix providing a generous but more curated catalog.

At the end of the day, though, when everything shakes out, we'll still have orders of magnitude more choices than the bad old days of praying for a single new anime release to show up at Blockbuster.

Labels: , , ,

June 23, 2016

A slice of Japanese life


The "slice-of-life" genre (manga and anime) intersects, but should not be confused with, "slice-of-realistic-life." Bunny Drop gives us a slice of life, but it's not quite "slice-of-life." Rather, it's better described as a family melodrama (quite a good one, in fact).

To put it in Studio Ghibli terms, Only Yesterday is slice-of-realistic-life (another good one). Whisper of the Heart is slice-of-life.


Of course, genre categories always get blurry at the edges. Hanasaku Iroha qualifies as a standard melodrama, replete with character development, a plot, and an ending. But its setting and emphasis on day-to-day life at a rural inn also tips it toward slice-of-life.

More importantly, a slice-of-life story doesn't weigh down the audience with heavy attitudes or a ponderous plot (at least not for long) and goes easy on the "meaning of it all." The tone is upbeat, the characters optimistic. If there are issues, people get over them.

In short, "stuff happens, mostly pleasant." A healthy serving of moe makes it easy on the eyes too. A touch of magical realism and nostalgia calms the nerves, even in the future. Aria and Yokohama Shopping Trip are two classic slice-of-life science fiction series.



As Wikipedia describes Yokohama Shopping Trip,

Whole chapters are devoted to brewing coffee, taking photographs, or repairing a model aircraft engine, sometimes with only a few lines of dialogue. [This emphasis on] the small wonders of everyday life makes the reader aware of their passing. In evoking a nostalgia for this loss, [the author] is following the Japanese aesthetic tradition of mono no aware.

In fact, the stories can be so plotless and meandering as to create a slight remove from reality. But not too far removed from reality, even when fantasy elements dominate the narrative.

Tamako Market is narrated by a talking bird. Kamichu! starts with Yurie getting turned into a Shinto goddess. Gingitsune is about a shrine maiden who can talk to her shrine's fox god. Flying Witch features, well, a flying witch (who, as it turns out, doesn't fly very much).

Geography can also achieve that "slight remove." Flying Witch, Non Non Biyori, and Hanasaku Iroha are based in rural or exurbia Japan, while Kamichu! takes place in a fishing village near Kure on the outskirts of Hiroshima, and Barakamon on a small island off the coast of Kyushu.

To the ninety-plus percent of Japan's urban population, these are magical settings that, as with NHK's perennial historical dramas, conjure up feelings of nostalgia for a bygone age that isn't quite yet gone for good in modern Japan.

(Here and here are side-by-side comparisons of the settings in Flying Witch and their real-life counterparts. Somebody at the studio did a lot of location scouting, probably also using the enormously useful Google Street View.)

Though there's nothing wrong with the cities and the suburbs. Consider the ever-popular K-On and Tamako Market (both recognizably made by the same production crew) and Strawberry Marshmallow.

The slice-of-life comedy typically has one live wire to play the boke (funny man) to the rest of the tsukkomi (straight man) and lead our little gang into one (minor) crisis after another. Our boke needn't be a comedienne or ha-ha funny. Quirky will do. It usually does.


Such as Yui, who joins a band when she can't play an instrument (K-On). Or Miu, a bundle of unconstrained kid id (Strawberry Marshmallow). Dera Mochimazzi, the talking bird in Tamako Market, is basically Bob Hope in the "Road" pictures he did with Bing Crosby.

But in all these cases, "real life" (or a close approximation thereof) eventually asserts itself, though with a focus on finding delight in the run-of-the-mill and beauty in the commonplace.

Related links

Aria (Netflix)
Barakamon (Hulu)
Flying Witch (CR)
Gingitsune (Hulu CR)
Hanasaku Iroha (CR)
Kamichu! (Netflix)
K-On (Hulu)
Non Non Biyori (CR)
Strawberry Marshmallow (Amazon)
Tamako Market (Hulu)

Labels: , , , ,