Saturday, February 28, 2015

The English verb of the Thirty Years War

The verb plunder means “to rob or take by force”. Plunder is often used in the context of warfare: The enemy troops plundered the town.

Plunder comes from the Middle High German word plunderen, which means, “to take away household furniture”.

English mercenaries became familiar with the word plunderen during the Thirty Years War of 1618 to 1648, the Catholic vs. Protestant conflict that produced approximately 8 million civilian and military casualties. (The Thirty Years War reduced the civilian population of the German states by some 25% to 40%.)

The anglicized plunder entered common usage in England during the English Civil War of 1642 to 1651. This conflict was not as deadly (probably about 190,000 dead), but it was nonetheless crucial in the development of British parliamentary democracy.

Wednesday, February 25, 2015

Writing: What writers can learn from Ernest Hemingway

Ernest Hemingway is one of the most revered American writers of the twentieth century. Nearly everyone has read at least one of his books. (You were probably forced to read Hemingway in high school, as his novels and short stories are mainstays of high school literature courses.)

While Hemingway is valuable to readers, there is also a lot that you can learn from him as a writer. Hemingway mastered the “short, declarative sentence”—and this is a key characteristic of his style. Hemingway’s novels also provide examples of how the writer can mine his or her life experiences for story ideas. However, this last trait was also one of Hemingway’s limitations, as we’ll see.

The short, declarative sentence

Hemingway was originally a journalist for the Kansas City Star; he later worked as a foreign correspondent in Paris.

His first experiences with professional writing, therefore, involved journalistic nonfiction—a medium that eschews long, complex sentence structures and flowery vocabulary. This training shaped Hemingway’s style as a fiction writer. Hemingway’s style is unadorned and minimalistic. His sentences are usually short, straightforward, and to the point.

This aspect of Hemingway is best explained through example. Below is the opening paragraph of his short story, “In Another Country”:

“In the fall the war was always there, but we did not go to it anymore. It was cold in the fall in Milan, and the dark came very early. Then the electric lights came on, and it was pleasant along the streets looking in the windows. There was much game hanging outside the shops, and the snow powder in the fur of the foxes and the wind blew their tails. The deer hung stiff and heavy and empty, and small birds blew in the wind, and the wind turned their feathers. It was a cold fall and the wind came down from the mountains.”

And below is the opening passage of A Farewell to Arms:

“In the late summer of that year we lived in a house in a village that looked across the river and the plain to the mountains. In the bed of the river there were pebbles and boulders, dry and white in the sun, and the water was clear and swiftly moving and blue in the channels. Troops went by the house and down the road and the dust they raised powdered the leaves of the trees. The trunks of the trees too were dusty and the leaves fell early that year and we saw the troops marching along the road and the dust rising and leaves, stirred by the breeze, falling and the soldiers marching and afterward the road bare and white except for the leaves.
The plain was rich with crops; there were many orchards of fruit trees and beyond the plain the mountains were brown and bare. There was fighting in the mountains and at night we could see the flashes from the artillery. In the dark it was like summer lightning, but the nights were cool and there was not the feeling of a storm coming.”

Some of these sentences are long, of course. But they are also simple and straightforward. Compare the above Hemingway passages to the opening passage of Mrs. Dalloway, Virginia Woolf’s novel of 1925:

“Mrs. Dalloway said she would buy the flowers herself.  
For Lucy had her work cut out for her. The doors would be taken off their hinges; Rumpelmayer's men were coming. And then, thought Clarissa Dalloway, what a morning--fresh as if issued to children on a beach. 
 What a lark! What a plunge! For so it had always seemed to her, when, with a little squeak of the hinges, which she could hear now, she had burst open the French windows and plunged at Bourton into the open air. How fresh, how calm, stiller than this of course, the air was in the early morning; like the flap of a wave; the kiss of a wave; chill and sharp and yet (for a girl of eighteen as she then was) solemn, feeling as she did, standing there at the open window, that something awful was about to happen; looking at the flowers, at the trees with the smoke winding off them and the rooks rising, falling; standing and looking until Peter Walsh said, "Musing among the vegetables?"--was that it?--"I prefer men to cauliflowers"--was that it? He must have said it at breakfast one morning when she had gone out on to the terrace--Peter Walsh. He would be back from India one of these days, June or July, she forgot which, for his letters were awfully dull; it was his sayings one remembered; his eyes, his pocket-knife, his smile, his grumpiness and, when millions of things had utterly vanished--how strange it was!--a few sayings like this about cabbages.”

Mrs. Dalloway was published the year before Hemingway published The Sun Also Rises—his first major novel. Hemingway’s style, therefore, was unique when it first appeared on the early twentieth century literary scene—which was still heavily influenced by the books of the nineteenth century.

Hemingway’s dialogue is similarly terse and unaffected. You will find no long soliloquies or melodrama. Hemingway writes dialogue more or less like people talk.

Hemingway knew how to use a snippet of dialogue to reveal complex subterranean situations and emotions. Consider the following passage of dialogue from “Hills Like White Elephants”.

In this story, a man and a woman are sitting in at a table in an outdoor café in Spain, staring at a row of foothills. On the surface they are talking about the hills, but their words suggest an underlying conflict:

"They look like white elephants," she says. 
"I've never seen one," the man says, and drinks his beer. 
"No, you wouldn't have." 
"I might have," the man says. "Just because you say I wouldn't have doesn't prove anything."

There is a lot more going on here than topographical observations; and this turns out to be one of Hemingway’s most emotionally jolting stories. It isn’t very long, but it packs a punch. (Read “Hills Like White Elephants” for yourself, and you’ll see what I mean.)

Why is Hemingway’s work (in particular, his style) so instructive for anyone who wants to write fiction?

The main reason is that most writers love words—often they love them a little too much. At some point, most writers become so enamored with words that their love of words gets in the way of telling the story.

When you sense that you might be falling into that trap, the reading of a few Hemingway short stories can quickly cure you of it. Hemingway proves that is possible to tell million-dollar stories without using fifty-cent words.

(I don’t mean to imply, by the way, that it’s a bad idea to develop an extensive vocabulary, or to hone your powers of description. Hemingway’s minimalist style has certainly had its detractors over the years. However, writers (especially those who aspire to write literary fiction) more frequently err in the direction of over-embellishment, rather than excessive minimalism.)

“Write what you know.”

This was one of Hemingway’s dictums. True to his philosophy, Hemingway repeatedly turned the raw experiences of his life into fiction.

As we’ll examine below, Hemingway had a lot of interesting experiences that seemed ready-made for fiction: He put himself in the middle of multiple wars (one as an ambulance driver, and several others as a correspondent). He lived in Paris, and spent time in Madrid and other romanticized European capitals.

This doesn’t imply that Hemingway lacked creativity, mind you. Plenty of men and women have had interesting, dramatic experiences without ever writing a novel. My grandfather experienced naval combat during World War II. Some of his experiences were quite intense—but he spent his productive years as a supervisor in a Ford plant. My grandfather—wonderful man though he was—was no Ernest Hemingway.

Moreover, while some of Hemingway’s fiction involves dramatic physical conflict, not all of it does. One of his best-known short stories, “A Clean, Well-Lighted Place” is a tale about an old man in a very mundane setting: a café. There is no physical danger, no spies or soldiers, and no femme fatales. But “A Clean, Well-Lighted Place” is nevertheless a very strong short story, for some of the observations it makes about the universal human need for security and familiarity.

Hemingway, then, was an astute observer: He was always on the lookout for “material”.

But while this tendency has an upside, it also has a downside. The downside is that every one of Hemingway’s major works was dependent on something that he had actually experienced. Some of his novels, in fact, contain thinly disguised autobiographical elements.

Hemingway in WWI 

For example:
  • A Farewell to Arms: Hemingway, like the main character in this book, was wounded in WWI. The tragic relationship in the novel mirrors Hemingway’s own unhappy wartime romance with a nurse in a military hospital.
  • For Whom the Bell Tolls: This is a novel about the Spanish Civil War, a conflict that Hemingway was personally involved in as a correspondent.
  • The Old Man and the Sea: Hemingway spent a great deal of time in Cuba, where he often went fishing.
  • The Sun Also Rises: This is a “Lost Generation” tale about American expatriates in Paris during the post-World War I years. Hemingway lived in Paris during this time; and the main character—Jake Barnes—has a personality that is very similar to Hemingway’s. The other characters in the book—Brett Ashley, Mike Campbell and Robert Cohn—have been traced to individuals who hung out with the author during his Paris years.

Hemingway with friends in Paris during the mid-1920s

This reliance on personal experience can work if you have a life like Ernest Hemingway’s. But what if your “day job” consists of processing claims for an insurance company, or working as an accountant?

An absolutist adherence to the principle of “write what you know (and only what you know)” therefore creates severe limitations—for individual authors, and for literature as a whole.

If every writer insisted on writing only from experience, most genre fiction would disappear: There would be no science fiction, and no horror. (Mary Shelley’s novel Frankenstein (1818)—perhaps the first example of modern genre fiction—was based on a dream.) Say goodbye to Stephen King, Ray Bradbury, and Robert Heinlein.

Few crime procedurals are written by ex-cops. Nor are many espionage novels penned by ex-spies. This would mean no Michael Connelly, no Tom Clancy, and no Vince Flynn.

Historical fiction would also be out. No contemporary writer can claim to have “experienced” the distant past. We would therefore lose the work of John Jakes, Edward Rutherford, and James Michener.

“Write what you know (and only what you know) is also physically exhausting (and sometimes dangerous) for the writer who feels compelled to constantly seek out new and exciting experiences to write about. More than one young man volunteered for hazardous military service in WWII, Korea, or Vietnam with the hopes of having a Hemingwaylike experience that could become the basis of a book. Some of these young men succeeded. (Karl Marlantes’s Matterhorn comes to mind here.) But what about the ones who never lived to write their stories?

Hemingway’s peripatetic lifestyle suggests an ongoing quest for fresh material. Hemingway moved around a lot; and he had a distinct preference for high-testosterone, risky forms of recreation. (Of sports Hemingway said, “There are only three sports: bullfighting, motor racing, and mountaineering; all the rest are merely games.”)

I recommend that every aspiring writer read Hemingway. He has much to teach you (especially if you find yourself overly enamored with words). It is important, however, to remember that imagination—and not just experience—often becomes the stuff of great fiction.

Tuesday, February 24, 2015

Fiction, research, and nonfiction boondoggles

An article about the link between science and science fiction:

"SF authors do their research. They tend to read widely, to generate ideas, and then think deeply, to focus in on the details. In the age of the author blog, readers can observe (some of) the authorial process. A lot of research can go into a book, much of it hidden, or even discarded. Inferior authors will info-dump every little last detail they’ve discovered; better authors weave their research seamlessly into the story, discarding what doesn’t fit. Sometimes the raw research reappears in footnotes, appendices, or bibliographies, which can be interesting in their own right; for example, Peter Watts’s Blindsight includes a fascinating technical appendix."

I can see both sides of this one.

On one hand, authenticity always enhances credibility.

It doesn't matter whether you’re writing a science fiction novel, or an urban thriller.

A sloppy mistake will always detract from credibility. If you’re writing a novel set in Kentucky, for example, you should take the time to learn that Frankfort is the capital of the Blue Grass State (which is actually structured as a commonwealth, by the way) if such a detail is relevant to your story.

That having been said, not all scientifically detailed science fiction makes for good reading, and a lot of very entertaining science fiction plays fast and loose with the scientific details.

A number of years ago I read Michael Crichton’s novel, State of Fear. (Yes, I realize that hardcore science fiction fans don’t regard Crichton as a science fiction writer, but work with me, please. At the very least, much of Crichton’s work could be described as “science fiction thrillers”, with emphasis on the last word in this description.) 

State of Fear basically represented Crichton’s attempt to debunk global warming alarmism. I don’t want to get into the global warming debate in this post; but suffice to say that the book overflowed with facts, figures, and not a small amount of argumentation.

On a less controversial note, James Michener, the virtuoso of the historical novel, usually managed to “weave the research seamlessly into the story”. But in his 1992 novel, Mexico, Michener goes off on a long tangent (or “info dump”) about the sport of bullfighting. 

Back to Michael Crichton. My favorite Michael Crichton novel, without a doubt, is Timeline. This is the one in which a group of academics travel through time back to medieval France.

Crichton did pay attention to some of the pertinent historical details in Timeline. (For example, he accommodates that fact that residents of medieval France would speak a variety of languages, some of them extinct or nearly extinct in modern times.)

The science of Timeline, however, is basically hokey. You don’t need a PhD in physics to realize that Crichton’s time travel scenario would never stand up to…well, scientific scrutiny.

Here is the crux of the matter: Some fiction writers clearly have nonfiction books lurking inside them. But because of the constraints of the publishing industry, and the need for author branding, it is fairly rare for a writer who is branded as a novelist to publish a book-length work of nonfiction.

And there may be some perfectly valid market reasons for this constraint. Stephen King’s novels all become bestsellers, practically without exception.

However, King’s nonfiction analysis of the horror genre, Danse Macabre, is barely in print.

In fact, unless you’re a Stephen King completist, there is a very good chance that you’ve never even heard of Danse Macabre. You can also be forgiven if you overlooked Faithful: Two Diehard Boston Red Sox Fans Chronicle the Historic 2004 Season. This is a nonfiction book that King co-wrote with Stuart O’Nan, a lesser-known but accomplished literary novelist.

It turns out that Stephen King’s fans want to come to him for horror tales and thrillers, and go elsewhere for sports writing.

Likewise, I’m not sure how a nonfiction Michael Crichton book about global warming would have gone over in the marketplace. Crichton had a medical degree—but he hadn’t practiced medicine for years, and environmental science is of course an entirely different bailiwick.

And we needn’t wonder how many copies a standalone exposition on bullfighting would have sold, even when the late James Michener was at the height of his popularity.

The takeaways:

Fiction is primarily about storytelling. Period. This doesn't mean that an author (of science fiction or any other genre) should play fast and loose with every fact out of laziness. However, the primary function of fiction is to entertain. Instruction and information are what nonfiction is for.

If a novelist wants to write a nonfiction book, he should write a nonfiction book. A nonfiction book shouldn't be embedded in a novel.

Sunday, February 22, 2015

Digital sharecropping and individual “brands” online

Back in 2006, a blogger named Nicholas Carr coined the term “digital sharecropping”:

“One of the fundamental economic characteristics of Web 2.0 is the distribution of production into the hands of the many and the concentration of the economic rewards into the hands of the few.”

Sonia Simone of Copyblogger further elaborated:

“In other words, anyone can create content on sites like Facebook, but that content effectively belongs to Facebook. The more content we create for free, the more valuable Facebook becomes. We do the work, they reap the profit.”

“Get your own website and hosting account!”

Simone ends her post on digital sharecropping with some advice for avoiding the trap.

Among other steps, she recommends that content creators have their own website addresses and hosting service accounts. This will prevent a catastrophe in the event that Google+ or Facebook goes the way of MySpace.

I don’t use Facebook much for my online presence (I do have a Facebook account, but it is for personal (i.e., old classmates and relatives) contacts only). I do, however, blog on Google’s blogging platform, Blogger, and I use YouTube for hosting my videos.

I used to do it Ms. Simone’s way, though. For a long time, I had my own website. In fact, I had several of them.

This was the only option before the Web 2.0 platforms were developed. Way back when (in 2001, I believe), I signed up for my first web hosting account with Interland, which shortly thereafter became I later used GoDaddy.

What I immediately found, however, is that while hosting companies give you land to build on, you still have to build your own house. That means you have to learn all about HTML, JavaScript, server administration, as well as the aesthetics of webpage design.

I often enjoy technical details for their own sake, so I did jump into these topics. I also became proficient with the now defunct Microsoft FrontPage, and moderately proficient with Dreamweaver.

What I found, however, was that I was soon spending more time on webpage development and administration than on writing.

This problem became especially acute after Microsoft discontinued the relatively user-friendly FrontPage, and Dreamweaver forced users to plan every page with cascading style sheets (CSS) instead of the far more intuitive WYSIWYG system.

That was back in the mid- to late-2000s. I understand that intermediate solutions are now available, namely in the form of WordPress templates. And I may migrate to something like that in the future. However, even this is not a panacea. Why not?

Because web hosting companies go out of business, too.

Yes, if a web hosting company goes out of business, you’ll still have your domain name, but you’ll lose all of the hosted content that isn’t backed up.

Not only do hosting companies go out of business, they often screw things up. once disabled 60% of the internal links in my site when they implemented an unannounced and poorly executed server change. I had no choice but to kill an entire weekend recreating the links.

Web hosting has become a low-margin, high-volume business. As a result, many hosting companies rely on low-paid technical support services in India and the Philippines. Very often they can only be reached by email. Good luck with that.

But what about content creator “branding”?

A writer or other content creator can be effectively branded on a site like Facebook, YouTube, Twitter, or Blogger.

This is evidenced by the fact that Twitter followers, blog pageviews, and YouTube subscribers are by no means evenly distributed. JA Konrath and I both use the Google Blogger platform, but his blog receives more pageviews than mine does, because JA Konrath the writer is far better known than yours truly is.

But technically, are blogs are more or less the same.

As a content creator, your brand is your name and the content you provide. It really doesn't matter whether your web presence is hosted by Google, YouTube (a Google company, incidentally) or a private hosting account with GoDaddy.

And while domain names are important, they aren’t as important as they used to be. If someone wants to find a particular personality online, their first inclination nowadays is to Google that person’s name. (Web addresses were a lot more crucial when search engines were less accurate and efficient, circa 1998 to 2001.)

This doesn't mean that you should ignore the vulnerability of Web 2.0 platforms. It is important to have all content backed up.

Before I upload anything to my blog (with the exception of the most ephemeral posts) I compose the content in Microsoft Word. This means that I have backups. If Google were to discontinue Blogger tomorrow, I would be able to upload all of my content somewhere else (like say, an independently hosted account with WordPress templates!)

So is there, in fact, “digital sharecropping”? And what about Wikipedia?

Well, yes: Technically speaking, anyone who writes for Wikipedia (where individual authors are not credited) is a digital sharecropper.

I suspect that this is the real reason why Jimmy Wales has never taken the logical step of funding the site through advertising. Ad revenue would immediately bring up the uncomfortable and inconvenient digital sharecropping issue, as all of those ad revenues would have to be somehow distributed. So instead Wikipedia annually engages in a far less efficient and less effective campaign for voluntary donations.

I understand that Wikipedia writers (called “Wikipedians”) aren’t in it for the money or the individual recognition to begin with. (So no—you don’t have to email me and tell me that.) But Wikipedians are, in fact, unpaid contributors for an enterprise in which they have no immediate monetary or reputational interest. At the very least, Wikipedians help Wales (who is worth a respectable $1 million) to secure paid speaking gigs. Most Wikipedians have to pay their bills with far less glamorous and anonymous day jobs. So there is a degree of digital sharecropping going on there—even if it’s relatively small-scale and only moderately exploitive.

And let’s not forget all those blog commenters!

I don’t allow comments at my blog, and I feel absolutely no compunction over this. To begin with, I have no double standard. In all my time on the Internet, I’ve probably posted no more than a half a dozen comments on other people’s blogs. It is something that I do very, very rarely. Almost never, in fact.

If I feel that I have a brilliant point to make, I would much rather make that point here.

Likewise, if you have a brilliant rebuttal, segue, or point of agreement with one of my posts, I would much rather you make that point on your own Twitter, Facebook, or Blogger account.

Nothing induces more guilt in your host’s heart than seeing an insightful, three-paragraph comment from a reader—which I won’t be able to address because I’ve already moved on to something else.

*    *    *

There is, of course, one exception I should mention: And that is the myriad bloggers and/or content creators who post only a few videos or blogs to sites like YouTube or Blogger. These are the folks who upload one cat-on-a-skateboard YouTube video that briefly goes viral, or one confessional blog post that makes a small splash for a short time.

These users will see little to no personal benefit, but they have little invested in terms of time or effort.

It is also true that Google, YouTube, Twitter, etc. reap benefits from these individuals in the aggregate. But Google, YouTube, Twitter, etc. also provide opportunities for individuals who don’t want to be full-time content creators, but occasionally want to be heard in the public square without making the large individual investment that would otherwise be needed to create their own platforms. When you think about it, this represents a fair tradeoff.

So what’s the bottom line? Digital sharecropping is a real issue—but only if you neglect to brand yourself, or deliberately choose to create content in an anonymous venue (such as Wikipedia).

Google, Blogger, Twitter, YouTube, WordPress etc. are merely tools for content creation. These websites and technologies aren’t the content itself. And while discussion of their individual merits and demerits is worthwhile, at the end of the day it is the content—not the platform—that is most likely to determine an individual creator’s success or failure.

Saturday, February 21, 2015

Russell Blake, subscription services, and the debate about “free”

Russell Blake, like a lot of authors, is decidedly pessimistic about subscription services like Kindle Unlimited:

"Subscription services will make it much harder to sell books. The voracious readers who are most likely to try an indie with a “WTF purchase” will instead tend to borrow instead of buy. This will result in drastic reductions in author take-home pay, all assurances of “increased exposure” aside. A whole group of readers are being conditioned to believe that books have little or no value/should be free/should only be read if virtually free. This will continue. For an idea of where this progression ends, look at music. Musicians can’t earn decent money anymore by having a hit, or even several hits. The economic model is broken in such a way that the artist sees virtually nothing, with the intermediary company that enables the download taking the lion’s share of the revenue. Musicians now earn their livings by touring, by selling merchandise (shirts, hats, etc.), by selling virtually anything but music. Alas, authors don’t have the option of filling coliseums at $50 a ticket or being cool or mainstream enough to hawk $22 concert T-shirts with their likenesses on them, so expect things to get much harder."

Blake is right—if (and only if) all books are forced into the subscription model stream.

But this is unlikely to happen, because writers will simply begin to withhold their work from such services. (As I noted earlier, musicians are already pushing back against the subscription model.) Writers are well connected with each other online, and it wouldn't take long for such a revolt to ensue should Amazon (or anyone else) attempt to mandate an exclusively subscription-based model.

And let’s not forget about the corporate publishers: They are unlikely to agree to a business model that basically gives away all of their product.

That said, even well known writers can benefit by making their work free (or practically free) in specific, limited circumstances.

The important factor here is strategy; and there are models to be found in other industries.

Nearly every consumer goods manufacturer gives away samples. (Haven’t you ever received a free sample of toothpaste or detergent in the mail?) These companies all experiment with loss leaders and the like.

The key distinction is that Procter & Gamble or Pepsi would never give away everything for free. Beyond a certain point, you have to pay them.

And herein lies the difference between a writer who strategically uses “free” as part of an overall marketing plan, and an insecure, approval-seeking writer who is eager for any scrap of validation.

Wednesday, February 18, 2015

Classic fiction about Japan

No, James Clavell’s novel Shogun isn’t the latest and greatest bestseller. It isn’t trending on Twitter. Oprah isn’t talking about it.

Shogun is, however, probably the best historical novel about Japan ever written.

I remember discovering Shogun in the late 1980s, when I was learning the Japanese language and just diving into Japanese history. I was “blown away” by this book.

I’ve now read Shogun several times, to the point where I take its plotline for granted. But if you haven’t read Shogun, you owe yourself this experience.

The book is set in Japan around the year 1600. The setup is this: An English sea captain named John Blackthorne is shipwrecked on the Japanese islands.

Blackthorne struggles to learn the Japanese language (a challenge that I was going through myself at the time, as I mention above.) He becomes involved in the battle to unify Japan under a single military ruler, or shogun.

This is a novel, not a history text. Nevertheless, the late James Clavell did include a sizeable chunk of authentic (or semi-authentic) history for those readers who are interested.

The political/military plot is loosely based on the actual wars of unification that took place in Japan at that time. The character of John Blackthorne also has a basis in history. The real-life John Blackthorne was named William Adams, or Anjin Miura in Japanese.

Tuesday, February 17, 2015

Greg Iles, and the risk-aversion of the publishing industry

It is one thing for the publishing industry to be risk averse when assessing the work of Mr. NoName Author. But what about Greg Iles?

MISSISSIPPI author Greg Iles' best-selling novel Natchez Burning—all 790 pages of it—is just the first book in an expansive trilogy about race and retribution in America.

As Iles relates, even he has struggled with an industry that wants to throw all of its weight behind a smaller number of guaranteed blockbusters.

After taking time off due to an accident that nearly claimed his life, Iles had difficulty securing industry support for a new project a few years ago:

I had to find people willing to go with me on this journey. Within six months I no longer had the same agent or publisher. People are used to the idea of trilogies because of the success of these YA series. But in mainstream fiction you don't see three novels, each of 200,000 words.  
It was a huge risk and nobody knew what would happen. But we just bet the farm on it. Then a couple of reviews came in that were really good.  
And then Ken Follett Tweeted that Natchez Burning was the “best thriller in years.”  
Not of the year, but in years. Then I thought, you know, maybe something’s about to happen here.

Several points are worth noting in closing:

First of all, Iles has always gone off in new directions. His first two bestsellers were World War II espionage novels. Then he wrote several (also bestselling) standalone thrillers.

Then he began writing a series set in Natchez, Mississippi. Since at least 2009, practically all of his fiction has been set there.

Honestly speaking, I preferred his earlier (pre-Natchez) work. (IMO, Iles peaked with 24 Hours and Sleep No More.) And while I appreciate Iles’ desire to produce more message-based fiction, he is hardly the first author who has set out to Say Something Significant about Race Relations in America. Arguably, that ground has been trodden and over-trodden a bit too much in recent years—especially by progressively minded white males.

Finally, even a big-name author like Iles acknowledges the importance of social media nowadays. A very few authors are so big, and so enshrined in household name status that they can probably afford to ignore social media altogether. (Stephen King comes to mind here.) But most can’t. The field is simply too crowded and competitive.

Monday, February 16, 2015

Is the Amazon series ‘Bosch’ authentic?

I’ve watched the Amazon series. While Bosch is a worthwhile series in its own right, it does take some notable departures from the Harry Bosch novels of Michael Connelly.

Perhaps the most significant departure is generational. The Harry Bosch of Michael Connelly’s novels is a 60-something Baby Boomer and a Vietnam vet.

The Harry Bosch of the Amazon series is a much younger man—a 40-something veteran of the recent military engagements in the Middle East.

I can understand why this decision was made. The Harry Bosch (fiction) series is now as mature as its lead character. When Michael Connelly wrote the first Bosch novel in the early 1990s, Harry Bosch was a relatively youngish man in his early middle-age prime.

Michael Connelly has clearly had to struggle with Harry’s age in recent novels in the series. He now has Bosch working for the LAPD on a post-retirement extended contract; and I’ve speculated that he may be grooming Bosch’s daughter Maddie to take over the “family business”.

On the other hand, the Amazon series is brand new, and the producers didn't want to begin a new series with a character who is already past retirement age. Again, perfectly understandable.

And the Amazon series is, once again, pretty darn good. But the Bosch of Bosch isn’t the Bosch of Michael Connelly’s novels--though they could plausibly be related.

The Americans: an accessible spy series

As regular readers will know, I much prefer books to television. But I will gladly put away my books for an hour each week to watch the FX Cold War-era spy series, The Americans.

The Americans has all the expected espionage tropes: assassinations, ruthless secret agents, and fancy gadgets (even though it’s set in the 1980s).

At root, however, The Americans is about family life and marriage; and this, I believe, is why the series has become so popular. Very few of us can relate to James Bond, or Alec Leamus of The Spy Who Came in from the Cold. But you will see yourself (or perhaps someone you live with) in the characters on The Americans.

Sunday, February 15, 2015

What made H.P. Lovecraft unique?

H.P. Lovecraft was a bundle of unusual, somewhat conflicting traits:

"The period during which weird fiction arose was unique, and Lovecraft and his fiction represented that uniqueness. Lovecraft was a patrician: a man born out of his time who lived his life as an English gentleman. At the same time, he was a rationalist, an atheist, and a learned man of scientific bent, with a love for astronomy and the hard sciences. These seemingly incongruous worldviews, rather than hindering him creatively, were what made him so powerful a writer."
Many readers may be surprised to learn that Lovecraft was an atheist. 

However, Lovecraft was not one of those atheists who waxes hubristic about the power of human reason. He was no Carl Sagan or Ayn Rand. Quite the contrary, in fact:

"Prior to Lovecraft, the gods were looking out for us. Literary heroes were likely to survive because the gods were benevolent. Lovecraft changed that – he wrote of the indifference of the cosmos and the insignificance of man. Despite several thousand years of religious belief and the inherent hubris of humanity, Lovecraft posited that humankind, instead of being unique and the masters of all we see, was, in fact, insignificant when compared to the backdrop of the larger universe.  
Religious writings have argued that we are the center of the universe, but science has argued otherwise, and Lovecraft’s fiction falls squarely on the side of science. To Lovecraft, we are not the center of the universe; our impact on a cold and unforgiving universe is infinitesimal."

Lovecraft, in fact, doesn't seem to have much of a moral compass, aside from the cautionary dictum, "Watch out for the Old Ones".

Don't get me wrong: Lovecraft's stories are entertaining (to a point). But their basically nihilistic view of the universe and humankind also limits their scope. 

Lovecraft may have been an interesting author. Few would describe him as an inspiring one.

Genre fiction: escapist or ‘morally complex’?

Jonathan Franzen has two occupations.

He is, first of all, a pretty decent literary novelist (if not a very prolific one). I enjoyed The Corrections immensely. I also liked his later novel, Freedom (though not quite as much).

However, Jonathan Franzen is also a professional snob, as indicated by his recent remarks concerning genre fiction and the people who read it.

"Most of what people read, if you go to the bookshelf in the airport convenience store and look at what’s there, even if it doesn’t have a YA on the spine, is YA in its moral simplicity. People don’t want moral complexity. Moral complexity is a luxury. You might be forced to read it in school, but a lot of people have hard lives. They come home at the end of the day, they feel they’ve been jerked around by the world yet again for another day. The last thing they want to do is read Alice Munro, who is always pointing toward the possibility that you’re not the heroic figure you think of yourself as, that you might be the very dubious figure that other people think of you as. That’s the last thing you’d want if you’ve had a hard day. You want to be told good people are good, bad people are bad, and love conquers all. And love is more important than money. You know, all these schmaltzy tropes." 

Personally, I’m glad just to see people reading, versus watching professional sports, listening to rap music, or tuning in to cat skateboarding videos on YouTube.

Moreover, it is fundamentally wrongheaded to dismiss all genre fiction as morally simplistic, as Sarah Seltzer discusses over at Flavorwire:

One must be wary of over-generalizing. No one is going to make the case that Clive Cussler’s novels are anything but escapist entertainment. That is, in fact, all they are.

However, crime fiction (read some of Michael Connelly’s Harry Bosch novels) is often riddled with philosophical ambiguities and moral complexity.

At the other side of the spectrum, not all literary fiction is as profound as its advocates claim it to be. A number of reviewers panned Haruki Murakami’s bloated literary novel, 19Q4, as “overhyped” and filled with pointless navel gazing.

After struggling in vain to get through 19Q4, I tended to agree….

Presidential deals on Amazon

In honor of President's Day tomorrow: several good Kindle deals on presidential biographies:

The following two books are only $2.99 on Amazon Kindle for a limited time:

What’s next for Nelson DeMille?

I recently reviewed a Nelson DeMille short story at this site.

My favorite Nelson DeMille novel is probably Up Country, which is about a Vietnam vet who returns to Vietnam to solve an old crime.

The horror-science fiction connection

Horror and science fiction are fundamentally different, of course.

Horror takes a fearful approach toward the unknown, whereas science fiction generally depicts the unknown more optimistically. (Or at least, as something that humankind can overcome through the very human power of reason.)

This is the key difference. There are spaceships in the movie Alien, but Alien views extraterrestrial contact as fundamentally frightening. The tone, the mood, and the horrified reactions of the protagonists in Alien all scream ‘horror movie!’ There is very little of the calm, “We can handle this” attitude that one sees in classic science fiction like Star Trek. 

Nevertheless, there was horror before there was science fiction. And as the article hyperlinked below explains, the earliest works of science fiction were arguably derived from the horror genre:

Frankenstein’s monster stands alongside the sharp-toothed Dracula as one of literature’s most iconic figures. His gruesome visage has appeared in numerous popular representations and is a Halloween mainstay. So popular is the image of the shambling monster with metal bolts in his neck that it has all but broken away from its original incarnation in Mary Shelley’s seminal work…Yet ‘Frankenstein’ remains not only a pillar of the horror genre but one of the earliest works of science fiction and a fantastic reflection on the suffering of existence.