Consider the Lobster

That is the name of a very famous David Foster Wallace piece in Gourmet magazine. It is interesting throughout. For example this paragraph:

Up until sometime in the 1800s, though, lobster was literally low-class food, eaten only by the poor and institutionalized. Even in the harsh penal environment of early America, some colonies had laws against feeding lobsters to inmates more than once a week because it was thought to be cruel and unusual, like making people eat rats. One reason for their low status was how plentiful lobsters were in old New England. “Unbelievable abundance” is how one source describes the situation, including accounts of Plymouth pilgrims wading out and capturing all they wanted by hand, and of early Boston’s seashore being littered with lobsters after hard storms—these latter were treated as a smelly nuisance and ground up for fertilizer.

But it is most famous because of its ethical argument against eating lobster. In the same way that Hunter S. Thompson was sent to cover the Kentucky Derby for Scanlan’s Monthly 35 years prior and unexpectedly came back with what would become the first ever piece of Gonzo Journalism (“The Kentucky Derby is Decadent and Depraved“), Wallace was sent to cover the Maine Lobster Festival and returned to surprise readers with a thoughtful reflection on the ethics of eating lobster, down to the inner workings of the lobster’s neurological system. Remember again that this was published in Gourmet magazine! Few groups are likely to love lobster more than its readers.

Here is a key passage from Wallace’s piece.

There happen to be two main criteria that most ethicists agree on for determining whether a living creature has the capacity to suffer and so has genuine interests that it may or may not be our moral duty to consider.16 One is how much of the neurological hardware required for pain-experience the animal comes equipped with—nociceptors, prostaglandins, neuronal opioid receptors, etc. The other criterion is whether the animal demonstrates behavior associated with pain. And it takes a lot of intellectual gymnastics and behaviorist hairsplitting not to see struggling, thrashing, and lid-clattering as just such pain-behavior. According to marine zoologists, it usually takes lobsters between 35 and 45 seconds to die in boiling water. (No source I could find talked about how long it takes them to die in superheated steam; one rather hopes it’s faster.)

There are, of course, other fairly common ways to kill your lobster on-site and so achieve maximum freshness. Some cooks’ practice is to drive a sharp heavy knife point-first into a spot just above the midpoint between the lobster’s eyestalks (more or less where the Third Eye is in human foreheads). This is alleged either to kill the lobster instantly or to render it insensate—and is said at least to eliminate the cowardice involved in throwing a creature into boiling water and then fleeing the room. As far as I can tell from talking to proponents of the knife-in-the-head method, the idea is that it’s more violent but ultimately more merciful, plus that a willingness to exert personal agency and accept responsibility for stabbing the lobster’s head honors the lobster somehow and entitles one to eat it. (There’s often a vague sort of Native American spirituality-of-the-hunt flavor to pro-knife arguments.) But the problem with the knife method is basic biology: Lobsters’ nervous systems operate off not one but several ganglia, a.k.a. nerve bundles, which are sort of wired in series and distributed all along the lobster’s underside, from stem to stern. And disabling only the frontal ganglion does not normally result in quick death or unconsciousness. Another alternative is to put the lobster in cold salt water and then very slowly bring it up to a full boil. Cooks who advocate this method are going mostly on the analogy to a frog, which can supposedly be kept from jumping out of a boiling pot by heating the water incrementally. In order to save a lot of research-summarizing, I’ll simply assure you that the analogy between frogs and lobsters turns out not to hold.

Ultimately, the only certain virtues of the home-lobotomy and slow-heating methods are comparative, because there are even worse/crueler ways people prepare lobster. Time-thrifty cooks sometimes microwave them alive (usually after poking several extra vent holes in the carapace, which is a precaution most shellfish-microwavers learn about the hard way). Live dismemberment, on the other hand, is big in Europe: Some chefs cut the lobster in half before cooking; others like to tear off the claws and tail and toss only these parts in the pot.

And there’s more unhappy news respecting suffering-criterion number one. Lobsters don’t have much in the way of eyesight or hearing, but they do have an exquisite tactile sense, one facilitated by hundreds of thousands of tiny hairs that protrude through their carapace. “Thus,” in the words of T.M. Prudden’s industry classic About Lobster, “it is that although encased in what seems a solid, impenetrable armor, the lobster can receive stimuli and impressions from without as readily as if it possessed a soft and delicate skin.” And lobsters do have nociceptors,17 as well as invertebrate versions of the prostaglandins and major neurotransmitters via which our own brains register pain.

Lobsters do not, on the other hand, appear to have the equipment for making or absorbing natural opioids like endorphins and enkephalins, which are what more advanced nervous systems use to try to handle intense pain. From this fact, though, one could conclude either that lobsters are maybe even more vulnerable to pain, since they lack mammalian nervous systems’ built-in analgesia, or, instead, that the absence of natural opioids implies an absence of the really intense pain-sensations that natural opioids are designed to mitigate. I for one can detect a marked upswing in mood as I contemplate this latter possibility: It could be that their lack of endorphin/enkephalin hardware means that lobsters’ raw subjective experience of pain is so radically different from mammals’ that it may not even deserve the term pain. Perhaps lobsters are more like those frontal-lobotomy patients one reads about who report experiencing pain in a totally different way than you and I. These patients evidently do feel physical pain, neurologically speaking, but don’t dislike it—though neither do they like it; it’s more that they feel it but don’t feel anything about it—the point being that the pain is not distressing to them or something they want to get away from. Maybe lobsters, who are also without frontal lobes, are detached from the neurological-registration-of-injury-or-hazard we call pain in just the same way. There is, after all, a difference between (1) pain as a purely neurological event, and (2) actual suffering, which seems crucially to involve an emotional component, an awareness of pain as unpleasant, as something to fear/dislike/want to avoid.

How to Start a Startup Notes

My Cliff’s Notes from Paul Graham’s How to Start a Startup.

On the importance of ideas:

What matters is not ideas, but the people who have them. Good people can fix bad ideas, but good ideas can’t save bad people.

On who to sell to:

Start by writing software for smaller companies, because it’s easier to sell to them. It’s worth so much to sell stuff to big companies that the people selling them the crap they currently use spend a lot of time and money to do it. And while you can outhack Oracle with one frontal lobe tied behind your back, you can’t outsell an Oracle salesman. So if you want to win through better technology, aim at smaller customers. [4]

They’re the more strategically valuable part of the market anyway. In technology, the low end always eats the high end. It’s easier to make an inexpensive product more powerful than to make a powerful product cheaper. So the products that start as cheap, simple options tend to gradually grow more powerful till, like water rising in a room, they squash the “high-end” products against the ceiling. Sun did this to mainframes, and Intel is doing it to Sun. Microsoft Word did it to desktop publishing software like Interleaf and Framemaker. Mass-market digital cameras are doing it to the expensive models made for professionals. Avid did it to the manufacturers of specialized video editing systems, and now Apple is doing it to Avid. Henry Ford did it to the car makers that preceded him. If you build the simple, inexpensive option, you’ll not only find it easier to sell at first, but you’ll also be in the best position to conquer the rest of the market.

It’s very dangerous to let anyone fly under you. If you have the cheapest, easiest product, you’ll own the low end. And if you don’t, you’re in the crosshairs of whoever does.

On taking investor money vs. not:

I think it’s wise to take money from investors. To be self-funding, you have to start as a consulting company, and it’s hard to switch from that to a product company.

Financially, a startup is like a pass/fail course. The way to get rich from a startup is to maximize the company’s chances of succeeding, not to maximize the amount of stock you retain. So if you can trade stock for something that improves your odds, it’s probably a smart move.

On choosing a VC:

VCs form a pyramid. At the top are famous ones like Sequoia and Kleiner Perkins, but beneath those are a huge number you’ve never heard of. What they all have in common is that a dollar from them is worth one dollar…Basically, a VC is a source of money. I’d be inclined to go with whoever offered the most money the soonest with the least strings attached.

On spending money:

When and if you get an infusion of real money from investors, what should you do with it? Not spend it, that’s what. In nearly every startup that fails, the proximate cause is running out of money. Usually there is something deeper wrong. But even a proximate cause of death is worth trying hard to avoid.

On being first to market:

I think in most businesses the advantages of being first to market are not so overwhelmingly great

[…]

Since this was the era of “get big fast,” I worried about how small and obscure we were. But in fact we were doing exactly the right thing. Once you get big (in users or employees) it gets hard to change your product. That year was effectively a laboratory for improving our software. By the end of it, we were so far ahead of our competitors that they never had a hope of catching up. And since all the hackers had spent many hours talking to users, we understood online commerce way better than anyone else.

[…]

To make something users love, you have to understand them. And the bigger you are, the harder that is. So I say “get big slow.” The slower you burn through your funding, the more time you have to learn.

Many companies take the advice below too far (as in “we never need to make a profit”):

One of my favorite bumper stickers reads “if the people lead, the leaders will follow.” Paraphrased for the Web, this becomes “get all the users, and the advertisers will follow.” More generally, design your product to please users first, and then think about how to make money from it. If you don’t put users first, you leave a gap for competitors who do.

On maximizing productivity:

The key to productivity is for people to come back to work after dinner. Those hours after the phone stops ringing are by far the best for getting work done. Great things happen when a group of employees go out to dinner together, talk over ideas, and then come back to their offices to implement them. So you want to be in a place where there are a lot of restaurants around, not some dreary office park that’s a wasteland after 6:00 PM. Once a company shifts over into the model where everyone drives home to the suburbs for dinner, however late, you’ve lost something extraordinarily valuable. God help you if you actually start in that mode.

On hiring

The most important way to not spend money is by not hiring people. I may be an extremist, but I think hiring people is the worst thing a company can do. To start with, people are a recurring expense, which is the worst kind. They also tend to cause you to grow out of your space, and perhaps even move to the sort of uncool office building that will make your software worse. But worst of all, they slow you down: instead of sticking your head in someone’s office and checking out an idea with them, eight people have to have a meeting about it. So the fewer people you can hire, the better.

On lessons learned:

I spent a year working for a software company to pay off my college loans. It was the worst year of my adult life, but I learned, without realizing it at the time, a lot of valuable lessons about the software business. In this case they were mostly negative lessons: don’t have a lot of meetings; don’t have chunks of code that multiple people own; don’t have a sales guy running the company; don’t make a high-end product; don’t let your code get too big; don’t leave finding bugs to QA people; don’t go too long between releases; don’t isolate developers from users; don’t move from Cambridge to Route 128.

On who should start a startup:

If you want to do it, do it. Starting a startup is not the great mystery it seems from outside. It’s not something you have to know about “business” to do. Build something users love, and spend less than you make. How hard is that?

Is Trump Hitler?

Dan McLaughlin writing for National Review says no (his piece is about what fronts Congressional Republicans should oppose Trump).

But for all of Trump’s authoritarian instincts, he’s not Hitler. Hitler in 1933 was 44, a political fanatic and hardened combat veteran of World War I with a decade’s experience leading a violent street movement full of his fellow veterans. Trump is 70, a political dilettante who’s addicted to cable TV, has spent most of his life making real-estate deals, and commands a political base disproportionately composed of people in their 60s and 70s. Moreover, America is not Weimar Germany, which was then a 15-year-old democracy crumbling amidst hyperinflation, a global Depression, and the loss of a war that killed 13 percent of its military-age men. We have a long history of absorbing and co-opting fringe movements into our remarkably durable two-party system, and that’s exactly what the rest of Republican leadership is trying to do with Trump.

What are People Looking For in White Noise Recordings?

The white noise recording below has nearly 4 million views. But why is “Rain in the Woods Sleep Sounds” so much better than “Rain Showers” (55k views) or “Rain & Thunder Camping” (318k views)? All three recordings came out within a couple of months of one another and are from the same YouTube channel. Doesn’t “Spring Rain with Birds” sound nice? Yet it has only 417k views despite being released two years before “Rain in the Woods Sleep Sounds.”

More confusingly 7% of votes on “Rain in the Woods Sleep Sounds” are a thumbs down. Not a large percentage, but should it not be 0%? What are you expecting to hear when you click on “Rain in the Woods Sleep Sounds”? Give it a listen. I promise that you will think it sounds exactly like rain in the woods. Thumbs up from me.

Trust and the Blockchain

Automation of trust is illusory.

That is the thesis form this Aeon piece on the failure of Ethereum, a popular blockchain. There are many other points of interest.

How about this question:

Why are people so eager to put their faith in blockchain technology and its human supporters, instead of in other social and economic organisations?

Or this:

What it really exposed was the extent to which trust defines what it is to be human. Trust is about more than making sure I get my orange juice on time. Trust is what makes all relationships meaningful. Yes, we get burned by people we rely on, and this makes us disinclined to trust others. But when our faith is rewarded, it helps us forge closer relationships with others, be they our business partners or BFFs. Risk is a critical component to this bonding process. In a risk-free world, we wouldn’t find anything resembling intimacy, friendship, solidarity or alliance, because nothing would be at stake.

Kenneth Arrow – The Polymath

The great economist Kenneth Arrow has passed away and the New York Times had a wonderful story about him in their obituary.

Professor Arrow was widely hailed as a polymath, possessing prodigious knowledge of subjects far removed from economics. Eric Maskin, a Harvard economist and fellow Nobel winner, told of a good-natured conspiracy waged by junior faculty to get the better of Professor Arrow, even if artificially. They all agreed to study the breeding habits of gray whales — a suitably abstruse topic — and gathered at an appointed date at a place where Professor Arrow would be sure to visit.

When, as expected, he showed up, they were talking out loud about the theory by a marine biologist — last name, Turner — which purported to explain how gray whales found the same breeding spot year after year. As Professor Maskin recounted the story, “Ken was silent,” and his junior colleagues amused themselves that they had for once bested their formidable professor.

Well, not so fast.

Before leaving, Professor Arrow muttered, “But I thought that Turner’s theory was entirely discredited by Spencer, who showed that the hypothesized homing mechanism couldn’t possibly work.”

Second Life’s Effect on the Disabled

As Fran and Barbara tell it, the more time Fran spent in Second Life, the younger she felt in real life. Watching her avatar hike trails and dance gave her the confidence to try things in the physical world that she hadn’t tried in a half decade — like stepping off a curb or standing up without any help. These were small victories, but they felt significant to Fran.

That is from a fascinating new article on the effect Second Life has had on the disability community. It seems that immersion in online worlds gets a lot of push back these days, but (at least as this article tells it) immersive virtual worlds can be a major positive influence for those across the disability spectrum.

Here is a short video about Fran, the main character in the story.

What Jane Jacobs Got Right and Wrong

The sad truth is that the saints we revere for thinking for themselves almost always end up thinking by themselves. We are disappointed to find that the self-taught are also self-centered, although a moment’s reflection should tell us that you have to be self-centered to become self-taught. (The more easily instructed are busy brushing their teeth, as pledged.) The independent-minded philosopher-saints are so sure of themselves that they often lose the discipline of any kind of peer review, formal or amateur. They end up opinionated, and alone.

That is from Adam Gopnik’s New Yorker article “Jane Jacobs’s Street Smarts” about what Jacobs got both right and wrong about cities over the course of her career.

There is, of course, a focus on “The Death and Life,” by far her most famous work.

Two core principles emerge from the book’s delightful and free-flowing observational surface. First, cities are their streets. Streets are not a city’s veins but its neurology, its accumulated intelligence. Second, urban diversity and density reinforce each other in a virtuous circle. The more people there are on the block, the more kinds of shops and social organizations—clubs, broadly put—they demand; and, the more kinds of shops and clubs there are, the more people come to seek them. You can’t have density without producing diversity, and if you have diversity things get dense. The two principles make it plain that any move away from the street—to an encastled arts center or to plaza-and-park housing—is destructive to a city’s health. Jacobs’s idea can be summed up simply: If you don’t build it, they will come.

The book still has relevant today, but not all of it has held up.

Books written in a time of crisis can make bad blueprints for a time of plenty, as polemics made in times of war are not always the best blueprint for policies in times of peace. Jane Jacobs wrote “Death and Life” at a time when it was taken for granted that American cities were riddled with cancer. Endangered then, they are thriving now, with the once abandoned downtowns of Pittsburgh and Philadelphia and even Cleveland blossoming. Our city problems are those of overcharge and hyperabundance—the San Francisco problem, where so many rich young techies have crowded in to enjoy the city’s street ballet that there’s no room left for anyone else to dance.

The old neighborhood is helpless in the face of new pressures, because it had depended on older versions of the same pressures, ones that Jacobs was not entirely willing to name or confront. What kept her street intact was not a mysterious equilibrium of types, or magic folk dancing, but market forces. The butcher and the locksmith on Hudson Street were there because they could make a profit on meat and keys. They weren’t there to dance; they were there to earn. The moment that Mr. Halpert and Mr. Goldstein can’t turn that profit—or that Starbucks and Duane Reade can pay the landlord more—the tempo changes.

 

On Rudeness

I give [the saleswoman] the dress, and she goes away. I find that I no longer want to be in the shop. I don’t want to try on the dress. I don’t want to take my clothes off or look at myself in a mirror. I consider quietly leaving while the assistant is gone, but the fact that I have caused the dress to be put in the fitting room is too significant. Perhaps it will be transformative after all.

That paragraph connect with me as I so often experience these little moments of social anxiety myself.

Perhaps surprisingly that paragraph is from a new and wonderful essay in The New York Times Magazine called “The Age of Rudeness” by Rachel Cusk.

I consider the role that good manners might play in the sphere of rat-eating, and it seems to me an important one. As one who has never been tested, who has never endured famine or war or extremism or even discrimination, and who therefore perhaps does not know whether she is true or false, brave or a coward, selfless or self-serving, righteous or misled, it would be good to have something to navigate by.

There are many, many other points of interest.

…I understood rudeness to be essentially a matter of verbal transgression: It could be defined within the morality of language, without needing to prove itself in a concrete act. A concrete act makes language irrelevant. Once words have been superseded by actions, the time for talking has passed. Rudeness, then, needs to serve as a barrier to action. It is what separates thought from deed; it is the moment when wrongdoing can be identified, in time to stop the wrong from having to occur. Does it follow, then, that a bigoted remark — however ugly to hear — is an important public interface between idea and action? Is rudeness a fundamental aspect of civilization’s immunity, a kind of antibody that is mobilized by the contagious presence of evil?

Or how about:

The liberal elite, as far as I am aware, do not make death threats. Is this because they have better manners? Do they in fact wish that their enemies were dead but would just never say so? And if they do wish it — albeit politely, in the manner of a white lie — is the sin somehow less cardinal for being courteous?

Beautiful sentences throughout:

The moral power of individuality and the poetic power of suffering are the two indispensable components of truth.

Highly recommended.