Category Archives: Anthropology

H. L. Mencken on aristocracy in America

H. L. Mencken is a glorious weirdo. My best attempt to describe him would be “Mark Twain on a Nietzsche bender.” Here’s a sample of his personality:

“American Culture”

From THE NATIONAL LETTERS, PERJUDICES: SECOND SERIES, 1920, pp. 65–78. First printed in the Yale Review, June, 1920, pp. 804–17

The capital defect in the culture of These States is the lack of a civilized aristocracy, secure in its position, animated by an intelligent curiosity, skeptical of all facile generalizations, superior to the sentimentality of the mob, and delighting in the battle of ideas for its own sake. The word I use, despite the qualifying adjective, has got itself meanings, of course, that I by no means intend to convey. Any mention of an aristocracy, to a public fed upon democratic fustian, is bound to bring up images of stockbrokers’ wives lolling obscenely in opera boxes, or of haughty Englishmen slaughtering whole generations of grouse in an inordinate and incomprehensible manner, or of bogus counts coming over to work their magic upon the daughters of breakfast-food and bathtub kings. This misconception belongs to the general American tradition. Its depth and extent are constantly revealed by the naïve assumption that the so- called fashionable folk of the large cities—chiefly wealthy industrials in the interior-decorator and country-club stage of culture—constitute an aristocracy, and by the scarcely less remarkable assumption that the peerage of England is identical with the gentry—that is, that such men as Lord Northcliffe, Lord Riddel and even Lord Reading were English gentlemen.

Here, as always, the worshiper is the father of the gods, and no less when they are evil than when they are benign. The inferior man must find himself superiors, that he may marvel at his political equality with them, and in the absence of recognizable superiors de facto he creates superiors de jure. The sublime principle of one man, one vote must be translated into terms of dollars, diamonds, fashionable intelligence; the equality of all men before the law must have clear and dramatic proofs. Sometimes, perhaps, the thing goes further and is more subtle. The inferior man needs an aristocracy to demonstrate, not only his mere equality, but also his actual superiority. The society columns in the newspapers may have some such origin. They may visualize once more the accomplished journalist’s understanding of the mob mind that he plays upon so skillfully, as upon some immense and cacophonous organ, always going fortissimo. What the inferior man and his wife see in the sinister revels of those brummagem first families, I suspect, is often a massive witness to their own higher rectitude—in brief, to their former grasp upon the immutable axioms of Christian virtue, the one sound boast of the nether nine-tenths of humanity in every land under the cross.

But this bugaboo aristocracy is actually bogus, and the evidence of its bogusness lies in the fact that it is insecure. One gets into it only onerously, but out of it very easily. Entrance is elected by dint of a long and bitter struggle, and the chief incidents of that struggle are almost intolerable humiliations. The aspirant must school and steel himself to sniffs and sneers; he must see the door slammed upon him a hundred times before ever it is thrown open to him. To get in at all he must show a talent for abasement—and abasement makes him timorous. Worse, that timorousness is not cured when he succeeds at last. On the contrary, it is made even more tremulous, for what he faces within the gates is a scheme of things made up almost wholly of harsh and often unintelligible taboos, and the penalty for violating even the least of them is swift and disastrous. He must exhibit exactly the right social habits, appetites and prejudices, public and private. He must harbor exactly the right enthusiasms and indignations. He must harbor exactly the right enthusiasms and indignations. He must have a hearty taste for exactly the right sports and games. His attitude toward the fine arts must be properly tolerant and yet not a shade too eager. He must read and like exactly the right books, pamphlets and public journals. He must put up at the right hotels when he travels. His wife must patronize the right milliners. He himself must stick to the right haberdashery. He must live in the right neighborhood. He must even embrace the right doctrines of religion. It would ruin him, for all society column purposes, to move to Union Hill, N. J., or to drink coffee from his saucer, or to marry a chambermaid with a gold tooth, or to join the Seventh Day Adventists. Within the boundaries of his curious order he is worse fettered than a monk in a cell. Its obscure conception of propriety, its nebulous notion that this or that is honorable, hampers him in every direction, and very narrowly. What he resigns when he enters, even when he makes his first deprecating knock at the door, is every right to attack the ideas that happen to prevail within. Such as they are, he must accept them without question. And as they shift and change he must shift and change with them, silently and quickly.

Obviously, that order cannot constitute a genuine aristocracy, in any rational sense. A genuine aristocracy is grounded upon very much different principles. Its first and most salient character is its interior security, and the chief visible evidence of that security is the freedom that goes with it—not only freedom in act, the divine right of the aristocrat to do what he damn well pleases, so long as he does not violate the primary guarantees and obligations of his class, but also, and more importantly, freedom in thought, the liberty to try and err, the right to be his own man. It is the instinct of a true aristocracy, not to punish eccentricity by expulsion, but to throw a mantle of protection about it—to safeguard it from the suspicions and resentments of the lower orders. Those lower orders are inert, timid, inhospitable to ideas, hostile to changes, faithful to a few maudlin superstitions. All progress goes on on the higher levels. It is there that salient personalities, made secure by artificial immunities, may oscillate most widely from the normal track. It is within that entrenched fold, out of reach of the immemorial certainties of the mob, that extraordinary men of the lower orders may find their city of refuge, and breathe a clear air. This, indeed, is at once the hall-mark and the justification of a genuine aristocracy—that it is beyond responsibility to the general masses of men, and hence superior to both their degraded longings and their no less degraded aversions. It is nothing if it is not autonomous, curious, venturesome, courageous, and everything if it is. It is the custodian of the qualities that make for change and experiment; it is the class that organizes danger to the service of the race; it pays for its high prerogatives by standing in the forefront of the fray.

No such aristocracy, it must be plain, is now on view in the United States. The makings of one were visible in the Virginia of the Eighteenth Century, but with Jefferson and Washington the promise died. In New England, it seems to me, there was never anything of the sort, either in being or in nascency: there was only a theocracy that degenerated very quickly into a plutocracy on the one hand and a caste of sterile pedants on the other—the passion for God splitting into a lust for dollars and a weakness for mere words. Despite the common notion to the contrary—a notion generated by confusing literacy with intelligence—the New England of the great days never showed any genuine enthusiasm for ideas. It began its history as a slaughterhouse of ideas, and it is today not easily distinguishable from a cold-storage plant. Its celebrated adventures in mysticism, once apparently so bold and significant, are now seen to have been little more than an elaborate hocus-pocus—respectable Unitarians shocking the peasantry and scaring the horned cattle in the fields by masquerading in the robes of Rosicrucians. The notions that it embraced in those austere and far-off days were stale, and when it had finished with them they were dead. So in politics. Since the Civil War it has produced fewer political ideas, as political ideas run in the Republic, than any average county in Kansas or Nebraska. Appomattox seemed to be a victory for New England idealism. It was actually a victory for the New England plutocracy, and that plutocracy has dominated thought above the Housatonic ever since. The sect of professional idealists has so far dwindled that it has ceased to be of any importance, even as an opposition. When the plutocracy is challenged now, it is challenged by the proletariat.

Well, what is on view in New England is on view in all other parts of the nation, sometimes with ameliorations, but usually with the colors merely exaggerated. What one beholds, sweeping the eye over the land, is a culture that, like the national literature, is in three layers—the plutocracy on top, a vast mass of undifferentiated human blanks bossed by demagogues at the bottom, and a forlorn intelligentsia gasping out a precarious life between. I need not set out at any length, I hope, the intellectual deficiencies of the plutocracy—its utter failure to show anything even remotely resembling the makings of an aristocracy. It is badly educated, it is stupid, it is full of low- caste superstitions and indignations, it is without decent traditions or informing vision; above all, it is extraordinarily lacking in the most elemental independence and courage. Out of this class comes the grotesque fashionable society of our big towns, already described. It shows all the stigmata of inferiority—moral certainty, cruelty, suspicion of ideas, fear. Never does it function more revealingly than in the recurrent pogroms against radicalism, i.e., against humorless persons who, like Andrew Jackson, take the platitudes of democracy seriously. And what is the theory at the bottom of all these proceedings? So far as it can be reduced to comprehensible terms it is much less a theory than a fear—a shivering, idiotic, discreditable fear of a mere banshee—an overpowering, paralyzing dread that some extra-eloquent Red, permitted to emit his balderdash unwhipped, may eventually convert a couple of courageous men, and that the courageous men, filled with indignation against the plutocracy, may take to the highroad, burn down a nail-factory or two, and slit the throat of some virtuous profiteer.

Obviously, it is out of reason to look for any hospitality to ideas in a class so extravagantly fearful of even the most palpably absurd of them. Its philosophy is firmly grounded upon the thesis that the existing order must stand forever free from attack, and not only from attack, but also from mere academic criticism, and its ethics are as firmly grounded upon the thesis that every attempt at any such criticism is a proof of moral turpitude. Within its own ranks, protected by what may be regarded as the privilege of the order, there is nothing to take the place of this criticism. In other countries the plutocracy has often produced men of reflective and analytical habit, eager to rationalize its instincts and to bring it into some sort of relationship to the main streams of human thought. The case of David Ricardo at once comes to mind, and there have been many others: John Bright, Richard Cobden, George Grote. But in the United States no such phenomenon has been visible. Nor has the plutocracy ever fostered an inquiring spirit among its intellectual valets and footmen, which is to say, among the gentlemen who compose headlines and leading articles for its newspapers. What chiefly distinguishes the daily press of the United States from the press of all other countries pretending to culture is not its lack of truthfulness or even its lack of dignity and honor, for these deficiencies are common to newspapers everywhere, but its incurable fear of ideas, its constant effort to evade the discussion of fundamentals by translating all issues into a few elemental fears, its incessant reduction of all reflection to mere emotion. It is, in the true sense, never well-informed. It is seldom intelligent, save in the arts of the mob-master. It is never courageously honest. Held harshly to a rigid correctness of opinion, it sinks rapidly into formalism and feebleness. Its yellow section is perhaps its best section, for there the only vestige of the old free journalist survives. In the more respectable papers one finds only a timid and petulant animosity to all questioning of the existing order, however urbane and sincere—a pervasive and ill-concealed dread that the mob now heated up against the orthodox hobgoblins may suddenly begin to unearth hobgoblins of its own, and so run amok.

For it is upon the emotions of the mob, of course, that the whole comedy is played. Theoretically, the mob is the repository of all political wisdom and virtue; actually, it is the ultimate source of all political power. Even the plutocracy cannot make war upon it openly, or forget the least of its weaknesses. The business of keeping it in order must be done discreetly, warily, with delicate technique. In the main that business consists in keeping alive its deep-seated fears—of strange faces, of unfamiliar ideas, of unhackneyed gestures, of untested liberties and responsibilities. The one permanent emotion of the inferior man, as of all the simpler mammals, is fear—fear of the unknown, the complex, the inexplicable. What he wants beyond everything else is security. His instincts incline him toward a society so organized that it will protect him at all hazards, and not only against perils to his hide but also against assaults upon his mind—against the need to grapple with unaccustomed problems, to weigh ideas, to think things out for himself, to scrutinize the platitudes upon which his everyday thinking is based.

One thing I love about this essay is it illuminates how liberalism must protect all marginal persons from the tyranny of majority. Not only the downtrodden, but also, and perhaps more importantly, the uptrodden. Because the uptrodden freak class is the most reliably fertile ground for fruitionist epiphany.

My cultural assimilation

When I entered the work world, I had to abandon many of the cultural habits I’d acquired as a youth growing up weird in rural and semi-urban South Carolina.

Many of us in my social circle had developed a sort of subversive irony and had woven it into our personal styles, manners and subcultural customs. In everything we did and said, we signaled “I only work here.” If we were made to put on a suit and act straight, we wanted our act to be unconvincing: “This not me.”

We saw everyone who tried to assimilate and achieve as sell-out phonies, and any adoption of any externally imposed etiquette or shared efforts was beneath our dignity. We were proud to not belong.

After years of professional cultural assimilation, looking back I realize most of this worldview was just a punk-mutated form of standard working class attitudes — devices used to insulate and protect an individual’s dignity from the degradation of low-paying, low-autonomy jobs. My own family history straddles classes, and I believe a got a pretty strong dose of working class attitude as a kid, enough that I found well-adjusted, classier kids uninteresting and unfit for friendship.

Basically, in becoming professional, first through incredibly awkward attempts at code-switching, then later through genuine internalization I learned a couple of really important things I never could have learned without undergoing this incredibly uncomfortable, occasionally depressing, ordeal.

1) We cannot thrive in institutions we secretly despise. If we withhold ourselves, preserve our alienation, participate with reluctance and wear our membership like a mask, instead of figuring out some mode where we can be who we really are within the necessary constraints of social existence, our withholding is palpable to peers and leaders. If you are half-in and half-out, whether you know it or not, everyone around you feels it and knows it with immediate, intuitive certainty. And committed members of an organization will not — and should not — give you responsibility they know you will not own.

2) There is profound wisdom in professionalism. What seems like arbitrary etiquette that only signals in-group from out-group is in fact an organic social technology that permits members of organizations to function effectively and gracefully as collaborators, while protecting everyone from potentially conflicting personal idiosyncrasies. We suppress at work whatever is not needed to get the job done, not because it is essentially unacceptable and unworthy, but because it is sacred, unique and vulnerable and requiring the protection of privacy. Those things we keep to ourselves at work — or at least, in wiser times, used to keep to ourselves — politics, religion, controversial opinions — are the very things that might conflict, cause friction and drive unnecessary wedges between people who need to get along and work together.

I am grateful for the opportunity to be at least somewhat initiated into the professional world. If I’d chosen a counter-cultural life outside of business I may have clung to my romantic ideal of proud and principled alienation from the superficialities of professional life.

I am even more grateful I was not indoctrinated to believe that my childhood culture determined my essential identity  and defined who I am and who I must forever commit to being, lest I become a sell-out phony and a betrayer of my culture.

If I had been taught this, and learned to believe it with all my heart, I would have been left on the margins, locked out by my own refusal just to open the door and walk in. This would have been a disservice, a miseducation — a passing down of a self-defeating tradition.

We are not who we are because of culture, nor are we who we are despite culture. We discover who we are by collaborating with culture, experimenting with who we can be, and maturing into well-socialized but authentic individuals.

Slurpy, mergy, touchy-feely notions of interpersonal being

Wow, this post really sprawled out. It hits a lot of my enduring interests. I’m not sure it is suitable for reading. It might just be a personal journal entry written to myself. Feel free to eavesdrop if you wish, but I cannot promise it will make sense or yield any value.

*

I listened to a fascinating Radio Open Source podcast on Hannah Arendt’s conception of evil, which ended with a wonderful discussion on empathy.

Jerome Kohn: Empathy is a fancy word or fancy theory that she argued passionately against. First of all she thought it was an impossible notion in the sense that it really means feeling what someone else feels. Sympathy, fellow feeling, is another thing. But empathy is the claim that you can actually feel what someone else is feeling. And for that Arendt found no evidence whatsoever. One could say it’s even the opposite of her notion of thinking from another person’s point of view. What you have to be able to do is to see a given issue from different points of view, to make it real. And then through those different points of view, with your own eyes, you don’t feel what the other person is feeling, you see what he is seeing through your own eyes, and then you can make a judgement. The more people you can take into consideration in this enlarged mentality, that actually is the foundation of reality for Arendt, the more valid your judgement will be.

Elisabeth Young-Bruehl: Jerry’s exactly right. Hannah Arendt was always opposed to these slurpy, mergy, touchy-feely notions about what binds people to each other. And she felt very keenly that what really binds one person to another is a commitment to try to see the world from that person’s point of view with your own eyes. Not to subscribe to their point of view or to merge with their point of view, but to be able to walk around and see what the world looks like from where they’re standing. But looking at it with your own eyes, so that you can then, as it were, discuss it with them. Not merge with them in some way, but discuss it with them. She was all about discussion. Not empathy in that sentimental way.

Christopher Lydon (host): And yet, well, there are distinctions without huge differences in some way. To put oneself in another’s mind is the beginning of something important.

EYB: To think that you can put yourself in another’s mind in the beginning of a terrible arrogance which has tremendous consequences. It’s a difference with great consequences. People who think they that they can know what another person thinks or feel what another person feels are narcissistic.

CL: Well, ok, I don’t want to make a philosophical or an endless argument about it. Isn’t it the incapacity and the lack of interest in that perspective precisely what she found at the core of Eichmann’s banality and Eichmann’s evil, really?

JK: Well, no, it was his thoughtlessness, his inability to think from any other point of view but his own.

EYB: Exactly. And these are very important distinctions.

This exchange is especially interesting to me for three reasons.

First: as a Human Centered Design researcher/strategist/designer, I am constantly telling people that I am in the “empathy business.” However, I have long been uncomfortable with the characterization of what I do as “empathy”. To characterize understanding another person subjectively as primarily a matter of experiencing how they feel misses the mark in a very Modernist way. (em- ‘in’ + pathos ‘feeling’). While feelings are important to what I do, they are not the primary focus. I would prefer to characterize my work as concrete hermeneutics, but words like that do not fly in the flatlands of business where thinking lags a minimum of three philosophical generations behind. So, I’ve adopted “empathy” and accepted the inevitable misconceptions that attend it, because that’s what it takes to be understood at all by most people.

It is hardly surprising that I see things similarly to to Young-Bruehl and Kohn, because I belong to their tradition. Heidegger taught Arendt and Gadamer who both taught my favorite thinker Richard J. Bernstein. A Clifford Geertz quote from Bernstein’s Beyond Objectivism and Relativism has stayed with me as an anchor for my understanding of what a good human centered designer does.

Second, I think that when we see things this way, we tend to treat emotionally-oriented people who are very sensitive and sentimentally responsive to people around them as having some kind of monopoly on human understanding. In my experience, there are multiple stages of coming to understanding of another person, and a talent for sensing and responding does not always correspond with a talent for grokking the “logic” of other people’s worldviews, nor an ability to think, speak and create from another worldview. It takes a fairly vast range of talents to function pluralistically.

I think a lot of the political problems we are experiencing today result from shoddy and retrogressive philosophical conceptions of alterity (“otherness”), which still see understanding of other people as very literally empathic. To know what is going on with another person, we must ourselves have had the experiences and emotions that other person has had. In an effort to understand and to demonstrate our understanding we must induce emotions similar to theirs. Two consequences follow: 1) The one who understands must try to produce the right emotions, and this production of emotion is the demonstration of understanding, which leads to some fairly repulsive public displays of political sentimentality. 2) The one who is understood is put in a position of judging the authenticity of those emotional displays, which is more or less being given the role of arbitrary judge. And if the feelings of the understood is viewed as the central datum or a special kind of insight (being “woke”) into a political situation (typically gauging the degree of prejudicial unfairness, its impact on those victimized by that prejudice and what is required to rectify that unfairness) this amounts to extreme epistemological privilege. Only the victim of prejudice has access to the reality of the situation, and those who are not the victims are incapable of perceiving how they participate in the perpetration, so to use the charming the formulation of today’s hyper-just youngsters, it is their job to STFU and to accept the truth dictated to them. It never occurs to anyone within the power hierarchy of wokeness that there’s anything superior to all this illiberal mess to awaken to. There are philosophical worldviews that are more thorough, more comprehensive and more expansive than the dwarfish ideology of the popular left, but for all the reasons they are eager to point out to anyone who defies them, they are entirely incapable of seeing beyond the motivated reasoning of their own class interests. (This does not mean I think the popular right is any better. It is not. We are in a Weimaresque situation of resentful evil left idiocy vs paranoid evil right idiocy, with the reasonable voices shoved to the margins.)

Third, I’ve found myself misunderstood by many close friends on how I view relationships, and Elisabeth Young-Bruehl did a great job of capturing how people think I see them: a “slurpy, mergy, touchy-feely notion about what binds people to each other.” I think the misunderstanding is rooted in this same conception of human understanding being primarily an emotional phenomenon. When my own ideal of marriage or of friendship is strained through the filter of today’s left worldview, it looks like a mystical merging of souls that arouses (and should arouse!) suspicions of domination and anxieties around loss of self. But any attempt I make to try to explain the difference between what I have in mind looks like, well, an attempt at philosophical domination and a threat to the selfhood of whoever is foolish enough to take it seriously. Who am I to tell someone something they don’t already know? And anyway, it smells very cultish to listen to someone claiming to know better than the public what is true and right. So, by the circular logic of the popular worldview of the left, it is superior to form one’s own individual opinion (never mind that this opinion on opinions is a product of an unexamined and manifestly broken worldview.)

Obviously, this means extreme alienation for anyone who adopts a sharply differing worldview that affirms the importance of collaboratively developing shared understandings with those around them. In an environment of extreme ideological conformity (with brutal social consequences for infractions) that exalts above all the importance of intellectual independence — but strictly within its own confined philosophical horizon — a philosophy of interdependence, of collaborative development of the very concepts one uses to form one’s opinions, and exalting a togetherness in shared worldview is marked for expulsion.

Anyway, what I really have in mind when I imagine ideal personal connections is, once again, that ideal sketched out by Bernstein, captured so well by Geertz, which I will now go ahead and re-re-quote.

…Accounts of other peoples’ subjectivities can be built up without recourse to pretensions to more-than-normal capacities for ego effacement and fellow feeling. Normal capacities in these respects are, of course, essential, as is their cultivation, if we expect people to tolerate our intrusions into their lives at all and accept us as persons worth talking to. I am certainly not arguing for insensitivity here, and hope I have not demonstrated it. But whatever accurate or half-accurate sense one gets of what one’s informants are, as the phrase goes, really like does not come from the experience of that acceptance as such, which is part of one’s own biography, not of theirs. It comes from the ability to construe their modes of expression, what I would call their symbol systems, which such an acceptance allows one to work toward developing. Understanding the form and pressure of, to use the dangerous word one more time, natives’ inner lives is more like grasping a proverb, catching an allusion, seeing a joke — or, as I have suggested, reading a poem — than it is like achieving communion.

And now I will quote myself:

“Understanding the form and pressure of, to use the dangerous word one more time, natives’ inner lives is more like grasping a proverb, catching an allusion, seeing a joke — or, as I have suggested, reading a poem…” or knowing how to design for them.

A design that makes sense, which is easy to interact with and which is a valuable and welcome addition to a person’s life is proof that this person is understood, that the designer cared enough to develop an understanding and to apply that understanding to that person’s benefit.

A good design shares the essential qualities of a good gift.

The kind of merging I have in mind is just sharing a worldview and using it together to live together, what Husserl (Heidegger’s teacher) called a “lifeworld“. I’ve called the process “enworldment”.

The merging aspect of this ideal enters the stage through my belief (shared, I believe by Process Theology) that souls are universe-sized. The pragmatic consequence of what one means when one says “everything” is the scope and density of one’s soul. To enworld* with another is to bring two “everythings” into harmonious relationship, and to begin to function more like a culture than two isolated individuals within this isolating milieu so many of us, without ever choosing, without even knowing we had a choice, inhabit as prisoners of our own destitute freedom.

(Note: that “enworld” link above is a pretty old post, and I’m not sure right now how much of it I still agree with. It makes me want to engage my old self in dialogue and try to discover how much common ground we have. How enworlded am I with my 9-years-ago self?)

“The many faces of research”

I just realized I never re-posted my October 2010 article summarizing James Spradley’s incredibly cool way of defining different types of research — by the role of the participant vis a vis the researcher.

Here’s the text:

Anyone who has ever commissioned, designed, conducted research will find these common but thorny questions all too familiar:

  • “What is this research going to give us that we can’t get from analytics and iterative design?”
  • “Don’t you need to ask all your interviewees the same set of questions so you can compare their answers?”
  • “Can you quantify these findings?”
  • And with qualitative research, the dreaded: “That’s an awfully small sample. Are these findings statistically significant?”

These questions can be difficult to answer clearly, succinctly and definitively. Wouldn’t it be helpful to have some kind of framework or model to help people understand how the various kinds of research (especially qualitative and quantitative fit together) to provide an organization what it needs to effectively engage and serve their customers?

James Spradley in The Ethnographic Interview provides such a framework. His approach is the identification of four different roles a research participant can play, each with a different relationship between researcher and participant and each producing a different kind of finding:

  • Informant – In ethnography, a participant is related to as an informant. Informants are “engaged by the ethnographer to speak in their own language or dialect”, providing “a model for the ethnographer to imitate” so that “the ethnographer can learn to use the native language in the way informants do.” The informant is related to as a teacher. What is learned is how the participant conceptualizes and verbalizes his experience. Informants give the researcher not answers to fixed predetermined questions, but the questions themselves. Informants help define what the researcher needs to learn in subsequent research. (Examples of research techniques with informants: unstructured and semi-structured interviews, diary studies, open card sorting, collaborative design exercises.)
  • SubjectSubjects are participants in social science research, upon whom hypotheses are tested. “Investigators are not primarily interested in discovering the cultural knowledge of the subjects; they seek to confirm or disconfirm a specific hypothesis by studying the subject’s responses. Work with subjects begins with preconceived ideas; work with informants begins with a naive ignorance. Subjects do not define what it is important for the investigator to find out; informants do.” (Examples of research techniques with subjects: usability testing, split testing, concept testing.)
  • Respondent – A respondent is any person who responds to a survey questionnaire or to queries presented by an investigator. “Survey research with respondents almost always employs the language of the social scientist. The questions arise out of the social scientist’s culture. Ethnographic research, on the other hand, depends more fully on the language of the informant. The questions arise out of the informant’s culture.” (Examples of research techniques with respondents: surveys, questionaires, structured interviews, closed card sorting.)
  • Actor – “An actor is someone who becomes the object of observation in a natural setting.” As with subjects and respondents, when participants are related to as actors, the terms of the description of the actor’s behaviors are those of the researcher, not of the participant. It should be noted, however, that in ethnographic research (and also in contextual inquiry, participants are interviewed as they are observed, which means the participant is still understood  primarily as an informant. The actor-informant teaches the researcher through showing and explaining in his own terms the significance of his actions, which allows the researcher to give (to use Clifford Geertz’s term) “thickness” to his descriptions of what he observes. (Examples of research techniques with actors: site analytics, business intelligence analysis, silent observation.)

Over the course of a research program, research participants may at various times be regarded as subjects, actors or respondents — but if the goal is to know what really motivates the participants, to understand how to engage them at an emotional level, and to cultivate an enduring relationship with them, it makes a lot of sense to begin by relating to research participants as informants, beginning with unstructured or semi-structured interviews.

By starting with an informant relationship with research participants researchers can develop a better idea of what matters to the participants, how they conceptualize and speak about these things, and most importantly how this motivates observable behavior. These insights (that is, findings that illuminate the inner life of participants) can focus subsequent research on the most relevant and impactful questions. It also improves the execution of the research by helping researchers use language that’s natural and understandable to participants, earning greater trust and cooperation, and minimizing misunderstandings. And in analysis researchers and planners will mine more valid insights from the data, since they understand the motives, thought process and language behind the responses and behaviors of the respondents, actors and subjects. And the insights will be accurate because they rely far more on fact than (often unconscious) assumptions.

The other types of research can then report in more quantifiable terms, using much larger samples, how many subjects or actors perform certain behaviors or how many respondents give one answer or another to certain questions on a survey or questionnaire — and these actions and responses will now carry much more meaning because now the researchers have subjective insights to complement the objective data.

Two more points worth making: 1) I haven’t mentioned segmentation in this article, but anywhere where I mention learning about research participants, I am talking about learning about segments of participants (defined by goals, needs, attitudes and behaviors), and understanding the similarities and differences among them. 2) Generally, it is in the role of informant that research participants provide findings that drive design and creative. Informants inspire empathy and creative approaches. Subjects, respondents and actors tend to yield information useful in making strategy decisions. Using the full range of qualitative and quantitative research methods together intelligently can enable strategists and designers to work together more effectively to harness the full power of experience design.

By understanding research better — recognizing the difference between research that produces subjective insights and research that produces objective data, by not mistaking them for rival methods for producing the same kinds of findings, and by understanding how they can be used together to gain a holistic picture of one’s customers that is far more than the sum of the facts — an organization becomes more capable of understanding its customers without sacrificing their individuality to empty statistics.

“God Is Not Dead”

A church in my neighborhood put a flyer in my mailbox inviting me to a screening of “God Is Not Dead.” I decided to go and see it and to meet the people at the church.

The film was interesting, but the church was even more interesting. The people there were extremely nice, both to me and to each other. People of different races sat together, with no trace of self-segregation. It was surprising how surprising this was to see. The children were exceptionally polite, but without any evidence of brokenness. They seemed very happy and alive. The service was moving. Everything centered around love. God loves every one of us. The world is underpinned and saturated with love. We are called to love each other.

The only major problem I had with any of it was the image they had of their non-Christian neighbors. I saw this image both in the film and in how they spoke about the wicked people in the world that make life difficult for everyone — themselves most of all, but also believers. The characters in the film were alarmingly flat and unbelievable. It was nearly as bad as reading Ayn Rand. They had some kind of horrific aversion to God and could not accept his love for various reasons, despite on some level feeling the truth. It made them lash out at God and Jesus and his faithful worshipers.

If I lived in the world with angry, irrational, evil people like that, and especially if I had children, I would take drastic measures to stop them. But they don’t live in a world full of people like that. These unbelievers were imagined characters — moral straw men. When I tried to tell them how they were getting their neighbors wrong, they were uninterested in discussing it. Eventually they stopped answering my emails.

It makes me wonder if we don’t store our own most vicious, hateful and violent impulses in the imputed inner-lives of our enemies.

I hear “God Is Not Dead 2” is coming out soon. Maybe they’ll screen that, too. It might be a good excuse to resume the conversation.

 

Pluritarian Pluriversalism

To someone born into an autistic universe controlled by a single set of strictly logical natural laws, the experience of empathy and the subsequent revelation of an empathic pluriverse redefines the meaning of miracle, and of transcendence, and of religion.

Before, miracles were exceptions to the laws of nature. After, miracles are the irruption of something in the midst of nothingness: other minds, each with a world of its own — each with the power to change the meaning of one’s own world.

Before, transcendence was defined in terms of an infinite reality standing beyond the finite objective world.  After, transcendence was defined in terms of an infinite reality standing beyond myriad finite objective worlds, each rooted in the elastic mind of a subject.

Before, religion was the attempt for an individual to commune with a transcendent reality with miraculous powers. After, religion was still the attempt for an individual to commune with a transcendent reality with miraculous powers, but the change in conceptions of transcendence and miracle means that it is the individual and the individual’s world that is transcended, and this means the route to transcendence is not around the world and one’s neighbors, but through them and their worlds. The activity of loving, respecting and learning from one’s neighbors is intrinsic to loving, respecting and learning from the infinite God who cannot be confined to any one world, however vast.

Myriad worship practices are needed to worship myriad aspects of an inexhaustible and inexhaustibly meaningful God. By this understanding, empathy is worship.

Lost in the concrete, lost in the abstract

Watching an occurrence is one kind of observation. Seeing a pattern is another. However, it is rare to find either observation in anything approaching pure form. When we observe an occurrence, often what we are most witnessing is the repetition of a pattern — and little else. And when we see a pattern, we imagine an event or two that lends sense to what repeats — but more vaguely than we suspect.

To watch an occurrence without the guidance of a pattern is disorienting. We don’t know what to make of it. But, conversely, to hear description of patterns of occurrences of which we lack real-life experience and cannot imagine is also disorienting, and we don’t know what to make of that either.

In the former case we are lost in the concrete and in the latter case we are lost in the abstract.

But something peculiarly meta happens when we get lost in the abstract: Being lost in the abstract it is also being lost in the concrete. An occurrence of explanation (of some unfamiliar thing) is happening before us in a conversation or on the pages of a book, and we do not know what is going on, and so we do not know what to make of it. (If you are having trouble recollecting a situation where this has happened to you, and it is preventing you from understanding what the hell I am talking about here, right now — well, now you have your example.) In these situations it is possible to master the mode of explanation (as a language game) without gaining familiarity with the reality to which the explanation refers…

  • A historian can get good at discussing battles and generals without ever knowing what it looks like to give an order or to receive a briefing and lead troops into battle or to be led into battle and to engage in combat.
  • A manager can become fluent with the terms used in his organization without actually knowing how how teams collaborate and activities are executed.
  • An armchair politician talks knowingly about Congressional developments without having the slightest insight into how legislation is drafted, debated, negotiated, passed and executed.
  • Statistics demonstrate that women on average make a quarter less than men, but leaves explanations of how this happens to other studies — or to the casual speculations of individuals.

Perhaps the vast share of our knowledge is of this second-degree concrete/abstract variety.

 

Ingold on animism

From Tim Ingold’s Being Alive:

In one of the most original and provocative discussions of materiality to have appeared in recent years, Peter Pels characterises the logic of this argument as animist: ‘a way of saying that things are alive because they are animated by something foreign to them, a “soul” or … spirit made to reside in matter’. Whatever its source might be, this animating principle is understood here as additional to the material object on which it has been bestowed.

There is however, according to Pels, another way of understanding how things can act back. This is to say that the spirit that enlivens them is not in but of matter. We do not then look beyond the material constitution of objects in order to discover what makes them tick; rather the power of agency lies with their materiality itself. Pels characterises this alternative logic as fetishist. Thus the fetish is an object that, by virtue of its sheer material presence, affects the course of affairs. This argument is an important step in the right direction, but it takes us only halfway. On the one hand it acknowledges the active power of materials, their capacity to stand forth from the things made of them. Yet it remains trapped in a discourse that opposes the mental and the material, and that cannot therefore countenance the properties of materials, save as aspects of the inherent materiality of objects. Thus the hybrid quality that Pels attributes to the fetish — its capacity at once to set up and disrupt ‘the sensuous border zone between ourselves and the things around us, between mind and matter’ — is in fact a product of the misrecognition of the active properties of materials as a power of the materiality of objects. …

Bringing things to life, then, is a matter not of adding to them a sprinkling of agency but of restoring them to the generative fluxes of the world of materials in which they came into being and continue to subsist. This view, that things are in life rather than life in things, is diametrically opposed to the conventional anthropological understanding of animism, invoked by Pels and harking back to the classic work of Edward Tylor, according to which it entails the attribution of life, spirit or agency to objects that are really inert. It is, however, entirely consistent with the actual ontological commitments of peoples often credited in the literature with an animistic cosmology. In their world there are no objects as such. Things are alive and active not because they are possessed of spirit — whether in or of matter — but because the substances of which they are comprised continue to be swept up in circulations of the surrounding media that alternately portend their dissolution or — characteristically with animate beings — ensure their regeneration. Spirit is the regenerative power of these circulatory flows which, in living organisms, are bound into tightly woven bundles or tissues of extraordinary complexity. All organisms are bundles of this kind. Stripped of the veneer of materiality they are revealed not as quiescent objects but as hives of activity, pulsing with the flows of materials that keep them alive.

This harmonizes with an earlier post I wrote, and nearly rewrote until I remembered I’d already written it.

Practical philosophical reductionism

Less than a month ago I observed I’d collected three anti-method books in my library: After Method, Beyond Method, Against Method, and noted the absence of Before Method, Within Method, For Method.

I forgot that I also own For and Against Method, which is half argument for method, and Truth and Method, which argues against the existence of any universally valid hermeneutic technique.

*

I did not set out to collect books on method. I own these books because the concept of good method is one of the most effective (because it is the least questioned/questionable) vehicles for enforcing practical philosophical reductionism.

We fail to recognize how aggressive this is, partly because we tend to harbor a monistic orientation to “best”. We are seeking the best way, and if someone has already found it, we should set aside our own semi-articulate objections, preferences and intuitions and resist the temptation to “reinvent the wheel.”

This aligns with a general moral preference for self-effacement. We are eager to show that we can put our own preferences aside in the interest of a better outcome. This is admirable — if you’ve actually established the superiority of the less preferred method. But all too often we adopt a “pain, therefore gain” attitude that does nobody a bit of good.

Then, of course, many people don’t want to think philosophically. They just want to figure out what they’re doing, so they can get down to the doing. Thought is an unpleasant necessity that precedes making and executing plans. For such minds, method eliminates a lot of crap they didn’t want to do anyway. Re-considering method introduces unwelcome extra work of a kind they’d prefer not to deal with. It’s like making them slaughter the cow that will become their tasty cheeseburger. They’d rather just slap it on the grill, already.

Then finally there’s the “involvement anxiety” toward  participatory understanding that’s endemic to human sciences. We badly want to know without our own selves figuring into the equation. Even people who pride themselves on accounting for context when studying human subjectivity often want to subtract the themselves out of the context they do create — and must create — through their practice. A practitioner’s preference of method is taken to be a subjective impurity to be removed in the attempt to understand the others’ subjectivity more objectively.

It reminds me of a passage from Italo Calvino’s Invisible Cities:

After a seven days’ march through woodland, the traveler directed toward Baucis cannot see the city and yet he has arrived. The slender stilts that rise from the ground at a great distance from one another and are lost above the clouds support the city. You climb them with ladders. On the ground, the inhabitants rarely show themselves: having already everything they need up there, they prefer not to come down. Nothing of the city touches the earth except those long flamingo legs on which it rests and, when the days are sunny, a pierced, angular shadow that falls on the foliage.

There are three hypotheses about the inhabitants of Baucis: that they hate the earth; that they respect it so much they avoid all contact; that they love it as it was before they existed and with spyglasses and telescopes aimed downward they never tire of examining it, leaf by leaf, stone by stone, ant by ant, contemplating with fascination their own absence.

 

Anthropology = empirical metaphysics

From Reassembling the Social:

What ANT does is that it keeps asking the following question: Since every sociologist loads things into social ties to give them enough weight to account for their durability and extension, why not do this explicitly instead of doing it on the sly? Its slogan, ‘Follow the actors’, becomes, ‘Follow the actors in their weaving through things they have added to social skills so as to render more durable the constantly shifting interactions.’

It’s at this point that the real contrast between sociology of associations and sociology of the social will be most clearly visible. So far, I might have exaggerated the differences between the two viewpoints. After all, many schools of social science might accept the two first uncertainties as their departure point (especially anthropology, which is another name for empirical metaphysics)…

A short-lived fashion from the turn of the millennium?

At the end of her 2000 article “Ethnography in the Field of Design” Christina Wasson issued some warnings:

Although ethnography enjoys a great deal of popularity in the design field at present, I want to close with a cautionary eye to the future. Observing a similar phenomenon in CSCW, Hughes et al. (1994:437) noted: “Ethnography is currently fashionable in CSCW, but if it is to survive this kind of attention then it is important that the method find an effective voice rather than remaining content with ephemeral celebrity.”

. . .

Ten years from now, will ethnography be regarded as a short-lived fashion from the turn of the millennium? Its staying power depends on its ability to accurately purvey a unique kind of useful information to designers. And while the details of design firms’ ethnographic practices may not be public, there is a widespread sense among anthropologists in the design community that the quality of these firms’ research varies widely. The popularity of the approach has led a number of design firms to claim they offer “ethnography” even though none of their employees has a degree in anthropology or a related discipline. Sometimes researchers trained in cognitive psychology adopt observational methods; sometimes designers themselves do the observation. Such design firms are not necessarily averse to hiring anthropologists; they may have been unable to find ones with an adequate knowledge of the private sector.

As a consequence, the concept of ethnography has become a “pale shadow of itself’ (Wasson n.d.). In its most emaciated form, the term is simply used to refer to a designer with a video camera. Even in somewhat richer versions, the term has become closely identified with the act of observing naturally occurring consumer behaviors. The need to analyze those behaviors and situate them in their cultural context is poorly understood, even though these activities are essential parts of developing a model of user experience that leads to targeted and far-reaching design conclusions. The anthropological apparatus that stands behind ethnography — the self-reflexivity of participant observation, the training in theory that enables fieldworkers to identify patterns — these are poorly understood in the design field. Indeed, the association between ethnography and anthropology is not widely known. The term “anthropology” is almost never heard. Even Chicago’s Institute of Design, whose faculty has a fairly sophisticated understanding of the topic, describes ethnographic observation merely as “a method borrowed from social science research” on its Web site (Institute of Design 1997).

The tendency for design firms to skimp on analysis is due in part to financial pressures. It can be hard to persuade clients to fund adequate labor time for researchers to develop well-grounded interpretations. Merely claiming to do ethnography costs little; actually conducting substantive anthropological research is much more expensive. Clients are also chronically in a hurry and press for immediate results. Nonetheless, my worry is that the design firms that skimp on analysis will tend to produce less interesting results. In the long run, this could lead to the perception that ethnography doesn’t have much to offer after all. If that should happen — and I certainly hope it does not — an opportunity for anthropologists to help construct the world around us will have been lost. Those of us who are active in the design field can address this issue in several ways. First of all, it is my hope that the mechanism of the market may actually be of use and that anthropologists can create positive publicity for themselves by doing good work on their projects. It seems possible, at least, that clients will realize, over time, that the findings of design firms engaging in richer forms of ethnography outshine the findings of other firms. E-Lab/ Sapient’s continued growth is a hopeful sign. . . .

 

Design and the social system

Talcot Parsons, from The Social System:

Reduced to the simplest possible terms, then, a social system consists in a plurality of individual actors interacting with each other in a situation which has at least a physical or environmental aspect, actors who are motivated in terms of a tendency to the “optimization of gratification” and whose relation to their situations, including each other, is defined and mediated in terms of a system of culturally structured and shared symbols.

This is practically an inventory of the elements uncovered in design research.

  • Actors (which Parsons divides into ego, equivalent to “the user” in UX parlance and alter, which UX treats as elements of social context.)
  • Social context (relationships between actors)
  • Physical context (environment)
  • Needs (a.k.a. “optimization of gratification”)
  • Behaviors (involvement with events and artifacts in physical context)
  • Interactions (involvement with actors in social context)
  • Mental models (the defining/mediating symbol structure)
  • Signs (I’m adding this one: affordances that link mental models with real-world events and artifacts)
  • Symbols (I’m adding this one, too: indicators of the value-significance of real-world events and artifacts)

 

Geertz on irony

Geertz: (From his essay “Thinking as a Moral Act”):

“Irony rests, of course, on a perception of the way in which reality derides merely human views of it, reduces grand attitudes and large hopes to self-mockery. The common forms of it are familiar enough. In dramatic irony, deflation results from the contrast between what the character perceives the situation to be and what the audience knows it to be; in historical irony, from the inconsistency between the intentions of sovereign personages and the natural outcomes of actions proceeding from those intentions. Literary irony rests on a momentary conspiracy of author and reader against the stupidities and self-deceptions of the everyday world; Socratic, or pedagogical, irony rests on intellectual dissembling in order to parody intellectual pretension.”

It seems to me that systems thinking — at least thinking about systems in which the thinker is a participant — might require a certain degree of irony. Our experience of being caught up in a system is one thing, but what is required to adjust or change the system is another — and the connection is rarely obvious. That experience is an intrinsic part of the workings of many systems, particularly management systems.