Category Archives: Design

Let’s stop engineering philosophies

My pet theory is that philosophies have been developed in an engineerly mode of making, with emphasis on the thought system, and to be evaluated primarily epistemically: “is it true?” The Pragmatists improved on this by asking, “does it work”?

But I believe the pluralistic insight requires us to take a designerly approach to philosophy by expanding the questions we ask of philosophies to those of design (as originally posed by Liz Sanders): “is it useful; is it usable; is it desirable?” And human-centered design has taught us always to dimensionalize this triad with “…for whom, in what contexts?”

*

Very few people grasp what philosophies are, and how and why they are so important. They use philosophies that lead them to believe their philosophy is their belief system — the things they believe to be true true and the means by which they evaluate these truths. They think they know what their philosophy is, because their philosophy points them only to their explicit assertions and arguments, and this sets sharp limits to what they can think and what they can do with their thoughts.

Philosophies are the tacit thinking that give us our explicit truths and our sense of reality. We had better design them well! But we keep engineering them… for other engineers.

Maybe philosophy is waiting for its own Steve Jobs.

Design brief

A design brief is a compact design problem definition, carefully designed to inspire designers to produce effective solutions to a real-world problem.

I’ve been thinking about and collecting briefs for years, and I have noticed that the very best briefs do three things well: define the problem, inspire solutions and provide guidance for evaluating solutions.

1) An ideal brief should be informed by an understanding of a real-world problem.

…which means the brief focuses on the right problem, addresses the full problem, and gets the details of the problem right. This way a design team will work on solving the right problem, will consider the most important factors, will incorporate the most important elements and will strike the smartest balances and tradeoffs when designing a solution, and the team’s efforts won’t be won’t be guided by assumptions or errors. The elements of a design problem definition will always include who the design is for, why the design will be useful and desirable for them, and all factors from the use context that might affect the solution. It will also include all requirements and constraints from the client organization.

2) An ideal brief should convey the problem powerfully.

…which means the brief communicates its problem clearly, memorably and inspiringly. Clear communication means the problem is crisply and unambiguously defined for the design team. Memorable means each designer can hold the whole problem in their minds, so that the parts of the problem all hang together as a whole, while remaining distinct. Inspiring is the most difficult part: this means that the brief can help produce surprising ideas not specified or implied by the brief, but which still satisfy it. The brief stimulates ingenuity that surpasses anything the brief anticipates. (* Note: A tight, effective brief is likely to require supplemental materials to educate the team and get them aligned with the people and contexts of the design problem. Ideally, the designers will participate in the research and gain a tacit feel for the situation they are designing into. The brief does not have to carry the entire team educational load, only the interpretation of the situation into a design problem.)

3) An ideal brief should provide an explicit evaluative standard.

…which means the brief outlines criteria by which a member of the design team can assess the quality of the solution, and determine where it is and is not successful. This removes arbitrariness from the process and fully empowers designers to try bold, radically novel approaches to solve the problem.

+

I have never created nor have I seen a design brief that fully lives up to the ideal sketched above. I believe what I’ve written may be a design brief for a designing a design brief.

10,000 foot view

Lately I’ve been reflecting on what strikes me as the most difficult and interesting challenge I’ve faced adjusting to service design after decades of practicing other flavors of human-centered design: the problem of altitude and granularity appropriate to solving service design problems.

In design, thinking about altitudes and zoom levels is common to the point of being a tradition, starting  with the Eamses’s classic “Powers of Ten” film. Given its strategic, integrative, multidisciplinary scope, Service Design is particularly zoomy, so it is unsurprising that altitude-based frameworks and analogies are frequently used in the Service Design world.

As important as it is to understand the value of working at multiple altitudes, it is also helpful to be prepared for the experience of changing altitudes, especially within Service Design’s own peculiar range. For the uninitiated, the shifts in granularity, theme and perspective can be a slightly strange experience. As a sort of expectation-setting initiation, I offer an extended, but hopefully not too labored, analogy.

If strategy flies at 30,000 feet (where the ground is so distant it looks like a map) and we agree most design flies at 3 feet (where the ground is so close and so chaotic it is hard to survey), service design flies at 10,000 feet, approximately the cruising altitude of a single-engine prop plane.

10,000 feet is a very useful altitude that bridges 30,000 foot and ground — clarifying relationships between strategy, operations and the experiences real people (real customers, real employees) have as a result — but flying at this altitude does introduce practical challenges.

First, there is the issue of clouds. At 30,000 feet, the clouds are below you. Standing on the ground, the clouds are above you. But at 10,000 feet you are flying in and out of clouds, which can be very disorienting, in the most literal sense. It can be tricky to know which way you are facing or which way is up. And the view is neither clear nor continuous.  One moment you can see a bit of ground, the next you see nothing but your instruments, and you have to use your memory, imagination and your recording and data-gathering tools to form a sense of the whole. But the understanding that develops from these varying sources has far more structural clarity than you can get from the ground, and more human richness than you can detect way up in the cold, thin air of the stratosphere. To put it all together, though, interpretation is necessary. The picture doesn’t automatically emerge by itself. The heterogeneous parts must be skillfully pieced together into a coherent image.

Second, you are dealing with some odd scales of meaning. Looking down at a town, everything looks miniaturized but still human, maybe even exaggeratedly human because the tedium of life is abstracted away and we relate to it like kids playing with toys. Some homes are big, some are small, some are complexes or towers. Some are arranged in grids, some along windy branches of street bulbing in cul-de-sacs, and some cluster along the edges of lakesides or hills. Some homes have trees or yards, pools, trampolines or gardens, driveways or parking lots. You can imagine what life in the neighborhood might be like. But you can also see the layout of the city, and get a sense of how parts of the town connect up. You can see where the schools, the stores, the churches and the sports fields are. You can see where things have been built up, what has been left in a wild state, and where development is happening.

Now, imagine telling a story about the life you see below, doing justice both to the individual lives taking place in the tiny buildings below but showing how it all connects to form a system… this is not the usual storytelling scale. It is neither intimate nor epic. It must generalize, but without blurring key particularities or averaging individuality into bland anonymity. But if you wish to tell the story of how a town works, or if you want to propose significant structural modifications to the town, this is the narrative scale required. Telling such a story requires thoughtful zooming in and zooming out to show connections between whole and part, connecting fine grain details of breakfasts, meetings and bills, with grander-scale phenomena like demographic trends, commerce and traffic patterns. Translated back to Service Design, this means combining stories of cultural and industry trends, corporate strategies and vignettes from customer’s and employee’s lives to show how these macro-level trends and strategies impact the everyday existence of individual people, and conversely, how the micro-level behaviors impact strategies and generate trends.

Finally, intervening at this height is strange. Many proposals for change fall somewhere between strategic and tactical. They anticipate details of implementation, but without over-specifying them. Specifications are suggestive and provisional and intended more to clarify a problem than provide a solution. Many people find the interpretive latitude confusing: what in the recommendation is fixed and what is variable? If everything is open-ended what use is the recommendation at all? It can all seem vague and insubstantial, yet there is a thrust and lasting momentum in the work that carries initiatives forward and in a direction that benefits both the organization as a whole, its employees and the people it serves. Somehow the recommendations made from this altitude are capable of creating continuity between the grand plans of strategists and the intricacies of implementers on the ground.

The 10,000 view manages to refract the grand plans and sweeping aspirations of the 30,000 foot view into actions on the ground that actualize it and prevent it from remaining mere aspiration and plan. And the 10,000 foot view provides individual actors on the ground a way to relate and connect their efforts to tangible, relatable and realistic goals that connect up with the purpose of the organization.

*

Recently it occurred to me that this 10,000 foot theme is closely related to a framework I found useful and amusing earlier in my career, which I’ve called the Bullshit-Chickenshit model:

Bullshit – Meaningful, inspiring ideas that that seem to promise practical action but never fulfill that promise and never find application.

Chickenshit – Practical actions that seem like they ought to serve some meaningful purpose, but in fact is meaningless and done for no reason.

Bullshit is meaning without application. Chickenshit is application without meaning.

Flying at 10,000 feet helps prevent strategy from losing sight of concrete application and devolving into idealistic bullshit that gets nodded at and then immediately ignored. And the 10,000 foot view provides context for people working on or near the front lines to help them to remember how their everyday work connects up with larger organizational goals, so the tasks don’t lose their purpose and fragment into procedural chickenshit, obeyed, entered into TPS reports, tracked and graphed for reasons nobody remembers.

I posted something (plus a bunch of comments) on Facebook that belongs here:

  • I propose an amendment to Useful, Usable and Desirable: RELIABLE. Who’s in?
    • Reliability is knowing that when you go to use a thing that it will work as expected.
    • The problem with most software design today is that it is embedded in bad service design, a.k.a. nonintentional service design.
    • The problem with most software design today is that it is embedded in bad service design, aka nonintentional service design.
    • Bad design = nonintentional design = mere engineering
    • Engineering is a specialized subdiscipline of design, not the other way around as most industrially-minded folks still think.

Less toxic ideology, more human-centered design

Yesterday, I opened a can of Johnny Letter on Fast Company, for running what I saw as an uninformed and blatantly bigoted opinion piece, “Design needs more feminism, less toxic masculinity”.

Rather than complain about the bigotry, though, I chose instead to focus on what I believe is the root cause of most lousy, unempathic design: the failure to research design problems before attempting to solve them. Far too often we reflexively impose our own perspectives and interpretations upon situations and assume we know what needs doing to improve the situation — neglecting the essential hard work of listening, observing and developing an understanding of people in their contexts.

This is a failure the author herself exemplifies in making reckless assumptions about the cause of the bad design she laments and her proposed solution to this problem. Here’s the letter I sent (with slight edits):

I am disappointed that Fast Company chose to run “Design needs more feminism, less toxic masculinity”. I’ve worked with many male and female designers, and have found that the difference between those who are able to empathize and design to the emotional and functional needs of other people has far more to do with willingness to investigate and to get over our own preconceived notions than anything else. In this piece Tillyer investigated nothing. She does not know who designed that airport gate. Instead, with no attempt to understand how the design happened or who did it she applied her preconceived notions about how men essentially are and how women essentially are and decided to blame men for a design she didn’t like. If I had written that article, I’d have begun by investigating the design process that produced that gate, and if I’d discovered my suspicions were correct — that nobody had looped passengers into the design process — I’d have written an article titled “Design needs more understanding, less toxic uninformed speculation”.

I think rhetorically the choice to deemphasize morality in favor of effectiveness was the right one, but that does not mean I do not see this as a moral issue.

Our social justice discourse has become hopelessly mired in questions of Who. Who is doing the wrong thing to whom? What category of person does it? What category of person suffers? But this is exactly how irresolvable resentments are formed, entrenched and intensified. Justice is traditionally depicted blindfolded for good reason.

If we want to live in a just society, we need to refocus on the How of justice: the How of learning, understanding, interpreting and responding to specific people in specific contexts.

This kind of investigation into particulars is difficult, tiring and uninspiring work, and it is no fun at all. In this work we constantly discover where we were wrong (despite every appearance of self-evident, no-brainer truth), because that is what truth requires.

In pursuit of truth, we lose our sense of omniscience, fiery self-righteousness and uncompromising conviction, and acquire more caution, patience, reticence, reflection, humility, self-skepticism and nuance. These qualities may not be rousing, inspiring, galvanizing, romantically gratifying or revolutionary — but they are judicious.

If we truly want justice — as opposed to revenge, venting of resentment and intoxication of table-turning aggression —  we need to re-acquire a taste for the judicious virtues.

Making conversational space

A post I put on Facebook just now:

This morning I was reading a pdf book (using the Notability app on my iPad) about the relationships people have with the things in their lives. As always, I was writing all over the pages, underlining, starring, etc. However, the book format was cramped, and there was insufficient space to write my own comments in the margin. I was feeling written at. So I reformatted the pdf with generous margins to make room for myself, and turned the monologue into a conversation.

Maturing

Reading Appendix A of Rorty’s Achieving Our Country, “Campaigns and Movements” I came upon this bit: “Most of us, when young, hope for purity of heart. The easiest way to assure oneself of this purity is to will one thing—but this requires seeing everything as part of a pattern whose center is that single thing. Movements offer such a pattern, and thus offer such assurance of purity. [Irving] Howe’s ability, in his later decades, to retain both critical consciousness and political conscience while not attempting to fuse the two into something larger than either, showed his admirers how to forgo such purity, and such a pattern.”

That brought to mind another passage from the introduction of Nicolai Berdyaev’s Slavery and Freedom: “My thought has always belonged to the existential type of philosophy. The inconsistencies and contradictions which are to be found in my thought are expressions of spiritual conflict, of contradictions which lie at the very heart of existence itself, and are not to be disguised by a facade of logical unity.”

For me, this immediately connects up with three themes from Nietzsche’s thought: youth, wholesale thinking, and the compulsion to systematize. (To poke around in my glorious wiki — and you really should — use the password “generalad”). Rather than explicitly draw every connection, I will juxtapose some passages and make a concept chord meant to convey an ideal of maturity I learned from Nietzsche.

*

Rough consistency. — It is considered a mark of great distinction when people say ‘he is a character!’ — which means no more than that he exhibits a rough consistency, a consistency apparent even to the dullest eye! But when a subtler and profounder spirit reigns and is consistent in its more elevated manner, the spectators deny the existence of character. That is why statesmen with cunning usually act out their comedy beneath a cloak of rough consistency.

*

Beware of systematisers! — Systematisers practise a kind of play-acting: in as much as they want to fill out a system and round off its horizon, they have to try to present their weaker qualities in the same style as their stronger — they try to impersonate whole and uniformly strong natures.

*

I mistrust all systematizers and I avoid them. The will to a system is a lack of integrity.

*

Youth and criticism. — To criticize a book means to a young person no more than to repulse every single productive idea it contains and to defend oneself against it tooth and claw. A youth lives in a condition of perpetual self-defence against everything new that he cannot love wholesale, and in this condition perpetrates a superfluous crime against it as often as ever he can.

*

Consciousness. — Consciousness is the latest development of the organic, and hence also its most unfinished and unrobust feature. Consciousness gives rise to countless mistakes that lead an animal or human being to perish sooner than necessary, ‘beyond destiny’, as Homer puts it.’ If the preserving alliance of the instincts were not so much more powerful, if it did not serve on the whole as a regulator, humanity would have to perish with open eyes of its misjudging and its fantasizing, of its lack of thoroughness and its incredulity in short, of its consciousness; or rather, without the instincts, humanity would long have ceased to exist! Before a function is fully developed and mature, it constitutes a danger to the organism, it is a good thing for it to be properly tyrannized in the meantime! Thus, consciousness is properly tyrannized — and not least by one’s pride in it! One thinks it constitutes the kernel of man, what is abiding, eternal, ultimate, most original in him! One takes consciousness to be a given determinate magnitude! One denies its growth and intermittences! Sees it as ‘the unity of the organism’! This ridiculous overestimation and misapprehension of consciousness has the very useful consequence that an all-too-rapid development of consciousness was prevented. Since they thought they already possessed it, human beings did not take much trouble to acquire it, and things are no different today! The task of assimilating knowledge and making it instinctive is still quite new; it is only beginning to dawn on the human eye and is yet barely discernible it is a task seen only by those who have understood that so far we have incorporated only our errors and that all of our consciousness refers to errors!

*

When one is young, one venerates and despises without that art of nuance which constitutes life’s greatest prize, and it is only fair that one has to pay dearly for having assaulted men and things in this manner with Yes and No. Everything is arranged so that the worst of tastes, the taste for the unconditional, should be cruelly fooled and abused until a man learns to put a little art into his feelings and rather to risk trying even what is artificial: as the real artists of life do. The wrathful and reverent attitudes characteristic of youth do not seem to permit themselves any rest until they have forged men and things in such a way that these attitudes may be vented on them: — after all, youth in itself has something of forgery and deception. Later, when the young soul, tortured by all kinds of disappointments, finally turns suspiciously against itself, still hot and wild, even in its suspicion and pangs of conscience: how angry it is with itself now, how it tears itself to pieces, impatiently, how it takes revenge for its long self-delusion, just as if it had been a deliberate blindness! In this transition one punishes oneself with mistrust against one’s own feelings; one tortures one’s own enthusiasm with doubts, indeed, one experiences even a good conscience as a danger, as if it were a way of wrapping oneself in veils and the exhaustion of subtler honesty; and above all one takes sides, takes sides on principle, against ‘youth.’– A decade later: one comprehends that all this, too–was youth!

*

The so-called soul. — The sum of the inner movements which a man finds easy, and as a consequence performs gracefully and with pleasure, one calls his soul; — if these inner movements are plainly difficult and an effort for him, he is considered soulless.

*

The serious workman. — Do not talk about giftedness, inborn talents! One can name great men of all kinds who were very little gifted. The acquired greatness, became “geniuses” (as we put it), through qualities the lack of which no one who knew what they were would boast of: they all possessed that seriousness of the efficient workman which first learns to construct the parts properly before it ventures to fashion a great whole; they allowed themselves time for it, because they took more pleasure in making the little, secondary things well than in the effect of a dazzling whole. The recipe for becoming a good novelist, for example, is easy to give, but to carry it out presupposes qualities one is accustomed to overlook when one says “I do not have enough talent.” One has only to make a hundred or so sketches for novels, none longer than two pages but of such distinctness that every word in them is necessary; one should write down anecdotes each day until one has learned how to give them the most pregnant and effective form; one should be tireless in collecting and describing human types and characters; one should above all relate things to others and listen to others relate, keeping one’s eyes and ears open for the effect produced on those present, one should travel like a landscape painter or costume designer; one should excerpt for oneself out of the individual sciences everything that will produce an artistic effect when it is well described, one should, finally, reflect on the motives of human actions, disdain no signpost to instruction about them and be a collector of these things by day and night. One should continue in this many-sided exercise some ten years: what is then created in the workshop, however, will be fit to go out into the world. — What, however, do most people do? They begin, not with the parts, but with the whole. Perhaps they chance to strike a right note, excite attention and from then on strike worse and worse notes, for good, natural reasons. — Sometimes, when the character and intellect needed to formulate such a life-plan are lacking, fate and need take their place and lead the future master step by step through all the stipulations of his trade.

*

Learning. — Michelangelo saw in Raphael study, in himself nature: there learning, here talent. This, with all deference to the great pedant, is pedantic. For what is talent but a name for an older piece of learning, experience, practice, appropriation, incorporation, whether at the stage of our fathers or an even earlier stage! And again: he who learns bestows talent upon himself — only it is not so easy to learn, and not only a matter of having the will to do so; one has to be able to learn. In the case of an artist learning is often prevented by envy, or by that pride which puts forth its sting as soon as it senses the presence of something strange and involuntarily assumes a defensive instead of a receptive posture. Raphael, like Goethe, was without pride or envy, and that is why both were great learners and not merely exploiters of those veins of ore washed clean from the siftings of the history of their forefathers. Raphael vanishes as a learner in the midst of appropriating that which his great competitor designated as his ‘nature’: he took away a piece of it every day, this noblest of thieves; but before he had taken over the whole of Michelangelo into himself, he died — and his last series of works is, as the beginning of a new plan of study, less perfect and absolutely good precisely because the great learner was interrupted in his hardest curriculum and took away with him the justificatory ultimate goal towards which he looked.

*

A man’s maturity — consists in having found again the seriousness one had as a child, at play.

*

Human beings are naturally artificial.

It is not our nature that is most precious; it is our hard-won second-nature, that set of artifices that are so well-designed that they disappear into our being and into the world we perceive around us. They become so natural to us that we can no longer experience them as man-made, and we begin to see them as God-given if we see them at all. And they are God-given, if we understand our real relationship with God.

Amen?

Eroding to wisdom

The best quotes are the misattributed ones — overused maxims that become smoother as they tumble from paraphrase to paraphrase until they are worn smooth like river stones.

Whenever I track one of these retroactively adopted orphans back to their birthplace, I discover that almost always its character has been improved by the traumas of public life.

Take for instance the famous quote that Yogi Berra should have said, but actually never did say: “In theory there is no difference between theory and practice, but in practice there is.” The original quote appeared in flabbier form in a Usenet proto-meme: “In theory, there is no difference between theory and practice, but in practice there is a great deal of difference.” Incidentally, one Berra quote Berra really did say is “I never said most of the things I said.”

Mark Twain is a popular misattributed source of collaboratively improved quotes, probably because Twain is the only writer of pithy sayings most people know, so if they hear a pithy saying they assume Twain must have said it. A great example of a Twain saying that Twain never said is “If your only tool is a hammer, everything looks like a nail.” Quote Investigator found the earliest example of this quote to be “Give a boy a hammer and chisel; show him how to use them; at once he begins to hack the doorposts, to take off the corners of shutter and window frames, until you teach him a better use for them, and how to keep his activity within bounds.”

Another fake Twain quote: “If I had more time, I would have written a shorter letter.” Quote Investigator explains the earliest English expression of this thought is a translation of a Pascal quote, “My Letters were not wont to come so close one in the neck of another, nor yet to be so large. The short time I have had hath been the cause of both. I had not made this longer then the rest, but that I had not the leisure to make it shorter then it is.” It took 300 years to shorten this quote to its current svelteness.

I even prefer the bastardized versions of properly attributed quotes. William James comes to mind:

When a thing is new, people say: “It is not true.”

Later, when its truth becomes obvious, they say: “It’s not important.”

Finally, when its importance cannot be denied, they say “Anyway, it’s not new.”

Who could possibly prefer the original?: “First, you know, a new theory is attacked as absurd; then it is admitted to be true, but obvious and insignificant; finally it is seen to be so important that its adversaries claim that they themselves discovered it.”

This meditation on misattributed quotes hints at something important: The lessons of the “gossip game” might need some qualifications. It is undeniably true that factual information passed from person to person does degrade over the course of minutes, hours, days and months. But is this true of wisdom passed from generation to generation over the course of decades or centuries? Perhaps not. Maybe wisdom seeks its perfect form through wear.

The designer in me wants to include physical objects in the set of examples of “wisdom seeking form”. I have always loved the perfection of tradition-worn objects like houses, tables, chairs, knives, pens, teapots, clothes and bicycles. My love of erosive essentializing could make me look like some sort of conservative Platonist type, except for one subtle but crucial difference: the Platonist ideal lives above humanity in a heavenly realm of preexisting perfect archetypes; where my ideal lives among us in an eternal democratic project of iterative design, a trans-generational collaboration to makes things better and better, approaching but never quite reaching perfection.

*

A friend tells me I buried the lede on this piece, and that this gives the piece a frivolous effect. One thing I have learned reflecting on philosophical communication and my own characteristic miscommunications, is that philosophy tends to reverse normal patterns of explanation. Things don’t progress in the normal subject-to-predicate order. Instead, it goes predicate-predicate-predicate-subject. You don’t exactly know what the work is about until the about finishes abouting about and finally resolves into the “what”. A capacity to enjoy philosophy is tied to an ability to endure whatlessness for long anxious stretches, until the whole mess finally coalesces and crystallizes into clear conception that makes simple sense of what preceded it.

So there’s just no way am I going to put that lede out in front where it belongs. But, being a good Liberal, I do believe in compromise, so here is what I can do: I will exhume the lede, and append it to the end, so anyone who wants to can re-read the original with this explication in mind.

What I wanted to do was to demonstrate a progressive traditionalist attitude.

Progressive traditionalism might seem like a contradiction in terms, but this is a side-effect of unexamined views of tradition that produce two mutually reinforcing oppositions: 1) progressive anti-traditionalism that wants to ignore or trash an unacceptable past in order to clear the way for a better future, and 2) traditional traditionalism that sees the past as better and the present as unacceptable, and therefore wants a future that looks more like the past than the present.

Progressive traditionalism sees tradition as a long process of collaborative improvement. The past is a swirl of good and bad. Humanity, genius is mixed with ignorance and atrocities, and our ability to discern the good and bad is a direct result of the tradition’s progress. We wouldn’t know how appalling our past is if we hadn’t lived through it, learned from it and been changed by it. Further, this work is nowhere close to finished. We are making mistakes this very moment that will be obviously stupid and wicked within a decade. I believe one of those mistakes is thinking we must choose between wholesale condemnation or wholesale worship of the past instead of treating it with the critical respect it deserves.

I wanted to demonstrate this attitude simply, and I believed a good way to do this was to show that old famous sayings can actually improve over time through being worked on by innumerable unfamous people. And I wanted to make fun of our compulsion to project this simplicity back into the past by placing the perfected words into the mouths of acclaimed geniuses. Why would we want to do that? What is the source of this need? The hammer I carry is philosophy, and the nail I see here is the unconscious impulse to preserve the current popular philosophy (also known as “common sense”) at all costs. This current philosophy, by the way, is also producing our political crisis.

There is a lot to say on this subject and it connects with some of the things in my life I value most, including my adopted Jewish religion. But I’ll leave it here for now.

Usefulness, Usability and Desirability of philosophies

Tim Morton explains Speculative realism:

Speculative realism is the umbrella term for a movement that comprises such scholars as Graham Harman, Jane Bennett, Quentin Meillassoux, Patricia Clough, Iain Hamilton Grant, Levi Bryant, Ian Bogost, Steven Shaviro, Reza Negarestani, Ray Brassier, and an emerging host of others such as Ben Woodard and Paul Ennis. All are determined to break the spell that descended on philosophy since the Romantic period. The spell is known as correlationism, the notion that philosophy can only talk within a narrow bandwidth, restricted to the human-world correlate: meaning is only possible between a human mind and what it thinks, its “objects,” flimsy and tenuous as they are. The problem as correlationism sees it is, is the light on in the fridge when you close the door?

So far, 65 pages in, I am seeing absolutely no progress toward transcending the human-world correlate. I am seeing attempts at using measurements and mathematical models as a substitute for intuition, but what could possibly be more human than that, even when, or especially when, such substituting of ratiocination for instinct make our minds feel abstracted from our animal bodies. I am also seeing speculations about real objects and what they might be like substituted with access to first-person being of objects (first-object being?). When you insist the light is on in the fridge when you close the door because it is the nature of light to withdraw when doors are shut, you’ve posed a possibility for humans to consider or for human scientists to investigate, and that should not be confused with seeing the inner-light of the fridge with superhuman refrigerated eyes. And even if you place sensors inside the fridge, or discover ways to detect or deduce light inside a closed fridge, or account for your inability to sense, detect or deduce with ontological maxims of withdrawal, you may be “seeing” through long networks of instruments (both physical and mental) but it all converges and terminates in an all-too-human “eye”. And this is true whether that eye is manifested from some archetypal realm, or the eye is imagined in a man’s or god’s mind (along with what is seen), or if the eye is an organ emergenging from interplay of matter and energy situated in a space-time container or if the eye is an object within hyperobjects.

For us, once we know a thing it all becomes something for-us, including our conviction that it is not only what is known, and that it is for-itself. As Nietzsche said “We cannot look around our own corner: it is a hopeless curiosity that wants to know what other kinds of intellects and perspectives there might be; for example, whether some beings might be able to experience time backward, or alternately forward and backward (which would involve another direction of life and another concept of cause and effect). But I should think that today we are at least far from the ridiculous immodesty that would be involved in decreeing from our corner that perspectives are permitted only from this corner. Rather has the world become ‘infinite’ for us all over again: inasmuch as we cannot reject the possibility that it may include infinite interpretations. Once more we are seized by a great shudder — but who would feel inclined immediately to deify again after the old manner this monster of an unknown world? And to worship the unknown henceforth as “the Unknown One”? Alas, too many ungodly possibilities of interpretation are included in the unknown, too much devilry, stupidity, and foolishness of interpretation — even our own human, all too human folly itself, which we know…”

So, good for the speculative realists that they have uncovered another human perspective for thinking in a less human-intuitive way. If learning to think that way delivers on the promise to make an ecological ethic more accessible, I’m all for it.

However, I am beginning to worry that this access is most likely occur through the thin conduit of argument, which rarely fully engages human intuition or or taps into moral impulses which “know” by caring or neglecting.

And as a designer-philosopher, I know hitting all three is paramount. For in design these are the holy trinity of experience, the necessary conditions of adoption: Useful, Usable, and Desirable.

The vice of utilitarian, functionalist folks who fancy themselves objective is they find far too much desirability in mere usefulness, and that desirability motivates them to surmount difficulties in comprehension — and then they find yet more desirability in the accomplishment of having surmounted the difficulties. This is why engineers, left on their own, engineer systems that only other engineers can use, much less love. Design is changing all that (at least for things made for non-engineers) and Human-Centered Design is accelerating that change.

Philosophy has been and is in the same stage as pre-design engineering. Because it requires motivated philosophical investigation to even grasp what philosophy does and is, most people can’t even see what it can be used for, or to even detect the symptoms of an obsolete or corrupted philosophy (or, as today, clashing of multiple corrupted obsolete philosophies). Philosophers engineer philosophies for other philosophers.

When philosophies are popularized, all that changes is the Usability. Now an ordinary above-average-smart person can get a sense of what philosophers are making for each other. They probably can’t get the same jolt of pleasure out of it, since most philosophy exists for academic philosophers’ purposes and tastes, but they can get a bit of that surmounting-difficulties pleasure and they can plume their social personas with the book-learning.

What most needs changing is Usefulness and Desirability.

By usefulness, I mean recognizing that every philosophy enables us to think certain kinds of thoughts. The live problems that orient and motivate philosophical effort tend to produce philosophies well-suited to think similar problem-types. The philosophy will instantly become difficult to distinguish from the reality it understands, so there’s a bit of a trap-like character to it. Philosophies are not tools we hold, look at and manipulate. They are tools we climb inside, see from and act from.

By desirability, I mean that you are moved by it. You don’t force yourself to care, or work yourself up and amp-up what little caring you feel. You don’t get argued out of apathy. The philosophy simply makes self-evident the importance of whatever it is doing. You just do care, and you will not even be able to explain why. Philosophies produce their own motivation.

Actually, in writing this, I changed my mind. Philosophy needs to give far more attention to Usability. Popularization of philosophy might help people absorb the content of a philosophy, but that’s the most superficial aspect. Philosophies are not for knowing, they are for doing, for application to real-world situations. Way too many people, even philosophers, think a philosophy is a thing that is known, an object of knowledge. This is not true. Philosophies are that by which everything else is understood and known. And Usability in philosophy is the degree of ease things outside the philosophy itself can be understood. To do that, we must tap into the power of the tacit layer of understanding, intuition. Philosophies ought to be designed for intuitiveness — not a preexisting natural intuition, but an acquired second-natural intuition that operates without conscious effort.

Here, the Usefulness of the philosophy becomes important: useful for what purpose? Because the purpose of the philosophy determines its Usability trade-offs. Scissors make cutting easy and propping open a door or chilling perishable food not-very-easy. A philosophy engineered to make it easier to integrate the latest findings of physics and to overcome the human tendency to think in such a human-centric human-scale way might be super-useful for writing provocative books and heavily-cited scholarly papers and building a reputation in an emerging school of philosophy, but it might not help that many non-academics make sense of the things they encounter all day, to be able to reach understandings with various people of divergent perspectives, to respond effectively to events in their lives, and to feel the importance of all this understanding, responding, communicating, doing and being.

This brings me to that passage from Morton’s book that inspired this post, and which brought to mind another passage, which I will quote after this one:

The undulating fronds of space and time float in front of objects. Some speculative realism maintains there is an abyss, an Ungrund (un-ground) deeper than thought, deeper than matter, a surging vortex of dynamism. To understand hyperobjects, however, is to think the abyss in front of things. When I reach out to mop my sweating brow on a hot day, the California sun beating down through the magnifying lens of global warming, I plunge my hand into a fathomless abyss. When I pick a blackberry from a bush, I fall into an abyss of phenotypes, my very act of reaching being an expression of the genome that is not exclusively mine, or exclusively human, or even exclusively alive.

This passage summoned to mind, a quote by A. S. Eddington:

I am standing on the threshold about to enter a room. It is a complicated business. In the first place I must shove against an atmosphere pressing with a force of fourteen pounds on every square inch of my body. I must make sure of landing on a plank travelling at twenty miles a second round the sun — a fraction of a second too early or too late, the plank would be miles away. I must do this whilst hanging from a round planet head outward into space, and with a wind of aether blowing at no one knows how many miles a second through every interstice of my body. The plank has no solidity of substance. To step on it is like stepping on a swarm of flies. Shall I not slip through? No, if I make the venture one of the flies hits me and gives a boost up again; I fall again and am knocked upwards by another fly; and so on. I may hope that the net result will be that I remain about steady; but if unfortunately I should slip through the floor or be boosted too violently up to the ceiling, the occurrence would be, not a violation of the laws of Nature, but a rare coincidence. Verily, it is easier for a camel to pass through the eye of a needle than for a scientific man to pass through a door. And whether the door be barn door or church door it might be wiser that he should consent to be an ordinary man and walk in rather than wait till all the difficulties involved in a really scientific ingress are resolved.

Well-designed philosophies open doors, and let our human, all-too-human, irreducibly-human eyes see what is in there so we can understand it and respond as humans and follow our human purposes.

An autobibliobiography

Well, I tried to write about my books and how I want to prune my library, and ended up writing a history of my interests. I know there are loose ends, but I am tired of writing, so blat, here it is:

I used to have strict criteria for book purchases. To earn a place on my shelf (singular) a book had to be either a reference or a landmark. In other words, I had to see it as persistently valuable in my future, or it had to be valuable in my past as something that influenced me. My library was personal.

Somewhere along the way my library became more general. References grew to include whatever I imagined to be the basic texts of whatever subject I cared about. Landmarks expanded to include any book that housed some striking quote that I wanted to bottle up and keep. How did this happen?

When Susan met me, I owned one book, Chaos, by James Gleick. This book is the landmark of landmarks. Reading it was a major life event for me. It introduced me to two of the most crucial concepts in my repertoire. 1) nonlinear processes, and 2) Kuhn’s theory of scientific revolutions. I loved the philosophical fairytale of Benoit Mandelbrot discovering a radical new way of thinking, and then skipping from discipline to disciple, tossing out elegantly simple solutions to their their thorniest, nastiest, most intractable problems, simply by glancing at them through his magic intellectual lens. He’d give them the spoiler (“look at it like this, and you’ll probably discover this…”) and then leave the experts to do the tedious work of figuring out that he was exactly right. And I loved it that the simplest algorithmic processes can, if ouroborosed into a feedback loop, can produce utterly unpredictable outcomes. We can know the dynamic perfectly, and we can know the inputs feeding into the dynamic perfectly — but we are locked out of the outputs until the process is complete. And then factor in the truth that numbers, however precise, are only approximate templates overlaid upon phenomena! Nothing outside of a mathematician’s imagination is a rational quantity. And in nonlinear systems, every approximation, however minute, rapidly amplifies into total difference. I’d go into ecstasies intuiting a world of irrational quantities interacting in the most rational, orderly ways, producing infinite overlapping interfering butterfly effects, intimating a simultaneously knowable-in-principle, pristinely inaccessible-in-fact reality separated by a sheer membrane of truth-reality noncorrespondance. I used to sit with girls and spin out this vision of truth for them, serene in the belief I was seducing them. Because if this can’t make a girl fall in love, what can? I still hold it against womenkind that so few girls ever lost their minds over one of my rhapsodies. They were into other stuff, like being mistaken for a person capable of losing her mind over the beauty of a thought, or being someone who enchants nerds and compels them to rhapsodize seductively. There’s a reason for all of this, and it might be the most important reason in the world, though I must admit, it remains pristinely inaccessible to me and an inexhaustible source of dread-saturated fascination. (If you think this is misogyny, you don’t understand my religion. “Supposing truth is a woman — what then…?”)

After I got married, my book collection expanded, reflecting some new interests and enthusiasms: Buddhism, Borges, and stuff related to personality theory, which became my central obsession. Somewhere around 2001 or 2002 I also became a fan of Christopher Alexander’s psychology of architecture, and I had my first inklings of the importance of design. Incidentally, one of the books I acquired in this period was a bio of Alexander, characterizing his approach to architecture as a paradigm shift. This was my second brush with Kuhn.) Until 2003 my book collection still fit on a single shelf.

In the winter of 2003 in Toronto, Nietzsche happened to me. Reading him, fighting with him, and being destroyed by him, I experienced intellectual events that had properties of thought, but which could not be spoken about directly. It wasn’t like an ineffable emotion or something that couldn’t quite be captured in words. These were huge, simple but entirely unsayable truths. I needed concrete anchors — concepts, language, parables, myths, images, exemplars — anything that could collect, formalize, stabilize, contain or convey what I “knew”. This is when books became life-and-death emergencies for me, and sources of extreme pleasure. I couldn’t believe you could buy a copy of Chuang Tzu’s sayings for less than the cost of a new car. From 2003 to 2006 my shelf grew into a library. I accumulated any book that helped reinforced my intense but disturbingly incommunicable sense of truth — what I eventually realized was a faith.

But then the question of this inexplicable state of mind and its contents became a problem to me. What exactly is known? How is it known? Why think of it in terms of knowledge? If it cannot even be said, then how can it be called knowledge? And the isolation was unbearable. I was in a state I called “solitary confinement in plain sight” with in an overwhelming feeling of having something of infinite importance to get across, but I couldn’t get anyone to understand what was going on or to consider it important enough to look into. I got lots of excuses, arguments, rebuffs, cuttings-down-to-size, ridicule and promises to listen in some infinitely receding later, but I could not find any real company at all, anywhere. This was a problem I desperately needed to solve.

Richard J. Bernstein’s hermeneutic Pragmatism is what hoisted me out of this void and gave me back a habitable inhabited world, with his lauded but still-underrated classic Beyond Objectivism and Relativism. Equipped with the language of pragmatism, hermeneutics, phenomenology and post-empiricism (Kuhn, again) I could account for my own experiences and link them to other people’s analogous experiences. Not only that — he began my reconnection with design, which had become a meaningless but necessary source of rent, food and book money. I was able to reengage practical life. But Bernstein’s method was intensely interpersonal, an almost talmudic commentary on commentaries ringing a missing central common text.

Richard J. Bernstein’s bibliography, however, was the flashpoint for my out-of-control library. Each author became a new collection. Kuhn, Feyerabend, Lakatos, and then eventually Latour, and then Harman and now Morton… etc. Geertz seeded an anthropology and sociology shelf, which is now a near-bursting book case. Hanna Arendt is a whole shelf, and spawned my collection of political books and my “CDC vault” of toxic ideologies. Gadamer and Heidegger were another space-consuming branch. Dewey, James and Peirce fill about three shelves. And Bernstein’s line of thinking led me directly to Buber, who also breathed fire into my interest in the research side of Human Centered Design (another half a case of books) and sparked a long process of conversion to Judaism (yet another half-case, and growing).

A bunch of these threads, or maybe all of them together drove me into Bruno Latour’s philosophy. Latour inflicted upon me a painful (and expensive) insight: Everything Is Important. Statistics, accounting, technologies, laws, bacteria, materials, roads. Therefore I must get books on everything, apparently. With this we finally ran out of room in my bookcases, them my library room, then our house. We had to get a storage space to cycle my out-of-season books into and out of again when I realize I must read that book right now. Susan just got a second space. I have books stacked up everywhere. I am a hoarder.

I am considering putting all these books back under review, and keeping only the books that fit those two original criteria. Is it a landmark for me? Is it a reference that I know I will use?

I cannot be everything, and I need to stop trying. I need things that help me stay me, and I need to shed the rest. Good design demands economy, tradeoffs, clarity of intent. I have a bad case of intellectual scope-creep. It is time to decide what is essential, and to prune away nonessentials so the rest can grow in a fuller way.

I have another half-written post I think I’ll finish now.

Hyperobjective spew

I’ve gotten sucked into Tim Morton’s Hyperobjects. I was reading Kaufmann’s book on Hegel, but after sampling few pages of this book on the recommendation of a friend Morton’s book felt “next”.

A few random notes:

This territory, settled first by Actor-Network Theory (ANT) and developed further by Speculative Realism, truly feels like where the philosophical action is. It is pro-science but anti-scientism, which matters quite a lot, given the left’s metastasis into an aggressively intensifying and spreading scientistic fundamentalism. It is built on the Pragmatist platform, as all good contemporary thinking is. It addresses our basic moral impulses along with our conceptions, and who cares about whatever doesn’t? This movement is for thinking folks beyond the academy. I have come to loathe papers meant only to boost some professor’s scorecard. (Back in the day I designed the interface for a system for capturing academic accomplishments for evaluation, so I know the game academic careerists must play to win. Whoever let the MBAs into the dean’s office deserves to be shot.)

This book definitely fits in the Object-Oriented Ontology (OOO) genre. As a genre, OOO seems not only influenced by, but highly derivative of ANT, and especially Latour, in its delight in dizzyingly heterogeneous lists designed to inflict ontological whiplash, and its ironic oscillations between light whimsy and the heaviest dread. I am writing this post from Paris, and I have to wonder if this literary texture doesn’t have something to do with Latour’s Frenchness. If there is one thing the French are not, it is streamlined. OOO is an unstreamlined genre. OOO profuses.

I’m struggling for a style for my 4-page pamphlet, so I’m a little genre-sensitized right now. I crave severe streamlining, to the point of geometry. The reason for is that I want to provide a minimal skeleton or scaffolding for thoughts, not the thoughts themselves. Now, that I’m writing this, maybe my genre is the genre of design brief. This is consistent with one of my core themes, that philosophy is a species of design. If this is true, and I am no longer inclined to doubt this background faith or its implications, wouldn’t this kind of design, like all others, benefit from a design brief? Design is directed by an intuited problem. Normally a problem is implicitly and instinctually felt by isolated individuals (as inspiration), or no problem is felt (as feeling uninspired). If framed explicitly as a brief, inspiration is socialized and made available to groups of collaborators. Briefs themselves are designed things, and my favorite kind of design is brief design. (By the way, a couple of months ago I developed a simple method for codesigning briefs that feels extremely promising, and I need to write about that. Note to self.) I think this pamphlet might be a universal design brief for designing design briefs. You know I love to stack some metas. This insight may be a breakthrough, or a yerba mate overdose, or both.

Another thing I’m noticing that I like about OOO is their metaphysical surveying work seems right on. The property lines they’ve drawn between being and alterity, knowledge and reality are very close to my own. The only conception of religion that has ever made sense to me is the cultivation of relationship between knowing self and the barely-known reality of which self is part. Speculative Realism seems built on this well-surveyed property, each herm in its proper place.

And if I am not mistaken, according to this survey, transcendental and transcendent are diametric opposites. In understanding, the transcendental is what we bring to the table of knowledge, and the transcendent is what not-we brings.

Drawing on every side of the brain

In high school, all my art teachers taught us to draw and paint the shapes our eyes “really” saw. We were discouraged from drawing the things we believed we were depicting — eyes, noses, vases, cow skulls, gourds, drapes — and encouraged instead to draw the shapes that were said to precede our objective interpretations. We did zillions of blind contour drawings. We drew and painted shapes instead of trying to model the dimensional forms we believed were there. It was an interesting experience. I learned to shift into a trancelike consciousness that made the visual world hyper-vivid, and disabled speech.

Toward the end of college I met a prickly teacher who demanded a different style from her class. Now we were to observe, analyze and model forms. She taught us methods for rendering various three-dimensional effects on flat plains, so we could translate the forms in space we learned to understand to what charcoal and paper could convey. It was an incredibly difficult shift, which I experienced as an undoing of years of skill development.

In the years after I did some other visual thinking development, but they were all remote from figurative drawing. I learned to compose pages and screens to aid in comprehending complex information. Shortly after college, I experimented with translating musical compositions into visual ones via the language of mathematical ratios. Most importantly, though, I developed an ability to collapse complexity into simple visual diagrams, which are tools for conceptualizing information, not only existing data, but for framing incoming data on an ongoing basis. They are visual hermeneutic tools. I philosophize visually first, and even when I translate the visuals into words, I keep wanting to retain the visual qualities, which might be why I’m tempted toward prosody. Not for the sake of sounds (or not primarily), but for the sake of structure. I want important thoughts to be expressed in linguistic crystals.

Now my job has me doing figurative drawing again, but in a style going driving me back further into those left-brained natural habits of seeing and drawing I worked so hard to break and replace in my teen years. Now I am sketching ideas with the goal of communicating complex ideas as simply as possible. It is somewhere between cartooning and writing in pictograms.

My life as a visualizer-thinker has led my on a tour through my brain and shown me how many ways we can bilateralize what we see and know.

Why you should be mad about Lean Startup

Lean Startup externalizes usability costs to users.

To combat this practice, if I find a usability issue I call tech support and have them walk me through the interaction. These calls cost a company a significant amount of money and makes it less profitable for them to skip the user-centered design steps that ensure a decent experience for users.

I urge everyone who cares about design to do the same. Stop wasting your time and energy trying figure out how bad designs are supposed to work, and start wasting the company’s resources instead.

 

The long story on Lean Startup:

Before Lean Startup, companies invested in user centered design processes, including usability testing, to ensure customer’s tools always worked well. The highest priority was given to protecting customers from design mistakes that inflicted frustration and interfered with their lives. Software was released only when the flaws were fixed and the software was ready for human use.

Lean Startup changed all that. It advises companies to not invest money in design and research, but instead to release the software sooner, even though this is likely to expose customers to usability errors, frustration and confusion. Rapid release cycles enable the problems to be spotted in analytics and quickly corrected. This enables the company to accelerate software improvements and outpace competitors.

With Lean Startup, it’s all about competing to be the best product first. It’s all about the company’s product surpassing the competitor’s product — not about the customer’s tools working as they should and providing a great experience. It’s all about how good the company’s software gets, not how bad their customers feel while using untested, hastily hacked-together interfaces.

 

Tool users vs service users

I am not one of those people who sees service design as the grand catch-all for multi-touchpoint multi-/omni-channel experiences.

I feel the same way about “service” as I did in the early aughts about the term “user”. These words imply relationships between what is designed and the person whom it is designed. Designing for the wrong relationship means misframing the design problem. “User” implies a tool relationship. Users use things as a means to accomplish something. Of course we can apply the word ‘use’ broadly and see a movie as something an audience uses for entertainment or a concierge as something a visitor uses to get local information, but this breadth is purchased at the cost of consequential subtleties. What we need and expect from a word processor is different from what we expect from a concert or a bank. Discovering exactly what those needs and expectations are and developing satisfactory resolutions of those needs calls for different methods. The mistakes UX have historically made were often tied up with insufficient sensitivity to these distinctions. The same is true of “services”. We can reduce a drill to one component in hole-making service that spans a journey from discovering a need all the way to resolving it, and, yes, much is gained from seeing it this way, but if we are not careful, important distinctions can be lost.

And in fact I do believe certain things are currently being lost by this framing. Software as a service (aka cloud computing) has changed norms around how software is supposed to behave. We are now accustomed to think of web-based software as something that belongs to someone else that we are licensed to make use of. A decade ago, users were more likely to perceive software as tools to own, learn and eventually master. Upgrading was a purchase decision resembling the decision to replace a pen or a hammer with an improved model — not as a periodic change that just happens and requires us to adapt.

This seems mostly OK in many cases, especially where tools serve as front ends to services, for instance banking and accounting, or databases. But for software tools used for making things — word processing, image editing, ideating, music creation, even blogging — changes, especially subtle ones, distract from the tools purpose which is to be an invisible extension of a user’s abilities. It is important that such tools be utterly predictable, controllable and unobtrusive so the user can exercise mastery over the tool to keep complete focus on what is being produced. I am concerned that software designers have lost all awareness of this goal. They are focused on different problems.

Years ago I was struck by the elegance of James Spradley’s research method typology, defining them not by technique, but rather by the role played by the research informant. Surveys are performed with respondents, tests with subjects and ethnography with informants. I think a similar approach could be helpful for classifying design methods. Perhaps we could gain clarity by paying less attention to medium or channel of delivery and more attention to the kind of relationship we are trying to develop through our design between the designed thing and the people for whom it is intended.

Obtrusive conveniences

A design trend that disturbs me intensely: obtrusive conveniences.

What makes these conveniences obtrusive is that they make it incredibly inconvenient to refuse what they offer and you end up fighting for control over what you are attempting to do.

An example that is driving me away from iOS is text selection. Instead of giving the user direct character-level control over  selection, iOS tries to divine the user’s intention. Are they selecting just a character? or a word? or a text block? It never gets it right, and the effect is one of fighting for control.

Autocorrect also blows it constantly. If you use unusual words it constantly changes them to common ones for you. It is like one of those idiots who insists on finishing your sentences for you constantly despite having no idea that you are saying something they don’t already know. I can’t believe Jony I’ve hasn’t done something about how much effort it takes to type his name against the digital will of the devices he’s made.

And these behaviors are not even bad in a consistent way across apps. Now a new breed of “creative” coder has entered the scene who feels he can improve “the experience” by adding his own innovative flourishes to text editing. Nowevery editor you use has different behaviors around selection, spell checking, formatting, etc. Sadly, the more powerful HTML becomes, and the more empowered designers and developers are, the more inconsistent the overall OS platform user experience becomes. “Learn once, use often” has been replaced with utter chaos of second-rate ingenuity. The very editor I am using now (WordPress) is one of the worst offenders.

And don’t even get me started on autocomplete. When everything is optimal — the device is running smoothly, the internet connection is fast, and the user is typing accurately — autocomplete is great. But things are rarely optimal, so what actually happens is painful delays between keystroke and result, leading to mistyping, leading to attempts to delete and correct, with missed keystrokes and that same desire to escape being helped so ineptly.

Behind it all are philosophical principles which I can feel palpably in these interactions. For one thing, there is no awareness that this product is one element of a much larger experiences. For one thing, there is the experience of using the device, something few developers consider anymore. Then there is the experience of trying to get something done. And of course, there is the experience with organizations over time. Human-centered designers think about these overlapping contexts and design with them in mind, but in recent years companies have come to the opinion that iterative trial and error with ludicrously short development cycles that leave little or no time for testing will get them to a great product faster than being thoughtful or thorough. In all of this I detect a relapse, away from empathic discipline (thinking subjectively in terms of experiences) back into obsessive making of objects (which are still called “experiences” by people who like the idealistic tone of the term and the mouthfeel of X). But what bothers me worse is a sense that these coralling conveniences are ok for most people, who don’t really need control, and who are happy to say and do what is easily expected. In these near-irresistable conveniences I feel a sludgy flow toward a brave new world of lethargic uniformity where everything is dittoing, me-tooing, LOLing, emoticoning from a shrinking repertoire of publically recognized standardized experiences.

If any individuals are still out there, consider this a liberal beacon. Hello? Hello?

 

 

Gadget-porn addiction

Apple used to innovate by asking “Wouldn’t it be great if people could ____?” This was what made them uniquely great.

Now Apple does what every other banal tech company does and asks “Wouldn’t it be great if we could make a thing that could ____?” Or even worse “Wouldn’t it be great if we made a thing that has ____ characteristics/features/specs?”

This is why Apple keeps coming up with the same ideas as everyone else in the industry and why none of what they do matters one bit, however much their gadgets get hyped by gadget enthusiasts.

This hyping is part of our problem: great designs are better to use than to obsess over and to talk about. Most of what is best in great design is hard to talk about and is boring to read about. Great design tends to disappear. But cool features, record-setting specs and thrilling visuals generates buzz and drives short-term sales.

*

I think our culture’s gadget porn problem might be destructive in ways that parallel our culture’s sexual porn problem.

Just as pornography confuses and misleads youth about healthy relationships between partners, gadget porn confuses consumers about healthy relationships between people and things. In both cases, what is most healthy is quiet and not much to talk about but makes life much better. Addiction to lust drives people into cycles of craving, temporary satiety and empty boredom.

When design isn’t rewarded in the market, companies stop taking it seriously. They don’t invest in making products that are great to use, the make sexy-looking gimmicks that open wallets. Our tools start out as pleasant diversions and end up as perpetually irritating distractions.

 

 

 

Why I get emotional about design

When I use a product, I feel the milieu that produced it. Products are crystallized philosophies. In a designed object I feel people — the people who produced it and sometimes a precise person for whom an object is intended. This “personal from” and “personal to” is what makes design what it is.

When I get inspired or offended by bad design, precisely the personal from and to is what I am reacting to. In objects I sense all kinds of things about the producer: care, contempt, insight, vanity, poetry, banality, tyranny, playfulness, thoroughness, orderliness, arbitrariness, etc. And I also sometimes detect a consumer’s personality and worldview (for better or worse) — a person the producer had in mind for whom this thing is intended. And all too often I feel an anonymous vacuum where a producer or consumer should be. It is a thing from nobody and it is for anybody.

*

When I’ve felt betrayed by design it is when an organization did a “personality switch” on me, like an unfaithful friend. I can feel that the organization has come to see the world in a new way where there is no longer space for me to exist. The organization used to make things designed for me, but now they’re designing for someone else, or worse for everyone, which really means nobody.

*

Since we are once again in gift season, I will repost my “Design as gift” idea yet again, with the usual minor variations.

When one person gives another person a perfect gift, the gift is valuable in three ways:

  1. The gift itself is intrinsically valuable to the recipient. The gift is good because it makes life easier, more pleasant or more meaningful.
  2. The gift contributes to the recipient’s own self-understanding and sense of identity. The gift is a concrete example what the receiver experiences as good. It is a crystallization of the recipient’s ideals that reveals something important about the recipient that sometimes cannot be said.
  3. The perfection of the gift is evidence that the giver cares about and understands who the recipient is as an individual. The successful giving of a perfect gift demonstrates that the giver was moved to reflect on the recipient and has real insight into who they are as an individual, what they value and how they fit into the world.

Great design experiences are similar to gifts. When a design  is successful the user gets something valuable, sees tangible proof they are valued and understood, and experiences an intensification or expansion of their sense of self.

 

Formalizing relationships with the formless

Formless realities cannot be grasped with formal thinking, but our relationship with formless realities can be.

Formally grasping our relationship with formless realities makes these relationships with formless realities more bearable.

 

This is mainly a note to myself at this point. It feels important, so I’m posting it.

It’s the experience, stupid

People think software is becoming more frustrating because the world has become more complex.

This is false. Software is worse because development has been drastically accelerated. The shortened cycles leave little or no time for best design practices that ensure that real people experience the updates as useful and usable. The QA testing often suffers, too and software is released with major bugs.

This is all by design. The following passage comes from page 4 of the Bible of this development approach, The Lean Startup by Eric Ries:

I’m a cofounder and chief technology officer of this company, which is called IMVU. At this point in our careers, my cofounders and I are determined to make new mistakes. We do everything wrong: instead of spending years perfecting our technology, we build a minimum viable product, an early product that is terrible, full of bugs and crash-your-computer-yes-really stability problems. Then we ship it to customers way before it’s ready. And we charge money for it. After securing initial customers, we change the product constantly — much too fast by traditional standards — shipping new versions of our product dozens of times every single day. We really did have customers in those early days — true visionary early adopters — and we often talked to them and asked for their feedback. But we emphatically did not do what they said. We viewed their input as only one source of information about our product and overall vision. In fact, we were much more likely to run experiments on our customers than we were to cater to their whims.

Traditional business thinking says that this approach shouldn’t work but it does and you don’t have to take my word for it. As you’ll see throughout this book, the approach we pioneered at IMVU has become the basis for a new movement of entrepreneurs around the world. It builds on many previous management and product development ideas, including lean manufacturing, design thinking, customer development, and agile development. It represents a new approach to creating continuous innovation. It’s called the Lean Startup.

If you read the book, it becomes abundantly clear that Ries thinks very much in terms of engineered things: software, organizations, innovations. And what he wants to do with those things is to improve them as rapidly as possible, through trial and error. This makes sense, given his background.

What Ries fails to consider, though, is the experience real people are having while advancing his project of continuous innovation. He is not thinking about what it is like for a real person to try to do something important with his latest “terrible, full of bugs and crash-your-computer” release. And he is certainly not thinking about what it is like to live in a world where most software is developed this this way, and consequently is in a stage of disrepair and renovation all the time. The “fail fast” trials of innovators translate directly into our own personal failures trying to get stuff done with reasonable effort, because our tools never work like we expect.

This is currently what is thought of as progress in the industry. In the 90s and early 2000s, though, the software industry was progressing in a different direction. Back then, more and more people began talking about designing experiences. What was meant by “designing experiences” is that when we design, our ultimate product is not the object we are engineering but the subjective experiences people when they use it.

But somewhere along the way, experience became a cool euphemism for “thing” with no reference whatsoever to real people or the experiences they have. People now work on their “experiences” and it doesn’t cross their mind to wonder how you, or any other actual human being, will experience the thing they’re building.

So, the next time you go to open some software and cannot figure out how to use it anymore, or when software updates and it crashes on you, or when feel a pit in your stomach when you notice that one of your apps has an update — just know that the owner and the investors in responsible for creating this software probably read this book and thought it sounded like a pretty great idea.

One day when we will look back at this time in our history, maybe our minds will boggle that the folly of this approach wasn’t obvious to everyone. But for now, we’re just bobbing in this boiling broth, singing “ribbit”, and blaming technological progress and ourselves for what is in fact an industry-wide brain fart.

Luckily, I got out of UX (user experience) just before it was taken over by Lean Startup, and designers were demoted to front-end prettifiers and design researchers were pushed to the margins of the process, if not out of it altogether. I have no professional skin in this game. But as a user, I do still have quite a bit at stake. I would love to spread my enlightened frustration as far as possible.