mardi 30 mars 2021

I'm very sympathetic about feminism's push against traditional gender norms, because I myself dislike and distrust such norms. If I don't like to be held to them, and I feel my own way is a better way and the right way, compared to an artificially fixed "standard" that knows nothing about me as a person, then why would I think a woman would like it, especially if those norms have been more restrictive toward women? Though I can't say I'm a woman, I can say, as close as means anything, "I totally get it."

Change takes raising awareness, and raising awareness sometimes seems to demand steps that would otherwise be less than ideal. Though I support breaking down gender norms and oppression wholeheartedly, and often find evidence of such norms (not to mention oppression and subjugation) to be distasteful in old cultural artifacts, there is one step I prefer not to take, myself, because I consider it not right. That is, old-fashioned gender norms are no good to enforce on anyone, let alone today, now that we know better; yet, personal tastes and distastes aside, I do feel we should stop short - or, anyway, I will - stop short of seeing all evidence of such as wrong. If a person wishes to be traditionalist, and this harms no one, then that should be as acceptable as anything else. And if a character is portrayed in fiction as traditionally masculine or traditionally feminine, with the usual specialization and division of responsibilities implied, then that is not in itself evil in any way. If we may be influenced by literature toward what's in it, we also may be influenced by literature away from what's in it. The argument that depicted violence makes us violent ourselves has gone around a million times. Does it? The evidence suggests it does not, or not in particular. There's even a case to be made, though I can't vouch for it with any numerical evidence or even a clear opinion, that fiction and virtuality are the place for violence, and putting it there makes us safer.

Personally I do suspect that boys/men and girls/women, in aggregate, have a few instincts and motivations that are a bit different - there's plenty of evidence for that in the rest of the tree of life, especially in the branch we live on, and in human history. That is, while I'll hear arguments and evidence with interest and agree with many of them, I don't believe "it's all socialization." Now, we may want to cast the net far and wide so that we can change as much as possible. That's important and probably necessary. But I will, myself, out of genuineness, stop short of supposing that every supposed difference is an oppressive, patriarchal conspiracy. Maybe girls do tend to like dolls better, on average, and that drives the market for dolls, rather than the patriarchy conspiring to get all girls believing they should be baby machines, and brain-washing them with dolls. The latter explanation, I must be honest, sounds like a conspiracy theory, and poorly thought out. Still, the establishment sells dolls to girls at great profit, and mocks boys for liking dolls, and this creates further differences where either there weren't any, or those differences were natural, adaptive, flexible, subject to circumstances and evolution and personal strengths and weaknesses and learning and role models and dreams and so on and so forth.

Having said all that, representation makes a difference. That is also true. I'd like to apply that standard to new representation, and let old representation be what it is and was without cursing it. That's my way. It feels like the best and most sensible way to me, and so I follow it, and will put in a good word for it here and there.

So, speaking of good words, I thought I should say: fuck the patriarchy!

I understand that raising awareness isn't always easy, and doesn't please everyone, and will at least temporarily make a few people enemies, unfortunately. That's to be expected. Count me in on the side of feminists everywhere. Do know that I put intellectual honesty on a pedestal, and that's behind both my criticism and my agreement. The truth is out there, as the logline goes, and a lot of feminism is right there in that truth.

jeudi 25 mars 2021

Amnesia is one of the oldest tricks in the book in fiction, so much so that a friend of mine who's a writer dismisses it everywhere it occurs. I myself feel similarly about zombies.

Allergy notwithstanding, we both appreciate particularly surprising examples of our allergic trope. For my part, it isn't as if I hate the idea of zombies. Not at all. I thoroughly enjoyed 28 Days Later, Hot Fuzz was pretty great, I'm highly entertained by both cuts of Romero's Dawn of the Dead, and even a cruddy B-movie like Zombi 2 can become something I remember fondly. (On the other hand, I hated the zombie-inspired finale to Skyfall - a huge pile of utterly senseless, numbing violence that's deafening in the theater - though I will agree it's semi-interesting for the fact they aren't actually zombies, and I liked the rest of the movie.) What puts me off is seeing a zombie variety of absolutely everything. That level of popularity feels overpoweringly gullible to me. Meanwhile, amnesia is objectionable because it's such an easy character and plot hack.

Ultimately, it isn't that a story using either of these can't be any good, but eventually it becomes difficult to suppress fatigue and irritation with everyone's copycat gullibility - if too many people do "the wave" for too long, that ends up feeling lazy, uninspired, and uninspiring. Excess unoriginality has a curious way of making originality itself seem futile or worthless. That's depressing.

One trope related to zombies lands better with me, though. This is the one where everyone seems to have been replaced by someone who looks like them, but isn't. The house, town, or world is filling with imposters, and the only person who notices (or the only human left?) talks about this and is treated like a lunatic. A cliche for sure, but it's a bit less common than zombies, and it can be done really interestingly - The Thing, Invasion of the Body Snatchers, etc. There's usually a psychiatrist or house arrest or mental ward (rather overplayed and unrealistic, so I'm less a fan of that aspect). The mental health reference is no coincidence, because the trope comes from a real condition called Capgras Syndrome, which you can look up if you're interested. For what it's worth, I know two people who experience Capgras Syndrome now and then, and I've been on the "imposter" end of it, hearing the news that someone pretending to be me was on the phone earlier - only, it actually was me. In fiction, this typically serves as a metaphor for situations where everyone disagrees with you, but you're convinced you're on to something, and it turns out you are. The perception divide can make the rest of the world seem hostile and dismissive, and we need stories about getting through the situation well.

I'm beginning to feel about AI the way I do about zombies and amnesia. The topic is all the rage, and whenever something's all the rage, you get exposed to a superabundance of stupid remarks and attitudes. Aside from rabidity and a few similar biological conditions involving sporulation, I'm not sure zombieism has ever existed in reality. (Of course, by applying extending the metaphor, we can refer to consumerism and status quo bias, etc.) Forgetting and computation, on the other hand, are here to stay; amnesia and AI are phenomena with perhaps deeper universality. There is no escaping them for long, I'm afraid. But I actually am very interested in AI, to the extent that I took a course in graduate school and have attended a number of AI conferences as a tourist. I'd simply like to see it talked about differently.

Bandwagons can be off-putting, depending on your personality. Another phrase associatable with the phenomenon is "cargo cults," and that implies the part I don't like: people buying into stuff wholesale because they keep hearing it. A superstition is born. Don't throw the baby out with the bathwater, but don't carry the bathwater with you everywhere you go just because others do. That's gross.

There is no reason not to let your imagination roam, of course. We've seen an explosion in AI, and no one knows exactly what that will mean 20 years hence. But I'd like to see AI included in ways that scrupulously avoid giving the sense that the idea has been inserted to tick boxes that say "I am current" and "I think tech is cool" and "AI is the GOAT" (for the record, that last box would amuse me, as it's the kind of thing my students would say). AI today is like magic. You can do anything with it. Any problem imaginable in your story, you could inject it with a futuristic AI. It's duct tape for the imagination. It's discount sealant for plot holes.

What I'd love to see more is AI treated as a response to some unexpected diagnosis. 2001 and Her don't just mention or use AI, they start good conversations years or decades after. Raise a thorny question. Add something no one has said yet, or no one explored much. Either that, or approach AI (or any popular topic) from an especially realistic angle, so that I feel more informed about what might actually happen in the next years and decades. That kind of use instantly gains my respect.

If you fail, that's ok. I can tell if you're trying, and that itself gains my respect.

vendredi 19 mars 2021

A good principle for representation in stories, one I've heard and want to pass on, is that the ratio of races/identities among characters should be the same as in that society at large. If the story digs into inequities, then of course a more exact realism can make sense or become necessary (and the digging will give balance). But, for example, it's at least slightly noxious or untoward for a story about programmers in America to involve only white men. That is, don't break down the current subculture demographically, as it could easily reflect prejudice in hiring. Represent the wider cultural one. Use the real background numbers. Roll some virtual dice. Or hand-pick your cast, but count. Only deviate from the rule when there's a specific reason. Fiction allows and asks us to draw and change with awareness. And this isn't a strict rule, that is I'm not proposing eviscerations where it's broken, but it is a good guideline. I can't think of a better one.

By the same token, half of characters should be women, or even slightly more than half. I don't care as much about religion so I'm not emphasizing it as much, but the argument is the same, and it might make for a better story. Trying to match the wider demographic background actually, to me, suggests more creative possibilities than trying to profile the audience and cater to them with what they tend to like. The latter easily slips into pandering and cliche. The former is a source of ideas you wouldn't think of. Imagine that.

There is an argument that varying race/gender/etc can make the story seem to be about that when it isn't. But I'd counter that this mainly seems true because the rule isn't followed enough yet, and so when it is, that stands out. If it were usually followed, it wouldn't stand out, and that might result in an actually more effective spin on color-blindness. In other words, with representative mixing we'll stop thinking everything is about race and politics, because representation has reached parity and we've gotten used to seeing what's actually there.
Semantics is important because if two people don't agree on the meaning of a word and don't even realize it, they'll be talking past each other. And if they realize that a seemingly larger disagreement depends on differences in understanding a word (or corresponding situation), then they have gotten somewhere.

No one is obligated to agree with anyone else. Too often in a discussion there's this sense that Thou Shalt See It This Way Or Else. It's never said, really, but it's often implied by tone or dogged repetition.

The only legitimate way to command agreement, in my view, is to be transparent about logic + evidence. What are the logic and evidence that led you to believe what you believe? If those pieces hold up, they more or less speak for themselves. If no one can find the problem with those pieces, or provide an equally compelling logic + evidence argument, then this is the reigning answer, not because anyone is bullying anyone, or demanding anything, but because we live in a material universe that operates according to logic. That is how you establish truth if you find it important to do so. There is no better or more reliable way, and there is no reliable shortcut.

You definitely don't have to think so - no one does, our thoughts are free - but if you don't think so, I would bet you large amounts of money (if I have it) that this will show in the inaccuracy of your beliefs and your thinking. And if I make that bet, I will be walking away richer.

-

This is the game of debate. And it is a game. 

And realizing it is a game makes you better at it, because this allows you to step back and see with a bit of detachment, and therefore from more angles. Clinging to one of them isn't conducive to finding the right answer, or proving you've found it if you have.

Ironically the most productive debate is one in which you can humor a position you find incorrect, and others can do the same. It's the attitude of play that brings flexibility to thought where emotions may run high. Expressing a stronger emotion than someone else doesn't establish that you're right. Often what it does is get you caught up in an inflexibility that impairs the process.

Global poverty is serious. But if you can't talk about it as if it weren't serious, you are doing your objectivity - and your debates - a disfavor. This is counterintuitive, and that's why so many debates go badly. Debate is a game. It works better that way. It's more productive that way. If you love someone, set them free. If you want shared truth, open your hands, let go of certainty, and play around with ideas that may seem untrue.

The appearance that you will refuse to even go there - refuse to even consider a possibility - may look strong and persuasive to some people, but to a slightly more sophisticated reasoner, it looks as if you're on crutches and don't know what's going on.

Everyone can benefit from watching and rewatching the movie 12 Angry Men, I suspect, including me. The protagonist is a hero because he's a better debater, and he's a better debater because his stance is flexible. Even though his motives and his questioning are principled and insistent, his stance is flexible. Everyone else is on crutches. He can dance. And that gives him more clarity, not less. Compared to them, he is playing. He knows debate is a game, even when it's deadly serious.

mercredi 17 mars 2021

Wreakhaf

Possibly the most ridiculous thing in my life is my relationship with decaf coffee. It used to be a support I had to rely on. Sounds strange, I know. But I'm what's called a "slow metabolizer" of caffeine, the slowest kind (two copies of the slow gene), so a very small amount (a decaf coffee could have 6-30 mg) can go a long way. Also, caffeine messes me up thoroughly about ten different ways. At the same time, I have absolutely needed help to get to work and focus when my energy levels are low. ADHD, depression, and a sleep disorder can make for a mighty fine punch in the face.

Then you get first hints of a migraine from the caffeine you either did or didn't have today. Damned if you do, damned if you don't.

Before returning in mid-2019 to the especially mild antidepressant/stimulant first prescribed for me almost 20 years ago (I've found a secret weapon that makes it actually work, a little-known amino acid that's always found inside us), I had this love/hate thing going with caffeine, even decaf coffee, which, as I said, is what I usually used.

Since starting the medication, I react worse to caffeine than before, it helps less, and also I need it less. Yet the thing isn't entirely gone. Sometimes I feel I have to fall back on it. Nothing like teaching or driving with your eyes closing themselves. Don't even try it! The one can kill you and someone else both, someone who doesn't deserve your stupid end. And the other, the former, is sheer misery and almost can't work.

Decaf coffee these days makes me considerably more ADHD. But I still have the addictive yearning for it, even now as I think about it, and it's been maybe a week, and I've been doing rather better. My focus has lengthened. The walls of impossibility have come down. I can decide to work on something, and then it gets worked on, without that much fuss.
The best criticism of an idea is that it isn't true, why it isn't true, and in what way it's deceptive and consequentially so. Few ideas are entirely untrue, so to cleanly call untruths untrue, it's important to point out what's true about them, or might be, or could be construed.

There's nothing for exposing the spurious quite like tracking the veritable by its footsteps as someone unwittingly lost it. We banish its harms by untangling a few ways perception is harmless, untangling until revealing the proverbial Hell Blvd. No one who ever stepped on the road of Good Intentions ever stepped on it with Bad Intentions first, yet everyone who stepped on it ended up mistaken. It's the nature of a mistake.

When untruths are discerned, there's nothing wrong with them. Most of us love fiction. The only trouble with fiction, paradoxically, lies in not suspending disbelief. The trouble lies in not needing to - in believing the same way too many drink and eat, to excess - in not realizing we've chosen any excess, in not realizing it's fiction. Then we have, ourselves, by failing to see well, made fiction a lie, rather than a craft, a puzzle, a curiouser turn of the head.

There is no one right way to appreciate art, nor is there only one way to find the truth of the matter. Art, though, undertaken well, absorbed intelligently, can inoculate against the ruse of a more mischievous sort, or a more ignorant.

mardi 16 mars 2021

I have a long-standing fascination with Peter Molyneux and the works associated with him. To some extent, leading others in making a new thing is about selling the dream, making it a mythology as you go. Molyneux was famous for talking big. His studios and the teams he led made some extraordinary works. The ideas were always a little bigger than the products. In my mind, this benefitted them before, during, and after.

Eventually, most people didn't see it that way. When the games he and his teams made stopped seeming extraordinary in any quality, and when the public lost all patience with him and his visionary style - these seemed to happen at the same time - the backlash was severe, downright mean-spirited, and frankly vicious and hateful. It was so brutal, and the difficulties of half-failed projects on top of that, that he declared he was going to stop talking up his games. Generally, that's what he's done for the last 5-7 years.

Curiously, his company has released one minor game in that time. It's so uninteresting - I played it for a couple hours - that Wikipedia doesn't have an entry on it after 4 years.

The message implied - almost expressed directly - is that in some sense this is his unique way of working (that's basically what he says) and a gift (it is, though I haven't heard him say it) and somehow related to the creativity of the games (it is, and that should be obvious). People focus on the pragmatic question of whether the released product delivers on the promises in some bullet-by-bullet feature sense. They get disappointed. They believed the talk. They wanted to believe it. The dream was a beautiful one. Then their disappointment affects their impression of what did release.

I've noticed the gaming public - the vocal complainers about things not working - going beyond mere complaint and into insult and derogation and contempt - can be quite spoiled. Nobody who knows how a game is made, who has any idea how much work that takes and from how many people - and this is true about software in general, or anything very involved, with long development cycles - should feel justified in being so bratty and mean about the failures. Fortunately, most who know aren't bratty and mean about it. Knowing is enough.

Games are always about more than meets the eye. A board game involves - what, little plastic tokens? Slips of cardboard? A video game involves - what, changing the colors of dots on a screen? A few buttons to press? These things are incredibly insubstantial. The substance almost all happens up here, in the mind.

Games associated with Peter Molyneux are hit and miss. This is true of any creative person or group. The most pedigreed ones learn how to hide away their least interesting work. They go on and focus on the next great big idea, and realizing it through a thousand and one great little ideas. The first Fable was groundbreaking. It was a watershed in the development of RPGs as a form. The sequels? Well, I didn't play them. But they weren't quite so groundbreaking, and in the span of a decade or more working on all this, the team, and its outspoken lead, certainly had time to make a bunch of promises that never materialized. But that's what you do - you give it everything you've got, and the spaghetti that doesn't stick on the wall, you don't cry about it.

What the naysayers and wisecrackers and haters of Peter Molyneux fail to recognize is that games from his studios have often enough been cutting-edge. They've introduced new ideas and forms. Some of the fallout of that is that they may not be as fully realized as the products of teams who rely more heavily on existing tropes, grinding on the next shipping. All game development is hard, in fact it's very hard, but when you take bigger risks, you face bigger unknowns. A lower proportion of what you propose will pan out.

If you look at Elon Musk, you see a similar pattern. How many of us drive a Tesla? Are we using his batteries daily? Has anyone gone to Mars, or even the Moon, lately? Yet we realize - many of us - that the hype and the machine are not entirely separable. No one is accusing Elon Musk of not working hard. No one is accusing him of lacking vision, or not bothering to communicate it. A few people think he's a dumb figurehead, but they're either ignorant or high on themselves.

Success is the result of a dream that carries on, outlasting countless failures.

(These were my thoughts while reading the beginning of a newer interview with the dude in question, here.)

Breakfast Breaks

If you want to figure out and apply truth in complicated situations, there's a technique. There are two parts, and everyone can improve at both.

First, seek out evidence. Focus as closely as possible on what are especially more likely to be facts - the "facts" that have the best chance to be factual. When you get a few, you will find that many of these are quite boring. It's easy to overlook them.

The secret is to find the most certain (ie, least uncertain) of relevant facts. You know belief is all a gamble, but you are playing the long game and looking for the best gambles. How? So, do you really want to identify the best facts? The trick is extreme sensitivity and openness to the possibility that a "true fact" isn't precise, accurate, or true. That's the only way you can find the really good bets.

But when you do, the really good bets will (usually) look to others like any other statement. Often they are boring. The difference is that on careful examination, you could find no cracks. You strove to break these eggs, began to make an omelette over and over, but you couldn't break one or two. Those may be qualitatively different, more resilient. From a distance, without careful examination, such pleasant tints of nothingness usually don't appear. All the statements will seem like eggs, and most of the time, eggs aren't breaking. Most of the time, eggs are eggs. Most of the time, you can break them and make an omelette if you want. Most of the time, facts are facts. Here, though, you want to make a non-omelette with the rare eggs that don't break.

Second, find the most basic possible logic. The simpler the better. This, too, will seem boring. Like the egg-sorting, it will seem meaningless to the uninitiated, or to those who aren't applying the technique in this context. Simple logic isn't impressive. Simple logic may make you look naive.

So you have to stop to ask yourself. Are you trying to impress people, or do you want to make a non-omelette with the rare eggs that don't break? Making an omelette - impressing others - is a different pastime with different demands, sometimes quite opposed to what's needed for the non-omelette - ie, for finding and applying truth as well as possible. You have to decide what you're trying to do, discern/predict or persuade, and remember not to forget that decision.

It's easy to make an assumption for today and then misplace the memo-to-self that this was an assumption. Everyone does this. Not doing this is very difficult, and you need to choose your battles. In trying to persuade others, we often wind up forgetting about the things we didn't really know. We believe we know them. It's inefficient to consider all the ways we don't know what we're taking about when we're trying to push someone else to decide right now. And on the other side, it's inefficient to worry how others will take things when we're trying to discern and predict as accurately/precisely as possible.

Logic is brittle. It's both fragile and immensely strong. Your best bet with logic is to use the simplest logic. Just as you did for the facts that feed logic, you want to put aside how boring or clever or hurtful or nice or impressive or awkward this is, and look for the ordinary eggs (or ordinary cardboard egg cartons now, if you like) that turn out to be extraordinary by not cracking. The unambiguity of logic can make this logic-soundness egg-carton-sorting much easier than in the preceding "true or false facts" step. Just keep the logic ultra-simple, and if the facts it's working with really don't break, then the result of the logic is guaranteed. The trick is to make sure the logic is correct. It is not sophistication. It is correctness. Nature does not award bonus points or put stickers on more complicated solutions: logic that works works. Humans are not good at complex logical chains. We can only manage this by breaking it all down and using the simplest pieces.

(I can hear some of you thinking, as I am, that this is really my own limitation, which I'm trying to impose on others. Maybe. But I'll bet heavily that it applies no less if you have the highest IQ in the world, and if you do, I bet you already know it.)

This approach to expanding what you know runs against the social intuition that says objectivity and prediction are sophisticated and should involve impressive facts and mind-boggling reasoning. Good technique turns out, very often, to demand quite the opposite.

Let's recap:

- One, the "facts" must be unassailable or as close to that as possible (which means they often won't be surprising or appear any different from other statements that most people take as givens).

- Two, the logic must be unassailable (which means that every piece of logic must be made as simple as humanly possible).

This level of care over certainty is crucial in math, science, engineering, etc. It's how you make sure Mars orbiters don't crash into Mars after assuming the wrong measurement unit in some software recess. It's a lot of what's implied by "rocket science," yet it's accessible to everyone with a bit of practice.

It won't tell you everything. But it will give your perspective on a topic a solid skeleton. With the best skeleton you can get, you'll find a lot else works better. Muscles can do a lot with a skeleton. The sequoia of intuition begins to grow in fertile mud; speculation, while often fun, may not be so idle now, thanks to all the homework. And you may find yourself understanding more of what people tell you, and remembering it better.

We tend to separate "rocket science" and "mathematical proof" from everyday life. In complex real-life scenarios, it doesn't get any less important to find the core, most probable facts. And avoiding logical errors by keeping logic extremely simple doesn't get any less important. It's just that humans need initial training and continual practice to manage that. And most of us, at best, get that training in one or two areas, and don't apply it in other scenarios where we don't recognize it's needed.

What are the core, least fallible facts? What is the simplest, least fallible logic? Start there, if you can. Then take all the feedback you can get.

Either that or recognize that you are not discerning truths to use in understanding how stuff works and make accurate predictions. If you want to relate to other people and perhaps convince them, it's about relating to other people. Then the thing to understand is how they feel, and how you feel, and what stories led or can lead to those feelings. Sometimes it's also about using specific persuasive techniques, either to trick people into agreeing, or to help them find an idea - one you want them to remember and then act on - to be especially memorable. There are times when all of us appreciate the latter (an illustration that hits home), and occasionally most of us also can appreciate the former (just the right push), if it's done with care and concern for our own amusement or best interests, or at least society's.

Grokking versus convincing - they're at odds, but sometimes you're lucky and they can happen half-decently at the same time. Think of both kinds of influence on others (or the world) as writing, for a moment. Writing often involves drafting "hot," without much filter, and later editing "cold," which is almost all filter and correction. We can to some extent combine these modes. It's possible to write a thing once, even creatively, and move on, because it doesn't particularly need editing, or we have more to work on.

I can't comment much on that combining. It applies in writing, and an analogous duality (synthesis of opposites) applies in debate and influence. Some people (I'm one) will find you much less convincing if you don't spend much time in the more objective mode. So to persuade them (me, often), don't be persuasive. Put objectivity and uncertainty first, for us, despite wanting and hoping to convince. (I'll give you a chance. I'll give you many. You're welcome. And thank you.) And on the other hand, sometimes intuition or inexplicable feelings guide us to truths. I'm not going to be a rules Nazi about the advice I'm giving. And I know that there's a lot I don't understand about how these two modes combine. Sometimes they can. Often that goes wrong, though.

If you want to discern truth and apply it constructively, use the two steps. They won't banish intuition or feelings. Those will wait in the periphery and will have useful or crucial things to say. But know the difference.

samedi 6 mars 2021

There are two kinds of perfectionism that are useful creatively.

One is what you could call "polishing perfectionism." That is, you care about every detail, because you know that the whole is made of parts, and anything off about one of them maybe really could negatively affect the whole. This is what most people usually mean by "perfectionism."

To counteract it, remember that in the beginning, there is nothing to polish. You can polish as you go, but that may be less than perfect as an approach. It can stop progress entirely, or short-circuit discovery. Polishing may be better - more perfect - later. (Perfectionists often like to procrastinate. Sneaky, maybe, but isn't it true?) Also, it helps to look at the words wabi-sabi and sprezzatura, and to remember that a lot of the charm of the hand-made is in the organic unity achieved from imperfect pieces. There is no such thing as every piece being perfect, only a perfect feeling of unity.

The other is what you could go all-out and describe as "creative perfectionism." That is, you are not motivated to make the kind of thing that already exists. It has to be new. (You can already see another bid for excessive procrastination, can't you?) This can work against craft, because so much of craft is learning from the masters and the skillful practitioners. Nothing is wholly new. Everything's made of quarks and bosons and spacetime, at the bottom. Everything's a pattern. All pattern is information, all information pattern. But patterns that are substantially fresh and new and exciting - this is entirely possible. Finding these can be very difficult, so creative perfectionism is about searching for what's actually new and good, not just settling for the tried and true, sometimes not even happy with "tried and true, with a twist." Like polishing perfectionism, this can block you, first because it deemphasizes craft, and second because while you are searching for the actually new, you are often not busy practicing.

To counteract this kind of perfectionism, remember that, much as polish can come later, and even better for that, inspiration, too, can come later. It's often in the middle of work that you see how to twist and turn and end up with something new. You are often very well served to begin with craft, begin with the tried and true, and keep pushing as you go for something unexpected. In other words, it is a near-constant frame of mind. It is usually not something accomplished up front, before even beginning.

You may be prone to both kinds of perfectionism, and to procrastination. I am, so I know what that's like. I have no other data, but my understanding is that these all tend to go together. One of the Achilles heels of people with perfectionism is that we know it can push what we do to a higher level. But we also know how exhausting that can be. As the saying goes, "There's the first 80%, and then the other 80%." There's an enormous difference between "maybe good enough" and "we did everything possible." Sometimes that difference is at least as much effort as it took to get to "maybe good enough." It can even become far, far more effort. But it's also true that it's absolutely necessary to finish things. People with a perfectionistic bent do not suddenly become deaf and blind when they accept that much of what they do will not be finished to their satisfaction, and that they will experiment with and practice on a lot that is hardly new at all. As long as you keep the searching instinct alive, and keep moving in one sense or another, your perfectionism is doing what it needs to do.

jeudi 4 mars 2021

Generally the advice when writing is to pick a topic and cleave to that topic with a vengeance.

There's another approach, though - what you might call "jumping off." Billy Collins these days is the most popular poet in America, some say the new Robert Frost in that sense (though recently Amanda Gorman's got a lot of lift). He describes the "jumping off" approach as starting with a very specific, concrete, easily-imagined thing. It could be a tiny detail. It could be anything you want, but it helps if it's firm enough to hook people and they have no trouble visualizing right off the bat. (Incidentally, "right off the bat" is easy to visualize but spoken by rote. Only rarely will someone imagine the ball. A less remarked on visual will be more actually visual. But it may suffice to take the cliche out of its usual uniform: "as the leather orb splits from a crack of wood" or whatever.)

I mean, let's call that the "diving board." You really can't bounce up and down on a diving board that isn't there. It's gotta be there. And if not, you've gotta have something to put your feet on and jump off of, regardless. Otherwise you won't be diving.

The point is, after you've bounced up and down a couple times, or done a run-and-jump, you're in the air, subject to gravity and air resistance and how you pirouette in free fall.

In this kind of writing, where you end up isn't known to you when you start. Nor do you expect to end up in a place and a fashion that could be predicted from the diving board. That's kinda the whole point.

Maybe the easiest format of all for this is the haiku. Pick two aspects of nature. Now write a few words linking one to the other.

Bippity boppit
bippity boppity boo
ty boppity boo.

With so few moving parts, it's easy to imagine that you don't know the finale when you set down the overture. You could know, of course. But you could also pick a rock and jump off. You have 17 syllables to fall at most. (All right all right, syllable count isn't that strict in traditional haiku, but you get the idea.)

The "jumping off" approach overlaps with "pantsing" (writing a story without an outline, ie, by the seat of your pants) but can differ in emphasis. The one is about starting zoomed in to a detail and discovering a larger tableau in stained glass, potentially universal - beginning and end stuck together with magical thinking or sheer voodoo. The other is about the intoxication of real life that we all feel: we don't know what's going to happen next, even though we have a hand in directing it.

In both cases, the start and the end are different, and the latter isn't quite predictable from the former, though it's connected in a way that makes sense step by step - otherwise it wouldn't be believable, compelling, moving, elevating, tragic, etc.

What's interesting about the jumping off approach is that the end can be - and should be - so radically different from the start. Billy Collins describes this in various ways, especially as "a left turn somewhere." To be sure, full narrative should have twists, too. We've got some overlap here, but there's also parallax.

Maybe that's the best way I can put it. You begin a "jumping off" piece with a super vivid hook that doesn't have to carry any import. The import is where it goes, how you zoom out. As you zoom out, you notice this slippage between where you were and where you seemed to be heading and where you are. It's like looking out the window of a moving car: the planes of distances seem to disagree on how fast you're moving - and whether you're moving at all. But when you look closely, there is a progression. The static background and the blurred foreground are interconnected, flat bead by flat bead, billboard sprite by billboard sprite. You know it's all one big block of invisible jello out there, wobbling, with those moving pieces you see stuck in it, waving about at different distances.

That's a bad metaphor. I won't defend it, but I'm too lazy to try harder. Never mind the jello. But I'll stick with the parallax. I think that can be useful. Also the diving board (probably one thing Billy Collins said, but other writers do, a useful cliche). And the relation to haiku, and storytelling in general, but the claim of at least a little distinction.

In a short story, the parallax between beginning and end is about some discrete realization, which usually is either in the character, or in you about the character. Stories without character aren't stories, and short stories are short. This may sound a little pat, but there aren't too many other options, if you think about it. (Try, though. Distrust what sounds a little pat. Even a small exception is interesting.)

In a "jumping off" poem, the parallax is the sleight of hand of a daydream. You take some perhaps mundane but vivid focal point, and then you progress by accelerating shades to daydream about something entirely different, but you leave a trail of breadcrumbs as to how you went.

We started with a diving board. Now we are escaping the Minotaur's labyrinth. What does the labyrinth mean for us as living creatures? What might it mean that isn't that, or that, or that?

lundi 1 mars 2021

I've written hundreds of poems, some really quickly, some through hundreds of revisions spanning years. One thing that's pretty consistent is that a lot of what I think about is - I'm worrying about the poem getting misread. So I'm removing interpretations I don't like, ones I didn't intend. Poetry is inherently multivalent. You're leaving stuff open. You throw open the doors. And you don't know what will walk in - or crawl in - fly in - scuttle in - from your unconscious - from sheer chance - let alone what will come in when a reader gets to the door. But you leave that door open, or it wouldn't be a poem. Sometimes it's only a crack. Sometimes you blow down the whole wall. Sometimes you leave nothing standing but a few chairs and potted plants.

But it's fairly consistent - at least for me - that I have to put in effort to remove interpretations I just don't want to be part of my poem.

The same goes when I write anything - when anyone writes anything - or says anything - whispers anything when alone in nature. It takes effort to cut out misinterpretation.

And if you try too hard at that, you wonder if you're killing everything vibrant. Sometimes you're worrying that you'll kill it all with any further effort whatsoever. Can't you just drop the words like - I don't know, flower petals in the sand at the beach on a windless afternoon, and whatever shape they make, that's the shape they make?

Sometimes you get stubborn. You know you could be misconstrued, and you think, goddamn you, I am not excising this tumor, and you're putting it here, not me. So if you want to take this gory mass through the door with you? Ok. Go ahead. That's you.

But when you invite like that, it continues to worry you. And you start to think, wait a minute, is this why psychopaths are a little more likely to be artists? No fear. Wouldn't that be nice. You just go with it. Someone wants to misinterpret? Fuck 'em. Who cares. Maybe they'll fawn on you anyway. You want them to misinterpret. You meant for them to.

A poem is simultaneously the most genuine and the most artificial expression.

Artificiality still lives fully in natural law. There is no quite unnatural sound or thought. Thoughts are the sounds of the mind, feelings the fragrances.

A poem I wrote in college, toward the last line, mixes the smell of someone taking a crap in a cafe bathroom - only hinted at, but it's there - with the taste of the coffee. That was fully intended. (I mean, it happened. It was part of the memory.) It probably doesn't strike many people as a pleasant or poetic effect, but I wanted to see if I could make it work, make it contribute. Usually I'm a bit embarrassed when I see those last lines, but I worked and reworked the poem enough that I no longer feel like changing it. If that bit sucks - or stinks, more like - or sucks while stinking - then at least I'm proud of a lot of the rest. I don't know what the detail means. But it happened, and it was weird. It means whatever the reality means.

Writing is impression management, but that doesn't mean you have to write like a corporate manager. You may not be aware that you're managing anything. Maybe you aren't. Maybe you aren't even writing when you're writing.

The question is ultimately not, as it is for many when they write, "Am I a good, likable person?" Nor is it "Will someone hire me at a good salary?" Nor "What cause shall I promote today?"

It's some other question, which is why you might leave unpleasant interpretations in, out of respect for an ear that says "But this is more interesting with than without. I am not going to knock down a nest of sparrow eggs to shake out a spider."