dimanche 27 décembre 2020

Breakage

I don't believe that you have to know rules to break them well. That's a myth. But I do believe the world bristles and bubbles with rules of thumb that you should listen to, and break freely when you want to and it hurts no one.

Knowing the rule clearly first is not a prerequisite. That's a scare tactic that teachers use to get you to listen to them (and you should).

Much has been done by people who didn't really know what they were doing, and who, as they have said afterwards, were just stupid enough not to realize that what they were doing was impossible.

You do not have to know a rule before breaking it excellently.

It can help. As long as you remember not to be a rules Nazi, you will benefit from knowing more of these rules of thumb offered because you're supposed to know.

There is, of course, the question of complacency. Yet breaking complacency isn't about refusing to learn. Don't refuse to learn. Learn rules, break rules, make rules, shake 'em up.

mercredi 23 décembre 2020

The Good Skeptic

Sometimes I think skepticism needs an owner's manual. Skepticism is NOT closed-mindedness. It is NO way to conveniently shoot down what doesn't please you or doesn't sound believable to your intuition. It is both LESS and MORE than common sense. It is FAR from militant atheism. It understands and exceeds reductionism.

Skepticism is three things used well: uncertainty, imagination, and curiosity.

You can also use those things poorly and get very confused. Use them well, though, and that's skepticism. There isn't much out there more useful than good skepticism.

Far from giving you license to stand on sidelines and forbid anyone to speak who doesn't meet your criteria or agree with your set of approved facts and mechanisms, skepticism is a tool for expanding and improving on knowledge wherever you find it.

The skeptic is not the enemy of new or alternative ideas, but of the notion that their own understanding has reached its final best form.

Be skeptical that you've got it all down pat, and do something about that. Now you are a skillful skeptic.

mercredi 16 décembre 2020

Semi or Pseudo

Semicolons are weird. It isn't difficult to use them; it's difficult to use them stylishly. Apart from the self-referential pun, that's one example I don't find too stylish. A comma or a dash would have worked fine, or a new sentence.

You could probably get away with something non-standard like... It isn't difficult to use them; but to use them stylishly. Technically that isn't correct, at least not by the usual simple guidelines. The reason I'm saying it could work is that a reader understands that this version is just the original cropped down. The second part is a clause (a would-be sentence) with a few words removed, and you can feel this as you read. In effect, it is still a full clause. And the reason I think this version might be more stylish than the first is that it leans on the semicolon to do something - cleanly separate two clauses, showing contrast - that neither a comma in its place nor a semicolon in the original context would do. It's "stylish" because it's more compact than the alternatives without losing any coherence. Still, I'm not convinced. I'd feel pretentious writing that.

I'm not sure why semicolons have a reputation for being difficult. Is this why? Style worries? To be fair, rather like colons, they are difficult to get right. You might need to understand fragments, clauses, and conjunctions modestly well to judge whether a semicolon or colon is technically corrector clarify the difference for someone.

But really. Take any two sentences. Replace the period with a semicolon. Not hard. (Reverse the procedure to check if one's correct.) The question is, why would you do that? When is it a good idea to set two sentences into extra relief against each other? Let's not forget that any two sentences next to each other are already a juxtaposition, and relatedness is already suggested by the sequence.

A semicolon (when not used in what I call its "supercomma" role, which I'm ignoring in this post and almost never comes up) basically asks a reader to look againlook, look a second timeat how two thoughts are connected.

That's it. It's very simple.

Now the question is, when do you think that's a move that's stylish and helpful, rather than a move that shows off that you know how to replace a period with a semicolon. (That's dead easy. You write a tail onto the period, and you put a dot above it. Then you improvise to make the capital letter lowercase.)

A Trunk is a Chest is a Nest

Sometimes I'm really not smart. (Often.) Early this year I lost my oldest sweater. It was kind of falling apart and made me look homeless. But it was the best sweater I ever had. Deep blue, a zip-down hoodie. It was thicker than usual, slightly padded and snug, and had a single green strip down either side of the zipper, on the inside, which looked interesting. Tragic loss.

Except I didn't lose it. It was in the trunk of my car all along. With some rotten food. And a dead mouse. The poor creature had gotten in somehow, but never got out.

I found it a few months ago, the sweater, height of summer, but only just tried wearing it. Yeahnope! I reacted to mold. I'll have to wash it more, which might shred it. The mortality of sweaters.

It's moments like these that give me inspirations for stories/songs/games I never write: Sweaters for the Dead. Sweaters from the Dead.

mercredi 9 décembre 2020

Heimweh

"Nostalgia" annoys me sometimes. Bad or mediocre times can seem great in retrospect. But aspects of culture that remain interesting aren't interesting because of nostalgia. Many tastes I've acquired might seem nostalgic, until you realize I'm also this way about times well before I was born, and it isn't about thinking life was better in those days.

There can be a component of deeply evocative familiarity. For the usual example, many people feel their mother's or grandmother's version of a dish is the best ever, just in terms of how they experience it. I've had many good spaghetti bologneses, but the way my dad makes it with two unusual twists (the presence of a "secret ingredient" is such a standard part of the story though!) just seems far better. To me, that's how bolognese should be, and it feels a lot like nobody else gets it right. And traditional Czech cooking the way I know it from my mom, aunt, and grandmother is just the same: no one else gets it right, and there's no other food like it.

There are four things going on, actually, not just "nostalgia" (one of the below is most accurately called that):

1) If people loved a thing back then, it appealed to human minds and hearts and probably still has the same emotional potentials, even though it might take some digging under the surface or into that culture.

2) We sometimes do feel that things were better in some golden past, whether or not we were actually having a good time then.

3) Familiarity from deep pathways, the deer paths we carve out in our own lives, makes some experiences especially resonant later, not least because meaning is created by time and experience and reflection and sharing.

4) Art is supposed to be evocative to begin with, its call-sign unique feelings and patterns (no one will ever write quite like Jane Austen again, etc), and when it loses something for new audienceseither by comparison, or because old references and values and languages are lostit retains intellectual interest as an artifact of that time, place, and people, and often for obvious and extensive influences on newer work.

I hate it when people pronounce "nostalgia!" and dismiss all that in one breath because they believe cynicism makes them worldly.

Nostalgia means 2, often mixed with and easily confused with generous heapings of 3.

I have no interest in going back and living in the early 1920s, but I've never seen another movie that feels just like a good screening of the remastered Nosferatu. That isn't nostalgia at all. I wasn't alive then, nor did I watch silent movies as a kid. Its appeal has nothing of 2 and only as much of 3 as any other art that resonates with me through ambient ideas and patterns.

samedi 5 décembre 2020

Poise

The most difficult thing about coding is not 1) arcane stuff 2) logic or 3) bugs. It's finding entry. You could stare at code all year. Until you go, "Ok, I'm gonna change something and see what happens," you're K I N D A N O W H E R E.

Coding is more like music than writing. Until you play some notes, it isn't really music. Constructing the logic is a duet or dialogue. Dialogue is never just staring, nor is it only staring and writing. Dialogue is two or more sides actually speaking, and to each other.

Whenever I've been stuck on a big project for weeks and seem to have written nothing, you know what I do? I hit the run button. I watch it run and prod it this way and that. Then I change something, and run it again.

You feel way too stupid to understand the code (even when you wrote all of it) until you engage with it like this, in my experience. Once you do, quickly you find you're back in the mental space that wrote the code. You didn't get any smarter in a minute. You engaged!

The other central lesson, which you'll probably have to learn over and over and over again, even after you start telling people it's the central lesson, is that getting upset under no circumstances helps you solve a code problem. The solution is always the same: get curious, test your assumptions, put in sanity checks, the most basic ones first. Ask yourself, "What's the dumbest thing it could be?" and almost like magic, it's often exactly that, because that's what you were overlooking.

Our first instinct when there's a problem is to get hyperactivated. Then we think we need to get sophisticated. With code, the approach is calm and simplicity.

That does not, of course, mean that code doesn't get super hyper (it runs fast) and elaborate (you could spend many lifetimes learning complex code patterns). But as they say in the Marines about dismantling, cleaning, and reassembling firearms: "Slow is smooth, smooth is fast."

samedi 14 novembre 2020

Uses

If you feel like something is wrong, believe me, you can find something that is actually wrong. It isn't even difficult!

Now, if it's getting too much and you can't help it, you can channel this. Never mind what's wrong with where you're looking. There's always something wrong. What's interesting is if you can find something wrong that needs more attention. Something underrecognized, maybe an unsolved problem, or one solved elsewhere but not here. Or maybe you can notice some pattern in a problem here that helps over there.

If you are feeling very much inclined to look in the shadow of every stone, that certainly has its uses.

jeudi 12 novembre 2020

Breadcrumbs

One of the most satisfying things about learning to make stuff: you watch feelings of despair and inadequacy dissolve into a better realization. Every reason you aren't good enough, every evidence of your weakness or incompetence, loses its threat when you notice that you are looking at what to do next. Far from saying you cannot, the flaws in your work are leads and ideas. Not happy with this bit here, but you know it's doing something, and you feel it just came out that way and ought to stay? If you have the freedom to experiment, then do so. Between feeling no good and trying stuff out, where's the contest?

It's more disheartening at first when you have less skill and less faith that you can turn critique into improvement. But mere effort and experience are your mentors. One day you will know, with certainty, that you have turned a critique into an improvement. From that day forward, if you wish, if you remember, you can always tell yourself that you know what to do with flaws: work on them, trust in them. They are your leads.

?

If you're like many people who want to be creative, you've spent a lot of time worrying whether you are, or whether you are enough.

Keep at it. In the end you will see, I promise, that the question is not whether you are. Nor is there a quantity to weigh.

Listen, there is joy in exploring and arriving.

Listen, there is work, there is pain in it, also.

If the challenge of "Am I creative?" or "Am I a real artist?" - or whatever your big concern is - motivates you well, then carry on.

But the worry could be joy instead, you know. It could be work instead, you know. It could be pain instead, you know. And in all those you would feel better than you would about the doubt.

Why? Because that's creating.

samedi 7 novembre 2020

Saying Never

People say - often very successful people, especially - never give up. And I've always thought that was particularly delusional advice, or at least incomplete. Everyone gives up on things, and everyone has to, and no one benefits from refusing to give up on even a single thing. On the contrary, the only way you can pursue a big goal well is by giving up much else.

So I adapted that to "Never give up - if it's really important." And how do you know? There isn't a clear criterion. You have to figure that out for yourself. And if someone else is involved, you have to respect their freedom.

Also, you can understand "never give up" less as general life advice than as how to succeed at a particular dream. While it's possible to give up or put aside dreams temporarily, they may not survive that. For me, I've found they can. I give up on things I then continue. It's almost as if these ideas follow me around. But we have to accept that putting something aside may be the end. So, you definitely don't want it to get away from you? Don't put it aside. Don't let it get lost where you might forget it. People lose sight of their dreams for the last time every day, around the world.

You can waste a lot of time losing steam questioning this, that, and the other thing. If and when you get discouraged - yet from experience you already know that, at the end of the day, or the week, or the month, you will be back at this, not having given up, whatever you say to yourself now - you might as well save the energy and the emotional pain, a little, and refuse to waste time on questions about giving up. Ultimately, the wrong kind of "giving up" is a waste of time, and you'll start to notice if you watch out for that.

It's one thing to be unrealistic. It's another to get demotivated and start believing this isn't possible yet come back around, having lost time and opportunities. You can be realistic and dogged as well. You have to be selective. You can't be that way about everything, no matter what they tell you. But you can pick one or two things to be this way about. Maybe a few more. Not many.

lundi 2 novembre 2020

Parallels

It's easy to forget that every experience you've had, someone's having that experience now. We put life in a chain, but it's a fabric. What you started off seeing and ended up seeing—another person has gone through the reverse, and feels just as sure of the new view.

I don't believe that now is all that exists. I don't see the past and future as illusions any more than the present is an illusion.

But what I'm getting at is that your past is someone's present. Your future is many's past.

I get the idea. The past only matters because the present was once it. But still...

Your memories are hooks on aspects of the present you do not see, but know are real.

To say now is the only time is to forget all the links.

jeudi 29 octobre 2020

What You Aren't, You Own

If you are your actions, then that means your actions are driven by your identity. If you are still you, your actions won't change.

If you are your thoughts and feelings, then that means your thoughts and feelings could be no other way as long as you are you.

We are not actions, thoughts, or feelings. If we recognize that, then these are all relatively mutable, and our ways can improve over time if we wish.

It's by identifying less, not more, with something about us that we change it, or prove it. That doesn't mean disowning: that means recognizing we never really owned it in the first place, but were clinging to it.

dimanche 25 octobre 2020

Cozy in the Metro

There's a lot of weight carried by that thing everyone knows, the thing everyone knows so surely that they don't even know they know it. The sheer tonnage is immense.

Why is a thing true?

Is it true because everyone knows it and there's an explanation that makes sense to you?

That isn't good enough, though.

When everyone knows something and it's wrong, there's always an explanation that makes sense to most people.

It just ain't right.

What I've noticed is that people believe on familiarity. You can show how someone is mistaken, and give them a better explanation, and they can even see it.

But guess what? If it doesn't feel comfortable, if they aren't quite sure about it, if they're just more used to the old explanation, they'll go right on back to it.

You don't believe so much because a thing makes sense. You feel it makes sense because it's familiar. And when you feel it makes sense, you believe it. But that just means most belief is a translation of familiarity.

Plenty of things that make sense aren't true.

samedi 24 octobre 2020

I get stuck.

I don't believe in writer's block, and I don't have it.

For a long time I seemed to have coder's block, until I realized that I believed in it. Once I realized I believed in it, I stopped believing in it, and I stopped getting it.

But it's useful to remind myself that I get stuck. It's a thing that happens—I feel paralyzed. Noticing that I feel paralyzed helps, because I know that I am not paralyzed. This cycle has whirled around its track so many times: I feel paralyzed, believe I'm paralyzed, hopeless, can't get going. But ultimately, I am not paralyzed. I am not hopeless. I can get going.

I get stuck.

That doesn't mean I am stuck. I and stuck are not identical. We are not identical or fraternal twins. Stuck and I are simply beings that understand each other.

When you truly understand a problem, it melts.

samedi 17 octobre 2020

Art is finding a way to say what others will not hear or understand fully by the normal routes.

There is a large field of such things which need to be said, but cannot, quite.

Go and find one. Say it.

Giveaway

When you're very shy, friendlies spend a lot of time convincing you (and you'll try to convince yourself also) that people aren't noticing or judging you the way you fear. This is half true. People aren't noticing you much. But they are judging you the way you fear.

The trick is not to deny that you're misunderstood and misjudged, nor is it to insist on the primacy of the "mis-," because it's quite possible you are judged more accurately than you will admit to yourself. This is, to be sure, pretty unlikely. But the trick certainly isn't gaslighting criticisms away, neither the silent kind you seem to create from nothing, nor the blunt or aggressive or careless kind.

The trick, if there is one, is that you don't know, but you may want to know, and what you want to know is truth, not someone else's opinion.

lundi 27 juillet 2020

Page impatience

I used to be self-conscious about reading speed, as if the way to get through books and be well-read and broadly educated is to be a good speed-reader.

Audiobooks taught me otherwise. Even though the narrator tends to read out loud consistently more slowly than an average person reading silently, I found I'd get through audiobooks much more quickly than paper books. Why?

I knew it had to do with focus. I tend to stop and think while I read. Maybe I want to hustle, and think I should, and even feel guilty, but something sparks a train of thought, and I follow that around, and reflect and connect things before remembering that I'm reading. This doesn't happen sometimes. It happens constantly. It's a lot of why I have trouble getting through books. My first audiobook showed me that it was not reading speed or even emotional ease with lifting words from lettered pages that would get me through more books. It was simply momentum. An audiobook provides momentum. It's like going to a museum with a friend who has a small child. If you stop at particular paintings or sculptures for long, you will get left behind. Then they will either forget all about you forever, or they will come back looking for you, complaining that they just spent five minutes trying to find you.

In a museum, there is little or nothing difficult about walking and seeing works of art go by as you walk. What's difficult is knowing that you have so much more to see and understand, and you can only get a fraction of it, and it's on you to grasp for that fraction in a short time.

Most writers write somewhere between 1 and 5 pages a day. If you ever feel that you're reading so slowly that you must be stupid, remember that the page you're kicking yourself for not finishing in 1 minute, or 2 minutes, or even 10 minutes may have been that author's entire day of work when it was written.

(It took me about 15 minutes to write and edit the above. So please don't spend any longer than that on it!)

vendredi 24 juillet 2020

Fixed

When someone has a conflict of interest, say a scientist hired by sugar manufacturers to look into the safety of sugar, that doesn't mean that person is wrong on that issue. It's something to note, and look at very carefully, but I refuse to treat people as robots. Some scientists will reach and publish accurate conclusions, at least some of the time, if not all of it, regardless of who's paying them and what they're asked to show.

This is my objection to the "follow the money" labyrinths I see in some analyses. The concept of a shill describes a phenomenon, but too many people seem to equate being paid with being a shill to the one paying. They are clearly not the same, and I know that because people are not robots.

The best way to know is to look at the science itself, and find experts and try to understand their explanations. This is not to shop for your favored conclusion. It's to learn from the best sources. Following the money must be interesting, and I have no doubt it uncovers plots and scams and shills. But it doesn't tell you much about the validity of a statement.

mardi 7 juillet 2020

Arc of the Pendulum

When I was a senior in high school, I had a little problem. Everyone said I was good at writing, but aside from a few poems here and there (something I'd pushed myself to do), I never wrote on my own. The first time I'd thought I really wanted anyone's job, the person was C.S. Lewis, and I wanted to write fantasy novels that transported people to worlds that felt alive and sentient. Like his readers, mine would stop seeing words, forget they were reading, and become real travelers. What a peerless skill! I figured I was much too dumb for that, but maybe I could learn to write pretty well. Maybe I could write good short stories—not novels, just short stories. For my next reading assignment, a chapter log as I reread a book in The Chronicles of Narnia (I read those five times in elementary school and haven't since), I started at the beginning, writing out the first page word for word.

I thought the teacher would understand that I was trying to learn from a master. She didn't. At all. She walked by, saw what I was doing, got tense and then angry, and told me I must never do that, it wasn't allowed, that's plagiarism, etc. And while I felt bad, mostly I felt she simply hadn't understood. Yes, I also wasn't sure what to say about this chapter in a reading log, but I thought copying it out by hand was a neat idea, a good start, a way to figure out what to say in the future. By no means would I claim the words were mine. They obviously weren't and couldn't be. She misunderstood my intent, and I was too shy to make much of it. I was in trouble and kept quiet, stifling my upset. Anyway, I didn't try that again. But it turns out many writers have written out whole novels they admired that way. Some say it works. It wasn't a bad impulse.

That was 3rd grade. In 2nd grade I hadn't the faintest idea what I wanted to be, so when I was supposed to say something that would go on a class poster, I randomly picked "firefighter," feeling vaguely guilty that I didn't know what I was talking about.

Fast-forward and I'm a senior in high school, in my first creative writing class, and people seem to like my poems, stories, descriptions, reflections. The teacher usually seems impressed.

She probably ruined me for life by validating my philosophical reflections (even more than my philosophy teacher had). We'd go to a park once a week, sit in our secret, chosen spots out of sight from every living human soul (perhaps not the dead ones), and just write what came to mind. Before that, I didn't see what came to mind as particularly worth noting. She spoiled me.

This is about the time I came up with a strategy. My whole life, since 3rd grade that is, I'd been meaning to write more. When I got into a short story, I felt like I could go on and on and I didn't want it to end. Soon enough, I'd have to brutally kill it (brutal in the sense of abrupt, not violent) with some sort of ending and turn in the corpse. But a phase transition would happen, and then writing one felt as good as reading one. And when I got to that emotional space, I knew I could always get back there. But somehow, I hadn't. Only the occasional creative writing assignment would lure and scare me back in. Why couldn't I write on my own? I very much intended to. But I didn't.

My whole existence was like that. Maybe I was destined to be a failure at everything. Everyone told me I didn't try, never made an effort at anything, but I felt like I was constantly trying to make more efforts and failing at that, and when I did make an effort, it would fail, and it would hurt, and no one would even notice I'd made an effort. So I'd try, fail, hurt, and immediately get accused of not trying, while I was still hurting. That was reliably upsetting.

So maybe in a way I was shielding myself from the same experience with writing stories, a thing I actually meant to get good at. Worryingly, I was never going to get good at something I didn't do. And I was miserable at deadlines, so I wasn't going to hook myself into the normal social patterns and rely on those for structure. I knew it wouldn't quite work. Besides, all the writers and game designers I admired said pretty much the same thing, whenever asked in an interview: they wrote for themselves. They knew it was worthwhile because they liked it. They told the story they wanted but didn't see anywhere. That was their metric. They loved their fans, but when they created, they had an audience of one: themselves.

The way I always wrote, I'd go slowly and fix errors as I went. By the time I got to the end, I didn't really have any revisions. Well, sometimes I'd print the thing out and mark up the page. That helped. But my writing needed very little in the way of editing. If people said it was good, it seemed to come out good as I wrote. But I wrote too little, too slowly. Maybe that's why I was averse to doing this on my own. If I couldn't persuade myself to strike out and get independent, it didn't matter how much talent I might have. I'd never amount to anything.

Here was my strategy, then: I lowered my standard and with that my inhibitions. Like my older sister, probably the most imaginative person I knew, whose reams of poems would flutter around the room if the window was open and get stepped on and torn, etc, I decided I would write a lot. Artists did that, didn't they? They just kept creating and let others pick things up off the floor and say, "Hey, this is pretty good! You should show your art in the lobby!" or "You should get this published!" That was a secondary concern best left to others. I'd read that Aphex Twin made music as a way of life, every day, because that's what he did. He always had. Once in a while a label would ask him for music, and he'd rummage around through his stacks of tapes, his informal diary of composing, and find some pieces he liked, and ask if that would do. Generally it would, and he'd have another terrific album out, and meanwhile he'd just be doing what he did, making music out of habit. That could have been a slightly tall tale, but it was memorable. Anyway, with these examples, I wasn't going to be so concerned about what I wrote. If it blew away in the wind, it wouldn't matter, because I had been writing so much and could write so much more, and would. This way, I'd get raw experience.

It was a direct counterattack on what had been holding me back.

It worked. At that time, I deeply disbelieved I could become a habitual writer. It seemed nice, but I despaired of any possibility of actually transforming myself. Today, though, I despair of the swing of the pendulum. Now I write too much, sometimes horribly too much, sometimes to the detriment of everything else in my life, so that everyone's angry with me, and I'm almost crashing my car. In fact, I've gotten in three accidents because I was typing thoughts on my phone. (Finally, I think, maybe, hopefully, I have broken that habit. Thank God. But you see what I mean.) For all this time and energy—and, very stupidly and condemnably, danger—I should have more to show for my efforts.

It seems I succeeded and went too far the other way. Most of what I write is garbage, and more dreadfully than that, repetitive. Of course, there's a certain amount of woodshedding you've got to do to refine an idea. Jimi Hendrix - please forgive me for referring to brilliant people as if I am one, but I think it's good to have heroes who are brilliant, whoever you are - would compose most of his songs either on stage, shaping them from performance to performance through repetition, or in the studio, where he all but lived, doing much the same as the magnetic strips whirled by. The vast majority of his studio material has still been heard by virtually no one, because it's this kind of experimental, formative noodling. More recently, Radiohead's composition process bridges years, even decades, and their initial song demos are so chaotic and unmusical that apparently it's difficult to reconcile the starting point with the consistently world-class endpoint. Allen Ginsberg, to revisit great writers, is said to have written 99% unpublishable trash, but the 1% that he allowed to be published was generally stunning. That's how this stuff works, though. It isn't just them. They were just particularly brilliant, industrious, and successful.

So I try to cut myself a break for slipping into repetition. But I also hate being a broken record, and disgust with it is a healthy feeling. If you aren't repeating a thing because you intend to make it great, knowing right now it isn't, then maybe you should quit repeating it entirely.

For the last year or two, I've been trying to reel in the excess. It had become a breeze to write five thousand, ten thousand, even thirteen thousand words in a day without any goal to write many words or even to write at all. It would just happen. But I was getting more and more disgusted with myself and the words I relied on ("way," "people," "result," "logic," "imagination," etc) and my sentences and my paragraphs and most revoltingly of all my boring topics.

Maybe six months ago, I decided I was going to write a story, and instead of thinking up a new idea or catching one from the air and this causing me to think I should write, I went to one of my digital piles of story ideas and picked one. It was the first time I'd ever done that. I actually went back to a tiny story idea and decided to try it out. Normally I'd either write as much as I could when I had the idea, or jot it down, intending to come back but then never coming back.

It felt liberating. Separating the idea-egg moment from the idea-hatching moment helped. Because I'd written this one down so long ago, I didn't care what happened to it. I could mess it up as much as I liked, or as much as my incompetence would demand. Within a few sentences, I was in that mode I talked about. It felt as if I could write this story forever.

I didn't. I wrote that day for a bit and haven't come back to it, yet. But after I wrote, I checked how much I'd written, and how fast. The calculation showed that I'd been writing at 1/3 my normal rate. Yet I was so much happier with the result, and I'd enjoyed the process so much more. It wasn't habitual word-stringing and idea-slinging. It felt... it felt like dreaming.

mardi 26 mai 2020

Picture books

Graphic novels are picture books for adults. Many are, anyway (if they're for adults). There's a hint of trouble already, but it isn't that they're picture books. The trouble is assuming this must be contemptible or atavistic in an adult because it's associated with childhood.

I know many people don't think that way, but enough do.

A movie is based on a storyboard, which is a specialized graphic novel. When you read the storyboard rather than watch a screen, your imagination is asked to produce lots of sound and emotion and logic and action to connect and fill in the frames.

Some see movies with contempt, because they aren't novels. But novels were once seen with contempt. Even writing itself was once seen with contempt.

All of these forms can be challenging.

Some people will completely follow and understand movies like The Thin Man, The Departed, Gosford Park, and Mission Impossible the first time through. But most won't, whatever they tell you. I didn't. It's ok: these were not meant to be easy. And films like Mulholland Drive and The Discreet Charm of the Bourgeoisie were not meant to be definite. All are challenging, each in its own way.

Your mind doesn't have to fill in as much with a live play. Are plays easy or difficult, then? Don't believe a friend who watches a Shakespeare play for the first time and claims to have understood everything. They're probably just happy they could follow the outline of the plot and guess at the meanings of old words. Or they're bluffing entirely. The Bard's plays were supposed to be comprehensible to Elizabethan and Jacobean audiences, but all the same they were quite challenging then. They almost immediately fell out of fashion for a century or two because they were considered too dark and too difficult. Yet some plays are simple. "Stage play" does not specify a level of challenge or a genre of story.

The first comics ("graphic novels" is mostly a pretentious relabeling) I got into were the Sandman collections. Since I'd never read this way outside of a few strips in the paper, it was actually quite difficult at first to know which order to trace the panels in and how, exactly. Um, do you look at the picture first, or any text in it? Do you read the text mainly and let the pictures work around the edges, unconsciously? Do you go through it the way an art museum goes through you, gazing at the drawings for minutes on end? What do you do with the words in caps? Is this person yelling? Are they being sarcastic? Are you meant to intentionally alter the sound in your fabricated scene? Or let the caps work unconsciously? It it important to hear it in your head? Should every character have a different voice?

I could go on for a while. The point is there aren't strictly right answers to any of these, including the order of the panels. The flow might seem obvious, but it is certainly taken seriously by the artist. And sometimes that's even used for effects like, well, direct ambiguity about the path. That does not exist in film or novels. If it does, I haven't seen it, anyway.

Ok, I've seen similar. In a classic like The Rules of the Game or one of the Hobbit movies presented in stereoscopic at high frame rate (HFR 3D), there absolutely is ambiguity about where to put your eyes. In regular film you'll notice this also: you can miss things in the foreground because you were checking out the background, and vice versa. But typical filmgoers (I think) watch mostly unaware of how their eyes are focusing. Intellectually we may know that our eyes saccade over a scene, tactically mining choice spots to smelt an impression quickly and accurately. But usually we are completely blind to those mechanisms. Seeing a film in HFR 3D, or one like The Rules of the Game which uses deep focus shots throughout—shots that are focused everywhere, leaving you all the choice as to where to focus—can feel overwhelming. Where do you look? You suddenly realize that you've been fed from someone's palm all along: you thought you were choosing where to look, and you were, sort of. But like a gambler at a casino, you face odds that are highly rigged. In some ways the outcome of your choosing is all but inevitable.

Isn't it wonderful for artwork to show you this without a word? Without it even being the main point? Reinforcing but not defining the theme? To me, that's incredible.

It may be tempting to see comic books as stupid, but they can easily be more challenging to read than a film is challenging to watch. And if you aren't used to them, you might find yourself reading a graphic novel more awkwardly and gradually than a novel, in terms of time per page. Per text, that's a bigger slowdown.

So Sandman is a series of picture books for adults. But it will challenge many readers who wouldn't be challenged by a series of picture books for children.

Lately I've been rereading it. The first time, I thought the writing wasn't that good. It was... well, all right, it worked in this format. The stories? Fantastic! But it felt as if the writer couldn't write well in a more traditional setting.

On rereading, I no longer see that. My concept of writing has changed over the years. This dialogue—it's almost all dialogue—is not meant to be lyrical. This is not poetry. People don't speak in poetry. And for sure it isn't realistic. We're talking about the eldritch king of dream-land influencing "the real world," contending with other spirits like him. It doesn't take a literature professor to notice the parallel between that dream king and any writer. Neatly, though, Sandman doesn't fixate on metafiction in a distracting way. A spell must be allowed room to work.

lundi 18 mai 2020

What humor are U in

We owe Carl Jung a giant thank you for discovering introversion versus extraversion and beginning the study of personality as a science. Advances make society better over time.

You can certainly poke holes in his vision of personality. For one thing, it's based on a theory of 4 bodily humors that traces back at least to Plato. You know the one—if you're angry that's yellow bile acting up (hence "bilious"); if you're depressed that's your black bile (hence "melancholy"); if you're anxiously careful that's mucous (hence "phlegmatic"); if you're impassioned and impulsive that's blood (hence "sanguine"). Spotting all the threads tying the Myers-Briggs Temperament Index derived a few years later by a mother-daughter team (who still get lambasted in an overly sexist way) to the 4 humors is fun, but I'll leave that to you if you're curious.

What I'll say is that although the bodily fluids idea turned out to be rubbish, the personalities pinned to different bodily fluids still make a good dab of sense. This wasn't half-bad for a first try 2400 years ago. No doubt that's what attracted Jung. Fortunately, he threw out the fluids hypothesis. Aside from "we notice these patterns" and "we hypothesize they arise from these bodily fluids"—before real science existed, I don't think anyone even put it forward as properly as that—there was nothing scientific about Plato's model of personality. It was just some good, informal observation tied to a baseless, untested, and unadmitted hypothesis as to the mechanism. (Actually, I don't think Plato brought in the fluids at all. If I remember correctly, that was Empedocles. So you could say Jung went all the way back to Plato's sketch of personality types, skipping Empedocles and all the medieval and wrong ideas about nutrition and medicine that followed.)

Carl Jung made it a little better. He was a trained experimental and observational scientist. Admittedly I haven't read a single one of his numerous books detailing the baroque framework he devised (personality is indeed complicated), but as a researcher and clinician who took copious notes, he based his ideas on some kind of empirical trail. This is so often forgotten, because it seems also very clear that he had a mild form of psychosis, which contributed to his themes of spiritualism and superstition and his love of mythology.

There's nothing necessarily wrong with that, though. It's how he was. We can still credit him with discoveries he made.

Actually, there's something good about it. He influenced the study of folklore and religion and storytelling tremendously. We can thank him, in part, for popularizing some of the most effective techniques in today's books, tv shows, story-based games, and movies.

My reason for saying this is just that I was standing by the microwave waiting for some chocolate milk to heat up, musing about humans as social creatures. Introversion/extraversion clears up a bit of confusion and helps us get along. There's really a ton more to it—that duality is only square one. But it is a first step to really understanding human socialization: conflict, empathy, cooperation, etc.

In fact, Jung came up with the idea while trying to explain to himself why he had such a big falling out with his mentor, Sigmund Freud. The eventual solution, in his mind, was to see this big divide between their two personality styles: Freud was something Jung decided to call "extraverted" (there are stories about how Freud made endless numbers of friends by memorizing everyone's name and details about their lives, so he could ask good questions if he ever met them in the future) while Jung was "introverted" (at one point in grade school, he was so shy he couldn't even go to class and had to be taught at home for a while). Today we wouldn't see that as a reason two people can't get along. But imagine a world before "introvert" and "extravert" existed as words. How would you understand what Jung struggled to understand about this big clash with someone he loved?

Maybe you wouldn't, and maybe people get along better today because of that fight, and the ideas that grew from it.

dimanche 10 mai 2020

Front for a frown

I like things to be neat and tidy in print, but there's a wall. There's a limit. When I'm reading anything but a logo or heading, I want the font invisible. The best font most of the time is the one I'll never notice. That's how I choose fonts 99% of the time: what do I not even see? Good. That's what I want.

I will go through things I've written and replace dashes - like these two - with proper long em dashes, which look a lot better to me. Back in the day, I'd always double the hyphens--like this--and then at a specific moment in my life I switched to space, hyphen, space. But ultimately it all looks wrong to me unless it's an em dash.

I've put down ebooks before (permanently) after noticing a typo and a poorly placed comma or two. It feels like I can't trust the text anymore. (My reading of I, Robot was sunk by sloppy text. One day...)

While I am evidently picky about punctuation, I still have trouble understanding people who get upset about seeing straight quotation marks and apostrophes instead of the curled, slanted ones. That seems like a very high-maintenance preoccupation for very little return. There is no easy way to type oriented quotes and apostrophes. And if you aren't looking, and you are most people, you aren't going to see the difference. I'd really prefer not to see the difference; I wouldn't consider it an improvement if the difference started jumping out at me.

It's already bad enough that when I copy and paste, momentarily paying closer attention to atomic symbols, I often replace any curly quotes with the straight, lower-maintenance, "less correct" ones. Consistency does matter.

See, I'm as bad as people who bitch about the uncurled ones being wrong :p

But I do recognize it's a useless preoccupation (I'm being mean) unless you're typesetting for publication. For my aims, I entirely refuse to go hunting the snark of quotation marks everywhere to curl them so they'll look marginally more proper—unless someone is set on doing that for me—in which case I won't stop them, as long as it's consistent. Entirely refuse, I say!

The curled ones look a little nicer, I admit. But I also don't want to notice. Magic happens when you stop noticing font and every other surface.

As long as both quotation marks are included, nobody spends time wondering whether a particular mark is the start or end of a quotation. I see the potential use, but in practice it's almost never needed. Even where it could help slightly, the amount it helps is negligible. (For some readers, say with dyslexia, this may be different, though. I'm not sure, but I apologize for being inconsiderate.)

The joke's on me, of course, for spending the time and energy to actually put these thoughts in print.

The fractured monarch

The King of Limbs is so experimental it's a fractured album. I even believe shattering is a hidden meaning in the name—"limbs"—but I can't confirm this. A couple years before it came out, Radiohead said they'd never release another album, just singles.

I remember their website saying it. What they eventually did was break parts off the album and release them as connected singles. The art on these non-overlapping fragments plainly shows that they're all part of The King of Limbs. Indeed, they're from the same sessions. The band's recording technique was also intentionally out of order. Staccato. TKOL is one of their least appreciated albums because the weirdness lost even Radiohead fans, haha.

The author of a Stereogum article diving into this ("In Defense Of The King of Limbs") says he's made an uber-playlist of all the associated songs. A reintegrated album that works fantastically, but he doesn't give the ordering. Then someone in the comments offers one up. So a while back I made Ass-Kicked89's list a Spotify playlist. It's pretty good—yeah, yeah to say the least! My guess is that Radiohead wanted to encourage us to make our own playlists... a jigsaw puzzle album.

It would be seriously remiss to talk about The King of Limbs without sharing that many consider the filmed live performance (The King of Limbs: Live from the Basement) better than the original, "better" as in "so much better that these fans see it as the album itself." Definitely check that out, but I say give the complete playlist a try as well! Personally, I love it.

dimanche 3 mai 2020

I used to think I'm a good writer. It's taken a lot of practice to realize I'm not.

Not being a good writer is as constructive as not being a mathematician. When you aren't a good writer, nothing is ever good just because you wrote it. When you aren't a mathematician, nothing makes sense just because someone says it's logic.

The only thing that makes a difference when you aren't any good is your willingness to keep at it.

Generators

Making a music generation engine that could learn different styles from music scores and recordings was my senior tech project in high school. The thought was to get started on something like LucasArts' iMUSE engine or Sid Meier's Bach generator, only shooting for what unites all music. I... didn't get very far. At all. Tried to learn Lisp, studied fractals and neural networks, dipped into papers on other music generators, outlined code that never got to the point of compiling. Getting nowhere was an issue at graduation time. There was even a question about whether I would, and it meant I didn't get a fancy emblem on my diploma.

It was a great excuse to learn music, though. As the year began, I signed up for the available semester-long theory class and convinced my dad to buy me a cheap guitar, which I gradually taught myself to play. And I've been fascinated ever since. Besides failing, which can be a healthy thing—it's better to take on an interesting challenge and fail than not challenge yourself—I realized that I like improvising my own melodies way too much to want a computer to do that for me. Sour grapes? Maybe. Maybe—but I really think they're sweet. Anyway, neural network research keeps closing in on what I over-ambitiously tried to do in 1999-2000. Eerie stuff!

samedi 2 mai 2020

Rules of democracy

While trying to piece together my memories of the places I lived when I was really little (it's a jumble!), I found myself going down the rabbit hole of Czech political history. Czechoslovakia was a communist state the entire time my mom lived there, from 1951 to sometime in the early 70s when she gave up her only citizenship to get out of there, even though it meant she could never go back.

The first time I went to the country was (I had somehow forgotten this) during the Velvet Revolution of 1989. It was still underway, but the president had resigned. Though I don't remember seeing any protests, I had a faint idea at the time. Change was afoot. After all, we were suddenly able to visit.

We'd watched the Berlin Wall fall not 6 weeks before on television, and Vaclav Havel, the man about to become the last president of Czechoslovakia (later the first president of the Czech Republic) had been arrested only 5 weeks before our arrival, during the first big protest in Wenceslas Square. He became president about 3 days after we flew back.

From my perspective, we went to see family for Christmas, which was wonderful for multiple reasons, not least of which was that it was so different. Everything was utterly grey, though. The Czech capital, Prague, looked nothing like how it looks now; it was a shock a few years later to visit again and discover that it's a gorgeous city. (Digression: history in the making again, take a look at Prague during COVID-19 lockdown in the short film The Silence of Prague. In the second to last shot, just as the title gets to the top of the screen, there's an old city square with a clock tower on the left. My mom was just telling me that she lived right around the corner of that clock tower in college, meters from what you can see in the frame.) My parents had mentioned protests and communism ending in the country, but it was all pretty distant to me, even though technically we were there.

The moment in history is called the Velvet Revolution (or the Gentle Revolution) because it wasn't violent, in general, and this was no mistake. Hundreds of different pamphlets had been circulating in the capital, and the sentiment on the street was to keep a premium on peaceful protest and maintain "humanness" under all circumstances. Of the two most famous flyers going around at the time, one was called "The Eight Rules of Dialogue." My mom has mentioned it to me before and read it out loud, translating it for me, but that was much later and I didn't put it in context until now. The Eight Rules set the stage for a new democracy.

More than 100 years before, Marx and Engels had called for a "violent revolution" as the only way to get communism up and running. But once Marx could see the consequences of those words in persistent brutality and dysfunction, he admitted that the statement had been a grievous mistake. Which brings us back to Czech communism: mistakenly brought in with violence, peacefully dismissed with protest when it simply didn't work anymore.

Vaclav Havel, the new president, had long been a popular playwright, poet, revolutionary, and essayist—since the 60s. Here's a quotation I like:

"I really do inhabit a system in which words are capable of shaking the entire structure of government, where words can prove mightier than ten military divisions."

To me, that's a reminder of what democracy is about and why it works. We use words.

Here are the Eight Rules from that famous flyer, courtesy of the book Revolution with a Human Face by James Krapfl:

"The Eight Rules of Dialogue"

1. Your opponent is not an enemy but a partner in search of truth. The goal of our discussion is truth, in no case intellectual competition. Participation in dialogue assumes a triple respect: toward truth, toward the other, and toward the self.

2. Try to understand each other. If you do not correctly understand the opinion of your opponent, you can neither refute his claims nor accept them. Formulate for yourself the objections of your partner, so that it may be clear how you understand him.

3. Don't present insistence without objective reasons as an argument. In such a case it is just a matter of your opinion and your partner need not concede the weight of the argument.

4. Don't skirt the issue. Do not avoid unpleasant questions or arguments by directing the issue elsewhere.

5. Don't try to have the last word at all costs. No quantity of words can make up for a missing argument. Silencing a partner does not mean refutation of his argument or disavowal of his ideas.

6. Don't undercut the personal dignity of your opponent. Whoever attacks the person of his opponent, rather than his thought, loses the right to participate in dialogue.

7. Don't forget that dialogue requires discipline. In the end it is with reason, never with emotion, that we form our claims and judgments. He who is unable intelligibly and calmly to express his opinion cannot conduct a worthwhile conversation with others.

8. Don't confuse dialogue with monologue. Everyone has the same right to express himself. Don't get lost in minor details. Consideration toward everyone else can be expressed by your ability to save time.

I must say that's a truly awesome way to have a revolution.

jeudi 9 avril 2020

Hooking the singing fish of time

One of my favorite things about art is connecting your ear more closely to your mouth, yet over greater time spans. The link may even be a roundabout way of defining art. When you are inexperienced, you see or hear or taste or touch something you like, and you know that you can't make the same. This may upset or excite you; it could encourage you or push you away, spur you on or sap your energy. Your reaction could be any of an endless number of combinations of these facets, and it'll change over time.

But, at the critical core, with practice you slowly, slowly begin to realize something. Whatever you can perceive and appreciate—you eventually begin to feel this is true—you can express the same quality yourself. The fact of your appreciation begins to seem a potential for the same acuity, subtleness, emotional power, whatever it is. This takes quite a long time and piles of hard work, and for much of it you will believe what I've just said is untrue.

A wondrous artist can draw on ancient influences, ones maybe from childhood or a lazy day-trip to a local museum, influences whose sources may rest entirely forgotten, and reproduce the same kind of effect without duplicating the original expression. And what makes the artist truly great is that the artist's "mouth" will still be finding influences a century or five later. Do you see what I mean? Art is a strong connection between ear and mouth over very large distances in space and time.

You can make that connection. You can strengthen it. Many things are mysterious, including the dizzying multitude of potentials in expression and art, but that fact isn't. That fact is as stable as the big rock you lie on for a windy view of a valley.

Know how a hook works

There's still a stigma against both video and board games. It's perfectly socially acceptable and normal to spend countless hours binge-watching shows, and the more hours spent reading, the better. Yet as soon as someone's playing, each minute is seen as squandering life potential. Huh. Tell me how that works.

The alternate reality of a game world is no more one that you can live in instead of this world than the alternate reality of a tv show or book. In all cases, if you are normal, you know it isn't the real reality, and you balance that against the rest.

As someone who studies and tries to make interactive media, a field that includes (but is not limited to) games of all kinds, I'm almost the opposite of the general public here: I'd find it productive to play more games. Rather than worrying about whether I'm wasting my life playing a game, I worry that I'm not playing enough games (a legitimate worry; since I was a teenager I don't play games very much).

What I can say in light of all this is that my own potential to be addicted by games is very low. It's no more than my potential to be addicted by a good book. I'm constantly bothered that I don't get addicted by all these things enough to go back to them and finish them.

There is a segment of the population who have troublesome gaming addictions, but it's 1-3 percent, and you probably know who you are. And I think it can be helped by becoming aware of whatever hooks you, seeing how it works, and distrusting designers who want to hook you as much as possible without providing emotional and intellectual value. For example, though it has plenty of good qualities, World of Warcraft often snags nefariously because the cooperation is so intensive that team members shame each other into attending group battles and quests and dungeon crawls that benefit the entire group. There are professional sports and professional competitive games. There is nothing inherently wrong with taking a game seriously, but anyone whose life is diminished by social demands in a virtual world should massively distrust such social demands as having no real currency. There is no shame in not attending an online raid, whatever anyone tells you or implies to you. Similar realizations—and sticking up for them—can, I think, help other sources of game addiction.

I've spent entire weekends locked in some game or other, barely moving. This was mostly when I was much younger. But you know what? I loved it. I've never felt I was harmfully addicted to a game. It was like getting lost in a book or miniseries or in something I'm writing. It's satisfying. It's meaningful. You remember it the rest of your life. And in balance with other things, like standing up and taking walks so you aren't sedentary for more than say an hour, it needn't bring any harm at all. But I will say: I happen to have an almost extreme distrust of things that don't have endings. If a television series is more than two seasons, I almost certainly won't ever see more than an episode or three to get a sense of what it's like—even if I want to watch all of it!

In a weird way, my form of ADHD might actually protect against game addiction. But according to the numbers in this article, it's rare in the general population regardless. Like plane crashes, we overblow the image until it seems to mean something it doesn't.

And I think we can look to our expectations to remedy our relationship with play. If you see a game as a waste of time, as a low-brow activity, as shameful, necessarily unproductive, then why would you look for an especially good one? And if you are playing just to kill time, why would you judge a game on anything but whether it feels addictive and you want to try another level? If your expectation is that games are for killing time, then that may be the fate of games for you. Personally, I love games, but I never seek to "kill time": I hate the very idea. Is that maybe why I appreciate all the games I get around to playing?

Playing games is a healthy part of life if you approach it in a healthy way. My go-to comparison is drinking. Many people drink, if not most people. Yet the benefits of drinking are less than the benefits of playing games, in my opinion, and the risks are far greater. If anything, drinking should be looked down on, and games celebrated everywhere.

mardi 31 mars 2020

Uselessness

It's fashionable to make fun of what we learn in school as useless. When I was a kid, everyone seemed to agree that math had no value unless you became a math teacher. By the time I was in university, and positions in mathematics, computer science, and data science were the cushiest, most sought-after jobs, the tune had changed. Yet kids still say this, math is useless in regular life.

Let me take up their case for a moment. Even school administrators are bubbling away recently about algebra as "obsolete." They argue that its usefulness is too specialized. They would rather replace the math in grade school largely with data analysis. In short, they want to train the next generation of data scientists. Computers after all can do algebra better and faster than humans can. And surveys of adults do indeed show that few use math beyond arithmetic in their daily and work lives. Interestingly, there's actually a blue-collar effect. Highly skilled blue-collar workers tend to use math more than white-collar workers. Many get promoted out of jobs that require math. They get bigger paychecks and preoccupy themselves with telling others what to do.

Are we just training people for jobs, or are we introducing people to this universe so that they can understand what's going on?

Although most people don't use much math on the job, a majority of people dislike or hate their jobs. The best jobs out there require more than an average level of understanding, including, often enough, in math. Mechanics and welders may use more algebra than you do, but so do people rolling down the tightrope of the frontier, reaching what exists only tomorrow out there by firmly understanding today in here. If his uncle Jakob hadn't taught Einstein math proofs at an early age (and no one else had), he would never have precisely defined the relativistic effects that reveal the large structure of the known universe, nor would he have helped discover quantum mechanics. And no, these aren't so esoteric. Your car battery uses special relativity for 80-85% of its voltage, and that green pigment in plants? It's there to capture individual photons for photosynthesis, a process that would not work without quantum mechanics. That's one reason it's taken us so long to understand. Imagine if we'd grasped photosynthesis and been able to run it with machines a thousand years ago! Life would be radically different today.

We take the power of photosynthesis as a given today, although we haven't quite harnessed it yet. Not long ago, its existence and potential weren't widely appreciated. Take a look at the 1938 movie "You Can't Take It with You," 49.5 minutes in. Between the dawn of history and 1938, I'd be willing to bet most people didn't need to know what photosynthesis was or how it worked to get by in life. A farmer would need to know about the sun and seasons and weather, of course, but apparently none of the details about chloroplasts or chlorophyll. Do you? Is that something you need to know? Does your answer to that question also answer the question "Is it useful?" They're two different questions.

Many people want to be promoted out of technical roles, and that's fine. But I don't really want to do a job that doesn't use more of my internal networks. If you offered me a long-term job that paid much more but required no math or other technical skill, I doubt I would take it; taking it would require some incredible purpose or mission. I enjoy using math and many other problem solving skills in the same hour, and day to day.

I see the fact most people don't use math on the job not as a kick at math, but as a knock against our society. It still lacks vigor at drawing on our collective intelligence.

But let me circle back to my original thought, the reason I started typing this. Simplicity is way more powerful than you think. We have a tendency to dismiss the simple. We think we need to earn our right to answer a question. We think a right answer needs to look a certain way. But solving a problem has nothing to do with rights or earning. If a problem can be solved simply, it doesn't matter how ridiculous people find the solution, or whether you have a PhD in a related field, or who has a right to what. None of that matters. If the solution works, the solution works. That is the character of fact. That is the mechanism of reality: detached from our preconceptions about ought or would.

If you get through university, you have been introduced to a shocking variety of difficult problems and their solutions. Many of those could save lives in the right circumstances, and many have. Do not underestimate the power of simplicity. Do not underestimate the power of complexity.

Let me also address the word "useless." Would you say a medical ventilator is useless, for example? Have you ever used one, personally? Most likely not, right? I would say a medical ventilator is incredibly useful, yet most people haven't used one, and if they have, someone else operated it. If we stopped making them because most people haven't used one, that would suck.

Math and science are astonishingly greater than a medical ventilator, in that a medical ventilator is just one expression of their usefulness.

If you feel you are the kind of person who will never use math in real life, that doesn't mean you couldn't benefit or benefit others by using math. It's a choice you make yourself, and we all have different strengths and weaknesses, and that's a wonderful thing. But just because you aren't going to use it in real life, or not much, you shouldn't go around saying it's worthless. Remember the medical ventilator. You might never need it, but if you do, let's be grateful it's there and someone knows how to turn it on.

samedi 28 mars 2020

How to find out

Found a paragraph I copied out three years ago from Bertrand Russell's The Problems of Philosophy, chapter 15:

"The value of philosophy is, in fact, to be sought largely in its very uncertainty. The man who has no tincture of philosophy goes through life imprisoned in the prejudices derived from common sense, from the habitual beliefs of his age or his nation, and from convictions which have grown up in his mind without the co-operation or consent of his deliberate reason. To such a man the world tends to become definite, finite, obvious; common objects rouse no questions, and unfamiliar possibilities are contemptuously rejected. As soon as we begin to philosophize, on the contrary, we find... that even the most everyday things lead to problems to which only very incomplete answers can be given. Philosophy, though unable to tell us with certainty what is the true answer to the doubts it raises, is able to suggest many possibilities which enlarge our thoughts and free them from the tyranny of custom. Thus, while diminishing our feeling of certainty as to what things are, it greatly increases our knowledge as to what they may be; it removes the somewhat arrogant dogmatism of those who have never traveled into the region of liberating doubt, and it keeps alive our sense of wonder by showing familiar things in an unfamiliar aspect."

There is risk in following and in not following the conventional wisdom—all believing comes down to a bet. You can be more informed or less, can cast the bet of your interpretation at a bad time or a good time, but it can only be a bet. And I repeat this too much! But what B.R. calls "liberating doubt" amounts to energy. You could fear, or you could suspend disbelief. Both are doubts, see? There are as many shades of doubt as there are of support. Doubt is chromatic, not monotone. But let's admit this is just another spiritual creed, one I prefer to live by...

jeudi 19 mars 2020

Racin data

The French have this phrase, "raison d'état." It means "official reason," more or less; literally, it's "the reason of the state." It can also be translated as "national interest." A leader using raison d'état seems to justify a political move, even sounds diplomatic. But it's really about a cover story. Raison d'état? Everyone knows it's a lie, but it keeps up appearances, gives a little plausible deniability.

For a commonly cited example, W went into Iraq with the raison d'état of quelling terrorism. It was apparent to many (especially foreigners, at first) that the invasion was tangled up in oil with an extra dressing of finishing up for pops. Retaliating for terrorism was opportunistic: any semi-credible excuse would have done just fine. There was a national interest: resources. There was a sheen of diplomacy: keeping the free world free.

When Trump says that he always knew this was a pandemic and very serious, that's his raison d'état for suddenly ramping up coronavirus testing and emergency stimulus measures very late. The real reason is that he was deluded and spent weeks in flat denial of the situation, trying to confidence his way through it. Today, there's a dire need to scramble for lost time. Rather than saying oops I was wrong, I guess this was 'ugely serious after all, which would sound weak to people he can fool, he instead says he always knew it was a pandemic and very serious. Of course of course, he's gotta ratchet ratchet ratchet things up. He always knew it was serious because it was serious. Plausible deniability! See? That's raison d'état.

Maybe his followers are more permissive or accepting of raison d'état than I and people in my filter bubble are.

I see the lie. It insults the intelligence of too many people. And the horrifying fact is his mistake will lead to thousands of deaths.

Reverse mortality

Virginia had its first coronavirus death on Friday, March 14. Working backwards according to some numbers and calculations in a big article I highly recommend (Coronavirus: Why You Must Act Now), this means that, statistically, we probably had 692 to 1384 cases in the state that day, rather than the 41 confirmed. Obviously the real number of infections could be more or less, since we're talking exponential growth around 30% per day, which will amplify little differences. The range 692-1384 comes from 1% versus 0.5% mortality when medical services are good. If we assume that healthcare around here is great (0.5% mortality), that actually means 1 death implies 1000+ cases now, and 200 when the person got infected.

Well, hold on.

The big problem with the calculation is that it assumes the first 200 people caught the disease on the same day. That isn't totally impossible, given that COVID and SARS have super-spreading (and people flow across borders), but it isn't likely.

I'll try to fix that "big problem" in a minute. Let's look at the basic reasoning first, though.

The focus is on number of deaths because it's the only reliable count. That will be true regardless of whether people are quarantined, locked down, going to concerts, etc. Asymptomatic passengers get on planes and go to restaurants... corpses don't, really, and it's hard to miss the fact someone died. COVID-19 was discovered when someone died from an unknown virus in China. (It was seen in bats earlier, back in 2015, but never mind.)

For the first critical cases, medical staff can work their miracles. Later on, with overloaded hospitals, mortality goes up to 4 and 5% (it'll look higher thanks to an illusion of record-keeping as partial data pours in), but that isn't relevant early on. Thankfully 0.5% is the best estimate when everything's working (thanks go to Tomas Pueyo for explaining this so well in his article). Around 1 in 200 people wouldn't make it. That's on average, but the virus has not been mutating fast for a virus, and no one has a cure. It should be fairly consistent.

Let's try to address the inaccuracy I mentioned. It changes the emphasis more than the numbers, because of the wonders of exponents.

First we'll assume "low" mortality at 0.5%.

If the very 1st person who caught the virus in a region was the one who died, and we assume steady growth from 1 case, then there could be as few as 7 cases (on average) when that person died 17.3 days later (about how long it takes).

On the other hand, if the 100th sick person was the one who died, there would be 692 cases at that point, on average.

If the last of the 200 people was the one who died, or if they all contracted it on the same day, then there would be 1384 cases at that point.

The article estimated ~800, which is the exact same calculation as the one with "probable" next to it below, only rounding the exponent off to 3 for simplicity. Anyway, here are the calculations if you're curious (doubling happens every 6.2 days according to lots of data now):

1*2^(17.3/6.2) = 6.9
100*2^(17.3/6.2) = 691.8     ← probable
200*2^(17.3/6.2) = 1383.6

And if healthcare isn't quite as great and mortality is 1%, the expected total number of sick people on Friday would be more like:

1*2^(17.3/6.2) = 6.9
50*2^(17.3/6.2) = 345.9     ← probable
100*2^(17.3/6.2) = 691.8

Notice two of the numbers get repeated above (6.9 and 691.8). We're just highlighting different parts of the same curve. Neither mortality rate nor the patient's "position in line" for catching the virus alters that fundamental curve.

Looking at the reporting for Saturday, March 15, I see there are 6 states with 1 death each. Colorado and Georgia have about 100 cases each, but the others have a lot less: 45 for Virginia, 36 for Oregon, and Kansas and South Dakota with 8 and 9. Those numbers are systematically a lot less than 692, which should be a typical number of cases with 1 death reported so far. (Many would be silent or still incubating, so the discrepancy wouldn't be surprising even in a fantasy land of excessive testing.)

That's an average of 50 cases when we'd expect 692. So... for lack of a better approach, maybe multiply cases reported in the US by 14?

Unless I've missed something else major—other than that people get better and this changes the dynamics (but not much early in an outbreak).

Also, I must say calculus would give a slightly better estimate than 692... wait, scratch that! Exponential curves are neat. Integrate x*2^(17.3/6.2) from 1 to 200 and divide by 200. The mean number of cases by the time of the first death (whatever that person's position was in the first 200, averaging all the possible infection counts when they die, ie from 7 to 1384) is precisely the same, 692.

mercredi 18 mars 2020

Out-rocked

It's easy to think of things like rains, hills, or sunshine as earthly. But think how many places out there have rains, hills, sunshine.

Almost everything you associate with Earth is out there in incomprehensible quantity.

Look straight ahead of you. What's there? Whatever you see—yes that—and about half of the universe behind that.

When you take a long flight over the ocean, and you land, and you notice that this new place is real, just as real as your own, yet feels so different...

That's arriving in a new star system, landing somewhere you can put your feet. Hey look, there's rock. There are breakers. Little puddles of water bake in the sun. Sand is over there, oh wow, and there's a pebble beach. You can pick up pebbles and skip them. The sun is different, but it's more of the same. It's star.

Neither of us has any idea how much of that is out there, other than: it's more than we imagine.

If you think an ant is small and expendable, never forget that you're about the same size. The difference seems big to you, but if it were really big, you'd have difficulty comprehending the scale separation. An ant feels small to you because you're almost the same in size. An atom doesn't feel small to the stomach. We have little comprehension of it.

This isn't abstract. Try to feel how much damn rock is out there, out in space orbiting, spinning, heating and cooling at the same time, forming new chemicals. How many galaxies of that. How many galaxy-sized containers of rock evolving like that? Trying to feel it all is like trying to pick up a brick wall. You're going to lose traction. That's very concrete.

The July 19, 2013 Cassini image of Earth from Saturn (The Day the Earth Smiled), Earth glinting under its rings, was the prompt for these efforts to express the wonder I feel. It's as if those rings are a foreign airport terminal, a very foreign one.

dimanche 1 mars 2020

Where's that?

The question "where do you get your ideas" is a really weird one. And I've heard a number of creative people answer this, and they always seem bemused. It's as if people think there's this special place you go, a corner cafe maybe, and if they dutifully mosey their flip-flops on down to that same corner cafe, now they will have great ideas. In reality, ideas come from absolutely all over. It isn't a place you go. It's called opening up your skull and using your sense of wonder. Anything can be an idea. Turn your head to the left. There are 5 ideas right there.

To maintain some kind of standard, I just did that. I stared at my closet door for a minute, and I imagined 5 different visual/story ideas, and they were all based on details of or defects in my closet door. (One was a strange pattern of tall, thin windows that give a view of the seaside; one was a bird's eye view of a dad wearing a long, blue dress jacket walking over a desert of dried paint and falling face-first over a large lump of paint that, to little him in that great white paint desert, is almost a hill; one involved a virus spewing like steam out of a cut wire; one was a sliding door on a cozy spaceship; and I forget the other one, but let's take another and say a closet is covered by a big rectangular granite slab that has misshapen eyes near the top, and maybe that kind of interior decoration is fashionable in this time and place.)

It's like asking, "Where do you go to listen to NPR?" That's fine for a person curious about your habits, but otherwise it shows a misunderstanding of the availability of NPR. If you've got a working radio, you just turn it on and tune in, wherever you are. If it isn't working, or you don't have one, you go and get one, or get it fixed.

The most basic trick to ideas is to stop worrying if it's a great idea, or if it works, and start pulling around the clay in your mind, the clay of all those sensations and memories, and, yes, whatever is in front of you at the time. The weirder, queasier, more outlandish, more half-formed and emotional and impossible to describe, the better. If it's terrible, that's like stretching before a run. There's no shame in it. Ignore any suggestion there's shame in it.

Sometimes an idea will punch you in the gut from around a corner you don't even see.

For example, two days ago, a student mentioned... what was it... ah, the French Revolution, when the revolutionaries wanted to change everything, including the calendar. So they adopted a completely new, bizarre, unfamiliar calendar. And it quickly fell through, because no one got used to it. Were the weeks even 7 days? I wanted to find out more. It excited my imagination. I said, that's a really interesting opportunity for a story, that moment in history, the bewildering calendar, what happened.

And I know I'm right.

What did that take? Almost nothing. It's like going to the supermarket. Go there, you'll see things you can eat. Except... the supermarket is everywhere. "Where is the supermarket?" happens to be exactly the wrong question. Where I S N ' T the supermarket? Now we're talking. Now we're cooking with gas.

There's this misconception that unless you are very responsibly worrying all the time that this idea won't work, you won't come up with any ideas that work. Totally wrong. Just come up with ideas. Don't shut off all critical thinking always, but suspend it sometimes, sometimes completely. And don't be too quick to kill something that doesn't pass the critical filter when you turn it on. Often it just needs tweaking or reimagining.

There's no shortage of people in the world who will delight in telling you what they think is wrong with your idea. They won't always be wrong, won't always be right. And you'll need to do some of that work yourself, a lot of it. But you are being different from them. You can do more than just shoot an idea down.

The biggest mistake most people make with ideas is confusing familiarity with plausibility. In cognitive science, that's sometimes called the availability heuristic. Basically, you try to imagine a thang, and if you run into any resistance, you conclude that this indicates the thang is not plausible. For example, if you are really straining to imagine that Puerto Rico could ever become the 51st state in the US, then you may conclude that this means it is correspondingly unlikely to happen, or unlikely to work if it does start to happen. Conversely, if it's easy to imagine your neighborhood's nuclear power plant melting down and spewing radiation all over the continent, or your next flight losing both wings and crashing into the ocean, you conclude that's very likely.

Just reminding you to remind yourself, then: "imaginability" is a well-known and carefully studied fallacy. Your brain is not half as good at all that as you assume. You are often enough drawing the wrong conclusions without realizing it, just on the basis of ease or difficulty of imagining scenarios. To put that differently: on the basis of familiarity.

If you've ever described new ideas to people, you have run into this. Even if the thang is true, even if it works, even if you know this, people will push back because they have trouble imagining, and they are sure this must mean it just isn't realistic at all.

Don't do that, and you're already a million miles ahead of most people. Notice that isn't a place you go! It's just watching out for a bias and shooting it in the head when you see it. With a Nerf crossbow. But it'll get the idea and leave you alone for a few minutes.

vendredi 28 février 2020

This is rarely talked about, at least not around me where I look. When you make something, something expressive, you are going to be misinterpreted. It is inevitable. I'm not saying lay down your white-out brush and never attempt clarity. But if you are always breaking and papering over and whitewashing your efforts to make sure they cannot possibly be misread, what you are doing is technical writing, not expressive writing.

Someone's going to misread you. The best you can do is the best you can do. Sometimes you will have to leave in the possibility of an interpretation you don't like, one you see, one that's painful to know is latent in this, because you do not have the skill to recapture what you would lose if you surgically removed this possibility.

There is no perfect creation. The closest creation to perfect is the millionth, not the millionth revision of one.

jeudi 27 février 2020

These days

The claim that there's too much political correctness these days raises my hackles. On the one hand, I'm really not a fan of "correctness" to begin with. True is true. What's correct? Someone's opinion. But my attitude here is nothing new at all. It isn't "these days." It's my permanent attitude, and I'd apply it to every era of human history.

If you're in a minority and you say "political correctness these days," I will fully sympathize, for the reason I just stated. For you, I will not assume that "THESE DAYS" is code for being unwilling to consider yourself prejudiced when you are. To state that differently, if a lot of prejudice gets directed at you, I will not infer that the actual problem is in you. Furthermore, I do agree that there is something going on THESE DAYS that's relatively different from how things were a generation ago: extreme polarization.

If you're a white guy (doesn't mean you aren't in any minorities, but bear with me) and I hear "political correctness THESE DAYS" as a complaint coming from you, I will be seeing Trump's face in my mind, unfortunately.

It's a double standard, I know. It is not strictly applied. But "political correctness THESE DAYS" is automatically suspicious to me, and I'm very willing to sympathize, but, see, I need an actual reason. I need a reason to see those words as something other than prejudice.

To NOT get this response from me, all you need to do, really, is talk about polarization instead. I don't care who you are or what country you live in. Polarization is quickly a sickness. Extreme polarization is ultimately worse than COVID-19.

As for political correctness THESE DAYS, a lot of it is about making sure people who are not like you feel welcome in society. If there are a few verbal misfires, it's well worth it for that larger goal. Most of the change (the "these days"), if we filter out polarization, is absolutely needed, even critical.

We've really gotta watch out for this ready assumption that there are two sides to every coin, and therefore two genres of human, one for each side, and that's that. Come on, fellow beings. Nothing could be further from the truth. We are not a flat surface.

There are not two sides to every story. There are more than two sides to every story. There is not a reason for everything. There are multiple factors behind everything. There is not one causal sequel for an event. There are multiple causal sequelae, probably impossible to list, because there are so many.

That's how reality actually works. Binary numbers are great, but splitting a nation or a world into two parties is an unhelpful fantasy.

So no, I am not worried about people becoming more mindful of others these days. More of that, please! I'm worried about people who stop thinking, who lose curiosity, who are suddenly convinced they have nothing to learn. I worry whenever it's the general belief that to stop thinking is correct and good—versus incorrect and bad, the only other possibility, apparently—which means now we are not mindful enough.

And that is not a kind of person. It is no indication of being on a right side or a wrong side. This mindset, the way it feels to be in it, does not carry a barcode that identifies whether you're mistaken or not. It isn't one kind of person who thinks in black and white. That is what humanity is like. We've all got to watch out for it, and help each other be better, more observant, wiser.

dimanche 16 février 2020

Debatiary

I've noticed there are at least 5 quite different kinds of debate. What they're called isn't that important, so don't mind my names too much.
  • Gladiatior
  • Procedural
  • Feminist
  • Academic
  • Collaborative
Televised debates are almost always gladiator debates. Representatives of factions duke it out, striving to show each other up and seize prime moments for one-liners. Two things set this debate style apart from others. First, it's done for the spectacle. Everything is pitched, above all, for the wider audience: pitched to persuade, pitched to win favor. Second, no one on stage can ever admit they're wrong or express much doubt. Not only would such an admission seem weak, but the speaker making it would be seen as failing to represent—or even as betraying—their ardent supporters, and they'd be punished.

The debates that go on in courtrooms and houses of deliberation are what I call procedural. They have clear rules that must be followed closely. These are similar to games, with distinct moves. They have a lot in common with gladiator debates, in that, for example, lawyers aren't primarily concerned with sharing the complete truth. Participants want to win for their clients or constituents, and they will use persuasion to do so. But procedural debates can also have an objective focus. A jury's purpose is to select the true story. A good lawmaker wants the law that works best for everyone.

Feminist debates are similar to what's seen in classroom discussions. Everyone has a perspective and this is politely respected. There is not a pressing need to establish fact versus illusion. That would be seen as rude. Another place this style is seen is in group therapy sessions. The emphasis is on sharing, respect, listening, and turn-taking.

Academic debates prioritize theories, interpretations, argumentation, and evidence. They have some of that in common with procedural debates and feminist debates. They are often competitive in the sense that participants will take clear positions and promote those. Ever human, academic debaters can still become offended when someone doesn't agree, which introduces an ironic element of interpersonal squabbling and feuding where, in theory, it ought to have disappeared.

Collaborative debates are what I consider true debates. They take all the best qualities of the others. Participants strive for truth, uncovering it by sharing and listening to and critiquing evidence. They take positions, but they freely change these, and they know that they can play Devil's Advocate and they won't offend their fellow participants. Everyone there knows that they are all there to advance their knowledge together. Winning means learning something new, helping to build a product, or even just imagining in a fresh way.

That last kind is rarely seen, except maybe between two or three close collaborators.

samedi 15 février 2020

Stuck appearing

I have a recurring dream that there's a class I've been forgetting to go to all semester. However well I do in the others, this one is an F, probably an unavoidable F, come to think of it (how do I keep blotting it out so completely?), and I bet there's a test today or tomorrow. Fear rises. I don't remember how I'd find out when the next test is, or what's on it. Where did I put the syllabus? Don't I have one? It would be so embarrassing to go and see the professor—I probably won't. Sometimes, the dream includes the detail that I never finished high school, so I'm back in high school as an adult to get this out of the way, and I'm the only one like that, which makes me self-conscious. At first, I'm glad to finally get around to this. Then I realize I've somehow completely forgotten that class all quarter, and I'm going to fail it. So I'm back in high school remedially, but I'm failing it.

Considering that I spend so much time teaching high schoolers, and there actually were classes in college just like that, the only thing that's really surprising is that after I wake up, I still fully believe there's that panicky class I haven't been going to, and it takes a few minutes to convince myself it isn't true. On the other hand, I haven't finished my Master's project in 8 years and counting, so there's that.

This time, my mother was driving me to school. We stopped at a supermarket and spent a strange amount of time there, so I prepared mentally to arrive at lunch. She took a photo of the checkout conveyor and put it on Facebook (she's never used social media, and it didn't exist then). Back in the car whirring down the main road to school, I closed my eyes and imagined I was on a yellow bus; and I was, looking out the window. I tried to think about less typical things to photo and post (in the dream apparently my mom's conveyor belt snapshot was a total cliche). I fell asleep. When I woke up, my brother was to my left driving the car, his present age. "What are you doing here? How did you get here?" I asked. He sort of chuckled and said, "Hm. How DID I get here?"

mardi 28 janvier 2020

Biosphere routines, feedbacks

I always feel slightly at odds with the term "climate change." Deep political stasis needed shattering with a slogan, and the phrase filled in. We had to unify and focus, pack up our smorgasbord of ecological concerns and send a single must-have item home. The old terrestrial alarms of the 19th and 20th centuries ended up putting decent people off. Numbing them. Too often our best information was seen as alarmism. It wasn't seen as fire alarms or canaries in coal mines, as notices we should at least hear and preferably investigate to see if we can help. As in all eras, some concerns were exaggerated, or turned out to be failed best guesses from partial facts, but so many weren't.

The stasis isn't unlike finding yourself saying "everything causes cancer!" to dismiss a new snippet on how you might get or avoid cancer. The rational action, of course, is to listen to the piece of information. It could save your life. (And it could save someone else's if you share.) If you believe that thinking about cancer will spoil your fun while you smoke barbecued lamb or spray paint a car door in an unventilated basement, I'll tell you what'll spoil your fun: dying unexpectedly. Talk about exaggerated concerns! It is untrue that everything causes cancer. It is true, however, that the dose makes the poison. So if you were rational, you wouldn't say "everything causes cancer!" You would listen and then try to glean what evidence there is and what dose of this factor could be threatening. Even too much water will poison you, but if a companion points out that the wild berries you're eating are poisonous, good luck with the reasoning that "everything is poisonous!"

"Everything's bad for the environment!" Are we on the same page about the difficulty we Earth aliens are having when we think about this? It's a lot easier to ruffle your flightless wings and stick your head in some mud. It's more difficult to think, "Wait a minute. Many things are bad for the environment. It isn't just one. But, hm, not everything is bad for the environment. And not everything is equally bad. Hm. So what are the worst things? And what are the best things? Where can we start today?"

It isn't actually that hard. It's just harder. And many people out there are setting the easy example of head-under-quicksand-it's-really-great-try-it. Humans love to imitate each other.

Climate change has helped focus our concerns. Most of us know it isn't just global warming from CO2 that's looming on our radar. At the same time, I worry that "climate change" is both too vague and too specific. If someone doesn't feel as if a changing climate sounds all that bad—after all, we've lived through ice ages and hot spells, and plants will enjoy some extra CO2—then "climate change" is just vague enough and just specific enough that this person may feel excused from considering any of the problems at all.

This leads to the question of whether we face one problem or many unconnected problems. And while there are many problems, I believe they are connected. These are dangers to the biosphere introduced by inefficiencies and inadequacies in the world's political and economic practices. In my opinion, we have plenty of information about how to upgrade those practices. The trouble is that it's happening so much more slowly than the speed of knowledge and understanding.