Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.
There’s nothing magical or mystical about writing, but it involves more than placing an existing document on an unreliable photocopier and pressing the Print button.
I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.
Tangentially related, the easiest way to come up with a unique and cool idea is to come up with a unique and dumb idea (which is way easier) and then work on it until it becomes cool. (Think how dumb some popular franchises concepts are if you take the raw idea out of context.)
I like the way Ted Chiang puts it:
I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.
I like this a lot. I’m going to thieve it.
Tangentially related, the easiest way to come up with a unique and cool idea is to come up with a unique and dumb idea (which is way easier) and then work on it until it becomes cool. (Think how dumb some popular franchises concepts are if you take the raw idea out of context.)