Blue Orange
Gone to Texas
I'd argue the interesting thing is that these things do study how people put words and pixels together, and that says more about creativity than we want to admit. Like it or not, ChatGPT can write short-form poetry better than 99% of human beings. There's that last 1%--it ain't Shakespeare--but it does say something about creativity not being the spiritual connection to a higher realm they viewed it as in the Romantic era. If a random assortment and probabilistic approach can give you something that's better than what most people can do, maybe this creativity thing isn't this specially human thing we thought after all? Maybe we're just machines made out of C, N, O, H, and a few other things instead of Si and O?
Look at DALL-E 3: over in another thread, people are having fun making custom art for their D&D characters and coming up with increasingly nutty ideas that the computer then brings to life. (And laughing at the ridiculous bugs, of course, but the thing is getting better and better constantly.) I don't know about the more spiritual uses of art, but it seems to be handling 'beauty' pretty well--in some cases a little too well as the people are much cuter in most cases than most actual human beings.
The thing is, as part of being an autocomplete, it also has a probabilistic sense of what the next thing ('token' in the parlance--it may not be a complete word) is supposed to be: 'Flint' might be 50% likely to be followed by 'Michigan', 20% by 'water', and 10% by 'Fireforge' (and 10% everything else), but 'Fireforge' might go up if the last couple of tokens included 'dragon' and 'dwarf'. That's a mathematical relationship, but it also tells us something about context in a text. I've never seen anyone crack open the weights and see which tokens lead to which, but it would be an interesting project to say the least.
One of the things the off-brand tools like NovelAI do is let you tweak some of the parameters. There's a parameter called 'temperature' which makes the generation of less likely tokens more likely and vice versa; effectively a randomizer, though it is only truly random at temperature 1; above that it actually favors less likely tokens. This will generate the less-likely juxtapositions people find interesting, in the same way as a Surrealist cut-up. (I think the Surrealists, with their disregard for convention and love of the random, would have loved LLMs.) Again, the thrill of the unexpected, which we see as opposed to a purely mechanistic view of the machine, can be produced by a machine.
So I think they do raise questions about human creativity and how 'human' it really is.
Look at DALL-E 3: over in another thread, people are having fun making custom art for their D&D characters and coming up with increasingly nutty ideas that the computer then brings to life. (And laughing at the ridiculous bugs, of course, but the thing is getting better and better constantly.) I don't know about the more spiritual uses of art, but it seems to be handling 'beauty' pretty well--in some cases a little too well as the people are much cuter in most cases than most actual human beings.
The thing is, as part of being an autocomplete, it also has a probabilistic sense of what the next thing ('token' in the parlance--it may not be a complete word) is supposed to be: 'Flint' might be 50% likely to be followed by 'Michigan', 20% by 'water', and 10% by 'Fireforge' (and 10% everything else), but 'Fireforge' might go up if the last couple of tokens included 'dragon' and 'dwarf'. That's a mathematical relationship, but it also tells us something about context in a text. I've never seen anyone crack open the weights and see which tokens lead to which, but it would be an interesting project to say the least.
One of the things the off-brand tools like NovelAI do is let you tweak some of the parameters. There's a parameter called 'temperature' which makes the generation of less likely tokens more likely and vice versa; effectively a randomizer, though it is only truly random at temperature 1; above that it actually favors less likely tokens. This will generate the less-likely juxtapositions people find interesting, in the same way as a Surrealist cut-up. (I think the Surrealists, with their disregard for convention and love of the random, would have loved LLMs.) Again, the thrill of the unexpected, which we see as opposed to a purely mechanistic view of the machine, can be produced by a machine.
So I think they do raise questions about human creativity and how 'human' it really is.