eller: iron ball (Default)
[personal profile] eller
AAARGH. I just wanted chatgpt's help to structure a text. You know - what should be in the introduction, how long should each part be for easy reading, and so on. Unsurprisingly, I'm shit at this stuff, but usually, the AI is of great help - at least when it comes to nonfiction with clear structural requirements. (Letting the AI write texts is, of course, hopeless, so I won't even try. Letting the AI organize text structures before I just write stream-of-consciousness stuff, however? I mean, that could save me some headaches.) Trying to let it organize fiction, however? Wow. WOW. Today, I learned that chatgpt is really Very Fucking American.

Things I learned:
- The AI will not just try to reorganize the plot around an acceptable novella structure (which, after all, is what I asked it to do) but flag any character behavior for editing that does not conform to American cultural standards.
- The AI told me that my characters are too obsessed with honor and duty and I should consider editing that. I'm like... WAIT... I'm actually writing a Fantasy!Medieval!North!Germany setting. With Fantasy!Medieval!North!German characters with according cultural background and mindset. (Come on. It's fucking Germany. At least some of the characters take their oaths seriously...) Apparently, Germany written by a German is not acceptable by genre standards...
- The AI completely unasked (!) changed a scene description from a male character making tea for the group to a female character making the tea. Thanks for the casual sexism, I guess.
- The AI described a female character as "flirtatious". She's... not. She is, however, speaking to male characters. In, you know, plot-related ways. Apparently, that's yet another thing the AI can't handle. (Not a problem with the technology itself, I know, but definitely with the training dataset. WTF.)
- The AI completely unasked (!) tried to give a genderfluid character an issuefic subplot centered around Gender!Angst!American!Style. I mean, I onbviously don't expect an American piece of software to understand historical German ways of gender expression... which is why I didn't ask it to. This character has a perfectly acceptable subplot centered around military technology and espionage, and.no gender issues whatsoever, thanks.
- The AI really wants to change the magic system (which is, of course, North German as fuck, considering the setting) to something ripped off Tolkien.
- The AI is shit at interpreting character motivations in ways that are actually pretty hilarious.

Thanks for the non-help. -_-

Date: 2025-04-13 05:34 pm (UTC)
sabotabby: (books!)
From: [personal profile] sabotabby
I don't know enough about music production or gaming to say. In terms of the how-to-write books, the difference I see is that when one fails, it's critiqued as a failure. If it's useful, people buy it, read it, and use it; if it's useless, they mostly don't. This is a strangely capitalistic argument for me, I know, but even the worst how-to can at least generate discussion. Even the worst book wastes no more trees than a useful book, and hasn't profited by intellectual property theft. It's not forced on anyone, nor do billionaires pump money and resources into making it more influential or widely adopted than it would be on its own merits.

In terms of who I am, eh, I'm someone who can now directly trace how my work was stolen without compensation or consent to make someone rich. So I do have a moral objection as well as an aesthetic one. People can have fun making TTRPG characters with picrew.me if they aren't interested in learning to draw.

And I've seen what students do with these tools. Even when it was "only" Grammarly, the willingness to cede authorial control to software made their work superficially more polished but resulted in sloppier writing and thinking.

This is just in art and writing. My partner teaches science, and recently had to contend with students insisting that HPV isn't a virus, because the AI told them that it wasn't. Even when he explained what it was and how it worked, they refused to believe him. This makes them less likely to get a lifesaving vaccine and more likely to die because they can't differentiate between machine hallucination and actual information. It's not just me and my ego and judgment, it's about how we learn—or don't—to think at a critical and structural level.
Edited Date: 2025-04-13 05:35 pm (UTC)

Date: 2025-04-13 05:47 pm (UTC)
yhlee: Alto clef and whole note (middle C). (Default)
From: [personal profile] yhlee
But again, people have been stupid since time immemorial. If the argument is "AI makes it easier in scale for people to cheat or be stupid," that's one argument; if t he argument is "AI causes people to cheat or be stupid," that is a different argument. Which assertion is the one you are making?

We have people believing things based on shitty snake oil advertisements - if you look at the history of medical advertising, you have people buying snake oil magnetized treatments for XYZ long before modern computers are around. Chad Orzel, who's a physics professor at Union College, talks about how he was bemused by the sudden hand-writing hand-wringing [edit: fixed Freudian typo lol] from humanities colleagues around essay writing cheating because it's so much easier to cheat in a typical math/science exam, this is a Very Old (Sometimes Boring) Problem. People having to distinguish shitty information from good information in general is an Old Hard Problem. The prevalence of machine hallucination exposes that problem in deeply troubling ways, but it's not a new problem. I mean, Herodotus ffs.

If we're looking at compensation schemes vs. IP theft, sure we can look at how tons of works (I think something like ~80 of my works turned up in that Atlantic database of stolen written works) are stolen without permission and used for profit; but this then ties into how the entire compensation system for creative narrative work has been in hell mode for a long time. No one ever adequately squared the circle regarding DRM vs. ebook pricing vs. ebook piracy. If we're at generalized compensation for narrative/creative work, capitalism has a ton of specific problems in this space, but also at the point where the Nibelungenlied has a whole fucking shout-out to PLEASE PAY UR LOCAL MINSTREL KTHX, the general problem of compensation predates capitalism by centuries.
Edited Date: 2025-04-13 05:48 pm (UTC)

Date: 2025-04-13 06:04 pm (UTC)
sabotabby: (teacher lady)
From: [personal profile] sabotabby
I guess both?? Definitely scale—as you say, these are old, old problems. And grifting isn't new, of course, so the grift of "believe this software and not a professional" isn't unique to AI.

But I do believe in affordances, and there are certain tendencies that the technology does encourage, in the same way that affordances in, say, algorithmic social media will lend themselves to bad political thinking over good political thinking. And that's where the latter assertion, that AI causes sloppy thinking rather than allowing the people who would be sloppy thinkers in any event to get away with it, is also something that I believe to be true.

I will say there's substantially more wiggle room in the latter argument. I've been researching the moral panic around cellphones and social media (curiously, in education, this is considered a much larger problem than ChatGPT), and I think the kids who are addicted to social media and phones would probably, in earlier ages, done other things to avoid learning. But having access to social media and phones is also more distracting to me, an adult who didn't own a phone until I was 30. So while I do think there's a moral panic, I also do think that designing apps that work like slot machines to exploit loopholes in human cognition probably results in behaviours that wouldn't otherwise happen.

Likewise, I have a rough idea of the curve of students who are good at writing and interested in learning versus the ones for whom it's pure hell and who will look for any reason to avoid it. If it were purely a matter of "AI makes it easier for people to cheat and be stupid," you'd think that the latter group would be the ones doing it the most. But it's actually the middle to high achievers, who normally might struggle through a difficult task, who are giving up and resorting to ChatGPT. That can't be divorced from other material conditions—namely, grade inflation and economic instability—but I am seeing the students who might otherwise learn often doing far worse because of the affordances of the technology. Some of the richer ones would have traditionally bought term papers, but most wouldn't have had that option, so now there's a whole cohort of kids who are weaker thinkers than they might otherwise be.

Date: 2025-04-13 06:11 pm (UTC)
yhlee: Alto clef and whole note (middle C). (Default)
From: [personal profile] yhlee
This is a much more interesting/stronger argument and I agree with large portions of it. That said, I think we're really now in the absolutely fucked (and out of my scope of knowledge) realm of economic incentives around the entire consumer tech industry and/or capitalism and/or economic instability for average citizens and kids. We're deffo long past the point of "we keep introducing tools (technological or cultural or otherwise) without having any idea, in an era of accelerating change, what the long-term cascading consequences are." Ironically at that point I tap out because I don't understand enough about global economics to even frame the question, and as someone with a background in activism, you're much better equipped to analyze that side of the problem. Sadly, my readings have been pretty narrowly focused on (bluntly) "how can I make things go KABOOM! in entertaining ways in a shitty commercial novel?"

One could make the meta-argument that kids are getting smarter at weaseling out of requirements, for good or ill. :wry: A South Asian physicist I know recounted the gatekeeping ~high school final exam that was basically necessary to pass in order to have ANY kind of future in their country. There was a girl who was a very weak student, just needed to check the box for this exam and get on with her life. This physicist (well, before they became a physicist) - the teacher took them aside and said, "I'm putting her behind you because otherwise she's going to fail." The girl copied the physicist's answers and came in 3rd in the class, which no one believed; but she was able to go on and live her life. Physicist's note: "She did have one excellent academic skill. SHE COULD COPY LIKE THE WIND." Obviously I wasn't there and I haven't been to this country for that matter, but in broad strokes I could well believe that what we have is someone end-running around a fucked SYSTEM so she could (with the aid and abetment of people also forced through the system) move on with her life.

Profile

eller: iron ball (Default)
eller

December 2025

S M T W T F S
 1 23456
78910111213
14151617181920
21222324252627
28293031   

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 10th, 2026 07:04 am
Powered by Dreamwidth Studios