this post was submitted on 27 Jun 2023
9 points (100.0% liked)

Technology

37702 readers
288 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

The literal judgement is in on using AI to write speeches

you are viewing a single comment's thread
view the rest of the comments
[–] Fisk400@beehaw.org 3 points 1 year ago (1 children)

Yeah, the illusion is quickly dispelled once you spend any time with it. I was trying it out when I was doing worldbuilding for a story I was writing. If you ask it to name 20 towns and describe them it spits out a numbered list. Same with characters. But then I asked it to make the names quirkier and it just used the same names again and described all the towns as quirky. I also asked it to make characters with names that are also adjectives and then describe them. Names like Able, Dusty, Sunny, Major.

The first iteration had a list of names and a description but the description always related to the adjective. Sunny had a sunny disposition and a bright smile. I told it the description should be unrelated to the name and it did the same thing again. I told it to change the name but not the description and it still rewrite the descriptors to match the name but didn't change the structure.

Nothing I told it could make it move off the idea that a man called sunny must be sunny. It basically can't be creative or even random when completing tasks.

This is fine when writing dry material that nobody will read but if you want someone to enjoy reading or listening to what is written then the human spark is required.

[–] Stumblinbear@pawb.social 1 points 1 year ago (1 children)

I just tested the character name thing and it got it on the first try. Maybe GPT-4 just handles it better?

[–] Fisk400@beehaw.org 0 points 1 year ago* (last edited 1 year ago)

It was gpt-4 I was using. It could be that you wrote it as one instruction and your intensions were very clear from the beginning while I explained it across multiple changes and clarifications when I noticed it wasn't giving me quite what I wanted.

Part of it is that I was intentionally being very human in my instructions, leaving it open to interpretation and then clarifying or adding things as I brainstormed. Its a messy way of doing it but if AI needs to be able to handle messy instructions in order to be considered on par with people.

Edit: turns out it wasnt gpt-4 I was using i was using the free chat on openais website. I was not aware that they were different.