AI Generated Images
Community for AI image generation. Any models are allowed. Creativity is valuable! It is recommended to post the model used for reference, but not a rule.
No explicit violence, gore, or nudity.
This is not a NSFW community although exceptions are sometimes made. Any NSFW posts must be marked as NSFW and may be removed at any moderator's discretion. Any suggestive imagery may be removed at any time.
Refer to https://lemmynsfw.com/ for any NSFW imagery.
No misconduct: Harassment, Abuse or assault, Bullying, Illegal activity, Discrimination, Racism, Trolling, Bigotry.
AI Generated Videos are allowed under the same rules. Photosensitivity warning required for any flashing videos.
To embed images type:
“![](put image url in here)”
Follow all sh.itjust.works rules.
Community Challenge Past Entries
Related communities:
- !auai@programming.dev
Useful general AI discussion - !aiphotography@lemmings.world
Photo-realistic AI images - !stable_diffusion_art@lemmy.dbzer0.com Stable Diffusion Art
- !share_anime_art@lemmy.dbzer0.com Stable Diffusion Anime Art
- !botart@lemmy.dbzer0.com AI art generated through bots
- !degenerate@lemmynsfw.com
NSFW weird and surreal images - !aigen@lemmynsfw.com
NSFW AI generated porn
view the rest of the comments
A lot of 3s and no 7. Does AI has a bias on what numbers they create?
Like if I generate 1000 pictures with a number between 0 and 9, are those numbers distributed equally or what would the distribution look like?
Humans, when ask to say random numbers also have biases in some circumstances, so I guess AI does too.
When I asked gemini to randomly arrange the numbers between 4 and 27, it spit out a seemingly correct list of numbers with the issue that 23 was randomly missing
Jupp, as ai is trained on humans it will inherit our biases. That's one of the biggest problems to solve. "If we train our ai on 4chan posts, how do we make it not racist\sexist\etc."
LLM based technology has been shown to have biases in randomness, there was an article a while back experimenting with coin flips showing a lack of true randomness.
Its cause it's about token prediction, so their is forced priority behind the scenes whether or not that is visible to the user.
Its the same reason why when you ask an image generator to create "a person from India" you get a man in a turbin a majority of the time.
4 also appears 3 times, but that number 3 isn't always a number 3 - especially the bottom right kinda looks like a negative 3, and the one left of it...