Use Old AI’s, smaller models, It works heavily on random so even if you set it up with exactly what they’re using, you won’t see that stuff all the time.
You also have a higher chance of seeing bizarre stuff if you ask it about subject matter it’s not well trained on. A lot of the hallucinations come from essentially asking it a question to which it has no good answer, It doesn’t necessarily just miss the mark and come up with another likely answer, now when it generates it’s noise, you get whatever unlikely concept that comes out of the random.
I grew up in farm country and used to be besties with a girl who, EVERY time she saw a cow, would say “Hey Kit, did you know that those are special cows? They’re outstanding in their field” and she would crack up. This happened multiple times a week for years. I still think of it every time I drive through farm country.
every time I see one of these rediculous ai answers I ask whatever ai I’m using and i always get a normal response :(
I’m not defending ai I just wanna know how to get the funni broken answers
Use Old AI’s, smaller models, It works heavily on random so even if you set it up with exactly what they’re using, you won’t see that stuff all the time.
You also have a higher chance of seeing bizarre stuff if you ask it about subject matter it’s not well trained on. A lot of the hallucinations come from essentially asking it a question to which it has no good answer, It doesn’t necessarily just miss the mark and come up with another likely answer, now when it generates it’s noise, you get whatever unlikely concept that comes out of the random.
What app is this?
duck.ai website
edit: using hermit app
I grew up in farm country and used to be besties with a girl who, EVERY time she saw a cow, would say “Hey Kit, did you know that those are special cows? They’re outstanding in their field” and she would crack up. This happened multiple times a week for years. I still think of it every time I drive through farm country.
Use an outdated LLM. You’ll notice the screenshot is from 3.5
These LLMs can be very inconsistent.
It should have spaced outstanding to out standing. Otherwise not a bad dad/Christmas cracker joke