Incredible, But Way Too Woke

#27
by deleted - opened
deleted
edited Apr 24

This LLM can't tell stories without going on tangents about how everything is wonderful, every person is a unique and beautiful snowflake, how it's important to respect everyone's boundaries, how telling jokes at the expense of others is wrong...

You made this LLM so woke that it's incapable of telling a story that even has a vague resemblance to the real world.

It's so extreme that a couple times in the middle of a poem when a character did something slightly "inappropriate" the LLM broke away from the story to begin moralizing about how said action was wrong, all while still attempting to adhere to rhyme and meter.

Still, I'm in disbelief at how well this tiny 3.8b parameter LLM performs.

deleted changed discussion status to closed

"Woke" is a meaningless word; everyone's definition is different. Use words that actually mean something.

That said, the model IS way too prone to going off on tangents . It's bad enough for human interactions, but it's outright useless for automated pipelines. You tell it to do a specific task and to not write anything else, and it STILL goes off on explaining itself and elaborating and doing everything you explicitly told it not to do. Not because the task is "woke", just because it likes to yammer. You can tell it to quote all things that, say, pertain to border collies in an article about dogs, in bullet points, and to include nothing except the quotes in bulletpoints - and it'll still go off on tangents about how it decided whether or not something pertained to a border collie, and why it decided to omit certain things, and how it thinks the task is too vague, and on and on. Just useless unless you can find a way to filter out its yammering from its actual results.

So much potential, but the finetune undercuts it badly :(

deleted

@Nafnlaus Agreed. Woke is used by horrible people to attack reasonable people for not sharing their homophobic, racist, misogynistic, or otherwise abhorrent intolerance of others. It's a shame I have to stop using the word because it was misappropriated. This LLM goes off in sporadic directions, similar to what I just did, and doesn't have enough respect for the system and user prompts. Tell it to be concise, and even ask it to define concise first, before asking a simple question like who played the role of Forrest Gump and it will ramble on just as long as it did the last time you asked without telling it to be concise.

Sign up or log in to comment