Has anybody else noticed responses randomly including bits of other languages like Arabic?

Edit: I've been informed this issue is over-posted, my bad.

I first ran into this problem while doing some anthropological research the cheat way. When bits and pieces were in a script I didn't recognize, I figured these were highly-specific cultural terms that were hard to translate or even transliterate and the model had just given up. But when I asked, these were common terms like 'area' or 'cost.' Now I'm seeing it in other places, even across accounts.

When asked, the model is no help and does its usual mea-culpa-but-let's-move-on deal, and I've long since learned better than to trust any explanation you can wring out of it - it's usually just throwing predictive spaghetti at the wall to see what sticks.

It's not a huge problem, given context usually makes the intention clear and I'll probably know the Arabic for 'cost' by heart before long, but it's a bizarre phenomenon and I was wondering what might be going on.

submitted by /u/SharksWithFlareGuns
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top