According to this article, chatGPT "reflects American norms and values - even when queried about other countries and cultures... The AI spun web of cultural bias is a major problem according to the study's researchers." I do think 'reflects' is a better word to use than 'promotes', because again, chaGPT knows nothing about American values, but rather merely replicates word order from a preponderance of texts that actually do represent those values. The same happens in Chinese. Different language, different values. "The answer depends on the language being used to ask." Maybe the researchers should have conducted their study in Danish.
Today: 1 Total: 1764 [Share]
] [