Discussion about this post

User's avatar
Alex Tolley's avatar

The "AIs' that we use are primarily trained on a corpus of English language texts. I have read that because the training sets are very similar, the responses of various models is rather similar. I wonder how much of our culture becomes evident in the AI. If we train an AI on e.g., only Chinese texts, would such an AI reflect more of a Chinese cultural response? I would argue that as we are embedded in our culture, so must AIs be embedded in their training corpus. Could being more selective in the training set be used to tailor the responses of an AI to a desired cultural model?

Expand full comment
Kevin's avatar

I feel like ChatGPT is already more empathetic than I am!

Often a coworker asks me a technical question and I'm just like, I'm busy, I'm tired, it's difficult, I can't help you right now. Try asking ChatGPT. And ChatGPT is always, "oh I'm eager to help."

Expand full comment
14 more comments...

No posts