The headline is that " Claude 3, the latest language model from Anthropic, was released earlier this week" and in general people are very impressed with it. This particular article considers whether Claude 3 could be thought of as anything like conscious, and more specifically, on "experiments which try to elicit whether there is something that it's like to be Claude 3 Opus." The point is that, as a sophisticated language model, Claude 3 could certainly respond as if it has a sense of self, of needs and wants, and even sensations, even though we don't think it does. "Even if LLMs are not conscious, their ability to act as if they are has all sorts of implications."
Today: 0 Total: 104 [Share]
] [