The important part is self-awareness, how a neuronal network can develop one, can we say that machine that only imitates self-consciousness are sentient?
Sentience (classically) means to experience sensations, i.e. feelings that have to do with sensory input. There's zero reason to assume a transformer trained on text can have that. There's also zero reason to assume that it has a meaningful concept of itself, which can easily be seen when it mimics talking about oneself.
Sentience (classically) means to experience sensations, i.e. feelings that have to do with sensory input. There's zero reason to assume a transformer trained on text can have that. There's also zero reason to assume that it has a meaningful concept of itself, which can easily be seen when it mimics talking about oneself.
this is more of a rhetorical question, not a stone in your garden