Anything that is not measurable (i.e. awareness, consciousness) is not very useful in practice as a metric. I don't think there is even an agreed definition what consciousness is, partially because it is not really observable outside of our own mind.
Therefore I think it makes perfect sense that awareness is not discussed in the paper.
Consciousness is observable in others! Our communication and empathy and indeed language depend on the awareness that others share our perceived reality but not our mind. As gp says, this is hard to describe or quantify, but that doesn't mean it's not a necessary trait for general intelligence.
But LLMs have been measured to have some theory of mind abilities at least as strong as humans: https://www.nature.com/articles/s41562-024-01882-z . At this point you either need to accept that either LLMs are already conscious, or that it's easy enough to fake being conscious that it's practically impossible to test for - philosophical zombies are possible. It doesn't seem to me that LLMs are conscious, so consciousness isn't really observable to others.
That's still using language. My dog has theory of mind in the real world where things actually exist.
Also, those results don't look as strong to me as you suggest. I do not accept that an LLM is conscious nor could I ever unless I can have a theory of mind for it... Which is impossible given that it's a stochastic parrot without awareness of the things my five senses and my soul feel in reality.