It's less the individual words and more the structure of the text. Normal people say delve sometimes, but ChatGPT tends to use a particular structure. You know it when you see it. It seems overly excited, it uses more words than it needs to, it uses overly flowery words (one is OK, but ChatGPT uses them a lot), it ends everything with "It's important to note..."
Are its outputs in the "ChatGPT style" due to fine-tuning? It does seem to use a specific style that would be relatively uncommon in the data it was trained on, but it might be overrepresented in fine-tuning data.