Yes, local for anything that can run locally. For higher-end model needs there are privacy platforms like Venice (https://venice.ai/privacy) with ZDR legal contracts and multiple E2EE options for their open-weight models. The OpenAI/Anthropic/Google models are also available through through them but at least your identity is anonymized, though the contents of your prompt could still be stored by the destination company.
I would not trust any anonymization service that still connects to gemini/openai/claude, they simply have too much reach to have any confidence that they can't [re-]connect a session to you via means other than the login and ip address.
That's why they added a verifiable E2EE mode that encrypts before leaving your device all the way to the GPU's TEE. You can proxy and see the shape of the request if you like. The platform supports no-KYC signups too, so if you care you can disguise your user too.
When using their platform via web/app there's a temporary chat option that avoids local browser storage entirely. You can also wipe the local browser storage whenever you want.
Just make sure you do it as a matter of routine policy, rather than in response to a legal issue, lest you get hit with a destruction of evidence charge.