> OpenAI already offers private ChatGPT instances hosted on Azure.
> They don't use it for code/confidential data though.
Yes, private isn't enough. They need to offer self hosted, for these types of clients. I imagine most orgs who need self hosted would already have a datacenter to run it in.
I know of a bank who is paranoid enough to use a self hosted on-premise GitHub instance and they went with the private (off-premise) ChatGPT instance.
They don't use it for code/confidential data though.