I would not want to use LLMs for such a thing like that. Something like SQL queries or other kind of computer codes would be better. You would have to read the documentation, but it can be specified more precisely and more accurately. If you have a local program that can manage these queries (and then convert them to the remote service's format; a service could provide a file to specify the schema and the estimated cost of different fields) and interact with multiple services (including local files), then that will be better, without having to worry about problems with OpenAI, require as much power that OpenAI uses, more privacy violations than is necessary, etc.
However, it might be useful for people who do want to use that instead.
However, it might be useful for people who do want to use that instead.