Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Until they actually make any of it available in anything but an obscure expensive API you have to request access to, they might as well not even exist.


The landing page says "Easy integration via standard APIs Claude can be incorporated into any product or toolchain you’re building with minimal effort." Then there is a big button "Request Access", which for me right now just does nothing. OpenAI has really faced the pain to make their product available via an API to the general public at scale, but Anthropic/Google/etc. don't quite seem to be there yet. It's frustrating.


I don't think the person you're responding to wants a network based or cloud based solution.

When someone says they want it available they mean running on their own device.

This is hackernews, nearly everyone on this site should have their own self hosted LLM running on a computer/server or device they have at their house.

Relying on 'the cloud' for everything makes us worse developers in just about every imaginable way, creates a ton of completely unnecessary and complicated source code, and creates far too many calls to the internet which are unnecessary. Using local hard drives for example is thousands of times faster than using cloud data storage, and we should take advantage of that in the software we write. So instead of making billions of calls to download a terabyte database query-by-query (seen this 'industry-standard' far too many times), maybe make one call and build it locally. This is effectively the same problem in LLMs/ML in general, and the same incredible stupidity is being followed. Download the model once, run your queries locally. That's the solution we should be using.


When I want code that has a reasonable chance of working, or to bounce ideas off of someone decently intelligent, or just to talk philosophy, I’m not going to get great results out of the kind of model I can feasibly run at home. Even 30b parameters isn’t enough. That’s 75% of what I want out of an LLM.


Try a browser or a clean profile without any ad blocking turned on. It took me a couple of tries to figure out how to get it working but you should see a modal with a form when it works.

FYI the waitlist form submits a regular POST request so it'll reload the main page instead of closing the modal dialog. I opened network monitor with preserved logs to double check that I made it on the list :facepalm:


Google models are now available as API on Google Cloud.


I've been using it through poe and I prefer it to ChatGPT but can't pinpoint why. It just "gets" me better I guess?


there are many services that integrate with them that would allow you to self-serve signup




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: