Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sometimes you know your scale. Say you have a website for your local area D&D club. You know that nobody outside a 20 mile radius is going to ever really bother looking at your stuff. And you know that there are about 200 people interested in D&D in this area. You need a contact form on your site to email you when someone has a question. How many connections at most would you realistically expect to have to process in this case per second/year/decade?


More than you’d assume, once you factor in search engine crawlers, miscellaneous bots and automated tools that bad actors run to probe websites looking for vulnerabilities. However if still expect CGI to stand up against that.

Performance arguments aside and given the type of site you describe, wouldn’t it be more convenient for the developer to use Wordpress or one of those website builder as a service things instead of inventing something from scratch in CGI?

I get the argument that some personal sites wouldn’t get much traffic but the argument that CGI is easier to build than any non-CGI alternative simply isn’t true any more and hasn’t been the case for more than a decade.

I know I’m coming across as passionately against CGI and I assure you that isn’t the case (I recently chose to use CGI as a private API endpoint for some Alexia skills I’d written for myself). But for a public site there isn’t really a strong argument in favour of CGI anymore given the wealth of options we have available.


Point taken on the bots but the static part shouldn’t go through the CGI anyways.

The irony of decrying poor performance yet suggesting WordPress is pretty priceless :)

I agree that most people would be better served with Squarespace/Wix but I run into the situation or static site + 1-2 bits of dynamic form processing frequently enough to warrant having a simple solution for it. When the site doesn’t warrant spending money on, sometimes a shared host + CGI is just right. But I think this is for some very rare cases. Most people should outsource these types of headaches.


> Point taken on the bots but the static part shouldn’t go through the CGI anyways.

Event the non-static stuff will be hit by bad bots (and by "bad bots" I mean any crawler - malicious or otherwise - that doesn't obey robots).

> The irony of decrying poor performance yet suggesting WordPress is pretty priceless :)

It's not ironic at all. I've done extensive benchmarking on this in a previous job and discovered that even the out-of-the-box Wordpress experience would easily outperform the same equivalent in CGI - and that's without taking into account all the caching plugins available for Wordpress (sure you could optimize your CGI as well, but you'd have to write all that yourself where as with Wordpress it's a 2 minute install).

Even DB and page caching aside, you still have a problem with CGI forking for each web request where as with PHP you have mod_php (Apache) or php-fpm which do opcode caching and such like. This alone can make a dramatic difference once you start piling on the requests.

> When the site doesn’t warrant spending money on, sometimes a shared host + CGI is just right. But I think this is for some very rare cases. Most people should outsource these types of headaches.

Do shared hosts even allow CGI? I'd have thought that was a bit of a security faux pas for a shared host. In any case, almost all of them support PHP (and those that don't are special purpose ones for node or other frameworks) so there's no forced requirement to use CGI even on shared hosting. In fact I'd go further and argue that PHP is even easier to write than CGI so you're better off using that regardless of whether CGI is available.

Disclaimer: I pretty much hate PHP as a language and not too fond of Wordpress either. But I'm being pragmatic in this discussion and leaving out my own personal biases. Personally I wrote my blog in Go (which, as it happens, was originally written in CGI/Perl, then ported to Apache+mod_perl before being ported to Go about 9 years ago and where it's been running ever since)


Some kinds of API endpoints and webhooks are exactly what I love CGI for—when what I want to do is exactly “bind an HTTP endpoint to a shell command.”


I agree but I'd never do that for a production system because it's far to risky from a security standpoint. For personal projects, sure. However even there I'd IP whitelist (wherever possible), sit the CGI behind an authentication screen and put log monitoring on there (eg fail2ban) to auto-blacklist any IPs identified as potentially abusing the endpoint.

If this isn't all stuff you already have set up on your dev environment then you might find hardening CGI becomes as much work as re-writing those tools in a more secure framework.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: