Let’s be clear about one thing, we are not comparing CGI to C++ or Java. There’s PHP, Ruby, JavaScript, Python, etc that are all good languages to rapidly develop backend code in. So The dev time between CGI and non-CGI is the same or even quicker to build with non-CGI versions because modern web languages and frameworks are much better than they were in the 90s when CGI was popular. Plus the sysadmin / DevOps time spend deploying and hardening CGI would be greater than a more modern framework.
Also your coats are hugely optimistic. I’ve done benchmarks with CGI and non-CGI code in previous companies and found it wasn’t just a couple more servers, it was often 10x more. Even at $0.26 an hour, that quickly adds up over the course of a month and year. Plus the slower throughout also has a knock on affect in your database as well. You’ll find as you’re running fewer connections on each web server you’d end up with smaller connection pools per node but more overall DB connections across the farm with those connections held open longer per web request. That means you then need to beef up your RDBMS instance and that gets very costly very quickly (even on the cloud)! And we’ve not even touched on the options of serverless et al that aren’t even available for CGI which would further bring down the cost of hosting a non-CGI site.
Let’s also not forget one of the key metrics when building a commercial web platform: performance form a UX perspective. Amazon, Google, etc have all done studies on page load times and user patterns. Their somewhat predictable result was that sites with slower response times will see more users leave that site in favour of a competitors one than sites with faster response times. So if your website is your business, running CGI could cost you in lost revenue as well as running costs.
None of what I say above is theoretical costs - this is actual data from my experiences migrating CGI platforms to non-CGI alternatives. I’ve done the benchmarking, cost analysis and so on and so forth. The arguments in favour of CGI are simply untrue for the modern era of web development.
Also your coats are hugely optimistic. I’ve done benchmarks with CGI and non-CGI code in previous companies and found it wasn’t just a couple more servers, it was often 10x more. Even at $0.26 an hour, that quickly adds up over the course of a month and year. Plus the slower throughout also has a knock on affect in your database as well. You’ll find as you’re running fewer connections on each web server you’d end up with smaller connection pools per node but more overall DB connections across the farm with those connections held open longer per web request. That means you then need to beef up your RDBMS instance and that gets very costly very quickly (even on the cloud)! And we’ve not even touched on the options of serverless et al that aren’t even available for CGI which would further bring down the cost of hosting a non-CGI site.
Let’s also not forget one of the key metrics when building a commercial web platform: performance form a UX perspective. Amazon, Google, etc have all done studies on page load times and user patterns. Their somewhat predictable result was that sites with slower response times will see more users leave that site in favour of a competitors one than sites with faster response times. So if your website is your business, running CGI could cost you in lost revenue as well as running costs.
None of what I say above is theoretical costs - this is actual data from my experiences migrating CGI platforms to non-CGI alternatives. I’ve done the benchmarking, cost analysis and so on and so forth. The arguments in favour of CGI are simply untrue for the modern era of web development.