Not if you have a keyboard with easily reflashable and programmable firmware (such as the Moonlander). If the firmware has built in support for layers it becomes trivial as you can just have a QWERTY layer activate when you hold down ctrl etc (Moonlander also)
I almost exclusively use Colemak-DH and have 0 issues typing in QWERTY when I have to. I chalk it up to my phone keyboard still being in QWERTY so I get enough exposure from that to not forget the locations of the keys (it's certainly not as well retained as it used to be, I no longer remember the location of all the keys in QWERTY from memory). I do use a columnar split keyboard though (Moonlander) so it's possible the radical difference in how I use the keyboard helped to keep the muscle memory separate.
FWIW I have the opposite experience, I never buy digital playstation games unless I'm forced to because they all sell for MSRP in New Zealand ($129/$139 NZD), whereas I can always get them physically for $99/$109 brand new. I would much prefer to buy digital games, but I'm not paying an extra $30 for the privilege of saving someone _else_ money
Yes for renting an instance from AWS. Minimum lease time is 24H iirc. But that's not the case for Github Actions (and so I assume Azure Pipelines). You pay per minute, it's just 10x the cost of an Ubuntu instance.
From skimming the paper[1] they pulled that number from it seems... not the best metric. It looks like they are estimating the power consumption of the internet, then dividing that power usage by the amount of data transferred and claiming that is the cost per gigabyte.
But that's including a _lot_ energy cost that isn't related to _transferring_ the data, but storing/generating/running services to send that data etc
> But that's including a _lot_ energy cost that isn't related to _transferring_ the data, but storing/generating/running services to send that data etc
Well, doesn't that make it an even more accurate measurement of how much our data really impacts the environment? Data doesn't just exist at the moment its being transferred...
Transit providers accept traffic for delivery anywhere in the world at a tiny fraction of that figure, suggesting that the fully allocated cost per GB is not quite as high as they represent. I trust transit service is not in a venture capital funded bubble subject to collapse any day now.
The same analysis goes for storage and processing. The economic marginal cost of storing and generating data as such is quite low, probably lower than the transit cost, especially since most data stored is either soon discarded or transmitted a large number of times.
Both are among the most successful economies of scale ever invented, pretty much the opposite of what the article alleges. Measure this out as a percentage of GDP or transit/storage cost per hour of work or entertainment - it is certainly worth more than one airline industry in economic terms, advances we could hardly live without unless the plan is to roll back living standards to the 1970s or earlier.
> Compilation is an embarrassingly parallel task: if you've got N cores, you can use N cores and get a linear speedup for every expansion in N you are offered.
Somebody should let the Javascript world know that. Both Webpack and TypeScript are single threaded by default and it's a pain in the ass to get them to use more threads. So my 12 core CPU mostly sits around twiddling it's thumbs while I compile code for work.