This is 100% the case. I wouldn't be surprised if universities actually have a policy against not customizing profile pages. Otherwise you're opening things up to the good old Geocities age.
Not really. If the seller does not have the rights to sell something than someone who purchases from them are in effect receiving stolen goods. They may not be intentionally violating those rights but they are still participating in an action which does just that.
Since an e-book is copied at the moment of purchase, the purchaser is the direct cause of the copy being made, so if they knew it was unlicensed, and they consider copyright violation to be wrong, then I would think they were doing wrong to cause it.
Are they civily or crimimaly liable? Probably not, but IANAL. Bigger pockets would say sue the store and the publisher, not the clients.
yeah, but intent matters, and someone purchasing an ebook is probably making a good faith attempt to obtain the media in a way that compensates the author - hence my elision in the passage i quoted, which seems to mean well, but is badly phrased. a downloader pirating the book is directly violating the author's rights wheras the purchaser is really also a victim along with the author - they are being deprived of money for something the seller has no legal right to provide. ianal and all that applies, of course!
I've been using Vimwiki for years. I've tried other note taking/todo listing/whatever solutions but I always end up back at Vimwiki. The only thing better than it that I've found is Emacs' OrgMode.
Exactly. Think of is as the classic compiled vs. interpreted languages debated. One is clearly faster than the other but the slower one may have different advantages.
Ahead-of-time vs Just-in-time is a more accurate description.
In this case, there's no evidence that precompiling is faster in theory -- let alone in practice. In the JS framework benchmark suite, SolidJS and InfernoJS performance is almost identical (with SolidJS having a much larger margin for error in most tests).
This is the BEST possible case for precompiling too. In the real world, JITs take a long time to warm up (a couple hundred executions before all the optimizations kick in). With the vDom, you warm up ONE set of code that then runs forever. With the precompile, it has to warm up for EVERY new component and potentially slightly different codepaths within the same component.
The JS framework benchmark reuses the same components for everything which is a huge advantage to precompiled frameworks while not having much impact on vDom ones (as the actual components in both cases usually won't optimize very much due to being polymorphic).
> In this case, there's no evidence that precompiling is faster in theory -- let alone in practice.
It’s absolutely not because it requires far greater effort computationally. The benefit has nothing to do with performance but instead simplified state management.
I know people desire certain frameworks due to how they perform state management. I have never really understood that motivation myself though because managing state is incredibly simple. Here is a basic outline of how simple it is:
1) realize there are exactly two facets to every component: data, interface.
2) all components should be stored in common locations. A single object stores component data and a common DOM node for storing component interfaces.
3) pick a component facet to update, either data or interface, and never update the other. The other should be automatically updated by your application logic (reflection).
4) store your data on each change. That could be dumping it into local storage. In my current app I send the data to a local node instance to write into a file so that state can shared across browsers in real time.
5) be able to restore state. On a fresh page, or even page refresh, simply grab the stored data and rebuild the component interfaces from it.
My current application is a peer to peer Windows like GUI that works in the browser and exposes the file system (local device and remote devices) in Windows Explorer like interfaces. Managing state is the least challenging part of this. The slowest executing part are long polling operations against large file system trees (it’s about as slow in the native OS interface)
> In the JS framework benchmark suite, SolidJS and InfernoJS performance is almost identical (with SolidJS having a much larger margin for error in most tests).
I mean I agree with most of your post, but I'm not sure I would necessarily make that highlighted claim from the benchmark results. I mean the +- seems to be pretty run dependent for most libraries on there. And while I agree that the differences in performance is neglible, there is one. Solid is clearly faster in most tests even if by a small amount. Anyone interested you can look at: https://krausest.github.io/js-framework-benchmark/current.ht... And then isolate Solid and Inferno and then do a comparison against one library. It will color highlight the degree of certainty the difference is between the libraries in terms of significance of the results.
This is one of the few books I've read that I would actually call 'mind bending' (though stretching might be a better word).
Another amazing book of his that I don't see mentioned a lot (perhaps because it is more technical that maybe any of his other books) is 'Fluid Concepts And Creative Analogies'.
Another amazing book of his that I don't see mentioned a lot (perhaps because it is more technical that maybe any of his other books) is 'Fluid Concepts And Creative Analogies'.
I'm reading that one now. I recommend it, along with Perception as Analogy by Melanie Mitchell, and the aforementioned Godel, Escher, Bach to anyone interested in AI / cognitive science. I've only read a small part of Metamagical Themas to date, but I've read enough to recommend that as well. Just the stuff on self-referential sentences makes it worth reading.