Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"information" is not a good term for the opposite. Entropy is a well defined concept in information theory (and can be connected to the physical concept). More entropy means more information, not less.


> Entropy is a well defined concept in information theory (and can be connected to the physical concept). More entropy means more information, not less.

More entropy means more missing information.

https://iopscience.iop.org/book/mono/978-0-7503-3931-5/chapt...


As I understand it in information theory, "more entropy" equals "more information is necessary to fully describe this thing", but I may be wrong.


“more [additional] information is necessary to fully describe this thing” = “more missing information [in the incomplete description of the thing]”


I agree. Order is a better term to describe it. Predictability. More entropy = more total possible states the system can be in.

I tend to use "information" to refer to a statistically-significant signal or data that the application/business can practically utilize. This is definitely not the same as the strict information theoretical definition.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: