"information" is not a good term for the opposite. Entropy is a well defined concept in information theory (and can be connected to the physical concept). More entropy means more information, not less.
> Entropy is a well defined concept in information theory (and can be connected to the physical concept). More entropy means more information, not less.
I agree. Order is a better term to describe it. Predictability. More entropy = more total possible states the system can be in.
I tend to use "information" to refer to a statistically-significant signal or data that the application/business can practically utilize. This is definitely not the same as the strict information theoretical definition.