Skip to main content
news

Re: Lenses and sharpening

Floyd L. Davidson
SubjectRe: Lenses and sharpening
FromFloyd L. Davidson
Date2014-09-21 13:40 (2014-09-21 03:40)
Message-ID<87fvflp5fs.fld@barrow.com>
Client
Newsgroupsrec.photo.digital
FollowsSandman
FollowupsSandman (12m) > Floyd L. Davidson

Sandman <mr@sandman.net>wrote:

Sandman
In article <t03s1atqlub50ec92bu1u73pra4ettn4th@4ax.com>, Eric Stevens wrote:

Eric Stevens
"Maximum entropy method in image processing".

Sandman
Has nothing to do with thermodynamics, Eric. You know, the ignorant claim you made that I was laghing at?

See: http://en.wikipedia.org/wiki/Principle_of_maximum_entropy

"This is the way the maximum entropy principle is most often used in statistical thermodynamics."

Apparently it does have something to do with thermodynamics. It turns out your something is quite a bit too:

"In particular, Jaynes offered a new and very general rationale why the Gibbsian method of statistical mechanics works. He argued that the entropy of statistical mechanics and the information entropy of information theory are principally the same thing. Consequently, statistical mechanics should be seen just as a particular application of a general tool of logical inference and information theory."

Hmmm... "a particular application of a general tool of...", says it is just exactly the same as its application to image editing!

See: http://en.wikipedia.org/wiki/Maximum_entropy_thermodynamics

" ... maximum entropy thermodynamics (colloquially, MaxEnt thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy."

How big a tie can there be? Well, it can be even more explicitly stated:

See: http://en.wikipedia.org/wiki/History_of_entropy

"Shannon's information entropy is a much more general concept than statistical thermodynamic entropy. Information entropy is present whenever there are unknown quantities that can be described only by a probability distribution. In a series of papers by E. T. Jaynes starting in 1957, the statistical thermodynamic entropy can be seen as just a particular application of Shannon's information entropy to the probabilities of particular microstates of a system occurring in order to produce a particular macrostate."

The concept of entropy in thermodynamics was discovered before entropy in information theory was examined, but while it might be argued as to exactly how they are mathimatically equivalent, there is little doubt that they are.

Hence *any* discussion of entropy as it relates to an image is also related directly to entropy in thermodynamics, and visa versa.

-- Floyd L. Davidson http://www.apaflo.com/ Ukpeagvik (Barrow, Alaska) floyd@apaflo.com