IGS Discussion Forums: Research Related to GS: Bayesian reasoning
Author: Ralph E. Kenyon, Jr. (diogenes) Monday, August 25, 2008 - 02:37 pm Link to this messageView profile or send e-mail

Hi Thomas,
Your quote includes "Popper's idea that there is only falsification and no such thing as confirmation turns out to be incorrect."

Popper himself was aware of this conclusion, as I pointed out in my 1984 paper Popper's Philosophy of Science


quote:

It is as much of a mistake to think that these "refuted" hypotheses are "false" as it is to think that the "accepted" theory is "true". Popper himself was aware of this.

but those who uphold it dogmatically [a system] . . . are adopting the very reverse of that critical attitude which in my view is the proper one for the scientist. In point of fact, no conclusive disproof of a theory can ever be produced; [italics mine] . . . If you insist on strict proof (or strict disproof) in the empirical sciences, you will never benefit from experience, and never learn from it how wrong you are.(9)

The mathematics in the article you quote shows what Popper already knew. Never the less, thanks for posting the link.

Author: Ralph E. Kenyon, Jr. (diogenes) Tuesday, August 26, 2008 - 10:57 pm Link to this messageView profile or send e-mail

Hi Thomas,

If you start with two valued logic, and you add a third value, a forth, etc., continuing until you have increased the number of possibilities to infinity; if you also order those truth values on a ratio scale, then you have a system isomorphic to probability when you map the "and" fuction to multiplication, and the OR function to the sum less the product.

Explanation for those not well versed in probability and logic:

The probability of A and B when A and B are independent is the product of their respective probabilities: P(A AND B)=P(A)P(B)
The probability of A or B when A and B are independet is the probability of A plus the probability of B less the probability they both happened (prevents counting it twice).

Since probabilities are numbers between zero and one inclusively, 1 corresponds to true and 0 corresponds to false, thus producing binary logic as a special case.

Flipping a quarter and a nickle gives
QH or QT, but not both.
NH or NT, but not both.
We take such coins to be "fair", that is having an equal chance of coming up heads or tails, and that makes the probability 50% or .5.

What's the probability of getting both heads?
P(QH AND NH) = P(QH)P(NH)=.5 x .5 = .25

What's the probability of getting any heads?

P(QH OR NH) = P(QH)+P(NH)-P(QH AND NH) = .5+.5-.25=.75.

End of explanation.

Since there are an infinite number of possible choices for probabilities, Korzybski refers to this as infinite-valued logic. On page 281 of S&S he says "... many-valued 'logic', which merges ultimately with the mathematical theory of probability; A more extensive section can be found on page 310.

Author: Ralph E. Kenyon, Jr. (diogenes) Wednesday, August 27, 2008 - 09:25 pm Link to this messageView profile or send e-mail

Thomas,

In the philosophy of logic and mathematics, the word 'valid' refers to rules of inference which are strictly guaranteed to be truth preserving. Induction, except for mathematical induction, does not satisfy this criteria.

Induction goes from a set of examples, possibly very, very, large, to a principle or formula that is capable of deriving all the examples in the inductive source set, which, necessarily, is finite when applied to what is going on.

Recall the general semantics formulation / notion that just because P(t1), P(t2), P(t2), ... P(tN), all agree does not mean that P(tN+1) will agree. That means that any highly probable theory T that predicts P(ti) can not satisfy IF T THEN P(ti) for all i. Validity, in the mathematical and philosophical sense is strict truth preserving - going only from T to T in any implication. Induction does not satisfy this. The "common man", however uses the word 'valid' in a much softer sense.

One of Korzibski's main points is that it is an assumption that an inductive reasoning step will yield the correct response for the next situation, and that we should be aware of when we make assumptions, and we shoud be prepared for them to be wrong. If we consider induction "valid" we contribute to hiding that awareness, as well as to a reduced preparedness to deal with the failed prediction.


(I)diots (M)ay (O)bserve that something works well when invalid reasoning is used whevever the concusion is itself highly probable by support from unrelated grounds. A case in point:

Nasrudin was spreading bread crumbs along the border of his properties when his neighbor asked, "Nasruden, What are you doing?". Nasrudin replied, "Keeping the tigers away.", whereupon his neighbor exlaimed, "There are no tigers within hundreds of miles of here!". Nasrudin camly replied, "Effective, isn't it?" (I. Shah)