Tuesday, May 7, 2013

Two Stories About Medicine One Provocative, One Shocking and Alarming

I don't know much of anything about the controversial use of chelation therapy in the treatment of heart disease except that it is controversial.   Chelation is one of those topics that has been deemed to be "woo" by the medical wing of the "Skepticism" industry.  But I don't know enough to know if it might be one of those cases when, oddly enough, their listing it on their Index of Prohibited Ideas is valid.   But a story by Karen Weintraub in the Boston Globe on April 22 was interesting because it cast a light on the idea of "woo" itself.

A study showing a small but positive result for chelation as therapy for heart disease was conducted under the supervision of the federal government, a rather large study.  You would think that having some unanticipated validation of a treatment that is used, unofficially by about 50,000 Americans a year, would be welcomed by cardiologists and others involved in treating heart disease and public health.   But that's not nearly the case.

The positive result triggered a firestorm of opposition from come cardiologists who dismissed the study as junk science;  while others defended the study, published in the prestigious Journal of the American Medical Association as no more flawed than any other.

The emotional debate reinforces the importance of scientific research,  several doctors and ethicists said, even as it shows weakness in the scientific process.  And it raises questions about the attitudes many doctors have shown toward alternative medicine.

"It challenges the foundation of western medicine to accept alternative medicines,"  said Felician Cohn, bioethics director for Kaisar Permanete Orange County. 

The results surprised even the researchers conducting the research,  who had expected it to give the negative results predicted by the current beliefs of doctors and researchers.

No one was more surprised by the results in the TACT trial than the researchers who conducted it.

A photo of the meeting where they first learned the results shows shock and dismay,  said Gervasio A Lamas,  chief of Columbia University's division of cardiology at Mt. Sinai Medical Center in Maimi Beach, Fla.

" We really do look astonished,"  Lamas said.  "Some people had their head on the table saying; 'no you've got to be kidding me.'"

Lamas said he felt he had to study chelation after a patient came to him asking whether to use the therapy.  His first reaction was "of course not!"  Then he began to research the treatment and realized that the "only correct answer was:  I don't know."

You can look up the reaction online, I have not, yet.   I suspect it will be dominated by the "Skeptical" rejection by people who  "know" it can't be true, some of them media doctors such as Steven E. Nissen who is given prominent mention in the story in the Globe .   And, as I said, they might be right, though, with this study, there is more reason to believe they may not be.  Though the question remains as to why these doctors wouldn't be thrilled to have evidence that they had another possible means of preventing second and third heart attacks in their patients that wouldn't make them more willing to look at the possible effectiveness of a widely used unofficial therapy?   What does that say about their value of patient care as opposed to their preexisting bundle of beliefs on the basis of no such study?

-------

Yesterday's reading brought up something about medical science that is far more stunning and disturbing, which opens far more of "the foundation of western medicine" and much more of biology into far more basic doubt. [Note:  Sorry, just realized I'd forgotten the link.]


New scientific research has cast grave doubt on the safety testing of hundreds of thousands of consumer products, food additives and industrial chemicals.

Everyday products, from soft drinks and baby foods, to paints, gardening products, cosmetics and shampoos, contain numerous synthetic chemicals as preservatives, dyes, active ingredients, or as contaminants. Official assurances of the safety of these chemicals are based largely on animal experiments that use rabbits, mice, rats and dogs. But new results from a consortium of researchers and published in the Proceedings of the National Academy of Sciences suggest such assurances may be worthless (Seok et al. 2013).

The results of these experiments challenge the longstanding scientific presumption holding that animal experiments are of direct relevance to humans. For that reason they potentially invalidate the entire body of safety information that has been built up to distinguish safe chemicals from unsafe ones. The new results arise from basic medical research, which itself rests heavily on the idea that treatments can be developed in animals and transferred to humans.

The research originated when investigators noted that in their medical specialism of inflammatory disease (which includes diabetes, asthma and arthritis), drugs developed using mice have to date had a 100% failure rate in almost 150 clinical trials on humans.

According to Kristie Sullivan, Director of Regulatory Testing Issues at the Physicians Committee for Responsible Medicine (PCRM), this is not unusual “about 90% of all pharmaceuticals tested for safety in animals fail to reach the market, or are quickly pulled from the market”. Wanting to understand why this might be so, the consortium decided to test the effects of various treatments that lead to inflammation, and systematically compare results between mice and humans. This postulated correlation across different animal species is sometimes known as the concordance assumption.

"The concordance assumption" is something I never encountered in any of the biology classes I took in high school or college, nor do I remember it being mentioned in  anything I've read in the matter of animal research, just about  every single one of which featured either research or experiments conducted on animals, most of which explicitly asserted some rather broad assertions about human beings and between different species on the basis of this unstated "assumption".    How serious is the problem with it?

In a first set of experiments the researchers looked at acute inflammation in mice brought on by various stimuli. These stimuli were bacterial toxins (endotoxaemia), trauma, and burns. To measure responses the authors quantified positive or negative changes in gene activity for thousands of individual genes. The researchers found that changes in activity of a particular mouse gene after treatment typically failed to predict changes in activity in the closest related human gene. This was not the expected result. If humans and mice are meaningfully similar (i.e. concordant) then gene activity changes in mice should have closely resembled those in humans after a similar challenge. But they did not.

In further experiments, the researchers identified another difference. While humans responded with similar patterns of gene changes to each of the three different challenges (trauma, burns, and endotoxaemia), mice did not. The three treatments in mice each resulted in a distinct set of gene activity changes. This confirmed the initial results in the sense that mice and humans responded differently. It also implied that the differences in gene response between mice and humans are attributable not so much to a lot of detailed ‘noise’ but to fundamental differences in the physiology of mice and humans in dealing with these challenges.

Next, the researchers examined the activity of specific biological signaling pathways after similar treatments. These too were highly divergent between mice and humans. Surprised by the consistently poor correlations between the two species, the authors then tested other human/mouse models of inflammatory diseases. Again, the similarity between mice and humans was low.

In summary, repeated experiments confirmed that, when it comes to inflammation, mice and humans have little in common, a finding important enough in itself given the prevalence of inflammation-related diseases in humans. These include allergies, celiac disease, asthma, rheumatoid arthritis, and autoimmune diseases.

Of the two articles mentioned here, this one really has immediate and extremely disturbing potential for harm.


Thus the Seok study is not the first to conclude that mice are poor models for human disease, but it is notable for being by far the most comprehensive. Combined with results of previous experiments, its conclusions suggest researchers should expect that mouse, and probably other animal testing, is of little use in advancing the treatment of human illnesses, including heart disease and cancer.

In other words, the public is probably being badly served by much of the money spent on medical research. According to PCRM’s Kristie Sullivan, “the National Institutes of Health is giving researchers billions of dollars every year for research on animals”. While missing out on potential cures, the public is also likely being exposed to dangerous or ineffective pharmaceuticals. Animal testing nearly prevented the approval of valuable drugs such as penicillin and subsequent antibiotics, but it did not prevent the thalidomide disaster of the 50s and 60s (Greek and Swingle Greek, 2003).,,

... If animals are not useful predictors of important disease responses in humans it is unlikely they are useful as test subjects for toxicological safety. In other words, lack of concordance means that the synthetic chemicals that are found in industrial products, incorporated into food, and otherwise spread throughout the environment, are essentially untested. The regulatory process through which they passed was never a scientifically validated and evidence-based system, but now the evidence shows it to have been functioning as a system of random elimination. “We are not protecting humans” says Kristie Sullivan, noting that “even a National Academy study agrees that many toxicological tests are not human-relevant.”

The effect of this is nothing less than shocking to someone who was brought up with the faith that all of that horrific and cruel animal testing was scientifically valid and a hard but necessary evil.   Now it would seem that even the scientific character of its basic theory was more faith than science.  I will not try to tease out its origins and ancestry, not just now, but will repeat that this article is about the most disturbing thing I've read in years not related to climate change.

What else it means for the enormous faith in a far less demonstrable "concordance" between the minds of animals as remotely related to us as other mammals and us,  never mind the ever popular one between human beings and ants,  it calls it into the most fundamental question.   If science missed the issues discussed in this article for all of those decades, there is absolutely no reason to have any faith in the wild speculations by those who find human consciousness and thought inconvenient for their "scientific" faith.   Unless they can account for it with science as clear as this, it should be considered to be rank superstition based in something far less valid than human experience,  ideology.  




Who Really "Knows" E8 ?

On the Impossibility of Knowing It All And What That Means and The Faith Based Rejection of Science

When the E8 figure constructed by a team of mathematicians using linked computers was announced it was kind of thrilling, if you find that kind of thing thrilling.  The sheer size of the effort to describe such an imaginary object in eight dimensions is stunning.  If I recall correctly, one of the more informed newspaper stories I read said that if printed out the paper required would be bigger than the island of Manhattan, or maybe it was twice the size.   At any rate, it would take more paper than I ever used in all of the math courses I took by a considerable amount.   In this "narrative" account  by David Vogan, a leader of the group,  of how it was done, if you wade through the stuff that I had to pretty much skim (with time taken out for my mouth to drop open at the size of the numbers involved in the effort) you'll come to this

So how big a computation is this character table for split E8? Fokko's software immediately told us exactly how many different representations we were talking about (453,060); that's also the number of terms that could appear in the character formulas. So the character table is a square matrix of size 453060. The number of entries is therefore about 200 billion, or 2 x 1011.

But, not to worry because he continues:

Fortunately the matrix is upper triangular, so we only need 100 billion entries

Reading even this narrative, for me, is like looking at the shadow of a reflection of the E8 figure at a great distance, through a gauze.   I can gather enough of the achievement to be very impressed but I really can't even understand the terms in the first paragraph.   My friend who teaches mathematics at a quite decent land-grant university and who publishes several papers a decade told me that she doesn't understand much more of it.   I don't think she's just trying to make me feel better, she's admitting the same thing that Richard Lewontin did more generally

First, no one can know and understand everything. Even individual scientists are ignorant about most of the body of scientific knowledge, and it is not simply that biologists do not understand quantum mechanics. If I were to ask my colleagues in the Museum of Comparative Zoology at Harvard to explain the evolutionary importance of RNA editing in trypanosomes, they would be just as mystified by the question as the typical well-educated reader of this review.*

As soon as I read the size of the effort of constructing the model of the E8 figure my first question was if anyone, even the most well informed members of the group could meaningfully claim to understand it or how confident they could really be in the tightness of their results.   No one could possibly master more than a small part of the topic and there is not really any such thing as knowledge that is held collectively, not without a great deal of faith in all of the other members of the group.  Perhaps in the efficacy of the computers and the intellectual architecture of the attempt.  Faith would, obviously, be a requirement of even a mathematical or scientific "fact" of  even less daunting dimensions.   Short of many minds being joined as the computers could be, the fact is that no one can really "know" much of anything about even that many dimensions.  The holding that what is "known" about it is actually known stretches the meaning of the word.   That word is similarly stretched even further to cover the entirety of what is included in science. 

While the obvious connection of this issue of faith in science with religious faith are there, those aren't what I'm interested in addressing in this post.   I'm interested in, once again, addressing the annoying and arrogant superstition among the obnoxious scientistic fundamentalists that is getting steadily worse.    Yes, I've been looking at the trash filtered out from my in-box, again. 

Everywhere since the inception of the "Skepticism" industry and especially since Sam Harris and Richard Dawkins have instituted the new atheist fad, the internet has been plagued with masses of these scientistic fundamentalists, many of whom are far more ignorant of even very simple math and science than, for example, I am.   It's not uncommon to find them making baroque and elaborate arguments about, most typically, religion on the basis of string or M-theory that involves speculations about more than eight dimensions, all while appearing to not be able to solve a linear equation, never mind a quadratic one.  As can be seen in the series I did about James Randi  et al. it is possible for a total ignoramus in matters scientific, to be revered as an oracular figure of science by a large number of acolytes.   And not only the quite ignorant of science but popular figures who are actual scientists.  One person I encountered recently said, well, yeah, Randi doesn't have any training in science but he spent a lot of time with Carl Sagan.   I noted that Carl Sagan spent a lot of time with Ann Druyan but that didn't make him a woman.    What is clear about this is that even such a figure of the church of scientism as Carl Sagan must have known that Randi is a complete non-entity in science but they were OK with the effort to sell him as a representative of science.  

The reason for that is ideological and political,  Randi shares a faith in materialism with Sagan and other actual scientists in the promotion of scientism.  That shared faith is enough for them to, not merely overlook the dishonesty of the effort, but often to participate in Randi's PR promotion.   The sheer dishonesty of Randi and the widespread acceptance of his self-generated career as a spokesman for science by scientists who know it is a total fraud, is certainly a scandal.  I would say it is a scandal big enough to do actual harm to science.  But that is the price that lots of scientists are willing to pay.  I think it's as clear an example of the corruption of science as it is alleged to be as opposed to the ideology that it has become in far too many cases. 

On The Faith Based Rejection of Science

As creationism and climate change denial shows, science can't only be accepted on the basis of pre-existing faith but it can be rejected as well.   I would have to say that it is in the materialists' faith-based rejection of science and other ideas that is the most basic aspect of their religion.   That is an aspect of "Skepticism" that is too little addressed.   I will point out, again, that several of the pseudo-scientific "voices of science" such as Randi and Penn Jillette, have extended this practice of "Skeptical" denial to climate change science. 

Dr. Dean Radin has posted a linked index of peer-reviewed studies in psi and related phenomena.  Many of the papers lay out quite impressively careful and controlled experiments which have yielded results of far, far more than sufficient statistical evidence than is generally required by science.   I can understand quite a bit of the math in some of them so I don't have to take that on faith.   I know the "Skeptics" provide a level of oversight that almost certainly guards against lapses in methodology and attention, I have even more faith in the internal critics and referees in that area.  I would say that someone who is even more knowledgeable of statistics than I am would not need as high a level of faith in the results of that science than need to either accept or reject it.  Though they would be even more aware of the necessary effort required to SCIENTIFICALLY challenge the reviewed results that are reported.   Requirements that are often not practiced by the "Skeptics".  I don't think anyone who hasn't even read the abstract of a paper can reject it on the basis of anything but faith.  I'll bet not one in 200 of the blog "Skeptics" could do even that. 

*  Richard Lewontin addresses the problem of the fact that we are all at the mercy of an inevitable reliance on authority, and the choices of even very sophisticated scientists will often be less than universally accepted by those with more knowledge than they have.

Third, it is said that there is no place for an argument from authority in science. The community of science is constantly self-critical, as evidenced by the experience of university colloquia "in which the speaker has hardly gotten 30 seconds into the talk before there are devastating questions and comments from the audience." If Sagan really wants to hear serious disputation about the nature of the universe, he should leave the academic precincts in Ithaca and spend a few minutes in an Orthodox study house in Brooklyn. It is certainly true that within each narrowly defined scientific field there is a constant challenge to new technical claims and to old wisdom. In what my wife calls the Gunfight at the O.K. Corral Syndrome, young scientists on the make will challenge a graybeard, and this adversarial atmosphere for the most part serves the truth. But when scientists transgress the bounds of their own specialty they have no choice but to accept the claims of authority, even though they do not know how solid the grounds of those claims may be. Who am I to believe about quantum physics if not Steven Weinberg, or about the solar system if not Carl Sagan? What worries me is that they may believe what Dawkins and Wilson tell them about evolution. 


Monday, May 6, 2013

Answer to an E-mail

I don't expect to have a high number of readers because reading one of my posts takes some commitment of time and attention.  From what I've seen, that don't make for a popular blog.  Given that, after more than six years of doing this, I remain an, as they say, "rank amateur",  I'm pleasantly surprised anyone reads what I post.  Given my editing, it might take a bit of deciphering, at times, as well.   I know they're not coming for the cute pictures and silly cartoons, cookie cutter remarks about pop culture and celebrity gossip.   Though if I shared what I was told about....

OK, well, never mind that.

I consider my readers to be braver and stronger than I am.

Update: Responseto a further e-mail

Why, yes.  That was open flattery.  I figure I should let people know I understand the demands these long posts make.

How Scientists Trick Themselves And Us Into Believing A Tiny Focus Shows The Entire Reality

Before the passage that follows,  Joseph Weizenbaum recalls the joke in which a policeman encounters a drunk on his hands and knees under a lamp post.  He asks him what he's doing.
"Looking for my keys."
"There aren't any keys here.  Where did you drop them."
"Over there."  He points in the dark.
"Why are you looking over here then?"
" Because the light is so much better."
That's how I was told the joke, anyway.

He concentrates on computer scientists who mistake their limited focus for an entire system but in all of science, up to even  cosmologists, they clearly mistake it for the entire universe.  More about that later.  For now, I couldn't possibly say it better than Weizenbaum did

Science can proceed only by simplifying reality.  The first step in its process of simplification is abstraction.  And abstraction means leaving out of account all those empirical data which do not fit the particular conceptual framework in which science at the moment happens to be working, which, in other words, are not illuminated by the light of the particular lamp under which science happens to be looking for keys.  Aldous Huxley remarked on this matter with considerable clarity:

" Pragmatically [scientists] are justified in acting in this odd and extremely arbitrary way;  for by concentrating exclusively on the measurable aspects of such elements of experience as can be explained in terms of a causal system they have been able to achieve a great and ever increasing control over the energies of nature.  But power is not the same thing as insight and, as a representation of reality, the scientific picture of the world is inadequate for the simple reason that science does not even profess to deal with experience as a whole, but only with certain aspects of it in certain contexts.  All this is quite clearly understood by the more philosophically minded men of science.  But unfortunately some scientists, many technicians, and most consumers of gadgets have lacked the time and inclination to examine the philosophical foundations and background of the sciences.  Consequently they tend to accept the world picture implicit in the theories of science as a complete and exhaustive account of reality;  they tend to regard those aspects of experience which scientists leave out of account, because they are incompetent to deal with them,  as being somehow less real than the aspects which science has arbitrarily chosen to abstract from out of the infinitely rich totality of given facts."

One of the most explicit statements of the way in which science deliberately and consciously plans to distort reality, and then goes on to accept that distortion as a "complete and exhaustive" account, is that of the computer scientist Herbert A. Simon, concerning his own fundamental theoretical orientation:

"  An ant, viewed as a behaving system, is quite simple.  The apparent complexity of it s behavior over time is largely a reflection of the complexity of the environment in which it finds itself ... the truth or falsity of [this] hypothesis should be independent of whether ants, viewed more microscopically  are simple or complex systems.  At the level of cells or molecules, ants are demonstrably complex; but these microscopic details of the inner environment may be largely irrelevant to the ant's behavior in relation to the outer environment.  That is why an automaton,  though completely different at the microscopic level, might nevertheless simulate the ant's gross behavior...
" I should like to explore this hypothesis,  but with the word 'man' substituted for 'ant'. 
" A man, viewed as a behaving system is quite simple.  The apparent complexity of his behavior over time is largely a reflection of the complexity of the environment in which he finds himself...  I myself believe that the hypothesis holds even for the whole man." 

With a single stroke of the pen, by simply substituting "man" for "ant," the presumed irrelevancy of the microscopic details of the ant's inner environment to its behavior has been elevated to the irrelevancy of the whole man's inner environment to his behavior!  Writing 23 years before Simon, but as if Simon's words were ringing in his ears, Huxley states;

"Because of the prestige of science as a source of power, and because of the general neglect of philosophy,  the popular Weltanschauung of our times contains a large element of what may be called 'nothing-but' thinking.  Human beings, it is more or less tacitly assumed, are nothing but bodies, animals, even machines ... values are nothing but illusions that have somehow got themselves mixed up in our experience of the world;  mental happenings are nothing but epiphenomena... spirituality is nothing but ... and so on."

Except, of course, that here we are not dealing with the "popular" Weltanschauung, but with that of one of the most prestigious of American scientists.  Nor is Simon's assumption of what is irrelevant to the whole man's behavior " more or less tacit"; to the contrary, he has, to his credit, made it quite explicit. 

Simon also provides us with an exceptionally clear and explicit description of how, and how thoroughly, the scientist prevents himself from crossing the boundary between the circle of light cast by his own presuppositions and the darkness beyond.  In discussing how he went about testing the theses that underlie his hypothesis, i.e. that man is quite simple, etc., he writes;

"I have surveyed some of the evidence from a range of human performances, particularly those that have been studied in the psychological laboratory.

The behavior of human subjects in solving cryptarithmetic problems, in attaining concepts, in memorizing, in holding information in short-term memory, in processing visual stimuli, and in performing tasks that use natural languages provides strong support for these theses... generalizations about human thinking... are emerging from the experimental evidence.  They are simple things, just as our hypothesis led us to expect.  Moreover, though the picture will continue to be enlarged and clarified,  we should not expect it to become essentially more complex.  Only human pride argues that the apparent intricacies of our path stem from quite different sources than the intricacy of the ant's path."

The hypothesis to be tested here is, in part, that the inner environment of the whole man is irrelevant to his behavior.  One might suppose that, in order to test it, evidence that might be able to falsify it would be sought.  One might, for example, study man's behavior in the face of grief or of a profound religious experience.  But these examples do not easily lend themselves to the methods for the study of human subjects developed in psychological laboratories.  Nor are they likely to lead to the simple things an experimenter's hypotheses lead him to expect.  They lie in the darkness in which the theorist, in fact, has lost his keys' but the light is so much better under the lamppost he himself has erected.

There is thus no chance whatever that Simon's hypothesis will be falsified by his or his colleagues' minds.  The circle of light that determines and delimits his range of vision simply does not illuminate any areas in which questions of, say, values or subjectivity can possibly arise.  Questions of that kind, being, as they must be, entirely outside his universe of discourse, can therefore not lead him out of his conceptual framework, which  like all other magical explanatory systems, has a ready reserve of possible hypotheses available to explain any conceivable event.

Almost the entire enterprise that is modern science and technology is afflicted with the drunkard's search syndrome and with the myopic vision which is its direct result.  But, as Huxley has pointed out, this myopia cannot sustain itself without being nourished by experiences of success.  Science and technology are sustained by their translations into power and control.  To the extent that computers and computation may be counted as part of science and technology, they feed at the same table.  The extreme phenomenon of the compulsive programmer teaches us that computers have the power to sustain megalomaniac fantasies.  But that power of the computer is merely an extreme version of a power that is inherent in all self-validating systems of thought.  Perhaps we are beginning to understand that the abstract systems - the games computer people can generate in their infinite freedom from the constraints that delimit the dreams of workers in the real world - may fail catastrophically when their rules are applied in earnest.  We must also learn that the same danger is inherent in other magical systems that are equally detached from authentic human experience, and particular in those sciences that insist they can capture the whole man in the abstract skeletal frameworks. 

"There will be no chance that Simon's hypothesis will be falsified by his or his colleagues' minds."   "Falsification" as the touchstone guaranteeing the presence of the quest of modern alchemy, "science," has been introduced into the popular imagination since Weizenbaum wrote what he did.   You can encounter sci-rangers who demonstrate their profound ignorance of what the word means flashing it like iron pyrite all over the web.  As this passage shows, the concept is little understood by even very sophisticated people, scientists included, perhaps especially.   They certainly will be unaware that even among scientists and the philosophers of science, "Falsifiability" isn't granted the status of a fixed and uncontroversial truth.

Much of the misconception of science, as encountered among the sciency, is encompassed in this passage.  You can read it and remember it was written during the dawn of Sociobiology, as expounded by E.O. Wilson, the world's foremost experts in ants and the reduction of people into "lumbering robots" by the "evolutionary" psychology that quickly succeeded it and overtook it.   In that form, as popularized in "The Selfish Gene" and other radically reductionist popularization, it has taken root among the would-be intelligentsia, both those in their own minds and those with advanced degrees.  It governs how they see other peoples' lives and, astonishingly enough, their own experience.

The TV commercial says, "If love is a chemical reaction,...." only in the minds of many, perhaps most, college educated folks these days, there is no if about it.   I would say if only they understood the practice of the radical abstraction and the profoundly limited focus of the psychological reductionist practice that is the origin of that superstition, they might not fall for it.  The chain of assumptions that leads to that belief contains many links that are ideological, I would guess even more of them than the "Intelligent Design" effort might.   Only, ideological links that are based in materialism are invisible to any but rather rigorous reviewers of science because the consideration of material entities is the subject matter of science.

In the 20th century and on to today, there has been an odd form of elevation of scientists, usually at the twilight of their productive career in science, to the status of popular sage or, more often, oracle.   These people declaim their prophesy on TV and YouTube to an eager lay public and to each other,  scientists are as prone to falling for PR as anyone.  Since the 1970s, these have been prophets of materialistic scientism,  just about any of the big name scientists whose names are recognized would fall into that category.   The editing of popular culture, done mostly by non-scientists who are quite ga-ga with the glamorous cache that science has or by wannabees who gave up long ago, disappears scientists who don't teach that dogma.   As real, working scientists forget that they are sampling a very limited amount of human experience to issue their doctrines and declaring their universal efficacy, the rest of us are prone to doing the same thing, adding ignorant credulity to the mix.

Sunday, May 5, 2013

Oscar Peterson:  Clavichord
Joe Pass:  Guitar
I Love You Porgy 

"... scientific demonstrations, even mathematical proofs, are fundamentally acts of persuasion" : More from Computer Power and Human Reason by Joseph Weizenbaum

It may be that human values are illusory, as indeed B. F. Skinner argues.  If they are, then it is presumably up to science to demonstrate that fact, as indeed Skinner (as scientist) attempts to do.  But then science must itself be an illusory system.  For the only certain knowledge science can give us is knowledge of the behavior of formal systems,  that is, systems that are games invented by man himself and in which to assert truth is nothing more or less than to assert that, as in a chess game, a particular board position was arrived at by a sequence of legal moves.  When science purports to make statements about man's experiences, it bases them on identifications between the primitive (that is, undefined) objects of one of its formalisms   the pieces of one of its games and some set of human observations.  No such sets of correspondences can ever be proved to be correct.  At  best, they can be falsified, in the sense that formal manipulations of a system's symbols may lead to symbolic configurations which, when read in the light of the set of correspondences in question, yield interpretations contrary to empirically observed phenomena.  Hence all empirical science is an elaborate structure built on piles that are anchored, not on bedrock as is commonly supposed, but on the shifting sand of fallible human judgment, conjecture, and intuition.  It is not even true, again contrary to common belief, that a single purported counter-instance that, if accepted as genuine would certainly falsify a specific scientific theory, generally leads to the immediate abandonment of that theory.  Probably all scientific theories currently accepted by scientists themselves (excepting only those purely formal theories claiming no relation to the empirical world) are today confronted with contradicting evidence of more than negligible weight that, again if fully credited, would logically invalidate them.  Such evidence is often explained (that is, explained away) by ascribing it to error of some kind, say, observational error, or by characterizing it as inessential, or by the assumption (that is, the faith) that some yet-to-be-discovered way of dealing with it will some day permit it to be acknowledged but nevertheless incorporated into the scientific theories it was originally thought to contradict.  In this way scientists continue to rely on already impaired theories and to infer "scientific fact" from them.   

The man on the street surely believes such scientific facts to be as well-established, as well-proven, as his own existence.  His certitude is an illusion.  Nor is the scientist himself immune to the same illusion.  In his praxis,  he must, after all, suspend disbelief in order to do or think anything at all.  He is rather like a theatergoer, who in order to participate in and understand what is happening on the stage, must for a time pretend to himself that he is witnessing real events.  The scientist must believe his working hypothesis, together with its vast underlying structure of theories and assumptions, even if only for the sake of the argument.  Often the "argument" extends over his entire lifetime.  Gradually he becomes what he at first merely pretended to be; a true believer.  I choose the word "argument" thoughtfully, for scientific demonstrations, even mathematical proofs, are fundamentally acts of persuasion. 

Scientific statements can never be certain;  they can be only more or less credible.  And credibility is a term in individual psychology,  i.e., a term that has meaning only with respect to an individual observer.  To say that some preposition is credible is, after all, to say that it is believed to be an agent who is free not to believe it, that is, by an observer who, after exercising judgment and (possibly) intuition  chooses to accept the proposition as worthy of his believing it.  How then can science, which itself surely and ultimately rests on vast arrays of human value judgments, demonstrate that human value judgments are illusory?  It cannot do so without forfeiting its own status as the single legitimate path to understanding man and his world.

But no merely logical argument, no matter how cogent or eloquent can undo this reality  that science has become the sole legitimate form of understanding in the common wisdom.  When I say that science has been gradually converted into a slow-acting poison, I mean that the attribution of certainty to scientific knowledge by the common wisdom, an attribution now made so nearly universally that it has become a commonsense dogma, has virtually delegitimatized all other ways of understanding.  People viewed the arts, especially literature, as sources of intellectual nourishment and understanding  but today the arts are perceived largely as entertainments.  The ancient Greek and Oriental theaters, the Shakespearean stage, the stages peopled by the Ibsens and Chekhovs nearer to our day - these were schools.  The curricula they taught were vehicles for understanding the societies they represented.  Today, although an occasional Arthur Miller or Edward Albee survives and is permitted to teach on the New York or London stage, the people hunger only for what is represented to them to be scientifically validated knowledge.  They seek to satiate themselves at such scientific cafeterias as Psychology Today, or on popularized versions of the works of Masters and Johnson, or on Scientology as revealed by L. Ron Hubbard.  Belief in the rationality-logicality equation has corroded the prophetic power of language itself.  We can count, but we are rapidly forgetting how to say what is worth counting and why. 

As you read that, I hope you took into account that this was written in the mid-1970s, and some then current aspects of pop culture have given way for others.  You can choose your 2013 counterparts but I would certainly replace B. F. Skinner with Richard Dawkins - just as Skinner's defunct behaviorism has been supplanted by Dawkins' evo-psy.   Finding the counterpart for Hubbard is a bit more difficult but not because there is only one obvious one.   Arthur Miller is, of course, dead, though, I believe Albee is still with us and as recently as last year was quite articulate about, among other things, the further decline in the theater.

I can only wonder what Joseph Weizembaum would have made of the foremost force for the scientism he warned against, "science" blogs.  Since most of his late life was spent in Germany and his further thinking is unavailable in English perhaps he addressed them.

What Weizenbaum had to say is, if anything, far, far more true today.   I would hold that it is far more true of what is officially denominated to be liberal politics.    Looking back, I would date the late 60s and 1970s as the turning point, when liberal politics, dominated by the enormous moral force of Reverend King and the largely religious and effective civil rights movement and early anti-war movement to the anti-religious, "scientific" "left" that began replacing them at that time.   That liberal politics reached its height in influence during the other than liberal Johnson and Nixon administrations testifies to the strength of that now lost liberalism.   As the Clinton and Obama administrations prove, the "liberalism" of today doesn't even have the power to move the law when they hold the entire government.

The media, the foremost beneficiaries of the form of libertarianism that posed as liberalism, largely concentrated in the elite media and those indifferent if not actually hostile to religion, has proven it will sell out the genuine ideals of liberalism for fame and fortune.  The list of putative liberal or leftists of that era who have jumped to what is universally recognized as being "the right" is impressively larger than those who have jumped the other way.   I think it would be useful to come to a better understanding of that phenomenon, which I'd call something like "the Hentoff-Hitchens effect".

UPDATE:  From The Lexicon of Popular Atheist Locutions

Word salad:   One of a number of pat statements that means something is too complex or too long for the post-literate era atheist to understand.  As used it is a variation on the logical positivist practice of declaring a statement not in accord with their ideological framing to be meaningless, though “word salad” is generally far less skillfully deployed.

If “word salad” is used to denote an actual passage that is nonsense it carries the danger that the user will be suspected of a low level of reading comprehension earned for the phrase by those who use it most often.   It is more accurate to say "that is nonsense".   However, unlike the use of "word salad" it doesn't carry the presumption that the one doing the dismissing is immune from having to be able to say why they have said that.

See also:   But that's haaarrrrrd! 

Saturday, May 4, 2013

Answer Given During a Blog Brawl Elsewhere

James Randi maintains his cult through his "Educational" Foundation, which seems to be dedicated mostly to covering up his many sins.  It's the Scientology of the Scientistic, Randi the El Ron Hubbard of his hucksterism.   He is the Ayn Rand of his own Randians, frequently they're anal Randians.  He's the Lyndon Larouche of his lied to and louche lair of louts. 

It's been a long week, I'm taking the morning off and hope to post tonight.

Friday, May 3, 2013

Oscar Peterson: Clavichord Joe Pass: Guitar

It Ain't Necessarily So

Notice

I've gotten bored with the trolling of Steve Simels so I will not be answering him here anymore but at the blog NY Mary started, handed off to him and fled for something productive.  It was probably something like what the oarsman did to the evil queen in The Three Hairs of the Devil.   His act isn't variable enough to maintain interest, as noted here the other day, it doesn't reach the novelty and originality of the late Soupy Sales.  It does remind me of George W. Bush.

If he thinks I'm giving him a link, no. 

Update:  As I knew he would as soon as I answered him at his own blog instead of here, Simels has proven he can't take it.  I will not be posting his comments from now on.  

Upupdate:   Response to an e-mail

Anthony -   Steve Simels is at Eschaton sayi.....

Yeah, whatever. 

People Are Not Machines Machines Don't Have Rights or Moral Obligations

You will probably hear it today, you will almost certainly hear it this week,  "people are hard-wired to..."  In the last couple of decades as materialists presenting their ideological metaphors as neuro- or cognitive science people have been taught to believe that they are "hard-wired" to behave and think and even perceive in the way they do by their "genes", evo-psy has a large hand in it too.  

The common view of  human minds expressed in the media is that "science proves" that we are the "moist robots" of Daniel Dennett, the "lumbering robots" of Richard Dawkins , "computers made of meat"  the phrase used by other materialists.   Even if other expressions are used,  that is the enforced point of view presented by a media, almost certainly as an article of "scientific" faith presented by people who probably couldn't tell you much of anything about genes and what they do or about how computers work.   That it is a belief that is entirely congenial to the corporations they work for as they sell our minds as product to advertisers who see us as units of potential profit might be seen as ironic, considering the passage I'm about to type into this piece, from Computer Power and Human Reason:  From Judgement to Calculation by the eminent and, I would say, prophetic, computer scientist, the late Joseph Weizenbaum

Introduction

In 1935 Michael Polanyi,  then holder of the Chair of Physical Chemistry at Victoria University of Manchester,  England, was suddenly shocked into a confrontation with philosophical questions that have ever since dominated his life.  The shock was administered by Nicolai Bukharin,  one of the leading theoreticians of th Russian Communist party,  who told Polanyi that "under socialism the conception of science pursued for its own sake would disappear, for the interests of scientists would spontaneously turn to the problems of the current Five Year Plan."  Polanyi sensed then that "the scientific outlook appeared to have produced a mechanical conception of man and history in which there was no place for science itself."  And further that "this conception denied altogether any intrinsic power of thought and thus denied any grounds for claiming freedom of thought."  

I don't know how much time Polanyi thought he would devote to developing an argument for a contrary concept of man and history.  His very shock testifies to the fact that he was in profound disagreement with Bukharin,  therefore that he already conceived of man differently,  even if he could not then give explicit form to his concept.  It may be that he determined to write a counterargument in Bukharin's position,  drawing only on his own experience as a scientist, and to have done with it in short order.  As it turned out,  however, the confrontation with philosophy triggered by Bukharin's revelation was to demand Polanyi's entire attention from then to the present day [c1975]

I recite this bit of history for two reasons.  The first is to illustrate that ideas which seem at first glance to be obvious and simple, and which ought therefore to be universally credible once they have been articulated,  are sometimes buoys marking out stormy channels in deep intellectual seas.  That science is creative, that the creative act in science is equivalent to the creative act in art, that creation springs only from autonomous individuals, as such a simple and, one might think, obvious idea.  Yet Polyani has, as have many others, spent nearly a lifetime exploring, the ground in which it is anchored and the turbulent sea of implications which surrounds it. 

The second reason I recite this history is that I feel myself to be reliving part of it.  My own shock was administered not by any important political figures espousing his philosophy of science, but by some people who insisted on misinterpreting a piece of work I had done.  I write this without bitterness and certainly not in a defensive mood  Indeed, the interpretations I have in mind tended, if anything, to overrate what little I had accomplished and certainly its importance.  No, I recall that piece of work now only because it seems to me to provide the most parsimonious way of identifying the issues I mean to discuss.   

The work was done in the eriod 1964-1966, and was reported in the computer-science literature in January 1966 and August 1967.  To summarize it briefly,  I composed a computer program with which one could "converse" in English.  The human conversationalist partner would type his portion of the conversation on a typewriter connected to a computer, and the computer, under control of my program, would analyze the message that had so been transmitted to it,  compose a response to it in English, and cause the response to be typed on the computer's typewriter.

I chose the name ELIZA for the language analysis program because, like the Eliza of Pygmalion fame,  it could be taught to "speak" increasingly well.  Because the conversations must about something, that is, because they must take place within some context,  the program was constructed in a two-tiered arrangement, the first tier consisting of the language analyzer and the second of a script.  The script is a set of rules rather like those that might be given to an actor who is to use them to improvise around a certain theme.  Thus ELIZA could be given a script to enable it to maintain a conversation about cooking eggs or about managing a bank checking account,  and so on.  Each specific script thus enabled ELIZA to play a specific conversational role. 

For my first experiment, I gave ELIZA a script designed to permit it to play ( and I should realy say parody) the role of a Rogerian psychotherapist engaged in an initial interview with a patient.  The Rogerian psychotherapist is relatively easy to imitate because much of his technique consists of drawing his patient out by reflecting the patients statements back to him.  The following conversation between a young lady ELIZA playing doctor illustrates both the Rogerian technique of encouraging a patient to keep talking and the operation of a computer program ELIZA .  The first to "speak" is the young lady.  The computer's responses are printed entirely in capitals

Men are all like that
IN WHAT WAY
They're always bugging us about something or other.
CAN YOU THINK OF A SPECIFIC EXAMPLE
Well, my boyfriend made me come here.
YOUR BOYFRIEND MADE YOU COME HERE
He says I'm depressed much of the time.
I'M SORRY TO HEAR YOU ARE DEPRESSED....

... DOCTOR, as ELIZA playing psychiatrist came to be known, soon becoming famous around the Massachusetts Institute of Technology,  where it first came into existence, mainly because it was an easy program to demonstrate.  Most other programs could not vividly demonstrate the information-processing power of a computer to visitors who did not already have some specialized knowledge, say of some branch of mathematics.  DOCTOR, on the other hand, could be appreciated on some level by anyone.  Its power as a demonstration vehicle was further enhanced by the fact that the visitor could actually participate in its operation.  Soon copies of DOCTOR, constructed on the basis of my published description of it,  began appearing at other institutions in the United States.  The program became nationally known and even, in certain circles, a national plaything.

The shocks I experience as DOCTOR became widely known and "played" were due principally to three distinct events.

1.  A number of practicing psychiatrists seriously believed the DOCTOR computer program could grow into a nearly completely automatic form of psychotherapy.  Colby et al.* write, for example, 

"Further work must be done before the program will be ready for clinical use.  If the method proves beneficial,  then it would provide a therapeutic took which can be made widely available to mental hospitals and psychiatric centers suffering a shortage of therapists.  Because of the time-sharing capabilities of modern and future computers, several hundred patients an hour could be handled by a computer system designed for this purpose.  The human therapist, involved in the design and operation of this system, would not be replaced, but would become a much more efficient man since his efforts would no longer be limited to the one-to-one patient-therapist ration as now exists."

I had thought it essential, as a prerequisite to the very possibility that one person might help another learn to cope with his emotional problems, that the helper himself participate in the other's experience of those problems and, in large part by way of his own sympathetic recognition of them, himself come to understand them.  There are undoubtedly many techniques to facilitate the therapist's imaginative projection into the patient's inner life.  But that it was possible for even one practicing psychiatrist to advocate that this crucial component of therapeutic process could be entirely supplanted by pure technique - that I had not imagined!  What must a psychiatrist who makes such a suggestion think he is doing while treating a patient,  that he can view the simplest mechanical parody of a single interviewing technique as having captured anything of the essence of a human encounter?  Perhaps Colby et al. give us the required clue when they write;

"A human therapist can be viewed as an information processor and decision maker with a set of decision rules which are closely linked to short-range and long-range goals, ... He is guided in these decisions by rough empiric rules telling him what is appropriate to say and not to say in certain contexts.  To incorporate these processes, to the degree possessed by a human therapist, in the program would be a considerable undertaking but we are attempting to move in this direction."

What can a psychiatrist's image of his patient be when he sees himself, as therapist, not as an engaged human being acting as a healer, but as an information processor following rules, etc." 

Such questions were my awakening to what Polany had earlier called a "scientific outlook that appeared to have produced a mechanical conception of man."  

* Nor is Dr. Colby alone in his enthusiasm for computer administered psychotherapy.   Dr. Carl Sagan, the astrophysicist, recently commented on ELIZA in Natural History, vol. LXXXIV,  "No such computer program is adequate for psychiatric use today, but the same can be remarked about some human psychotherapists.  In a period when more and more people in our society seem to be in need of psychiatric counseling, and when time sharing of computers is widespread,  I can imagine the development of a network of computer psychotherapeutic terminals, something like arrays of large telephone booths, in which, for a few dollars a session, we would be able to talk with an attentive, tested and largely non-directive psychotherapist." 

For anyone who wants to read ahead, the entire Introduction, it has been posted online in pdf format.  There are a number of versions of ELIZA available online.   Those which I have tried would require a large amount of credulous acceptance on the part of the human, though I doubt people in 2013 have become any less credulous about the fact that they are interacting with a machine than those in the mid-60s were.  If anything, people are far more impressed with the far more powerful computers and sophisticated programs and far, far less impressed with people, even in their own minds.   The extent to which that is due to their casual experience with using computers and what influence that has had on the language people use to talk about our minds, I don't know.  I do know that what was commonly believed by people during that time, that people were really thinking, freely choosing, living beings seems to have given way to exactly the mechanical view that Weizenbaum warned about.    As he was surprised to find, it was among scientists who he, and earlier, Polanyi, believed should have known better that the mechanical view of humanity was already more common.   That it was, apparently, acceptable among psychotherapists and psychologists should tell us that there was something seriously wrong with the scientific identity of those academic fields.  I would say that the subsequent decades, as Behaviorism was succeeded by evolutionary psychology, the beliefs, assumptions and attitudes on display, have almost entirely dominated those and other "sciences" dealing with our minds.

I don't think the sci-fi nightmare of us being dictated to by enslaving machines is the problem, though, as scientists in "artificial intelligence" work to give predator drones the ability to "decide" to kill and to carry out those "decisions" that could change in the most drastic way possible.   The more immediate problem is that how people see themselves and, especially, "other people" has an controlling influence on their political choices and how they will react to the choices made by politicians and courts.   Which is why it is even more important to understand the folly of believing people are computers.  Which is why Weizenbaum's book is so important.  

Thursday, May 2, 2013

Thelonious Monk: Well You Needn't  
Gabriele Toia - Clavichord


Night Thought About Economic Democracy and China

It is possible to consider China since the revolution as being a series of experiments in a large country officially governed by materialists.   First there was the brutality of the imposition of the Communist government, from an admittedly brutal and chaotic series of awful non-communist regimes.  Then there were the horrible famines brought on by the imposition of high theory and science on agriculture.  Tens of millions of people are estimated to have died in those.  Part of that was the importation of Lysenkoism in China but much of it was the inability of lower level communist hacks and lackies to tell higher ups that the great applications of science in service to materialist ideology didn't, somehow, seem to work in reality.   Lysenkoism was abandoned within fifteen years, as it was in the Soviet Union after Stalin died.  The bureaucratic establishment under communism has been perpetuated into today.

Then, I'd guess largely in response to the disasters of the early years of the "Peoples' Republic",  the different horrors of the Cultural Revolution were imposed as a means of maintaining control.  With its large scale bloodshed, enslavement and other horrors.   I don't think that the frequently encountered stories of families forcibly divided into agit-prop dramas for public consumption were merely coincidental.  Any level of competition for loyalty to the governing establishment was seen as a danger to it.  It is an indictment, especially of people allegedly on the left in the West, that the lives of people destroyed in this modern reign of terror, weren't seen as more important than theory, more important than concentrating on the bizarre spectacles of that period and finding them amusing.

Gradually, as the old and true believers died off or became the focus of show trials by those who grabbed the reigns, China abandoned pretensions of "socialism"* for a capitalism that has been brutal and destructive, humanly and environmentally,  on a scale that would be hard to match.  Adding scientific efficiency to the enslavement of humanity and the despoliation of the environment, quite intuitively, turns out to bring even more horrific results.  It's so weird that students of Marx, whose critique of industrial capitalism will stand as his greatest work even as so much else of it lies discredited, didn't get that.   Or maybe they did and, in the belief that moral obligations are scientifically discredited, they went with it.   And here we can indict  the "right" in the West which has had no problem with moving in to make deals turning the "Peoples" Republic" into a vast slave market and industrial brown field, working hand in hand with their "communist" partners, so many of whom became instant billionaires with a vulgarity unmatched by that of the most vulgar "capitalists" our gaudy economics has produced.

For this week, as I turn from encouraging people to read Eddington to promoting Joseph Weizenbaum's great and important and criminally neglected book,  "Computer Power and Human Reason: From Judgment To Calculation",  I will ask if the current system in Capitalist-Communist China, isn't a continuing experiment in capitalism with the, admittedly imperfect, moral restraints of religion removed and what that means for people and the biosphere.  If the new atheists' prophesy of an atheist West comes true, it might be a window into our future.

*  I put socialism in quotes because that word has come to mean something quite different from what socialism should have always meant.   When I said "I'm a socialist" I meant nothing except that the means of production rightfully belong to those who use them to produce wealth and not to investors who, through legalized theft, are illegitimately given legal ownership of them.  That turns workers into equipment as it steals the tangible product that they produce, allowing the "owners" to turn them out and ship their jobs to slave labor markets in places such as China.

"Socialism", in that meaning of the word must be democratic, socialism is an aspect of democracy. No political system which is not, actually, democratic could possibly sustain workers' ownership of the means of production.  Some form of theft on behalf of investors will always succeed where democracy is absent.  I would almost guarantee that, as in the United States, the extent to which investors are allowed to steal the products of other peoples' work that democracy can be dependably regarded as being absent.

The appropriation of the word by fascists, both of the "right wing" and "left wing" variety, has made it unusable.  I'm in favor of finding a term that will separate real socialism from what most people mean when they use the word.   "Economic democracy" seems to me to be the best expression of what I meant when I have said I was a socialist.   If I'd said I was an "economic democrat"  from the beginning I might have avoided being willfully blind to the horrors of so much of the brutal history of communist rule and the moral depravity of so much of the pseudo-left here and in Europe.   So, in my very late middle age, I go from being a "socialist" to being an economic democrat,  changing nothing except common misunderstanding and a label.

Wednesday, May 1, 2013

Remains Of The Day

From "The Cool Ones"

Tantrum A-go-go

Does the Flying Spaghetti Monster Exist?

Updated Below

In my never ending quest to provide service to my readers I now go the extra mile and respond to  my most obvious non-reader, the washed-up pop-music scribbler, Steve Simels.  Or, at least, the person who posts comments here as "Steve Simels".   Since someone using that same name has been caught by different people using other identities in order to attack people, on a number of  blogs - me being one of those-  it's possible that the "Steve Simels" who has been trolling me here isn't the washed-up pop-music scribbler but some other witzbold  in his own mind who is assuming his identity for some purpose.   Which would be called "satire" if someone with the technical ability traced him, or his sock puppets, to a less deniable identity.  Though it would be surprising if someone voluntarily chose him to impersonate.  It would be like  choosing Cisco Red or MD 20/20 when you could choose something a bit higher up on the wine list at the convenience store.

Now, to start with, and, I suspect to his bitter disappointment, I wouldn't hold that Steve Simels is God.  Though such an hypothesis might help us to understand The Problem of Pain and other such mysteries if that were the case, the theory fails on other tests.   Such a god would have to be demoted far down from all knowing, all wise, and any number of other, partial definitions of God as believed in by most believers.   The sometimes ventured speculation that God had a sense of humor would have to go too.  I mean, God would have to have more of a sense of humor than someone who can't even come up to the level of Soupy Sales and Stubby Kaye.  Not even by stealing their material.  Soupy had some sense of timing, mostly when to not repeat the pie in the face for the 98,457th time in order to avoid his audience noticing it had gotten unacceptably old .   If I had a dime for every time that "Steve Simels" has pulled out the lamest of lame satirical cliches, the Flying Spaghetti Monster, I'd be able to bribe the guy with the hook to get him off stage permanently.

God, the creator of the universe is definitely beyond human definition, as is the universe.

I will pause here to point out that any human conception of God, individually or collectively,  is inevitably incomplete and inadequate.  In that sense, any idea we have about God might be reasonably considered an idol.  Forgetting human inability to conceive of God leads religion and the religious into some of their most serious sins.

Now, remember one of my proudest achievements, getting Sean Carroll to admit that there was not a single object, not the most humble and common molecule, atom or subatomic particle, which physics knows comprehensively and exhaustively.  Since physics doesn't know even one object in the universe completely, it will not soon encompass the entire universe no matter how fashionable the talk of a "Theory of Everything" is among the trendy and sciency.  I think that when someone claims to have one of those, even as it is taught as such at universities around the world and touted by science reporters who don't have the foggiest idea of what it really means, it will be a relatively short time before the holes, lapses and discrepancies in that materialist desideratum are identified.  There will be lots of physicists with the ability to understand the issues who will be wanting to make a name for themselves.   With them on the case, I have a feeling that the expiration date of theories of everything will be rather briefer than the Newtonian Universe was when, as I mentioned last week, Lord Kelvin declared the first End of Physics in the 19th century.

The universe, the creation of God would seem to not be entirely comprehensible by the brightest of the Brights, so many of who don't seem to understand the wisdom of being rather more modest about their products than they are.   Eddington understood the problem of overestimating how much of even the physical universe was vulnerable to discovery by human abilities.   In a quote already given here he said:

It is one thing for the human mind to extract from the phenomena of nature the laws which it has itself put into them; it may be a far harder thing to extract laws over which it has had no control It is even possible that laws which have not their origin in the mind may be irrational and we can never succeed in formulating them.

Reason is a means of people to make reliable assumptions about the nature of their world of sense. It is applied to levels of the universe that aren't vulnerable to our everyday senses and used to construct ways of understanding things at those levels, with some success, though often with far less than complete success.  What often gets included in the corpus of ideas held to comprise "science" is later found to be, as they say, mistaken.   Often reason lets us down, often due to reliance on incomplete knowledge or understanding, it's no better than the people applying it and their ability to understand what they see.  And no one can see more than they can.  Reason has limits.

As Eddington says, there might well be "laws" of the universe that are irrational - that is not vulnerable to discovery by human reason -  and which would always elude our reason.  Always.  It's pretty amusing to think about that when you consider how many of the true believers that we are on the verge  of the day when the great and true Theory of Everything, never stop asserting that we're just like other animals or, even worse, computers who I doubt they'd suspect have the capacity to even observe, never mind understand the entire universe.  I mean, even if you've got a really groovy and powerful computer, do you not find it often doesn't seem to even understand its own instructions?   I'm extremely skeptical of the plainly absurd idea that science hasn't rendered us quite different from other animals, and, having eaten from that tree of most efficacious knowledge, far more capable of depravity and the most irrational acts of murder and destruction.   But I really don't think we're more capable of a comprehensive observation of the universe than a bacterium that shows some response to its environment.   Comprehensive means, well,  absolutely comprehensive.  Anything that isn't comprehended could not be included in the "everything" in a "Theory of EVERYTHING".

Since it would seem to be wise to be skeptical of the idea that even such Big Thinkers as Sean Carroll are on the verge of understanding the entire universe, the idea that they understand God who created the universe, would seem to be even less wise.   Even the most popular current hero of physics and cosmology, Steven Hawking, hasn't got that ability.  As Peter Woit has pointed out, he's given up* on the quest to explain even physical reality,  demanding to change the rules to remove the exigencies of the subject of physics as a test for the ideas of "physicists".

We seem to be at a critical point in the history of science, in which we must alter our conception of goals and of what makes a physical theory acceptable. It appears that the fundamental numbers, and even the form, of the apparent laws of nature are not demanded by logic or physical principle. The parameters are free to take on many values and the laws to take on any form that leads to a self-consistent mathematical theory, and they do take on different values and different forms in different universes.

or, considering the proposal to replace the verifiable physical universe with sci-fi written in equations, once they've divorced the observable universe from their "discipline, at least people who get called "physicists" for purposes of filling important chairs at important universeities.   It was reading Hawking's recent stuff that convinced me physics, though likely not at an end, was certainly well into a decadent phase.  I suspect it is exactly the Cervantine cosmological quest for what can't be had which has helped lead it there.  Even as modest, sober and aware a scientist as Eddington made a more modest version of that mistake with his Fundamental Theory, and Hawking and Carroll ain't exactly modest.  That the quest to use physics, the study of the physical universe, in the attempted hit job on God, is done by some of the same guys doesn't give my powers of deduction much of a workout.

If the Big Thinkers of atheism can't come up with what is necessary to convince people to give up God, I doubt that Bobby Henderson's claim to fame will do it.  Flying Spaghetti Monster is pretty lame satire, even by pop-atheist standards.  So, after making Simels suffer through skimming through this piece in search of references to himself,  FSM exists as a really stupid example of what gets called satire in this post-literate age, more like something a 5th grader might scribble out in a fantasy of him eating the mean teacher who gave him a D - on his history paper.   It doesn't eat the teacher and it doesn't eat God.  No more than Hawking's imaginary universes "not demanded by logic or physical principle".   If physics doesn't have to follow the exigencies of those, and as committed a "naturalist" as Sean Carroll doesn't have any objection to it, their war machine against God and religion has disappeared.  Though it seems that they're even willing to sacrifice physical science in their quest to do that.


* David Gross has in the past invoked the phrase “never, never, never give up”, attributed to Churchill, to describe his view about claims that one should give up on the traditional goals of fundamental physics in favor of anthropic arguments invoking a multiverse. Steven Hawking has a new book out this week, called The Grand Design and written with Leonard Mlodinow, in which he effectively announces that he has given up:


"We seem to be at a critical point in the history of science, in which we must alter our conception of goals and of what makes a physical theory acceptable. It appears that the fundamental numbers, and even the form, of the apparent laws of nature are not demanded by logic or physical principle. The parameters are free to take on many values and the laws to take on any form that leads to a self-consistent mathematical theory, and they do take on different values and different forms in different universes."

Thirty years ago, in his inaugural lecture as Lucasian professor, Hawking took a very different point of view. He argued that we were quite close to a final unified theory, based on N=8 supergravity, with a 50% chance of complete success by the year 2000. A few years after this, N=8 supergravity fell into disfavor when it was shown that supersymmetry was not enough to cancel possible ultraviolet divergences in the theory. There has been a recent revival of interest as new calculational methods show unexpected and still not completely understood additional cancellations that may fully eliminate ultraviolet divergences. Hawking shows no interest in this, instead signing on to the notion that “M-theory” is the theory of everything. The book doesn’t even really try to explain what “M-theory” is, we’re just told that:

"People are still trying to decipher the nature of M-theory, but that may not be possible. It could be that the physicist’s traditional expectation of a single theory of nature is untenable, and there exists no single formulation. It might be that to describe the universe, we have to employ different theories in different situations"


The book ends with the argument that

Our TOE must contain gravity.
Supersymmetry is required to have a finite theory of gravity.
M-theory is the most general supersymmetric theory of gravity.
ergo

M-theory is the unified theory Einstein was hoping to find. The fact that we human beings – who are ourselves mere collections of fundamental particles of nature – have been able to come this close to an understanding of the laws governing us and our universe is a great triumph.

This isn’t exactly an air-tight argument…

UPDATE:  I don't know if the urgent e-mail I just got demanding that I change the reference of  Stubby Kaye to Dennis Miller, on threat of a lawsuit, is authentic.  I will stipulate that it wouldn't work because "Simels" can match Dennis Miller.  I'd assert that he doesn't appear to have much of a choice, in that.  I've never seen Simels,  though I wonder if he and Miller have ever been seen together.   Anyone know?