In part one of this two-part article, I made the following (paraphrased) claims:
- The state of science is unwell.
- A sample of the majority of published papers are likely irreproducible, or just plain garbage.
I know of no other way to make the case for these propositions than by example, knowing full well that this is subject to the counterclaims that “these are just anecdotes” and/or that they are not representative of the whole of science. To somewhat blunt such criticism, I begin with examples of people who were at the pinnacle of some of the most prestigious scientific and medical journals and let their experiences begin the discussion. I finish my proof by offering a history lesson on the origins of these phenomena, which I believe assists in showing that the “how” of “how we got here” helps explain and buttress the conclusion that “here” is, scientifically speaking, a very bad place.
Ben Goldacre is a British doctor and author who wrote the wonderfully evocative “Bad Science” and its companion follow-up “Bad Pharma.” Ben makes a living speaking and writing about the vagaries of pseudoscience and does so in a wonderfully humorous and accessible way. Having seen him speak in person, I tend to think of him as a kind of Malcolm-Gladwell-meets-Michael-Lewis of junk #science. His examples are relevant, modern, funny (most of the time), pervasive, and cross cultural boundaries. His chapter in “Bad Science” entitled “The Doctor Will Sue You Now” should be read by all high school students as an introduction to science by showing these future adults how governments, physicians, and all of the supposed checks and balances of peer review don’t prevent, but in actuality help enable, the kind of fraud peddled by Dr. Matthias Rath (who claimed he could cure AIDS with his multivitamins and managed to get the support of the South African government for a completely unethical clinical trial on human beings).
Goldacre’s book begins with an offhand remark that I believe merits consideration on this notion of the poor state of science.
“The hole in our culture is gaping: evidence-based medicine, the ultimate applied science, contains some of the cleverest ideas from the past two centuries; it has saved millions of lives, but there has never once been a single exhibit on the subject in London’s Science Museum.”
Bad Medicine, Preface, p. x (emphasis added).
If Goldacre is correct that medicine is truly an “applied science,” then I feel confident that my claim that science is a mess can be amply proven because “The Mess” that is modern medicine can be considered an archetype for all of the ills of science. Additionally, the greatest killer of human beings in the United States right now is chronic disease: that is to say, far and away in the United States, more people die every year as a result of repeated, entirely optional, bad behaviors than any other single disease or causal factor. We also spend about $0.86 out of every healthcare dollar on the various chronic diseases. As just one salient example, when I was the general counsel for CrossFit, Inc., and we were approaching 7,000 gyms in the United States, we got curious to know what other ‘chains’ were growing as fast. Starbucks and Subway certainly had more locations, but they had started well before us and were no longer opening stores as quickly. After some web searching we found the only business opening as many new locations was DaVita – kidney dialysis centers – owned by Berkshire Hathaway. Diabetes and its associated disease states are a massive drain on the healthcare budget. Then think about adding in coronary artery disease (Thanks govt nutrition guidelines!) and the associated problems, most strokes (Thanks doctors who advocated smoking!), etc. Yet these are diseases of advanced civilization. We continue to pat ourselves on the back at our advanced #science(!) while we kill ourselves with lifestyle behaviors at a rate that approaches the death chambers at Auschwitz. And speaking of which, one would think that the entirety of WW2 and its aftermath, with the use and application of science to produce more efficient ways of killing, should give us great pause to consider whether our science (with or without the hashtag) might need some recalibration, yet there is no post-war period of philosophical introspection about science at all to which one can point. FN 1
Richard Smith began his career as a physician in Great Britain, and finished it as the chief editor of the “prestigious” BMJ (previously known as the British Medical Journal) from 1991-2004. He was also head of the BMJ publishing group and worked at BMJ for a total of 25 years, beginning in 1979. Here is his take from 2006 on some ideas for reform, or even abandonment, of the peer review process in medical and scientific journals. Ten years later, however, in a lecture to the International Journal of Epidemiology, he was singing a different tune: blow the entire system up. One might argue that this is simply a case of one man’s bitterness, but Marcia Angell, the former editor-in-chief of the New England Journal of Medicine, arguably the most influential medical journal on the planet, had very similar things to say after her time in academic publishing (in 2009). She was disenchanted enough to write a book entitled “The Truth About the Drug Companies: How They Deceive Us and What to Do About It.” Her conclusion?
“It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine.”
Dr. Marcia Angell, 2009
A great place to really get one’s “bad science” on is Retraction Watch. Retraction Watch, as the name implies, began as a simple blog about scientific papers that were subsequently retracted, in part because – just like other publishers, especially newspapers and magazines – science publishers aren’t particularly keen on reporting that something they previously published was completely wrong. More importantly, in addition to burying subsequent retractions, no one is charged with connecting a paper’s retraction to all of the subsequent papers and research that relied upon the conclusions of the original faulty paper. Entire fields of study have been wiped out by faked papers and research.
John Ioannidis wrote a paper attempting to explain mathematically why “Most Published Research Findings Are False.” The sidebar to that article is worth following as Ioannidis responded to the firestorm that his paper generated.
My own personal favorite is based upon my experience as a criminal defense attorney with government run forensic laboratories. While this may bum-out the viewers that keep the “CSI” franchise and its sponsors in business, I invite anyone to go to a search engine and type in the name of a state agency’s lab – the ones that handle forensic analysis of evidence in criminal trials – and the word “scandal” after it and see what results return. Massachusetts, for example, is still dealing with the fallout from their drug-using forensic chemists. But. Massachusetts. is. hardly. unique. Indeed, one could argue that government attempts to use #science to put people in cages is the perfect jumping-off point for explaining how we got “here” with bad science.
The “Generally Accepted” Test
In Frye v. United States, 293 F. 1023 (D.C. Cir. 1923), a man was convicted of second-degree murder and appealed his conviction. The basis for his appeal was the trial court’s decision to exclude the results from what amounted to an early form of the lie detector, which the defendant had ‘passed’ and wanted to submit to the jury via expert testimony. The judge did not allow the evidence and on appeal the D.C. Circuit upheld the court’s ruling.
Just when a scientific principle or discovery crosses the line between the experimental and demonstrable stages is difficult to define. Somewhere in this twilight zone the evidential force of the principle must be recognized, and while courts will go a long way in admitting expert testimony deduced from a well-recognized scientific principle or discovery, the thing from which the deduction is made must be sufficiently established to have gained general acceptance in the particular field in which it belongs.
We think the systolic blood pressure deception test has not yet gained such standing and scientific recognition among physiological and psychological authorities as would justify the courts in admitting expert testimony deduced from the discovery, development, and experiments thus far made.
Id., at 1023.
This case and its announced legal standard held sway in courts around the Nation for almost seven decades, even surviving the court reforms of the 1950s and the adoption of the Federal Rules of Evidence in 1975. What is important about the decision, however, was that it gave legal definition to “science” in courtrooms across the United States, or at least, that is what subsequent courts did with it. The so-called “Frye standard” consulted no scientists nor, perhaps even more importantly, scientists steeped in the philosophy of science. The standard did nothing more than reify the idea of a ‘consensus’ of opinion about some particular technology – in this case, the so-called ‘lie detector’ – at any given moment as being the standard of admissibility for ‘scientific evidence’ in federal courts. It is a fundamental misapprehension about what makes something “science.”
The case was, of course, *not* the single causative event of its era that led to the diminution of real science. In point of fact, it may not have been even the second, or third, or fourth most important event, but it was certainly a part of a series of events between World Wars that changed the course of science. (FN 2) The inter-bellum period of the early twentieth century saw radical changes in the prevailing models of how economics, politics, public policy, and even how the universe worked. In the ‘science’ of economics, Marxism rose to ascendance in the same period that eugenics was a very real policy of multiple states in the U.S., born out of the (mis)application of Darwin’s theories on evolution to social structures. The same was true for Einstein and relativity.
It is hard to fully appreciate now the impact Einstein’s general theory of relativity – and its “proof” by the eclipse of May 29, 1919 – had on the underlying faith in science and a wide swath of popular culture. As Brittanica notes:
The ideas of relativity were widely applied—and misapplied—soon after their advent. Some thinkers interpreted the theory as meaning simply that all things are relative, and they employed this concept in arenas distant from physics. The Spanish humanist philosopher and essayist José Ortega y Gasset, for instance, wrote in The Modern Theme (1923),
‘The theory of Einstein is a marvelous proof of the harmonious multiplicity of all possible points of view. If the idea is extended to morals and aesthetics, we shall come to experience history and life in a new way.’
The revolutionary aspect of Einstein’s thought was also seized upon, as by the American art critic Thomas Craven, who in 1921 compared the break between classical and modern art to the break between Newtonian and Einsteinian ideas about space and time.
Encyclopedia Brittanica, “Relativity” entry, accessed 7/8/2019.
David Stove of the University of Sydney makes a compelling case for where the philosophy of science in the western world went awry in his book Popper and After: Four Modern Irrationalists, Pergamon Press, 1982. He also points to the disruption that Einstein’s relativity wrought upon the scientific world.
The crucial event was that one which for almost two hundred years had been felt to be impossible, but which nevertheless took place near the start of this century: the fall of the Newtonian empire in physics. This catastrophe, and the period of extreme turbulence in physics which it inaugurated, changed the entire history of the philosophy of science. Almost all philosophers of the 18th and 19th centuries, it was now clear, has enormously exaggerated the certainty and extent of scientific knowledge.
D.C. Stove, “Popper and After,” p. 51.
Stove takes only 100 pages to fully identify and explicate the source of irrationalism in science, beginning his work with Karl Popper and his fellow scientific irrationalists, and leading eventually to Hume’s extreme inductive skepticism, going so far as to detail the flaw in inductive skepticism by use of symbolic logic. Stove contends that Hume’s belief that one could draw no conclusions at all from repeated observations in the past about the future was revived by Karl Popper in the aftermath of the “fall” of the Newtonian view of the universe. FN 4.
In this dependence on Hume, Popper is only an extreme case of a general condition. For the influence of Hume on 20th-century philosophy of science in general is so great that it is scarcely possible to exaggerate it. He looms like a colossus over both of the main tendencies in philosophy of science in the present century: the logical positivist one, and the irrationalist one. His empiricism, his insistence on the fallibility of induction, and on the thesis which flows from those two, of the permanent possibility of of the falsity of any scientific theory, are fundamental planks in the platform of both of these schools of thought.
Id. p. 50.
By the time we get to the Supreme Court finally updating the “Frye standard” to discuss what the can qualify as ‘scientific knowledge’ for admissibility in the case Daubert v. Merrell Dow Pharmaceuticals, 509 U.S. 579 (1993), the battle has already been lost, with Chief Justice Rehnquist’s mewling lament of a concurrence:
I defer to no one in my confidence in federal judges; but I am at a loss to know what is meant when it is said that the scientific status of a theory depends on its “falsifiability,” and I suspect some of them will be, too.
I do not doubt that Rule 702 confides to the judge some gatekeeping responsibility in deciding questions of the admissibility of proffered expert testimony. But I do not think it imposes on them either the obligation or the authority to become amateur scientists in order to perform that role.
Daubert, at 600-01. (emphasis and bold added).
Sacre bleu! Why, the very idea that a learned man or woman – and jurist – should know what science is!
AGW being taught in schools as “science” is simply the culmination of a journey that begins with Karl Popper and his intellectual heirs – Kuhn, Lakatos, Feyerabend – coming to dominate the philosophy of science. Some seventy years after “general acceptance” in Frye the Supreme Court is now citing a self-admitted irrationalist, who did not believe that it was possible for human knowledge to be cumulative and advance, for its definition of what “scientific knowledge” is. The opinion itself discusses all manner of non-science as being characteristic of science, or should I say #science, including “peer review” and consensus.
And that, my friends, is how fast we went from a society bursting with innovation and the understanding that science was an attempt to model underlying universal truths, to a level of specialization that approaches only insects, and our schools hammering our children with post-modernist ideas about what makes something #science.
FN 1 – There is no post-WW2 ‘scientific reformation,’ for example, with a commitment to use the awesome power of the atom to provide energy for all – there are instead only more bombs that can reach farther faster. This doesn’t even begin to address the science used to make chemical and biological weapons.
FN 2 – In order to avoid claims of plagiarism, I want to be clear that I am hardly the first person to point to the era between the First and Second World Wars as being historically significant for the changes that were wrought in this country – and across the world – most specifically in the outcome of the clash of ideas of the day. For example, the influence of Progressivism and the Eugenics movements, as well as Marxism’s influence on Russia and its subsequent Lysenkoism, show that as early as 1935, people were sounding the alarms on the influence of post-modernist ideas in seemingly diverse fields of human action.
FN 3 – Stove’s work puts me in mind of Albert J. Nock sounding the alarm in 1935 in “Our Enemy, The State,” or “Isaiah’s Job.” Stove also authored a compelling book titled The Plato Cult: And Other Philosophical Follies, Basil Blackwell Ltd., 1991. The book’s jacket has a professor describing Stove as “an entirely worthy member of a distinguished tradition of outrageous curmudgeons.” That phrase immediately invokes Richard Weaver and his “Ideas Have Consequences,” as well (1948).
FN 4 – To be clear, Einstein’s theories did not in any lessen the utility or power of Newton’s Laws, they only shortened their reach, but Einstein’s ideas about space-time and how gravity would bend light were a departure from Newton’s corpuscular model of light, as well as what had been the prevailing notion of space as essentially “inert.” Regardless of subsequent interpretations, for many people and pundits in that era, the boundaries of what could be known with certainty certainly seemed to have shrunk.
Malcolm-Gladwell-meets-Michael-Lewis
Bad hair but married well?
I don’t get it.
https://pbs.twimg.com/profile_images/836744595316879363/CWd5Db_i_400x400.jpg
https://i0.lisimg.com/11710400/280full.jpg
Michael Lewis’s Undoing Project is unbearable, but it would be good summer reading for a child before high school. How people misunderstand, miscalculate, and their over-confidence in their beliefs and observations are all useful morals that might be worth the trauma of his meager writing.
I’m an arrogant asshole of underclass origin, so observing how wrong people get things has always been my sport. Little did I know how well this would serve me in life: from government to investment to industrialization, it is good to know that every budget will be too small, every timeline will be too short, and no bill of goods will ever be delivered in total.
Moneyball was good, even if Voros got cut from the movie.
Speaking of which, did he ever visit over here or did he get left behind at TOS?
Moneyball is how I always thought: you’re buying wins, not air. Too bad athletes don’t have assays on the bag like so many bags of fertilizer. Need a shortstop? I’ve got some 12-12-12 right here.
The beginning of Moneyball is the inkling that the experts are shams. It’s not that they don’t know anything; it’s that they pretend to be able to predict everything with almost no accountability.
This is how people are, though. They see a name and think: he’ll be a good general, his father was. So many of the huge, popular swells are just childish bullshit; that various media pump them up or that some scientist can rationalize each urge is no surprise.
I can’t swear to it, but I don’t VM ever posted here.
+1 igon value
The “bad foundation” portion of this is something I hadn’t spent much time reading about. I’ve spent more time reading about the “bad math” and “bad data sources” sides of things, especially in the social sciences. I don’t trust a single finding that comes from a survey or a poll. They are nearly impossible to get right, even with the best of intentions.
You didn’t vote Boaty McBoatface for Mayor?
Thank you.
1) Anyone interested in this topic should be listening to the podcast Context by Brad Harris. Especially the last episode.
2) Rehnquist was an idiot. But I don’t doubt that he was wrong that most judges can “do science.” In some jurisdictions, the courts keep neutral scientists on staff / on call to advise the judges (or whatever funny furin words they have for judges) on what is good science and what is bad science. This could potentially be problematic in our explicitly adversarial system, but other explicitly adversarial systems seem to manage.
3) Kuhn, of course, shouldn’t be lumped in with the rest on your list. He abhorred the way his work was used.
Read Stove’s “After Popper” and then talk to me.
‘Falsifiability’ is always going to be the sticking point, because for the majority of humans, there’s few things more unsettling than admitting that there’s a possibility that something you truly and honestly believe, might in fact be ‘wrong.’ In my research, that’s the most fun and exciting part, because that’s my chance to learn something new. But for most people doing non-science related jobs, it’s a non-starter.
It’s a shame that understanding things better isn’t always the goal.
I was struck by the comments on Newton: I don’t think of him as wrong; I think of him as limited. The engineer in me seldom leaves Newton because I’m supremely practical, and he’s relevant to my work most hours of the week. Relative physics almost never matters to me, but I do want my GPS to work, so I know some of my cousins in other industry need him from time to time.
We have similar issues with the Bayesian vs frequentists debates in statistics. The Bayesians are 115% convinced they’re right, and from an abstract and philosophical standpoint, they are: frequentist models assume things we know not to be true. But I’ve seen more disastrous models because of Bayesian modeling than I ever did with frequentist modeling. The Bayesians usually hand-wave things away with a variation of the “no true Scotsman” argument. But if very well-informed and experienced analysts are constantly having this happen to them, at what point do hard to perfect modeling techniques (as opposed to the more straight-forward frequentist techniques) start to share the blame?
Statistical models are tools that entail certain assumptions. Those models should be judged by their utility in a particular context. If classical or frequentist models prove themselves a better fit for what you want to know, then they are the best one for that context. Full stop. And I say this as someone whose work leans Bayesian.
Maybe I’m mistaken, but my understanding is that Newtonian physics isn’t wrong, per se. It’s just a specialized case of Einsteinian physics. The same applies to classical statistics versus Bayesian.
Newtonian physics is enough to cover all of structural, civil, and almost all of mechanical engineering.
I’m not sure I would agree with that. From what I understand the frequentists and Bayesians differ on the validity of treating a fixed, but unknown, quantity as random. For frequentists, a quantity can either be fixed or random, but Bayesians argue that through use of prior distribution, we can sort of make a cloud of probability to represent that unknown, fixed quantity and work from there. Because the parameters of this probability “cloud” depend on the statistician making a subjective judgement on “degrees of belief,” frequentists argue that it is not an empirical use of statistics.
Again, I view them as different tools for different jobs.
It is fair to say that the first test of General Relativity entirely encompasses Newton’s Laws, yes.
there’s few things more unsettling than admitting that there’s a possibility that something you truly and honestly believe, might in fact be ‘wrong.’ In my research, that’s the most fun and exciting part, because that’s my chance to learn something new. But for most people doing non-science related jobs, it’s a non-starter.
I have always said, “You (I, anyway) learn more from failure than you do from success.”
Not many people seem to agree.
Not many people seem to agree.
Then they have never accomplished anything. Because success is pretty much always built on the ruins of a shit-ton of failures.
I thought that was basically a truism . But is can be a struggle. I spent a week trying something that did not work and my first reaction was i wasted a week. But in fact i did not waste it i learned that solution x does not work. Which is valuable.
+1 General Grant
So, it appears that science isn’t quite as ‘rational’ as we’ve been told, huh?
I’m curious about this:
There is no post-WW2 ‘scientific reformation,’ for example, with a commitment to use the awesome power of the atom to provide energy for all – there are instead only more bombs that can reach farther faster. This doesn’t even begin to address the science used to make chemical and biological weapons.
Are you suggesting that a good moral foundation is necessary for proper science? Or am I reading too much into it? It does seem that the intention/motivation of the scientist certainly affects the process and ultimate value of the findings. Norman Borlaug, for instance.
These articles are great for those of us who aren’t scientists. Thanks for writing them.
“Are you suggesting that a good moral foundation is necessary for proper science?”
Perhaps. I think you’re cutting some linguistic corners that matter, so let me state it in a way that I think is more accurate and faithful to my claim:
Science is simply another (one) of (many) human “truth-seeking” endeavors. Music, Art, Literature, Law, and many others have, as a core assumption, the belief in Universals. Contrast this with most humanities and philosophy departments that are all run by people who are – or specifically teaching – explicitly deconstructionist, post-modernist, “relativist” notions of Truth (or maybe that should be ‘truth’ with a small ‘t’). That’s a problem, in my opinion, and what’s happened to Science isn’t an accident.
In other words, if one takes the long view of the historical progress of Science in the West, I would argue that it was not simply because the peoples of western civilization are more ‘science-y’ than others, but rather because there was both an underlying philosophy (Classical Greek) and a methodology (Science) that worked perfectly well together because they both were based upon identical premises regarding the existence of universal truths.
Somewhere in the time between The War to End All Wars and, uhhh, the Next War – the ’20s and ’30s – the academe, in all of its post-structuralist glory, snapped from those notions of Truth. The predominant ideology, spun a million different ways, is simply a will to power. Might makes Right. Consensus = Science. Logical and legal positivism. Eugenics. “Law is what we say it is.”
That’s a pretty long way from Antigone’s claim that Creon may be technically correct, but that he ignores “greater law” by not allowing the burial rites for the dead.
We are thoroughly postmodern now. The hard sciences and engineering are the last to fall, but they are falling.
I suppose this might be as succinct a summary of my observation/point as there is. Richard Weaver says it’s William of Occam’s fault and it’s all been downhill since, but I really see the 20s and 30s as being the point at which the US took the wrong turn, though the Progressive movement as a historical antecedent to that era might move the true “cause” slightly earlier.
The proggie/post-modernist/marxist approach seems pretty fucking lazy. If there are no correct answers, no one ever has to really defend or deliver anything.
Good gig if you can get it, I guess.
It dates to Hume/Kant/Rousseau and their rejection of the Enlightenment. Almost all of philosophy since has been built on the foundations they laid with the conclusions becoming ever more absurd.
And where do we go after that?
You’ll forgive me if I’m not to keen on crossing a bridge designed on the popular feelings of those who built it.
You’re already crossing bridges that are not designed by the most qualified based on skill, but most qualified based on equity of social outcome (I’m looking at you Florida).
I thought exactly about that bridge too. Sad…
This?
https://en.m.wikipedia.org/wiki/Florida_International_University_pedestrian_bridge_collapse
Nani? Add “history” to the list of music, art, and literature and you have “the Humanities”. What do you think they are?
And, for what it’s worth, in my experience a great many philosophers in NA universities are still analytical in orientation. Europe is a whole other ball of wax.
I’m sorry, HM, I don’t understand the “Nani” comment. The list of “Music, Art, Literature, Law” was not intended to be a list of Humanities. In fact, as a matter of categories “Art” would encompass the other three. The later mention of the Humanities as a contradistinction is to highlight the fact that there is a huge swath of majors in universities that purport to be teaching young adults ‘knowledge’ that is premised on the idea that cultures are all “relative” and there is no “better” – an explicit rejection of universals. Which kinda undercuts the entire endeavor, really.
For that I can help! (I think.) It means “what” in Japanese.
I grew up watching stuff like Fist of the North Star
What I’m saying is that the Humanities is the study of music, art, literature, philosophy and history. Within a particular academic discipline, there are various theoretical schools. I do agree that post-structuralist theoretical stances are predominant in certain fields, (e.g., literary criticism, art history, etc.), however, it is not an inherent characteristic of the field.
As for cultural relativity, I would place the blame squarely on anthropology, but that is an article for another time!
I was using Law, Literature, Music, and Art in a different sense, HM. If you look at the sentence right before it, I was referring to each of those as “truth-seeking” human endeavors, not as specific university courses. I was using the “humanities” to mean all of those courses and majors at a university that don’t explicitly lead to a BS. Yes, you can find some law courses in the humanities, or some French or British lit courses, as well as a whole swath of others that don’t fall under that list, too.
And my broad claim is that most of the humanities – indeed, the university system itself – has collapsed because the predominant viewpoint among the teachers and the curriculum is that universals don’t exist.
Have all engineering programs succumbed? No. I don’t think so. But do the engineers have to take courses in the respective humanities departments where they are likely to hear such nonsense? I would respectfully suggest for the majority of folks, yes.
The best piece to support my contention would be Victor Davis Hanson’s book “Killing Homer.” It was the Classics professors who killed the Classics. The “why” of that tracks with the claim I’m making.
I think that’s where I was confused, and/or, was my point of contention. A lot of what you are talking about fall under the social sciences (anthro, gender studies, etc.) and are not the humanities.
I think there are some valid arguments against the existence of Platonic ideal forms that don’t necessarily lead to complete skepticism and relativism. Epistemological humility is a good thing. That having been said, if there are some things that are universal, and I think there are, I don’t believe the definitive list has been drawn up yet, so debate about which they are is healthy.
As a Classics minor who was studying when “Black Athena” hit, I know exactly what you and VDH are talking about.
HM – if there are no universals, then our laws about pedophilia are just preferences; quaint, cultural quirks of western civilization. Whenever we (society) start down this road, it’s the absolutists who win. I would contend that the historical evidence is now “in” and it turns out that believing in universals does produce demonstrably better outcomes by almost any metric we want to use. Of course, measurement itself is “comparison against a standard” and (IMO) our ancestors in education lost the fight on “standards.” (See my first article on compulsory education). What those ineluctable Truths are is certainly a subject for ripe debate, across the entirety of human experience, but belief in Truth is essential and I think even higher education has abandoned that notion.
“And my broad claim is that most of the humanities – indeed, the university system itself – has collapsed because the predominant viewpoint among the teachers and the curriculum is that universals don’t exist.”
Which kind of returns us to postmodernism. I would actually argue that postmodernism made some important and positive contributions to history even though I don’t accept all the baggage it brings. I’ve got in mind a short piece on this topic for this site.
Have all engineering programs succumbed? No. I don’t think so. But do the engineers have to take courses in the respective humanities departments where they are likely to hear such nonsense? I would respectfully suggest for the majority of folks, yes.”
This is also an important point which I have alluded to in the past either here or at TOS. In my experience, most people majoring in the sciences really don’t want to take Gen Ed courses. Their goal is to get through them as quickly and as painlessly as possible. For the most part, this means regurgitating the text or the lectures. In the process, they generally accept what is presented there (unless they come in with strong contrary views) because they don’t really think about those subjects even though they may think very critically about the science they are studying.
Engineering is succumbing in an insidious manner. Engineering is a naturally combative and competitive environment where the best idea should prevail. But when the value of the idea is affected by the social identity of the person who came up with it, all of the profession suffers.
Well, to play devil’s advocate, aren’t they? What empirical proof, based in pure logic, could you show the Greeks to prove, outside of cultural context, that Socrates sliding into the well-oiled thighs of young teenaged boys was against universal morality? The very foundation of so-called “western civilization” was based on the transmission of knowledge and cultural values through the eroticized mentoring relationships of erastai and eromenoi. If western civilization was founded on universal truths, then why the 180 on pederasty?
Oh, let’s not do this, especially since we’re (a) this far into the thread and it makes it impossible to have a coherent give and take, and (b) I already know you don’t believe it, so I’m less inclined to play. It appears to me we’ve both already followed this rabbithole. I doubt I have anything new to offer you; and I feel confident in saying that you wouldn’t vote “not guilty”/nullify on a jury handling a case of an egregious child molester on the basis that you’re putting forth here… So, we’ll save it for the possible in person meet up over a cold beverage?
Publish or perish…
there is a huge swath of majors in universities that purport to be teaching young adults ‘knowledge’ that is premised on the idea that cultures are all “relative” and there is no “better” – an explicit rejection of universals.
Then what makes that professor’s opinion about their topic any more valid than mine? They better get in line, or maybe I’ll just sock ’em in the face.
That they are university professors – QED. Credentialism is another symptom of the underlying disease.
When substance no longer matters – when we vote on science – then you see credentialism, too. They’re both substitutes for actual science.
I suspect that works for awhile. But, that’s because they’re drawing from a large reserve of respect for academia built up by far better thinkers than them. Eventually that reserve is exhausted, though. That gets labeled anti-intellectualism when there really isn’t an intellectualism to be anti- about.
wdalasio – I would agree. It’s my sense most profs in colleges today are just gliding on credentials. But credentialism has won out, which I think is part of my point. Look at how people argue nowadays – it’s entirely about what credentials people have, whether or not something is “peer reviewed”, and who else ‘votes’ for the idea. THEN we start to get to substance of the underlying claims, but by then, the conversation been sidetracked.
Everything is about power, or more specifically relative power.
The party with more power is morally wrong by comparison to the party with less power, and therefore factually wrong as well.
All under a postmodern view of course.
To some extent, the tenure process provides a temporary stay on credentialism. If you don’t publish sufficiently in that first six-year period, you won’t get tenure. So this requires you to actually produce scholarship. However, that tends to lead to a lull. At least in History, a significant number of faculty never move from associate professor (the immediate post-tenure rank) to full professor. And, if you don’t get that second promotion within five years, the odds are against you ever doing it.
But, then you tend to be stuck within fairly ossified thinking. The old saw used to be that, once you got beyond a certain age in research (maybe 30), you were unlikely to come up with new ideas and, indeed, may even resist new thinking. However, I read somewhere (can’t remember where now), that more recent research has shown it’s not age per se but, when you get accepted into the guild, it is harder to think outside the straight-jacket.
However, I read somewhere (can’t remember where now), that more recent research has shown it’s not age per se but, when you get accepted into the guild, it is harder to think outside the straight-jacket.
This makes perfect sense. I was in my mid-40s when I created my most significant and popular invention. But I was an entrepreneur, not a guild member. And I know a lot of smarter guys than I am who did brilliant and fundamental work well into their 60s.
Thank you. Linguistic corner-cutting is a hobby of mine.
For the scientists out there, how do you approach this? Are there Universal Truths? Are we far enough along in the process to know any of them?
Don’t think so; maybe that’s a trap?
I think half of conversations are people bypassing or wrestling over semantic differences.
The most irritating part of post-modern chat is the lack of anchoring. Without givens or at least assumptions, everything in the past four millennia must be re-adjudicated in the comments below every Salon or Slate article.
I think that is the point at which one’s model becomes a “Law”, by definition, no? Isn’t that hierarchy in science an attempt at qualifying the truth-content of a model – or put it slightly differently, isn’t that hierarchy and attempt to qualify just how close the model hews to some underlying Reality? That the model can be predictive in as many domains as is possible?
Not a true scientist, but I offer the following:
If you step in front of a moving bus, you’re going to get run over.
To me the mistake is reducing Hume’s proposition that “our senses cloud our perception of reality” to “we cannot perceive reality”.
I always say that the electron is just a cartoon, a model, but I wouldn’t lick the power supply.
I approach it in a simple and non-philosophical way: are my predictions testable? If so, do they correctly predict my experiments? Have I done proper controls? And then, finally, have I described my research well enough for someone to be able to examine it, punch holes, or (better yet!) replicate?
MOST of the problems in science (or “science”) are in those areas where proper controls are just not feasible- like medicine, epidemiology, climatology. If I say that “my model predicts that tetraborane should be linear” and when I make tetraborane, it’s a 3D cage, my model is discarded.
Of course, as a good scientist, I published this result because my model was a common one (Hartree Fock SCF with CI and a few extra basis orbitals, for you quantum chem geeks) and instead of “hey, here’s a new and more stable configuration of tetraborane,” my paper’s theme was “Be careful of using SCF-CI for boron compound calculations, it can easily give you wrong answers.” Less glamorous, but it had the virtue of being true, and people have developed better methods to understand the structures of this fascinating family of molecules.
this fascinating family of molecules
citation needed. :p
Cluster bonding, no classical two electron two-center bonds, cages in different conformations, no linear polymers…
I should also add that if there’s a political stake in the research AND it’s government-funded, it is almost certain to be wrong.
I retreat into method (to the extent that I’m a scientist any more, or ever was…)
“Truth” is a concept that is only so helpful, or maybe I should say it is wildly useful but only in bounded circumstances (like Newtonian physics or the Rational Agent or any number of allegedly universal truths that aren’t). When you really get right down to it, you have propositions that have not yet been proven to be false.
Your model of ‘Truth’ needs to have a defined domain where they aren’t proven to fail (ie large bodies for Newtonian physics, institutional actors driven to rational behavior by a market the culls irrational actors) and you need to have a clear sense for where, within that domain, those models fail anyway even though I just said otherwise (ie at high speeds, in situations where every possible institution is subject to the same bias [in the mathematical sense]).
That’s a long way of saying *every* Truth has a litany of Yes, Buts proving the Truth wrong. The value of the Truth is tied to how large the domain is where the Yes, Buts don’t matter.
Thanks to all of you.
A very helpful discussion.
truth
I had distilled some definitions from an engineering perspective:
a/ fact > characterization based on the limits of man such as (not necessarily correct example): there were 15 known elements in 1700; facts change
b/ theory > relationship demonstrated by a model: Bohr’s model shows the electrons in orbit around a nucleus; theories compete
c/ law > repeatedly demonstrated theory that usefully predicts outcomes or are at least directionally correct if not quantifiably accuate :laws like gravity can be used in physical analysis and design; supply and demand predicts the directional reaction of a market to a directional change in inputs
d/ hypothesis > testable statement: a mole of chlorine will react with a mole of sodium and leave no excess of either
Truth always seemed emotional; I don’t really use the word.
I hate the term “fact.” I much prefer either trivia or data or observation, depending on the context.
I also really hate the term “ground truth”.
I respect that
but it’s a fact that there are 50 states; things like that don’t bother me
1) That’s trivia, in my head cannon.
2) There are 57 states, shitlord.
3) My head cannon is full of depleted uranium rounds.
“Somewhere in the time between The War to End All Wars and, uhhh, the Next War – the ’20s and ’30s ”
Perhaps not surprisingly, the same time Europe ditched sound money once and for all.
To be fair, holding on to the aucousitics for transport made sound money difficult for everyday use.
Okay, I have burned through 25,000 words establishing this land, and getting Dug to a metastable spot where time skips in the narrative are possible. What do I have on my brainstorming sheet?
Dug learns the local language (over the better part of a year).
Dug meets Captain Ottrson, and learns he missed the window for passing around to the eastern side of the continent this year.
Dug has a relationship with Svetlana.
The Amphitheatre subplot introducing unnamed Venator and Drea Wulff. Dug acquires both by betting against Magistrate Husil on the outcome of fights in the amphitheatre.
Cult subplot which puts Dug in personal danger while in the otherwise relatively secure Imperial capital.
Dug arranging for the acquisition of Pygmy Dragons for sale back in his homeland.
Befriending the Emperor and being sent away by the senate with a trade delegation because executing Dug would be bad form.
Several prongs off the cult subplot which bring Dug and Co. to places in the south when they do set out.
The Blind man meeting the Gorgon.
Fighting the “People of Yath” along the coast.
Destruction of a strategic site in the empire’s war in the south.
Exploration of various temples and ruins.
Reaching Atlor and continuing on to Salzheim.
Presenting to ambassadors to the court and generating enough buzz to drive up the prices of his wares.
Profit.
Can I fit that all in?
Yes.
Can I fit that all in?
That’s what she
saidpondered.I’ll have to re-read this cause i get confused a bit. Hume is an irrationality, (I.e empiricist?) But is sceptical of inductive logic? That seems contradictory.
I suspect you didn’t mean that as a reply to my comment.
Yes, but maybe we can play it off.
Arrows are back, and they are spectacular!
I’m so dull I didn’t notice until told.
From last thread: sorry about my thin humor and gross generalizations. FWIW, I’m on record here as a child of Southern, public schools, so I could only be making fun of myself. And I volunteered for winter test repeatedly just for the walleye and was never late to lunch when Michigan Poles rolled out the perogis on Catholic holidays.
Wild tangents from the reasonable, some brash or cruel, are often viewed as humor here; my form apparently isn’t quite up to par.
No worries! I was responding to the Cajun!
I hadn’t even seen yours. I have been fortunate to do quite a lot of traveling and a lot of eating. Every region has its wonderful and it’s ‘sweet Jesus how can people eat this?!?’ It’s what makes America great (again)!
I have two different sets of arrows superimposed on each other. The usual arrows plus some weirdo floating thingie.
Same here, on Brave mobile.
Ditto on FF mobile
This is a very complicated topic for how drunk I am. I blame Cadenheads wearhous tasting for this sad state of affairs. Mitch, our guide, was upset that someone smashed into his parked car and then drove away so he gave us two extra casks. So 8 tastings of cask strength whiskey.
I will say this
was the general counsel for CrossFit, Inc – eeewww CrossFit
I had a 39 year old blended whiskey which even they did not know what it was. Oldest scotch i ever had
While you’re setting personal records, there’s a madam in Paisley you might visit.
But how much do you deadlift, Bro?
150 kn with bad form. Less with good form
Kg… as in 2.2 pounds
No, we should all start describing our strength by the amount of newtons we can produce.
No, we should all start describing our strength by the amount of newtons we can
produceconsume.Mmmm. Newtons.
I’ve been trying to bring my caloric intake back down again…. now you tempt me.
You bastard.
Our fig tree is about to bear ripe figs. My wife makes homemade newton’s that are light year better. The whole range of critters that scramble when they ripen is astounding. Beetles, birds, raccoons, squirrels, you name it. They’re already coming around and testing the figs. Cuts the yield in half because they eat so many.
Back in my youth I could lift of whole pound of fig newtons from the package to my mouth in less than an hour.
Did you know that fig newtons have the loosest USDA standards (with regards to inclusions, i.e. spiders, grasshoppers, etc.) for any baked good?
Extra protein
They have to because of the nature of figs (specifically a kind of wasp which breeds inside them and are necessary for pollination).
Is that a 1RM? If so, I suggest more DLs. 😉
“I was the general counsel for CrossFit, Inc.”
Wait, what?
Mrs. Dean did CrossFit and got a lot of benefit from it, prior to the implosion. An inside account of which would be fascinating, but sadly, as their former GC, not one you could probably write.
I probably could get permission to write about aspects of it, as the owner is still a very close friend. It’s also not really any one person’s story and I think it would be incomplete without different perspectives. Mine was just one view of the elephant.
Would you mind summarizing Stove’s argument or pointing to a source when one could read it? I was not aware that the problem of induction had been definitively solved.
Meaning the Raven paradox, or am I hopelessly over my head.
Black swans as well.
Why the logic birds always have to be black?
I believe it’s the same, though I’ve not heard it as the Raven paradox.
And HM, I believe you should read Stove’s “Popper and After” for yourself.
As a terribly short summation, the upshot of Stove’s work on Hume’s inductive skepticism would be: “Not all hypotheses are equal.” i.e. There *are* reasons to believe some hypotheses’ predictability based upon their past occurrence, moreso than others. But again, that’s a terrible understatement of Stove’s work.
The appearance of swan breeds we didn’t know about isn’t in quite the same category of induction as me knowing that stepping in front of that moving truck ends badly for me, no matter how many times it’s been tried previously or is tried going forward.
Sure, I can buy that. I don’t go to bed worrying if the sun will rise tomorrow (most nights). But like Taleb, I find myself interested antifragility.
I would be interested in reading the actual logical proof, though, just out of interest.
Looks like it might be here: http://ontology.buffalo.edu/stove/chapter-04.html
Yahtzee. That appears to be it, without doing a line-by-line proofread. I think the Chapters leading up to it are important, but perhaps not necessary for the limited purpose HM might have.
The book has an interesting history; it’s been published under four separate titles, but the one I have is called “Popper and After” and it includes some “updates” Stove made from some of the prior editions. P.83 in my edition contains the entire symbolic logic that Stove uses to describe the various parts of Hume’s irrationalist philosophy of science. I think the entire endeavor is an amazing intellectual achievement, but YMMV.
OT: Sometimes I just don’t even know what to do with other people. I sent out an email giving away some baseball tickets to the department, and i clearly spelled out the fact that the they had a week to fill out the survey and the deadline was 5pm yesterday. I also explained in two separate places that the first 2 tix per person were first-come, first-served, and you could indicate how many “additional tickets (beyond 2)” you wanted in case we had leftovers.
Absolutely clear instructions, super simple process. I got an email at 8:30 last night complaining that my survey was broken (it had expired). I also got an email about how I gave one person an extra pair of tickets (they put the total number of tickets in the “additional tickets (beyond 2)” field)
I feel like lawyers are generally detail oriented people. This process has really challenged that belief.
I feel like lawyers are generally detail oriented people.
LOL! Good one.
I’ve known a goodly number of lawyers.
Those I’ve known in real life have relied on their legal secretaries/paralegals to do the detail work.
I can’t speak for the crop of Glib Lawyers.
Detail =/= clerical
My paralegal is really good at following a 15 step process to file a patent. She’s not very good at noticing that I cited the wrong statute in a document.
However, paralegals and secretaries are much, much better at following instructions than lawyers.
And? Based upon the sample I had, I still have a strong sense that Lawyers actually reduce the efficacy and accuracy of legal counsel. It could be the individuals in question, but expanding my reference pool gets complicated when I want to compare the accuracy of work product. Something about confidentiality and privilege.
Some people don’t read the instructions, or read full e-mails. They skim, and just pick up on a couple of key words, run with that, and fill in their own details.
Yup, which I why I try to write very brief, skimmable instructions. Apparently, I need to use colors, font changes, and emoji to get people to RTFE.
Hell, id just expect them to read the 4 word long prompt when they’re filling out the damn survey.
*faith in humanity stays pegged at 0*
Some peoplemost people don’t read the instructions, or read full e-mails. They skim, and just pick up on a couple of key words, run with that, and fill in their own details.FTFY
Some peoplemost people don’t readthe instructions, or read full e-mails. They skim, and just pick up on a couple of key words, run with that, and fill in their own details.FTFY
I just think most people don’t read.
People don’t read.
I don’t think people read.
My royalties prove as much.
*sniffles*
I didn’t read any of this.
Perhaps the issue is that most people don’t read.
I had that originally, but decided to be optimistic about humanity today.
The problem, from where I see it, is that people don’t read.
It wasn’t that I didn’t read the article, I just didn’t understand it. You know, an Ed major. I admit to enjoying the discussion though truthfully I didn’t comprehend a lot of it. Still isn’t too late to learn though, I hope.
Hah, you should try reading questions from students. Most of them can be answered by the phrase, “Read the Syllabus.” Sometimes rendered as RTFS
“What Syllabus?”
/Guy who skipped first day.
The one available online.
Which LMS have you been shackled to? We’ve been enslaved to heavily nerfed version of Brightspace.
Better or worse than Canvas?
I don’t know. I’ve only ever used Blackboard and Brightspace (aka D2L).
We transitioned to Canvas two years ago from Blackboard. In general, I find Canvas to be more efficient than BB, even though there are one or two things that irritate me (especially in terms of running online quizzes). We’re currently using Zoom for online meetings/office hours, etc.
That said, for my modified flipped classroom approach in Gen Ed, I usually embed a lot of youtube vids rather than creating anything myself.
Canvas is a pretty good choice if your course is media-heavy. Mine is usually just posting a shitload of written stuff, with occasional links to short youtube videos, so it doesn’t matter too much. The gradebook in Canvas can be tricky if you’re not careful in how you tell it to handle missed assignments; this is one main reason I switched my grading to total points rather than percentages.
Sometimes I just don’t even know what to do with other people.
I avoid them, as much as possible.
An atheist, a vegan, and a CrossFitter walk into a bar. I only know because they told everyone within two minutes.
On a serious note, very interesting article.
Entire fields of study have been wiped out by faked papers and research.
And one particular field has survived faked data, faked papers, hidden research, etc…. This is perhaps more maddening.
Especially when you hear things like:
You think you know more than a scientist, or Bill Nye. I Fucking Love Science!
Politics is science , right? It says so right there in the degree program.
I’m not sure how Popper bears the blame for the spread of the AGW. Isn’t he the one who insisted on falsifiability? Even Kuhn was rather describing how the scientific community actually works. It’s a separate question whether it should work that way.
The state of science is unwell.
Have you tried rebooting it?
In the reboot, Newton is a Latinx transwoman.
I’ve cycled it repeatedly to no effect, unfortunately.
I’m afraid your product is no longer under support, and we can only offer a best effort, time and materials troubleshooting option.
Check the thermostat.
Speaking of science, let the NYT learn y’all about yet another way in which the Soviet Union was superior. They had better sex AND better science!
Unless you had to build a different type of rocket for each of these, I’m not seeing any achievement there.
Look, they had to redesign the pressure suit at least!
the Soviet Union sent the first woman, the first Asian man, and the first black man into orbit
And?
https://www.npr.org/sections/krulwich/2011/05/02/134597833/cosmonaut-crashed-into-earth-crying-in-rage
open casket.. pure balls
every budget will be too small, every timeline will be too short, and no bill of goods will ever be delivered in total
a percentage of prototypes will fail; be sure you’re not the prototype . . . or driving one
The charred corpse is true enough, but the book and NPR’s account of the crying much less settled.
My guess is that he was a professional until the end. Long, long, read, but really interesting series here.
http://www.thespacereview.com/article/3226/1
This is a great example of the kind of nonsense I’m talking about. As if the skin-color, ethnicity, or chromosome makeup of the person in space had fuck-all to do with the scientific accomplishment, yet here is the Gray Lady, doing it again, telling us how wonderful the Soviets were. I mean, FFS, wasn’t Duranty enough for them? But Nope. Gotta keep carrying the intersectional water and shitting on science at the same time.
I recommend Stephen Hick’s “Explaining Postmodernism”
It’s just (very) thinly veiled Marxism expanded from pure economics into race, sex, gender, etc. etc. The denial/destruction of objective reality is just a necessary step in the Permanent Revolution.
I’d argue that it’s merely convenient.
At it’s core, postmodernism is the belief that we cannot “know” anything, therefore everything is relative. However, communism/socialism was sold as being the scientific solution to human utopia.
If we can’t know anything, how can we possibly engineer the perfect society? I think it’s on this point that Zizek departs from the current crop of Marxists. They only seek the revolution these days, because in a postmodern society, there can’t be anything else. It’s only about power.
I’ve come to think of post-modernism – and probably Progressivism before it – as a “spectrum,” like autism. (And I do not in any way mean to denigrate people with autopsy with the comparison. It’s just a convenient analogy).
Post-modernists/Leftists come with all levels of understanding about the underlying ideology and philosophical premises that are necessary for it. There are, at one far end, the “useful idiots.” They just buy into the sloganeering. They have almost no intellectual understanding about the tenets of Marxism, for example. They’ve never read Marx, probably would confuse Karl for Groucho, couldn’t tell you what “Wealth of Nations” is, etc. And the same is true across the full range of subjects implicated: from science to philosophy to economics to political theory. At the other end are those who truly understand what the underlying philosophy means, both in theory and in outcome. Nietsche did, anyway; he predicted the kinds of philosophical gyrations that happened in the 20th century.
An underlying view of the world that is nihilistic leads to bad consequences in human beings. Where there is no such thing as a capital T (transcendent) Truth, there is only Man’s own Ego, manifested through Power, as the determinant. Man as the measure, not of just Man, but of Man’s Universe.
That’s how you get science by voting. Whoever convinces the most people wins, regardless of any moral claim about whether it’s True, or Good, or can be measured against some overarching standard. To measure something is to compare it to an agreed upon (hopefully, universal) standard. That’s what science claims to be doing.
Never change, NYT.
“By Sophie Pinkham”
Sophie PINKham.
Ham usually is pink, yes.
Like, the ham sandwich between her legs?
“first black man into orbit”
Charred cosmonauts don’t count as black.
*cringes, then applauds*
The Soviets made it standard practice to not announce their launches/re-entries until after they were successful simply becasue so many of the cosmonauts bounced off the atmosphere or were flung out into deep space.
The USSR also sent the first dog in space. What’s your point?
Has she come back down yet?
Re-entry on 4/14/58
We sent the first Ham into space.
William Shatner?
*golf clap*
I was waiting for the Owen Garriott joke.
That’s some real inside baseball right there.
So was the ape name.
There are Blacks in Russia?
Not anymore, they fired the guy into space.
Bro
No, he was Cuban.
Upon returning to Earth, I assume he played for the Yankees.
This guy?
https://roadsandkingdoms.com/2015/superman-of-havana/
:Contemplates making Astros joke:
PS: Hey NYT, you do know that about 75% of Russia is in Asia right? There are millions of Russian shitlord Caucasians that are, technically in the truest sense, Asian.
Shhh… they’re rolling
Since there are (essentially) no people of African descent in Russia, the derogatory term “black” was often used to refer to the people from the Caucus: they sure have darker skin than ethnic Russians. It was interesting to discover that Americans used the word Caucasian as a fancy synonym for white.
from the Caucasus
Just like white Australians calling Aborigines “black” back in the 19th and early 20th centuries.
Caucasian as a fancy synonym for white
Get a load of this uppity cracker.
Origins of the Caucasian people, Caucasus region and Rasicm
https://www.youtube.com/watch?v=BxwOyMXsnNk
Didn’t watch it but I’m sure it’s solid.
Sophie Pinkham
More like Sophie Pinkoham, amirite?
Where can I get more information on the book featured on the main page?
A good example of this is PSSD (post-SSRI sexual dysfunction); a supposed syndrome in which sexual dysfunction continues for months, years or indefinitely after cessation of SSRI use. There have been sporadic reports of this effect since the late 90’s, but they are exceedingly rare (we’re talking on the order of 1,000 out of the tens of millions of people that take SSRIs). There is a huge dearth of clinical research on it and no known mechanism for it; these are self reported and anecdotal. However, pressure groups banded together and basically strongarmed drug companies and the FDA into putting warnings on SSRIs. Because of the lack of conclusive evidence of catastrophic sexual dysfunction on a large scale, they have to concoct that it’s a “spectrum” in which if, after you quit taking SSRIs, you occasionally have a hard time keeping an erection or having an orgasm, you have been permanently damaged by the drug.
Does the condition exist? I have no idea. Are there some people that are convinced that it does and are suffering? Most definitely and I have great sympathy for them. However, folding to pressure/activist groups and putting (potentially bogus) warnings on drugs for which there is very thin evidence is dangerous in and of itself. It might scare people who really need the drug into non-compliance. It might make doctors reluctant to prescribe them to people who really need them (who needs SSRIs and whether they are overprescribed is a completely separate discussion). It opens the door to even more lawsuits against drug companies and doctors, driving up costs. Bad science and pseudoscience hurts everyone (except for lawyers).
“Regardless of subsequent interpretations, for many people and pundits in that era, the boundaries of what could be known with certainty certainly seemed to have shrunk.”
As some dude pointed out long ago, “The more you know, the more you realize you don’t know.”
The real surprise in this article is that anyone takes Feyerabend seriously.
Also, I think Lakatos gets a bit of a bad rap. I’m only familiar with Proofs and Refutations, but it’s a very good observation of the process of mathematics, if you remember that he condenses 200 years into a few chapters.
Stove puts Feyerabend, Lakatos, Kuhn, and Popper together because of their ‘irrationalist’ view (his well-defended term) of the philosophy of science. i.e. That human knowledge is not cumulative. I believe Stove is worth reading simply for this. He also clearly distinguishes Popper as being the ‘leader’ in latching onto Hume from the prior century, but the others no doubt followed.
I’m late to the party. Great article Ozymandias.
Thanks, Chafed. Appreciate it.
Good article. Also, many interesting discussions in the comments, although most were over my head.