Carl Gustav Hempel (1905—1997)
Carl Hempel, a German-born philosopher who immigrated to the United States, was one of the prominent philosophers of science in the twentieth century. His paradox of the ravens—as an illustration of the paradoxes of confirmation—has been a constant challenge for theories of confirmation. Together with Paul Oppenheim, he proposed a quantitative account of degrees of confirmation of hypotheses by evidence. His deductive-nomological model of scientific explanation put explanations on the same logical footing as predictions; they are both deductive arguments. The difference is a matter of pragmatics, namely that in an explanation the argument’s conclusion is intended to be assumed true whereas in a prediction the intention is make a convincing case for the conclusion. Hempel also proposed a quantitative measure of the power of a theory to systematize its data.Later in his life, Hempel abandoned the project of an inductive logic. He also emphasized the problems with logical positivism (logical empiricism), especially those concerning the verifiability criterion. Hempel eventually turned away from the logical positivists’ analysis of science to a more empirical analysis in terms of the sociology of science.
Hempel studied mathematics, physics, and philosophy in Gottingen, Heidelberg, Vienna, and Berlin. In Vienna, he attended some of the meetings of the Vienna Circle. With the help of Rudolf Carnap , he managed to leave Europe before the Second World War, and he came to Chicago on a research grant secured by Carnap. He later taught at the City University of New York, Yale University and Princeton University.
Table of Contents
- Scientific Explanation
- Paradoxes of Confirmation
- Concept Formation in Empirical Science
- The Late Hempel
- References and Further Reading
One of the leading members of logical positivism, he was born in Oranienburg, Germany, in 1905. Between March 17 and 24, 1982, Hempel gave an interview to Richard Nolan; the text of that interview was published for the first time in 1988 in Italian translation (Hempel, "Autobiografia intellettuale" in Oltre il positivismo logico, Armando: Rome, Italy, 1988). This interview is the main source of the following biographical notes.
Hempel studied at the Realgymnasium at Berlin and, in 1923, he was admitted at the University of Gottingen where he studied mathematics with David Hilbert and Edmund Landau and symbolic logic with Heinrich Behmann. Hempel was very impressed with Hilbert’s program of proving the consistency of mathematics by means of elementary methods; he also studied philosophy, but he found mathematical logic more interesting than traditional logic. The same year he moved to the University of Heidelberg, where he studied mathematics, physics, and philosophy. From 1924, Hempel studied at Berlin, where he met Reichenbach who introduced him to the Berlin Circle. Hempel attended Reichenbach’s courses on mathematical logic, the philosophy of space and time, and the theory of probability. He studied physics with Max Planck and logic with von Neumann.
In 1929, Hempel took part in the first congress on scientific philosophy organized by logical positivists. He meet Carnap and—very impressed by Carnap—moved to Vienna where he attended three courses with Carnap, Schlick, and Waismann, and took part in the meetings of the Vienna Circle. In the same years, Hempel qualified as teacher in the secondary school and eventually, in 1934, he gained the doctorate in philosophy at Berlin, with a dissertation on the theory of probability. In the same year, he immigrated to Belgium, with the help of a friend of Reichenbach, Paul Oppenheim (Reichenbach introduced Hempel to Oppenheim in 1930). Two years later, Hempel and Oppenheim published the book Der Typusbegriff im Lichte der neuen Logik on the logical theory of classifier, comparative and metric scientific concepts.
In 1937, Hempel was invited—with the help of Carnap—to the University of Chicago as Research Associate in Philosophy. After another brief period in Belgium, Hempel immigrated to the United States in 1939. He taught in New York, at City College (1939-1940) and at Queens College (1940-1948). In those years, he was interested in the theory of confirmation and explanation, and published several articles on that subject: "A Purely Syntactical Definition of Confirmation," in The Journal of Symbolic Logic, 8, 1943; "Studies in the Logic of Confirmation" in Mind, 54, 1945; "A Definition of Degree of Confirmation" (with P. Oppenheim) in Philosophy of Science, 12, 1945; "A Note on the Paradoxes of Confirmation" in Mind, 55, 1946; "Studies in the Logic of Explanation" (with P. Oppenheim) in Philosophy of Science, 15, 1948.
Between 1948 and 1955, Hempel taught at Yale University. His work Fundamentals of Concept Formation in Empirical Science was published in 1952 in the International Encyclopedia of Unified Science. From 1955, he taught at the University of Princeton. Aspects of Scientific Explanation and Philosophy of Natural Science were published in 1965 and 1966 respectively. After the pensionable age, he continued teaching at Berkley, Irvine, Jerusalem, and, from 1976 to 1985, at Pittsburgh. In the meantime, his philosophical perspective was changing and he detached from logical positivism: "The Meaning of Theoretical Terms: A Critique of the Standard Empiricist Construal" in Logic, Methodology and Philosophy of Science IV (ed. by Patrick Suppes), 1973; "Valuation and Objectivity in Science" in Physics, Philosophy and Psychoanalysis (ed. by R. S. Cohen and L. Laudan), 1983; "Provisoes: A Problem Concerning the Inferential Function of Scientific Theories" in Erkenntnis, 28, 1988. However, he remained affectionately joined to logical positivism. In 1975, he undertook the editorship (with W. Stegmüller and W. K. Essler) of the new series of the journal Erkenntnis. Hempel died November 9, 1997, in Princeton Township, New Jersey.
2. Scientific Explanation
Hempel and Oppenheim’s essay "Studies in the Logic of Explanation," published in volume 15 of the journal Philosophy of Science, gave an account of the deductive-nomological explanation. A scientific explanation of a fact is a deduction of a statement (called the explanandum) that describes the fact we want to explain; the premises (called the explanans) are scientific laws and suitable initial conditions. For an explanation to be acceptable, the explanans must be true.
According to the deductive-nomological model, the explanation of a fact is thus reduced to a logical relationship between statements: the explanandum is a consequence of the explanans. This is a common method in the philosophy of logical positivism. Pragmatic aspects of explanation are not taken into consideration. Another feature is that an explanation requires scientific laws; facts are explained when they are subsumed under laws. So the question arises about the nature of a scientific law. According to Hempel and Oppenheim, a fundamental theory is defined as a true statement whose quantifiers are not removable (that is, a fundamental theory is not equivalent to a statement without quantifiers), and which do not contain individual constants. Every generalized statement which is a logical consequence of a fundamental theory is a derived theory. The underlying idea for this definition is that a scientific theory deals with general properties expressed by universal statements. References to specific space-time regions or to individual things are not allowed. For example, Newton’s laws are true for all bodies in every time and in every space. But there are laws (e.g., the original Kepler laws) that are valid under limited conditions and refer to specific objects, like the Sun and its planets. Therefore, there is a distinction between a fundamental theory, which is universal without restrictions, and a derived theory that can contain a reference to individual objects. Note that it is required that theories are true; implicitly, this means that scientific laws are not tools to make predictions, but they are genuine statements that describe the world—a realistic point of view.
There is another intriguing characteristic of the Hempel-Oppenheim model, which is that explanation and prediction have exactly the same logical structure: an explanation can be used to forecast and a forecast is a valid explanation. Finally, the deductive-nomological model accounts also for the explanation of laws; in that case, the explanandum is a scientific law and can be proved with the help of other scientific laws.
Aspects of Scientific Explanation, published in 1965, faces the problem of inductive explanation, in which the explanans include statistical laws. According to Hempel, in such kind of explanation the explanans give only a high degree of probability to the explanandum, which is not a logical consequence of the premises. The following is a very simple example.
The relative frequency of P with respect to Q is r
The object a belongs to P
Thus, a belongs to Q
The conclusion "a belongs to Q" is not certain, for it is not a logical consequence of the two premises. According to Hempel, this explanation gives a degree of probability r to the conclusion. Note that the inductive explanation requires a covering law: the fact is explained by means of scientific laws. But now the laws are not deterministic; statistical laws are admissible. However, in many respects the inductive explanation is similar to the deductive explanation.
- Both deductive and inductive explanation are nomological ones (that is, they require universal laws).
- The relevant fact is the logical relation between explanans and explanandum: in deductive explanation, the latter is a logical consequence of the former, whereas in inductive explanation, the relationship is an inductive one. But in either model, only logical aspects are relevant; pragmatic features are not taken in account.
- The symmetry between explanation and prediction is preserved.
- The explanans must be true.
3. Paradoxes of Confirmation
During his research on confirmation, Hempel formulated the so-called paradoxes of confirmation. Hempel’s paradoxes are a straightforward consequence of the following apparently harmless principles:
- The statement (x)(Rx → Bx) is supported by the statement (Ra & Ba)
- If P1 and P2 are logically equivalent statements and O1 confirms P1, then O1 also supports P2.
Hence, (~Ra & ~Ba), which confirms (x)(~Bx → ~Rx), also supports (x)(Rx → Bx). Now suppose Rx means "x is a raven" and Bx means "x is black." Therefore, "a isn't a raven and isn't black" confirms "all ravens are black." That is, the observation of a red fish supports the hypothesis that all ravens are black.
Note also that the statement (x)((~Rx ∨ Rx) → (~Rx ∨ Bx)) is equivalent to (x)(Rx → Bx). Thus, (~Ra ∨ Ba) supports "all ravens are black" and hence the observation of whatever thing which is not a raven (tennis-ball, paper, elephant, red herring) supports "all ravens are black."
4. Concept Formation in Empirical Science
In his monograph Fundamentals of Concept Formation in Empirical Science (1952), Hempel describes the methods according to which physical quantities are defined. Hempel uses the example of the measurement of mass.
An equal-armed balance is used to determine when two bodies have the same mass and when the mass of a body is greater than the mass of the other. Two bodies have the same mass if, when they are on the pans, the balance remains in equilibrium. If a pan goes down and the other up, then the body in the lowest pan has a greater mass. From a logical point of view, this procedure defines two relations, say E and G, so that:
- E(a,b) if and only if a and b have the same mass;
- G(a,b) if and only if the mass of a is greater that the mass of b.
The relations E and G satisfy the following conditions:
- E is a reflexive, symmetric and transitive relation.
- G is an irreflexive, asymmetric and transitive relation.
- E and G are mutually exclusive—that is, if E(a,b), then not G(a,b).
- For every a and b, one and only one of the following assertions is true:
Relations E and G thus define a partial order.
The second step consists in defining a function m which satisfies the following three conditions:
- A suitable prototype is chosen, whose mass is one kilogram.
- If E(a,b) then m(a)=m(b).
- There is an operation, say ©, which combines two bodies a and b, so that
m(a © b) = m(a) + m(b)
Conditions (1)-(7) describe the measurement not only of mass but also of length, of time and of every extensive physical quantity. (A quantity is extensive if there is an operation which combines the objects according to condition 7, otherwise it is intensive; temperature, for example, is intensive.)
5. The Late Hempel
In "The Meaning of Theoretical Terms" (1973), Hempel criticizes an aspect of logical positivism’s theory of science: the distinction between observational and theoretical terms and the related problem about the meaning of theoretical terms. According to Hempel, there is an implicit assumption in neopositivist analysis of science, namely that the meaning of theoretical terms can be explained by means of linguistic methods. Therefore, the very problem is how can a set of statements be determined that gives a meaning to theoretical terms. Hempel analyzes the various theories proposed by logical positivism.
According to Schlick, the meaning of theoretical concepts is determined by the axioms of the theory; the axioms thus play the role of implicit definitions. Therefore, theoretical terms must be interpreted in a way that makes the theory true. But according to such interpretation—Hempel objects—a scientific theory is always true, for it is true by convention, and thus every scientific theory is a priori true. This is a proof—Hempel says—that Schlick’s interpretation of the meaning of theoretical terms is not tenable. Also the thesis which asserts that the meaning of a theoretical term depends on the theory in which that term is used is, according to Hempel, untenable.
Another solution to the problem of the meaning of theoretical terms is based on the rules of correspondence (also known as meaning postulates). They are statements in which observational and theoretical terms occur. Theoretical terms thus gain a partial interpretation by means of observational terms. Hempel raises two objections to this theory. First, he asserts that observational concepts do not exist. When a scientific theory introduces new theoretical terms, they are linked with other old theoretical terms that usually belong to another already consolidated scientific theory. Therefore, the interpretation of new theoretical terms is not based on observational terms but it is given by other theoretical terms that, in a sense, are more familiar than the new ones. The second objection is about the conventional nature of rules of correspondence. A meaning postulate defines the meaning of a concept and therefore, from a logical point of view, it must be true. But every statement in a scientific theory is falsifiable, and thus there is no scientific statement which is beyond the jurisdiction of experience. So, a meaning postulate can be false as well; hence, it is not conventional and thus it does not define the meaning of a concept but it is a genuine physical hypothesis. Meaning postulates do not exist.
"Provisoes: A Problem concerning the Inferential Function of Scientific Theories," published in Erkenntnis (1988), criticizes another aspect of logical positivism’s theory of science: the deductive nature of scientific theories. It is very interesting that a philosopher who is famous for his deductive model of scientific explanation criticized the deductive model of science. At least this fact shows the open views of Hempel. He argues that it is impossible to derive observational statements from a scientific theory. For example, Newton’s theory of gravitation cannot determine the position of planets, even if the initial conditions are known, for Newton’s theory deals with the gravitational force, and thus the theory cannot forecast the influences exerted by other kinds of force. In other words, Newton’s theory requires an explicit assumption—a provisoe, according to Hempel—which assures that the planets are subjected only to the gravitational force. Without such hypothesis, it is impossible to apply the theory to the study of planetary motion. But this assumption does not belong to the theory. Therefore, the position of planets is not determined by the theory, but it is implied by the theory plus appropriate assumptions. Accordingly, not only observational statements are not entailed by the theory, but also there are no deductive links between observational statements. Hence, it is impossible that an observational statement is a logical consequence of a theory (unless the statement is logically true). This fact has very important consequences.
One of them is that the empirical content of a theory does not exist. Neopositivists defined it as the class of observational statements implied by the theory; but this class is an empty set.
Another consequence is that theoretical terms are not removable from a scientific theory. Known methods employed to accomplish this task assert that, for every theory T, it is possible to find a theory T* without theoretical terms so that an observational statement O is a consequence of T* if and only if it is a consequence of T. Thus, it is possible to eliminate theoretical terms from T without loss of deductive power. But—Hempel argues—no observational statement O is derivable from T, so that T* lacks empirical consequence.
Suppose T is a falsifiable theory; therefore, there is an observational statement O so that ~O → ~T. Hence, T → ~O; so T entails an observational statement ~O. But no observational statement is a consequence of T. Thus, the theory T is not falsifiable. The consequence is that every theory is not falsifiable. (Note: Hempel’s argument is evidently wrong, for according to Popper the negation of an observational statement usually is not an observational statement).
Finally, the interpretation of science due to instrumentalism is not tenable. According to such interpretation, scientific theories are rules of inference, that is, they are prescriptions according to which observational statements are derived. Hempel’s analysis shows that these alleged rules of inference are indeed void.
6. References and Further Reading
- Essler, W. K., Putnam, H., & Stegmuller, W. (Eds.). (1985). Epistemology, Methodology, and Philosophy of Science: Essays in Honour of Carl G. Hempel on the Occasion of his 80th Birthday, January 8th, 1985. Dordrecht, Holland: D. Reidel Pub. Co.
- Hempel, C. G. (1934). Beitrage zur logischen analyse des wahrscheinlichkeitsbegriffs. Universitats-buchdruckerei G. Neuenhahn, Jena.
- Hempel, C. G. (1937). "Le problème de la vérité." Theoria, 3.
- Hempel, C. G. (1942). "The Function of General Laws in Hystory." The Journal of Philosophy, 39.
- Hempel, C. G. (1943). "A Purely Syntactical Definition of Confirmation." The Journal of Symbolic Logic, 8.
- Hempel, C. G. (1945). "Studies in the Logic of Confirmation." Mind, 54.
- Hempel, C. G. (1952). Fundamentals of Concept Formation in Empirical Science. Chicago: University of Chicago Press.
- Hempel, C. G. (1958). "The Theoretician's Dilemma." In H. Feigl, M. Scriven & G. Maxwell (Eds.), Minnesota Studies in the Philosophy of Science (Vol. 2). Minneapolis: University of Minnesota Press.
- Hempel, C. G. (1962). "Deductive-Nomological vs. Statistical Explanation." In H. Feigl & G. Maxwell (Eds.), Minnesota Studies in the Philosophy of Science (Vol. 3). Minneapolis: University of Minnesota Press.
- Hempel, C. G. (1965). Aspects of Scientific Explanation and other Essays in the Philosophy of Science. New York: Free Press.
- Hempel, C. G. (1966). Philosophy of Natural Science. Englewood Cliffs, N.J.: Prentice-Hall.
- Hempel, C. G. (1973). "The Meaning of Theoretical Terms: A Critique to the Standard Empiricist Construal." In Logic, Methodology and Philosophy of Science (Vol. IV): North Holland Publishing Company.
- Hempel, C. G. (1981). "Turns in the Evolution of the Problem of Induction." Synthese (46).
- Hempel, C. G. (1983). "Valutation and Objectivity in Science." In R. S. Cohen & L. Laudan (Eds.), Physics, Philosophy and Psychoanalysis. Dordrecth, Holland: D. Reidel Pub. Co.
- Hempel, C. G. (1985). "Thoughts on the Limitation of Discovery by Computer." In K. F. Schaffner (Ed.), Logic of Discovery and Diagnosis in Medicine: University of California Press.
- Hempel, C. G. (1988). "Provisoes: A Problem concerning the Inferential Function of Scientific Theories." Erkenntnis, 28.
- Hempel, C. G., & Oppenheim, P. (1936). Der Typusbegriff im Lichte der neuen Logik. Leiden: A. W. Sijthoff.
- Hempel, C. G., & Oppenheim, P. (1945). "A Definition of Degree of Confirmation." Philosophy of Science, 12.
- Hempel, C. G., & Oppenheim, P. (1948). "Studies in the Logic of Explanation." Philosophy of Science, 15.
- Rescher, N. (Ed.). (1970). Essays in Honor of Carl G. Hempel: A Tribute on the Occasion of his Sixty-fifth Birthday. Dordrecht, Holland: D. Reidel Pub. Co.
- Salmon, W. C. (1989). Four Decades of Scientific Explanation: Regents of the University of Minnesota.
- Scheffler, I. (1963). The Anatomy of Inquiry. New York: Knopf.
The deductive-nomological model (DN model), also known as Hempel's model, the Hempel–Oppenheim model, the Popper–Hempel model, or the covering law model, is a formal view of scientifically answering questions asking, "Why...?". The DN model poses scientific explanation as a deductive structure—that is, one where truth of its premises entails truth of its conclusion—hinged on accurate prediction or postdiction of the phenomenon to be explained.
Because of problems concerning humans' ability to define, discover, and know causality, it was omitted in initial formulations of the DN model. Causality was thought to be incidentally approximated by realistic selection of premises that derive the phenomenon of interest from observed starting conditions plus general laws. Still, DN model formally permitted causally irrelevant factors. Also, derivability from observations and laws sometimes yielded absurd answers.
Upon the fall of logical empiricism's in the 1960s, the DN model was widely seen as a flawed or greatly incomplete model of scientific explanation. Nonetheless, it remained an idealized version of scientific explanation, and one that was rather accurate when applied to modern physics. In the early 1980s, revision to DN model emphasized maximal specificity for relevance of the conditions and axioms stated. Together with Hempel's inductive-statistical model, the DN model forms scientific explanation's covering law model, which is also termed, from critical angle, subsumption theory.
The term deductive distinguishes the DN model's intended determinism from the probabilism of inductive inferences. The term nomological is derived from the Greek word νόμος or nomos, meaning "law". The DN model holds to a view of scientific explanation whose conditions of adequacy (CA)—semiformal but stated classically—are derivability (CA1), lawlikeness (CA2), empirical content (CA3), and truth (CA4).
In the DN model, a law axiomatizes an unrestricted generalization from antecedent A to consequent B by conditional proposition—If A, then B—and has empirical content testable. A law differs from mere true regularity—for instance, George always carries only $1 bills in his wallet—by supporting counterfactual claims and thus suggesting what must be true, while following from a scientific theory's axiomatic structure.
The phenomenon to be explained is the explanandum—an event, law, or theory—whereas the premises to explain it are explanans, true or highly confirmed, containing at least one universal law, and entailing the explanandum. Thus, given the explanans as initial, specific conditions C1, C2 . . . Cn plus general laws L1, L2 . . . Ln, the phenomenon E as explanandum is a deductive consequence, thereby scientifically explained.
Aristotle's scientific explanation in Physics resemble the DN model, an idealized form of scientific explanation. The framework of Aristotelian physics—Aristotelian metaphysics—reflected the perspective of this principally biologist, who, amid living entities' undeniable purposiveness, formalized vitalism and teleology, an intrinsic morality in nature. With emergence of Copernicanism, however, Descartes introduced mechanical philosophy, then Newton rigorously posed lawlike explanation, both Descartes and especially Newton shunning teleology within natural philosophy. At 1740, David Hume staked Hume's fork, highlighted the problem of induction, and found humans ignorant of either necessary or sufficient causality. Hume also highlighted the fact/value gap, as what is does not itself reveal what ought.
Near 1780, countering Hume's ostensibly radical empiricism, Immanuel Kant highlighted extreme rationalism—as by Descartes or Spinoza—and sought middle ground. Inferring the mind to arrange experience of the world into substance, space, and time, Kant placed the mind as part of the causal constellation of experience and thereby found Newton's theory of motion universally true, yet knowledge of things in themselves impossible. Safeguarding science, then, Kant paradoxically stripped it of scientific realism. Aborting Francis Bacon's inductivist mission to dissolve the veil of appearance to uncover the noumena—metaphysical view of nature's ultimate truths—Kant's transcendental idealism tasked science with simply modeling patterns of phenomena. Safeguarding metaphysics, too, it found the mind's constants holding also universal moral truths, and launched German idealism, increasingly speculative.
Auguste Comte found the problem of induction rather irrelevant since enumerative induction is grounded on the empiricism available, while science's point is not metaphysical truth. Comte found human knowledge had evolved from theological to metaphysical to scientific—the ultimate stage—rejecting both theology and metaphysics as asking questions unanswerable and posing answers unverifiable. Comte in the 1830s expounded positivism—the first modern philosophy of science and simultaneously a political philosophy—rejecting conjectures about unobservables, thus rejecting search for causes. Positivism predicts observations, confirms the predictions, and states a law, thereupon applied to benefit human society. From late 19th century into the early 20th century, the influence of positivism spanned the globe. Meanwhile, evolutionary theory's natural selection brought the Copernican Revolution into biology and eventuated in the first conceptual alternative to vitalism and teleology.
Whereas Comtean positivism posed science as description, logical positivism emerged in the late 1920s and posed science as explanation, perhaps to better unify empirical sciences by covering not only fundamental science—that is, fundamental physics—but special sciences, too, such as biology, psychology, economics, and anthropology. After defeat of National Socialism with World War II's close in 1945, logical positivism shifted to a milder variant, logical empiricism. All variants of the movement, which lasted until 1965, are neopositivism, sharing the quest of verificationism.
Neopositivists led emergence of the philosophy subdiscipline philosophy of science, researching such questions and aspects of scientific theory and knowledge.Scientific realism takes scientific theory's statements at face value, thus accorded either falsity or truth—probable or approximate or actual. Neopositivists held scientific antirealism as instrumentalism, holding scientific theory as simply a device to predict observations and their course, while statements on nature's unobservable aspects are elliptical at or metaphorical of its observable aspects, rather.
DN model received its most detailed, influential statement by Carl G Hempel, first in his 1942 article "The function of general laws in history", and more explicitly with Paul Oppenheim in their 1948 article "Studies in the logic of explanation". Leading logical empiricist, Hempel embraced the Humean empiricist view that humans observe sequence of sensory events, not cause and effect, as causal relations and casual mechanisms are unobservables. DN model bypasses causality beyond mere constant conjunction: first an event like A, then always an event like B.
Hempel held natural laws—empirically confirmed regularities—as satisfactory, and if included realistically to approximate causality. In later articles, Hempel defended DN model and proposed probabilistic explanation by inductive-statistical model (IS model). DN model and IS model—whereby the probability must be high, such as at least 50%—together form covering law model, as named by a critic, William Dray. Derivation of statistical laws from other statistical laws goes to the deductive-statistical model (DS model).Georg Henrik von Wright, another critic, named the totality subsumption theory.
Amid failure of neopositivism's fundamental tenets, Hempel in 1965 abandoned verificationism, signaling neopositivism's demise. From 1930 onward, Karl Popper had refuted any positivism by asserting falsificationism, which Popper claimed had killed positivism, although, paradoxically, Popper was commonly mistaken for a positivist. Even Popper's 1934 book embraces DN model, widely accepted as the model of scientific explanation for as long as physics remained the model of science examined by philosophers of science.
In the 1940s, filling the vast observational gap between cytology and biochemistry,cell biology arose and established existence of cell organelles besides the nucleus. Launched in the late 1930s, the molecular biologyresearch program cracked a genetic code in the early 1960s and then converged with cell biology as cell and molecular biology, its breakthroughs and discoveries defying DN model by arriving in quest not of lawlike explanation but of causal mechanisms. Biology became a new model of science, while special sciences were no longer thought defective by lacking universal laws, as borne by physics.
In 1948, when explicating DN model and stating scientific explanation's semiformal conditions of adequacy, Hempel and Oppenheim acknowledged redundancy of the third, empirical content, implied by the other three—derivability, lawlikeness, and truth. In the early 1980s, upon widespread view that causality ensures the explanans' relevance, Wesley Salmon called for returning cause to because, and along with James Fetzer helped replace CA3 empirical content with CA3' strict maximal specificity.
Salmon introduced causal mechanical explanation, never clarifying how it proceeds, yet reviving philosophers' interest in such. Via shortcomings of Hempel's inductive-statistical model (IS model), Salmon introduced statistical-relevance model (SR model). Although DN model remained an idealized form of scientific explanation, especially in applied sciences, most philosophers of science consider DN model flawed by excluding many types of explanations generally accepted as scientific.
As theory of knowledge, epistemology differs from ontology, which is a subbranch of metaphysics, theory of reality. Ontology poses which categories of being—what sorts of things exist—and so, although a scientific theory's ontological commitment can be modified in light of experience, an ontological commitment inevitably precedes empirical inquiry.
Natural laws, so called, are statements of humans' observations, thus are epistemological—concerning human knowledge—the epistemic. Causal mechanisms and structures existing putatively independently of minds exist, or would exist, in the natural world's structure itself, and thus are ontological, the ontic. Blurring epistemic with ontic—as by incautiously presuming a natural law to refer to a causal mechanism, or to trace structures realistically during unobserved transitions, or to be true regularities always unvarying—tends to generate a category mistake.
Discarding ontic commitments, including causality per se, DN model permits a theory's laws to be reduced to—that is, subsumed by—a more fundamental theory's laws. The higher theory's laws are explained in DN model by the lower theory's laws. Thus, the epistemic success of Newtonian theory's law of universal gravitation is reduced to—thus explained by—Einstein's general theory of relativity, although Einstein's discards Newton's ontic claim that universal gravitation's epistemic success predicting Kepler's laws of planetary motion is through a causal mechanism of a straightly attractive force instantly traversing absolute space despite absolute time.
Covering law model reflects neopositivism's vision of empirical science, a vision interpreting or presuming unity of science, whereby all empirical sciences are either fundamental science—that is, fundamental physics—or are special sciences, whether astrophysics, chemistry, biology, geology, psychology, economics, and so on. All special sciences would network via covering law model. And by stating boundary conditions while supplying bridge laws, any special law would reduce to a lower special law, ultimately reducing—theoretically although generally not practically—to fundamental science. (Boundary conditions are specified conditions whereby the phenomena of interest occur. Bridge laws translate terms in one science to terms in another science.)
By DN model, if one asks, "Why is that shadow 20 feet long?", another can answer, "Because that flagpole is 15 feet tall, the Sun is at x angle, and laws of electromagnetism". Yet by problem of symmetry, if one instead asked, "Why is that flagpole 15 feet tall?", another could answer, "Because that shadow is 20 feet long, the Sun is at x angle, and laws of electromagnetism", likewise a deduction from observed conditions and scientific laws, but an answer clearly incorrect. By the problem of irrelevance, if one asks, "Why did that man not get pregnant?", one could in part answer, among the explanans, "Because he took birth control pills"—if he factually took them, and the law of their preventing pregnancy—as covering law model poses no restriction to bar that observation from the explanans.
Many philosophers have concluded that causality is integral to scientific explanation. DN model offers a necessary condition of a causal explanation—successful prediction—but not sufficient conditions of causal explanation, as a universal regularity can include spurious relations or simple correlations, for instance Z always following Y, but not Z because of Y, instead Y and then Z as an effect of X. By relating temperature, pressure, and volume of gas within a container, Boyle's law permits prediction of an unknown variable—volume, pressure, or temperature—but does not explain why to expect that unless one adds, perhaps, the kinetic theory of gases.
Scientific explanations increasingly pose not determinism's universal laws, but probabilism's chance,ceteris paribus laws. Smoking's contribution to lung cancer fails even the inductive-statistical model (IS model), requiring probability over 0.5 (50%). (Probability standardly ranges from 0 (0%) to 1 (100%).) An applied science that applies statistics seeking associations between events, epidemiology cannot show causality, but consistently found higher incidence of lung cancer in smokers versus otherwise similar nonsmokers, although the proportion of smokers who develop lung cancer is modest. Versus nonsmokers, however, smokers as a group showed over 20 times the risk of lung cancer, and in conjunction with basic research, consensus followed that smoking had been scientifically explained as a cause of lung cancer, responsible for some cases that without smoking would not have occurred, a probabilistic counterfactual causality.
Through lawlike explanation, fundamental physics—often perceived as fundamental science—has proceeded through intertheory relation and theory reduction, thereby resolving experimental paradoxes to great historical success, resembling covering law model. In early 20th century, Ernst Mach as well as Wilhelm Ostwald had resisted Ludwig Boltzmann's reduction of thermodynamics—and thereby Boyle's law—to statistical mechanics partly because it rested on kinetic theory of gas, hinging on atomic/molecular theory of matter. Mach as well as Ostwald viewed matter as a variant of energy, and molecules as mathematical illusions, as even Boltzmann thought possible.
In 1905, via statistical mechanics, Albert Einstein predicted the phenomenon Brownian motion—unexplained since reported in 1827 by botanist Robert Brown. Soon, most physicists accepted that atoms and molecules were unobservable yet real. Also in 1905, Einstein explained the electromagnetic field's energy as distributed in particles, doubted until this helped resolve atomic theory in the 1910s and 1920s. Meanwhile, all known physical phenomena were gravitational or electromagnetic, whose two theories misaligned. Yet belief in aether as the source of all physical phenomena was virtually unanimous. At experimental paradoxes, physicists modified the aether's hypothetical properties.
Finding the luminiferous aether a useless hypothesis, Einstein in 1905 a priori unified all inertialreference frames to state special principle of relativity, which, by omitting aether, converted space and time into relative phenomena whose relativity aligned electrodynamics with the Newtonian principle Galilean relativity or invariance. Originally epistemic or instrumental, this was interpreted as ontic or realist—that is, a causal mechanical explanation—and the principle became a theory, refuting Newtonian gravitation. By predictive success in 1919, general relativity apparently overthrew Newton's theory, a revolution in science resisted by many yet fulfilled around 1930.
In 1925, Werner Heisenberg as well as Erwin Schrödinger independently formalized quantum mechanics (QM). Despite clashing explanations, the two theories made identical predictions.Paul Dirac's 1928 model of the electron was set to special relativity, launching QM into the first quantum field theory (QFT), quantum electrodynamics (QED). From it, Dirac interpreted and predicted the electron's antiparticle, soon discovered and termed positron, but the QED failed electrodynamics at high energies. Elsewhere and otherwise, strong nuclear force and weak nuclear force were discovered.
In 1941, Richard Feynman introduced QM's path integral formalism, which if taken toward interpretation as a causal mechanical model clashes with Heisenberg's matrix formalism and with Schrödinger's wave formalism, although all three are empirically identical, sharing predictions. Next, working on QED, Feynman sought to model particles without fields and find the vacuum truly empty. As each known fundamental force is apparently an effect of a field, Feynman failed.Louis de Broglie's waveparticle duality had rendered atomism—indivisible particles in a void—untenable, and highlighted the very notion of discontinuous particles as selfcontradictory.
Meeting in 1947, Freeman Dyson, Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga soon introduced renormalization, a procedure converting QED to physics' most predictively precise theory, subsuming chemistry, optics, and statistical mechanics. QED thus won physicists' general acceptance.Paul Dirac criticized its need for renormalization as showing its unnaturalness, and called for an aether. In 1947, Willis Lamb had found unexpected motion of electron orbitals, shifted since the vacuum is not truly empty. Yet emptiness was catchy, abolishing aether conceptually, and physics proceeded ostensibly without it, even suppressing it. Meanwhile, "sickened by untidy math, most philosophers of physics tend to neglect QED".
Physicists have feared even mentioning aether, renamed vacuum, which—as such—is nonexistent. General philosophers of science commonly believe that aether, rather, is fictitious, "relegated to the dustbin of scientific history ever since" 1905 brought special relativity. Einstein was noncommittal to aether's nonexistence, simply said it superfluous. Abolishing Newtonian motion for electrodynamic primacy, however, Einstein inadvertently reinforced aether, and to explain motion was led back to aether in general relativity. Yet resistance to relativity theory became associated with earlier theories of aether, whose word and concept became taboo. Einstein explained special relativity's compatibility with an aether, but Einstein aether, too, was opposed. Objects became conceived as pinned directly on space and time by abstract geometric relations lacking ghostly or fluid medium.
By 1970, QED along with weak nuclear field was reduced to electroweak theory (EWT), and the strong nuclear field was modeled as quantum chromodynamics (QCD). Comprised by EWT, QCD, and Higgs field, this Standard Model of particle physics is an "effective theory", not truly fundamental. As QCD's particles are considered nonexistent in the everyday world, QCD especially suggests an aether, routinely found by physics experiments to exist and to exhibit relativistic symmetry. Confirmation of the Higgs particle, modeled as a condensation within the Higgs field, corroborates aether, although physics need not state or even include aether. Organizing regularities of observations—as in the covering law model—physicists find superfluous the quest to discover aether.
In 1905, from special relativity, Einstein deduced mass–energy equivalence, particles being variant forms of distributed energy, how particles colliding at vast speed experience that energy's transformation into mass, producing heavier particles, although physicists' talk promotes confusion. As "the contemporary locus of metaphysical research", QFTs pose particles not as existing individually, yet as excitation modes of fields, the particles and their masses being states of aether, apparently unifying all physical phenomena as the more fundamental causal reality, as long ago foreseen. Yet a quantum field is an intricate abstraction—a mathematical field—virtually inconceivable as a classical field's physical properties. Nature's deeper aspects, still unknown, might elude any possible field theory.
Though discovery of causality is popularly thought science's aim, search for it was shunned by the Newtonianresearch program, even more Newtonian than was Isaac Newton. By now, most theoretical physicists infer that the four, known fundamental interactions would reduce to superstring theory, whereby atoms and molecules, after all, are energy vibrations holding mathematical, geometric forms. Given uncertainties of scientific realism, some conclude that the concept causality raises comprehensibility of scientific explanation and thus is key folk science, but compromises precision of scientific explanation and is dropped as a science matures. Even epidemiology is maturing to heed the severe difficulties with presumptions about causality. Covering law model is among Carl G Hempel's admired contributions to philosophy of science.
Types of inference
- ^ abWoodward, "Scientific explanation", §2 "The DN model", in SEP, 2011.
- ^ abJames Fetzer, ch 3 "The paradoxes of Hempelian explanation", in Fetzer, ed, Science, Explanation, and Rationality (Oxford U P, 2000), p 113.
- ^Montuschi, Objects in Social Science (Continuum, 2003), pp 61–62.
- ^Bechtel, Philosophy of Science (Lawrence Erlbaum, 1988), ch 2, subch "DN model of explanation and HD model of theory development", pp 25–26.
- ^ abBechtel, Philosophy of Science (Lawrence Erlbaum, 1988), ch 2, subch "Axiomatic account of theories", pp 27–29.
- ^ abcdefghSuppe, "Afterword—1977", "Introduction", §1 "Swan song for positivism", §1A "Explanation and intertheoretical reduction", pp 619–24, in Suppe, ed, Structure of Scientific Theories, 2nd edn (U Illinois P, 1977).
- ^ abcdeKenneth F Schaffner, "Explanation and causation in biomedical sciences", pp 79–125, in Laudan, ed, Mind and Medicine (U California P, 1983), p 81.
- ^ abG Montalenti, ch 2 "From Aristotle to Democritus via Darwin", in Ayala & Dobzhansky, eds, Studies in the Philosophy of Biology (U California P, 1974).
- ^In the 17th century, Descartes as well as Isaac Newton firmly believed in God as nature's designer and thereby firmly believed in natural purposiveness, yet found teleology to be outsidescience's inquiry (Bolotin, Approach to Aristotle's Physics, pp 31–33). By 1650, formalizing heliocentrism and launching mechanical philosophy, Cartesian physics overthrew geocentrism as well as Aristotelian physics. In the 1660s, Robert Boyle sought to lift chemistry as a new discipline from alchemy. Newton more especially sought the laws of nature—simply the regularities of phenomena—whereby Newtonian physics, reducing celestial science to terrestrial science, ejected from physics the vestige of Aristotelian metaphysics, thus disconnecting physics and alchemy/chemistry, which then followed its own course, yielding chemistry around 1800.
- ^Nicknames for principles attributed to Hume—Hume's fork, problem of induction, Hume's law—were not created by Hume but by later philosophers labeling them for ease of reference.
- ^By Hume's fork, the truths of mathematics and logic as formal sciences are universal through "relations of ideas"—simply abstract truths—thus knowable without experience. On the other hand, the claimed truths of empirical sciences are contingent on "fact and real existence", knowable only upon experience. By Hume's fork, the two categories never cross. Any treatises containing neither can contain only "sophistry and illusion". (Flew, Dictionary, "Hume's fork", p 156).
- ^Not privy to the world's either necessities or impossibilities, but by force of habit or mental nature, humans experience sequence of sensory events, find seeming constant conjunction, make the unrestricted generalization of an enumerative induction, and justify it by presuming uniformity of nature. Humans thus attempt to justify a minor induction by adding a major induction, both logically invalid and unverified by experience—the problem of induction—how humans irrationally presume discovery of causality. (Chakraborti, Logic, p 381; Flew, Dictionary, "Hume", p 156.
- ^For more discursive discussions of types of causality—necessary, sufficient, necessary and sufficient, component, sufficient component, counterfactual—see Rothman & Greenland, Parascandola & Weed, as well as Kundi. Following is more direct elucidation:
A necessary cause is a causal condition required for an event to occur. A sufficient cause is a causal condition complete to produce an event. Necessary is not always sufficient, however, since other casual factors—that is, other component causes—might be required to produce the event. Conversely, a sufficient cause is not always a necessary cause, since differing sufficient causes might likewise produce the event. Strictly speaking, a sufficient cause cannot be a single factor, as any causal factor must act casually through many other factors. And although a necessary cause might exist, humans cannot verify one, since humans cannot check every possible state of affairs. (Language can state necessary causality as a tautology—a statement whose terms' arrangement and meanings render it is logically true by mere definition—which, as an analytic statement, is uninformative about the actual world. A statement referring to and contingent on the world's actualities is a synthetic statement, rather.)
Sufficient causality is more actually sufficient component causality—a complete set of component causes interacting within a causal constellation—which, however, is beyond humans' capacity to fully discover. Yet humans tend intuitively to conceive of causality as necessary and sufficient—a single factor both required and complete—the one and only cause, the cause. One may so view flipping a light switch. The switch's flip was not sufficient cause, however, but contingent on countless factors—intact bulb, intact wiring, circuit box, bill payment, utility company, neighborhood infrastructure, engineering of technology by Thomas Edison and Nikola Tesla, explanation of electricity by James Clerk Maxwell, harnessing of electricity by Benjamin Franklin, metal refining, metal mining, and on and on—while, whatever the tally of events, nature's causal mechanical structure remains a mystery.
From a Humean perspective, the light's putative inability to come on without the switch's flip is neither a logical necessity nor an empirical finding, since no experience ever reveals that the world either is or will remain universally uniform as to the aspects appearing to bind the switch's flip as the necessary event for the light's coming on. If the light comes on without switch flip, surprise will affect one's mind, but one's mind cannot know that the event violated nature. As just a mundane possibility, an activity within the wall could have connected the wires and completed the circuit without the switch's flip.
Though apparently enjoying the scandals that trailed his own explanations, Hume was very practical and his skepticism was quite uneven (Flew p 156). Although Hume rejected orthodox theism and sought to reject metaphysics, Hume supposedly extended Newtonian method to the human mind, which Hume, in a sort of antiCopernican move, placed as the pivot of human knowledge (Flew p 154). Hume thus placed his own theory of knowledge on par with Newton's theory of motion (Buckle pp 70–71, Redman pp 182–83, Schliesser § abstract). Hume found enumerative induction an unavoidable custom required for one to live (Gattei pp 28–29). Hume found constant conjunction to reveal a modest causality type: counterfactual causality. Silent as to causal role—whether necessity, sufficiency, component strength, or mechanism—counterfactual causality is simply that alteration of a factor prevents or produces the event of interest.
- ^ abcdeKundi M (2006). "Causality and the interpretation of epidemiologic evidence". Environmental Health Perspectives. 114 (7): 969–974. doi:10.1289/ehp.8297. PMC 1513293. PMID 16835045.
- ^Hume noted that authors ubiquitously continue for some time stating facts and then suddenly switch to stating norms—supposedly what should be—with barely explanation. Yet such values, as in ethics or aesthetics or political philosophy, are not found true merely by stating facts: is does not itself reveal ought. Hume's law is the principle that the fact/value gap is unbridgeable—that no statements of facts can ever justify norms—although Hume himself did not state that. Rather, some later philosophers found Hume to merely stop short of stating it, but to have communicated it. Anyway, Hume found that humans acquired morality through experience by communal reinforcement. (Flew, Dictionary, "Hume's law", p 157 & "Naturalistic fallacy", pp 240–41; Wootton, Modern Political Thought, p 306.)
- ^Kant inferred that the mind's constants arrange space holding Euclidean geometry—like Newton's absolute space—while objects interact temporally as modeled in Newton's theory of motion, whose law of universal gravitation is a truth synthetic a priori, that is, contingent on experience, indeed, but known universally true without universal experience. Thus, the mind's innate constantscross the tongs of Hume's fork and lay Newton's universal gravitation as a priori truth.
- ^ abChakravartty, "Scientific realism", §1.2 "The three dimensions of realist commitment", in SEP, 2013: "Semantically, realism is committed to a literal interpretation of scientific claims about the world. In common parlance, realists take theoretical statements at 'face value'. According to realism, claims about scientific entities, processes, properties, and relations, whether they be observable or unobservable, should be construed literally as having truth values, whether true or false. This semantic commitment contrasts primarily with those of so-called instrumentalist epistemologies of science, which interpret descriptions of unobservables simply as instruments for the prediction of observable phenomena, or for systematizing observation reports. Traditionally, instrumentalism holds that claims about unobservable things have no literal meaning at all (though the term is often used more liberally in connection with some antirealist positions today). Some antirealists contend that claims involving unobservables should not be interpreted literally, but as elliptical for corresponding claims about observables".
- ^ abChallenges to scientific realism are captured succinctly by Bolotin, Approach to Aristotle's Physics (SUNY P, 1998), pp 33–34, commenting about modern science, "But it has not succeeded, of course, in encompassing all phenomena, at least not yet. For it laws are mathematical idealizations, idealizations, moreover, with no immediate basis in experience and with no evident connection to the ultimate causes of the natural world. For instance, Newton's first law of motion (the law of inertia) requires us to imagine a body that is always at rest or else moving aimlessly in a straight line at a constant speed, even though we never see such a body, and even though according to his own theory of universal gravitation, it is impossible that there can be one. This fundamental law, then, which begins with a claim about what would happen in a situation that never exists, carries no conviction except insofar as it helps to predict observable events. Thus, despite the amazing success of Newton's laws in predicting the observed positions of the planets and other bodies, Einstein and Infeld are correct to say, in The Evolution of Physics, that 'we can well imagine another system, based on different assumptions, might work just as well'. Einstein and Infeld go on to assert that 'physical concepts are free creations of the human mind, and are not, however it may seem, uniquely determined by the external world'. To illustrate what they mean by this assertion, they compare the modern scientist to a man trying to understand the mechanism of a closed watch. If he is ingenious, they acknowledge, this man 'may form some picture of a mechanism which would be responsible for all the things he observes'. But they add that he 'may never quite be sure his picture is the only one which could explain his observations. He will never be able to compare his picture with the real mechanism and he cannot even imagine the possibility or the meaning of such a comparison'. In other words, modern science cannot claim, and it will never be able to claim, that it has the definite understanding of any natural phenomenon".
- ^Whereas a hypothetical imperative is practical, simply what one ought to do if one seeks a particular outcome, the categorical imperative is morally universal, what everyone always ought to do.
- ^ abBourdeau, "Auguste Comte", §§ "Abstract" & "Introduction", in Zalta, ed, SEP, 2013.
- ^Comte, A General View of Positivism (Trübner, 1865), pp 49–50, including the following passage: "As long as men persist in attempting to answer the insoluble questions which occupied the attention of the childhood of our race, by far the more rational plan is to do as was done then, that is, simply to give free play to the imagination. These spontaneous beliefs have gradually fallen into disuse, not because they have been disproved, but because humankind has become more enlightened as to its wants and the scope of its powers, and has gradually given an entirely new direction to its speculative efforts".
- ^Flew, Dictionary (St Martin's, 1984), "Positivism", p 283.
- ^ abcWoodward, "Scientific explanation", §1 "Background and introduction", in SEP, 2011.
- ^ abFriedman, Reconsidering Logical Positivism (Cambridge U P, 1999), p xii.
- ^Any positivism placed in the 20th century is generally neo, although there was Ernst Mach's positivism nearing 1900, and a general positivistic approach to science—traceable to the inductivist trend from Bacon at 1620, the Newtonianresearch program at 1687, and Comptean positivism at 1830—that continues in a vague but usually disavowed sense within popular culture and some sciences.
- ^Neopositivists are sometimes called "verificationists".
- Chakravartty, "Scientific realism", §4 "Antirealism: Foils for scientific realism", §4.1 "Empiricism", in SEP, 2013: "Traditionally, instrumentalists maintain that terms for unobservables, by themselves, have no meaning; construed literally, statements involving them are not even candidates for truth or falsity. The most influential advocates of instrumentalism were the logical empiricists (or logical positivists), including Carnap and Hempel, famously associated with the Vienna Circle group of philosophers and scientists as well as important contributors elsewhere. In order to rationalize the ubiquitous use of terms which might otherwise be taken to refer to unobservables in scientific discourse, they adopted a non-literal semantics according to which these terms acquire meaning by being associated with terms for observables (for example, 'electron' might mean 'white streak in a cloud chamber'), or with demonstrable laboratory procedures (a view called 'operationalism'). Insuperable difficulties with this semantics led ultimately (in large measure) to the demise of logical empiricism and the growth of realism. The contrast here is not merely in semantics and epistemology: a number of logical empiricists also held the neo-Kantian view that ontological questions 'external' to the frameworks for knowledge represented by theories are also meaningless (the choice of a framework is made solely on pragmatic grounds), thereby rejecting the metaphysical dimension of realism (as in Carnap 1950)".
- Okasha, Philosophy of Science (Oxford U P, 2002), p 62: "Strictly we should distinguish two sorts of anti-realism. According to the first sort, talk of unobservable entities is not to be understood literally at all. So when a scientist pus forward a theory about electrons, for example, we should not take him to be asserting the existence of entities called 'electrons'. Rather, his talk of electrons is metaphorical. This form of anti-realism was popular in the first half of the 20th century, but few people advocate it today. It was motivated largely by a doctrine in the philosophy of language, according to which it is not possible to make meaningful assertions about things that cannot in principle be observed, a doctrine that few contemporary philosophers accept. The second sort of anti-realism accepts that talk of unobservable entities should be taken at face value: if a theory says that electrons are negatively charged, it is true if electrons do exist and are negatively charged, but false otherwise. But we will never know which, says the anti-realist. So the correct attitude towards the claims that scientists make about unobservable reality is one of total agnosticism. They are either true or false, but we are incapable of finding out which. Most modern anti-realism is of this second sort".
- ^ abWoodward, "Scientific explanation", in Zalta, ed, SEP, 2011, abstract.
- ^Carl G Hempel & Paul Oppenheim, "Studies in the logic of explanation", Philosophy of Science, 1948 Apr; 15(2):135–175.
- ^ abcdBechtel, Discovering Cell Mechanisms (Cambridge U P, 2006), esp pp 24–25.
- ^ abWoodward, "Scientific explanation", §2 "The DN model", §2.3 "Inductive statistical explanation", in Zalta, ed, SEP, 2011.
- ^von Wright, Explanation and Understanding (Cornell U P, 1971), p 11.
- ^ abStuart Glennan, "Explanation", § "Covering-law model of explanation", in Sarkar & Pfeifer, eds,