In his masterpiece Insight, Bernard Lonergan offers six ‘canons of empirical method’. His purpose is pedagogical, for by exploring the contours of empirical method, he aims to discover to the reader the dynamic of human knowledge, which he considered a platform for an exploration of the major disciplines of philosophy such as metaphysics, ethics and epistemology. For a satisfactory understanding of the significance of these canons—which are introduced almost out of the blue near the beginning of Insight—within Lonergan's project, an fuller acquaintance with his philosophy is necessary.
However, the canons have merit simply on their own. I have found myself returning to them when thinking about it is that natural science actually is and what criteria we can use for delineating proper scientific inquiry from naïf speculation. There are other popular criteria, such as falsefiability, but more often than not it is too simplistic to reduce the method of natural science to single canons. Lonergan's six canons interlock in a particularly compelling way.The canons might be organised as follows:
- The pivot between data and scientific theory: canons of (1) selection and (2) operations.
- The nature of scientific theory: canons of (3) relevance and (4) parsimony.
- The concrete realisation of scientific theory: canons of (5) complete explanation and (6) statistical residues.
Empirical method prescinds from all questions and answers that do not involve distinctive, sensible consequences; and it discards all that involve such consequences logically yet fail to be confirmed by the results of observation or experiment.
This canon expresses a basic realisation of modern natural science: theory is not enough. Natural science must, ultimately, be connected to what is accessible to the senses—though, of course, this connexion may be complicated and involve the mediation of instruments. But among all possible theories, only theories that predict sensible consequences can be ‘selected’ for consideration by the empirical investigator, and only those whose sensible predictions are realised can be retained.
This is perhaps the most obvious of the canons, but because obvious, surprisingly easy to overlook. Just as any baseball player needs to be reminded from time to time to ‘keep your eye on the ball’, so the empirical investigator needs to be reminded often to keep the theory to what has sensible consequences.
Note that it is fallacious to conclude that theories that do not satisfy the canon of selection are worthless; it is just that such theories do not pertain to empirical method.
In virtue of [the canon of operations], fresh data are ever being brought to light to force upon scientific consciousness the inadequacies of existing hypotheses and theories, to provide the evidence for their revision and, in the limit, when minor corrections no longer are capable of meeting the issue, to demand the radical transformation of concepts and postulates that is named a higher viewpoint.
Data are not given once and for all, but rather, new data are added as fresh theories stimulate fresh investigations, lead to new experiments and suggest novel observations. Thus, the verification of theories by the expansion of sensible data is generative of new theories as the need arises to explain an ever-increasing field of observations. Hence, the canons of selection and operations are ‘obverse and reverse of the same coin’. Theories are refined by the canon of selection, and the operations of refinement lead to new data, new questions and new theories.
This canon emphasises that fact that the scientific endeavour is dynamic—man's search for knowledge involves, as Lonergan says elsewhere, a ‘moving viewpoint’. The operations by which the scientist pivots between data, theory and verification are not neutral; rather, they are the matrix in which theories (far from being disjointed) must interconnect.
There is … an intelligibility immanent in the immediate data of sense; it resides in the relations of things, not to our senses, but to one another; it consists not in an absolute necessity, but in a realised possibility.
Data are rich in intelligibility, of two basic sorts. On the one hand, there is an intelligibility that leads away from the data at hand to insights about utility or craftsmanship or design—by considering a cart-wheel (to use Lonergan's example), one can have insights about transportation or wheel-making or materials engineering. On the other hand, there is an intelligibility that resides in the data themselves— the cart-wheel is a disc which has such-and-such a moment of inertia, such-and-such a relationship between its radius and circumference, and so on. Empirical method is concerned with the immanent, not the extrinsic, intelligibility.
Classically, the extrinsic intelligibilities are known as final, instrumental and efficient causes. But the intrinsic intelligibility is known as formal cause. Lonergan acknowledges that the latter term is easily misunderstood; nevertheless, it seems to me that many misconceptions about the nature of empirical method come down to a confusion that the physical sciences study efficient causes. Further, there is the equivocal use of the word ‘matter’ in modern times, which sometimes leads to the notion that physical sciences study material causes. But in the classical sense, matter is not simply material being, but rather potency; and while it is in a certain sense an intrinsic cause, matter is in fact only understood through the form by which it is enacted. Thus, the focus of empirical investigation remains formal causality.
A final important point is that the formal causes of empirical method are not self-evident or necessary. Once verified, we can affirm that they are real, but they could have been otherwise. ‘As insight grasping possibility [empirical science] is science. As verification selecting the possibilities that in fact are realised, it is empirical.’
The canon of parsimony in its most elementary form excludes from scientific affirmation all statements that are unverified and, still more so, all that are unverifiable.
This canon codifies things hinted at in the canons of selection and relevance. The canon of selection states that scientific knowledge involves sensible data, and the canon of relevance states that scientific knowledge is about intelligibility intrinsic to data that is not necessary, but realised possibility. The current canon insists that such all aspects of a theory must be realisable in actual sensible data. The empirical method can never terminate at hypotheses, for the canon of parsimony claims that every element of a theory must be verifiable.
The canon of parsimony is related to ‘Occam's razor’, which is often vaguely articulated as a preference for simplicity rather than complexity in scientific explanation. This is not wholly satisfactory, for theory ought to be driven by data, whether their explanation be simple or complex. Bertrand Russell is closer to the canon of parsimony when he formulates Occam's razor thus: `Whenever possible, substitute constructions out of known entities for inferences to unknown entities.’ In other words, the aim of empirical method is theory in which all components refer to empirically verifiable phenomena.
The goal of empirical method is commonly stated to be the complete explanation of all phenomena or data.
Whereas the canon of parsimony prescribes that theory always be realised, or at least realisable, in sensible data, the canon of complete explanation calls for the converse: all sensible data are to be explained.
This seems uncontroversial—as Lonergan points out above, it is ‘commonly stated’. Nevertheless, there is a strong temptation to cordon off certain phenomena and assume that they are so fundamental that they do not require explanation. The compelling example that Lonergan proposes are the phenomena of space and time. Since at least Galileo and Descartes, and ratified in a peculiar way by Kant, there have been attempts to consider extension and duration as ‘primary qualities’ of the sensible world that do not need to be analysed scientifically in the same way as other phenomena, such as colour. However, as Einstein demonstrated, the phenomena of extension and duration do need to be intelligently abstracted to space and time, just as colour needs to be abstracted to wavelength of light.
Hence, in empirical science, all data need to be explained, not merely described.
It would seem, then, that an understanding of the concrete unfolding of the world process will not be based exclusively on classical laws, however exactly and completely known, but in a fundamental manner will appeal to statistical laws.
This canon is the most difficult since it requires discarding not only a completely deterministic worldview, but also the notion that the determinism of classical laws are in competition with the lack of determinism in statistical laws.
The determinism of classical laws is a feature of their abstract nature. On the one hand, abstraction is enriching in that it reveals the real relations inherent in data. On the other hand, however, abstract laws are only realised in concrete events, which are contingent in nature. Abstract kinematics is realised when two billiard balls collide, but the conditions for such a collision rely on a diverging chain of conditions that are not under the sway of kinematics: the level of the table, the fact of the manufacturing of the balls, the willingness of a player to push them together, and so on.
Of course, one can imagine that all events in the world can be abstracted to a total, classical system, but this is merely a hypothesis. In fact, the progress of science seems to show that it is much more likely that there is an inherent indeterminacy to events. Hence, if the other canons are to be maintained—especially the canon of complete explanation—statistical laws that describe the average occurrence of non-deterministic events, is needed.
In other words, the canon of statistics reminds us that the ‘residue’ of the data after abstraction to classical laws has a further kind of intelligibility: a statistical one.