“科学解释与科学方法论”国际学术会议论文摘要
2012-09-02
1. Special relativity and the spatiotemporal extent of the present. Katherine Brading. University of Notre Dame, USA
In his comments on Putnam’s (1967) paper on special relativity, Stein (1968) suggests that adopting both
(P1) All and only things that exist now are real
(a standard expression of “presentism”), and
(P2) Special relativity is a complete account of spatiotemporal structure
leads to “the interesting result that special relativity implies a peculiarly extreme (but pluralistic) form of solipsism.” This conclusion depends on an assumption, held in common by presentists and advocates of the block universe, concerning the role of spatiotemporal structure in grounding ontological unity. I argue that if dynamical laws, rather than spatiotemporal structure, bear the ontological weight of grounding the unity of what there is then the core thesis of presentism survives the transition to Minkowski spacetime without entailing “pluralistic solipsism” and without requiring that we reject (P2). Moreover, once unity is grounded in dynamics, the spatiotemporal extent of the present becomes an empirical matter. This latter result follows the general methodological line found in Newton’s work, in which more and more questions previously taken to be a priori metaphysical questions are transformed into empirical ones. To my surprise, not to say astonishment, I suggest that presentism as an empirical thesis is not yet dead.
2.Science Is Dappled; The World Is Not. Michael Strevens. University of New York, USA
Science as we know it is “dappled”: even within a discipline, models tend to draw the majority of their empirical content not from some general theory but from observed local regularities. The current scientific picture of the world, then, is a mosaic, with different aspects of the world, different systems, represented by different sub-theories or models that are only loosely or formally connected. Nancy Cartwright has proposed that the best explanation for this disunity in our representation of the world is a disunity in the world itself. The world is dappled, which is to say that, rather than being governed by a single set of omnipotent fundamental laws, what happens is due to a patchwork of principles covering different kinds of systems or segments of reality, each with something less than full omnipotence and with the possibility of anomic indeterminism at the boundaries. I want to undercut this argument for a dappled world by showing that the motley nature of science, both now and even when empirical inquiry is complete, can be equally well explained by proponents of the “fundamentalist” view that the universe’s initial conditions and fundamental laws of nature determine everything that ever happens.
3.How Rich Our Consciousness Is?. Allen Y. Houng. Yang Ming University, Taipei, Taiwan
In recent literature of consciousness study, there is a hot debate concerning how rich our conscious experience is. Take visual perception as an example, one camp of people maintain that when we see, say, a picture, we see a great details of the picture. The other camp of people contend that we see only those that get processed by our cognitive operations such as reasoning, attention, verbal report and motor control. Do we see more than what we can cognitively access? Or put it in another way, is phenomenology richer than cognition? In this essay, I will argue that our consciousness is much richer than what we can access. In addition to philosophical arguments, I will cite some empirical researches in cognitive neuroscience to support my thesis. I will also discuss the philosophical implications of that result.
4.Rule-Following and Sociology of Scientific Knowledge. Kai-Yuan Cheng. Institute of Philosophy of Mind and Cognition, Yang-Ming University, Taiwan
Philosophical discussions of rule-following in the later Wittgenstein (1953) are an important source of inspiration for the development of views on the social nature of scientific knowledge. Two major opposing views in this inquiry—Bloor’s sociology of scientific knowledge (SSK) (1983, 1991, 1992, 1997) on the one hand and ethnomethodological studies of work in the sciences and mathematics (ESW) of Garfinkel et. al. (1981, 1989) and Lynch et. al. (1983; 1992) on the other hand—represent two positions derived from two different readings of Wittgenstein’s rule-following considerations. Kusch (2004) has recently revived this noted Bloor-Lynch debate by finding fault with the readings of Wittgenstein on both sides of the debate. The aim of this paper is to re-examine, and hopefully to advance, the debate in question in light of Kusch’s (2004) take on these issues and of my own view on the problem of rule-following from the perspective of the study of generics (Cheng, 2011).
5.Two Accounts of Laws and Time. Barry Loewer. Rutgers University, USA
Among the most important questions in the metaphysics of science are “What are the natures of fundamental laws and chances?” and “What grounds the direction of time? I will examine some connections between these questions, discuss two approaches to answering them and argue in favor of one. Along the way I will raise and comment on a number of issues concerning the relationship between physics and metaphysics and consequences for the subject matter and methodology of metaphysics.
6.How physics mandates Darwinian adaptation and makes it irreducible. Alex Rosenberg. Duke University, USA
I show that, given reasonable requirements on any explanation of adaptations, physical processes are necessary and sufficient for Darwinian processes, and they exclude any alternative processes that might produce adaptations. I then go on to show that the thermodynamic character of the process whereby physical processes generate adaptations also makes it difficult systematically to explain them and so gives biology its appearance of autonomy and irreducibility.
7.Cases as Bases for the Epistemology of Science. Anjan Chakravartty. University of Notre Dame, USA
Case studies of scientific theory and practice, past and present, are often offered as evidence for or against scientific realism: the view that our best theories give approximately true descriptions of unobservable aspects of the world, or that their central terms refer to entities that exist in a mind-independent reality, or that they give approximative representations of the structure of scientific phenomena. But how compelling is this evidence? I consider this question in light of three arguments. The first concerns the possible immunity of realism (and antirealism) to case studies of scientific methodology. The second concerns the possible immunity of realism to considerations of the history of scientific theories that are now rejected. The third concerns a suggested inability of case study evidence to indicate which form of scientific realism is most promising.
8.Simplicity, Unification, and Ockham’s Razor. Malcolm Forster. University of Wisconsin-Madison, USA
9.Explanation by idealized theories. Ilkka Niiniluoto. University of Helsinki, Finland
Idealized scientific theories tell how natural and social systems would behave under counterfactual conditions, so that their descriptions of actual situations are at best truthlike. The use of such theories as covering laws in deductive-nomological explanations of empirical facts and regularities is therefore problematic. Duhem observed already in 1906 that Newton’s theory and Kepler’s laws in fact strictly speaking contradict each other. Attempts to analyze their relation as approximate explanation were given in the 1960s. A more systematic perspective was developed in the method of idealization and concretization by the Poznan school (Nowak, Krajewski) in the 1970s: ideal assumptions are first made explicit as antecedents of counterfactual conditionals (idealizational laws) and then these assumptions are eliminated or relaxed by modifying the consequent (concretization, factualization). This treatment covers both the so-called Galilean idealizations (simplifications for the purpose computability) and isolations (consideration of essential or pure aspects of phenomena without secondary disturbing factors). It allows us to distinguish several important types of explanations by idealizations. First, a theory T’ may entail an empirical consequence e’ which approximates a well-established empirical statement e. If e’ is a concretization of e, then this explanation is corrective. After the independent empirical confirmation of e’, this argument can used to answer the contrastive explanatory question “Why e’ rather than e?”. Secondly, T’ may itself be a concretization of some theory T which entails e. Again, the explanation of e by showing that e’ follows from T’ is corrective. Thirdly, while an empirical claim e may be approximately derivable from an idealizational law T or from an approximation of T, this derivation need not be replaced by a corrective explanation, as e may be strictly deducible from a concretization T’ of T.
10.A Muse of Fire. James Robert Brown. Toronto University, Canada
Daniel Dennett and others have rejected the use of thought experiments. This talk will be an account of Dennett's reasons for doing so, followed by a critical evaluation and rejection of them. The specific issues will involve the difference between philosophical thought experiments (which he criticizes) and those from the sciences, and also his claim that common sense "folk concepts" only get in the way of real knowledge. It will turn out that Dennetthas no case against the use of thought experiments, but the reasons for his failure are quite interesting in their own right.
11.Grounding Content. Kelly Trogdon. Lingnan University, Hong Kong
In this paper I develop and defend a novel naturalistic account of mental content, what I call thegrounding theory. The account isinformationalin that it understands content partly in terms certainmind-world nomic relations andreferentialin that it treats the contents of neural states as properties. The account identifies a substantive sufficient condition for a property (e.g.being a horse) tobe the content of a neural state (e.g. HORSE), one that appeals to the idea that certain causal facts ground other causal facts. The grounding theory is inspired by Fodor’s asymmetricdependence theory of content. I argue that the grounding theory incorporates the insights of Fodor’s theory while avoiding some of its centrals problems..
12.Models, Structures, and the Explanatory Role of Mathematics in Empirical Science. Mary Leng. University of York, UK
13.Laplace's Demon and the Adventures of his Apprentice. Roman Frigg. LSE, London UK
Foretelling the future is an age-old human desire. Among the methods to pursue this goal mathematical modelling has gained prominence, in particular in climate science. Many of these models promise to make probabilistic forecasts. This raises the question of exactly what these models deliver: can these models provide the results as advertised? The aim of this paper is to urge some caution. Using the example of the logistic map, we argue that if a model is non-linear and if there is only the slightest model imperfection, then treating model outputs as decision relevant probabilistic forecasts can be seriously misleading. This casts doubt on the trustworthiness of model results. This is nothing short of a methodological disaster: probabilistic forecasts are used in many places all the time and the realisation that probabilistic forecast cannot be trusted pulls the rug from underneath many modelling endeavours.
14.Dispositions, Twin Earth, and Relativity of Scientific Theories. Sungho Choi. Kyung Hee University, South Korea
In their attempt to provide a better understanding of semantics of dispositional predicates, Bird and Lipton entertain an insightful idea that there is a remarkable similarity between natural kind terms and dispositional terms. The chief goal of this paper is to give substantive content to their idea, which will lead to the externalist thesis that the prevalent feature of the ascriber’s environment makes difference to the overall semantic content of the dispositional ascription. To justify this thesis, I will bring into focus the issue of how dispositional concepts are used for a variety of classificatory and inferential purposes in everyday or scientific practices, which will provide a theoretical basis for turning down what I call the no interference reasoning, a line of reasoning that I think has been at least implicitly endorsed by a number of metaphysicians of dispositions. I indeed maintain that the neglect of this practical dimension of our use of dispositional concepts is largely responsible for the failure of some prominent proposals on the semantics of dispositional predicates. I will close by explaining the lessons regarding scientific practices we can draw from the externalist thesis advanced in this paper.
15.“Bridled Irrationality”: Historical Antecedents of Bas van Fraassen’s Epistemic Voluntarism. Kathleen Okruhlik. Department of Philosophy The University of Western Ontario, Canada
Bas van Fraassen’s earliest work was in the philosophy of time and space. He developed his views in the context of the conventionalisms of Hans Reichenbach and Adolf Grünbaum, his dissertation supervisor. Conventionalism highlights the necessity of making choices in science. Whether it should be considered a form of epistemic voluntarism is a more difficult question, which will be examined in the latter parts of this paper.
Van Fraassen’s constructive empiricism and his more recent empiricist structuralism, on the other hand, are both couched in the context of a very explicit commitment to epistemic voluntarism. In his 1984 article called “Belief and the Will”, van Fraassen draws on St. Augustine, Pascal, and William James to make his argument that diachronic incoherence is to be avoided by replacing the descriptivist view of epistemic judgment with an alternative view that makes expressing a belief similar to making a promise. He calls this view “voluntarist” because “it makes judgment in general, and subjective probability in particular, a matter of cognitive commitment, intention, engagement.” Belief, he says, “is a matter of the will.”
In Laws and Symmetry (1989), after coining the slogan that “Rationality is only bridled irrationality”, van Fraassen goes on to say that “in rationality as in morality, what is not required is still not always forbidden. This means…that rational change of opinion is not a case of rule-following. It includes an element of free choice… . The factors relevant to the free choices…are in the domain of pragmatics.” (p. 176) In The Empirical Stance (2002), scientific revolutions (Kuhnian paradigm changes) are described as conversion processes that depend on emotional transformation. Much of the language used here is explicitly Sartrean, and in recent work, van Fraassen’s epistemic agent has come more and more to resemble Sartre’s existential protagonist. In science as in morality, we are condemned to freedom.
This paper will try to sort out the historical traditions that have shaped van Fraassen’s epistemic voluntarism. It will cross the Kantian divide only a little bit. St. Augustine and Pascal figure in the analysis, but the paper will deal mainly with three post-Kantian traditions: conventionalism, pragmatism, and existentialism, all of which underwent periods of rapid development early in the twentieth century. There is ample evidence of mutual influence, and sometimes individuals are located in more than one tradition. So, for example, in her 1912 MA thesis called Pragmatism and French Voluntarism, Susan Stebbing denies the claim made by others that LeRoy and Poincaré can be classified as pragmatists and locates them instead within the tradition of French voluntarism. Others, of course, have classified the same figures as conventionalists or sometimes “radical conventionalists”.
My hope is to tell a story about van Fraassen’s historical antecedents that will explain the prima facie tensions I see between the existentialist and pragmatic dimensions of his epistemic voluntarism.
16.What Kind of Progress Occurred When Genes Went Molecular?. Paul E. Griffiths (co-authored with Dr Karola Stotz). University of Sydney, Australia
The result of the molecular revolution in genetics was not that the Mendelian gene was reduced, to or eliminated in favour of, the molecular gene. Although the new, molecular identity of the gene was now its dominant identity, the other, Mendelian identity did not simply go away. The gene retains its Mendelian identity in certain areas of biological research, namely those continuous with classical genetic analysis. We give examples of contemporary research in which it is necessary to think of genes as both Mendelian alleles and molecular genes, even when those two identities do not converge on the same pieces of DNA. This explains the well-known difficulties encountered by philosophers who have tried to explain how the Mendelian gene was reduced to molecular biology. The attempt to assimilate the development of genetics to the replacement of an old theory by a new theory resulted from the failure to recognise what is now widely accepted, namely that Mendelian and molecular genetics are different experimental practices, rather than different theories of the gene.
17.How Fundamental Physics Represents Causality. Andreas Bartels & Daniel Wohlfarth. University of Bonn, Germany
According to the Neo-Russellian view (e.g. Price & Corry 2007), causality has no place within fundamental physics. I discuss some arguments pro and against this thesis (e.g. Norton 2007, Price 2007 on the one, S. Smith 2007 and M. Frisch 2012 on the other side) with the result that (i) no convincing argument has been provided to the effect that fundamental physics doesn’t represent causation, and (ii) even defenders of causality in physics refrain from anchoring causation within fundamental structures (see e.g. Frisch’s account according to which the causal resources of physics models are derived from their embedding into enriched causal structures).
Contrary to the view that causation is not a matter of fundamental physics, I will show that the spacetimes of General Relativity have intrinsic structure to represent causal relations, at least with respect to two fundamental properties, time-asymmetry and energy flow between events. Firstly, it can be shown that “almost all” spacetimes in which a global cosmic time order exists are time-asymmetric with respect to the cosmic time parameter (Wohlfarth 2012). Secondly, there exist physical timelike vector fields by which this global time-asymmetry can be tranferred in an overall consistent way to the local proper time parameter. Thus the local time asymmetry grounding causal relations can be derived from the global structure of spacetime. Thirdly, the 4-vector representing local energy flow according to the standard interpretation is a candidate for the above mentioned physical timelike vector field. It can thus serve to represent causal relations by time-asymmetric energy flow. In sum, this shows that, even if causation is not represented by the fundamental laws of physics directly, fundamental structures within their solutions do have the resources to represent causation.
18.On Forster’s Counterexamples to the Likelihood Theory of Evidence. Jiji Zhang. Lingnan University, Hong Kong
Forster (2006) presented some interesting examples having to do with distinguishing the direction of causal influence between two variables, which he argued are counterexamples to the likelihood theory of evidence (LTE). His more general contention was that a likelihoodist or Bayesian philosophy of science based on the LTE could not capture the consilience of inductions. In this paper we examine Forster’s argument that his examples constitute counterexamples to the LTE, and argue that it relies on an intuition that likelihoodists have forcefully criticized. We then present a theorem that reveals a general and systematic connection between likelihood and (a reasonable interpretation of) consilience, in the kind of causal inference considered by Forster. The theorem shows that at least for such problems, likelihood-based methods can well accommodate the consilience of inductions. (This is joint work with Kun Zhang.)
19.The Promise of Science. Janet A. Kourany. University of Notre Dame, USA
At the dawn of modern science, a promise was made. If society would but support the new enterprise, society would be richly rewarded not only with unprecedented insights into the workings of the universe but also with all the benefits that such insights would provide. Indeed, Francis Bacon, one of the main architects of the new science as well as one of its more exuberant press agents, promised that the knowledge science would offer would “establish and extend the power and dominion of the human race over the universe” for the benefit of all of humankind. Now, centuries later, has the promise been kept? Certainly, science has given us much control over nature—more diverse, more abundant foods to eat, produced more quickly and easily; more comfortable, more attractive places to live with more conveniences, built more efficiently; quicker, more convenient ways to get around and to communicate with each other; and so on. Yet, science has given us far less control over human nature—increased longevity; freedom from disease, disability, and suffering; expanded mental and physical abilities; personality traits more amenable to our needs, and so on. Now, in the 21st century, however, all these are the anticipated outcomes of current technoscientific developments. More specifically, all these are the anticipated outcomes of four emerging technosciences—nanotechnology, biotechnology, cognitive science, and information technology. And since these technosciences are not only emerging, but also converging—that is, since they interconnect and assist each other in myriad ways—the new modes of control are said to lie close to the horizon, and some are available even now. Carried to the utmost, the result will be humans with such radically expanded capacities as to be no longer humans at all—“posthumans.” Or so it is claimed. But is this so-called “human enhancement” project a legitimate part of Bacon’s promise for science? That is the question I would like to explore. I will argue that it is not, at least not at present.
20.Scaffolding Understanding: featuring causality. Sergio F. Martínez. Universidad Nacional Autónoma de México, Mexico
The search for understanding is a major aim of science. Traditionally, understanding has been undervalued in the philosophy of science because it is not possible to detach it from its psychological underpinnings; nowadays, however, it is widely recognized that epistemology cannot be divorced from psychology. This eliminates the main obstacle to giving the concept of understanding due attention in philosophy of science. My aim in this paper is to describe an account of scientific understanding as an emergent feature of our mastering of different (causal) explanatory frameworks. Mastering of explanatory frameworks involves the mastering of scientific practices and the development of appropriate ontologies that allow for the integration of very different phenomena into more general and more tightly related explanations. Clarifying my proposal requires the elaboration of underlying views about the relation between explanation and understanding, as well as about the role of processes of abstraction in generating the appropriate ontologies (promoting understanding). I will use examples from different disciplines, mainly from biology and the cognitive sciences to make my point.
21.The Veracity of Empirical Data: A Causal Account of Observational Error. James W. McAllister. University of Leiden, Netherlands
Almost every writer on science since the seventeenth century has agreed that it is possible that empirical data – the results of observations and measurements – are false, untrustworthy or uninformative. Scientists and philosophers of science use the concept of observational error to refer to the degree or extent to which empirical data fail in this way. In this talk, I argue that it is impossible on conceptual grounds that empirical data are anything other than entirely veridical. I retrace the origin of the concept of observational error to intentional accounts of the production of empirical data, which, though useful in some respects, are misleading. I propose an alternative, causal account that provides a better understanding of how empirical data are produced and in which the concept of observational error does not arise.
22.Lawful Explanation and Manipulationist Account. Wei Wang. Tsinghua University, China
In a recent decade, Prof. James Woodward’s manipulationist account became the most promising approach in the issue of scientific explanation. The paper tries to argue for lawful explanation by revisiting the comparison between his notion of invariance and the traditional concepts, e.g. laws and ceteris paribus laws. The author agrees that Woodward’s concept of invariance would improve our understanding of laws and CP laws, therefore scientific explanation. However, manipulationist account could be a complement, rather than the replacement, of lawful explanation. Lawful explanation is helpful for us to “Think Global”, while the maninpulationist account is good to “Act Local”.
23.Hierarchy, Form, and Reality. Gang Cheng. Huazhong University of Science and Technology, China
Scientific progress in the 20th century has shown that the structure of the world is hierarchical. A philosophical analysis of the hierarchy will bear obvious significance for metaphysics and philosophy in general. Jonathan Schaffer's paper, "Is There a Fundamental Level?", provides a systematic review of the works in the field, the difficulties for various versions of fundamentalism, and the prospect for the third option, i.e., to treat each level as ontologically equal. The purpose of this paper is to provide an argument for the third option. The author will apply Aristotle's theory of matter and form to the discussion of the hierarchy and develop a form realism, which will grant every level with "full citizenship in the republic of being." It constitutes an argument against ontological and epistemological reductionism. A non-reductive theory of causation is also developed against the fundamental theory of causation.
24.Mechanism and Productive Event. Zhu Xu. Graduate University of China Academy of Sciences, China
This article is about theory of causation; in particular it is about the mechanist theory of causation. The regularist theory of Humean tradition has been an accepted view in the twentieth century. In the last two decades, several philosophers of science put more attention on causation and explanation in special sciences than ever before. Some of them claim the interventionist theory which conceptualizes causation in terms of manipulability among variables, namely as “invariance under intervention” (Hausman and Woodward 1999; Woodward 2000, 2003). Others are mechanical theorists, who argue that the notion of mechanism, though differs from mechanistic philosophy in the seventeenth century, implies causation in special sciences, such as biology, neuroscience, and the social sciences, in the ontological and methodological sense (Glennan 1996, 2002; Machamer, Darden, and Craver 2000). Though both are critics on the regularist view, interventionist and mechanist theories are still competing with each other (Machamer 2004; Psillos 2004; Woodward 2002, 2011).
The central problem in the following discussion is the characterization of production. The mechanist theory is to appeal to a general intuition on causality, namely that causes play some productive roles on effects. Mechanist theory believes production could not be correctly represented from the notion of causal dependence. However, the proponents of interventionist theories could insist that there is no distinctiveness of production other than dependence among variables. Mechanist theory could only provide detailed spatial-temporal information about “invariance under intervention”, rather than any distinctive position on causation. Therefore, it is necessary to put forward some plausible characterization of production in the justification of mechanist theory.
There are three typical attempts on that issue. Machamer (2004) maintains the irreducibility of production based on the difference between activities, represented as verb-forms in natural language, and correlations among variables usually described in canonical way. Hall (2004) gives a reductive analysis of production, parallel to counterfactual analysis on dependence. Craver (2007) seems to hold an intermediate position, which admits manipulability as an empirical criterion for mechanism, while denies the models constructed merely from dependence among variables could be qualified as mechanism.
This article opens with a brief introduction of the mechanist theory, discussing its components and competitions with interventionist theory. From the second section it turns to the discussion of the above three approaches on conceptualizing production. I will argue that all of them are entangled with a deep dilemma which makes those attempts, even if heuristic, could not be successful at all. One horn of the dilemma is that, if we accept irreducibility of production, the mechanical description may serve as, by Craver’s term, “filler terms”, which characterize the mechanism merely in a superficial and uninformative way. The other horn is that, if we make explicit the notion by canonical or functional representation, typically justified apparatus of analysis in sciences, we actually put production parallel to dependence also under canonical representation, which makes us lose the sensitivity for the distinctiveness of production.
My suggestion is that to resolve the dilemma insofar as particular theory of event based on the notion of change. Mechanism is a special type of productive event. That is the main point I want to elucidate in this article. And I would argue for why it makes contributions to the resolution of the dilemma, and in further, to the justification of the mechanist theory of causation.
25.Mixed Methods in Social Research: Paradigm and Design Issues. Pei Zhang. Tianjin University of Finance and Economis, China
In spite of its wide use in many of the social sciences including applied linguistics and education, Mixed Methods Research (MMR) remains an area where scholars do not agree on many basic issues. The paradigm and design issues are among the most disputed. Through a critical review of the literature, this paper discusses and critiques some of the most important propositions on MMR’s paradigmatic foundations including pragmatism and the dialectic stance. It suggests that it is the autonomous and independent nature of research methods that provides the foundation for MMR, and that an over-emphasis on paradigmatic issues does no credit to the idea that methods are free and the advocacy that MMR is about ‘the freeing of the investigative impulses.’ This paper also discusses the major factors or dimensions that a researcher needs to consider in the design of MMR. These factors include the priority or dominance given to qualitative or quantitative components, the implementation or sequence of data collection, and the stage of integration. It argues against the idea that there needs to be a core method in MMR because equal priority is both theoretically possible and practically successful. It suggests that in MMR integration can take place at different stages of the inquiry process, yet the decision on where to integrate is a difficult one for the researcher to make as it may involve questions on qualitative/quantitative distinctions, which are inevitably open to debate.
26.The Statistical Explanation, Causal Explanations and Sure Thing Principle. Hao Hu. South China Normal University, China
In the population biology, there are three models use trait fitness to explain and predict the change of the structure of a population. Both the two-force model and single-force model regard the trait fitness as a kind of causal properties. However, Denis Walsh takes it as statistical. I try to spell out the Denis’s convincing arguments which argue against the causal explanation of trait fitness .Especially, according to Judea Pearl’s “sure thing principle”, his argument shows that in the Gillespie’ Model, the causal explanation of the trait fitness will face up to Simpson paradox and therefore it is inconsistence Thus, the statistical explanation program has got some advantages over the causal explanation program of the trait fitness at present.
27.Against “Information-theoretical Paradigm”. Hong Fang Li. WuHan University, China
We critically discuss information-theoretical paradigm and its philosophical implication. We think that the relationship between information and matter is mutually constituted, co-existing, and thus both enjoy equal ontological status, quantum information playing an important role in determining the quantum mechanical aspects of matter in concrete contingent situations, vice versa. This is a kind of informational constructivism.
28.Contextual Analysis of Gauge Argument. Jitang Li, Gui-chun Guo. Soochow University, China
According to contemporary physics, there are four basic interaction in nature, corresponding to the strong, weak, electromagnetic, and gravitational forces. The strong interaction force is well described by quantum chromo dynamics. The electroweak interaction is consists of electromagnetic interaction and weak interaction, which are described by unified electroweak theory. The gravitational force is well described by general relativity. The centre of unified electroweak theory and quantum chromo dynamics is Yang-Mills theory, and they are non-Abel gauge theory. The electromagnetic is the first gauge theory for human, which is abele gauge theory. General relativity can be count as another gauge theory other than Yang-Mills theory. That is to say, electromagnetic gauge theory, general relativity, and Yang-Mills gauge theory can be regard as gauge theory, which is an unfied framework. Why difference interactions are gauge theories? There is a reason so-called gauge argument. From the viewpoint of the relationship between mathematical structures and physical structures, we study in detail the gauge argument by the means of contextual analysis.
29.Mapping the Intellectual Structure of the Life Cycle Assessments Field through Co-citation Analysis. Ge Qian. Shanghai University of Finance and Economics,China
Life-cycle-assessment has been one of the hottest topics in environmental engineering. This paper aims to assess the evolution of life-cycle-assessment as a research field by using scientometrics and scientific visualization techniques. CiteSpace II was used to map the intellectual structure of life cycle assessments field based on 3824 articles in ISI Web of Science database on this topic between 1972 and 2011, and the co-citation maps analysed and visualized here show the major areas of research, prominent articles, major knowledge producers and journals in the life cycle assessments field. The two editions of standards made by International Organisation for Standardization (“ISO 14040”, 1997 and 2006) appears to be the first and second most influential sources, and the book by Guinee JB et al. (“Handbook on life cycle assessment: operational guide to the ISO standards”, 2002) is the third most highly cited document. INT J LIFE CYCLE ASS is the most frequently cited journal by the authors writing on life cycle assessments field. International Organisation for Standardization (ISO) is most frequently co-cited author, followed by Guinee JB and Goedkoop M. The hottest keywords in this field appear to be life cycle assessment, energy, systems, emissions, sustainability, management, model, and impact assessment.
30.The frame problem, the relevance problem, and a package solution to both. Yingjin Xu, Pei Wang. Fudan University, China
As many philosophers agree, the frame problem is concerned with how an agent may ef?ciently ?lter out irrelevant information in the process of problem-solving. Hence, how to solve this problem hinges on how to properly handle semantic relevance in cognitive modeling, which is an area of cognitive science that deals with simulating human’s cognitive processes in a computerized model. By “semantic relevance”, we mean certain inferential relations among acquired beliefs which may facilitate infor¬mation retrieval and practical reasoning under certain epistemic constraints, e.g., the insuf?ciency of knowledge, the limitation of time budget, etc. However, traditional approaches to relevance—as for example, relevance logic, the Bayesian approach, as well as Description Logic—have failed to do justice to the foregoing constraints, and in this sense, they are not proper tools for solving the frame problem/relevance problem. As we will argue in this paper, Non-Axiomatic Reasoning System (NARS) can handle the frame problem in a more proper manner, because the resulting solution seriously takes epistemic constraints on cognition as a fundamental theoretical principle.
31.Do the Computational Discovery Systems Demonstrate an Empirical Refutation of the Strong Programme ? : An Analysis of the Debate Arising out of the Computer Simulation of Scientific Discovery. Xiao-hong Wang. Xian Jiaotong University, China
In 1989 and 1991, Social Studies of Science, a significant publication on history and philosophy of science, has published a seminar on “Computer Discovery and Sociology of Scientific Knowledge” at volumes 19 and 21 respectively. The target article of it was a 40-page essay strikingly entitled “ Computer Scientific Discovery’s Experiential Rejection to Strong Programme” by Peter Slezak, director of Cognitive Science Center of New South Wales University in Australia. According to Steve Woolgar, the essay has given rise to a “huge chaos” even before its publication, so that the editors have organized this discussion participated by leaders from Sociology of Scientific Knowledge, Artificial Intelligence, Philosophy of Science, Cognitive Science and Linguistics.
Computationalists took the computational discovery systems in AI as an empirical refutation to the ‘strong programme’. This brought forth a debate on whether scientific discoveries were totally causally determined by social factors. This paper respectively analyzes what the ideas of mentalism and sociologism on scientific discoveries are in details, and unfolds the key cause of their divergences is whether to distinguish two kinds of social contexts or not. They are intrinsic/extrinsic social factors. We give our viewpoints about the results and the meanings of this debate. Meanwhile, through analyzing on the heuristics, we conclude that AI system’s rediscovery justified the rationality to distinguish two social factors. We could not separate the real scientist from the broader social contexts, but we can do this through the AI programs. Moreover, AI programs can instantiate some results of those controversies in the long history of philosophical debating.
32.Abduction, Analogy and The Logic of Discovery. Xiang Huang. Fudan University, China
The logic of discovery is usually modeled by abductive reasoning, which, according to Pierce, has the following inferential form: “the surprising fact, C, is observed; But if A were true, C would be a matter of course; Hence, there is reason to suspect that A is true”. In order to model the scientific discovery, the contemporary version of the Piercian abduction has been related by philosophers of science to the inference to the best explanation. In both abduction and inference to the best explanation, reason for accepting the inferred conclusion A comes from the epistemic and/or explanatory force of A. However, not all discoveries take this top-down procedure from A to C, in which the result of discovery A implies the observed fact C. I contend that discovery can take a bottom-up procedure based on analogy, starting from C in search of the reason A for it. The rules of this analogical reasoning are specified with the Indian medieval logician Dignaga’s triple rules. I argue that 1) both the top-down and the bottom-up approach are legitimate, and, a methodological pluralism should be adopted to understand scientific discovery; 2) many problems of discovery that crucial for abduction and the inference to the best explanation can be avoided by the bottom-up approach, hence, the analogy based model of discovery plays an important role in understand the nature of scientific discovery.
33.The Equivalence And Transformation between Non-Truth-Function And Truth- Function- The Fifth Research on Unary Operator Theory. Xiaolong Wan. Huazhong University of Science and Technology, China
There are two main kinds of non-classical logics that are modern modal logic and many-valued one which are directly relative to non-truth-function and non-classical truth-value respectively. The axiom systems of the most non-classical logics which are mainly represented by modern modal logic are the set of axioms of truth-functions in classical logic plus the additional axioms of non-truth-functions which formed by modal words "necessary" and other non-truth-function connectives. Unary Operator Theory systematically research unary operators by analyzing the inverse function of the classical truth-function, and based on the relative of non-truth-functions. At first, it focuses on Special Theory of Relativity of Function (STRF) which is only useful for the non-truth-function with two-valued unary. Moreover, by the STRF, there are new interpretations about several problems which no solved by the logic of truth-valued function only in the past.
34.On Modern Modal Logic from the Special Theory of Relativity of Function. Mingyi Chen, Xiaolong Wan. Huazhong University of Science and Technology, China
The general principle of Special Theory of Relativity of Function (STRF) is that, for any two-truth-valued variables Hp which formed by any two-truth-valued logical variable p and any unary operator H, Hp is always equivalent to a two-truth-valued function which is formed by p and another two- truth-valued variable q independent of p, whatever Hp is a truth-function of p or not. There are only 16 unary operators and 16 corresponding basic non-truth-function with two-valued unary, because there are only 16 two-truth-binary connectives and 16 corresponding basic two-truth-binary functions. Any other non-truth-functions with two truth-valued unary justly result from superposition of these 16 unary operators. In this sense, modern modal logic can be further considered as a classification study on classical bivalent truth functions in terms of first order logical axioms and rules. Thus, any set W of possible worlds in modal propositional logic merely corresponds to a set of binary truth function, and the relation R between possible worlds can be treated as a kind of set property shared by the set of the functions. Moreover, that any axiomatic schema is valid in a frame means that substituting every truth function belonging to W into each “□” in the axiomatic schema respectively in the light of K-2, so that only a set of classical theorems will be formulated.
35.Explaining the ‘Needham Question’ in Its Socio-cultural Context. Yi-dong Wei. Shanxi University, China
The ‘Needham question’ has now become an important concept in history of science since it was posed by Joseph Needham. It has attracted many historians of science from various countries to discuss it continually. Some affirm it, some deny it, and others want to correct it .In the paper this author will first argue that the ‘Needham question’ is a set of questions, and that any answer to one of the questions is an one-sided view. Secondly, this author argues that the ‘Needham question’ has contained many important problems on outlooks of science and history of science as well as methodology of history of science, such as universality of science, wholeness of science, imbalance of science and comparative analysis of context of science. Thirdly, this author argues that Needham was a socio-cultural contextualist, so his idea of historiography of science was contextualism. Finally, this author argues that the ‘Needham question’ should be understood in comparing with Chinese context and Western context, and gives out the general structure of context of science, and analyses Chinese context and Western context.
The ‘Needham question’ has become a widely-discussed topic among historians of science throughout the world, (Liu Dun,2000,293), especially in Chinese academic circles, and its full implication goes well beyond the more specific matters of history of science. In this paper, this author shall use context analysis to explore the ‘Needham question’.
36.The Contextalist Basis of Scientific Explanation. Guichun Guo. Shanxi University, China
With considerable quality of context-dependence and context-sensibility, the essence of scientific explanation is unfolded and accomplished in contextualized dynamic process, an integration of contextual structure and elements. This integration contains the structurally unification of the syntactical basis, semantic rule, pragmatical boundary, and the functionally fusion of explanatory question, background, and intentionality. In this combination, the explanatory regulation, process, criterion, and the diversity, multiplicity, and pluralism of the relevant models come into being. So the contextualist idea could significantly enhance the comprehensiveness and openness of the theory for scientific explanation. Leading to a broader notion and understanding of scientific theory and methodology, it will also encompass the intrinsic unitarity and common value among different thoughts of scientific explanation.
37.Reference and the Scientific Explanation of Concepts. Yi Jiang. Beijing Normal University, China
Jody Azzouni in his Knowledge and Reference in Empirical Science (2000) argued that confirmation holism and theoretical deductivism are wrong in reducing experiences to the systematic explanation in science. In this paper I would like to argue against this claim by exploring the reference problem in science and the scientific explanation of concepts. Firstly, I will argue that the problem would be interpreted semantically. It is impossible that reference could be reduced directly and fulfilled to what terms refer to. Secondly, we could not discuss the reference problem just by questioning whether experiences can be reduced to the systematic explanation in science. We have to get reference of terms only by those systematic explanations. Thirdly, what we can do here is to pick up concepts for scientific explanations of terms in dealing with the reference problem.
In his comments on Putnam’s (1967) paper on special relativity, Stein (1968) suggests that adopting both
(P1) All and only things that exist now are real
(a standard expression of “presentism”), and
(P2) Special relativity is a complete account of spatiotemporal structure
leads to “the interesting result that special relativity implies a peculiarly extreme (but pluralistic) form of solipsism.” This conclusion depends on an assumption, held in common by presentists and advocates of the block universe, concerning the role of spatiotemporal structure in grounding ontological unity. I argue that if dynamical laws, rather than spatiotemporal structure, bear the ontological weight of grounding the unity of what there is then the core thesis of presentism survives the transition to Minkowski spacetime without entailing “pluralistic solipsism” and without requiring that we reject (P2). Moreover, once unity is grounded in dynamics, the spatiotemporal extent of the present becomes an empirical matter. This latter result follows the general methodological line found in Newton’s work, in which more and more questions previously taken to be a priori metaphysical questions are transformed into empirical ones. To my surprise, not to say astonishment, I suggest that presentism as an empirical thesis is not yet dead.
2.Science Is Dappled; The World Is Not. Michael Strevens. University of New York, USA
Science as we know it is “dappled”: even within a discipline, models tend to draw the majority of their empirical content not from some general theory but from observed local regularities. The current scientific picture of the world, then, is a mosaic, with different aspects of the world, different systems, represented by different sub-theories or models that are only loosely or formally connected. Nancy Cartwright has proposed that the best explanation for this disunity in our representation of the world is a disunity in the world itself. The world is dappled, which is to say that, rather than being governed by a single set of omnipotent fundamental laws, what happens is due to a patchwork of principles covering different kinds of systems or segments of reality, each with something less than full omnipotence and with the possibility of anomic indeterminism at the boundaries. I want to undercut this argument for a dappled world by showing that the motley nature of science, both now and even when empirical inquiry is complete, can be equally well explained by proponents of the “fundamentalist” view that the universe’s initial conditions and fundamental laws of nature determine everything that ever happens.
3.How Rich Our Consciousness Is?. Allen Y. Houng. Yang Ming University, Taipei, Taiwan
In recent literature of consciousness study, there is a hot debate concerning how rich our conscious experience is. Take visual perception as an example, one camp of people maintain that when we see, say, a picture, we see a great details of the picture. The other camp of people contend that we see only those that get processed by our cognitive operations such as reasoning, attention, verbal report and motor control. Do we see more than what we can cognitively access? Or put it in another way, is phenomenology richer than cognition? In this essay, I will argue that our consciousness is much richer than what we can access. In addition to philosophical arguments, I will cite some empirical researches in cognitive neuroscience to support my thesis. I will also discuss the philosophical implications of that result.
4.Rule-Following and Sociology of Scientific Knowledge. Kai-Yuan Cheng. Institute of Philosophy of Mind and Cognition, Yang-Ming University, Taiwan
Philosophical discussions of rule-following in the later Wittgenstein (1953) are an important source of inspiration for the development of views on the social nature of scientific knowledge. Two major opposing views in this inquiry—Bloor’s sociology of scientific knowledge (SSK) (1983, 1991, 1992, 1997) on the one hand and ethnomethodological studies of work in the sciences and mathematics (ESW) of Garfinkel et. al. (1981, 1989) and Lynch et. al. (1983; 1992) on the other hand—represent two positions derived from two different readings of Wittgenstein’s rule-following considerations. Kusch (2004) has recently revived this noted Bloor-Lynch debate by finding fault with the readings of Wittgenstein on both sides of the debate. The aim of this paper is to re-examine, and hopefully to advance, the debate in question in light of Kusch’s (2004) take on these issues and of my own view on the problem of rule-following from the perspective of the study of generics (Cheng, 2011).
5.Two Accounts of Laws and Time. Barry Loewer. Rutgers University, USA
Among the most important questions in the metaphysics of science are “What are the natures of fundamental laws and chances?” and “What grounds the direction of time? I will examine some connections between these questions, discuss two approaches to answering them and argue in favor of one. Along the way I will raise and comment on a number of issues concerning the relationship between physics and metaphysics and consequences for the subject matter and methodology of metaphysics.
6.How physics mandates Darwinian adaptation and makes it irreducible. Alex Rosenberg. Duke University, USA
I show that, given reasonable requirements on any explanation of adaptations, physical processes are necessary and sufficient for Darwinian processes, and they exclude any alternative processes that might produce adaptations. I then go on to show that the thermodynamic character of the process whereby physical processes generate adaptations also makes it difficult systematically to explain them and so gives biology its appearance of autonomy and irreducibility.
7.Cases as Bases for the Epistemology of Science. Anjan Chakravartty. University of Notre Dame, USA
Case studies of scientific theory and practice, past and present, are often offered as evidence for or against scientific realism: the view that our best theories give approximately true descriptions of unobservable aspects of the world, or that their central terms refer to entities that exist in a mind-independent reality, or that they give approximative representations of the structure of scientific phenomena. But how compelling is this evidence? I consider this question in light of three arguments. The first concerns the possible immunity of realism (and antirealism) to case studies of scientific methodology. The second concerns the possible immunity of realism to considerations of the history of scientific theories that are now rejected. The third concerns a suggested inability of case study evidence to indicate which form of scientific realism is most promising.
8.Simplicity, Unification, and Ockham’s Razor. Malcolm Forster. University of Wisconsin-Madison, USA
9.Explanation by idealized theories. Ilkka Niiniluoto. University of Helsinki, Finland
Idealized scientific theories tell how natural and social systems would behave under counterfactual conditions, so that their descriptions of actual situations are at best truthlike. The use of such theories as covering laws in deductive-nomological explanations of empirical facts and regularities is therefore problematic. Duhem observed already in 1906 that Newton’s theory and Kepler’s laws in fact strictly speaking contradict each other. Attempts to analyze their relation as approximate explanation were given in the 1960s. A more systematic perspective was developed in the method of idealization and concretization by the Poznan school (Nowak, Krajewski) in the 1970s: ideal assumptions are first made explicit as antecedents of counterfactual conditionals (idealizational laws) and then these assumptions are eliminated or relaxed by modifying the consequent (concretization, factualization). This treatment covers both the so-called Galilean idealizations (simplifications for the purpose computability) and isolations (consideration of essential or pure aspects of phenomena without secondary disturbing factors). It allows us to distinguish several important types of explanations by idealizations. First, a theory T’ may entail an empirical consequence e’ which approximates a well-established empirical statement e. If e’ is a concretization of e, then this explanation is corrective. After the independent empirical confirmation of e’, this argument can used to answer the contrastive explanatory question “Why e’ rather than e?”. Secondly, T’ may itself be a concretization of some theory T which entails e. Again, the explanation of e by showing that e’ follows from T’ is corrective. Thirdly, while an empirical claim e may be approximately derivable from an idealizational law T or from an approximation of T, this derivation need not be replaced by a corrective explanation, as e may be strictly deducible from a concretization T’ of T.
10.A Muse of Fire. James Robert Brown. Toronto University, Canada
Daniel Dennett and others have rejected the use of thought experiments. This talk will be an account of Dennett's reasons for doing so, followed by a critical evaluation and rejection of them. The specific issues will involve the difference between philosophical thought experiments (which he criticizes) and those from the sciences, and also his claim that common sense "folk concepts" only get in the way of real knowledge. It will turn out that Dennetthas no case against the use of thought experiments, but the reasons for his failure are quite interesting in their own right.
11.Grounding Content. Kelly Trogdon. Lingnan University, Hong Kong
In this paper I develop and defend a novel naturalistic account of mental content, what I call thegrounding theory. The account isinformationalin that it understands content partly in terms certainmind-world nomic relations andreferentialin that it treats the contents of neural states as properties. The account identifies a substantive sufficient condition for a property (e.g.being a horse) tobe the content of a neural state (e.g. HORSE), one that appeals to the idea that certain causal facts ground other causal facts. The grounding theory is inspired by Fodor’s asymmetricdependence theory of content. I argue that the grounding theory incorporates the insights of Fodor’s theory while avoiding some of its centrals problems..
12.Models, Structures, and the Explanatory Role of Mathematics in Empirical Science. Mary Leng. University of York, UK
13.Laplace's Demon and the Adventures of his Apprentice. Roman Frigg. LSE, London UK
Foretelling the future is an age-old human desire. Among the methods to pursue this goal mathematical modelling has gained prominence, in particular in climate science. Many of these models promise to make probabilistic forecasts. This raises the question of exactly what these models deliver: can these models provide the results as advertised? The aim of this paper is to urge some caution. Using the example of the logistic map, we argue that if a model is non-linear and if there is only the slightest model imperfection, then treating model outputs as decision relevant probabilistic forecasts can be seriously misleading. This casts doubt on the trustworthiness of model results. This is nothing short of a methodological disaster: probabilistic forecasts are used in many places all the time and the realisation that probabilistic forecast cannot be trusted pulls the rug from underneath many modelling endeavours.
14.Dispositions, Twin Earth, and Relativity of Scientific Theories. Sungho Choi. Kyung Hee University, South Korea
In their attempt to provide a better understanding of semantics of dispositional predicates, Bird and Lipton entertain an insightful idea that there is a remarkable similarity between natural kind terms and dispositional terms. The chief goal of this paper is to give substantive content to their idea, which will lead to the externalist thesis that the prevalent feature of the ascriber’s environment makes difference to the overall semantic content of the dispositional ascription. To justify this thesis, I will bring into focus the issue of how dispositional concepts are used for a variety of classificatory and inferential purposes in everyday or scientific practices, which will provide a theoretical basis for turning down what I call the no interference reasoning, a line of reasoning that I think has been at least implicitly endorsed by a number of metaphysicians of dispositions. I indeed maintain that the neglect of this practical dimension of our use of dispositional concepts is largely responsible for the failure of some prominent proposals on the semantics of dispositional predicates. I will close by explaining the lessons regarding scientific practices we can draw from the externalist thesis advanced in this paper.
15.“Bridled Irrationality”: Historical Antecedents of Bas van Fraassen’s Epistemic Voluntarism. Kathleen Okruhlik. Department of Philosophy The University of Western Ontario, Canada
Bas van Fraassen’s earliest work was in the philosophy of time and space. He developed his views in the context of the conventionalisms of Hans Reichenbach and Adolf Grünbaum, his dissertation supervisor. Conventionalism highlights the necessity of making choices in science. Whether it should be considered a form of epistemic voluntarism is a more difficult question, which will be examined in the latter parts of this paper.
Van Fraassen’s constructive empiricism and his more recent empiricist structuralism, on the other hand, are both couched in the context of a very explicit commitment to epistemic voluntarism. In his 1984 article called “Belief and the Will”, van Fraassen draws on St. Augustine, Pascal, and William James to make his argument that diachronic incoherence is to be avoided by replacing the descriptivist view of epistemic judgment with an alternative view that makes expressing a belief similar to making a promise. He calls this view “voluntarist” because “it makes judgment in general, and subjective probability in particular, a matter of cognitive commitment, intention, engagement.” Belief, he says, “is a matter of the will.”
In Laws and Symmetry (1989), after coining the slogan that “Rationality is only bridled irrationality”, van Fraassen goes on to say that “in rationality as in morality, what is not required is still not always forbidden. This means…that rational change of opinion is not a case of rule-following. It includes an element of free choice… . The factors relevant to the free choices…are in the domain of pragmatics.” (p. 176) In The Empirical Stance (2002), scientific revolutions (Kuhnian paradigm changes) are described as conversion processes that depend on emotional transformation. Much of the language used here is explicitly Sartrean, and in recent work, van Fraassen’s epistemic agent has come more and more to resemble Sartre’s existential protagonist. In science as in morality, we are condemned to freedom.
This paper will try to sort out the historical traditions that have shaped van Fraassen’s epistemic voluntarism. It will cross the Kantian divide only a little bit. St. Augustine and Pascal figure in the analysis, but the paper will deal mainly with three post-Kantian traditions: conventionalism, pragmatism, and existentialism, all of which underwent periods of rapid development early in the twentieth century. There is ample evidence of mutual influence, and sometimes individuals are located in more than one tradition. So, for example, in her 1912 MA thesis called Pragmatism and French Voluntarism, Susan Stebbing denies the claim made by others that LeRoy and Poincaré can be classified as pragmatists and locates them instead within the tradition of French voluntarism. Others, of course, have classified the same figures as conventionalists or sometimes “radical conventionalists”.
My hope is to tell a story about van Fraassen’s historical antecedents that will explain the prima facie tensions I see between the existentialist and pragmatic dimensions of his epistemic voluntarism.
16.What Kind of Progress Occurred When Genes Went Molecular?. Paul E. Griffiths (co-authored with Dr Karola Stotz). University of Sydney, Australia
The result of the molecular revolution in genetics was not that the Mendelian gene was reduced, to or eliminated in favour of, the molecular gene. Although the new, molecular identity of the gene was now its dominant identity, the other, Mendelian identity did not simply go away. The gene retains its Mendelian identity in certain areas of biological research, namely those continuous with classical genetic analysis. We give examples of contemporary research in which it is necessary to think of genes as both Mendelian alleles and molecular genes, even when those two identities do not converge on the same pieces of DNA. This explains the well-known difficulties encountered by philosophers who have tried to explain how the Mendelian gene was reduced to molecular biology. The attempt to assimilate the development of genetics to the replacement of an old theory by a new theory resulted from the failure to recognise what is now widely accepted, namely that Mendelian and molecular genetics are different experimental practices, rather than different theories of the gene.
17.How Fundamental Physics Represents Causality. Andreas Bartels & Daniel Wohlfarth. University of Bonn, Germany
According to the Neo-Russellian view (e.g. Price & Corry 2007), causality has no place within fundamental physics. I discuss some arguments pro and against this thesis (e.g. Norton 2007, Price 2007 on the one, S. Smith 2007 and M. Frisch 2012 on the other side) with the result that (i) no convincing argument has been provided to the effect that fundamental physics doesn’t represent causation, and (ii) even defenders of causality in physics refrain from anchoring causation within fundamental structures (see e.g. Frisch’s account according to which the causal resources of physics models are derived from their embedding into enriched causal structures).
Contrary to the view that causation is not a matter of fundamental physics, I will show that the spacetimes of General Relativity have intrinsic structure to represent causal relations, at least with respect to two fundamental properties, time-asymmetry and energy flow between events. Firstly, it can be shown that “almost all” spacetimes in which a global cosmic time order exists are time-asymmetric with respect to the cosmic time parameter (Wohlfarth 2012). Secondly, there exist physical timelike vector fields by which this global time-asymmetry can be tranferred in an overall consistent way to the local proper time parameter. Thus the local time asymmetry grounding causal relations can be derived from the global structure of spacetime. Thirdly, the 4-vector representing local energy flow according to the standard interpretation is a candidate for the above mentioned physical timelike vector field. It can thus serve to represent causal relations by time-asymmetric energy flow. In sum, this shows that, even if causation is not represented by the fundamental laws of physics directly, fundamental structures within their solutions do have the resources to represent causation.
18.On Forster’s Counterexamples to the Likelihood Theory of Evidence. Jiji Zhang. Lingnan University, Hong Kong
Forster (2006) presented some interesting examples having to do with distinguishing the direction of causal influence between two variables, which he argued are counterexamples to the likelihood theory of evidence (LTE). His more general contention was that a likelihoodist or Bayesian philosophy of science based on the LTE could not capture the consilience of inductions. In this paper we examine Forster’s argument that his examples constitute counterexamples to the LTE, and argue that it relies on an intuition that likelihoodists have forcefully criticized. We then present a theorem that reveals a general and systematic connection between likelihood and (a reasonable interpretation of) consilience, in the kind of causal inference considered by Forster. The theorem shows that at least for such problems, likelihood-based methods can well accommodate the consilience of inductions. (This is joint work with Kun Zhang.)
19.The Promise of Science. Janet A. Kourany. University of Notre Dame, USA
At the dawn of modern science, a promise was made. If society would but support the new enterprise, society would be richly rewarded not only with unprecedented insights into the workings of the universe but also with all the benefits that such insights would provide. Indeed, Francis Bacon, one of the main architects of the new science as well as one of its more exuberant press agents, promised that the knowledge science would offer would “establish and extend the power and dominion of the human race over the universe” for the benefit of all of humankind. Now, centuries later, has the promise been kept? Certainly, science has given us much control over nature—more diverse, more abundant foods to eat, produced more quickly and easily; more comfortable, more attractive places to live with more conveniences, built more efficiently; quicker, more convenient ways to get around and to communicate with each other; and so on. Yet, science has given us far less control over human nature—increased longevity; freedom from disease, disability, and suffering; expanded mental and physical abilities; personality traits more amenable to our needs, and so on. Now, in the 21st century, however, all these are the anticipated outcomes of current technoscientific developments. More specifically, all these are the anticipated outcomes of four emerging technosciences—nanotechnology, biotechnology, cognitive science, and information technology. And since these technosciences are not only emerging, but also converging—that is, since they interconnect and assist each other in myriad ways—the new modes of control are said to lie close to the horizon, and some are available even now. Carried to the utmost, the result will be humans with such radically expanded capacities as to be no longer humans at all—“posthumans.” Or so it is claimed. But is this so-called “human enhancement” project a legitimate part of Bacon’s promise for science? That is the question I would like to explore. I will argue that it is not, at least not at present.
20.Scaffolding Understanding: featuring causality. Sergio F. Martínez. Universidad Nacional Autónoma de México, Mexico
The search for understanding is a major aim of science. Traditionally, understanding has been undervalued in the philosophy of science because it is not possible to detach it from its psychological underpinnings; nowadays, however, it is widely recognized that epistemology cannot be divorced from psychology. This eliminates the main obstacle to giving the concept of understanding due attention in philosophy of science. My aim in this paper is to describe an account of scientific understanding as an emergent feature of our mastering of different (causal) explanatory frameworks. Mastering of explanatory frameworks involves the mastering of scientific practices and the development of appropriate ontologies that allow for the integration of very different phenomena into more general and more tightly related explanations. Clarifying my proposal requires the elaboration of underlying views about the relation between explanation and understanding, as well as about the role of processes of abstraction in generating the appropriate ontologies (promoting understanding). I will use examples from different disciplines, mainly from biology and the cognitive sciences to make my point.
21.The Veracity of Empirical Data: A Causal Account of Observational Error. James W. McAllister. University of Leiden, Netherlands
Almost every writer on science since the seventeenth century has agreed that it is possible that empirical data – the results of observations and measurements – are false, untrustworthy or uninformative. Scientists and philosophers of science use the concept of observational error to refer to the degree or extent to which empirical data fail in this way. In this talk, I argue that it is impossible on conceptual grounds that empirical data are anything other than entirely veridical. I retrace the origin of the concept of observational error to intentional accounts of the production of empirical data, which, though useful in some respects, are misleading. I propose an alternative, causal account that provides a better understanding of how empirical data are produced and in which the concept of observational error does not arise.
22.Lawful Explanation and Manipulationist Account. Wei Wang. Tsinghua University, China
In a recent decade, Prof. James Woodward’s manipulationist account became the most promising approach in the issue of scientific explanation. The paper tries to argue for lawful explanation by revisiting the comparison between his notion of invariance and the traditional concepts, e.g. laws and ceteris paribus laws. The author agrees that Woodward’s concept of invariance would improve our understanding of laws and CP laws, therefore scientific explanation. However, manipulationist account could be a complement, rather than the replacement, of lawful explanation. Lawful explanation is helpful for us to “Think Global”, while the maninpulationist account is good to “Act Local”.
23.Hierarchy, Form, and Reality. Gang Cheng. Huazhong University of Science and Technology, China
Scientific progress in the 20th century has shown that the structure of the world is hierarchical. A philosophical analysis of the hierarchy will bear obvious significance for metaphysics and philosophy in general. Jonathan Schaffer's paper, "Is There a Fundamental Level?", provides a systematic review of the works in the field, the difficulties for various versions of fundamentalism, and the prospect for the third option, i.e., to treat each level as ontologically equal. The purpose of this paper is to provide an argument for the third option. The author will apply Aristotle's theory of matter and form to the discussion of the hierarchy and develop a form realism, which will grant every level with "full citizenship in the republic of being." It constitutes an argument against ontological and epistemological reductionism. A non-reductive theory of causation is also developed against the fundamental theory of causation.
24.Mechanism and Productive Event. Zhu Xu. Graduate University of China Academy of Sciences, China
This article is about theory of causation; in particular it is about the mechanist theory of causation. The regularist theory of Humean tradition has been an accepted view in the twentieth century. In the last two decades, several philosophers of science put more attention on causation and explanation in special sciences than ever before. Some of them claim the interventionist theory which conceptualizes causation in terms of manipulability among variables, namely as “invariance under intervention” (Hausman and Woodward 1999; Woodward 2000, 2003). Others are mechanical theorists, who argue that the notion of mechanism, though differs from mechanistic philosophy in the seventeenth century, implies causation in special sciences, such as biology, neuroscience, and the social sciences, in the ontological and methodological sense (Glennan 1996, 2002; Machamer, Darden, and Craver 2000). Though both are critics on the regularist view, interventionist and mechanist theories are still competing with each other (Machamer 2004; Psillos 2004; Woodward 2002, 2011).
The central problem in the following discussion is the characterization of production. The mechanist theory is to appeal to a general intuition on causality, namely that causes play some productive roles on effects. Mechanist theory believes production could not be correctly represented from the notion of causal dependence. However, the proponents of interventionist theories could insist that there is no distinctiveness of production other than dependence among variables. Mechanist theory could only provide detailed spatial-temporal information about “invariance under intervention”, rather than any distinctive position on causation. Therefore, it is necessary to put forward some plausible characterization of production in the justification of mechanist theory.
There are three typical attempts on that issue. Machamer (2004) maintains the irreducibility of production based on the difference between activities, represented as verb-forms in natural language, and correlations among variables usually described in canonical way. Hall (2004) gives a reductive analysis of production, parallel to counterfactual analysis on dependence. Craver (2007) seems to hold an intermediate position, which admits manipulability as an empirical criterion for mechanism, while denies the models constructed merely from dependence among variables could be qualified as mechanism.
This article opens with a brief introduction of the mechanist theory, discussing its components and competitions with interventionist theory. From the second section it turns to the discussion of the above three approaches on conceptualizing production. I will argue that all of them are entangled with a deep dilemma which makes those attempts, even if heuristic, could not be successful at all. One horn of the dilemma is that, if we accept irreducibility of production, the mechanical description may serve as, by Craver’s term, “filler terms”, which characterize the mechanism merely in a superficial and uninformative way. The other horn is that, if we make explicit the notion by canonical or functional representation, typically justified apparatus of analysis in sciences, we actually put production parallel to dependence also under canonical representation, which makes us lose the sensitivity for the distinctiveness of production.
My suggestion is that to resolve the dilemma insofar as particular theory of event based on the notion of change. Mechanism is a special type of productive event. That is the main point I want to elucidate in this article. And I would argue for why it makes contributions to the resolution of the dilemma, and in further, to the justification of the mechanist theory of causation.
25.Mixed Methods in Social Research: Paradigm and Design Issues. Pei Zhang. Tianjin University of Finance and Economis, China
In spite of its wide use in many of the social sciences including applied linguistics and education, Mixed Methods Research (MMR) remains an area where scholars do not agree on many basic issues. The paradigm and design issues are among the most disputed. Through a critical review of the literature, this paper discusses and critiques some of the most important propositions on MMR’s paradigmatic foundations including pragmatism and the dialectic stance. It suggests that it is the autonomous and independent nature of research methods that provides the foundation for MMR, and that an over-emphasis on paradigmatic issues does no credit to the idea that methods are free and the advocacy that MMR is about ‘the freeing of the investigative impulses.’ This paper also discusses the major factors or dimensions that a researcher needs to consider in the design of MMR. These factors include the priority or dominance given to qualitative or quantitative components, the implementation or sequence of data collection, and the stage of integration. It argues against the idea that there needs to be a core method in MMR because equal priority is both theoretically possible and practically successful. It suggests that in MMR integration can take place at different stages of the inquiry process, yet the decision on where to integrate is a difficult one for the researcher to make as it may involve questions on qualitative/quantitative distinctions, which are inevitably open to debate.
26.The Statistical Explanation, Causal Explanations and Sure Thing Principle. Hao Hu. South China Normal University, China
In the population biology, there are three models use trait fitness to explain and predict the change of the structure of a population. Both the two-force model and single-force model regard the trait fitness as a kind of causal properties. However, Denis Walsh takes it as statistical. I try to spell out the Denis’s convincing arguments which argue against the causal explanation of trait fitness .Especially, according to Judea Pearl’s “sure thing principle”, his argument shows that in the Gillespie’ Model, the causal explanation of the trait fitness will face up to Simpson paradox and therefore it is inconsistence Thus, the statistical explanation program has got some advantages over the causal explanation program of the trait fitness at present.
27.Against “Information-theoretical Paradigm”. Hong Fang Li. WuHan University, China
We critically discuss information-theoretical paradigm and its philosophical implication. We think that the relationship between information and matter is mutually constituted, co-existing, and thus both enjoy equal ontological status, quantum information playing an important role in determining the quantum mechanical aspects of matter in concrete contingent situations, vice versa. This is a kind of informational constructivism.
28.Contextual Analysis of Gauge Argument. Jitang Li, Gui-chun Guo. Soochow University, China
According to contemporary physics, there are four basic interaction in nature, corresponding to the strong, weak, electromagnetic, and gravitational forces. The strong interaction force is well described by quantum chromo dynamics. The electroweak interaction is consists of electromagnetic interaction and weak interaction, which are described by unified electroweak theory. The gravitational force is well described by general relativity. The centre of unified electroweak theory and quantum chromo dynamics is Yang-Mills theory, and they are non-Abel gauge theory. The electromagnetic is the first gauge theory for human, which is abele gauge theory. General relativity can be count as another gauge theory other than Yang-Mills theory. That is to say, electromagnetic gauge theory, general relativity, and Yang-Mills gauge theory can be regard as gauge theory, which is an unfied framework. Why difference interactions are gauge theories? There is a reason so-called gauge argument. From the viewpoint of the relationship between mathematical structures and physical structures, we study in detail the gauge argument by the means of contextual analysis.
29.Mapping the Intellectual Structure of the Life Cycle Assessments Field through Co-citation Analysis. Ge Qian. Shanghai University of Finance and Economics,China
Life-cycle-assessment has been one of the hottest topics in environmental engineering. This paper aims to assess the evolution of life-cycle-assessment as a research field by using scientometrics and scientific visualization techniques. CiteSpace II was used to map the intellectual structure of life cycle assessments field based on 3824 articles in ISI Web of Science database on this topic between 1972 and 2011, and the co-citation maps analysed and visualized here show the major areas of research, prominent articles, major knowledge producers and journals in the life cycle assessments field. The two editions of standards made by International Organisation for Standardization (“ISO 14040”, 1997 and 2006) appears to be the first and second most influential sources, and the book by Guinee JB et al. (“Handbook on life cycle assessment: operational guide to the ISO standards”, 2002) is the third most highly cited document. INT J LIFE CYCLE ASS is the most frequently cited journal by the authors writing on life cycle assessments field. International Organisation for Standardization (ISO) is most frequently co-cited author, followed by Guinee JB and Goedkoop M. The hottest keywords in this field appear to be life cycle assessment, energy, systems, emissions, sustainability, management, model, and impact assessment.
30.The frame problem, the relevance problem, and a package solution to both. Yingjin Xu, Pei Wang. Fudan University, China
As many philosophers agree, the frame problem is concerned with how an agent may ef?ciently ?lter out irrelevant information in the process of problem-solving. Hence, how to solve this problem hinges on how to properly handle semantic relevance in cognitive modeling, which is an area of cognitive science that deals with simulating human’s cognitive processes in a computerized model. By “semantic relevance”, we mean certain inferential relations among acquired beliefs which may facilitate infor¬mation retrieval and practical reasoning under certain epistemic constraints, e.g., the insuf?ciency of knowledge, the limitation of time budget, etc. However, traditional approaches to relevance—as for example, relevance logic, the Bayesian approach, as well as Description Logic—have failed to do justice to the foregoing constraints, and in this sense, they are not proper tools for solving the frame problem/relevance problem. As we will argue in this paper, Non-Axiomatic Reasoning System (NARS) can handle the frame problem in a more proper manner, because the resulting solution seriously takes epistemic constraints on cognition as a fundamental theoretical principle.
31.Do the Computational Discovery Systems Demonstrate an Empirical Refutation of the Strong Programme ? : An Analysis of the Debate Arising out of the Computer Simulation of Scientific Discovery. Xiao-hong Wang. Xian Jiaotong University, China
In 1989 and 1991, Social Studies of Science, a significant publication on history and philosophy of science, has published a seminar on “Computer Discovery and Sociology of Scientific Knowledge” at volumes 19 and 21 respectively. The target article of it was a 40-page essay strikingly entitled “ Computer Scientific Discovery’s Experiential Rejection to Strong Programme” by Peter Slezak, director of Cognitive Science Center of New South Wales University in Australia. According to Steve Woolgar, the essay has given rise to a “huge chaos” even before its publication, so that the editors have organized this discussion participated by leaders from Sociology of Scientific Knowledge, Artificial Intelligence, Philosophy of Science, Cognitive Science and Linguistics.
Computationalists took the computational discovery systems in AI as an empirical refutation to the ‘strong programme’. This brought forth a debate on whether scientific discoveries were totally causally determined by social factors. This paper respectively analyzes what the ideas of mentalism and sociologism on scientific discoveries are in details, and unfolds the key cause of their divergences is whether to distinguish two kinds of social contexts or not. They are intrinsic/extrinsic social factors. We give our viewpoints about the results and the meanings of this debate. Meanwhile, through analyzing on the heuristics, we conclude that AI system’s rediscovery justified the rationality to distinguish two social factors. We could not separate the real scientist from the broader social contexts, but we can do this through the AI programs. Moreover, AI programs can instantiate some results of those controversies in the long history of philosophical debating.
32.Abduction, Analogy and The Logic of Discovery. Xiang Huang. Fudan University, China
The logic of discovery is usually modeled by abductive reasoning, which, according to Pierce, has the following inferential form: “the surprising fact, C, is observed; But if A were true, C would be a matter of course; Hence, there is reason to suspect that A is true”. In order to model the scientific discovery, the contemporary version of the Piercian abduction has been related by philosophers of science to the inference to the best explanation. In both abduction and inference to the best explanation, reason for accepting the inferred conclusion A comes from the epistemic and/or explanatory force of A. However, not all discoveries take this top-down procedure from A to C, in which the result of discovery A implies the observed fact C. I contend that discovery can take a bottom-up procedure based on analogy, starting from C in search of the reason A for it. The rules of this analogical reasoning are specified with the Indian medieval logician Dignaga’s triple rules. I argue that 1) both the top-down and the bottom-up approach are legitimate, and, a methodological pluralism should be adopted to understand scientific discovery; 2) many problems of discovery that crucial for abduction and the inference to the best explanation can be avoided by the bottom-up approach, hence, the analogy based model of discovery plays an important role in understand the nature of scientific discovery.
33.The Equivalence And Transformation between Non-Truth-Function And Truth- Function- The Fifth Research on Unary Operator Theory. Xiaolong Wan. Huazhong University of Science and Technology, China
There are two main kinds of non-classical logics that are modern modal logic and many-valued one which are directly relative to non-truth-function and non-classical truth-value respectively. The axiom systems of the most non-classical logics which are mainly represented by modern modal logic are the set of axioms of truth-functions in classical logic plus the additional axioms of non-truth-functions which formed by modal words "necessary" and other non-truth-function connectives. Unary Operator Theory systematically research unary operators by analyzing the inverse function of the classical truth-function, and based on the relative of non-truth-functions. At first, it focuses on Special Theory of Relativity of Function (STRF) which is only useful for the non-truth-function with two-valued unary. Moreover, by the STRF, there are new interpretations about several problems which no solved by the logic of truth-valued function only in the past.
34.On Modern Modal Logic from the Special Theory of Relativity of Function. Mingyi Chen, Xiaolong Wan. Huazhong University of Science and Technology, China
The general principle of Special Theory of Relativity of Function (STRF) is that, for any two-truth-valued variables Hp which formed by any two-truth-valued logical variable p and any unary operator H, Hp is always equivalent to a two-truth-valued function which is formed by p and another two- truth-valued variable q independent of p, whatever Hp is a truth-function of p or not. There are only 16 unary operators and 16 corresponding basic non-truth-function with two-valued unary, because there are only 16 two-truth-binary connectives and 16 corresponding basic two-truth-binary functions. Any other non-truth-functions with two truth-valued unary justly result from superposition of these 16 unary operators. In this sense, modern modal logic can be further considered as a classification study on classical bivalent truth functions in terms of first order logical axioms and rules. Thus, any set W of possible worlds in modal propositional logic merely corresponds to a set of binary truth function, and the relation R between possible worlds can be treated as a kind of set property shared by the set of the functions. Moreover, that any axiomatic schema is valid in a frame means that substituting every truth function belonging to W into each “□” in the axiomatic schema respectively in the light of K-2, so that only a set of classical theorems will be formulated.
35.Explaining the ‘Needham Question’ in Its Socio-cultural Context. Yi-dong Wei. Shanxi University, China
The ‘Needham question’ has now become an important concept in history of science since it was posed by Joseph Needham. It has attracted many historians of science from various countries to discuss it continually. Some affirm it, some deny it, and others want to correct it .In the paper this author will first argue that the ‘Needham question’ is a set of questions, and that any answer to one of the questions is an one-sided view. Secondly, this author argues that the ‘Needham question’ has contained many important problems on outlooks of science and history of science as well as methodology of history of science, such as universality of science, wholeness of science, imbalance of science and comparative analysis of context of science. Thirdly, this author argues that Needham was a socio-cultural contextualist, so his idea of historiography of science was contextualism. Finally, this author argues that the ‘Needham question’ should be understood in comparing with Chinese context and Western context, and gives out the general structure of context of science, and analyses Chinese context and Western context.
The ‘Needham question’ has become a widely-discussed topic among historians of science throughout the world, (Liu Dun,2000,293), especially in Chinese academic circles, and its full implication goes well beyond the more specific matters of history of science. In this paper, this author shall use context analysis to explore the ‘Needham question’.
36.The Contextalist Basis of Scientific Explanation. Guichun Guo. Shanxi University, China
With considerable quality of context-dependence and context-sensibility, the essence of scientific explanation is unfolded and accomplished in contextualized dynamic process, an integration of contextual structure and elements. This integration contains the structurally unification of the syntactical basis, semantic rule, pragmatical boundary, and the functionally fusion of explanatory question, background, and intentionality. In this combination, the explanatory regulation, process, criterion, and the diversity, multiplicity, and pluralism of the relevant models come into being. So the contextualist idea could significantly enhance the comprehensiveness and openness of the theory for scientific explanation. Leading to a broader notion and understanding of scientific theory and methodology, it will also encompass the intrinsic unitarity and common value among different thoughts of scientific explanation.
37.Reference and the Scientific Explanation of Concepts. Yi Jiang. Beijing Normal University, China
Jody Azzouni in his Knowledge and Reference in Empirical Science (2000) argued that confirmation holism and theoretical deductivism are wrong in reducing experiences to the systematic explanation in science. In this paper I would like to argue against this claim by exploring the reference problem in science and the scientific explanation of concepts. Firstly, I will argue that the problem would be interpreted semantically. It is impossible that reference could be reduced directly and fulfilled to what terms refer to. Secondly, we could not discuss the reference problem just by questioning whether experiences can be reduced to the systematic explanation in science. We have to get reference of terms only by those systematic explanations. Thirdly, what we can do here is to pick up concepts for scientific explanations of terms in dealing with the reference problem.