responsabile sezioneVincenzo Fano


Luciano Boi, Fondamenti geometrici e problemi filosofici dello spazio-tempo Dalla relatività generale alla teoria delle supercorde, 16-2-2012 ©
The answer to some of the longstanding issues in the 20th century theoretical physics, such as those of the incompatibility between general relativity and quantum mechanics, the broken symmetries of the electroweak force acting at the subatomic scale and the missing mass of Higgs particle, and also those of the cosmic singularity and the black matter and energy, appear to be closely related to the problem of the quantum texture of space-time and the fluctuations of its underlying geometry. Each region of space landscape seem to be filled with spacetime weaved and knotted networks, for example, spacetime has immaterial curvature and structures, such as topological singularities, and obeys the laws of quantum physics. Thus, it is filled with potential particles, pairs of virtual matter and anti-matter units, and poten ties at the quantum scale. For example, quantum entities (like fields and particles) have both wave (i.e., continuous) and particle (i.e., discrete) properties and behaviors. At the quantum level (precisely, the Planck scale) of space-time such properties and behaviors could emerge from some underlying (dynamic) phase space related to some field theory. Accordingly, these properties and behaviors leave their signature on objects and phenomena in the real Universe. In this paper we consider some conceptual issues of this question.

 

Angelo Vistoli, Groupoids: a local theory of symmetry, 26-9-2011 ©
The theme of symmetry is of great interest to mathematician, physicists, chemists, biologists, psychologists, philosophers, and others. The very word \symmetry" is used with a wide variety of meanings; I will only discuss the way it is used in mathematics. In fact, even this seems to me too ambitious a goal. Symmetry permeates every eld of mathematics, and I do not have the intention, and even less the ability, to give a comprehensive picture of its multifaceted aspects. The mathematical analysis of the concept has been traditionally based on the theory of group actions. As we shall discuss, this notion is global; that is, the symmetries of a structure (geometric or otherwise) always involve the whole structure. It is natural, on the other hand, to talk about local symmetries, symmetries that appear only among certain parts of the structure itself. Mathematicians have a local theory of symmetry, which is known as the theory of groupoids. However, its existence does not seem to have been really noticed outside of the communities of mathematicians and theoretical physicists; the only place in the philosophical literature where I have seen it discussed is Cor eld's book (Cor eld 2003), which, I am afraid, has not been read by many philosophers, because of the vast mathematical background it requires. The very modest purpose of this note is to give a quick introduction to symmetry in mathematics, and point out the existence of a mathematical analysis of the notion of local symmetry to philosophers and others who may be interested in this theme. No originality whatsoever is claimed for any of the ideas presented here.

Stefano Bordoni, Re-thinking a Scientific Revolution: An inquiry into late nineteenth-century theoretical physics, 20-9-2011 ©
In the early 1890s, before his well-known experiments on cathode rays, J.J. Thomson outlined a discrete model of electromagnetic radiation. In the same years, Larmor was trying to match continuous with discrete models for matter and electricity. Just starting from Faraday‟s tubes of force, J.J. Thomson put forward a reinterpretation of the electromagnetic field: energy, placed both in the tubes of force and in the motion of tubes of force, spread and propagated by discrete units, in accordance with a theoretical model quite different from Maxwell and Heaviside‟s. Larmor developed a different theoretical model, wherein electrons, discrete units of matter and electricity, stemmed from the continuous structure of aether. Both of them tried to realise an original integration between two different British traditions: Maxwell‟s contiguous action applied to electrodynamics, and W. Thomson‟s kinetic model of matter. Although Larmor and J.J. Thomson‟s specific theoretical models were formally dismissed after the deep transformations which took place in theoretical physics in the first decades of the twentieth century, I find a persistence of commitments and conceptions. This conceptual link has been overlooked in more recent secondary literature. What appears as a sort of missing link in recent historical studies was seriously taken into account by contemporary physicists. Nevertheless, authoritative physicists like Planck and Millikan were led astray by oversimplifications and misinterpretations. In order to appreciate the continuity between late nineteenth-century electromagnetic theories which emerged in the British context, and early twenty-century new physics we should disentangle the different levels of that theoretical physics. We should distinguish first-level specific theoretical models from second-level more general conceptions or conceptual streams.

Carlo Maria Cirino, Osservazioni sulla nozione di rovesciamento freudiano nell’opera di Georges Lapassade, 26-6-2011 ©
This paper aims to follow the steps that led Freud to abandon the paradigm of consciousness dissociation, useful to explain the psychical mechanism of hysterical phenomena and, more generally, of personality disorders. By analyzing George Lapassade's observations, we will show how, despite the attempts of asserting the contrary, Freud is in line with his historical predecessors: Franz Anton Mesmer and Pierre Janet. Hence, within the new historical framework, the interpretation of the disease symptoms (the cornerstone in the psychoanalytic therapy developed by Freud) becomes only a means to conceal the reality of the consciousness dissociation. For fear of the unknown, Freud turns from one of the most burning pieces of evidence related to the nature of human mind: the likelihood that conscience is not unique and indivisible.

Clelia Sedda, Gino Tarozzi, Vedere e rivedere l’inosservabile doppia natura della realtà quantistica, 10/11/2010 ©
The very historical origin of movie appears strongly connected to the requests of scientific research, owing to the aptitude of the language of motion picture both to record physical reality in its dynamic aspect, allowing one to analyze those types of motion, either too slow or too fast to be immediately perceived, and to have access to those phenomena which are not directly observable, but able to impress the photographic emulsions.
With the transition from the individual vision of the scientist to collective achievement in movie theatres, cinematographic language seems to have gradually lost its function of objective representation of reality, and in particular of those forms and aspects of reality which cannot directly observed.
Two short films, made between the '70s and '80s of the last century, the former by the physicists Merli, Missiroli and Pozzi of the Bologna University, the latter by a group of Japanese physicists directed by Tonomura seem, however, to confirm the possibility that movie recuperates its historic role of scientific language. These pictures, showing the phenomenon of interference or rather, as we shall see, autointerference of electrons, allows, in fact, one to view the dual nature, ondulatory and corpuscolar, of physical reality at the elementary level, a duality considered unobservable on the grounds on the standard interpretations of quantum mechanics.

Timothy Tambassi, Un’ontologia quadripartita. La proposta di Jonathan Lowe, 10/1/2010 ©
In the framework of analytical ontology there has recently been a renewed interest in categorization, which has contributed, in addition to a greater awareness of its fields of application and limits, to a lively discussion about what ontological categories should be considered as fundamental and how they should be organised into a system containing all types of existing entities. Part of this debate involves Jonathan Lowe‘s ontological proposal, here presented and discussed through the analysis of his ontological system, which recognizes four basic categories: Kinds, Attributes, Objects and Modes. According to Lowe, these categories have their own a priori ontological status and supply a reference frame by which the conditions of existence and identity of all entities are defined.

Giovanni Macchia, La metafisica dei poteri causali, 17/5/2009 ©
The necessitarian position on laws of nature roughly states that nomological statements are necessary: they could not have been otherwise than they are, at least in our world. Even though conceptions on laws are largely independent of the syntactic or semantic views on theories, they have been developed mostly in the context of syntactic views (since laws are statements). The semantic conception, developed in the 1960s, understands theories as classes of models, rather than linguistic items organized in an axiomatic system, attaining, in this way, a philosophical account of theories more in line with the practices of scientists. Accordingly, laws are about entities and processes represented by a model. In this semantic context – which even permits the elimination of laws altogether (as in van Fraassen’s thought) –, the central idea concerning the necessitarian approach we are going to analyse is that the truth of (necessary) nomological statements, expressing regularities, is grounded on other necessary statements dealing with particular dispositions, called causal powers, regarded as properties that things essentially possess and in virtue of which they tend to display a characteristic behaviour under suitable circumstances. According to this view, laws are not merely contingent necessary statements, as in the ADT’s necessitarian view, but hold true in all metaphysically possible worlds.
We will survey the philosophical debate on causal powers, focusing both on their richness and their ambiguities, with an in-depth exploration of Michel Ghins’ thought and his defence of this metaphysics.

Giovanni Macchia, Leggi di natura: metafisica nella scienza?, 24/12/2008 ©
In many fields of human knowledge, the laws of nature play an unquestionable central role. It is usually thought that one of science’s chief goals is to discover these laws, while philosophy, in turn, has to explicate what laws really are. In this paper we will attempt to realize this last aim: to describe the vigorous discussions in contemporary metaphysics concerning the epistemological status of the laws of nature and the role they play in scientific and philosophical reasoning. The main difficulty in developing an account of laws is to distinguish genuine laws from accidental truths. Historically, two principal competing views have tried to tackle this task: the regularity view (also called Humean), and the necessity view. Humeans hold that in nature there are no laws at all: at best experience shows that only regularities exist. There is no evidence and therefore no demonstration of the existence of some underlying necessity because our data are about how the world is, not how it must be. Therefore, “laws” are only our statements, or descriptions, of the uniformities that happen in constant conjunction. A more sophisticated Humean position is defended by David Lewis (the historical roots of his account trace back to John Stuart Mill and Frank Plumpton Ramsey’s views and for this reason is also called the MRLapproach), according to which laws are those generalizations that are axioms or theorems in true deductive systems that achieve the best combination of simplicity and strength. Unlike Humeans, the necessitarians believe that laws of nature describe how the world must be. One of the most influential necessitarian position is David Armstrong’s (also called ADT approach, as Fred Dretske and Michael Tooley are the other two main proponents), according to which laws are not just universal generalizations, but are singular propositions which state relations between universals. We will examine these main schools of thought and assess their weak and strong points. In this debate on laws of nature, special emphasis will be laid upon how metaphysics inevitably intertwines with science in one of the most profound, inextricable, elusive and fascinating ways.

Mario Alai, Realistic and antirealistic attitudes in natural science, 24/12/2008 ©
Throughout history scientists and philosophers have discussed whether the theoretical descriptions of unobservable entities and facts were credible or not. The problem, I claim, is not due to an epistemic boundary demarcating the observable from the unobservable, but to methodological problems concerning theories, both local (as incompatibility with background theories, or lack of independent evidence) and global (as empirical underdetermination and basic human fallibility). This is seen by reviewing Celsus’ account of controversies in Hellenistic medicine, selected contemporary sources on ancient, medieval and modern cosmology, some outlines of developments in atomistic theory, wave theories, classical and relativistic mechanics, and the most recent debates on scientific revolutions.

Adriano Angelucci. Un contributo di M. W. Drobisch al dibattito tedesco sulla questione logicat 11/10/2008 ©
According to a widely spread opinion in last century’s historiography of logic, the first part of the XIX century represents a period of little or none importance for the history of this science. This point of view is largely due to the misleading tendency of focusing the attention uniquely on the important turning-points of the last part of the century without paying proper attention to their philosophical background. As a matter of fact, after the end of the idealistic period, arose in German speaking countries a prolific philosophical discussion concerning the necessity to reform logic. This discussion, often referred to with the expression “logische Frage”, the logic question, has singled out a series of theoretical questions which will characterize the following investigations on the subject. One of the central figures of the debate was the Leipzig mathematician and philosopher Moritz Wilhelm Drobisch, whose logical work is worth paying a closer attention to. For this reason in what follows will be given an Italian translation of his 1857 short essay «Ueber logische Analysis und Synthesis», which may be regarded as the clearest overview of his logical thought.

Andrea Tontini, La formula chimica di struttura: un ploblema per l'epistemologia popperiana?,28/09/2008 ©
Twentieth-century philosophy of science should be considered as mainly a philosophy of physics, not taking into account the originality and autonomy of chemical knowledge and, consequently, leaving the analysis of this fundamental branch of modern science underdeveloped. This paper begins by studying the constituent elements of the chemistry, thereby highlighting the profound difference between its method and scope and those of physics. The generation of chemical knowledge involves as a primary operation the isolation of elements and compounds and their structural characterization. Structural formulas must be consistent with experimental data from diverse spectroscopic techniques, which vouches for their truthfulness and irrefutability. It is argued that a structure formula is not to be held as an arbitrary intellectual construct possessing high heuristic value, but rather as a definitive, if rudimentary, statement regarding the authentic structure of a molecule, i.e. its atomic frame. This is at odds with Kuhn’s interpretation of scientific activity as well as with Feyerabend’s epistemology. Furthermore, the fact that structural formulas are not liable to falsification appears to contrast with Popperian fallibilism.

Marco Toscano Praeter Quantitatem. Spunti di ricerca per una lettura epistemologica dell'analisi qualitativa in Henri Poincare, 26/5/2008 ©
The paper analyzes the epistemological meaning of Poincaré’s qualitative approach. Poincaré used this approach in his studies on differential equations and, later, on the three body problem. The development of Poincaré’s topological researches will be explained considering two aspects of his scientific journey. The first concerns his deep interest into the three body problem. The second deals with his awareness that such a problem required new mathematical tools, alternative to the classical-quantitative ones in use at that time. This will make it possible to recognize the epistemological influence of Leibniz on Poincaré and to analyze the fundamental role that the Leibnizian thought has played in Poincaré’s philosophical education. In conclusion I will also introduce the epistemological meaning of Analysis Situs in Poincaré’s thought. The present essay should not be expected to be a complete treatment of these subjects, rather it suggests some, original starting points for new researches on Poincaré’s philosophy©
As Popper emphasizes, most of modern and contemporary physicists are aware of the philosophical meaning of their work in physical research. We argue that Silvio Bergia has a similar awareness of the philosophical ideas underpinning his research programme and the epistemological consequences of his work in physics. We analyze some of Bergia’s contributions to philosophy of physics, to special and general relativity, and to quantum mechanics; and identify empirical realism as an underlying philosophical theme of his work and his effort to describe reality while doing physics.

Giorgio Fontana, IIn defence of a minimal idealism, 7/4/2007 ©
In the last years, the panorama of analytical philosophy has been characterized by a strong revival of metaphysical realism, combined with a more general attack to idealism. In this paper I will try to show – at least in the form of some prolegomena – how we can commit ourselves to a minimal form of transcendental idealism, without being afraid of losing the world. The strategy is to consider reality as depending on possible experience, and to admit the plausibility of an ontological pluralism, taking the world as a noumenic transcendental idea.

Nicola Toro, La coscienza e l’organizzazione funzionale dei sistemi fisici, 24/1/2007 ©
In the first part of the article is Chalmers ’s consciousness theory and the applications in the field of artificial intelligence are shown. He proposes that physical systems which share the same functional organization have the same conscious experience (principle of organizational invariance). Starting from this principle Chalmers defends the thesis of strong artificial intelligence: the execution of some algorithms by a physical system generates conscious experiences. In the second part the reasons for which the thesis of Chalmers cannot be true and therefore also the implications in the field of artificial intelligence are shown. The definition of functional organization coincides with the modelling inputs-states-outputs (ISO) of the physical systems which is used in the field of automatic controls. Analyzing some properties of ISO systems we can see how a system can be represented by a lot of models, so there are more functional organizations (considering Chalmers ’s definition) which describe the same system. According to the principle of organizational invariance a system will cause a multiplicity of conscious experiences, but this looks absurd.

Giulia Giannini, Il convenzionalismo geometrico di Poincaré, La nozione di gruppo e il “doppio ruolo” dell’esperienza, 23/12/2006 ©
Till at least the seventies, Poincaré’s image given by physicists and mathematics was that of a scientist sticked to an out-of-date idea of science. On the contrary, several interpretations of Poincaré’s geometrical conventionalism largely ignored the role played, in them, by mathematical concept of transformations group and by the notion, closely linked to it, of isomorphism among different groups. Furthermore, this has led to a great ambiguity and
confusion as regards to the meaning of Poincaré’s geometrical conventionalism: Poincaré’s epistemological position, often accused of “nominalism”, sometimes even associated with Carnap’s or Reichenbach’s “conventionalism”, just recently has revalued. A better understanding of Poincaré’s thought was possible through an analysis of the role played by Lie’s theory of trasformations’groups and by the group’s notion itself. Starting from this point of view we were able to understand Poincaré’s originality that placed him between empiricism, that he harshly criticized on several occasions, and Kant’s rationalism, that he believed necessary to reform

Giovanni Macchia, L’Argomento del Buco di Einstein nel recente dibattito sull’ontologia dello spaziotempo 30/9/2006 ©
This paper tries to set off the main features that the debate on the ontological nature of spacetime has produced since 1987, when John Earman and John Norton animated the discussion renewing and extending epistemologically the Albert Einstein’s original Hole Argument. This matter – that, around 1913, raised deep conceptual problems for Einstein in his early development of General Relativity – from philosophical viewpoint deals with the
manifold spacetime substantivalism that leads to a radical form of indeterminateness for theories with generally covariant field equations. But, while the traditional debate about the substantival or relational nature of spacetime tries to resolve the Hole Argument, new structural realist interpretations of spacetime theories are rising on the ontological horizon.

Giorgio Volpe, Sulle proposizioni (e altre entità) pleonastiche. Alcune considerazioni in margine alla teoria di Stephen Schiffer, 4/9/2006 ©
Beginning from the early 1990s, Stephen Schiffer has developed increasingly refined versions of the view that propositions are ‘pleonastic entities’, i.e., ontologically minimal entities that are admitted in our ontology as a result of the engagement in certain linguistic practices and whose conditions of individuation are entirely dependent upon such practices. This paper argues that the way Schiffer formulates the pleonastic conception of propositions is not entirely satisfactory and draws on some suggestions by Tobias Rosefeldt and Wolfgang Künne to arrive at a more compelling formulation. The new version maintains Schiffer’s claim that talk of propositions is licensed through ‘something-from-nothing transformations’ that take us from statements in which no reference is made to such entities to statements in which there is explicit reference to them; however, this claim is divorced from the ‘face-value theory’ of (propositional) attitudes attributions and combined with a different account of the premises of the relevant transformations – an account that opens the way to the formulation of a pleonastic conception of facts.

Mario Alai,Ontologia, spiegazione e interpretazione di Copenaghen della meccanica quantistica, 22/5/2006 ©
As no non paradoxical ontological interpretation for Quantum Mechanics has been found up to now, I argue that (despite some criticisms by J.C. Cramer) the Copenhagen School successfully avoids the paradoxes by the positivist strategy of accepting only an empirical interpretation and rejecting any ontological interpretation of the theory. In this way, however, it cannot explain the various regularities exhibited by the empirical data, nor draw from the formalism any knowledge concerning the deep structure of physical reality.

Mario Valentino Bramè, La metafisica di Whitehead nelle formule di Whitehead Ph.D. ©
Alfred North Whitehead formulated in 1922 his Relativity Theory moving from typical metaphysical issues. His metaphysics of prehensions lead him to reject the variably curved spacetime of General Relativity by A. Einstein. In this paper I try to point out how typical features of Whitehead’s metaphysics can be found among the formulas his physical theory.

Laura Felline, Nonlocality in Everettian Accounts of Quantum Mechanics, 8/4/2006©
In this work we investigate the problem of locality in theories inspired by Everett’s Relative State Theory. Specifically we address the Many Worlds by Deutsch, the Many Minds Theory by David Albert and Barry Loewer and the Relational Theory by Simon Saunders, and we carry out our inquiry in view of recent work by Meir Hemmo and Itamar Pitowsky. Our aim is, on the one hand, to clarify the remarkably important points which have been put forward within Hemmo and Pitowsky work on the issue of the interpretation of probability in Many Minds, on the other hand to argue against a remark on the Relational theory, stating that the analysis carried out on the Many Minds interpretation could be applied, mutatis mutandis, to Saunders’ theory.

Mario Valentino Bramè, Dalla metafisica alla fisica: la relatività di Whitehead, 21/12/2005©
The Relativity Theory by Alfred North Whitehead has been formulated in 1922. It is an alternative to the Theory of General Relativity by A. Einstein, for it accepts the results of Special Theory of Relativity but denies the possibility to deal with a variable curvature of the space-time. The great appeal of hat theory is due to its metaphysical origin, for it directly comes out from Whitehead’s process metaphysics. From an experimental point of view, it is worth saying that the Relativity Theory by A. Whitehead hasn’t been definitely disconfirmed yet. In this paper we analyse the conceptual aspects of that theory and how the metaphysical thought of A.N. Whitehead comes to produce it.

Federica Russo, Dal realismo scientifico all'interpretazione della probabilità. Salmon e van Fraassen a confronto, 20/4/2005©
A careful analysis of Salmon's Theoretical Realism and van Fraassen's Constructive Empiricism shows that both share a common origin: the requirement of literal construal of theories inherited by the Standard View. However, despite this common starting point, Salmon and van Fraassen strongly disagree on the existence of unobservable entities. I argue that their different ontological commitment towards the existence of unobservables traces back to their different views on the interpretation of probability via different conceptions of induction. In fact, inferences to statements claiming the existence of unobservable entities are inferences to probabilistic statements, whence the crucial importance of the interpretation of probability.

Alberto Gualandi, La rottura e l'evento, 25/9/2004 ©
This essay rebuilds the problem of the relationship between science and philosophy in two generations of XX century French thinkers from a theorethical-critical point of view. The first part of the essay focuses on the analysis of the theorethical consequences which Gaston Bachelard's idea of epistemological break produced in some French thinkers of the years after World War two: Michel Serres, Gilles Deleuze, Jean-François Lyotard, Michel Foucault, authors who made the category of event the core of their reflections. From the inadequacies and metatheorethical paradoxes in the philosophies of the event of these exemplary thinkers, the second part of the essay tries to rebuild the genesis of the idea of epistemological break in those authors of the years after World War one - Henri Bergson, Émile Meyerson and, particularly, Léon Brunschvicg - who paved the way to the "Bachelardian break" through a critical rereading of Kantian epistemology. The last part of the essay then tries to show that Brunschvicg's doctrine of judgment had in itself some theorethical possibilities, set aside by the later authors, which could have given the idea of epistemological break a less paradoxical and more positive meaning for philosophy. According to the author these theoretical possibilities could even today take on all their cognitive meaning if put on an anthropological and linguistic ground - only present in a metaphorical and sublimated form in Brunschvicg and in the other pre- and post- Bachelardian French authors - which constitutes its "trascendental concrete" foundation.

Maurizio Ferraris, Necessità materiale, 15/12/2004 ©
The aim of this article is to define the concept of "material necessity", that is ultimately defined as "unamendableness", i.e. the impossibility of correction, typical of sensible perceptions.

Beatrice Mezzacapa, Metzger e Damasio su percezione e metodo in psicologia 25/9/2004 ©
Moving from some problems in perception theory and their solution in gestalt theory we suggest a method for research in experimental psychology and in neuropsychology. It is an epistemological constraint that forces us to speak of mind and body as two different substances, since we aquainte the mind through introspection, and the body, the external world, through observation. Research is constrained itself by this, but not in a negative way, rather in an euristic way; admitting the epistemological difference there's between knowledge of mind and knowledge of body, research should be able to use it (through the method of autoconsistent field) to escape the problems which arise not distinguishing the two levels of knowledge

Paola Belpassi, La teoria modulare e i nuovi paradigmi sperimentali, 25/9/2004 ©
The studies of Jean Piaget about the development of intelligence in the child still remain fundamental in the field. Along with the theoretical basis of “constructivism”, Piaget elaborated a technique of research able to define experimentally the cognitive conducts specific to each evolutionary stage of the child. Conversely, the subsequent modular model has substantially modified both the object of observation and the methodology used to detect children's capabilities. In this model, such capabilities are considered genetically programmed and must therefore be observed in the most precocious phases of the child's development and with specific procedures that should stimulate spontaneous reactions to each stimulus

Domenico Mancuso, Dai futuri contingenti all'irrealtà del tempo. Una versione indeterminista del paradosso di Mctaggart, 2/8/2004©
The purpose of this paper is to apply the logical structure of McTaggart's paradox (more precisely, the step on the contradiction of the A-series) to the problem of so-called future contingents. After an overview and a short discussion of McTaggart's argument, I will introduce the main issue of my work which concerns the truth-values of propositions on future events. Beginning with an alternative between empty and non-empty values, I will construct an infinite regress modeled on McTaggart's, and involving a sequence of nested propositions of growing complexity. I will then propose an Idealistic reading of such a regress, according to which future events are progressively acknowledged as mental anticipations belonging to the subject's actual present; a symmetric argument for the past will also be worked out in a separate section. Finally, I will briefly examine some problems of internal consistency connected with my 'timeless present' thesis.

Paola Belpassi, Costruttivismo e modularismo: un dibattito e le sue implicazioni pedagogiche, 27/1/2004 ©
The text is articulated as follows: in Piaget's theory about cognitive development, the relationship between perception and cognition presents the characteristics of an evolutionary or "gradual" dualism. Perception and actions are at the center of the "intelligent" activity of the child in their pre-verbal phase, but such intelligence is not equilibrated and has various limitations. Only by overcoming these limitations through a system of logical operations will objectivity and stability to knowledge be guaranteed.
Fodor on the other hand, by proposing his "functional taxonomy of cognitive mechanisms" makes a distinction between "input systems", modules employed to elaborate perceptions, from "central processors" which are the place for the superior functions of thought. However, in Fodor's opinion, both the former and the latter act on the basis of a logical model of the inferential type, which is based on the formulation and confirmation of hypothesis. "Input systems" are based on a restricted number of data, whilst "central processors" operate on a quasi-unlimited number of assumptions that will conduct to the "fixation of beliefs".
This debate, with its evident psycho-pedagogical implications, is still at its initial stages although it is already possible to anticipate its fecund future developments within the didactic field.

Eddy Carli, Intenzioni e intenzionalità collettiva, 27/1/2004 ©
The main purpose of this short paper is trying to reach the following outcomes: (1) to analyse the theory of collective intentionality of John Searle. which constitutes the background of his social ontology (1995, 1999), (2) to point out some problems connected to the internalist approach to collective intentionality. An internalist perspective, such as Searle's approach maintains, in particular, doesn't seem to consider some relational and normative aspects which characterise social phenomena. A radical alternative, would be a theory which considers also the relational aspects involved in collective intentionality, and which refers to an holistic perspective of intentional action (relational approach, Bratman, 1993, Meijers, 2003, Pettit, 1998).

Massimo Dell'Utri, Conoscenza e verità, 29/12/2003 ©
The aim of the paper is to shed some light on the concept of truth by means of an analysis of the relation which could tie truth and knowledge. Accordingly, various conceptions of truth will be discerned, depending on their allowing the possibility of knowing the truth-values of beliefs and sentences. The analysis will avail itself of some ‘pre-philosophical intuitions’ regarding the situations in which we say something is ‘true’ or ‘false’, on the one hand, and regarding knowledge in general, on the other. These pre-philosophical intuitions will help to insulate a sort of minimal interpretation of truth in terms of correspondence, and – by way of examples – brief descriptions of Ludwig Wittgenstein’s and John Austin’s correspondentist positions will be presented. In the end, the epistemological thesis of fallibilism will be methodologically used to point out what will be regarded as the most plausible conception of truth. Again, brief descriptions of Charles S. Peirce’s and Raimund Popper’s account of truth and fallibilism will be put forward in order to clarify the issue at stake.

Angelina De Luca, La teoria della doppia soluzione: un punto di vista realista sulla fisica dei quanti 29/12/2003 ©
Louis de Broglie is one of the founders of quantum mechanics. Rather than ascribing to the physical world some supposed new paradoxical features, he admits the practical success of quantum mechanics, but he considers it just a statistically exact theory; in addition to that, de Broglie purports to recover a causal and spatio-temporal description of phenomena. Unlike others detractors of the orthodox paradigm, he elaborates his own pars costruens, which is a not ambiguous alternative to standard quantum physics. The theory of double solution, in fact, attributes reality both to particles and waves, and it represents an attempt to overcome the conceptual difficulties of the prevailing interpretation. Moreover, apart from its limits, de Broglie's theory is able to restrict the importance of formal prohibitions, set from orthodox physicists, toward "heretical" explanations of empirical facts.

Alexander Afriat, Calzini di Bertlmann e sviluppi multiortogonali
It is argued that perfect quantum correlations are not due to additive conservation.

Mario Alai, Informatica e didattica della filosofia , 12/12/2003 ©
Computer programs may be used to model human intelligent practices in flexible and detailed ways. This is one of the reasons why computing can be very useful to philosophers both in research and in teaching. Two examples are suggested here, one drawn from philosophy of science and the other from epistemology. The former concerns the debate on the ‘logic of discovery': after a short review of the basic terms and present state of the question, instances are given of how this topic might be taught by simultaneously introducing the students to conceptual thinking, history of science, history of Artificial Intelligence and programming languages. The latter example concerns the classical problem of innate ideas: the application of computing to this question is mainly discussed theoretically, while only hints are given to possible classroom presentations.

Claudio Mantovani, Coscienza ed entanglement quantistico, 11/10/2003 ©
The problem of measurement in quantum mechanics rises from the application of mathematical formalism to macroscopic situations. The central position of the observer has furthermore produced a deviation towards a metaphysical subjectivism. Some controversial aspects of the role of consciousness, in the process of reduction of the wave function, are discussed. By means of a version of ‘Schrödinger's cat' Gedankenexperimente, and moving from some simple assumptions about the nature of consciousness, we try to show how this argument narrows the field of validity of some fundamental principles (superposition and reduction), during the interaction between microsystems and macrosystems. It follows a different definition of the ontological status of consciousness and reality, according to the position backed up by the critics of the Copenhagen School.

Mario Alai, Artificial intelligence, logic of discovery and scientific realism, 30/12/2002 ©
Epistemologists have debated at length whether scientific discovery is a rational and logical process. If it is, according to the Artificial Intelligence hypothesis, it should be possible to write computer programs able to discover laws or theories; and if such programs were written, this would definitely prove the existence of a logic of discovery.
Attempts in this direction, however, have been unsuccessful: the programs written by Simon's group, indeed, infer famous laws of physics and chemistry; but having found no new law, they cannot properly be considered discovery machines. The programs written in the "Turing tradition", instead, produced new and useful empirical generalization, but no theoretical discovery, thus failing to prove the logical character of the most significant kind of discoveries.
A new cognitivist and connectionist approach by Holland, Holyoak, Nisbett and Thagard, looks more promising. Reflection on their proposals helps to understand the complex character of discovery processes, the abandonment of belief in the logic of discovery by logical positivists, and the necessity of a realist interpretation of scientific research.

Vittorio De Palma, La ripresa brentaniana della teoria aristotelica delle categorie e il suo influsso su Husserl, 30/12/2002 ©
The paper moves from the distinction between being according to the figures of categories and being as true - which Brentano puts at the basis of his interpretation of Aristotle - and it tries to reconstruct the reception of the latter in the work of Husserl. It schowed that the Husserlian dichotomies between formalization and generalization, formal categories and material categories, formal ontology and material ontologies, formal a priori and material a priori, correspond to the above-mentioned distinction. Contrary to Kant and to transcendental philosophy, indeed, for Husserl the sensuous objects have categorial determinations in the Aristotelian sense, i.e. determinations which belong to them as a consequence of their peculiarity, indepedently from the fact that they are thought by the subject. It is then dealt with Husserlian inquiry into the relation between logical forms and structure of experience, schowing that the phenomenological distinction between substrat and determination, that Husserl puts at the basis of the structure of experience and judgement, is de facto an experiential foundation of the ontological distinction between substance and accident, which is put by Aristotle at the basis of logic and metaphysics. The paper ends by establishing that both Aristotle and Husserl think reality as an objective structure independent of the subject, but whereas Aristotle puts at the basis of the experience an unmoved substance, i.e. something which escapes from possible experience, for Husserl "real" is only what can be sensuously experienced and consequently something which escapes from experience is a nonsense.

Gennaro Auletta,Critical Examination of the Conceptual Foundations of Classical Mechanics in the Light of Quantum Physics, 14/9/2002 ©
As it is well known, classical mechanics consists of several basic features like determinism, reductionism, completeness of knowledge and mechanism. In this article the basic assumptions which underlie those features are discussed. It is shown that these basic assumptions - though universally assumed up the beginnings of the XX century - are far from being obvious. Finally it is shown that - to a certain extent - there is nothing wrong in assuming these basic postulates. Rather, the error lies in the epistemological absolutization of the theory, which was considered as a mirroring of Nature.
Keywords: Perfect determination, determinism, mechanism, completeness, mirroring, causality.

Eddy Carli, Cause, ragioni, intenzioni: spiegazione causale e comprensione di senso, 14/9/2002 ©
The topics of this paper is the connection between the concept of “reason” and the concept of “cause”and the question of how they are related to action.
Particularly, we analyze some specific paragraphs of Elizabeth Anscombe’s book, Intention (1957), §§10-19, to underline her criticism of causalism and causal explanation of action, based on the identity theory between “reason” and (mental) “causes”. We individuate three main arguments in Anscombe’s
theory of intention and action. Anscombe’s thesis is that reasons are not causes of action, and, according to us, it is based on the three following arguments: (i) the question “Why?”, (ii) the logical connection argument, and (iii) the general argument for intentional action. We develop then a critical confrontation with the opposite thesis, supported by Donald Davidson (1963), which maintains a particular version of the causalist thesis: the identity between reasons and causes. We could summarize the “causalist thesis” as follows: Intentional actions are actions caused by certain mental states or events, whose occurrence explains the occurrence of the action. Anscombe’s legislates against this thesis, and against the existence of mental causes in her specification of when the question “Why?”, in its special sense, has no application. It hasn't an application in cases where the answer does not give the agent’s reason for acting – when the answer states a cause, including a mental cause (§ 16). Anscombe’s criticism of causalism is based on the legacy of Wittgenstein’s research on intention and action (Philosophical Investigations; Blue Book) and it is strongly influenced by Aristotle’s theory of practical syllogism (Nicomachean Ethics, III, VI, VII; De Motu Animalium, VII). And we find the same influence – by Aristotle’s practical syllogism – in Davidson’s theory of action and in his idea of a “practical rationality”. So, in this confrontation between two radically different thesis about causation we find a common background, the Aristotelian practical philosophy, focused on the theory of practical syllogism and its conclusion into an action (De Motu, VII, 701a).

Massimiliano Carrara, Sull'identità mente/corpo, 14/9/2002 ©
This paper is an introduction to the type-type identity theory of mind and body. In general, the identity theory of mind holds that every mental state is identical with some state in the brain. The inspiration for this theory is the way in which science expresses many of its discoveries in terms of identity. In the paper I start analysing these scientific identities. Then, I find a model for the identity-sentences of mind and body used by the type-type identity theorists. After that, I analyse some standard criticisms to the theory.

Beatrice Mezzacapa, Il posto degli schemi concettuali nella filosofia di W/V/Quine, 31/5/2002 ©
Conceptual scheme, the way we see the world, our point of view in seeing it, is central in W.V.O. Quine’s vision of language (as to this point we shall speak of the indeterminacy of translation and the imperscrutability of reference) and of his philosophy of science (also T.S. Kuhn will be of help here). We will see how our communication is corrupted by our personal or cultural conceptual schemes, and how this has an effect on science too, and on its evolution as well. After these reflections, scientific realism will seem defeated, but Quine himself gives us a solution, showing how we can find a sort of objectivity for meaning, reference, and scientific theories in the relation between the elements of our language or theories. This gives us a start to begin a new reflection on language and science, and on their possibilities of bringing us to truth, even if it won’t be an absolute Truth anymore.

home