In this paper we view the first order set theory ZFC under the canonical frst order semantics and the second order set theory ZFC_2 under the Henkin semantics. Main results are: (i) Let M_st^ZFC be a standardmodel of ZFC, then ¬Con(ZFC + ∃M_st^ZFC ). (ii) Let M_stZFC_2 be a standardmodel of ZFC2 with Henkin semantics, then ¬Con(ZFC_2 +∃M_stZFC_2). (iii) Let k be inaccessible cardinal then ¬Con(ZFC + ∃κ). In order to obtain the statements (i) (...) and (ii) examples of the inconsistent countable set in a set theory ZFC + ∃M_stZFC and in a set theory ZFC2 + ∃M_st^ZFC_2 were derived. It is widely believed that ZFC + ∃M_stZFC and ZFC_2 + ∃M_st^ZFC_2 are consistent, i.e. ZFC and ZFC_2 have a standard models. Unfortunately this belief is wrong. Book. Advances in Mathematics and Computer Science Vol. 1 Chapter 3 There is No StandardModel of ZFC and ZFC2 ISBN-13 (15) 978-81-934224-1-0 See Part II of this paper DOI: 10.4236/apm.2019.99034 . (shrink)
In this article we proved so-called strong reflection principles corresponding to formal theories Th which has omega-models or nonstandard model with standard part. An posible generalization of Lob’s theorem is considered.Main results are: (i) ConZFC Mst ZFC, (ii) ConZF V L, (iii) ConNF Mst NF, (iv) ConZFC2, (v) let k be inaccessible cardinal then ConZFC .
In this article we derived an important example of the inconsistent countable set in second order ZFC (ZFC_2) with the full second-order semantics. Main results: (i) :~Con(ZFC2_); (ii) let k be an inaccessible cardinal, V is an standardmodel of ZFC (ZFC_2) and H_k is a set of all sets having hereditary size less then k; then : ~Con(ZFC + E(V)(V = Hk)):.
Boolean-valued models of set theory were independently introduced by Scott, Solovay and Vopěnka in 1965, offering a natural and rich alternative for describing forcing. The original method was adapted by Takeuti, Titani, Kozawa and Ozawa to lattice-valued models of set theory. After this, Löwe and Tarafder proposed a class of algebras based on a certain kind of implication which satisfy several axioms of ZF. From this class, they found a specific 3-valued model called PS3 which satisfies all the axioms (...) of ZF, and can be expanded with a paraconsistent negation *, thus obtaining a paraconsistent model of ZF. The logic (PS3 ,*) coincides (up to language) with da Costa and D'Ottaviano logic J3, a 3-valued paraconsistent logic that have been proposed independently in the literature by several authors and with different motivations such as CluNs, LFI1 and MPT. We propose in this paper a family of algebraic models of ZFC based on LPT0, another linguistic variant of J3 introduced by us in 2016. The semantics of LPT0, as well as of its first-order version QLPT0, is given by twist structures defined over Boolean agebras. From this, it is possible to adapt the standard Boolean-valued models of (classical) ZFC to twist-valued models of an expansion of ZFC by adding a paraconsistent negation. We argue that the implication operator of LPT0 is more suitable for a paraconsistent set theory than the implication of PS3, since it allows for genuinely inconsistent sets w such that [(w = w)] = 1/2 . This implication is not a 'reasonable implication' as defined by Löwe and Tarafder. This suggests that 'reasonable implication algebras' are just one way to define a paraconsistent set theory. Our twist-valued models are adapted to provide a class of twist-valued models for (PS3,*), thus generalizing Löwe and Tarafder result. It is shown that they are in fact models of ZFC (not only of ZF). (shrink)
We examine some of Connes’ criticisms of Robinson’s infinitesimals starting in 1995. Connes sought to exploit the Solovay model S as ammunition against non-standard analysis, but the model tends to boomerang, undercutting Connes’ own earlier work in functional analysis. Connes described the hyperreals as both a “virtual theory” and a “chimera”, yet acknowledged that his argument relies on the transfer principle. We analyze Connes’ “dart-throwing” thought experiment, but reach an opposite conclusion. In S , all definable sets (...) of reals are Lebesgue measurable, suggesting that Connes views a theory as being “virtual” if it is not definable in a suitable model of ZFC. If so, Connes’ claim that a theory of the hyperreals is “virtual” is refuted by the existence of a definable model of the hyperreal field due to Kanovei and Shelah. Free ultrafilters aren’t definable, yet Connes exploited such ultrafilters both in his own earlier work on the classification of factors in the 1970s and 80s, and in Noncommutative Geometry, raising the question whether the latter may not be vulnerable to Connes’ criticism of virtuality. We analyze the philosophical underpinnings of Connes’ argument based on Gödel’s incompleteness theorem, and detect an apparent circularity in Connes’ logic. We document the reliance on non-constructive foundational material, and specifically on the Dixmier trace −∫ (featured on the front cover of Connes’ magnum opus) and the Hahn–Banach theorem, in Connes’ own framework. We also note an inaccuracy in Machover’s critique of infinitesimal-based pedagogy. (shrink)
The impossibility theorem of Dekel, Lipman and Rustichini has been thought to demonstrate that standard state-space models cannot be used to represent unawareness. We first show that Dekel, Lipman and Rustichini do not establish this claim. We then distinguish three notions of awareness, and argue that although one of them may not be adequately modeled using standard state spaces, there is no reason to think that standard state spaces cannot provide models of the other two notions. In (...) fact, standard space models of these forms of awareness are attractively simple. They allow us to prove completeness and decidability results with ease, to carry over standard techniques from decision theory, and to add propositional quantifiers straightforwardly. (shrink)
We present a model of the distribution of labour in science. Such models tend to rely on the mechanism of the invisible hand . Our analysis starts from the necessity of standards in distributed processes and the possibility of multiple standards in science. Invisible hand models turn out to have only limited scope because they are restricted to describing the atypical single-standard case. Our model is a generalisation of these models to J standards; single-standard models such (...) as Kitcher are a limiting case. We introduce and formalise this model, demonstrate its dynamics and conclude that the conclusions commonly derived from invisible hand models about the distribution of labour in science are not robust against changes in the number of standards. (shrink)
A theory of truth is usually demanded to be consistent, but -consistency is less frequently requested. Recently, Yatabe has argued in favour of -inconsistent first-order theories of truth, minimising their odd consequences. In view of this fact, in this paper, we present five arguments against -inconsistent theories of truth. In order to bring out this point, we will focus on two very well-known -inconsistent theories of truth: the classical theory of symmetric truth FS and the non-classical theory of naïve truth (...) based on ᴌukasiewicz infinitely valued logic: PAᴌT. (shrink)
In this article, a possible generalization of the Löb’s theorem is considered. Main result is: let κ be an inaccessible cardinal, then ¬Con( ZFC +∃κ) .
This paper intends to further the understanding of the formal properties of (higher-order) vagueness by connecting theories of (higher-order) vagueness with more recent work in topology. First, we provide a “translation” of Bobzien's account of columnar higher-order vagueness into the logic of topological spaces. Since columnar vagueness is an essential ingredient of her solution to the Sorites paradox, a central problem of any theory of vagueness comes into contact with the modern mathematical theory of topology. Second, Rumfitt’s recent topological reconstruction (...) of Sainsbury’s theory of prototypically defined concepts is shown to lead to the same class of spaces that characterize Bobzien’s account of columnar vagueness, namely, weakly scattered spaces. Rumfitt calls these spaces polar spaces. They turn out to be closely related to Gärdenfors’ conceptual spaces, which have come to play an ever more important role in cognitive science and related disciplines. Finally, Williamson’s “logic of clarity” is explicated in terms of a generalized topology (“locology”) that can be considered an alternative to standard topology. Arguably, locology has some conceptual advantages over topology with respect to the conceptualization of a boundary and a borderline. Moreover, in Williamson’s logic of clarity, vague concepts with respect to a notion of a locologically inspired notion of a “slim boundary” are (stably) columnar. Thus, Williamson’s logic of clarity also exhibits a certain affinity for columnar vagueness. In sum, a topological perspective is useful for a conceptual elucidation and unification of central aspects of a variety of contemporary accounts of vagueness. (shrink)
In most accounts of realization of computational processes by physical mechanisms, it is presupposed that there is one-to-one correspondence between the causally active states of the physical process and the states of the computation. Yet such proposals either stipulate that only one model of computation is implemented, or they do not reflect upon the variety of models that could be implemented physically. In this paper, I claim that mechanistic accounts of computation should allow for a broad variation of models (...) of computation. In particular, some non-standard models should not be excluded a priori. The relationship between mathematical models of computation and mechanistically adequate models is studied in more detail. (shrink)
According to standard rational choice theory, as commonly used in political science and economics, an agent's fundamental preferences are exogenously fixed, and any preference change over decision options is due to Bayesian information learning. Although elegant and parsimonious, such a model fails to account for preference change driven by experiences or psychological changes distinct from information learning. We develop a model of non-informational preference change. Alternatives are modelled as points in some multidimensional space, only some of whose (...) dimensions play a role in shaping the agentís preferences. Any change in these "motivationally salient" dimensions can change the agent's preferences. How it does so is described by a new representation theorem. Our model not only captures a wide range of frequently observed phenomena, but also generalizes some standard representations of preferences in political science and economics. (shrink)
Schaffner’s model of theory reduction has played an important role in philosophy of science and philosophy of biology. Here, the model is found to be problematic because of an internal tension. Indeed, standard antireductionist external criticisms concerning reduction functions and laws in biology do not provide a full picture of the limits of Schaffner’s model. However, despite the internal tension, his model usefully highlights the importance of regulative ideals associated with the search for derivational, and (...) embedding, deductive relations among mathematical structures in theoretical biology. A reconstructed Schaffnerian model could therefore shed light on mathematical theory development in the biological sciences and on the epistemology of mathematical practices more generally. *Received November 2006; revised March 2009. †To contact the author, please write to: Philosophy Department, University of California, Santa Cruz, 1156 High St., Santa Cruz, CA 95064; e‐mail: [email protected] (shrink)
In most accounts of realization of computational processes by physical mechanisms, it is presupposed that there is one-to-one correspondence between the causally active states of the physical process and the states of the computation. Yet such proposals either stipulate that only one model of computation is implemented, or they do not reflect upon the variety of models that could be implemented physically. -/- In this paper, I claim that mechanistic accounts of computation should allow for a broad variation of (...) models of computation. In particular, some non-standard models should not be excluded a priori. The relationship between mathematical models of computation and mechanistically adequate models is studied in more detail. (shrink)
The way, in which quantum information can unify quantum mechanics (and therefore the standardmodel) and general relativity, is investigated. Quantum information is defined as the generalization of the concept of information as to the choice among infinite sets of alternatives. Relevantly, the axiom of choice is necessary in general. The unit of quantum information, a qubit is interpreted as a relevant elementary choice among an infinite set of alternatives generalizing that of a bit. The invariance to the (...) axiom of choice shared by quantum mechanics is introduced: It constitutes quantum information as the relation of any state unorderable in principle (e.g. any coherent quantum state before measurement) and the same state already well-ordered (e.g. the well-ordered statistical ensemble of the measurement of the quantum system at issue). This allows of equating the classical and quantum time correspondingly as the well-ordering of any physical quantity or quantities and their coherent superposition. That equating is interpretable as the isomorphism of Minkowski space and Hilbert space. Quantum information is the structure interpretable in both ways and thus underlying their unification. Its deformation is representable correspondingly as gravitation in the deformed pseudo-Riemannian space of general relativity and the entanglement of two or more quantum systems. The standardmodel studies a single quantum system and thus privileges a single reference frame turning out to be inertial for the generalized symmetry [U(1)]X[SU(2)]X[SU(3)] “gauging” the standardmodel. As the standardmodel refers to a single quantum system, it is necessarily linear and thus the corresponding privileged reference frame is necessary inertial. The Higgs mechanism U(1) → [U(1)]X[SU(2)] confirmed enough already experimentally describes exactly the choice of the initial position of a privileged reference frame as the corresponding breaking of the symmetry. The standardmodel defines ‘mass at rest’ linearly and absolutely, but general relativity non-linearly and relatively. The “Big Bang” hypothesis is additional interpreting that position as that of the “Big Bang”. It serves also in order to reconcile the linear standardmodel in the singularity of the “Big Bang” with the observed nonlinearity of the further expansion of the universe described very well by general relativity. Quantum information links the standardmodel and general relativity in another way by mediation of entanglement. The linearity and absoluteness of the former and the nonlinearity and relativeness of the latter can be considered as the relation of a whole and the same whole divided into parts entangled in general. (shrink)
John Broome has developed an account of rationality and reasoning which gives philosophical foundations for choice theory and the psychology of rational agents. We formalize his account into a model that differs from ordinary choice-theoretic models through focusing on psychology and the reasoning process. Within that model, we ask Broome’s central question of whether reasoning can make us more rational: whether it allows us to acquire transitive preferences, consistent beliefs, non-akratic intentions, and so on. We identify three structural (...) types of rationality requirements: consistency requirements, completeness requirements, and closedness requirements. Many standard rationality requirements fall under this typology. Based on three theorems, we argue that reasoning is successful in achieving closedness requirements, but not in achieving consistency or completeness requirements. We assess how far our negative results reveal gaps in Broome's theory, or deficiencies in choice theory and behavioral economics. (shrink)
The main foundations of the standard \CDM model of cosmology are that: the redshifts of the galaxies are due to the expansion of the Universe plus peculiar motions; the cosmic microwave background radiation and its anisotropies derive from the high energy primordial Universe when matter and radiation became decoupled; the abundance pattern of the light elements is explained in terms of primordial nucleosynthesis; and the formation and evolution of galaxies can be explained only in terms of gravitation within (...) a inflation + dark matter + dark energy scenario. Numerous tests have been carried out on these ideas and, although the standardmodel works pretty well in fitting many observations, there are also many data that present apparent caveats to be understood with it. In this paper, I offer a review of these tests and problems, as well as some examples of alternative models. (shrink)
In recent decades, the intertwining ideas of self-determination and well-being have received tremendous support in bioethics. Discussions regarding self-determination, or autonomy, often focus on two dimensions—the capacity of the patient and the freedom from external coercion. The practice of obtaining informed consent, for example, has become a standard procedure in therapeutic and research medicine. On the surface, it appears that patients now have more opportunities to exercise their self-determination than ever. Nonetheless, discussions of patient autonomy in the bioethics literature, (...) which focus on individual patients making particular decisions, neglect the social structure within which health-care decisions are made. Looking through the lens of disability and informed by the feminist conception of relational autonomy, this essay argues that the issue of autonomy is much more complex than the individualist model suggests. The social system and the ableist ideology impose various forms of pressure or oppressive power that can affect people’s ability to choose according to their value system. Even if such powers are not directly coercive, they influence potential parents’ decisions indirectly—they structure their alternatives in such a way that certain options are never considered as viable and other decisions must be made. This paper argues that, instead of only focusing on the individual act of decision-making, we need to pay attention to the social structure that frames people’s decision. (shrink)
Building upon Nancy Cartwright's discussion of models in How the Laws of Physics Lie, this paper addresses solid state research in transition metal oxides. Historical analysis reveals that in this domain models function both as the culmination of phenomenology and the commencement of theoretical explanation. Those solid state chemists who concentrate on the description of phenomena pertinent to specific elements or compounds assess models according to different standards than those who seek explanation grounded in approximate applications of the Schroedinger equation. (...) Accurate accounts of scientific debate in this field must include both perspectives. (shrink)
Artificial models of cognition serve different purposes, and their use determines the way they should be evaluated. There are also models that do not represent any particular biological agents, and there is controversy as to how they should be assessed. At the same time, modelers do evaluate such models as better or worse. There is also a widespread tendency to call for publicly available standards of replicability and benchmarking for such models. In this paper, I argue that proper evaluation ofmodels (...) does not depend on whether they target real biological agents or not; instead, the standards of evaluation depend on the use of models rather than on the reality of their targets. I discuss how models are validated depending on their use and argue that all-encompassing benchmarks for models may be well beyond reach. (shrink)
This study analyses the predictions of the General Theory of Relativity (GTR) against a slightly modified version of the standard central mass solution (Schwarzschild solution). It is applied to central gravity in the solar system, the Pioneer spacecraft anomalies (which GTR fails to predict correctly), and planetary orbit distances and times, etc (where GTR is thought consistent.) -/- The modified gravity equation was motivated by a theory originally called ‘TFP’ (Time Flow Physics, 2004). This is now replaced by the (...) ‘Geometric Model’, 2014 [20], which retains the same theory of gravity. This analysis is offered partially as supporting detail for the claim in [20] that the theory is realistic in the solar system and explains the Pioneer anomalies. The overall conclusion is that the model can claim to explain the Pioneer anomalies, contingent on the analysis being independently verified and duplicated of course. -/- However the interest lies beyond testing this theory. To start with, it gives us a realistic scale on which gravity might vary from the accepted theory, remain consistent with most solar-scale astronomical observations. It is found here that the modified gravity equation would appear consistent with GTR for most phenomena, but it would retard the Pioneer spacecraft by about the observed amount (15 seconds or so at time). Hence it is a possible explanation of this anomaly, which as far as I know remains unexplained now for 20 years. -/- It also shows what many philosophers of science have emphasized: the pivotal role of counterfactual reasoning. By putting forward an exact alternative solution, and working through the full explanation, we discover a surprising ‘counterfactual paradox’: the modified theory slightly weakens GTR gravity – and yet the effect is to slow down the Pioneer trajectory, making it appear as if gravity is stronger than GTR. The inference that “there must be some tiny extra force…” (Musser, 1998 [1]) is wrong: there is a second option: “…or there may be a slightly weaker form of gravity than GTR.” . (shrink)
Today's climate models are supported in a couple of ways that receive little attention from philosophers or climate scientists. In addition to standard 'model fit', wherein a model's simulation is compared to observational data, there is an additional type of confirmation available through the variety of instances of model fit. When a model performs well at fitting first one variable and then another, the probability of the model under some standard confirmation function, say, (...) likelihood, goes up more than under each individual case of fit alone. Thus, two instances of fit of distinct variables of a global climate model using distinct data sets considered collectively will provide stronger evidence for a model than either one of the instances considered individually. This has consequences for model robustness. Sets of models that produce robust results will, if their assumptions vary enough and they each are observationally sound, provide reasons to endorse common structures found in those models. Finally, independent empirical support for aspects and assumptions of the model provides an additional confirmational virtue for climate models, contrary to what is implied by some current philosophical writing on this topic. (shrink)
Research into two-stage models of “free will” – first “free” random generation of alternative possibilities, followed by “willed” adequately determined decisions consistent with character, values, and desires – suggests that William James was in 1884 the first of a dozen philosophers and scientists to propose such a two-stage model for free will. We review the later work to establish James’s priority. By limiting chance to the generation of alternative possibilities, James was the first to overcome the standard two-part (...) argument against free will, i.e., that the will is either determined or random. James gave it elements of both, to establish freedom but preserve responsibility. We show that James was influenced by Darwin’s model of natural selection, as were most recent thinkers with a two-stage model. In view of James’s famous decision to make his first act of freedom a choice to believe that his will is free, it is most fitting to celebrate James’s priority in the free will debates by naming the two-stage model – first chance, then choice -“Jamesian” free will. (shrink)
The ‘yes means yes’ model of sexual consent and the political and ethical commitments that underpin this model have three fundamental disadvantages. This position unfairly polices the sexual expression of participants; it demands an unreasonably high standard for defining sexual interaction as consensual; and by denying the body’s capacity for expressing sexual consent this model allows perpetrators of sexual violence to define consent. I argue that a critical examination of Marquis de Sade’s novel Juliette can provide (...) the basis for a model of sexual consent that avoids these problems by refraining from pre-judging the means by which consent is communicated. (shrink)
Simulation models of the Reiterated Prisoner's Dilemma have been popular for studying the evolution of cooperation since more than 30 years now. However, there have been practically no successful instances of empirical application of any of these models. At the same time this lack of empirical testing and confirmation has almost entirely been ignored by the modelers community. In this paper, I examine some of the typical narratives and standard arguments with which these models are justified by their authors (...) despite the lack of empirical validation. I find that most of the narratives and arguments are not at all compelling. None the less they seem to serve an important function in keeping the simulation business running despite its empirical shortcomings. (shrink)
In this paper I begin by extending two results of Solovay; the first characterizes the possible Turing degrees of models of True Arithmetic (TA), the complete first-order theory of the standardmodel of PA, while the second characterizes the possible Turing degrees of arbitrary completions of P. I extend these two results to characterize the possible Turing degrees of m-diagrams of models of TA and of arbitrary complete extensions of PA. I next give a construction showing that the (...) conditions Solovay identified for his characterization of degrees of models of arbitrary completions of PA cannot be dropped (I showed that these conditions cannot be simplified in the paper. (shrink)
The purpose of this article is to present several immediate consequences of the introduction of a new constant called Lambda in order to represent the object ``nothing" or ``void" into a standard set theory. The use of Lambda will appear natural thanks to its role of condition of possibility of sets. On a conceptual level, the use of Lambda leads to a legitimation of the empty set and to a redefinition of the notion of set. It lets also clearly (...) appear the distinction between the empty set, the nothing and the ur-elements. On a technical level, we introduce the notion of pre-element and we suggest a formal definition of the nothing distinct of that of the null-class. Among other results, we get a relative resolution of the anomaly of the intersection of a family free of sets and the possibility of building the empty set from ``nothing". The theory is presented with equi-consistency results . On both conceptual and technical levels, the introduction of Lambda leads to a resolution of the Russell's puzzle of the null-class. (shrink)
We call for a new philosophical conception of models in physics. Some standard conceptions take models to be useful approximations to theorems, that are the chief means to test theories. Hence the heuristics of model building is dictated by the requirements and practice of theory-testing. In this paper we argue that a theory-driven view of models can not account for common procedures used by scientists to model phenomena. We illustrate this thesis with a case study: the construction (...) of one of the first comprehensive model of superconductivity by London brothers in 1934. Instead of theory-driven view of models, we suggest a phenomenologically-driven one. (shrink)
Classical interpretations of Goedels formal reasoning, and of his conclusions, implicitly imply that mathematical languages are essentially incomplete, in the sense that the truth of some arithmetical propositions of any formal mathematical language, under any interpretation, is, both, non-algorithmic, and essentially unverifiable. However, a language of general, scientific, discourse, which intends to mathematically express, and unambiguously communicate, intuitive concepts that correspond to scientific investigations, cannot allow its mathematical propositions to be interpreted ambiguously. Such a language must, therefore, define mathematical truth (...) verifiably. We consider a constructive interpretation of classical, Tarskian, truth, and of Goedel's reasoning, under which any formal system of Peano Arithmetic---classically accepted as the foundation of all our mathematical Languages---is verifiably complete in the above sense. We show how some paradoxical concepts of Quantum mechanics can, then, be expressed, and interpreted, naturally under a constructive definition of mathematical truth. (shrink)
The Emergic Cognitive Model (ECM) is a unified computational model of visual filling-in based on the Emergic Network architecture. The Emergic Network was designed to help realize systems undergoing continuous change. In this thesis, eight different filling-in phenomena are demonstrated under a regime of continuous eye movement (and under static eye conditions as well). -/- ECM indirectly demonstrates the power of unification inherent with Emergic Networks when cognition is decomposed according to finer-grained functions supporting change. These can interact (...) to raise additional emergent behaviours via cognitive re-use, hence the Emergic prefix throughout. Nevertheless, the model is robust and parameter free. Differential re-use occurs in the nature of model interaction with a particular testing paradigm. -/- ECM has a novel decomposition due to the requirements of handling motion and of supporting unified modelling via finer functional grains. The breadth of phenomenal behaviour covered is largely to lend credence to our novel decomposition. -/- The Emergic Network architecture is a hybrid between classical connectionism and classical computationalism that facilitates the construction of unified cognitive models. It helps cutting up of functionalism into finer-grains distributed over space (by harnessing massive recurrence) and over time (by harnessing continuous change), yet simplifies by using standard computer code to focus on the interaction of information flows. Thus while the structure of the network looks neurocentric, the dynamics are best understood in flowcentric terms. Surprisingly, dynamic system analysis (as usually understood) is not involved. An Emergic Network is engineered much like straightforward software or hardware systems that deal with continuously varying inputs. Ultimately, this thesis addresses the problem of reduction and induction over complex systems, and the Emergic Network architecture is merely a tool to assist in this epistemic endeavour. -/- ECM is strictly a sensory model and apart from perception, yet it is informed by phenomenology. It addresses the attribution problem of how much of a phenomenon is best explained at a sensory level of analysis, rather than at a perceptual one. As the causal information flows are stable under eye movement, we hypothesize that they are the locus of consciousness, howsoever it is ultimately realized. (shrink)
Simulation models of the Reiterated Prisoner's Dilemma have been popular for studying the evolution of cooperation since more than 30 years now. However, there have been practically no successful instances of empirical application of any of these models. At the same time this lack of empirical testing and confirmation has almost entirely been ignored by the modelers community. In this paper, I examine some of the typical narratives and standard arguments with which these models are justified by their authors (...) despite the lack of empirical validation. I find that most of the narratives and arguments are not at all compelling. None the less they seem to serve an important function in keeping the simulation business running despite its empirical shortcomings. (shrink)
The exploration of the literature indicated that much studies abound in related areas. Much seems yet to be known about the nature of the relationship that exists between managerial variables and the sustainability of graduate programmes. To bridge this gap, we utilized a standardised multiple regression approach to build up linear models that examine three managerial processes (strategic planning, staff and information/communication management) and how they affect three proxies of the sustainability of graduate programmes (availability of funds and facilities, as (...) well as supervision) respectively and cumulatively. Quantitative data were obtained from an entire population of 149 managers (Head of Departments and Deans) in the University of Calabar, Calabar, Nigeria using a questionnaire. Findings emerged, among others, a significant relationship between each of the managerial processes to the sustainability of graduate programmes generally and particularly in terms of availability of funds, facilities and supervision; the three predictor variables partially and jointly accounted for significant proportions in the sustainability of graduate programmes generally and specifically in terms of the various dimensions. Based on this evidence, one general and three partial predictive linear equations, as well as models were derived, while relevant action-based implications for effective management and practice were discussed. (shrink)
[This paper is written in Czech language.] The aim of the article is to re-evaluate the still-surviving anthropological trope which, in reaction to an inquiry into the essence of man, compares humans with animals and points to culture as the means by which humans complete their “deficient” nature. This motif contrasting humans with animals has been extended by A. Gehlen who characterises humans as “beings of deficiencies”. In his view, the morphological-instinctive insufficiency of the human being must be stabilised by (...) cultural institutions, i.e. complexes of habitual actions. Merleau-Ponty, however, demonstrates that bodily beings always relate to their environment in an indirect way, on the basis of certain “standards” and “norms” of interaction, which exist by way of institution. The anthropological trope confronting humans and animals thus cannot produce, as in Gehlen, a contrast between an allegedly “direct” relationship to the world in animals, and a supposedly “indirect” relationship to the world in humans. It can be meaningfully retained only if it is interpreted in a Merleau-Pontyan way, that is, as an invitation to understand the transformation of the norms of indirect interaction with the world found in animals into those found in people, that is, if viewed as a comparison of their respective institutions. (shrink)
Recent studies tend to explain the importance of communication in the organisation as well as prescribing the most commonly practised techniques adopted by school managers. Studies on financial management are quite limited with the available ones suggesting that poor financial management is a source of conflict between school leaders and host communities. Little seems to be known on the connection between principals’ communication patterns and funds’ management as predictors of school-community relationship. This study builds on existing studies and appears to (...) be the first to assess the linkages between principals’ communication patterns, fund management practices and school- community relationship based on empirical rather than subjective data. A structural modelling approach was adopted to examine the nexus using quantitative primary data obtained from a random sample of 2108 respondents. A questionnaire which was designed and validated by the researchers served as the data collection device. Collected data were subjected to Exploratory and Confirmatory Factor Analyses, as well as Multiple Regression Analysis. Findings revealed various significant communication, funds management and school-community relationship practices that are available for adoption. However, it was found that the extent to which principals utilised such practices were below expected minimum standards. It was also found that there were no significant partial and composite effects of principals’ communication and funds management practices on school-community relationship (SCR). Based on these results, relevant theoretical, policy and practical implications are discussed. (shrink)
According to a common conception of legal proof, satisfying a legal burden requires establishing a claim to a numerical threshold. Beyond reasonable doubt, for example, is often glossed as 90% or 95% likelihood given the evidence. Preponderance of evidence is interpreted as meaning at least 50% likelihood given the evidence. In light of problems with the common conception, I propose a new ‘relevant alternatives’ framework for legal standards of proof. Relevant alternative accounts of knowledge state that a person knows a (...) proposition when their evidence rules out all relevant error possibilities. I adapt this framework to model three legal standards of proof—the preponderance of evidence, clear and convincing evidence, and beyond reasonable doubt standards. I describe virtues of this framework. I argue that, by eschewing numerical thresholds, the relevant alternatives framework avoids problems inherent to rival models. I conclude by articulating aspects of legal normativity and practice illuminated by the relevant alternatives framework. (shrink)
We here make preliminary investigations into the model theory of DeMorgan logics. We demonstrate that Łoś's Theorem holds with respect to these logics and make some remarks about standardmodel-theoretic properties in such contexts. More concretely, as a case study we examine the fate of Cantor's Theorem that the classical theory of dense linear orderings without endpoints is $\aleph_{0}$-categorical, and we show that the taking of ultraproducts commutes with respect to previously established methods of constructing nonclassical structures, (...) namely, Priest's Collapsing Lemma and Dunn's Theorem in 3-Valued Logic. (shrink)
The study aimed to identify the availability of the resource and partnership standard as one of the possibilities of excellence in Palestinian universities according to the European model. The study used the analytical descriptive method. The study was conducted on the university leadership at Al - Azhar and Islamic Universities, where the study population consisted of (282) individuals. The study sample consisted of (135) individuals, (119) of them responded, and the questionnaire was used in collecting the data. The (...) study has reached a number of results, the most important of which is the existence of a high standard of resources and partnership in the Palestinian public universities operating in the southern governorates. The fields of the resource and partnership criteria were as follows: (79.8%) management of internal and external partnerships; management of technical resources (technology) (76.4%), Buildings, equipment and resources (76%), Finance Management (72.8%). The study presented a number of recommendations, the most important of which is: Increasing the interest of universities in applying the criteria of resources and partnership as one of the criteria of excellence, increasing the interest of universities in managing finance and obtaining funding for their various activities, periodic maintenance of buildings and equipment and ensuring their suitability to the requirements of the educational process. (shrink)
This article presents important properties of standard discrete distributions and its conjugate densities. The Bernoulli and Poisson processes are described as generators of such discrete models. A characterization of distributions by mixtures is also introduced. This article adopts a novel singular notation and representation. Singular representations are unusual in statistical texts. Nevertheless, the singular notation makes it simpler to extend and generalize theoretical results and greatly facilitates numerical and computational implementation.
This article argues that common intuitions regarding (a) the specialness of ‘use-novel’ data for confirmation and (b) that this specialness implies the ‘no-double-counting rule’, which says that data used in ‘constructing’ (calibrating) a model cannot also play a role in confirming the model’s predictions, are too crude. The intuitions in question are pertinent in all the sciences, but we appeal to a climate science case study to illustrate what is at stake. Our strategy is to analyse the intuitive (...) claims in light of prominent accounts of confirmation of model predictions. We show that on the Bayesian account of confirmation, and also on the standard classical hypothesis-testing account, claims (a) and (b) are not generally true; but for some select cases, it is possible to distinguish data used for calibration from use-novel data, where only the latter confirm. The more specialized classical model-selection methods, on the other hand, uphold a nuanced version of claim (a), but this comes apart from (b), which must be rejected in favour of a more refined account of the relationship between calibration and confirmation. Thus, depending on the framework of confirmation, either the scope or the simplicity of the intuitive position must be revised. (shrink)
Main objective of this paper is to describe emergence of a Polish Universities of the Third Age model. These are a multidisciplinary non-formal education centers, which allow formation of positive responses to the challenges of an ageing population. Article indicates main organizational changes of these institutions conditioned by internal and external factors. Essay describes transformation, differentiation factors, and characteristics of these institutions for elderly based on a critical analysis of literature.
Based on de Broglie’s wave hypothesis and the covariant ether, the Three Wave Hypothesis (TWH) has been proposed and developed in the last century. In 2007, the author found that the TWH may be attributed to a kinematical classical system of two perpendicular rolling circles. In 2012, the author showed that the position vector of a point in a model of two rolling circles in plane can be transformed to a complex vector under a proposed effect of partial observation. (...) In the present project, this concept of transformation is developed to be a lab observation concept. Under this transformation of the lab observer, it is found that velocity equation of the motion of the point is transformed to an equation analogising the relativistic quantum mechanics equation (Dirac equation). Many other analogies has been found, and are listed in a comparison table. The analogy tries to explain the entanglement within the scope of the transformation. These analogies may suggest that both quantum mechanics and special relativity are emergent, both of them are unified, and of the same origin. The similarities suggest analogies and propose questions of interpretation for the standard quantum theory, without any possible causal claims. (shrink)
I consider the field of aesthetics to be at its most productive and engaging when adopting a broadly philosophically informative approach to its core issues (e.g., shaping and testing putative art theoretic commitments against the relevant standard models employed in philosophy of language, metaphysics, and philosophy of mind) and to be at its most impotent and bewildering when cultivating a philosophically insular character (e.g., selecting interpretative, ontological, or conceptual models solely for fit with pre-fixed art theoretic commitments). For example, (...) when philosophical aesthetics tends toward insularity, we shouldn’t be surprised to find standard art-ontological categories incongruous with those standardly employed in contemporary metaphysics. Of course, when contemporary metaphysics tends to ignore aesthetic and art theoretic concerns, perhaps we likewise shouldn’t be surprised to find the climate of contemporary metaphysics inhospitable for a theory of art. While this may seem to suggest at least a prima facie tension between our basic art theoretic commitments considered from within philosophical aesthetics and our standard ontological commitments considered from without, I think any perceived tension or antagonism largely due to metaphysicians and aestheticians (at least implicitly) assuming there to be but two available methodological positions with respect to the relationship between contemporary metaphysics and philosophical aesthetics (in the relevant overlap areas). I call these two opposing views the Deference View and the Independence View. I argue that either view looks to lead to what I call the Paradox of Standards. (shrink)
Simulation models of the Reiterated Prisoner's Dilemma (in the following: RPD-models) are since 30 years considered as one of the standard tools to study the evolution of cooperation (Rangoni 2013; Hoffmann 2000). A considerable number of such simulation models has been produced by scientists. Unfortunately, though, none of these models has empirically been verified and there exists no example of empirical research where any of the RPD-models has successfully been employed to a particular instance of cooperation. Surprisingly, this has (...) not kept scientists from continuing to produce simulation models in the same tradition and from writing their own history as a history of success. In a recent simulation study -- which does not make use of the RPD but otherwise follows the same pattern of research -- Robert Axelrod's (1984) original role model for this kind of simulation studies is praised as ``an extremely effective means for investigating the evolution of cooperation'' and considered as ``widely credited with invigorating that field'' (Rendell et al. 2010). (shrink)
Evolutionary psychology and the selectionist theories of neural development are usually regarded as two unrelated theories addressing two logically distinct questions. The focus of evolutionary psychology is the phylogeny of the human mind, whereas the selectionist theories of neural development analyse the ontogeny of the mind. This paper will endeavour to combine these two approaches in the explanation of the human mind. Doing so might help in overcoming some of the criticisms of both theories. The first part of the paper (...) mentions three standard objections to evolutionary psychology and then outlines three philosophical problems evolutionary psychology has to offer a solution to. The second part will try to show that an approach combining evolutionary psychology and the selectionist theory of neural development might overcome some of these objections. (shrink)
We introduce a novel point of view on the “models as mediators” framework in order to emphasize certain important epistemological questions about models in science which have so far been little investigated. To illustrate how this perspective can help answer these kinds of questions, we explore the use of simplified models in high energy physics research beyond the StandardModel. We show in detail how the construction of simplified models is grounded in the need to mitigate pressing epistemic (...) problems concerning the uncertainty inherent in the present theoretical and experimental contexts. (shrink)
We respond to Stephen T. Davis’ criticism of our earlier essay, “Assessing the Resurrection Hypothesis.” We argue that the StandardModel of physics is relevant and decisive in establishing the implausibility and low explanatory power of the Resurrection hypothesis. We also argue that the laws of physics have entailments regarding God and the supernatural and, against Alvin Plantinga, that these same laws lack the proviso “no agent supernaturally interferes.” Finally, we offer Bayesian arguments for the Legend hypothesis and (...) against the Resurrection hypothesis. (shrink)
A standard way of representing causation is with neuron diagrams. This has become popular since the influential work of David Lewis. But it should not be assumed that such representations are metaphysically neutral and amenable to any theory of causation. On the contrary, this way of representing causation already makes several Humean assumptions about what causation is, and which suit Lewis’s programme of Humean Supervenience. An alternative of a vector diagram is better suited for a powers ontology. Causation should (...) be understood as connecting property types and tokens where there are dispositions towards some properties rather than others. Such a model illustrates how an effect is typically polygenous: caused by many powers acting with each other, and sometimes against each other. It models causation as a tendency towards an effect which can be counteracted. The model can represent cases of causal complexity, interference, over-determination and causation of absence (equilibrium). (shrink)
In recent years, there have been many reforms in the field of accounting. In the same time, scientists focus on the leading methods of accounting, financial management and economic opportunities for the additional use of accounting tools to introduce reforms in the field of the accounting of public sector entities. The main goal of this paper is to reveal the leading features of the accounting system of public sector entities and to study the aspects of a new accounting system, which (...) in the future can be implemented into the activities of public sector entities. The paper provides a scientific vision of the accounting reform of public sector entities. Our vision of forming a new accounting system in the public sector is based on the accounting model used in Italy and takes into account the peculiarities of the methodology and accounting organization in accordance with the International Accounting Standards. We highlight the main problems of introducing a new accounting system for public sector entities, based on the International Accounting Standards. In our opinion, this research can form new knowledge in the national literature on the accounting of public sector entities and highlight the main problems that arise while implementing accounting reforms. In addition, our research results can serve as a basis for the implementation of the accounting of public sector entities on the basis of International Accounting Standards for the public sector and the accrual principle. We believe that the main scientific aspects of public sector accounting will be the basis for future reforms based on the implementation of International Accounting Standards in the activities of public sector entities. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.