Loading...
Loading...
Browse, search and filter the latest cybersecurity research papers from arXiv
This paper presents a praxeological analysis of artificial intelligence and algorithmic governance, challenging assumptions about the capacity of machine systems to sustain economic and epistemic order. Drawing on Misesian a priori reasoning and Austrian theories of entrepreneurship, we argue that AI systems are incapable of performing the core functions of economic coordination: interpreting ends, discovering means, and communicating subjective value through prices. Where neoclassical and behavioural models treat decisions as optimisation under constraint, we frame them as purposive actions under uncertainty. We critique dominant ethical AI frameworks such as Fairness, Accountability, and Transparency (FAT) as extensions of constructivist rationalism, which conflict with a liberal order grounded in voluntary action and property rights. Attempts to encode moral reasoning in algorithms reflect a misunderstanding of ethics and economics. However complex, AI systems cannot originate norms, interpret institutions, or bear responsibility. They remain opaque, misaligned, and inert. Using the concept of epistemic scarcity, we explore how information abundance degrades truth discernment, enabling both entrepreneurial insight and soft totalitarianism. Our analysis ends with a civilisational claim: the debate over AI concerns the future of human autonomy, institutional evolution, and reasoned choice. The Austrian tradition, focused on action, subjectivity, and spontaneous order, offers the only coherent alternative to rising computational social control.
Inspired by the Turing test, we present a novel methodological framework to assess the extent to which a population of machines mirrors the philosophical views of a population of humans. The framework consists of three steps: (i) instructing machines to impersonate each human in the population, reflecting their backgrounds and beliefs, (ii) administering a questionnaire covering various philosophical positions to both humans and machines, and (iii) statistically analyzing the resulting responses. We apply this methodology to the debate on scientific realism, a long-standing philosophical inquiry exploring the relationship between science and reality. By considering the outcome of a survey of over 500 human participants, including both physicists and philosophers of science, we generate their machine personas using an artificial intelligence engine based on a large-language generative model. We reveal that the philosophical views of a population of machines are, on average, similar to those endorsed by a population of humans, irrespective of whether they are physicists or philosophers of science. As compared to humans, however, machines exhibit a weaker inclination toward scientific realism and a stronger coherence in their philosophical positions. Given the observed similarities between the populations of humans and machines, this methodological framework may offer unprecedented opportunities for advancing research in experimental philosophy by replacing human participants with their machine-impersonated counterparts, possibly mitigating the efficiency and reproducibility issues that affect survey-based empirical studies.
This work explores the connection between logical independence and the algebraic structure of quantum mechanics. Building on results by Brukner et al., it introduces the notion of \textit{onto-epistemic ignorance}: situations in which the truth of a proposition is not deducible due to an objective breakdown in the phenomenal chain that transmits information from a system A to a system B, rather than to any subjective lack of knowledge. It is shown that, under such conditions, the probabilities accessible to a real observer are necessarily conditioned by decidability and obey a non-commutative algebra, formally equivalent to the fundamental postulates of quantum mechanics.
This paper develops the approach to special relativity put forward by John S. Bell. The classical dynamics of an electron orbiting a nucleus in uniform motion is solved analytically and compared to numerical simulations for an accelerated nucleus. The relativistic phenomena of length contraction and time dilation are shown to result from the electric and magnetic forces on the electron when its motion is analyzed in a single frame of reference. The relevance of these results for understanding the theory of special relativity is discussed.
This paper reconceptualises peer review as structured public commentary. Traditional academic validation is hindered by anonymity, latency, and gatekeeping. We propose a transparent, identity-linked, and reproducible system of scholarly evaluation anchored in open commentary. Leveraging blockchain for immutable audit trails and AI for iterative synthesis, we design a framework that incentivises intellectual contribution, captures epistemic evolution, and enables traceable reputational dynamics. This model empowers fields from computational science to the humanities, reframing academic knowledge as a living process rather than a static credential.
In this paper I analyze the context in which Cassini 1st described lunar libration and proposed its interpretation.
Ronald Wilfrid Gurney is one of the lesser-known research students of the Cavendish Laboratory in the mid 1920s. Gurney made significant contributions to the application of quantum mechanics to problems related to tunneling of alpha-particles from nuclei, to formation of images in photographic plates, the understanding of the origin of color-centres in salt crystals, and in the theory of semiconductors. He was the first physicist to apply quantum mechanics to the theory of electrochemistry and ionic solutions. He also made fundamental contributions to ballistics research. Gurney wrote a number of textbooks on fundamental and applied quantum mechanics in a distinctive style which are still useful as educational resources. In addition to his scientific contributions, he travelled extensively, and during and after World War II worked in the United States. During the cold war, he got entangled in the Klaus Fuchs affair and lost his employment. He died at the age of 54 in 1953 from a stroke. With the approach of the 100th year anniversary of quantum mechanics, it is timely to commemorate the life and contributions of this somewhat forgotten physicist.
A theory of quantum gravity consists of a gravitational framework which, unlike general relativity, takes into account the quantum character of matter. In spite of impressive advances, no fully satisfactory, self-consistent and empirically viable theory with those characteristics has ever been constructed. A successful semiclassical gravity model, in which the classical Einstein tensor couples to the expectation value of the energy-momentum tensor of quantum matter fields, would, at the very least, constitute a useful stepping stone towards quantum gravity. However, not only no empirically viable semiclassical theory has ever been proposed, but the self-consistency of semiclassical gravity itself has been called into question repeatedly over the years. Here, we put forward a fully self-consistent, empirically viable semiclassical gravity framework, in which the expectation value of the energy-momentum tensor of a quantum field, evolving via a relativistic objective collapse dynamics, couples to a fully classical Einstein tensor. We present the general framework, a concrete example, and briefly explore possible empirical consequences of our model.
General Relativity (GR) was created in November 1915 and since its creation and up to now this theory has undergone many tests. The first realistic cosmological models were proposed in the works of Friedman, written in the 1920s. For a long time Friedman's cosmological works were actually banned in Soviet Union due to philosophical reasons, since the models where the birth and evolution of the Universe occurs were considered ideologically unacceptable. Due to great achievements in relativity and cosmology and due to increasing interest to these branches of of science in last decades we recall a development of relativistic astrophysics and contribution of Russian researchers in these studies. Since one of the world leaders in physical cosmology A. A. Friedman passed away in September 1925, it is reasonable to outline the main achievements of physical cosmology over the past 100 years. We discuss also observational and theoretical achievements in confirmations of relativistic observational predictions for black holes, including the closest supermassive black hole in our Galactic Center. We outline an evolution of black hole shadow from the purely theoretical concept to observable quantities for supermassive black holes in Sgr A* and M87*.
For the simple system of a point-like particle in the real line I compile and contrast, starting with a concise table, the structural elements of quantum mechanics with those of classical (statistical) mechanics. Despite many amazing similarities, there are the well-known fundamental differences. The basic reason for all of them is the algebraic non-commutativity in the quantal structure. It was discovered by Werner Heisenberg (1901-1976) in June 1925 on the small island of Helgoland in the North Sea, as a consequence of his success at understanding atomic spectral data within a matrix scheme which is consistent with energy conservation. I discuss the differences and exemplify quantifications of them by the variance and entropic indeterminacy inequalities, by (pseudo-)classical bounds on quantum canonical partition functions, and by the correlation inequalities of John Bell (1928-1990) and others.
There is a contemporary trend toward geometrizing all mathematical theories, as proposed by the Langlands program, and, by extension, physical theories as well. Within this paradigm, it becomes possible to represent physical objects as formal geometric entities. This opens the door within the framework of logical empiricism pragmatism and constructive realism to the development of a new philosophical approach grounded in a logical mathematical perspective. The aim is to integrate both the physics and metaphysics of the objects of the world into a unified system of knowledge, based on the geometric and physical interpretations of their mathematical representations. This article is presented as a foundational step in the development of that approach.
This note addresses the relevance of rare events in system dynamics, inspired by Jill North reflections on the origin of the arrow of time in thermodynamics. After identifying the existence of rare events, characterized by a Pareto distribution, within a simple gas particle simulation, we investigate their impact on entropy evolution. These rare events are associated with microstates that locally decrease entropy, in contrast to the overall entropy increase observed in the bulk of the system. We present numerical simulations of gas particles, both without and with a gravity-like attractive force, to explore the fate of these rare events. Our results show that, while rare events can transiently generate local decreases in entropy, global entropy may continue to increase in accordance with the second law of thermodynamics. The introduction of gravity-like attraction stabilizes these low-entropy configurations, allowing them to persist longer. This study highlights the interplay between rare statistical fluctuations and macroscopic thermodynamic behavior, providing new insights into the emergence and stability of order in complex systems.
This paper presents an annotated English translation of Daniel Bernoulli's 1727 work A New Theory on the Motion of Waters through Channels of Any Kind, originally published in the Commentarii Academiae Scientiarum Imperialis Petropolitanae. Written over a decade before his renowned Hydrodynamica (1738), this early treatise reveals D. Bernoulli's foundational approach to fluid motion based on the conservation of vis viva-the kinetic energy of moving bodies-at a time when the principle was still under active debate. In this work, D. Bernoulli applies mechanical reasoning to fluids flowing through channels of arbitrary shape, deriving relationships between velocity, cross-sectional area, and efflux under ideal conditions. He anticipates core results of Hydrodynamica, including the inverse relationship between flow velocity and cross-sectional area, and emphasizes the role of energy balance in analyzing steady flow. The text also includes reflections on experimental limitations, the influence of friction, and the boundaries of theoretical applicability. This translation highlights the historical and conceptual significance of the 1727 paper as a precursor to D. Bernoulli's mature hydrodynamic theory.
This paper argues that the practice of fault-tolerant quantum computation, specifically the mechanism of Quantum Error Correction (QEC), offers a profoundly new lens through which to examine foundational questions of ontology, emergence, and interpretation. We move beyond the standard debate on quantum speedup to ask: What is the nature of the entity--the logical qubit--that is being protected, and what does the active, goal-directed process of its protection reveal about physical reality? We argue that the logical qubit presents a unique case study in the metaphysics of identity, functioning as a quantifiable "Ship of Theseus" in Hilbert space. We introduce the concept of "engineered emergence" to describe the active, information-driven stabilization of the logical qubit, distinguishing it from passive forms of emergence and positioning it as a new category of causal structure. Finally, we demonstrate that the logical qubit serves as a powerful new testbed for major interpretations of quantum mechanics (including agent-centered, Many-Worlds, and Bohmian views), revealing novel strengths and challenges for each. We conclude that the technological imperative of fault-tolerance is not merely an engineering problem but a catalyst for deep philosophical insight, transforming abstract debates into concrete physical questions.
We reanalyze from a modern perspective the bold idea of G. Helm, W. Ostwald, P. Duhem and others that energy is the fundamental entity composing the physical world. We start from a broad perspective reminding the search for a fundamental ``substance'' (perhaps better referred to as ous\'\i a, the original Greek word) from the pre-Socratics to the important debate between Ostwald and Boltzmann about the energy vs. atoms at the end of the 19th century. While atoms were eventually accepted (even by Ostwald himself), the emergence of Quantum Mechanics and Relativity were crucial to suggest that the dismissal of energy in favor of atoms was perhaps premature, and should be revisited. We discuss how the so-called primitive ontology programme can be implemented with energy as the fundamental entity, and why fields (and their quanta, particles) should rather be considered as non-fundamental. We sketch some of the difficulties introduced by the attempt to include gravitation in the general scheme.
The world of particle physics was revolutionised in November 1974 by the discovery of the J/psi particle, the first particle to be identified as a quarkonium state composed of charm quarks and antiquarks. The charmonium interpretation of the J/psi was cemented by the subsequent observations of a spectrum of related c \bar c states, and finally by the discovery of charmed particles in 1976. The discovery of charmonium was followed in 1977 by the discovery of bottomonium mesons and particles containing bottom quarks. Toponium bound states of top quark and antiquarks were predicted to exist in principle but, following the discovery of the top quark in 1995, most physicists thought that its observation would have to wait for next-generation e^+ e^- collider. However, in the second half of 2024 the CMS Collaboration reported an excess of events near the threshold for \bar t t production at the LHC that is most plausibly interpreted as the lowest-lying toponium state. These are the personal recollections of an eyewitness who followed closely these 50 years of quarkonium discoveries.
The process launched by Lobachevsky. The movement of the Kazan school of geometry towards physics. Personal memories of Alexei Zinovievich Petrov, the great Kazan geometer and theoretical physicist, who became the Author's Guiding Star. KeyWords: Alexey Zinovievich Petrov, geometry, general theory of relativity, Kazan University, Department of Relativity and Gravitation Theory, methods of teaching exact sciences.
A simplified method to calculate the critical mass of a fissile material sphere is presented. This is a purely pedagogical study, in part to elucidate the historical evolution of criticality calculations. This method employs only elementary calculus and straightforward statistical arguments by formulating the problem in terms of the threshold condition that the number of neutrons in the sphere does not change with time; the average neutron path length in the material must be long enough to produce enough fission neutrons to balance losses by absorption due to nuclear reactions and leakage through the surface. This separates the nuclear reaction part of the problem from the geometry and mechanics of neutron transport, the only connection being the total path length which together with the distance between scatterings determines the sphere radius. This leads to an expression for the critical radius without the need to solve the diffusion equation. Comparison with known critical masses shows agreement at the few-percent level. The analysis can also be applied to impure materials, isotopically or otherwise, and can be extended to general neutronics estimations as a design guide or for order-of-magnitude checking of Monte Carlo N-Particle (MCNP) simulations. A comparison is made with the Oppenheimer-Bethe criticality formula, with the results of other calculations, and with the diffusion equation approach via a new treatment of the boundary conditions.