Artículos con la etiqueta ‘High Energy Physics – Phenomenology (hep-ph)’

Constraints on the Universe as a Numerical Simulation

Por • 27 dic, 2012 • Category: Opinion

Observable consequences of the hypothesis that the observed universe is a numerical simulation performed on a cubic space-time lattice or grid are explored. The simulation scenario is first motivated by extrapolating current trends in computational resource requirements for lattice QCD into the future. Using the historical development of lattice gauge theory technology as a guide, we assume that our universe is an early numerical simulation with unimproved Wilson fermion discretization and investigate potentially-observable consequences. Among the observables that are considered are the muon g-2 and the current differences between determinations of alpha, but the most stringent bound on the inverse lattice spacing of the universe, b^(-1) >~ 10^(11) GeV, is derived from the high-energy cut off of the cosmic ray spectrum. The numerical simulation scenario could reveal itself in the distributions of the highest energy cosmic rays exhibiting a degree of rotational symmetry breaking that reflects the structure of the underlying lattice.

On the concepts of vacuum and mass and the search for higgs

Por • 9 dic, 2012 • Category: Leyes

Abstract: Some recollections on the recent history of the concepts of vacuum and mass and the search for higgs. According to the widely spread terminology the Higgs field permeates vacuum and serves as the origin of masses of all fundamental particles including the Higgs Boson — the higgs.

Effective Field Theories and the Role of Consistency in Theory Choice

Por • 13 nov, 2012 • Category: Ciencia y tecnología

Promoting a theory with a finite number of terms into an effective field theory with an infinite number of terms worsens simplicity, predictability, falsifiability, and other attributes often favored in theory choice. However, the importance of these attributes pales in comparison with consistency, both observational and mathematical consistency, which propels the effective theory to be superior to its simpler truncated version of finite terms, whether that theory be renormalizable (e.g., Standard Model of particle physics) or nonrenormalizable (e.g., gravity). Some implications for the Large Hadron Collider and beyond are discussed, including comments on how directly acknowledging the preeminence of consistency can affect future theory work.

Numerical Simulations of the Dark Universe: State of the Art and the Next Decade

Por • 1 oct, 2012 • Category: Leyes

We present a review of the current state of the art of cosmological dark matter simulations, with particular emphasis on the implications for dark matter detection efforts and studies of dark energy. This review is intended both for particle physicists, who may find the cosmological simulation literature opaque or confusing, and for astro-physicists, who may not be familiar with the role of simulations for observational and experimental probes of dark matter and dark energy. Truly massive dark matter-only simulations are being conducted on national supercomputing centers, employing from several billion to over half a trillion particles to simulate the formation and evolution of cosmologically representative volumes (cosmic scale) or to zoom in on individual halos (cluster and galactic scale). These simulations cost millions of core-hours, require tens to hundreds of terabytes of memory, and use up to petabytes of disk storage. The field is quite internationally diverse, with top simulations having been run in China, France, Germany, Korea, Spain, and the USA. Predictions from such simulations touch on almost every aspect of dark matter and dark energy studies, and we give a comprehensive overview of this connection.

Comment on the evidence of the Higgs boson at LHC

Por • 27 sep, 2012 • Category: Leyes

We comment on the Standard Model Higgs boson evidence from LHC. We propose that the new resonance at 125 GeV could be interpreted as a pseudoscalar meson with quantum number $J^{PC} = 0^{- +}$. We show that this pseudoscalar could mimic the decays of the Standard Model Higgs boson in all channels with the exception of the decay into two leptons that i. s strongly suppressed due to charge-conjugation invariance

Lattice Gauge Theory and the Origin of Mass

Por • 18 sep, 2012 • Category: Opinion

Most of the mass of everyday objects resides in atomic nuclei; the total of the electrons’ mass adds up to less than one part in a thousand. The nuclei are composed of nucleons—protons and neutrons—whose nuclear binding energy, though tremendous on a human scale, is small compared to their rest energy. The nucleons are, in turn, composites of massless gluons and nearly massless quarks. It is the energy of these confined objects, via $M=E/c^2$, that is responsible for everyday mass. This article discusses the physics of this mechanism and the role of lattice gauge theory in establishing its connection to quantum chromodynamics.

Origins of Mass

Por • 14 jul, 2012 • Category: Leyes

Newtonian mechanics posited mass as a primary quality of matter, incapable of further elucidation. We now see Newtonian mass as an emergent property. Most of the mass of standard matter, by far, arises dynamically, from back-reaction of the color gluon fields of quantum chromodynamics (QCD). The equations for massless particles support extra symmetries – specifically scale, chiral, and gauge symmetries. The consistency of the standard model relies on a high degree of underlying gauge and chiral symmetry, so the observed non-zero masses of many elementary particles ($W$ and $Z$ bosons, quarks, and leptons) requires spontaneous symmetry breaking. Superconductivity is a prototype for spontaneous symmetry breaking and for mass-generation, since photons acquire mass inside superconductors. A conceptually similar but more intricate form of all-pervasive (i.e. cosmic) superconductivity, in the context of the electroweak standard model, gives us a successful, economical account of $W$ and $Z$ boson masses. It also allows a phenomenologically successful, though profligate, accommodation of quark and lepton masses. The new cosmic superconductivity, when implemented in a straightforward, minimal way, suggests the existence of a remarkable new particle, the so-called Higgs particle.

Quantum Theory without Planck’s Constant

Por • 15 jun, 2012 • Category: Filosofía

Planck’s constant was introduced as a fundamental scale in the early history of quantum mechanics. We find a modern approach where Planck’s constant is absent: it is unobservable except as a constant of human convention. Despite long reference to experiment, review shows that Planck’s constant cannot be obtained from the data of Ryberg, Davisson and Germer, Compton, or that used by Planck himself. In the new approach Planck’s constant is tied to macroscopic conventions of Newtonian origin, which are dispensable. The precision of other fundamental constants is substantially improved by eliminating Planck’s constant. The electron mass is determined about 67 times more precisely, and the unit of electric charge determined 139 times more precisely. Improvement in the experimental value of the fine structure constant allows new types of experiment to be compared towards finding «new physics.» The long-standing goal of eliminating reliance on the artifact known as the International Prototype Kilogram can be accomplished to assist progress in fundamental physics.