What is the higgs boson particle? part 2

Editor's note: On October 8, 2013, Peter Higgs and Francois Englert won the Nobel Prize in Physics for their work on the Higgs boson. Below, our science columnist Brian Greene explains the science behind the discovery.

What is the Higgs Boson Particle? Part 2

Photo Gallery

A famous story in the annals of physics tells of a 5-year-old Albert Einstein, sick in bed, receiving a toy compass from his father.

The boy was both puzzled and mesmerized by the invisible forces at work, redirecting the compass needle to point north whenever its resting position was disturbed.

That experience, Einstein would later say, convinced him that there was a deep hidden order to nature, and impelled him to spend his life trying to reveal it.

Although the story is more than a century old, the conundrum young Einstein encountered resonates with a key theme in contemporary physics, one that’s essential to the most important experimental achievement in the field of the last 50 years: the discovery, a year ago this July, of the Higgs boson.

Let me explain.

Science in general, and physics in particular, seek patterns. Stretch a spring twice as far, and feel twice the resistance. A pattern. Increase the volume an object occupies while keeping its mass fixed, and the higher it floats in water. A pattern. By carefully observing patterns, researchers uncover physical laws that can be expressed in the language of mathematical equations.

A clear pattern is also evident in the case of a compass: Move it and the needle points north again. I can imagine a young Einstein thinking there must be a general law stipulating that suspended metallic needles are pushed north.

But no such law exists. When there is a magnetic field in a region, certain metallic objects experience a force that aligns them along the field’s direction, whatever that direction happens to be.

And Earth’s magnetic field happens to point north.

The example is simple but the lesson profound. Nature’s patterns sometimes reflect two intertwined features: fundamental physical laws and environmental influences. It’s nature’s version of nature versus nurture. In the case of a compass, disentangling the two is not difficult.

By manipulating it with a magnet, you readily conclude the magnet’s orientation determines the needle’s direction.

But there can be other situations where environmental influences are so pervasive, and so beyond our ability to manipulate, it would be far more challenging to recognize their influence.

Physicists tell a parable about fish investigating the laws of physics but so habituated to their watery world they fail to consider its influence. The fish struggle mightily to explain the gentle swaying of plants as well as their own locomotion. The laws they ultimately find are complex and unwieldy. Then, one brilliant fish has a breakthrough.

Maybe the complexity reflects simple fundamental laws acting themselves out in a complex environment—one that’s filled with a viscous, incompressible and pervasive fluid: the ocean. At first, the insightful fish is ignored, even ridiculed.

But slowly, the others, too, realize that their environment, its familiarity notwithstanding, has a significant impact on everything they observe.

Does the parable cut closer to home than we might have thought? Might there be other, subtle yet pervasive features of the environment that, so far, we’ve failed to properly fold into our understanding? The discovery of the Higgs particle by the Large Hadron Collider in Geneva has convinced physicists that the answer is a resounding yes.

Nearly a half-century ago, Peter Higgs and a handful of other physicists were trying to understand the origin of a basic physical feature: mass. You can think of mass as an object’s heft or, a little more precisely, as the resistance it offers to having its motion changed.

Push on a freight train (or a feather) to increase its speed, and the resistance you feel reflects its mass. At a microscopic level, the freight train’s mass comes from its constituent molecules and atoms, which are themselves built from fundamental particles, electrons and quarks.

But where do the masses of these and other fundamental particles come from?

One more piece in the puzzle of the universe—a Higgs-shaped one

People have always been curious about how the world works, and what everything is fundamentally made of.

The ancient Greeks thought it all boiled down to four elemental substances: earth, air, fire and water. Over the centuries, physicists have taken these concepts and run amok.

Modern day particle physics is concerned with how the world works on a sub-atomic (really really small) scale.

For a while, we thought that the atom was the smallest component of matter, and could not be broken up into anything smaller (the name comes from the Greek word atomon, which means ‘that which cannot be divided’) and that a hydrogen atom was the smallest possible particle of matter. We now know that the fundamental world extends far beyond the basics of earth, air, fire and water, and there are a lot of things smaller than a hydrogen atom. In fact, there are more than 200 types of sub-atomic particles, which interact in very complex ways.

Of the 200 sub-atomic particles that we know about, there are some that cannot be split into smaller components, and these are called elementary particles. These elementary particles combine to form composite particles.

There are also four fundamental forces, or fields, within the universe: the strong nuclear force, the weak nuclear force, the electromagnetic force and the gravitational force.

These forces act upon different types of particles and control how they interact with each other.

The Standard Model is theoretical physicists’ best attempt yet to describe the elementary particles, and to explain how three of the four forces govern the way they behave. The model is a sort of instruction manual for the universe, with 12 elementary particles of matter, and the fundamental forces provide the methods for putting the matter together.

The particles

There are two main categories of particles that make up the Standard Model: fermions and bosons. Fermions make up matter, and bosons are responsible for transmitting the forces that control how matter behaves.

Fermions are divided into quarks and leptons. Quarks can be further split up into six different types: up, down, charm, strange, top, bottom.

Leptons also come in six varieties: electron , electron neutrino, muon, muon neutrino, tau and tau neutrino. Electrons, muons and taus all have an electrical charge, which is important as it determines what forces can control them.

Quarks can combine to make composite particles, such as a hadron (remember this one for later).

Fermions are the matter particles, the basic building blocks of everything in the universe. They are split into two groups: quarks and leptons.

See also:  Sit versus set


  • u Up
  • c Charm
  • t Top
  • d Down
  • s Strange
  • b Bottom


  • e Electron
  • μ Muon
  • τ Tau
  • νe Electron neutrino
  • νμ Muon neutrino
  • ντ Tau neutrino

All matter as we know it is made from various combinations of elementary particles, stuck together by one of the fundamental forces.

Stuck together how?

This is where the bosons come in. The four fundamental forces—the weak nuclear force, the strong nuclear force, the electromagnetic force and the gravitational force each have their own specific type of boson , known as a gauge boson. Interaction of gauge bosons results in the manifestation of the forces that control our natural world.

The weak nuclear force is controlled by the W bosons and Z bosons and is responsible for the radioactive decay of sub-atomic particles.

The Higgs boson: the hunt, the discovery, the study and some future perspectives | ATLAS Experiment at CERN

Many questions in particle physics are related to the existence of particle mass. The “Higgs mechanism,” which consists of the Higgs field and its corresponding Higgs boson, is said to give mass to elementary particles.

By “mass” we mean the inertial mass, which resists when we try to accelerate an object, rather than the gravitational mass, which is sensitive to gravity. In Einstein’s celebrated formula E = mc2, the “m” is the inertial mass of the particle.

In a sense, this mass is the essential quantity, which defines that at this place there is a particle rather than nothing.

In the early 1960s, physicists had a powerful theory of electromagnetic interactions and a descriptive model of the weak nuclear interaction – the force that is at play in many radioactive decays and in the reactions which make the Sun shine. They had identified deep similarities between the structure of these two interactions, but a unified theory at the deeper level seemed to require that particles be massless even though real particles in nature have mass.

In 1964, theorists proposed a solution to this puzzle. Independent efforts by Robert Brout and François Englert in Brussels, Peter Higgs at the University of Edinburgh, and others lead to a concrete model known as the Brout-Englert-Higgs (BEH) mechanism.

The peculiarity of this mechanism is that it can give mass to elementary particles while retaining the nice structure of their original interactions. Importantly, this structure ensures that the theory remains predictive at very high energy.

Particles that carry the weak interaction would acquire masses through their interaction with the Higgs field, as would all matter particles. The photon, which carries the electromagnetic interaction, would remain massless.

In the history of the universe, particles interacted with the Higgs field just 10-12 seconds after the Big Bang. Before this phase transition, all particles were massless and travelled at the speed of light.

After the universe expanded and cooled, particles interacted with the Higgs field and this interaction gave them mass. The BEH mechanism implies that the values of the elementary particle masses are linked to how strongly each particle couples to the Higgs field. These values are not predicted by current theories.

However, once the mass of a particle is measured, its interaction with the Higgs boson can be determined.

The BEH mechanism had several implications: first, that the weak interaction was mediated by heavy particles, namely the W and Z bosons, which were discovered at CERN in 1983. Second, the new field itself would materialize in another particle.

The mass of this particle was unknown, but researchers knew it should be lower than 1 TeV – a value well beyond the then conceivable limits of accelerators.

This particle was later called the Higgs boson and would become the most sought-after particle in all of particle physics.

The Large Electron-Positron collider (LEP), which operated at CERN from 1989 to 2000, was the first accelerator to have significant reach into the potential mass range of the Higgs boson. Though LEP did not find the Higgs boson, it made significant headway in the search, determining that the mass should be larger than 114 GeV.

In 1984, a few physicists and engineers at CERN were exploring the possibility of installing a proton-proton accelerator with a very high collision energy of 10-20 TeV in the same tunnel as LEP. This accelerator would probe the full possible mass range for the Higgs, provided that the luminosity[1] was very high.

However, this high luminosity would mean that each interesting collision would be accompanied by tens of background collisions. Given the state of detector technology of the time, this seemed a formidable challenge. CERN wisely launched a strong R&D programme, which enabled fast progress on the detectors.

This seeded the early collaborations, which would later become ATLAS, CMS and the other LHC experiments.

On the theory side, the 1990s saw much progress: physicists studied the production of the Higgs boson in proton-proton collisions and all its different decay modes.

As each of these decay modes depends strongly on the unknown Higgs boson mass, future detectors would need to measure all possible kinds of particles to cover the wide mass range.

Each decay mode was studied using intensive simulations and the important Higgs decay modes were amongst the benchmarks used to design the detector.

Meanwhile, at the Fermi National Accelerator Laboratory (Fermilab) outside of Chicago, Illinois, the Tevatron collider was beginning to have some discovery potential for a Higgs boson with mass around 160 GeV. Tevatron, the scientific predecessor of the LHC, collided protons with antiprotons from 1986 to 2011.

In 2008, after a long and intense period of construction, the LHC and its detectors were ready for the first beams. On 10 September 2008, the first injection of beams into the LHC was a big event at CERN, with the international press and authorities invited.

The machine worked beautifully and we had very high hopes. Alas, ten days later, a problem in the superconducting magnets significantly damaged the LHC. A full year was necessary for repairs and to install a better protection system.

The incident revealed a weakness in the magnets, which limited the collision energy to 7 TeV.

When restarting, we faced a difficult decision: should we take another year to repair the weaknesses all around the ring, enabling operation at 13 TeV? Or should we immediately start and operate the LHC at 7 TeV, even though a factor of three fewer Higgs bosons would be produced? Detailed simulations showed that there was a chance of discovering the Higgs boson at the reduced energy, in particular in the range where the competition of the Tevatron was the most pressing, so we decided that starting immediately at 7 TeV was worth the chance.

See also:  Sergeant stubby: how did one dog become a decorated war hero?

The LHC restarted in 2010 at 7 TeV with a modest luminosity – a luminosity that would increase in 2011. The ATLAS Collaboration had made good use of the forced stop of 2009 to better understand the detector and prepare the analyses.

In 2010, Higgs experts from experiments and theory created the LHC Higgs Cross-Section[2] Working Group (LHCHXSWG), which proved invaluable as a forum to accompany the best calculations and to discuss the difficult aspects about Higgs production and decay.

These results have since been regularly documented in the “LHCHXSWG Yellow Reports,” famous in the community.

The discovery of the Higgs boson

The higgs boson and the resonances at the Large Hadron Collider

Review Article Volume 2 Issue 5

Department of Physical Electronics, Russian State Pedagogical University, Russia

Correspondence: Konstantinov Stanislav Ivanovich, Department of Physical Electronics, Russian State Pedagogical University, St.Petersburg, RSC?Energy?, Russia, Tel 8911 7159 176

Received: July 06, 2018 | Published: October 23, 2018

Citation: Ivanovich KS. The higgs boson and the resonances at the Large Hadron Collider. Phys Astron Int J. 2018;2(5):481-487. DOI: 10.15406/paij.2018.02.00130

Download PDF


The article presents experimental discoveries recently obtained at the Large Hadron Collider (LHC) and at the Alpha–magnetic spectrometer AMS–02, which was placed on the International Space Station (ISS).

To explain their results, it is proposed to consider the polarization of quantum vacuum (dark matter) and the resonances accompanying the formation of unstable particles, including heavy resonances of the Higgs boson, in collisions of bundles of relativistic protons.

In this connection, it is necessary to reconsider the Standard Model.

Keywords: vacuum, dark matter, proton, electron, positron, resonance, mass, energy

The experimental discoveries made recently at the Large Hadron Collider (LHC) include the discovery of the Higgs boson, the increase in the proton interaction cross–section with increasing energy, and the increase in the fraction of the elastic scattering processes in the interval the energy that is, the effect an increase in stability the protons as well as the emission of jets in inelastic processes with a large multiplicity.1 The most striking is that the interval of resonant proton energy in the LHC, at which is observed the greatest probability of inelastic collisions of protons and the creation of new particles, corresponds to the energy interval ,1 however, with increasing energy of relativistic protons, the effect of their stability after collision increases.1 Noteworthy is the fact that that in the alpha–magnetic spectrometer AMS–02, the resonance maximal of the total energy spectrum of the secondary electrons and positrons,2 as well as the maxima of the energy spectra obtained separately for positrons2 and for electrons2 also corresponds to the energy interval . It can be assumed that the creation of new particles in this energy range is associated with the polarization of a quantum vacuum (dark matter). Such a picture contradicts the notions of classical physics and goes beyond the framework of the Standard Model (SM). The last experimental discoveries in the LHC are waiting for an explanation. Their interpretation is difficult since all observed effects in LHC are associated with the manifestation of strong interaction forces and should be described by quantum chromo dynamics (QCD). Today in theoretical physics, the perturbation theory method is used, in which expansions in power series are performed with respect to the coupling constant in the case of its small values. However, this method is not suitable for describing the production of particles at so–called “soft” events of interaction with small transmitted pulses, for which the coupling constant is sufficiently large. In the interpretation of experimental data on such events, models with a lot of adjustable parameters are usually involved. As a result, predictions of models become less certain and not reliable enough. The article offers a different interpretation of the “soft” events of the creation of unstable particles in the LHC on the basis the polarization of a quantum vacuum (dark matter) and the resonances. The creation of unstable particles in the LHC at proton energies of the order of  is indeed observed,1 as is the production of electron–positron pairs in the cosmic AMS–02 detector,2 but this effect can be explained by the polarization of the quantum vacuum (dark matter), which is the third full participant in collisions of protons in the LHC and whose presence the apologists of the dominant 100 years in the physics of the Einstein's Special Relativity Theory (SRT) deny.3 Due to the fact that within the framework of the ELI and XCELS projects the laser radiation intensity available for experiments has increased to  and higher, it has become possible to study the nonlinear vacuum effects that have so far not been experimentally studied. Thus, at the level of ultrahigh intensities 10 23 W.c m −2 [email protected]@[email protected]@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfwBIj xAHbqedmvETj2BSbqefm0B1jxALjhiov2DaerbuLwBLnhiov2DGi1B TfMBaebbnrfifHhDYfgasaacPi=BMqFfpeea0xh9v8qiW7rqqrFfpe ea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0Firpe peKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaaeaqbaaGcba qcLbsaqaaaaaaaaaWdbiaaigdacaaIWaWcdaahaaqcbasabeaajugW aiaaikdacaaIZaaaaKqzGeGaam4vaiaac6cacaWGJbGaamyBaSWaaW [email protected]@
, the effect of scattering of a laser pulse on an electron beam with an energy of 46.6 GeV (nonlinear Compton effect) on the SLAC linear accelerator causes such cascades of successive hard–photon emissions ( Wphot=h ν [email protected]@[email protected]@+= feaagKart1ev2aqatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfwBIj xAHbqedmvETj2BSbqefm0B1jxALjhiov2DaerbuLwBLnhiov2DGi1B TfMBaebbnrfifHhDYfgasaacPi=BMqFfpeea0xh9v8qiW7rqqrFfpe ea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0Firpe peKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaaeaqbaaGcba qcLbsaqaaaaaaaaaWdbiaadEfacaWGWbGaamiAaiaad+gacaWG0bGa [email protected]@
) that the creation of secondary electron–positron pairs in vacuum is a chain reaction that continues up to up to the moment of complete loss of energy by charged particles. This is very reminiscent of the extensive atmospheric showers generated by cosmic particles.4,5 In this comparison of space observations with the results of laboratory studies demonstrates the deep analogies, evidencing, at a minimum, the unity of the physical principles of the behavior of matter in a wide range of densities (approximately 42 orders of magnitude) and temperatures (
10 13 K [email protected]@[email protected]@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfwBIj xAHbqedmvETj2BSbqefm0B1jxALjhiov2DaerbuLwBLnhiov2DGi1B TfMBaebbnrfifHhDYfgasaacPi=BMqFfpeea0xh9v8qiW7rqqrFfpe ea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0Firpe peKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaaeaqbaaGcba qcLbsaqaaaaaaaaaWdbiaaigdacaaIWaWcdaahaaqcbasabeaajugW [email protected]@
). Perhaps the creation of electron–positron pairs in a vacuum is a manifestation of the instability of dark matter. Today, according to the results of experiments on the SLAC linear accelerator, many physicists believe that in the LHC the emission of jets in inelastic processes with a large multiplicity, including protons and antiprotons, is associated with dark matter.6 In the quantum electrodynamics (QED) there is still no complete clarity on how to solve the problem of the production of pairs of elementary particles and antiparticles in a vacuum under the action of external fields, relying on the corresponding equations the Klein–Gordon–Fock and the Dirac's equations. For some of these fields, it is possible to construct the corresponding quantum theory of the Dirac's field, but on the whole, there are insurmountable difficulties connected with the creation of electron–positron pairs from the vacuum leading to nonlinear many–particle problems.7

In quantum electrodynamics (QED), the instability of a physical vacuum under the influence of high–energy photons of cosmic radiation, relativistic protons, peak electric and magnetic fields or high–intensity laser radiation is called the vacuum polarization and is characterized by the formation of electron–positron pairs, which makes the vacuum itself unstable.

8 In the LHC, the electric current arises from the motion of charged beams of relativistic protons and in accordance with Maxwell's equations, generates force and non–strong, toroidal and poloidal electromagnetic fields.

See also:  What was the first english novel?

9 It has been experimentally established that in the presence of a strong magnetic field H≈ 10 16 T [email protected]@[email protected]@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfwBIj xAHbqedmvETj2BSbqefm0B1jxALjhiov2DaerbuLwBLnhiov2DGi1B TfMBaebbnrfifHhDYfgasaacPi=BMqFfpeea0xh9v8qiW7rqqrFfpe ea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0Firpe peKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaaeaqbaaGcba qcLbsaqaaaaaaaaaWdbiaadIeacqGHijYUcaaIXaGaaGimaSWaaWba [email protected]@
or peak of the electric fields strength E≈ 10 16  V⋅c m −1 [email protected]@[email protected]@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfwBIj xAHbqedmvETj2BSbqefm0B1jxALjhiov2DaerbuLwBLnhiov2DGi1B TfMBaebbnrfifHhDYfgasaacPi=BMqFfpeea0xh9v8qiW7rqqrFfpe ea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0Firpe peKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaaeaqbaaGcba qcLbsaqaaaaaaaaaWdbiaadweacqGHijYUcaaIXaGaaGimaSWaaWba aKqaGeqabaqcLbmacaaIXaGaaGOnaaaajugibiaabccacaWGwbGaey yXICTaam4yaiaad2galmaaCaaajeaibeqaaKqzadGaeyOeI0IaaGym [email protected]@
in the quantum vacuum from virtual particles relatively stable particles are formed (lifetime 16·10ˉ²³ sec.). With the polarization of the quantum vacuum and its transformation into the matter, the change in the energy of the vacuum w can be represented as a sum:

  • w= w p + w a [email protected]@[email protected]@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfwBIj xAHbqedmvETj2BSbqefm0B1jxALjhiov2DaerbuLwBLnhiov2DGi1B TfMBaebbnrfifHhDYfgasaacPi=BMqFfpeea0xh9v8qiW7rqqrFfpe ea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0Firpe peKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaaeaqbaaGcba qcLbsaqaaaaaaaaaWdbiaadEhacqGH9aqpcaWG3bGcdaahaaWcbeqc basaaKqzadGaamiCaaaajugibiabgUcaRiaadEhalmaaCaaajeaibe [email protected]@
  • where w p [email protected]@[email protected]@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbdfwBIj xAHbqedmvETj2BSbqefm0B1jxALjhiov2DaerbuLwBLnhiov2DGi1B TfMBaebbnrfifHhDYfgasaacPi=BMqFfpeea0xh9v8qiW7rqqrFfpe ea0xe9Lq=Jc9vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0Firpe peKkFr0xfr=xfr=xb9adbaqaaeGaciGaaiaabeqaamaaeaqbaaGcba qcLbsaqaaaaaaaaaWdbiaadEhakmaaCaaaleqajeaibaqcLbmacaWG [email protected]@
    is the vacuum polarization, w p

The Higgs Boson discovery

From Scholarpedia

Post-publication activity

Curator: Chris Seez

The Higgs boson discovery was announced by the ATLAS and CMS collaborations on 4th July 2012.
Evidence for a new particle with the mass of about 125 GeV and the properties of the Standard Model Higgs boson was present in the three decay modes H → ZZ* → ℓℓ ℓℓ, H → γγ, and H → WW* → ℓν ℓν in both experiments.


The Standard Model of particle physics takes quarks and leptons to be fundamental, elementary particles, and describes the forces that govern their interactions as mediated through the exchange of further elementary particles.

The exchanged particles are photons in the case of the electromagnetic interaction, W and Z bosons in the case of the weak interaction, and gluons in the case of the strong interaction.

After the discovery of the W and Z bosons in the early 1980s, the elucidation of the mechanism by which they acquire mass became an important goal for particle physics.

Within the Standard Model the W and Z bosons have masses generated via the symmetry breaking Englert-Brout-Higgs-Guralnik-Hagen-Kibble mechanism, proposed in 1964 and giving rise to a massive scalar particle, the Standard Model Higgs boson.

The Large Hadron Collider

The Large Hadron Collider (LHC) was built at CERN in the 27-kilometre LEP tunnel with the aim of probing the TeV energy scale.

A key element in the scientific goals of the LHC was the elucidation of the electroweak symmetry breaking mechanism and the search for the Higgs boson postulated in the Standard Model.

The LHC is designed to accelerate and collide protons at a centre-of-mass energy of 14 TeV, and to achieve an instantaneous luminosity exceeding 1034 cm-2s-1 with the counter-rotating proton bunches separated by 25 ns, resulting in a bunch crossing rate of 40 MHz.

Stable operation of the LHC began in 2010 at a centre-of-mass energy of 7 TeV.

In 2011 the number of bunches in the beams was increased to 1380, corresponding to a bunch separation of 50 ns, and significantly increasing the luminosity.

In 2012 the centre-of-mass energy was raised to 8 TeV, and the luminosity was further increased during the course of the year, reaching peak luminosities of up to 7 ( cdot ) 1033 cm-2s-1.


The instantaneous luminosity, (mathcal{L}), provides a measure of the beam intensity and depends on the number of particles in the circulating beams and how closely they are squeezed in space at the collision point. The usual unit of measurement is cm-2s-1. The cross section of a process, σ, multiplied by the instantaneous luminosity, then gives the rate of interactions, ( dot N = mathcal{L} sigma ). For example, the Higgs boson production cross section multiplied by the instantaneous luminosity yields the expected production rate of Higgs bosons, i.e. the number of Higgs bosons produced per second.The integrated luminosity is the instantaneous luminosity integrated over time. The data collected by the ATLAS and CMS experiments in 2011 and in the first months of data taking in 2012, and used in the analyses that resulted in the first observation of the Higgs boson, corresponded to slightly more than 5 inverse femtobarns (fb-1 = 10-39 cm-2) in each year for each experiment. This integrated luminosity resulted in about 1015 proton-proton collisions at the centre of each detector.


An event recorded by a detector may contain signals from more than one proton-proton collision and from several bunch crossings. This effect is known as pileup. The high intensity of the proton beams results in multiple proton-proton collisions occurring
during each bunch crossing. The average number of interactions per bunch crossing was about 10 in 2011 and increased to about 20 in 2012. Advances in understanding of the performance of the detectors, and improved analysis techniques, were used to associate tracks and energy deposits to the different interactions, thereby mitigating the effects of the harsher environment, as the LHC instantaneous luminosity increased.

Experimental variables

The pseudorapidity, η, is defined as η=−ln[tan(θ/2)], where θ is the polar angle measured from the anticlockwise beam direction. The azimuthal angle, measured about the axis defined by the beam directions, is denoted by (phi). The transverse momentum, pT, denotes the component of momentum perpendicular to the beam axis.
Given the relativistic energy-momentum relation ( E = sqrt{p^2c^2+m^2c^4}) and using the Natural Unit system, as commonly used in particle physics, where the speed of light in vacuum c is set to 1, masses and momenta are expressed in units of energy, as eV (electron-volts) or keV, MeV, GeV, or TeV.

The ATLAS and CMS experiments

Be the first to comment

Leave a Reply

Your email address will not be published.