[fluka-discuss]: Energy loss of relativistic nuclei in a crystal

From: maestro <paolo.maestro_at_pi.infn.it>
Date: Sat, 22 Jun 2019 09:02:40 +0200

Hello fluka developers,
I am doing simulations of the energy deposit of oxygen nuclei at high energy in a bar of PWO crystal (density 8.28 g/cm^3, 2 cm thick).
Oxygen nuclei have generated according to KE^-1 spectrum, where KE is the kinetic energy per particle spanning from 178 GeV to 316 TeV.
I made plots (see attached figure to this email) of the energy loss (dE/dx) distributions (computed from the energy deposited in the bar, and dividing by
the square of the particle charge) for different energy intervals of KE.
In these plots, nuclei undergoing hadronic inelastic interaction (somewhere in the crystal) are not included.
As you can see, dE/dx has a nearly constant value (around 2 MeV cm^2/g, as expected) up to KE ~ 10 TeV, then it increases dramatically for higher KE.
I wonder which might be the physical reason for this increase of dE/dx.
I would have expected an almost constant value for relativistic charged particles (in the Fermi plateau).
Are there any radiative effects turned-on for ultra-relativistic charged particles?
Thanks for your help.
        Paolo Maestro






__________________________________________________________________________
You can manage unsubscription from this mailing list at https://www.fluka.org/fluka.php?id=acc_info
Received on Sat Jun 22 2019 - 11:02:13 CEST

This archive was generated by hypermail 2.3.0 : Sat Jun 22 2019 - 11:02:15 CEST