(no subject)

From: Francesco Cerutti (Francesco.Cerutti@cern.ch)
Date: Tue Nov 13 2007 - 17:50:44 CET

  • Next message: Alberto Fasso': "Re: USRTRACK fluence spectrum scoring in GeV/nucleon instead of GeV?"

    Dear Denis,

    > I have the following experiment to simulate:
    > Projectile : A gold beam particle at 25 GeV/nucleons
    > Fixed target: 1% interaction Au target (corresponds ~ 250 micrometers )
    > Iron beam dump at the end of the cave.
    > Now i would like to study the fluences of electron, protons, pions and
    > mostly neutrons inside the cave.
    > For minimum bias event i use Fluka heavy ion definition with DpmJetIII.
    > For Au-Au collision i use a external source ( using source.f ) reading
    > UqQmd output file and filling the stack.
    > Now i want to estimate the fluence of all particles in the cave.
    > My question is how to modify source.f in order to have the correct
    > normalization for the fluence?
    > In source.f there is the line calculating total weight for primaries:
    > should i modify it to
    > WEIPRI = 1/100 ?

    if you want to use your own list of secondary particles - so that each
    of them represents a primary history - and to assume an interaction
    probability of 1% for the incident Au projectiles, you should in
    principle randomly select one Au+Au event in your list and hence load all
    the respective secondary products one after the other, each as a primary
    At the same time I would keep unchanged the weight definition in source.f

    * Wt is the weight of the particle
           WTFLK (NPFLKA) = ONEONE

    [by the way, WEIPRI = 1/100 means WEIPRI = 0 since you have a division
    between two integers]

    and I would rescale fluences for each cycle eventually multiplying them by
    (SCAFAC * 1D-2) where 1d-2 is the assumed ratio between the interacting
    gold ions and the incident ones, and SCAFAC is the ratio between the
    number of transported reaction products (i.e. your number of histories N)
    and the number of considered collisions.
    Then, merging results from different cycles, you should NOT weigh them
    according to their number of histories N_i, but according to N_i/SCAFAC_i.

    On the other hand, to properly overcome the low interaction probability
    due to your very thin target, you might want to apply biasing: the
    LAM-BIAS card allows you to reduce the hadronic inelastic interaction
    length (by the factor WHAT(2) with SDUM blank) for a given material
    (WHAT(3), gold in your case).

    Hope this helps


    Francesco Cerutti
    CH-1211 Geneva 23
    tel. ++41 22 7678962
    fax ++41 22 7668854

  • Next message: Alberto Fasso': "Re: USRTRACK fluence spectrum scoring in GeV/nucleon instead of GeV?"

    This archive was generated by hypermail 2.1.6 : Tue Nov 13 2007 - 23:29:32 CET