Role of DCYTIMES in finding time evolution of dose eq rate

From: Mina Nozar <nozarm_at_triumf.ca>
Date: Mon, 23 Jul 2012 17:54:08 -0700

Hello everyone,

I have come across a feature that is puzzling to me.

A bit of context:
While summarizing the results of the time evolution of activities in different regions and DoseEq. rate at a detector
placed a meter away from various targets, and comparing the results with measurements, I found a bug in my input files
where I had accidentally set the region in one of the RESNUCLEi cards wrong. After correcting this mistake and
rerunning, I noticed that the values for DoseEq rate in my detector had also changed. This should not have happened.
So upon further investigation and many sets of runs, ruling out any differences between the first set of runs few months
ago and the recent run (including different versions of FLUKA), through the process of elimination, I have come to the
conclusion that the difference in the DoseEq values has to do with the difference in two intermediate ramp up times as
defined in my DCYTIMES cards!! This would have been the last culprit I would have thought as the source of the
difference.

                                  Dose Eq. (uSv/h) Delta
# Time(s) Ta26_Wrong-DCYTIMES Ta26 ((Ta26_Wrong-DCYTIMES - Ta26) /Ta26) * 100 %
#
1 1h_SOB 3.6000e+03 2.635534e+05 2.635534e+05 0.00
2 1d_SOB 8.6400e+04 1.405412e+06* 1.405412e+06 0.00
3 3d_SOB 2.5920e+05 1.677836e+06* 1.631715e+06 2.75
4 10d_SOB 8.6400e+05 1.685668e+06 1.677836e+06 0.46
5 0s_EOB 2.1600e+06 1.697290e+06 1.697290e+06 0.00
6 1h_EOB 2.1636e+06 1.450112e+06 1.448546e+06 0.11
7 1d_EOB 2.2464e+06 2.875528e+05 2.859396e+05 0.56
8 10d_EOB 3.0240e+06 2.326512e+04 2.173031e+04 6.60
9 40d_EOB 5.6160e+06 8.896961e+03 7.884914e+03 11.38
10 1y_EOB 3.3696e+07 1.709451e+03 1.289858e+03 24.55
11 2y_EOB 6.5232e+07 1.063496e+03 7.738430e+02 27.24
12 3y_EOB 9.6768e+07 7.320424e+02 5.321056e+02 27.31
13 5y_EOB 1.5984e+08 3.542361e+02 2.589828e+02 26.89

NOTE: 1.405412e+06 is dose rate at 10 days after SOB (Start Of rBeam) and 1.677836e+06 at 14 days after SOB as opposed
to 1 and 3 days after SOB in the case for Ta26, respectively. The last column shows the fractional differences between
the dose rates in the two sets of runs. The difference increases to around 27% while the errors in each case are on
the order of 0.3%!

The IRRPROFI and DCYTIMES cards for the original run (Ta26_Wrong-DCYTIMES):

IRRPROFI 2.16E6 4.994E14
* Ramp up (grown in) times -- negative times for times before EOB
* 1 hour after SOB ===> 2.16e6 - 3600 = -2.156e6 s
* 1 day after SOB ===> 2.16e6 - 24*3600 = -2.074e6 s
* 10 days after SOB ===> 2.16e6 - 3*24*3600 = -1.901e6 s
* 14 days after SOB ===> 2.16e6 - 10*24*3600 = -1.296e6 s
DCYTIMES -2.156E6 -2.074E6 -1.296E6 -9.36E5 <========
* Cool down times
* 0 sec after EOB ===> 0 s
* 1 hour after EOB ===> 3600 s
* 1 day after EOB ===> 24*3600 = 86400 s
* 10 days after EOB ===> 10*24*3600 = 864000 s
* 40 days after EOB ===> 40*24*3600 = 3.456e6 s
* 1 year after EOB ===> 365*24*3600 = 3.154e7 s
DCYTIMES 0. 3600. 86400. 864000. 3.456E6 3.154E7
* 2 years after EOB ===> 2*365*24*3600 = 6.307e7 s
* 3 years after EOB ===> 3*365*24*3600 = 9.461e7 s
* 5 years after EOB ===> 5*365*24*3600 = 1.577e8 s
DCYTIMES 6.307E7 9.461E7 1.577E8

For the recent run (Ta26):
IRRPROFI 2.16E6 4.994E14
DCYTIMES -2.156E6 -2.074E6 -1.901E6 -1.296E6 <=======
DCYTIMES 0. 3600. 86400. 864000. 3.456E6 3.154E7
DCYTIMES 6.307E7 9.461E7 1.577E8

There are no other differences between the two input files. My question is this... How could the dose rates, say 3
years after EOB be different if one or two of the intermediate ramp-up or cool-down times are different??? What am I
missing?

Thank you in advance,
Mina
Received on Tue Jul 24 2012 - 09:33:49 CEST

This archive was generated by hypermail 2.2.0 : Tue Jul 24 2012 - 09:34:01 CEST