From: Valery Taranenko (TaranV@rpi.edu)
Date: Tue Apr 10 2007 - 19:09:58 CEST
1) Recompiling the Fluka with say 1 GB of RAM. I think 1 GB should work: 400
million voxels (1 byte per voxel), 36 organs, 20--25 materials (probably
different densities of the same material I should count separately). The
number of organs and materials can be approx 100 in future (as you
confirmed, the number of materials is independent of voxel memory
allocation; hence it shouldn't be a problem).
2) If I want to score average organ dose (energy deposited in all voxels
with the same voxel ID; for all voxel IDs, binning separately), should I use
SCORE (scoring by region) method? This is my program minimum, and I'd like
to start easy things first. Also I expect, this method should have less
memory overhead (Fluka doesn't need to store each-voxel-depositions in RAM).
2) USRBIN XYZ scoring. I think I don't need it at the moment and if you
include it now in the special Fluka-binary then it will probably slow down
normal runs without USRBIN feature (because Fluka with reserve this huge
memory block each time). I also need to figure out how USRBIN works. It's
not quite clear for me right now, how you calculate 8-byte per voxel memory
requirement for USRBIN XYZ scoring assuming that 2 bytes per voxel were used
in geometry description. Voxel indices go maximum till 1600, so the
basically each voxel index is 2B. Ok. Then we get 8B for all three indices.
OK, I have to admit, I need to study USRBIN more. Let's leave its memory
allocation for now. I don't think I will need to calculate really in each
voxel over the whole body, this is very unrealistic for now. Maybe just part
of the body--like for that region under radiation treatment.
3) Multi-group neutron cross sections for dosimetry calculations. There are
about 70, if I remember correctly. So basically, you are saying that I
shouldn't start source neutron with energies belonging to same group,
because obviously Fluka will treat them as having the same group energy. I
think typically we need about 20 source energies from thermal up to 20 MeV.
Really no problem here. However, I believe there must be issues on
difference multi-group vs continuous from the dosimetry point of view. If
you have any reference on this, would you mind to share?
Valery Taranenko, Ph.D.
Postdoc Research Associate, RPI, http://rrmdg.rpi.edu
From: Alfredo Ferrari [mailto:firstname.lastname@example.org]
Sent: Tuesday, April 10, 2007 10:50 AM
To: Valery Taranenko
Cc: email@example.com; firstname.lastname@example.org; email@example.com;
Subject: RE: max number of voxels
the answers from me and Paola among your lines below...
On Thu, 5 Apr 2007, Valery Taranenko wrote:
> Dear Paola,
> Thank you for the exhaustive explanation.
> In our research group we have recently finished construction of a series
> phantoms representing a pregnant female at different period of gestation.
> The organs are represented by 3D polygonal meshes and we can voxelize them
> virtually at any resolution. The thing is, 1-mm voxels are quite common
> today, and although we can voxelize at lower resolution (thanks to the
> original organ meshes) we would like to stick with 1-mm for direct
> comparison with Penelope and EGSnrc codes. Standard MCNP/X can't deal with
> more than about 25 million voxels, but we are working on it.
> Can you set dynamic memory allocation for voxel input? This could be a
> solution (although I'm not sure how this fits in your development habits).
At present we cannot use dynamic memory allocation for two reasons:
a) we should move from g77
b) all FLUKA memory allocation is deeply interlinked and moving to
dynamic allocation would require a major reworking, not only
of the voxel part
> Strategically I would expect an increase in demand from people to run
> phantoms with higher resolution. And here dynamic memory allocation would
> keep (presumably) low overhead for smaller phantoms but also will allow to
> go to higher voxel numbers.
> You know, today 2 GB of random-access memory is really common. Therefore
> physical memory limit (RAM) shouldn't be a problem. Presently we have 36
> organs (unique voxel IDs) and 20 unique materials, a couple of them could
> with different densities. Ultimately we may need to go to 100 organs (this
> is very typical nowadays in this business of whole-body phantoms). I
> a limit on materials number is independent from voxel input (it should
You assumption is correct. In principle recompiling the code with say
1 GB of RAM would solve you problem, unless you ask for a lot of
memory for scoring as well
> Scoring. The program minimum is to calculate average organ absorbed dose
> each organ; that is, calculate absorbed energy in all voxels (mass
> normalization can be obviously done outside of Fluka without problem).
> calculation would be of an interest as well. The program maximum (for
> radiotherapy) is: calculate special distribution of dose in each voxel
> (average over a voxel). Although I found very clear explanation on how to
> input a voxel phantom, I didn't really found anything on how to score
> it (when searched for "voxel" in manual PDF, all results relate to
> input). I know I need to read the whole manual... but If you comment on
> to score (at least simple average organ dose) this would be very helpful.
It is easy to use a USRBIN XYZ scoring so that it exactly matches your
voxels (see the manual), you can even ask lower or larger resolution,
however if you need 2 bytes / voxel in the geometry, you would need
8 bytes per voxel in the scoring (unless you ask for a R*4 USRBIN
instead of a R*8 one, which however is dangerous if you have a lot
of small energy deposition events) which makes the memory issue very
much different. Please let me know how much scoring you really need
> Multi-group low energy neutrons. As far as I know people hesitate to use
> Fluka at low energies(?) in dosimetry applications for neutrons (only?)
> to multi-group approach at energies below 20(?) MeV. My task is very
> simple--I need to calculate dose conversion coefficients (basically
> dose with appropriate normalization for source fluence) for external
> neutrons from thermal energies up to 100 GeV. Obviously following
> secondaries is important. Typically people split the energy range in two
> parts: table-based region--below 20 or 150 maybe MeV (like in MCNP/X); and
> the rest. Tell me, is it too bad to use Fluka at low neutron energies?
> about protons (100 MeV--100 GeV)?
I don't see the issue with multi-group cross sections for dosimetry
calculations, particularly with absorbed dose ones like those you plan.
They have been used for ages by many codes and they are perfectly ok, the
issue would be if you want a lot of (primary) energy points, more than
the groups. Protons are fine and we use models all the time for them,
from threshold on
> I'm aware of the extensive work done by Maurizio Pelliccioni in radiation
> protection. Dr. Pelliccioni is like an apostle of all Fluka calculations
> phantom, with many papers which I'm going to look at meanwhile.
> As a quick solution, can you recompile the code with 400 million voxel
In principle yes, but please look at my comment about scoring above
> Please excuse the large number of my questions. I've just embarked on the
> Fluka and expect to have a great time with it!
> Thank you,
> Valery Taranenko, Ph.D.
> Postdoc Research Associate, RPI, http://rrmdg.rpi.edu
> Troy, NY
> -----Original Message-----
> From: firstname.lastname@example.org
> [mailto:email@example.com] On Behalf Of paola sala
> Sent: Thursday, April 05, 2007 4:19 AM
> To: Valery Taranenko
> Cc: firstname.lastname@example.org; email@example.com
> Subject: Re: max number of voxels
> there is no fixed limit for the number of voxels, BUT
> there is a limit given by memory:
> FLUKA stores all initialization and scoring data in an
> indexed blank common. At present this common is set to a
> dimension of 200MB. Since each voxel needs an integer*2 storage,
> if all the memory could be used for the voxels ( that is never true) the
> max n. of voxels would be 100 million.
> In practice, the max. number will depend on the problem : n. of
> materials, binning structures etc. The program stops with an error
> message if the memory limit is exceeded.
> In case you really need 400 million voxels, the program should be
> re-compiled with a much larger memory limit. This could bring to a
> severe loss of computing efficiency if the computer does not have enough
> physical memory... let me know
> On Wed, 2007-04-04 at 18:01 -0400, Valery Taranenko wrote:
>> I would like to input 400 million voxel phantom into Fluka (these are
>> 1-mm voxels).
>> I could not find in the manual any limitations associated with the
>> maximum voxel number.
>> The number of organs is below 100.
>> Thank you.
>> Valery Taranenko, Ph.D.
>> Postdoc Research Associate, RPI, http://rrmdg.rpi.edu
>> Troy, NY
-- +--------------------------------------------------------------------------- -+ | Alfredo Ferrari || Tel.: +41.22.767.6119 | | CERN-AB || Fax.: +41.22.767.7555 | | 1211 Geneva 23 || e-mail: Alfredo.Ferrari@cern.ch | | Switzerland || Alfredo.Ferrari@mi.infn.it | +--------------------------------------------------------------------------- -+
This archive was generated by hypermail 2.1.6 : Wed Apr 11 2007 - 00:13:52 CEST