RE: max number of voxels

From: Alfredo Ferrari (alfredo.ferrari@cern.ch)
Date: Tue Apr 10 2007 - 16:50:08 CEST

  • Next message: Valery Taranenko: "RE: max number of voxels"

    Hi Valery

    the answers from me and Paola among your lines below...

    On Thu, 5 Apr 2007, Valery Taranenko wrote:

    > Dear Paola,
    >
    > Thank you for the exhaustive explanation.
    >
    > In our research group we have recently finished construction of a series of
    > phantoms representing a pregnant female at different period of gestation.
    > The organs are represented by 3D polygonal meshes and we can voxelize them
    > virtually at any resolution. The thing is, 1-mm voxels are quite common
    > today, and although we can voxelize at lower resolution (thanks to the
    > original organ meshes) we would like to stick with 1-mm for direct
    > comparison with Penelope and EGSnrc codes. Standard MCNP/X can't deal with
    > more than about 25 million voxels, but we are working on it.
    >
    > Can you set dynamic memory allocation for voxel input? This could be a best
    > solution (although I'm not sure how this fits in your development habits).

    At present we cannot use dynamic memory allocation for two reasons:
    a) we should move from g77
    b) all FLUKA memory allocation is deeply interlinked and moving to
        dynamic allocation would require a major reworking, not only
        of the voxel part

    > Strategically I would expect an increase in demand from people to run voxel
    > phantoms with higher resolution. And here dynamic memory allocation would
    > keep (presumably) low overhead for smaller phantoms but also will allow to
    > go to higher voxel numbers.
    >
    > You know, today 2 GB of random-access memory is really common. Therefore
    > physical memory limit (RAM) shouldn't be a problem. Presently we have 36
    > organs (unique voxel IDs) and 20 unique materials, a couple of them could be
    > with different densities. Ultimately we may need to go to 100 organs (this
    > is very typical nowadays in this business of whole-body phantoms). I assume
    > a limit on materials number is independent from voxel input (it should be).

    You assumption is correct. In principle recompiling the code with say
    1 GB of RAM would solve you problem, unless you ask for a lot of
    memory for scoring as well

    >
    > Scoring. The program minimum is to calculate average organ absorbed dose in
    > each organ; that is, calculate absorbed energy in all voxels (mass
    > normalization can be obviously done outside of Fluka without problem). Kerma
    > calculation would be of an interest as well. The program maximum (for
    > radiotherapy) is: calculate special distribution of dose in each voxel
    > (average over a voxel). Although I found very clear explanation on how to
    > input a voxel phantom, I didn't really found anything on how to score within
    > it (when searched for "voxel" in manual PDF, all results relate to geometry
    > input). I know I need to read the whole manual... but If you comment on how
    > to score (at least simple average organ dose) this would be very helpful.
    >

    It is easy to use a USRBIN XYZ scoring so that it exactly matches your
    voxels (see the manual), you can even ask lower or larger resolution,
    however if you need 2 bytes / voxel in the geometry, you would need
    8 bytes per voxel in the scoring (unless you ask for a R*4 USRBIN
    instead of a R*8 one, which however is dangerous if you have a lot
    of small energy deposition events) which makes the memory issue very
    much different. Please let me know how much scoring you really need

    > Multi-group low energy neutrons. As far as I know people hesitate to use
    > Fluka at low energies(?) in dosimetry applications for neutrons (only?) due
    > to multi-group approach at energies below 20(?) MeV. My task is very
    > simple--I need to calculate dose conversion coefficients (basically absorbed
    > dose with appropriate normalization for source fluence) for external
    > neutrons from thermal energies up to 100 GeV. Obviously following
    > secondaries is important. Typically people split the energy range in two
    > parts: table-based region--below 20 or 150 maybe MeV (like in MCNP/X); and
    > the rest. Tell me, is it too bad to use Fluka at low neutron energies? What
    > about protons (100 MeV--100 GeV)?
    >

    I don't see the issue with multi-group cross sections for dosimetry
    calculations, particularly with absorbed dose ones like those you plan.
    They have been used for ages by many codes and they are perfectly ok, the
    issue would be if you want a lot of (primary) energy points, more than
    the groups. Protons are fine and we use models all the time for them,
    from threshold on

    > I'm aware of the extensive work done by Maurizio Pelliccioni in radiation
    > protection. Dr. Pelliccioni is like an apostle of all Fluka calculations for
    > phantom, with many papers which I'm going to look at meanwhile.
    >
    > As a quick solution, can you recompile the code with 400 million voxel array
    > size?
    >

    In principle yes, but please look at my comment about scoring above

    > Please excuse the large number of my questions. I've just embarked on the
    > Fluka and expect to have a great time with it!
    >
    > Thank you,
    > Valery
    >
    >
    > Valery Taranenko, Ph.D.
    > Postdoc Research Associate, RPI, http://rrmdg.rpi.edu
    > Troy, NY
    >
    >
    >
    > -----Original Message-----
    > From: owner-fluka-discuss@fisica.unimi.it
    > [mailto:owner-fluka-discuss@fisica.unimi.it] On Behalf Of paola sala
    > Sent: Thursday, April 05, 2007 4:19 AM
    > To: Valery Taranenko
    > Cc: fluka-discuss@fluka.org; alfredo.ferrari@cern.ch
    > Subject: Re: max number of voxels
    >
    > Hi
    >
    > there is no fixed limit for the number of voxels, BUT
    > there is a limit given by memory:
    > FLUKA stores all initialization and scoring data in an
    > indexed blank common. At present this common is set to a
    > dimension of 200MB. Since each voxel needs an integer*2 storage,
    >
    > if all the memory could be used for the voxels ( that is never true) the
    > max n. of voxels would be 100 million.
    >
    > In practice, the max. number will depend on the problem : n. of
    > materials, binning structures etc. The program stops with an error
    > message if the memory limit is exceeded.
    >
    > In case you really need 400 million voxels, the program should be
    > re-compiled with a much larger memory limit. This could bring to a
    > severe loss of computing efficiency if the computer does not have enough
    > physical memory... let me know
    > Paola
    >
    > On Wed, 2007-04-04 at 18:01 -0400, Valery Taranenko wrote:
    >> Hi,
    >>
    >>
    >>
    >> I would like to input 400 million voxel phantom into Fluka (these are
    >> 1-mm voxels).
    >>
    >> I could not find in the manual any limitations associated with the
    >> maximum voxel number.
    >>
    >> The number of organs is below 100.
    >>
    >>
    >>
    >>
    >>
    >> Thank you.
    >>
    >>
    >>
    >> Regards,
    >>
    >> Valery Taranenko, Ph.D.
    >>
    >> Postdoc Research Associate, RPI, http://rrmdg.rpi.edu
    >>
    >> Troy, NY
    >>
    >>
    >>
    >>
    >
    >
    >

    -- 
    +----------------------------------------------------------------------------+
    |  Alfredo Ferrari                ||  Tel.: +41.22.767.6119                  |
    |  CERN-AB                        ||  Fax.: +41.22.767.7555                  |
    |  1211 Geneva 23                 ||  e-mail: Alfredo.Ferrari@cern.ch        |
    |  Switzerland                    ||          Alfredo.Ferrari@mi.infn.it     |
    +----------------------------------------------------------------------------+
    

  • Next message: Valery Taranenko: "RE: max number of voxels"

    This archive was generated by hypermail 2.1.6 : Tue Apr 10 2007 - 17:22:33 CEST