Hello,
I am trying to generate a random pebble distribution with serpent 2 and it is not quite clear to me the meaning of parameters like: particle shake factor or grow rate factor, and how they affect the calculation.
Could you suggest some reasonable value for the parameters? and maybe same reference to the specific algorithms? I couldn´t find anything on the manual or in the forum.
Furthermore which kind of algorithm is working if you don´t choose the grow and shake algorithm?
generate random particle or pebble bed files for HTGR calc
-
- Posts: 24
- Joined: Mon Jun 03, 2013 10:59 pm
- Security question 1: No
- Security question 2: 92
Re: generate random particle or pebble bed files for HTGR calc
The original algorithm used by SERPENT to perform random particle dispersion was initially based on a simple location sampling and rejection routine. In this routine, points were randomly selected within the geometry and assessed to ensure that the point was both a particle radius away from the boundaries, to avoid clipping, and a particle radius away from all nearest neighbors, to avoid particle overlap. If the sampled point satisfied both of these requirements, then a particle was placed at the point, otherwise the point was rejected and no particle was placed. The routine would then continue with a new randomly selected point, and the process continue until a sufficient number of particles were placed based on the desired packing fraction, particle size, and geometry volume.
Unfortunately, this naïve approach leads to sub-optimal packing with no allowance for particles to move to a more optimal packing condition. This is illustrated in the below picture, where in the optimal packing case four particles may fit, but in the sub-optimal packing case, where the first two particles are placed poorly, no room exists to place additional particles in a volume that could possibly hold four. This results in SERPENT only being able to generate random dispersion realizations with packing fractions less than 35% before an artificially premature jamming is encountered, which is problematic being that the some fuel designs may seek to use a packing fraction of 40% or more. The actual "theoretical limit" for random closed packed spheres is somewhere around 64%, and though the current manufacturing limit for most fuels is typically lower than the theoretical limit, it would be best to have an algorithm that more closely approaches the theoretical limit, if not only for academic purposes.
The challenge of obtaining the largest packing fraction of randomly dispersed spheres is not new, and several algorithms have been developed to achieve this goal. One such algorithm from Tobochnik and Chapin was selected as the basis for the random dispersion subroutine revision being that it was found to be easy to implement and still achieves the theoretical maximum spherical random closed packing of roughly 64% [1]. The algorithm as was implemented in SERPENT 2, consists of an initial sampling of random points within the geometry, with acceptance criteria based solely on the points existing within the geometry. Zero sized particles, the number of which is the theoretical number of full sized particles that would provide the desired packing fraction, are then placed at these points, and the routine begins iterations of randomly shifting the particles and gradually allowing them to grow. With each iteration every particle, one at a time, attempts to grow by some user specified amount (i.e. the growth factor) and move in a random direction and distance, with the maximum magnitude of said distance (i.e the shake factor) also specified by the user. Should the growth cause particle overlap, boundary clipping, or exceed the target particle size, then the particle is left at its current size until the next iteration. Should the random movement produce particle overlap or boundary clipping, then the particle is left in its original location until the next iteration. This gradual growing and shifting of particles is continued until all particles have met the target particle size. This was implement in SERPENT 2 and was found to be able to achieve at least a 60% packing fraction as shown in the below picture.
It should be noted that there are other algorithms that converge faster (ex: algorithms that treat particles as moving with momentum and bouncing off each other and surfaces rather than the random shaking), but I was simply seeking an easy algorithm to implement that did the job.
The growth and shake factors are specified as fractions of particle radius. The optimal settings for these two will result in obtaining the desired realization in the shortest amount of time. Though I'm not entirely familiar with the theory, I believe / assume that the optimal settings will be a function of container geometry, particle geometry, and packing fraction. So, I use trial and error. For my models I found that a growth factor of 0.05 and shake factor of 0.1 were sufficient to get me to 60% packing fraction.
Let me know if you have any additional questions, comments, or concerns.
Cole
1. J. Tobochnik and P.M. Chapin “Monte Carlo Simulation of Hard Spheres Near Random Closest Packing Using Spherical Boundary Conditions”, Journal of Chemical Physics, 88, 5824 (1988)
Unfortunately, this naïve approach leads to sub-optimal packing with no allowance for particles to move to a more optimal packing condition. This is illustrated in the below picture, where in the optimal packing case four particles may fit, but in the sub-optimal packing case, where the first two particles are placed poorly, no room exists to place additional particles in a volume that could possibly hold four. This results in SERPENT only being able to generate random dispersion realizations with packing fractions less than 35% before an artificially premature jamming is encountered, which is problematic being that the some fuel designs may seek to use a packing fraction of 40% or more. The actual "theoretical limit" for random closed packed spheres is somewhere around 64%, and though the current manufacturing limit for most fuels is typically lower than the theoretical limit, it would be best to have an algorithm that more closely approaches the theoretical limit, if not only for academic purposes.
The challenge of obtaining the largest packing fraction of randomly dispersed spheres is not new, and several algorithms have been developed to achieve this goal. One such algorithm from Tobochnik and Chapin was selected as the basis for the random dispersion subroutine revision being that it was found to be easy to implement and still achieves the theoretical maximum spherical random closed packing of roughly 64% [1]. The algorithm as was implemented in SERPENT 2, consists of an initial sampling of random points within the geometry, with acceptance criteria based solely on the points existing within the geometry. Zero sized particles, the number of which is the theoretical number of full sized particles that would provide the desired packing fraction, are then placed at these points, and the routine begins iterations of randomly shifting the particles and gradually allowing them to grow. With each iteration every particle, one at a time, attempts to grow by some user specified amount (i.e. the growth factor) and move in a random direction and distance, with the maximum magnitude of said distance (i.e the shake factor) also specified by the user. Should the growth cause particle overlap, boundary clipping, or exceed the target particle size, then the particle is left at its current size until the next iteration. Should the random movement produce particle overlap or boundary clipping, then the particle is left in its original location until the next iteration. This gradual growing and shifting of particles is continued until all particles have met the target particle size. This was implement in SERPENT 2 and was found to be able to achieve at least a 60% packing fraction as shown in the below picture.
It should be noted that there are other algorithms that converge faster (ex: algorithms that treat particles as moving with momentum and bouncing off each other and surfaces rather than the random shaking), but I was simply seeking an easy algorithm to implement that did the job.
The growth and shake factors are specified as fractions of particle radius. The optimal settings for these two will result in obtaining the desired realization in the shortest amount of time. Though I'm not entirely familiar with the theory, I believe / assume that the optimal settings will be a function of container geometry, particle geometry, and packing fraction. So, I use trial and error. For my models I found that a growth factor of 0.05 and shake factor of 0.1 were sufficient to get me to 60% packing fraction.
Let me know if you have any additional questions, comments, or concerns.
Cole
1. J. Tobochnik and P.M. Chapin “Monte Carlo Simulation of Hard Spheres Near Random Closest Packing Using Spherical Boundary Conditions”, Journal of Chemical Physics, 88, 5824 (1988)
Re: generate random particle or pebble bed files for HTGR calc
I was wondering if there any plan to extent the current capabilities of this method to allow to generate random pebble distributions in cilinders with a cone area.
Thanks.
Thanks.
-
- Posts: 2
- Joined: Tue Oct 20, 2020 2:39 am
- Security question 1: No
- Security question 2: 7
Re: generate random particle or pebble bed files for HTGR calc
I am seconding this request. I would use it for modeling a PB-FHR. We are using LAMMPS right now and it doesn't fit very well in the workflow.
- Jaakko Leppänen
- Site Admin
- Posts: 2455
- Joined: Thu Mar 18, 2010 10:43 pm
- Security question 2: 0
- Location: Espoo, Finland
- Contact:
Re: generate random particle or pebble bed files for HTGR calc
The random sampling is done in subroutine disperse.c. Implementing a new geometry shape basically requires rejecting positions that are outside or intersect the boundary. The existing options are handled by if-else structures. It should be relatively easy to add new shapes.
- Jaakko