Hello,
Im new in EDEM simulation and would be very happy if I can get some advice.
My project is to model planetary ball mill but first just to see how metallic powder is being mixed in the vial. The powder is spherical with very narrow particle size distribution around 50 micrometers. There is several grams of powder in the vial which gives nuber of particles in the level of millions. The vial itself is a cylinder several centimeters in diameter and it rotates about 400rpm as in the model im attaching. I don’t want to introduce any destriction criteria, just to see how powder behaves in the vial. I have read a lot about EDEM and I could see that with GPU it should be possible to model millions of particles in reasonable time. For my project we just bought a reasonably good PC , precisely speaking: Asus PRO WS WRX80E-SAGE SE WIFI . 5965 WX threadripper, 256 GB DDR4 RAM, 3 x Nvidia RTX 3090 24GB , 14 TB of 7GB/s NVME SSD not much more I can imagine… the PC is worth >10kEUR. So I thought that with 3 GPU I should be able to proceed reasonably fast with calculations but it seems its not the case. If I put large balls (milimeters) , tens, hundreds, thousands , MY 24 core CPU is much much faster than 3 GPU but still its onot very fast, GPU is jst slow. When trying to model millions , GPU is really slow …CPU sometimes even don’t start and Im getting the solver error. Automatic caluclation of integration step gives 1*10-11 s sep which is crazy. 1*10-6 results in balls to “escape “ from vial. Can anyone please help me in configuration of the simulation in order to get some reasonable results of the powder being mixed. I assume that all the tutorials etc that we can find are not made with such powerful PC so I would be really happy to find out what Im doing wrong. Im attaching the files. Thank you!