Replies: 1 comment 1 reply
-
As is said, your GPU memory is not enough. You may reduce your system size or call MPI to run on multiple GPUs. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When I try to make a simulation using lammps, however, I always meet the problem. "CUDA_ERROR_OUT _OF_MEMORY: out of memory". I hope that you can help me to fix the problem.
Any discussion is welcomed.Thank you very much.
Beta Was this translation helpful? Give feedback.
All reactions