I have a code which I tried running both in cpu and gpu. A typical number for log probability from cpu is
-66.78053371
while the gpu numbers are
-98.88531779
I know there is the difference in double precision for cpu, I can understand if the gpu crashes at certain point due to out of range problems, but how can the same code show such different numbers while its running?