Description
When I started working on loltorch I used benchmark/threadedtrain.lua as an example of using the torch threads library for training. Unfortunately after much trial and error I realized that there are no concurrency guarantees using the threads library. In the loltorch file train.lua:256-321 I addressed the issue using concurrency primitives to ensure proper order of execution.
It seems like a disclaimer in the benchmark/README.md might be useful or potentially fixing the issues in benchmark/threadedtrain.lua file. As it stands others might run into similar issues without realizing it since the first line of the README is: This is a benchmark of the threads package, as well as a good real use-case example, using Torch neural network packages.
I am not sure the number of people who would use torch threads for training as I mostly saw it used for data loading in the open source projects I looked at. I did training on my Macbook Pro and the GPU was slower than the CPU (despite having a dedicated GPU), thus the need to use the threading library.