You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(You may want to see the already built [releases](https://github.com/edumucelli/build-torch/releases) before building yourself)
4
+
5
+
This project aims to build PyTorch from the source using a Dockerfile.
6
+
The idea is to simplify the build so that one can choose the the Cuda/CuDNN
7
+
version which better fits ones environment.
8
+
9
+
As PyTorch pre-built binaries require specific CUDA/CuDNN versions, e.g.,
10
+
[#10971](https://github.com/pytorch/pytorch/issues/10971) you can't have pre-built
11
+
PyTorch with CUDA 9.1 after the 0.4.0 release. For instance if you are using
12
+
Debian Stretch and the Nvidia distribution packages you won't be able to
13
+
use newer versions as they are built with CUDA 9.2.
14
+
15
+
The default configuration of the Dockerfile will build:
16
+
17
+
* Python 3.5
18
+
* PyTorch 1.3
19
+
* Cuda 9.1.85
20
+
* CuDNN 7.1.3
21
+
22
+
As it downloads the packages straight from Nvidia you can build the one
23
+
you prefer with Cuda, the package comes straight from [here](https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64).
24
+
The following versions are then available, just replace the `CUDA_VERSION`
25
+
variable by one of the following:
26
+
27
+
* 8.0.44
28
+
* 8.0.61
29
+
* 9.0.176
30
+
* 9.1.85
31
+
* 9.2.88
32
+
* 9.2.148
33
+
* 10.0.130
34
+
* 10.1.105
35
+
* 10.1.168
36
+
* 10.1.243
37
+
38
+
Likewise, CuDNN version can be one of the following. Just replace the `CUDNN_PKG_VERSION`
39
+
variable to the one you prefer. The packages come directly from [here](https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64).
40
+
41
+
* 7.0.1.13-1+cuda8.0
42
+
* 7.0.2.38-1+cuda8.0
43
+
* 7.0.3.11-1+cuda8.0
44
+
* 7.0.3.11-1+cuda9.0
45
+
* 7.0.4.31-1+cuda8.0
46
+
* 7.0.4.31-1+cuda9.0
47
+
* 7.0.5.15-1+cuda8.0
48
+
* 7.0.5.15-1+cuda9.0
49
+
* 7.0.5.15-1+cuda9.1
50
+
* 7.1.1.5-1+cuda8.0
51
+
* 7.1.1.5-1+cuda9.0
52
+
* 7.1.1.5-1+cuda9.1
53
+
* 7.1.2.21-1+cuda8.0
54
+
* 7.1.2.21-1+cuda9.0
55
+
* 7.1.2.21-1+cuda9.1
56
+
* 7.1.3.16-1+cuda8.0
57
+
* 7.1.3.16-1+cuda9.0
58
+
* 7.1.3.16-1+cuda9.1
59
+
* 7.1.4.18-1+cuda8.0
60
+
* 7.1.4.18-1+cuda9.0
61
+
* 7.1.4.18-1+cuda9.2
62
+
* 7.2.1.38-1+cuda8.0
63
+
* 7.2.1.38-1+cuda9.0
64
+
* 7.2.1.38-1+cuda9.2
65
+
* 7.3.0.29-1+cuda9.0
66
+
* 7.3.0.29-1+cuda10.0
67
+
* 7.3.1.20-1+cuda9.0
68
+
* 7.3.1.20-1+cuda9.2
69
+
* 7.3.1.20-1+cuda10.0
70
+
* 7.4.1.5-1+cuda9.0
71
+
* 7.4.1.5-1+cuda9.2
72
+
* 7.4.1.5-1+cuda10.0
73
+
* 7.4.2.24-1+cuda9.0
74
+
* 7.4.2.24-1+cuda9.2
75
+
* 7.4.2.24-1+cuda10.0
76
+
* 7.5.0.56-1+cuda9.0
77
+
* 7.5.0.56-1+cuda9.2
78
+
* 7.5.0.56-1+cuda10.0
79
+
* 7.5.0.56-1+cuda10.1
80
+
* 7.5.1.10-1+cuda9.0
81
+
* 7.5.1.10-1+cuda9.2
82
+
* 7.5.1.10-1+cuda10.0
83
+
* 7.5.1.10-1+cuda10.1
84
+
* 7.6.0.64-1+cuda9.0
85
+
* 7.6.0.64-1+cuda9.2
86
+
* 7.6.0.64-1+cuda10.0
87
+
* 7.6.0.64-1+cuda10.1
88
+
* 7.6.1.34-1+cuda9.0
89
+
* 7.6.1.34-1+cuda9.2
90
+
* 7.6.1.34-1+cuda10.0
91
+
* 7.6.1.34-1+cuda10.1
92
+
* 7.6.2.24-1+cuda9.0
93
+
* 7.6.2.24-1+cuda9.2
94
+
* 7.6.2.24-1+cuda10.0
95
+
* 7.6.2.24-1+cuda10.1
96
+
* 7.6.3.30-1+cuda9.0
97
+
* 7.6.3.30-1+cuda9.2
98
+
* 7.6.3.30-1+cuda10.0
99
+
* 7.6.3.30-1+cuda10.1
100
+
* 7.6.4.38-1+cuda9.0
101
+
* 7.6.4.38-1+cuda9.2
102
+
* 7.6.4.38-1+cuda10.0
103
+
* 7.6.4.38-1+cuda10.1
104
+
105
+
One thing to note is that some of the packages will respect the Cuda version
106
+
you are looking for, e.g.,
107
+
108
+
```
109
+
apt-get install ...
110
+
cuda-command-line-tools-9-1 \
111
+
cuda-cublas-dev-9-1 \
112
+
cuda-cudart-dev-9-1 \
113
+
cuda-cufft-dev-9-1 \
114
+
cuda-curand-dev-9-1 \
115
+
cuda-cusolver-dev-9-1 \
116
+
cuda-cusparse-dev-9-1 \
117
+
```
118
+
119
+
have the `9-1` in the package names, if you are building a Cuda `10.1` then
120
+
you should replace the `9-1` everywhere by `10-1`.
121
+
122
+
## Building
123
+
124
+
Just run `(sudo) docker build -t build-torch .` in the same directory
125
+
as the one where the Dockerfile is stored. The pytorch wheel will be
126
+
stored on the `/pytorch/dist` directory of the container.
127
+
128
+
## Getting the wheel out of the container
129
+
130
+
There are [several ways](https://stackoverflow.com/questions/22049212/copying-files-from-docker-container-to-host) to copy
131
+
data from container to the host machine. Here is a suggestion:
0 commit comments