Skip to content

Commit fede876

Browse files
authored
Release 0.1.6 (#137)
* Modified parameter order of DecoderRNN.forward (#85) * Updated TopKDecoder (#86) * Fixed topk decoder. * Use torchtext from pipy (#87) * Use torchtext from pipe. * Fixed torch text sorting order. * attention is not required when only using teacher forcing in decoder (#90) * attention is not required when only using teacher forcing in decoder * Updated docs and version. * Fixed code style. * bugfix (#92) Fixed field arguments validation. * Removed `initial_lr` when resuming optimizer with scheduler. (#95) * shuffle the training data (#97) * 0.1.5 (#91) * Modified parameter order of DecoderRNN.forward (#85) * Updated TopKDecoder (#86) * Fixed topk decoder. * Use torchtext from pipy (#87) * Use torchtext from pipe. * Fixed torch text sorting order. * attention is not required when only using teacher forcing in decoder (#90) * attention is not required when only using teacher forcing in decoder * Updated docs and version. * Fixed code style. * shuffle the training data * fix example of inflate function in TopKDecoer.py (#98) * fix example of inflate function in TopKDecoer.py * Fix hidden_layer size for one-directional decoder (#99) * Fix hidden_layer size for one-directional decoder Hidden layer size of the decoder was given `hidden_size * 2 if bidirectional else 1`, resulting in a dimensionality error for non-bidirectional decoders. Changed `1` to `hidden_size`. * Adapt load to allow CPU loading of GPU models (#100) * Adapt load to allow CPU loading of GPU models Add storage parameter to torch.load to allow loading models on a CPU that are trained on the GPU, depending on availability of cuda. * Fix wrong parameter use on DecoderRNN (#103) * Fix wrong parameter use on DecoderRNN * Upgrade to pytorch-0.3.0 (#111) * Upgrade to pytorch-0.3.0 * Use pytorch 3.0 in travis env. * Make sure tensor contiguous when attention's not used. (#112) * Implementing the predict_n method. Using the beam search outputs it returns several seqs for a given seq (#116) * Adding a predictor method to return n predicted seqs for a src_seq input (intended to be used along to Beam Search using TopKDecoder) * Checkpoint after batches not epochs (#119) * Pytorch 0.4 (#134) * add contiguous call to tensor (#127) when attention is turned off, pytorch (well, 0.4 at least) gets angry about calling view on a non-contiguous tensor * Fixed shape documentation (#131) * Update to pytorch-0.4 * Remove pytorch manual install in travis. * Allow using pre-trained embedding (#135) * updated docs
1 parent 4c661ca commit fede876

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

48 files changed

+267
-5850
lines changed

.travis.yml

+1-3
Original file line numberDiff line numberDiff line change
@@ -10,8 +10,6 @@ python:
1010
install:
1111
- pip install -U pip
1212
- pip -q install -r requirements.txt
13-
- pip -q install "http://download.pytorch.org/whl/cu75/torch-0.2.0.post1-cp27-cp27mu-manylinux1_x86_64.whl; python_version == '2.7'"
14-
- pip -q install "http://download.pytorch.org/whl/cu75/torch-0.2.0.post1-cp36-cp36m-manylinux1_x86_64.whl; python_version == '3.6'"
1513

1614
# dev dependencies
1715
- pip install flake8
@@ -32,4 +30,4 @@ script:
3230
# Unit test
3331
- nosetests --with-coverage --cover-erase --cover-package=seq2seq
3432
# Integration test
35-
- "if [[ $TRAVIS_BRANCH =~ (master|develop) ]]; then python setup.py install && scripts/integration_test.sh; fi"
33+
- "if [[ $TRAVIS_BRANCH =~ (master|develop) ]]; then python setup.py install && scripts/integration_test.sh; fi"

docs/public/_modules/index.html

+4-7
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88

99
<meta name="viewport" content="width=device-width, initial-scale=1.0">
1010

11-
<title>Overview: module code &mdash; pytorch-seq2seq 0.1.5 documentation</title>
11+
<title>Overview: module code &mdash; pytorch-seq2seq 0.1.6 documentation</title>
1212

1313

1414

@@ -35,7 +35,7 @@
3535
<link rel="index" title="Index"
3636
href="../genindex.html"/>
3737
<link rel="search" title="Search" href="../search.html"/>
38-
<link rel="top" title="pytorch-seq2seq 0.1.5 documentation" href="../index.html"/>
38+
<link rel="top" title="pytorch-seq2seq 0.1.6 documentation" href="../index.html"/>
3939

4040

4141
<script src="../_static/js/modernizr.min.js"></script>
@@ -64,7 +64,7 @@
6464

6565

6666
<div class="version">
67-
0.1.5
67+
0.1.6
6868
</div>
6969

7070

@@ -164,8 +164,6 @@
164164

165165
<h1>All modules for which code is available</h1>
166166
<ul><li><a href="seq2seq/dataset/fields.html">seq2seq.dataset.fields</a></li>
167-
<li><a href="seq2seq/dataset/utils.html">seq2seq.dataset.utils</a></li>
168-
<li><a href="seq2seq/dataset/vocabulary.html">seq2seq.dataset.vocabulary</a></li>
169167
<li><a href="seq2seq/evaluator/evaluator.html">seq2seq.evaluator.evaluator</a></li>
170168
<li><a href="seq2seq/evaluator/predictor.html">seq2seq.evaluator.predictor</a></li>
171169
<li><a href="seq2seq/loss/loss.html">seq2seq.loss.loss</a></li>
@@ -178,7 +176,6 @@ <h1>All modules for which code is available</h1>
178176
<li><a href="seq2seq/optim/optim.html">seq2seq.optim.optim</a></li>
179177
<li><a href="seq2seq/trainer/supervised_trainer.html">seq2seq.trainer.supervised_trainer</a></li>
180178
<li><a href="seq2seq/util/checkpoint.html">seq2seq.util.checkpoint</a></li>
181-
<li><a href="seq2seq/util/custom_time.html">seq2seq.util.custom_time</a></li>
182179
</ul>
183180

184181
</div>
@@ -215,7 +212,7 @@ <h1>All modules for which code is available</h1>
215212
<script type="text/javascript">
216213
var DOCUMENTATION_OPTIONS = {
217214
URL_ROOT:'../',
218-
VERSION:'0.1.5',
215+
VERSION:'0.1.6',
219216
COLLAPSE_INDEX:false,
220217
FILE_SUFFIX:'.html',
221218
HAS_SOURCE: true,

docs/public/_modules/seq2seq/dataset/dataset.html

-420
This file was deleted.

docs/public/_modules/seq2seq/dataset/utils.html

-358
This file was deleted.

docs/public/_modules/seq2seq/dataset/vocabulary.html

-431
This file was deleted.

docs/public/_modules/seq2seq/evaluator/evaluator.html

-285
This file was deleted.

0 commit comments

Comments
 (0)