Releases
v0.2.1
New features
Add support for GPT-2 345M model in examples/gpt-2 . (#156 )
Add BERT modules, including texar.modules.BERTEncoder
(doc ) and texar.modules.BERTClassifier
(doc ). (#167 )
Feature improvements
Refactor TransformerEncoder
and TransformerDecoder
to separate position embeddings from the modules. (#126 )
Allow passing a Tensor to output_layer
of decoders' constructors -- used for weight tie b/w the output layer and input embedding matrix. (#126 )
TransformerDecoder
constructor interface made exact the same with RNN decoders
constructor interfaces. (#126 )
Refactor decoder Helper
s to allow two-argument embedding_fn
(supporting for position embedding). (#126 )
Refactor SinusoidsPositionEmbedder
to enable infinite large or negative position indexes. (#176 )
Fixes
Fix texar.losses.reduce_batch_time
when sequence
has dtype other than tf.float32
. (#143 )
Fix texar.losses.reduce_dimensions
when average_axes
or sum_axes
is int
. (#141 )
Fix GPT-2 tokenization loading path. (#165 )
Fix examples/vae_text EOS bug. (#168 )
Fix transformer bleu_tool.py when translation_length
is 0. (#176 )
Fix StochasticConnector
and ReparameterizedStochasticConnector
when transform=False
. (#179 )
You can’t perform that action at this time.