probnmn.models.program_generator¶
-
class
probnmn.models.program_generator.
ProgramGenerator
(vocabulary: allennlp.data.vocabulary.Vocabulary, input_size: int = 256, hidden_size: int = 256, num_layers: int = 2, dropout: float = 0.0)[source]¶ Bases:
probnmn.modules.seq2seq_base.Seq2SeqBase
A wrapper over
probnmn.modules.seq2seq_base.Seq2SeqBase
. This sequence to sequence model accepts tokenized and padded question sequences and decodes them to program sequences.- Parameters
- vocabulary: allennlp.data.vocabulary.Vocabulary
AllenNLP’s vocabulary. This vocabulary has three namespaces - “questions”, “programs” and “answers”, which contain respective token to integer mappings.
- input_size: int, optional (default = 256)
The dimension of the inputs to the LSTM.
- hidden_size: int, optional (default = 256)
The dimension of the outputs of the LSTM.
- num_layers: int, optional (default = 2)
Number of recurrent layers in the LSTM.
- dropout: float, optional (default = 0.0)
Dropout probability for the outputs of LSTM at each layer except last.