seq2seq-transformer / config.json
letran1110's picture
Upload Seq2Seq Transformer model
ab20cfe verified
raw
history blame contribute delete
162 Bytes
{"num_encoder_layers": 6, "num_decoder_layers": 6, "d_model": 512, "d_ff": 2048, "src_vocab_size": 10000, "tgt_vocab_size": 10000, "num_heads": 8, "dropout": 0.1}