Pergunta

I'm doing some research for the summarization task and found out BERT is derived from the Transformer model. In every blog about BERT that I have read, they focus on explaining what is a bidirectional encoder, So, I think this is what made BERT different from the vanilla Transformer model. But as far as I know, the Transformer reads the entire sequence of words at once, therefore it is considered bidirectional too. Can someone point out what I'm missing?

Foi útil?

Solução

The name provides a clue. BERT (Bidirectional Encoder Representations from Transformers): So basically BERT = Transformer Minus the Decoder

BERT ends with the final representation of the words after the encoder is done processing it.

In Transformer, the above is used in the decoder. That piece of architecture is not there in BERT

Licenciado em: CC-BY-SA com atribuição
scroll top