Jump to content

bidirectional encoder representations from transformers (Q61726893)

From Wikidata
deep learning artificial neural network language model
  • BERT
  • bidirectional encoder representations from transformer
edit
Language Label Description Also known as
default for all languages
BERT
    English
    bidirectional encoder representations from transformers
    deep learning artificial neural network language model
    • BERT
    • bidirectional encoder representations from transformer

    Statements

    1 reference
    Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing (English)
    16 March 2023
    1 reference
    Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing (English)
    31 July 2022
    0 references
    1 reference
    Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing (English)
    31 July 2022
    2018
    0 references
    1 reference
    Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing (English)
    31 July 2022
    0 references
    110,000,000 parameter
    0 references
    340,000,000 parameter
    0 references
    BERT
    0 references

    Identifiers

     
    edit
      edit
        edit
          edit
            edit
              edit
                edit