Bidirectional Encoder Representations from Transformer

deep learning artificial neural network language model

DBpedia resource is: http://dbpedia.org/resource/BERT_(language_model)

Abstract is: Bidirectional Encoder Representations from Transformers (BERT) is a transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google. In 2019, Google announced that it had begun leveraging BERT in its search engine, and by late 2020 it was using BERT in almost every English-language query. A 2020 literature survey concluded that "in a little over a year, BERT has become a ubiquitous baseline in NLP experiments", counting over 150 research publications analyzing and improving the model. The original English-language BERT has two models: (1) the BERTBASE: 12 encoders with 12 bidirectional self-attention heads, and (2) the BERTLARGE: 24 encoders with 16 bidirectional self-attention heads. Both models are pre-trained from unlabeled data extracted from the BooksCorpus with 800M words and English Wikipedia with 2,500M words.

Wikimedia Commons category is BERT

Bidirectional Encoder Representations from Transformer is …
instance of (P31):
masked language modelQ24868018
transformerQ85810444
large language modelQ115305900

sublass of (P279):
transformerQ85810444

External links are
P973described at URLhttps://ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html
https://devopedia.org/bert-language-model
P9100GitHub topicbert
P2671Google Knowledge Graph ID/g/11h75tkhxs
P856official websitehttps://arxiv.org/abs/1810.04805
P1324source code repository URLhttps://github.com/google-research/bert

P275copyright licenseApache Software License 2.0Q13785927
P6216copyright statuscopyrightedQ50423863
P3575data size110000000
340000000
P1343described by sourceBERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingQ57267388
P178developerGoogle ResearchQ28943742
P366has usenatural language processingQ30642
P571inception2018-01-01
P138named afterBertQ584184

Reverse relations

based on (P144)
Q75720911CamemBERT
Q106201926Weaviate

subclass of (P279)
Q107031872ALBERT
Q85124095RoBERTa

describes a project that uses (P4510)
Q108039873A Simple Yet Strong Pipeline for HotpotQA
Q104418048An Empirical Study of Using Pre-trained BERT Models for Vietnamese Relation Extraction Task at VLSP 2020
Q61726888Assessing BERT's Syntactic Abilities
Q118823928Assessing the impact of Word Embeddings for Relation Prediction: An Empirical Study
Q65924660Augmenting Chinese WordNet semantic relations with contextualized embeddings
Q112084338Automated Crossword Solving
Q85528867Automatic Fact-guided Sentence Modification
Q70186975BERT for Named Entity Recognition in Contemporary and Historic German
Q57267388BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Q116250705Biographical Semi-Supervised Relation Extraction Dataset
Q112075162BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions
Q112254431Code and Named Entity Recognition in StackOverflow
Q118894562Combining Semantic Web and Machine Learning for Auditable Legal Key Element Extraction
Q62101883Combining embedding methods for a word intrusion task
Q70188422Combining embedding methods for a word intrusion task
Q119525793CrossRE: A Cross-Domain Dataset for Relation Extraction
Q70444897Does BERT Make Any Sense? Interpretable Word Sense Disambiguation with Contextualized Embeddings
Q109943627Does Typological Blinding Impede Cross-Lingual Sharing?
Q115698532Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
Q125749900Entity Linking for Short Text Using Structured Knowledge Graph via Multi-Grained Text Matching
Q112119127Entity Linking in 100 Languages
Q125947627Event Extraction for Portuguese: A QA-Driven Approach Using ACE-2005
Q113770518Find the Funding: Entity Linking with Incomplete Funding Knowledge Bases
Q109283586Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics
Q113663041Harnessing privileged information for hyperbole detection
Q126087116Image classification with symbolic hints using limited resources
Q123260150Investigating Entity Knowledge in BERT with Simple Neural End-To-End Entity Linking
Q118809860Knowledge graph data enrichment based on a software library for text mapping to the Sustainable Development Goals
Q125422072Knowledge graphs for empirical concept retrieval
Q67482086Language Models as Knowledge Bases?
Q122922648Large language models converge toward human-like concept organization
Q84876760Learning to Retrieve Reasoning Paths over Wikipedia Graph for Question Answering
Q112667375MAGNET: Multi-Label Text Classification using Attention-based Graph Neural Network
Q113662028MOVER: Mask, Over-generate and Rank for Hyperbole Generation
Q109944224Multi-Hop Fact Checking of Political Claims
Q84078340Multi-hop Reading Comprehension through Question Decomposition and Rescoring
Q118867783NASTyLinker: NIL-Aware Scalable Transformer-Based Entity Linker
Q123020081Native Language Prediction from Gaze: a Reproducibility Study
Q109285514Revealing the Dark Secrets of BERT
Q109283441Right for the Wrong Reasons: Diagnosing Syntactic Heuristics in Natural Language Inference
Q85124231Select, Answer and Explain: Interpretable Multi-hop Reading Comprehension over Multiple Documents
Q118814258Semi-supervised Construction of Domain-specific Knowledge Graphs
Q65933760Sense Vocabulary Compression through the Semantic Knowledge of WordNet for Neural Word Sense Disambiguation
Q76470494SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems
Q118610590The finer they get: Aggregating fine-tuned models improves lexical semantic change detection
Q86591666Towards Detection of Subjective Bias using Contextualized Word Embeddings
Q118812268Towards Tailored Knowledge Base Modeling using Masked Language Models
Q118867488Transformer Based Semantic Relation Typing for Knowledge Graph Integration
Q87402253What the [MASK]? Making Sense of Language-Specific BERT Models

main subject (P921)
Q86695949A Primer in BERTology: What we know about how BERT works
Q107763843ANDES at SemEval-2020 Task 12: A jointly-trained BERT multilingual model for offensive language detection
Q123494755An Analysis of BERT (NLP) for Assisted Subject Indexing for Project Gutenberg
Q104418048An Empirical Study of Using Pre-trained BERT Models for Vietnamese Relation Extraction Task at VLSP 2020
Q61726888Assessing BERT's Syntactic Abilities
Q109283281BERT Busters: Outlier Dimensions that Disrupt Transformers
Q112119511BERT for Coreference Resolution: Baselines and Analysis
Q104798629BERT-GT: Cross-sentence n-ary relation extraction with BERT and graph transformer
Q57267388BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Q123966253BERTologyNavigator: Advanced Question Answering with BERT-based Semantics
Q126726880BRep-BERT: Pre-training Boundary Representation BERT with Sub-graph Node Contrastive Learning
Q90006632BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Q125526133Broadening BERT vocabulary for Knowledge Graph Construction using Wikipedia2Vec
Q105703238Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT
Q89653510Does BERT need domain adaptation for clinical negation detection?
Q104090794DynaBERT: Dynamic BERT with Adaptive Width and Depth
Q84574679Enriching BERT with Knowledge Graph Embeddings for Document Classification
Q125526132Expanding the Vocabulary of BERT for Knowledge Base Construction
Q114795666Fine-Tuning BERT Models to Extract Named Entities from Archival Finding Aids.
Q98775064Hate speech detection and racial bias mitigation in social media based on BERT model
Q108533266KBLab
Q100393501Korean clinical entity recognition from diagnosis text using BERT
Q108824029Lacking the embedding of a word? Look it up into a traditional dictionary
Q119982575Load What You Need: Smaller Versions of Multilingual BERT
Q111697654MetaMap versus BERT models with explainable active learning: ontology-based experiments with prior knowledge for COVID-19
Q110756024New explainability method for BERT-based model in fake news detection
Q105592627Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles
Q125809751Personalizing Type-Based Facet Ranking Using BERT Embeddings
Q108750211Playing with Words at the National Library of Sweden -- Making a Swedish BERT
Q109286905Positional Artefacts Propagate Through Masked Language Model Embeddings
Q104388228Relation Extraction with BERT-based Pre-trained Model
Q109285514Revealing the Dark Secrets of BERT
Q108146683STonKGs: A Sophisticated Transformer Trained on Biomedical Text and Knowledge Graphs
Q117199937ScholarBERT: Bigger is Not Always Better
Q108712147Scholarly Text Classification with Sentence BERT and Entity Embeddings
Q104433073SciBERT: A Pretrained Language Model for Scientific Text
Q104091325The Lottery Ticket Hypothesis for Pre-trained BERT Networks
Q76472200Visualizing and Measuring the Geometry of BERT
Q116449197W2v-BERT: Combining Contrastive Learning and Masked Language Modeling for Self-Supervised Speech Pre-Training
Q87402253What the [MASK]? Making Sense of Language-Specific BERT Models
Q109283171When BERT Plays the Lottery, All Tickets Are Winning
Q107059867WikiBERT models: deep transfer learning for many languages

Q112063555BioBERTnamed afterP138
Q70444897Does BERT Make Any Sense? Interpretable Word Sense Disambiguation with Contextualized EmbeddingsusesP2283

The articles in Wikimedia projects and languages

      Category:BERTwikimedia
      Категория:BERT (языковая модель)wikinews
Catalan (ca / Q7026)BERT (model de llenguatge)wikipedia
      BERTwikipedia
      BERT (language model)wikipedia
      BERT (modelo de lenguaje)wikipedia
Basque language (eu / Q8752)BERT (hizkuntz eredua)wikipedia
Persian (fa / Q9168)برت (مدل زبانی)wikipedia
      BERT (modèle de langage)wikipedia
hiबर्ट (भाषा मॉडल)wikipedia
      BERTwikipedia
      BERT (言語モデル)wikipedia
      BERT (언어 모델)wikipedia
      BERTwikipedia
      BERT (modelo de linguagem)wikipedia
      BERT (модель мови)wikipedia
      BERT (mô hình ngôn ngữ)wikipedia
yueBERTwikipedia
      BERTwikipedia

Search more.