scholarly article | Q13442814 |
P7505 | ACL Anthology article ID | D19-1250 |
P820 | arXiv classification | cs.CL |
P818 | arXiv ID | 1909.01066 |
P356 | DOI | 10.18653/V1/D19-1250 |
P953 | full work available at URL | https://arxiv.org/pdf/1909.01066.pdf |
https://aclanthology.org/D19-1250.pdf |
P50 | author | Sebastian Riedel | Q39128623 |
Tim Rocktäschel | Q50281591 | ||
Fabio Petroni | Q123558967 | ||
P2093 | author name string | Patrick Lewis | |
Yuxiang Wu | |||
Anton Bakhtin | |||
Alexander H. Miller | |||
P2860 | cites work | Long short-term memory | Q24805158 |
SimLex-999: Evaluating Semantic Models with (Genuine) Similarity Estimation | Q28732668 | ||
A Neural Probabilistic Language Model | Q29517961 | ||
Translating Embeddings for Modeling Multi-relational Data | Q32892039 | ||
Reading Wikipedia to Answer Open-Domain Questions | Q32904318 | ||
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | Q57267388 | ||
Assessing BERT's Syntactic Abilities | Q61726888 | ||
P4510 | describes a project that uses | ConceptNet | Q19776339 |
Stanford Question Answering Dataset | Q32923098 | ||
Bidirectional Encoder Representations from Transformer | Q61726893 | ||
Google RE | Q125847896 | ||
P407 | language of work or name | English | Q1860 |
P921 | main subject | knowledge base | Q593744 |
language model | Q3621696 | ||
P304 | page(s) | 2463-2473 | |
P577 | publication date | 2019-09-04 | |
P1433 | published in | Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) | Q77199054 |
P1476 | title | Language Models as Knowledge Bases? |
Q123244749 | A Unified Generative Framework based on Prompt Learning for Various Information Extraction Tasks |
Q120970016 | Extracting Multi-valued Relations from Language Models |
Q125526134 | LLM2KB: Constructing Knowledge Bases using instruction tuned context aware Large Language Models |
Q125567785 | LM-KBC 2023: 2nd Challenge on Knowledge Base Construction from Pre-trained Language Models |
Q115265833 | LM-KBC: Knowledge Base Construction from Pre-trained Language Models |
Q109286381 | Language Models As or For Knowledge Bases |
Search more.