scholarly article | Q13442814 |
P50 | author | Yada Pruksachatkun | Q125961858 |
P2093 | author name string | Amanpreet Singh | |
Felix Hill | |||
Omer Levy | |||
Alex Wang | |||
Julian Michael | |||
Nikita Nangia | |||
Samuel Bowman | |||
Yada Pruksachatkun | |||
P2860 | cites work | BoolQ: Exploring the Surprising Difficulty of Natural Yes/No Questions | Q112075162 |
P4510 | describes a project that uses | Amazon Mechanical Turk | Q733115 |
Bidirectional Encoder Representations from Transformer | Q61726893 | ||
SuperGLUE | Q112066118 | ||
BoolQ | Q112075146 | ||
P407 | language of work or name | English | Q1860 |
P921 | main subject | SuperGLUE | Q112066118 |
P1104 | number of pages | 15 | |
P577 | publication date | 2019-01-01 | |
P1433 | published in | Advances in Neural Information Processing Systems 32 | Q68600639 |
P1476 | title | SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems |
Q108941092 | Multitask Prompted Training Enables Zero-Shot Task Generalization |
Q109285514 | Revealing the Dark Secrets of BERT |
Q110887400 | The Power of Scale for Parameter-Efficient Prompt Tuning |
Q107059867 | WikiBERT models: deep transfer learning for many languages |
Q112066118 | SuperGLUE | described by source | P1343 |
Search more.