您現在的位置:首頁> 外文會議>Annual meeting of the Society for Computation in Linguistics >文獻詳情

Jabberwocky Parsing: Dependency Parsing with Lexical Noise

機器翻譯Jabberwocky解析:依賴于詞匯噪聲的解析

原文傳遞 原文傳遞并翻譯 加入購物車 收藏
3.3 【6hr】

【摘要】Parsing models have long benefited from the use of lexical information, and indeed current state-of-the art neural network models for dependency parsing achieve substantial improvements by benefiting from distributed representations of lexical information. At the same time, humans can easily parse sentences with unknown or even novel words, as in Lewis Carroll's poem Jabberwocky. In this paper, we carry out jabberwocky parsing experiments, exploring how robust a state-of-the-art neural network parser is to the absence of lexical information. We find that current parsing models, at least under usual training regimens, are in fact overly dependent on lexical information, and perform badly in the jabberwocky context. We also demonstrate that the technique of word dropout drastically improves parsing robustness in this setting, and also leads to significant improvements in out-of-domain parsing.

【作者】Jungo Kasai; Robert Frank;

【作者單位】University of Washington; Yale University;

【年(卷),期】2019,,

【頁碼】113-123

【總頁數】11

【正文語種】eng

【中圖分類】;

【關鍵詞】;


激情球迷怎么玩 中国电脑体育彩票 陕西快乐10分技巧 辽宁快乐12系统机选 神圣计划软件怎么购买 河南快3网 3d试机号今天晚上 推荐平特一肖资料 888电玩城棋牌 排列五投注网站 四川快乐12前三直选最大遗漏 广西快三现场开奖 吉林时时几点开奖直播 齐鲁风采电脑福利彩票开奖结果 现在农村卖什么植物最赚钱 安徽时时l凉拖 广东11选5技巧稳定