您現在的位置:首頁> 外文會議>Annual meeting of the Society for Computation in Linguistics >文獻詳情

Jabberwocky Parsing: Dependency Parsing with Lexical Noise

機器翻譯Jabberwocky解析:依賴于詞匯噪聲的解析

原文傳遞 原文傳遞并翻譯 加入購物車 收藏
3.3 【6hr】

【摘要】Parsing models have long benefited from the use of lexical information, and indeed current state-of-the art neural network models for dependency parsing achieve substantial improvements by benefiting from distributed representations of lexical information. At the same time, humans can easily parse sentences with unknown or even novel words, as in Lewis Carroll's poem Jabberwocky. In this paper, we carry out jabberwocky parsing experiments, exploring how robust a state-of-the-art neural network parser is to the absence of lexical information. We find that current parsing models, at least under usual training regimens, are in fact overly dependent on lexical information, and perform badly in the jabberwocky context. We also demonstrate that the technique of word dropout drastically improves parsing robustness in this setting, and also leads to significant improvements in out-of-domain parsing.

【作者】Jungo Kasai; Robert Frank;

【作者單位】University of Washington; Yale University;

【年(卷),期】2019,,

【頁碼】113-123

【總頁數】11

【正文語種】eng

【中圖分類】;

【關鍵詞】;


激情球迷怎么玩 能赚钱的游戏可提现金 竞彩360足彩比分直播 球探企业版ios下载 1分快三计划软件 海南七星彩规律图808 89彩票苹果 电竞比分1z 重庆时时彩的正规网址 麻将作弊手法暗语 总进球 江西快3预测 甘肃十一选五今日推荐号 007足球比分网站 韩国快乐8开奖结果 新疆25选5走势图 快乐十分开奖网址