bepaly体育下载

文字:用于分析人类语言的R包

在人工智能(AI)领域,变形金刚有彻底改变的语言分析。从未有过新技术普遍改进了几乎所有语言处理任务的基准:例如,普通语言理解,问题 -回答, 和网络搜索。The transformer method itself, which probabilistically models words in their context (i.e. “language modeling”), was introduced in 2017 and the first large-scale pre-trained general purpose transformer, BERT, was released open source from Google in 2018. Since then, BERT has been followed by a wave of new transformer models including GPT, RoBERTa, DistilBERT, XLNet, Transformer-XL, CamemBERT, XLM-RoBERTa, etc. The文本包装制作所有这些语言模型,并且可以更轻松地访问R用户;包括针对社会科学家量身定制的人类分析优化的功能。