![]() ![]() However, most existing PLMs are in huge size with hundreds of millions of parameters. News articles usually contain rich textual information, and PLMs have the potentials to enhance news text modeling for various intelligent news applications like news recommendation and retrieval. Pre-trained language models (PLMs) like BERT have made great progress in NLP. News BERT: Distilling Pre-trained Language Model for Intelligent News Applicationįindings of the Association for Computational Linguistics: EMNLP 2021 Extensive experiments on two real-world datasets validate our method can effectively improve the performance of user modeling for personalized news recommendation. Moreover, we propose a hierarchical user interest matching framework to match candidate news with different levels of user interest for more accurate user interest targeting. ![]() We use a three-level hierarchy to represent 1) overall user interest 2) user interest in coarse-grained topics like sports and 3) user interest in fine-grained topics like football. Instead of a single user embedding, in our method each user is represented in a hierarchical interest tree to better capture their diverse and multi-grained interest in news. ![]() In this paper, we propose a news recommendation method with hierarchical user interest modeling, named HieRec. However, user interest is usually diverse and multi-grained, which is difficult to be accurately modeled by a single user embedding. Existing news recommendation methods usually learn a single user embedding for each user from their previous behaviors to represent their overall interest. User interest modeling is critical for personalized news recommendation. Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) Hie Rec: Hierarchical User Interest Modeling for Personalized News Recommendation Extensive experiments on two real-world datasets validate the effectiveness and efficiency of our method. Multiple teacher models originated from different time steps of our post-training procedure are used to transfer comprehensive knowledge to the student model in both its post-training stage and finetuning stage. We further propose a two-stage knowledge distillation method to improve the efficiency of the large PLM-based news recommendation model while maintaining its performance. We first design a self-supervised domain-specific post-training method to better adapt the general PLM to the news domain with a contrastive matching task between news titles and news bodies. In this paper, we propose Tiny-NewsRec, which can improve both the effectiveness and the efficiency of PLM-based news recommendation. Moreover, PLMs usually contain a large volume of parameters and have high computational overhead, which imposes a great burden on low-latency online services. However, most existing works simply finetune the PLM with the news recommendation task, which may suffer from the known domain shift problem between the pre-training corpus and downstream news texts. Recently, pre-trained language models (PLMs) have demonstrated the great capability of natural language understanding and benefited news recommendation via improving news modeling. Statistics accurate as of match played 31 December 2020.News recommendation is a widely adopted technique to provide personalized news feeds for the user. After that game he would become an integral member of the team that would win the 2019 China League One division and promotion into the top tier. He would make his debut in a league game against Dalian Transcendence F.C. ![]() On 2 February 2018, Yang transferred to China League One side Qingdao Huanghai. Even during the 2009 league season when Liaoning were playing in the second tier Yang Yu would still have to wait to establish himself within the team and only became a vital member for the club during the second half of the season when Liaoning were pushing for the division title and promotion back into the top tier. Often a peripheral member of the squad it was only once Liaoning were relegated at the end of the 2008 league season and playing in the second tier before Yang Yu was given his chance to establish himself within the Liaoning team. After making his debut he would become a fringe player within the squad, however he would see enough playing time to score his debut goal against Shandong Luneng in a league game on Augin a 2–1 victory. Yang Yu broke into the senior side of Liaoning FC on September 12, 2006, in a league game against Dalian Shide as a late substitute in a 2–1 victory. Yang Yu ( simplified Chinese: 杨宇 traditional Chinese: 楊宇 pinyin: Yáng Yǔ born 18 April 1985) is a professional Chinese football player who currently plays as a midfielder for Qingdao Huanghai. *Club domestic league appearances and goals, correct as of 31 December 2020 ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |