用户名: 密码: 验证码:
基于双层注意力机制的评分预测推荐模型
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Rating prediction recommendation model based on dual-level attention mechanism
  • 作者:李钰钰 ; 郎丛妍 ; 冯松鹤
  • 英文作者:LI Yuyu;LANG Congyan;FENG Songhe;School of Computer Science and Information Technology,Beijing Jiaotong University;
  • 关键词:评分预测 ; 深度学习 ; 卷积神经网络 ; 注意力机制
  • 英文关键词:rating prediction;;deep learning;;convolutional neural network;;attention mechanism
  • 中文刊名:ZKZX
  • 英文刊名:China Sciencepaper
  • 机构:北京交通大学计算机与信息技术学院;
  • 出版日期:2018-09-23
  • 出版单位:中国科技论文
  • 年:2018
  • 期:v.13
  • 语种:中文;
  • 页:ZKZX201818005
  • 页数:6
  • CN:18
  • ISSN:10-1033/N
  • 分类号:25-30
摘要
设计了2个平行的卷积神经网络,联合学习用户和商品的隐藏特征表示,建模时综合考虑了细粒度的词汇和粗粒度的评论2个层面,将连接的词向量和评论向量作为网络的输入,并采用基于Word2vec的语义一阶跳转方法表示评论向量,进一步丰富评论的语义表达;在卷积层之前设计注意力层,加强重要特征对评分预测的贡献,并增加了模型的可解释性;顶层使用因子分解机模拟高阶潜在特征的相互作用,以进行评分预测。实验结果表明,所提出的方法相比于基准方法有更低的均方根误差,可有效提高评分预测精度。
        This paper designes two parallel convolutional neural networks to jointly learn the hidden features of users and products.In modeling phase,both fine-grained words and coarse-grained reviews are taken into account.The connected word vectors and review vectors are used as input of the network,and a semantic first-order jumping method based on Word2 vec is used to represent the review vectors,which further enriches the semantic expression of the reviews.An attention layer is designed before the convolutional layers to reinforce the contribution of the important features to the rating prediction,and to increase the interpretability of the model.The top layer is motivated by factorization machines to simulate the interaction of higher-order latent features for the rating prediction.Experimental results show that the proposed method has lower root mean square error(RMSE)than the benchmark methods,thereby effectively improving the accuracy of the rating prediction.
引文
[1]CHEN Y,SUN X Y,GONG D W,et al.Personalized search inspired fast interactive estimation of distribution algorithm and its application[J].IEEE Transactions on Evolutionary Computation,2017,21(4):588-600.
    [2]BOBADILLA J,ORTEGA F,HERNANDO A,et al.Recommender systems survey[J].Knowledge-Based Systems,2013,46(1):109-132.
    [3]GLOROT X,BORDES A,BENGIO Y.Deep sparse rectifier neural networks[C]∥Proceedings of the 14th International Conference on Artificial Intelligence and Statistics.Lauderdale:UK:JMLR,2011:315-323.
    [4]MCAULEY J,LESKOVEC J.Hidden factors and hidden topics:understanding rating dimensions with review text[C]∥Proceedings of the 7th ACM Conference on Recommender Systems.Hong Kong,China:ACM,2013:165-172.
    [5]LING G,LYU M R,KING I.Ratings meet reviews,a combined approach to recommend[C]∥Proceedings of the 8th ACM Conference on Recommender Systems.Foster City,USA:ACM,2014:105-112.
    [6]BLEI D M,NG A Y,JORDAN M I.Latent dirichlet allocation[J].Journal of Machine Learning Research,2003,3:993-1022.
    [7]LECUN Y,BENGIO Y,HINTON G.Deep learning[J].Nature,2015,521(7553):436-444.
    [8]WANG H,WANG N Y,YEUNG D Y.Collaborative deep learning for recommender systems[C]∥Proceedings of the 21th ACM International Conference on Knowledge Discovery and Data Mining.Sydney,Australia:ACM,2015:1235-1244.
    [9]VINCENT P,LAROCHELLE H,LAJOIE I,et al.Stacked denoising autoencoders:learning useful representations in a deep network with a local denoising criterion[J].Journal of Machine Learning Research,2010,11:3371-3408.
    [10]MIKOLOV T,SUTSKEVER I,CHEN Kai,et al.Distributed representations of words and phrases and their compositionality[C]∥Proceedings of the 26th International Conference on Neural Information Processing Systems.Lake Tahoe,USA:MIT Press,2013,26:3111-3119.
    [11]PENNINGTON J,SOCHER R,MANNING C.Glove:global vectors for word representation[C]∥Proceedings of the 2014Conference on Empirical Methods in Natural Language Processing.Doha,Qatar:ACL,2014:1532-1543.
    [12]KIM D,PARK C,OH J,et al.Convolutional matrix factorization for document context-aware recommendation[C]∥Proceedings of the 10th ACM Conference on Recommender Systems.Boston,USA:ACM,2016:233-240.
    [13]SALAKHUTDINOV R,MNIH A.Probabilistic matrix factorization[C]∥Proceedings of the 20th International Conference on Neural Information Processing Systems.Vancouver,Canada:MIT Press,2007:1257-1264.
    [14]ZHENG L,NOROOZI V,YU P S.Joint deep modeling of users and items using reviews for recommendation[C]∥Proceedings of the 10th ACM International Conference on Web Search and Data Mining.Cambridge,UK:ACM,2017:425-434.
    [15]BAHDANAU D,CHO K,BENGIO Y.Neural machine translation by jointly learning to align and translate[Z].arXiv preprint arXiv:1409.0473,2014.
    [16]XIAO J,YE H,HE X N,et al.Attentional factorization machines:learning the weight of feature interactions via attention networks[C]∥Proceedings of the26th International Joint Conference on Artificial Intelligence.Melbourne,Australia:Morgan Kaufmann,2017:3119-3125.
    [17]RENDLE S.Factorization machines[C]∥Proceedings of the 10th IEEE International Conference on Data Mining.Sydney,Australia:IEEE,2010:995-1000.
    [18]SEO S Y,HUANG J,YANG H,et al.Interpretable convolutional neural networks with dual local and global attention for review rating prediction[C]∥Proceedings of the 11th ACM Conference on Recommender Systems.Como,Italy:ACM,2017:297-305.
    [19]LIN M,CHEN Q,YAN S C.Network in network[Z].arXiv preprint arXiv:1312.4400,2013.
    [20]KIM Y.Convolutional neural networks for sentence classification[C]∥Proceedings of the 2014Conference on Empirical Methods in Natural Language Processing.Doha,Qatar:ACL,2014:1746-1751.
    [21]KOREN Y,BELL R,VOLINSKY C.Matrix factorization techniques for recommender systems[J].Computer,2009,42(8):30-37.
    [22]KINGMA D P,BA J.Adam:a method for stochastic optimization[Z].arXiv preprint arXiv:1412.6980,2014.
    [23]HE R N,MCAULEY J.Ups and downs:modeling the visual evolution of fashion trends with one-class collaborative filtering[C]∥Proceedings of the 25th International Conference on World Wide Web.Montreal,Canada:ACM,2016:507-517.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700