為了解決這個問題,本文提出了一種新穎的可控式添加方法,可以在不影響原始性能的情況下盡可能地添加有用的表示。在許多真實世界數據集上進行的大量實驗表明了我們方法的有效性和靈活性,並且它也有機會和潛力用於其他模型或其他任務。;As the research of machine learning continues to progress, achieving good performance without a large amount of complicated data is prioritized over asking the model to reach a good performance from huge data. In the field of recommendation systems, digging out users′ interests with limited data is one of the popular research directions.
Session-based recommendations with Graph Neural Networks is a very trendy model, it can make a good recommendation with only simple user browsing records, however, this kind of model usually has an obvious disadvantage, it can not perform any actions on an unknown item which model have not seen during the training phase, even though it is not a cold start item. This is a big problem in practical applications, machines are unlikely to train the large model repeatly, at it will consume a lot of resources.
To solve this problem, a novel controllable addition method is proposed, the useful representations can be added without affecting the original performance as much as possible. Extensive experiments conducted on many real-world datasets show the effectiveness and flexibility of our method, and it also has the opportunity and potential to be used in other models or other tasks.