https://lilianweng.github.io/lil-log/2021/05/31/contrastive-representation-learning.html Contrastive Representation Learningby Lilian WengMay 31, 2021 “The goal of contrastive representation learning is to learn such an embedding space in which similar sample pairs stay close to each
What I Read: Knowledge Graphs with Language Model
https://ai.googleblog.com/2021/05/kelm-integrating-knowledge-graphs-with.html KELM: Integrating Knowledge Graphs with Language Model Pre-training CorporaThursday, May 20, 2021Posted by Siamak Shakeri, Staff Software Engineer and Oshin Agarwal, Research Intern, Google Research “Alternate sources of information
What I Read: Transformer Networks to Answer Questions About Images
https://medium.com/dataseries/microsoft-uses-transformer-networks-to-answer-questions-about-images-with-minimum-training-f978c018bb72 Microsoft Uses Transformer Networks to Answer Questions About Images With Minimum TrainingUnified VLP can understand concepts about scenic images by using pretrained models.Jesus RodriguezJan 12 “Can we build deep