Generating Similar Words (or Synonyms) with Word Embeddings (Word2Vec)2019 Community Moderator ElectionHow to handle Memory issues in training Word Embeddings on Large Datasets?How to overcome training example's different lengths when working with Word Embeddings (word2vec)Understanding Word EmbeddingsK-means clustering of word embedding gives strange resultsWhy would you use word embeddings to find similar words?Learning word embeddings using RNNHow to count number of word embeddings in Gensim Word2Vec modelPossible reasons for word2vec learning context words as most similar rather than words in similar contextsWord embeddings and punctuation symbolsDocument parsing modeling and approach?

Why can't I see bouncing of a switch on an oscilloscope?

Client team has low performances and low technical skills: we always fix their work and now they stop collaborate with us. How to solve?

Codimension of non-flat locus

Can I make popcorn with any corn?

Is it inappropriate for a student to attend their mentor's dissertation defense?

Is it tax fraud for an individual to declare non-taxable revenue as taxable income? (US tax laws)

Was any UN Security Council vote triple-vetoed?

Why is consensus so controversial in Britain?

Important Resources for Dark Age Civilizations?

Add text to same line using sed

Did Shadowfax go to Valinor?

Replacing matching entries in one column of a file by another column from a different file

Can I ask the recruiters in my resume to put the reason why I am rejected?

Revoked SSL certificate

How can bays and straits be determined in a procedurally generated map?

Do I have a twin with permutated remainders?

Fully-Firstable Anagram Sets

How can I prevent hyper evolved versions of regular creatures from wiping out their cousins?

Perform and show arithmetic with LuaLaTeX

Cross compiling for RPi - error while loading shared libraries

Character reincarnated...as a snail

High voltage LED indicator 40-1000 VDC without additional power supply

Why doesn't Newton's third law mean a person bounces back to where they started when they hit the ground?

Is it legal for company to use my work email to pretend I still work there?



Generating Similar Words (or Synonyms) with Word Embeddings (Word2Vec)



2019 Community Moderator ElectionHow to handle Memory issues in training Word Embeddings on Large Datasets?How to overcome training example's different lengths when working with Word Embeddings (word2vec)Understanding Word EmbeddingsK-means clustering of word embedding gives strange resultsWhy would you use word embeddings to find similar words?Learning word embeddings using RNNHow to count number of word embeddings in Gensim Word2Vec modelPossible reasons for word2vec learning context words as most similar rather than words in similar contextsWord embeddings and punctuation symbolsDocument parsing modeling and approach?










0












$begingroup$


We have a search engine, and when users type in Tacos, we also want to search for similar words, such as Chilis or Burritos.



However, it is also possible that the user search with multiple keywords. Such as Tacos Mexican Restaurants, and we also want to find similar word such as Chilis or Burritos.



What we do is to add all the vectors together for each word. This sometimes works, but with more keywords the vectors tend to be in a place where there are no neighbors.



Is there an approach where we can use not only one word, but multiple word, and still gives us similar results? We are using pre-trained glove vectors from Stanford, would it help if we train on articles that are food related, and specifically use that type of word embeddings for this task?










share|improve this question









$endgroup$
















    0












    $begingroup$


    We have a search engine, and when users type in Tacos, we also want to search for similar words, such as Chilis or Burritos.



    However, it is also possible that the user search with multiple keywords. Such as Tacos Mexican Restaurants, and we also want to find similar word such as Chilis or Burritos.



    What we do is to add all the vectors together for each word. This sometimes works, but with more keywords the vectors tend to be in a place where there are no neighbors.



    Is there an approach where we can use not only one word, but multiple word, and still gives us similar results? We are using pre-trained glove vectors from Stanford, would it help if we train on articles that are food related, and specifically use that type of word embeddings for this task?










    share|improve this question









    $endgroup$














      0












      0








      0





      $begingroup$


      We have a search engine, and when users type in Tacos, we also want to search for similar words, such as Chilis or Burritos.



      However, it is also possible that the user search with multiple keywords. Such as Tacos Mexican Restaurants, and we also want to find similar word such as Chilis or Burritos.



      What we do is to add all the vectors together for each word. This sometimes works, but with more keywords the vectors tend to be in a place where there are no neighbors.



      Is there an approach where we can use not only one word, but multiple word, and still gives us similar results? We are using pre-trained glove vectors from Stanford, would it help if we train on articles that are food related, and specifically use that type of word embeddings for this task?










      share|improve this question









      $endgroup$




      We have a search engine, and when users type in Tacos, we also want to search for similar words, such as Chilis or Burritos.



      However, it is also possible that the user search with multiple keywords. Such as Tacos Mexican Restaurants, and we also want to find similar word such as Chilis or Burritos.



      What we do is to add all the vectors together for each word. This sometimes works, but with more keywords the vectors tend to be in a place where there are no neighbors.



      Is there an approach where we can use not only one word, but multiple word, and still gives us similar results? We are using pre-trained glove vectors from Stanford, would it help if we train on articles that are food related, and specifically use that type of word embeddings for this task?







      word2vec word-embeddings






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked 5 hours ago









      user1157751user1157751

      2181416




      2181416




















          0






          active

          oldest

          votes












          Your Answer





          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "557"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48722%2fgenerating-similar-words-or-synonyms-with-word-embeddings-word2vec%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Data Science Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f48722%2fgenerating-similar-words-or-synonyms-with-word-embeddings-word2vec%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Францішак Багушэвіч Змест Сям'я | Біяграфія | Творчасць | Мова Багушэвіча | Ацэнкі дзейнасці | Цікавыя факты | Спадчына | Выбраная бібліяграфія | Ушанаванне памяці | У філатэліі | Зноскі | Літаратура | Спасылкі | НавігацыяЛяхоўскі У. Рупіўся дзеля Бога і людзей: Жыццёвы шлях Лявона Вітан-Дубейкаўскага // Вольскі і Памідораў з песняй пра немца Адвакат, паэт, народны заступнік Ашмянскі веснікВ Минске появится площадь Богушевича и улица Сырокомли, Белорусская деловая газета, 19 июля 2001 г.Айцец беларускай нацыянальнай ідэі паўстаў у бронзе Сяргей Аляксандравіч Адашкевіч (1918, Мінск). 80-я гады. Бюст «Францішак Багушэвіч».Яўген Мікалаевіч Ціхановіч. «Партрэт Францішка Багушэвіча»Мікола Мікалаевіч Купава. «Партрэт зачынальніка новай беларускай літаратуры Францішка Багушэвіча»Уладзімір Іванавіч Мелехаў. На помніку «Змагарам за родную мову» Барэльеф «Францішак Багушэвіч»Памяць пра Багушэвіча на Віленшчыне Страчаная сталіца. Беларускія шыльды на вуліцах Вільні«Krynica». Ideologia i przywódcy białoruskiego katolicyzmuФранцішак БагушэвічТворы на knihi.comТворы Францішка Багушэвіча на bellib.byСодаль Уладзімір. Францішак Багушэвіч на Лідчыне;Луцкевіч Антон. Жыцьцё і творчасьць Фр. Багушэвіча ў успамінах ягоных сучасьнікаў // Запісы Беларускага Навуковага таварыства. Вільня, 1938. Сшытак 1. С. 16-34.Большая российская1188761710000 0000 5537 633Xn9209310021619551927869394п

          Partai Komunis Tiongkok Daftar isi Kepemimpinan | Pranala luar | Referensi | Menu navigasidiperiksa1 perubahan tertundacpc.people.com.cnSitus resmiSurat kabar resmi"Why the Communist Party is alive, well and flourishing in China"0307-1235"Full text of Constitution of Communist Party of China"smengembangkannyas

          ValueError: Expected n_neighbors <= n_samples, but n_samples = 1, n_neighbors = 6 (SMOTE) The 2019 Stack Overflow Developer Survey Results Are InCan SMOTE be applied over sequence of words (sentences)?ValueError when doing validation with random forestsSMOTE and multi class oversamplingLogic behind SMOTE-NC?ValueError: Error when checking target: expected dense_1 to have shape (7,) but got array with shape (1,)SmoteBoost: Should SMOTE be ran individually for each iteration/tree in the boosting?solving multi-class imbalance classification using smote and OSSUsing SMOTE for Synthetic Data generation to improve performance on unbalanced dataproblem of entry format for a simple model in KerasSVM SMOTE fit_resample() function runs forever with no result