Resource and useful tips on Transfer Learning in NLP Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) 2019 Moderator Election Q&A - Questionnaire 2019 Community Moderator Election ResultsConvnet training error does not decreaseChoosing the right model to learntransfer learning with sentiment analysis?Transfer learning by concatenating the last classification layercorrecting conditional and marginal distribution in transfer learningTransfer learning - small databaseCan transfer learning be applied to predict salesHow to input different sized images into transfer learning networkNeural Network Model using Transfer Learning not learningOutput range of BERT model shrinks after fine-tuning on domain specific dataset

Disembodied hand growing fangs

Amount of permutations on an NxNxN Rubik's Cube

How do I use the new nonlinear finite element in Mathematica 12 for this equation?

Can a new player join a group only when a new campaign starts?

How do living politicians protect their readily obtainable signatures from misuse?

Why do we need to use the builder design pattern when we can do the same thing with setters?

Chinese Seal on silk painting - what does it mean?

Can anything be seen from the center of the Boötes void? How dark would it be?

Putting class ranking in CV, but against dept guidelines

What initially awakened the Balrog?

How to react to hostile behavior from a senior developer?

AppleTVs create a chatty alternate WiFi network

Do wooden building fires get hotter than 600°C?

What's the meaning of "fortified infraction restraint"?

How could we fake a moon landing now?

Time to Settle Down!

How does Python know the values already stored in its memory?

Maximum summed subsequences with non-adjacent items

Take 2! Is this homebrew Lady of Pain warlock patron balanced?

How to play a character with a disability or mental disorder without being offensive?

Is there a kind of relay that only consumes power when switching?

What order were files/directories outputted in dir?

Drawing without replacement: why is the order of draw irrelevant?

Do I really need to have a message in a novel to appeal to readers?



Resource and useful tips on Transfer Learning in NLP



Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)
2019 Moderator Election Q&A - Questionnaire
2019 Community Moderator Election ResultsConvnet training error does not decreaseChoosing the right model to learntransfer learning with sentiment analysis?Transfer learning by concatenating the last classification layercorrecting conditional and marginal distribution in transfer learningTransfer learning - small databaseCan transfer learning be applied to predict salesHow to input different sized images into transfer learning networkNeural Network Model using Transfer Learning not learningOutput range of BERT model shrinks after fine-tuning on domain specific dataset










1












$begingroup$


I have a few label data for training and testing a DNN. Main purpose of my work is to train a model which can do a binary classification of text. And for this purpose, I have around 3000 label data and 60000 unlabeled data available to me. My data type is related to instructions (like- open the door[label-1], give me a cup of water[label-1], give me money[label-0] etc.) In this case, I heard that Transferring knowledge from other models will help me a lot. Can anyone give me some useful resource for transfer learning in NLP domain?



I already did a few experiments. I used GLoVE as a pretrained embeddings. Then test it with my label data. But got around 70% accuracy. Also tried with embedding built using my own data (63k) and then train the model. Got 75% accuracy on the test data. My model architecture is given below-
enter image description here



Q1: I have a quick question will it be referred to as Transfer learning if I use GLOVE embeddings in model?



Any kind of help is welcomed. Even someone has other ideas for building a model without using transfer learning is welcomed.










share|improve this question









$endgroup$




bumped to the homepage by Community 25 mins ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.



















    1












    $begingroup$


    I have a few label data for training and testing a DNN. Main purpose of my work is to train a model which can do a binary classification of text. And for this purpose, I have around 3000 label data and 60000 unlabeled data available to me. My data type is related to instructions (like- open the door[label-1], give me a cup of water[label-1], give me money[label-0] etc.) In this case, I heard that Transferring knowledge from other models will help me a lot. Can anyone give me some useful resource for transfer learning in NLP domain?



    I already did a few experiments. I used GLoVE as a pretrained embeddings. Then test it with my label data. But got around 70% accuracy. Also tried with embedding built using my own data (63k) and then train the model. Got 75% accuracy on the test data. My model architecture is given below-
    enter image description here



    Q1: I have a quick question will it be referred to as Transfer learning if I use GLOVE embeddings in model?



    Any kind of help is welcomed. Even someone has other ideas for building a model without using transfer learning is welcomed.










    share|improve this question









    $endgroup$




    bumped to the homepage by Community 25 mins ago


    This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.

















      1












      1








      1





      $begingroup$


      I have a few label data for training and testing a DNN. Main purpose of my work is to train a model which can do a binary classification of text. And for this purpose, I have around 3000 label data and 60000 unlabeled data available to me. My data type is related to instructions (like- open the door[label-1], give me a cup of water[label-1], give me money[label-0] etc.) In this case, I heard that Transferring knowledge from other models will help me a lot. Can anyone give me some useful resource for transfer learning in NLP domain?



      I already did a few experiments. I used GLoVE as a pretrained embeddings. Then test it with my label data. But got around 70% accuracy. Also tried with embedding built using my own data (63k) and then train the model. Got 75% accuracy on the test data. My model architecture is given below-
      enter image description here



      Q1: I have a quick question will it be referred to as Transfer learning if I use GLOVE embeddings in model?



      Any kind of help is welcomed. Even someone has other ideas for building a model without using transfer learning is welcomed.










      share|improve this question









      $endgroup$




      I have a few label data for training and testing a DNN. Main purpose of my work is to train a model which can do a binary classification of text. And for this purpose, I have around 3000 label data and 60000 unlabeled data available to me. My data type is related to instructions (like- open the door[label-1], give me a cup of water[label-1], give me money[label-0] etc.) In this case, I heard that Transferring knowledge from other models will help me a lot. Can anyone give me some useful resource for transfer learning in NLP domain?



      I already did a few experiments. I used GLoVE as a pretrained embeddings. Then test it with my label data. But got around 70% accuracy. Also tried with embedding built using my own data (63k) and then train the model. Got 75% accuracy on the test data. My model architecture is given below-
      enter image description here



      Q1: I have a quick question will it be referred to as Transfer learning if I use GLOVE embeddings in model?



      Any kind of help is welcomed. Even someone has other ideas for building a model without using transfer learning is welcomed.







      deep-learning nlp convnet word-embeddings transfer-learning






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Aug 20 '18 at 5:12









      faysalfaysal

      61




      61





      bumped to the homepage by Community 25 mins ago


      This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.







      bumped to the homepage by Community 25 mins ago


      This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.






















          1 Answer
          1






          active

          oldest

          votes


















          0












          $begingroup$

          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!






          share|improve this answer









          $endgroup$












          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10











          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "557"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f37167%2fresource-and-useful-tips-on-transfer-learning-in-nlp%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0












          $begingroup$

          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!






          share|improve this answer









          $endgroup$












          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10















          0












          $begingroup$

          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!






          share|improve this answer









          $endgroup$












          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10













          0












          0








          0





          $begingroup$

          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!






          share|improve this answer









          $endgroup$



          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Aug 20 '18 at 15:14









          tm1212tm1212

          46517




          46517











          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10
















          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10















          $begingroup$
          Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
          $endgroup$
          – faysal
          Aug 26 '18 at 20:33




          $begingroup$
          Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
          $endgroup$
          – faysal
          Aug 26 '18 at 20:33












          $begingroup$
          I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
          $endgroup$
          – tm1212
          Aug 27 '18 at 15:10




          $begingroup$
          I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
          $endgroup$
          – tm1212
          Aug 27 '18 at 15:10

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Data Science Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f37167%2fresource-and-useful-tips-on-transfer-learning-in-nlp%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Францішак Багушэвіч Змест Сям'я | Біяграфія | Творчасць | Мова Багушэвіча | Ацэнкі дзейнасці | Цікавыя факты | Спадчына | Выбраная бібліяграфія | Ушанаванне памяці | У філатэліі | Зноскі | Літаратура | Спасылкі | НавігацыяЛяхоўскі У. Рупіўся дзеля Бога і людзей: Жыццёвы шлях Лявона Вітан-Дубейкаўскага // Вольскі і Памідораў з песняй пра немца Адвакат, паэт, народны заступнік Ашмянскі веснікВ Минске появится площадь Богушевича и улица Сырокомли, Белорусская деловая газета, 19 июля 2001 г.Айцец беларускай нацыянальнай ідэі паўстаў у бронзе Сяргей Аляксандравіч Адашкевіч (1918, Мінск). 80-я гады. Бюст «Францішак Багушэвіч».Яўген Мікалаевіч Ціхановіч. «Партрэт Францішка Багушэвіча»Мікола Мікалаевіч Купава. «Партрэт зачынальніка новай беларускай літаратуры Францішка Багушэвіча»Уладзімір Іванавіч Мелехаў. На помніку «Змагарам за родную мову» Барэльеф «Францішак Багушэвіч»Памяць пра Багушэвіча на Віленшчыне Страчаная сталіца. Беларускія шыльды на вуліцах Вільні«Krynica». Ideologia i przywódcy białoruskiego katolicyzmuФранцішак БагушэвічТворы на knihi.comТворы Францішка Багушэвіча на bellib.byСодаль Уладзімір. Францішак Багушэвіч на Лідчыне;Луцкевіч Антон. Жыцьцё і творчасьць Фр. Багушэвіча ў успамінах ягоных сучасьнікаў // Запісы Беларускага Навуковага таварыства. Вільня, 1938. Сшытак 1. С. 16-34.Большая российская1188761710000 0000 5537 633Xn9209310021619551927869394п

          Partai Komunis Tiongkok Daftar isi Kepemimpinan | Pranala luar | Referensi | Menu navigasidiperiksa1 perubahan tertundacpc.people.com.cnSitus resmiSurat kabar resmi"Why the Communist Party is alive, well and flourishing in China"0307-1235"Full text of Constitution of Communist Party of China"smengembangkannyas

          ValueError: Expected n_neighbors <= n_samples, but n_samples = 1, n_neighbors = 6 (SMOTE) The 2019 Stack Overflow Developer Survey Results Are InCan SMOTE be applied over sequence of words (sentences)?ValueError when doing validation with random forestsSMOTE and multi class oversamplingLogic behind SMOTE-NC?ValueError: Error when checking target: expected dense_1 to have shape (7,) but got array with shape (1,)SmoteBoost: Should SMOTE be ran individually for each iteration/tree in the boosting?solving multi-class imbalance classification using smote and OSSUsing SMOTE for Synthetic Data generation to improve performance on unbalanced dataproblem of entry format for a simple model in KerasSVM SMOTE fit_resample() function runs forever with no result