SVM radial basis generate equation for hyperplane2019 Community Moderator ElectionChoosing the right data mining method to find the effect of each parameter over the targetFeature selection for Support Vector MachinesWhy does an SVM model store the support vectors, and not just the separating hyperplane?How to code an SVM's equation including kernels?Find the order of importance of random variables in their ability to explain a variance of YWhy adding combinations of features would increase performance of linear SVM?Intuition behind the fact that SVM uses only measure of similarity between examples for classificationFinding the equation for a multiple and nonlinear regression model?Minimum numbers of support vectorsFeature selection through Random Forest and Principal Component Analysis

Do sorcerers' Subtle Spells require a skill check to be unseen?

What happens if you roll doubles 3 times then land on "Go to jail?"

Did Dumbledore lie to Harry about how long he had James Potter's invisibility cloak when he was examining it? If so, why?

Applicability of Single Responsibility Principle

Inappropriate reference requests from Journal reviewers

Avoiding estate tax by giving multiple gifts

What can we do to stop prior company from asking us questions?

How to be diplomatic in refusing to write code that breaches the privacy of our users

Is there a problem with hiding "forgot password" until it's needed?

Sort a list by elements of another list

Two monoidal structures and copowering

Why does indent disappear in lists?

Is there a korbon needed for conversion?

How did Arya survive the stabbing?

Do the temporary hit points from the Battlerager barbarian's Reckless Abandon stack if I make multiple attacks on my turn?

I'm in charge of equipment buying but no one's ever happy with what I choose. How to fix this?

Escape a backup date in a file name

Why are there no referendums in the US?

Opposite of a diet

Was Spock the First Vulcan in Starfleet?

How can I kill an app using Terminal?

Term for the "extreme-extension" version of a straw man fallacy?

How do I go from 300 unfinished/half written blog posts, to published posts?

Return the Closest Prime Number



SVM radial basis generate equation for hyperplane



2019 Community Moderator ElectionChoosing the right data mining method to find the effect of each parameter over the targetFeature selection for Support Vector MachinesWhy does an SVM model store the support vectors, and not just the separating hyperplane?How to code an SVM's equation including kernels?Find the order of importance of random variables in their ability to explain a variance of YWhy adding combinations of features would increase performance of linear SVM?Intuition behind the fact that SVM uses only measure of similarity between examples for classificationFinding the equation for a multiple and nonlinear regression model?Minimum numbers of support vectorsFeature selection through Random Forest and Principal Component Analysis










1












$begingroup$


I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



https://scikit-learn.org/stable/modules/svm.html



So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
i would be grateful for any kind of suggestion. Many thanks in advance.










share|improve this question









$endgroup$




bumped to the homepage by Community 15 mins ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.



















    1












    $begingroup$


    I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



    Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



    I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



    https://scikit-learn.org/stable/modules/svm.html



    So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
    i would be grateful for any kind of suggestion. Many thanks in advance.










    share|improve this question









    $endgroup$




    bumped to the homepage by Community 15 mins ago


    This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.

















      1












      1








      1





      $begingroup$


      I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



      Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



      I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



      https://scikit-learn.org/stable/modules/svm.html



      So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
      i would be grateful for any kind of suggestion. Many thanks in advance.










      share|improve this question









      $endgroup$




      I would be very grateful if I could receive some help regarding generating hyperplane equation. I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.



      Regarding this following equation for svm , f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )



      I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case,and now I have one column of 51 y (i) alpha (i) or (dual coeffficients), two columns of 51 sv (support vectors)for P and Q, and one single value for b . I received these using scikit SVC.



      https://scikit-learn.org/stable/modules/svm.html



      So, how can I generate the equation now? Can I multiply those 51 y (i) alpha (i) or (dual coeffficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as : f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients).
      i would be grateful for any kind of suggestion. Many thanks in advance.







      machine-learning python scikit-learn svm machine-learning-model






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Feb 25 at 17:31









      Alejandro Alejandro

      63




      63





      bumped to the homepage by Community 15 mins ago


      This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.







      bumped to the homepage by Community 15 mins ago


      This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.






















          2 Answers
          2






          active

          oldest

          votes


















          0












          $begingroup$

          I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



          $k(x,y)=phi(x)cdot phi(y)$



          $phi(x)$ - mapping



          The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






          share|improve this answer











          $endgroup$












          • $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02


















          0












          $begingroup$

          Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



          Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
          Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






          share|improve this answer









          $endgroup$












            Your Answer





            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "557"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46217%2fsvm-radial-basis-generate-equation-for-hyperplane%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0












            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$












            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02















            0












            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$












            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02













            0












            0








            0





            $begingroup$

            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.






            share|improve this answer











            $endgroup$



            I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.



            $k(x,y)=phi(x)cdot phi(y)$



            $phi(x)$ - mapping



            The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Feb 25 at 20:52

























            answered Feb 25 at 20:47









            Michał KardachMichał Kardach

            716




            716











            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02
















            • $begingroup$
              Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
              $endgroup$
              – Alejandro
              Feb 26 at 6:02















            $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02




            $begingroup$
            Lets say i have two independent variables (P and Q) and a binary variable C. i use logistic regression to calculate individual coefficients of P and Q (m,n) plus a constant( b). The equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities. Similarly, if I use support vector, how to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)).
            $endgroup$
            – Alejandro
            Feb 26 at 6:02











            0












            $begingroup$

            Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



            Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
            Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






            share|improve this answer









            $endgroup$

















              0












              $begingroup$

              Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



              Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
              Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






              share|improve this answer









              $endgroup$















                0












                0








                0





                $begingroup$

                Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



                Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
                Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.






                share|improve this answer









                $endgroup$



                Let me explain my question in terms of logistic regression. Suppose i am trying to use logistic regression to predict probabilities, i have two independent variables (P and Q) and a binary dependent variable C. I will use logistic regression to calculate individual coefficients of P and Q (let us say m and n respectively) plus a constant(let us say b). The fundamental equation of generalized linear model will be (mP + nQ + b). I can now use this equation to calculate probabilities.



                Similarly, if I am using support vector, how am I going to get this kind of generalised linear model equation? I have used scikit in Python and also R, all i get is total number of support vectors and their values and value for (alpha (i) x X(i)). I need to assign individual weight to the two variable P and Q plus bias (b) so that I could use this equation as a generaised linear model. I am getting the constant term from Python but how am i suppose to generate coefficients of P and Q using SVM radial kernel?
                Therefore I was wondering if there is some way I could assign weights to my two variables and create the linear function which i could use. I would be very grateful for an explanation.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Feb 26 at 5:52









                Alejandro Alejandro

                63




                63



























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Data Science Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f46217%2fsvm-radial-basis-generate-equation-for-hyperplane%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Францішак Багушэвіч Змест Сям'я | Біяграфія | Творчасць | Мова Багушэвіча | Ацэнкі дзейнасці | Цікавыя факты | Спадчына | Выбраная бібліяграфія | Ушанаванне памяці | У філатэліі | Зноскі | Літаратура | Спасылкі | НавігацыяЛяхоўскі У. Рупіўся дзеля Бога і людзей: Жыццёвы шлях Лявона Вітан-Дубейкаўскага // Вольскі і Памідораў з песняй пра немца Адвакат, паэт, народны заступнік Ашмянскі веснікВ Минске появится площадь Богушевича и улица Сырокомли, Белорусская деловая газета, 19 июля 2001 г.Айцец беларускай нацыянальнай ідэі паўстаў у бронзе Сяргей Аляксандравіч Адашкевіч (1918, Мінск). 80-я гады. Бюст «Францішак Багушэвіч».Яўген Мікалаевіч Ціхановіч. «Партрэт Францішка Багушэвіча»Мікола Мікалаевіч Купава. «Партрэт зачынальніка новай беларускай літаратуры Францішка Багушэвіча»Уладзімір Іванавіч Мелехаў. На помніку «Змагарам за родную мову» Барэльеф «Францішак Багушэвіч»Памяць пра Багушэвіча на Віленшчыне Страчаная сталіца. Беларускія шыльды на вуліцах Вільні«Krynica». Ideologia i przywódcy białoruskiego katolicyzmuФранцішак БагушэвічТворы на knihi.comТворы Францішка Багушэвіча на bellib.byСодаль Уладзімір. Францішак Багушэвіч на Лідчыне;Луцкевіч Антон. Жыцьцё і творчасьць Фр. Багушэвіча ў успамінах ягоных сучасьнікаў // Запісы Беларускага Навуковага таварыства. Вільня, 1938. Сшытак 1. С. 16-34.Большая российская1188761710000 0000 5537 633Xn9209310021619551927869394п

                    Partai Komunis Tiongkok Daftar isi Kepemimpinan | Pranala luar | Referensi | Menu navigasidiperiksa1 perubahan tertundacpc.people.com.cnSitus resmiSurat kabar resmi"Why the Communist Party is alive, well and flourishing in China"0307-1235"Full text of Constitution of Communist Party of China"smengembangkannyas

                    ValueError: Expected n_neighbors <= n_samples, but n_samples = 1, n_neighbors = 6 (SMOTE) The 2019 Stack Overflow Developer Survey Results Are InCan SMOTE be applied over sequence of words (sentences)?ValueError when doing validation with random forestsSMOTE and multi class oversamplingLogic behind SMOTE-NC?ValueError: Error when checking target: expected dense_1 to have shape (7,) but got array with shape (1,)SmoteBoost: Should SMOTE be ran individually for each iteration/tree in the boosting?solving multi-class imbalance classification using smote and OSSUsing SMOTE for Synthetic Data generation to improve performance on unbalanced dataproblem of entry format for a simple model in KerasSVM SMOTE fit_resample() function runs forever with no result