Eigen: Is it possible to create LeastSquareDiagonalPreconditioner-like conditioner if i only can compute Aty...











up vote
1
down vote

favorite












I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question
























  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    Nov 8 at 9:54










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    Nov 8 at 9:57










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    Nov 8 at 10:15






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    Nov 8 at 16:47






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    Nov 9 at 14:41















up vote
1
down vote

favorite












I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question
























  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    Nov 8 at 9:54










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    Nov 8 at 9:57










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    Nov 8 at 10:15






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    Nov 8 at 16:47






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    Nov 9 at 14:41













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question















I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.







c++ linear-algebra eigen least-squares eigen3






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 9 at 7:26

























asked Nov 8 at 8:24









Dark_Daiver

7231830




7231830












  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    Nov 8 at 9:54










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    Nov 8 at 9:57










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    Nov 8 at 10:15






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    Nov 8 at 16:47






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    Nov 9 at 14:41


















  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    Nov 8 at 9:54










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    Nov 8 at 9:57










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    Nov 8 at 10:15






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    Nov 8 at 16:47






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    Nov 9 at 14:41
















In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
Nov 8 at 9:54




In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
Nov 8 at 9:54












I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
Nov 8 at 9:57




I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
Nov 8 at 9:57












@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
Nov 8 at 10:15




@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
Nov 8 at 10:15




1




1




Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
– chtz
Nov 8 at 16:47




Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
– chtz
Nov 8 at 16:47




1




1




I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
Nov 9 at 14:41




I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
Nov 9 at 14:41












1 Answer
1






active

oldest

votes

















up vote
2
down vote



accepted










The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



 m_invdiag(j) = 1./mat.col(j).squaredNorm();


for all column j using a strategy to what you already implemented for the product operators.






share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203840%2feigen-is-it-possible-to-create-leastsquarediagonalpreconditioner-like-condition%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    2
    down vote



    accepted










    The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



     m_invdiag(j) = 1./mat.col(j).squaredNorm();


    for all column j using a strategy to what you already implemented for the product operators.






    share|improve this answer

























      up vote
      2
      down vote



      accepted










      The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



       m_invdiag(j) = 1./mat.col(j).squaredNorm();


      for all column j using a strategy to what you already implemented for the product operators.






      share|improve this answer























        up vote
        2
        down vote



        accepted







        up vote
        2
        down vote



        accepted






        The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



         m_invdiag(j) = 1./mat.col(j).squaredNorm();


        for all column j using a strategy to what you already implemented for the product operators.






        share|improve this answer












        The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



         m_invdiag(j) = 1./mat.col(j).squaredNorm();


        for all column j using a strategy to what you already implemented for the product operators.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 8 at 14:51









        ggael

        19.8k22944




        19.8k22944






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203840%2feigen-is-it-possible-to-create-leastsquarediagonalpreconditioner-like-condition%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            這個網誌中的熱門文章

            Tangent Lines Diagram Along Smooth Curve

            Yusuf al-Mu'taman ibn Hud

            Zucchini