Eigen: Is it possible to create LeastSquareDiagonalPreconditioner-like conditioner if i only can compute Aty...
up vote
1
down vote
favorite
I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
c++ linear-algebra eigen least-squares eigen3
|
show 6 more comments
up vote
1
down vote
favorite
I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
c++ linear-algebra eigen least-squares eigen3
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
Nov 8 at 9:54
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
Nov 8 at 9:57
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
Nov 8 at 10:15
1
Is the actual question how to compute the diagonal ofA^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more aboutA
(and maybe it will be more a math question than a programming question).
– chtz
Nov 8 at 16:47
1
I don't see any obvious solution except computing the squared norm of eachA*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
Nov 9 at 14:41
|
show 6 more comments
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
c++ linear-algebra eigen least-squares eigen3
I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
c++ linear-algebra eigen least-squares eigen3
c++ linear-algebra eigen least-squares eigen3
edited Nov 9 at 7:26
asked Nov 8 at 8:24
Dark_Daiver
7231830
7231830
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
Nov 8 at 9:54
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
Nov 8 at 9:57
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
Nov 8 at 10:15
1
Is the actual question how to compute the diagonal ofA^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more aboutA
(and maybe it will be more a math question than a programming question).
– chtz
Nov 8 at 16:47
1
I don't see any obvious solution except computing the squared norm of eachA*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
Nov 9 at 14:41
|
show 6 more comments
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
Nov 8 at 9:54
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
Nov 8 at 9:57
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
Nov 8 at 10:15
1
Is the actual question how to compute the diagonal ofA^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more aboutA
(and maybe it will be more a math question than a programming question).
– chtz
Nov 8 at 16:47
1
I don't see any obvious solution except computing the squared norm of eachA*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
Nov 9 at 14:41
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
Nov 8 at 9:54
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
Nov 8 at 9:54
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
Nov 8 at 9:57
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
Nov 8 at 9:57
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
Nov 8 at 10:15
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
Nov 8 at 10:15
1
1
Is the actual question how to compute the diagonal of
A^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A
(and maybe it will be more a math question than a programming question).– chtz
Nov 8 at 16:47
Is the actual question how to compute the diagonal of
A^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A
(and maybe it will be more a math question than a programming question).– chtz
Nov 8 at 16:47
1
1
I don't see any obvious solution except computing the squared norm of each
A*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.– chtz
Nov 9 at 14:41
I don't see any obvious solution except computing the squared norm of each
A*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.– chtz
Nov 9 at 14:41
|
show 6 more comments
1 Answer
1
active
oldest
votes
up vote
2
down vote
accepted
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
add a comment |
up vote
2
down vote
accepted
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
add a comment |
up vote
2
down vote
accepted
up vote
2
down vote
accepted
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
answered Nov 8 at 14:51
ggael
19.8k22944
19.8k22944
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203840%2feigen-is-it-possible-to-create-leastsquarediagonalpreconditioner-like-condition%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
Nov 8 at 9:54
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
Nov 8 at 9:57
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
Nov 8 at 10:15
1
Is the actual question how to compute the diagonal of
A^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more aboutA
(and maybe it will be more a math question than a programming question).– chtz
Nov 8 at 16:47
1
I don't see any obvious solution except computing the squared norm of each
A*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.– chtz
Nov 9 at 14:41