Issue
I have to solve a linear system of equations with multiple right hand sides, A*X=B, where both, A and B are (upper) triangular, real, square matrices. The size is about 200 by 200. Is there a fast method for this in python/numpy?
I was considering looping over the columns: [edit: included a 7x7 example as requested in the comments.]
import numpy as np
import scipy.linalg as sp
A=np.array(
[[ 1. 0.44615865 0.39541532 0.24977742 0.0881614 0.26116991 0.4138066 ]
[ 0. 0.89495389 0.24253783 0.4514874 0.12356345 0.22552021 0.48408527]
[ 0. 0. 0.88590187 0.03860599 0.19887529 0.03114347 -0.02639242]
[ 0. 0. 0. 0.85573357 -0.05867366 0.85120741 0.25861816]
[ 0. 0. 0. 0. 0.96641899 0.14020408 0.26514478]
[ 0. 0. 0. 0. 0. 0.36844234 0.50505032]
[ 0. 0. 0. 0. 0. 0. 0.44885192]])
B=np.triu(np.array(
[[ 949.43526038 550.35234482 232.34981032 -176.85444188 -143.39220636 198.43783458 60.7140828 ]]
).T @ np.ones((1,7)) )
n=A.shape[0]
X=np.zeros((n,n))
for i in range(n):
X[:i+1,i]=sp.solve_triangular(A[:i+1,:i+1],B[:i+1,i])
But this does not use fast matrix-matrix operations.
I could also do all right hand sides simultaneously, X=solve_triangular(A,B)
, but this does not take into account the triangular structure in B.
Finally, I could invert A and multiply with B, X=inv(A)@B
, but inverting matrices is usually discouraged from.
Solution
To answer my own question, I couldn't find a single routine which makes use of the triangular structure and uses BLAS3 operations. I ended up using a blocked version of the loop-over-columns approach from my question, treating a bunch of columns at a time.
X = np.zeros((n,n))
bs = 32 #block size.
for bst in range(0, n, bs):#bst = block start
bsn = min(bst + bs, n)#block start next
X[:bsn,bst:bsn] = sp.solve_triangular(
A[:bsn,:bsn], B[:bsn,bst:bsn])
On the plus side this does use the structure and BLAS3 operations. As the disadvantage you have to tune the block size.
Answered By - C.Schroeder
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.