Issue
Is there a simple way of reducing the size of a 3D matrix by averaging blocks of a certain size in Numpy or Scipy or even with NetCDF tools or something similar? I wrote a 2D one using strides a while back but a ready to use function would help a lot.
Edit:
Example of what I'd like my input and output to look like:
Input's shape: (500, 500, 100)
Calling the function: downsize(input, 10, 10, 10, func)
Output's shape: (50, 50, 10)
where every cell's value is the result of func
on consecutive 10x10x10 submatrices.
Alternatively, the code can get the desired matrix size as input instead of the size of the submatrices and figure them out.
Thanks
Solution
Here's an approach using reshaping to slice up each axis into two, thereby creating six axes and then performing merged averaging on the second sliced axes for each of the three original axis to get the blockwise averaging -
def blockwise_average_3D(A,S):
# A is the 3D input array
# S is the blocksize on which averaging is to be performed
m,n,r = np.array(A.shape)//S
return A.reshape(m,S[0],n,S[1],r,S[2]).mean((1,3,5))
Sample runs -
In [107]: A = np.random.randint(0,255,(500,500,100)) # 3D Input array
...: S = (10,10,10) # Blocksize
...:
In [108]: out = blockwise_average_3D(A,S)
In [109]: out[0,0,0]
Out[109]: 124.242
In [110]: A[:10,:10,:10].mean()
Out[110]: 124.242
In [111]: out[0,1,0]
Out[111]: 129.89400000000001
In [112]: A[:10,10:20,:10].mean()
Out[112]: 129.89400000000001
Answered By - Divakar
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.