Issue
I'm trying to adapt a big radiative transfert code written in python to use the GPU capacities, as I perform a lot of times the same computation which can be done in parallel. I'm a newbie when it comes to shaders, but found this, which seems to offer what I want. It uses the Arcade python module, and runs fine, as is, on my machine. I started to implement this in my radiative transfert code, but got an error that I can't find anywhere on the internet:
(python:20832): GLib-GIO-CRITICAL **: 14:10:15.559: g_application_run() cannot acquire the default main context because it is already acquired by another thread!
(python:20832): Gdk-WARNING **: 14:10:15.563: gdk_gl_context_make_current() failed
I believe it is linked to the use of matplotlib in the same program. I could reproduce this error in the demo code (their github) from arcade by importing matplotlib.pyplot in the main script, and trying to plot a dummy figure like so:
import matplotlib.pyplot as plt
import arcade
from arcade.gl import BufferDescription
###
Rest of the main.py file
###
if __name__ == '__main__':
f = plt.figure()
plt.plot(range(10), range(10))
plt.show()
app = MyWindow()
arcade.run()
I've read a lot of forums and manuals speaking of thread and context, but could'nt find any mention of this error, or how to solve it. Can I set two separate context for matplotlib and the compute shader? In my code, I will use matplotlib before or after the compute shader, but not at the same time. Also the shader will run once, and not every frame like in the demo example I'm aware that this might be a dumb question, because I don't really understand what I'm doing. However, all resources I found on the subject are not oriented to beginners, and speak a foreign language to me. So I would gladly take a beginner's tuto if you know a good one, or more adapted methods to use compute shaders in python.
Thanks in advance for your help
Solution
After a lot of searching around and waiting for a miracle below this post, I think I found a better way of doing what I'm trying to do. First of all, I found the modernGL python module that seems more adequate for my purpose because it is not a game engine, and focuses on shaders. Then, using their compute shader example on github link, I was able to experiment. I found out that at the very end of the file, I could use matplotlib if I released the modernGL context first, like so:
# Begining of the example file
...
import matplotlib.pyplot as plt
...
# First plot with matplotlib
plt.plot(range(10), range(10), 'r')
...
# Creation of the modernGL context
context = moderngl.create_standalone_context(require=430)
# Compute shader usage
...
# End of the example file, we can release the context and come back to the previous one to allow plotting with matplotlib
context.release()
# Plotting a second graph and displaying them.
plt.plot(range(10), range(10) + 1, 'b')
plt.show()
I hope this can help some lost souls someday ;)
Answered By - Azireo
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.