Issue
How do I use torch.utils.cpp_extension.load
to link shared or static library from external source?
I wrote some function in C++, and am using it in PyTorch. So I am using load
function from torch.utils.cpp_extension
to load a PyTorch C++ extension just-in-time (JIT).
This is the wrapper.py
file's content:
import os
from torch.utils.cpp_extension import load
dir_path = os.path.dirname(os.path.realpath(__file__))
my_func = load(name='my_func', sources=[os.path.join(dir_path, 'my_func.cpp')], extra_cflags=['-fopenmp', '-O2'], extra_ldflags=['-lgomp','-lrt'])
my_func.cpp
uses OpenMP, so I use the above flags.
Now, I am trying to additionally use several functions in zstd library in my_func.cpp
. After cloning and make
ing zstd repository, shared libraries like libzstd.so
, libzstd.so.1
, libzstd.so.1.5.3
, and static library like libzstd.a
have been created.
I've included #include <zstd.h>
inside my_func.cpp
and used zstd's functions.
I now have to modify wrapper.py
to tell the compiler that I am using functions from zstd library.
How can I successfully compile my_func.cpp
using PyTorch C++ extension's torch.utils.cpp_extension.load
-- which arguments should I modify? Or, is it even possible to add external shared or static library using this method?
Frankly, I'm not familiar with the difference between static and shared library. But it seems that I can compile my_func.cpp
with either one of them, i.e., g++ -fopenmp -O2 -lgomp -lrt -o my_func my_func.cpp lib/libzstd.so.1.5.3
and g++ -fopenmp -O2 -lgomp -lrt -o my_func my_func.cpp lib/libzstd.a
both works.
I just can't figure out how I can do the exact same compiling using torch.utils.cpp_extension.load
.
Sorry for delivering kind of a lengthy question. I just wanted to make things clear.
Solution
I've figured this out. extra_ldflags
argument in torch.utils.cpp_extension.load
can handle this. In my case, I've added libzstd.so
file in my repository and added -lzstd
in above argument.
Answered By - SHM
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.