Which version of pytorch is available with nnstreamer-pytorch


Hi dev.
I’m trying to use yolov5s tensorscript.pt file that exported from yolov5 GitHub code with nnstreamer pipeline.
I’m working on develop environment with nnstreamer/tools/docker/ubuntu18.04-run/Dockerfile

Below is my test pipeline.

const char *string = "rtspsrc location=rtsp://address:port/mount latency=0 \
protocols=4 ! rtph265depay ! avdec_h265 ! \
videoscale ! videoconvert ! video/x-raw,format=RGB,width=640,height=640 ! \
tensor_converter ! tensor_filter framework=pytorch \
model=../../tf_model/yolov5s.torchscript.pt \
input=3:640:640:1 inputname=x inputtype=float32 \
output=1:25200:85 outputname=416 outputtype=float32 \
! tensor_sink name=tensor_sink;

But with this pipeline, I got error as below

failed to initialize the object: PyTorch
0:00:00.376511035    29 0x559a1ba3d4c0 WARN                GST_PADS gstpad.c:1149:gst_pad_set_active:<tensorfilter0:sink> Failed to activate pad
** Message: 05:11:55.607: gpu = 0, accl = cpu

** (tester:29): CRITICAL **: 05:11:55.610: Exception while loading the model: 

aten::_convolution(Tensor input, Tensor weight, Tensor? bias, int[] stride, int[] padding, int[] dilation, bool transposed, int[] output_padding, int groups, bool benchmark, bool deterministic, bool cudnn_enabled) -> (Tensor):
Expected at most 12 arguments but found 13 positional arguments.
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/conv.py(442): _conv_forward
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/conv.py(446): forward
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py(1090): _slow_forward
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py(1102): _call_impl
/usr/src/app/yolov5/models/common.py(49): forward_fuse
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py(1090): _slow_forward
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py(1102): _call_impl
/usr/src/app/yolov5/models/common.py(207): forward
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py(1090): _slow_forward
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py(1102): _call_impl
/usr/src/app/yolov5/models/yolo.py(149): _forward_once
/usr/src/app/yolov5/models/yolo.py(126): forward
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py(1090): _slow_forward
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py(1102): _call_impl
/opt/conda/lib/python3.8/site-packages/torch/jit/_trace.py(958): trace_module
/opt/conda/lib/python3.8/site-packages/torch/jit/_trace.py(741): trace
export.py(71): export_torchscript
export.py(372): run
/opt/conda/lib/python3.8/site-packages/torch/autograd/grad_mode.py(28): decorate_context
export.py(430): main
export.py(435): <module>
Serialized   File "code/__torch__/torch/nn/modules/conv.py", line 12
    bias = self.bias
    weight = self.weight
    input0 = torch._convolution(input, weight, bias, [1, 1], [1, 1], [1, 1], False, [0, 0], 1, False, False, True, True)
             ~~~~~~~~~~~~~~~~~~ <--- HERE
    return input0

** (tester:29): CRITICAL **: 05:11:55.610: Failed to load model

It looks like something that happens when the yolov5 pytorch version and nnstreamer runtime pytorch version are different.
But I heard as pytorch is backward compatible type. 
Yolov5 pytorch version is 1.10.1
And with description on nnstreamer-pytorch in GitHub, It looks like It is built from pytorch version 1.3.1
After this point, I lost the way to go.

If do you have any advise, tell me.

Additionaly, In tensorfilter element, How can I get the property that name of “outputname” and “inputname” from pytorch model?
When I convert pytorch model to onnx model, there have arg that I can specify.
But I cannot find those arg when I export model as normal pt or tensorsciprt.


Join {nnstreamer-technical-discuss@lists.lfaidata.foundation to automatically receive all group messages.