-
Notifications
You must be signed in to change notification settings - Fork 751
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some Pool Options outSize datatype is not correct , maybe should set as LongOptionalVector #1586
Comments
has these bug is these layer FractionalMaxPool2dImpl FractionalMaxPool2dOptions only receive the first value , @saudet we need solve these bug before javacpp-pytorch 2.6 release |
let us check the javacpp raw code in scala
console log
|
def main(args: Array[String]): Unit = { |
javacpp-pytorch Fra..Pool
console log error
|
console log error
|
@saudet now I have supply raw javacpp-pytorch raw code with these pool layer , these layer are have bug ,please solve these bug ,thanks |
maybe LongExpandingArrayOptional, DoubleExpandingArrayOptional these class have some bug |
Sounds like toNative isn't able to set the kernel_size properly for some reason. Please try to set the values manually |
it really bug ,I could promise . the code I think you should try run once, the console log you could view set the raw value is difference from options get real value ,this is pure javacpp code ,not use toNative and so on , AdaptiveMaxPool3dImpl , AdaptiveMaxPool2dImpl,AdaptiveAvgPool3dImpl,AdaptiveAvgPool2dImpl thanks @saudet the pure javacpp code
console log
|
HI , @saudet the FractionalMaxPool2d FractionalMaxPool3d can not work maybe just bad operate options unset correct value , |
HI @saudet ,adaptiveMaxpool2d the output_size really bug ,please check , because the output_size second element can not set value ! it become Long.MaxValue ! 216232169515805804 ,if you could set ,paste the correct code ,thanks
console
|
Please try to set the "org.bytedeco.javacpp.nopointergc" system property to "true". |
console log
|
You'll need to allocate memory for kernel_size and output_size for this to work |
HI ,
some pool layer can not use ,AdaptiveMaxPool2d, AdaptiveMaxPool3d AdaptiveAvgPool2d AdaptiveAvgPool2d ,these need pass outputSize maybe tuple2 or tuple3, but the
2d
public native @cast("torch::ExpandingArrayWithOptionalElem<2>*") @ByRef @NoException(true) LongOptional output_size();
3d
public native @cast("torch::ExpandingArrayWithOptionalElem<3>*") @ByRef @NoException(true) LongOptional output_size();
so we will meet error
but I also scare it can not work, please check the code ,thanks
The text was updated successfully, but these errors were encountered: