Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Installation] <title> #343

Open
2 tasks done
zx1384187 opened this issue Feb 12, 2025 · 1 comment
Open
2 tasks done

[Installation] <title> #343

zx1384187 opened this issue Feb 12, 2025 · 1 comment

Comments

@zx1384187
Copy link

zx1384187 commented Feb 12, 2025

Is there an existing issue for this?

  • I have searched the existing issues

Have you followed all the steps in the FAQ?

  • I have tried the steps in the FAQ.

Current Behavior

When I try to run"pip install ./torchsparse" ,it returns error " torchsparse/backend/others/query_cpu.cpp:6:10: fatal error: google/dense_hash_map: No such file or directory".Before this I have made install sparsehash like https://github.com/PJLab-ADG/OpenPCSeg/blob/master/docs/INSTALL.md 4.3
Could someone give me assiant?

Error Line

torchsparse/backend/others/query_cpu.cpp:6:10: fatal error: google/dense_hash_map: No such file or directory
6 | #include <google/dense_hash_map>
| ^~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.
error: command 'gcc' failed with exit status 1

Environment

- GCC:9.3.0
- NVCC:11.3
- PyTorch:1.10.0
- PyTorch CUDA:11.3

Full Error Log

Error Log Looking in indexes: http://mirrors.aliyun.com/pypi/simple Processing ./torchsparse DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default. pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555. Building wheels for collected packages: torchsparse Building wheel for torchsparse (setup.py) ... error ERROR: Command errored out with exit status 1: command: /root/miniconda3/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-req-build-hotmfuez/setup.py'"'"'; __file__='"'"'/tmp/pip-req-build-hotmfuez/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-gev9gav0 cwd: /tmp/pip-req-build-hotmfuez/ Complete output (72 lines): running bdist_wheel running build running build_py creating build creating build/lib.linux-x86_64-3.8 creating build/lib.linux-x86_64-3.8/torchsparse copying torchsparse/version.py -> build/lib.linux-x86_64-3.8/torchsparse copying torchsparse/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse copying torchsparse/operators.py -> build/lib.linux-x86_64-3.8/torchsparse copying torchsparse/tensor.py -> build/lib.linux-x86_64-3.8/torchsparse creating build/lib.linux-x86_64-3.8/torchsparse/utils copying torchsparse/utils/utils.py -> build/lib.linux-x86_64-3.8/torchsparse/utils copying torchsparse/utils/quantize.py -> build/lib.linux-x86_64-3.8/torchsparse/utils copying torchsparse/utils/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/utils copying torchsparse/utils/collate.py -> build/lib.linux-x86_64-3.8/torchsparse/utils creating build/lib.linux-x86_64-3.8/torchsparse/backbones copying torchsparse/backbones/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones copying torchsparse/backbones/unet.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones copying torchsparse/backbones/resnet.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones creating build/lib.linux-x86_64-3.8/torchsparse/nn copying torchsparse/nn/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/nn creating build/lib.linux-x86_64-3.8/torchsparse/backbones/modules copying torchsparse/backbones/modules/blocks.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones/modules copying torchsparse/backbones/modules/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones/modules creating build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/devoxelize.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/voxelize.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/query.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/pooling.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/activation.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/count.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/hash.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/conv.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/crop.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/downsample.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional creating build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/norm.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/bev.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/activation.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/conv.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/crop.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/pooling.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules creating build/lib.linux-x86_64-3.8/torchsparse/nn/utils copying torchsparse/nn/utils/kernel.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/utils copying torchsparse/nn/utils/apply.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/utils copying torchsparse/nn/utils/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/utils running build_ext building 'torchsparse.backend' extension creating build/temp.linux-x86_64-3.8 creating build/temp.linux-x86_64-3.8/torchsparse creating build/temp.linux-x86_64-3.8/torchsparse/backend creating build/temp.linux-x86_64-3.8/torchsparse/backend/others creating build/temp.linux-x86_64-3.8/torchsparse/backend/hash creating build/temp.linux-x86_64-3.8/torchsparse/backend/hashmap creating build/temp.linux-x86_64-3.8/torchsparse/backend/devoxelize creating build/temp.linux-x86_64-3.8/torchsparse/backend/voxelize creating build/temp.linux-x86_64-3.8/torchsparse/backend/convolution gcc -pthread -B /root/miniconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/pybind_cuda.cpp -o build/temp.linux-x86_64-3.8/torchsparse/backend/pybind_cuda.o -g -O3 -fopenmp -lgomp -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++14 cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ /usr/local/cuda/bin/nvcc -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/others/query_cuda.cu -o build/temp.linux-x86_64-3.8/torchsparse/backend/others/query_cuda.o -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr --compiler-options '-fPIC' -O3 -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -gencode=arch=compute_86,code=compute_86 -gencode=arch=compute_86,code=sm_86 -std=c++14 gcc -pthread -B /root/miniconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/others/count_cpu.cpp -o build/temp.linux-x86_64-3.8/torchsparse/backend/others/count_cpu.o -g -O3 -fopenmp -lgomp -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++14 cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ /usr/local/cuda/bin/nvcc -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/others/count_cuda.cu -o build/temp.linux-x86_64-3.8/torchsparse/backend/others/count_cuda.o -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr --compiler-options '-fPIC' -O3 -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -gencode=arch=compute_86,code=compute_86 -gencode=arch=compute_86,code=sm_86 -std=c++14 gcc -pthread -B /root/miniconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/others/query_cpu.cpp -o build/temp.linux-x86_64-3.8/torchsparse/backend/others/query_cpu.o -g -O3 -fopenmp -lgomp -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++14 cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ torchsparse/backend/others/query_cpu.cpp:6:10: fatal error: google/dense_hash_map: No such file or directory 6 | #include | ^~~~~~~~~~~~~~~~~~~~~~~ compilation terminated. error: command 'gcc' failed with exit status 1 ---------------------------------------- ERROR: Failed building wheel for torchsparse Running setup.py clean for torchsparse Failed to build torchsparse Installing collected packages: torchsparse Running setup.py install for torchsparse ... error ERROR: Command errored out with exit status 1: command: /root/miniconda3/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-req-build-hotmfuez/setup.py'"'"'; __file__='"'"'/tmp/pip-req-build-hotmfuez/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-pkv0itmw/install-record.txt --single-version-externally-managed --compile --install-headers /root/miniconda3/include/python3.8/torchsparse cwd: /tmp/pip-req-build-hotmfuez/ Complete output (72 lines): running install running build running build_py creating build creating build/lib.linux-x86_64-3.8 creating build/lib.linux-x86_64-3.8/torchsparse copying torchsparse/version.py -> build/lib.linux-x86_64-3.8/torchsparse copying torchsparse/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse copying torchsparse/operators.py -> build/lib.linux-x86_64-3.8/torchsparse copying torchsparse/tensor.py -> build/lib.linux-x86_64-3.8/torchsparse creating build/lib.linux-x86_64-3.8/torchsparse/utils copying torchsparse/utils/utils.py -> build/lib.linux-x86_64-3.8/torchsparse/utils copying torchsparse/utils/quantize.py -> build/lib.linux-x86_64-3.8/torchsparse/utils copying torchsparse/utils/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/utils copying torchsparse/utils/collate.py -> build/lib.linux-x86_64-3.8/torchsparse/utils creating build/lib.linux-x86_64-3.8/torchsparse/backbones copying torchsparse/backbones/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones copying torchsparse/backbones/unet.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones copying torchsparse/backbones/resnet.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones creating build/lib.linux-x86_64-3.8/torchsparse/nn copying torchsparse/nn/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/nn creating build/lib.linux-x86_64-3.8/torchsparse/backbones/modules copying torchsparse/backbones/modules/blocks.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones/modules copying torchsparse/backbones/modules/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/backbones/modules creating build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/devoxelize.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/voxelize.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/query.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/pooling.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/activation.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/count.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/hash.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/conv.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/crop.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional copying torchsparse/nn/functional/downsample.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/functional creating build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/norm.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/bev.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/activation.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/conv.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/crop.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules copying torchsparse/nn/modules/pooling.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/modules creating build/lib.linux-x86_64-3.8/torchsparse/nn/utils copying torchsparse/nn/utils/kernel.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/utils copying torchsparse/nn/utils/apply.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/utils copying torchsparse/nn/utils/__init__.py -> build/lib.linux-x86_64-3.8/torchsparse/nn/utils running build_ext building 'torchsparse.backend' extension creating build/temp.linux-x86_64-3.8 creating build/temp.linux-x86_64-3.8/torchsparse creating build/temp.linux-x86_64-3.8/torchsparse/backend creating build/temp.linux-x86_64-3.8/torchsparse/backend/others creating build/temp.linux-x86_64-3.8/torchsparse/backend/hash creating build/temp.linux-x86_64-3.8/torchsparse/backend/hashmap creating build/temp.linux-x86_64-3.8/torchsparse/backend/devoxelize creating build/temp.linux-x86_64-3.8/torchsparse/backend/voxelize creating build/temp.linux-x86_64-3.8/torchsparse/backend/convolution gcc -pthread -B /root/miniconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/pybind_cuda.cpp -o build/temp.linux-x86_64-3.8/torchsparse/backend/pybind_cuda.o -g -O3 -fopenmp -lgomp -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++14 cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ /usr/local/cuda/bin/nvcc -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/others/query_cuda.cu -o build/temp.linux-x86_64-3.8/torchsparse/backend/others/query_cuda.o -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr --compiler-options '-fPIC' -O3 -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -gencode=arch=compute_86,code=compute_86 -gencode=arch=compute_86,code=sm_86 -std=c++14 gcc -pthread -B /root/miniconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/others/count_cpu.cpp -o build/temp.linux-x86_64-3.8/torchsparse/backend/others/count_cpu.o -g -O3 -fopenmp -lgomp -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++14 cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ /usr/local/cuda/bin/nvcc -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/others/count_cuda.cu -o build/temp.linux-x86_64-3.8/torchsparse/backend/others/count_cuda.o -D__CUDA_NO_HALF_OPERATORS__ -D__CUDA_NO_HALF_CONVERSIONS__ -D__CUDA_NO_BFLOAT16_CONVERSIONS__ -D__CUDA_NO_HALF2_OPERATORS__ --expt-relaxed-constexpr --compiler-options '-fPIC' -O3 -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -gencode=arch=compute_86,code=compute_86 -gencode=arch=compute_86,code=sm_86 -std=c++14 gcc -pthread -B /root/miniconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/root/miniconda3/lib/python3.8/site-packages/torch/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/torch/csrc/api/include -I/root/miniconda3/lib/python3.8/site-packages/torch/include/TH -I/root/miniconda3/lib/python3.8/site-packages/torch/include/THC -I/usr/local/cuda/include -I/root/miniconda3/include/python3.8 -c torchsparse/backend/others/query_cpu.cpp -o build/temp.linux-x86_64-3.8/torchsparse/backend/others/query_cpu.o -g -O3 -fopenmp -lgomp -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=backend -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++14 cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ torchsparse/backend/others/query_cpu.cpp:6:10: fatal error: google/dense_hash_map: No such file or directory 6 | #include | ^~~~~~~~~~~~~~~~~~~~~~~ compilation terminated. error: command 'gcc' failed with exit status 1 ---------------------------------------- ERROR: Command errored out with exit status 1: /root/miniconda3/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-req-build-hotmfuez/setup.py'"'"'; __file__='"'"'/tmp/pip-req-build-hotmfuez/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-pkv0itmw/install-record.txt --single-version-externally-managed --compile --install-headers /root/miniconda3/include/python3.8/torchsparse Check the logs for full command output. [PUT YOUR ERROR LOG HERE]
@pphuangyi
Copy link

pphuangyi commented Feb 16, 2025

Could you try python setup.py install inside torchsparse folder? I also wrote a tentative solution in #340. Please give it a try and let me know your experience if you like. (But I think your current gcc/nvcc/pytorch combination should work)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants