So this script is only supported for machines with cuda-11.x. Pay attention to the version of cuda: 11. If you wants deploy models with TensorRT: If you wants deploy models with OnnxRuntime: # Download and link to MMDeploy-onnxruntime pre-built package wget tar -zxvf mmdeploy-0.9.0-linux-x86_64-onnxruntime1.8.1.tar.gz pushd mmdeploy-0.9.0-linux-x86_64-onnxruntime1.8.1 export MMDEPLOY_DIR = $ ( pwd )/sdk export LD_LIBRARY_PATH = $ MMDEPLOY_DIR/sdk/lib: $ LD_LIBRARY_PATH popd # Download and link to OnnxRuntime engine wget tar -zxvf onnxruntime-linux-圆4-1.8.1.tgz cd onnxruntime-linux-圆4-1.8.1 export ONNXRUNTIME_DIR = $ ( pwd ) export LD_LIBRARY_PATH = $ ONNXRUNTIME_DIR/lib: $ LD_LIBRARY_PATH Don't be disappoint, the script of building from source is ongoing, and after finishing that we can deploy models with all backends supported by mmdeploy in Rust. Currently, mmdeploy-sys is built upon the pre-built package of mmdeploy so this repo only supports OnnxRuntime and TensorRT backends. Download and install pre-built mmdeploy package. Install Clang and Rust required by Bindgen. apt install curl curl -proto '=https ' -tlsv1.2 -sSf | sh The following guidance is tested on Ubuntu OS on x86 device. To make sure the building of this repo successful, you should install some pre-packages. (2022.9.27) This repo has been added to the MMDeploy CI. (2022.9.29) This repo has been added to the OpenMMLab ecosystem.If you believe this is docs.rs' fault, open an issue. Or Metadata for how to configure docs.rs builds. See Builds for ideas on how to fix a failed build,
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |