nvidia@tegra-ubuntu:~/TensoRT_C++/TensorRT-6.0.1.8/targets/x86_64-linux-gnu/bin$ ./sample_onnx_mnist &&&& RUNNING TensorRT.sample_onnx_mnist # ./sample_onnx_mnist [04/24/2021-14:52:43] [I] Building and running a GPU inference engine for Onnx MNIST ---------------------------------------------------------------- Input filename: ../../../data/mnist/mnist.onnx ONNX IR version: 0.0.3 Opset version: 8 Producer name: CNTK Producer version: 2.5.1 Domain: ai.cntk Model version: 1 Doc string: ---------------------------------------------------------------- ERROR: ModelImporter.cpp:463 In function importModel: [4] Assertion failed: !_importer_ctx.network()->hasImplicitBatchDimension() && "This version of the ONNX parser only supports TensorRT INetworkDefinitions with an explicit batch dimension. Please ensure the network was created using the EXPLICIT_BATCH NetworkDefinitionCreationFlag." &&&& FAILED TensorRT.sample_onnx_mnist # ./sample_onnx_mnist