Skip to content

Commit f4e24d7

Browse files
authored
Fix returning error when tracing is off (triton-inference-server#295)
1 parent 66f5e1e commit f4e24d7

File tree

1 file changed

+5
-2
lines changed

1 file changed

+5
-2
lines changed

src/python_be.cc

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -365,8 +365,11 @@ ModelInstanceState::SaveRequestsToSharedMemory(
365365
RETURN_IF_ERROR(TRITONBACKEND_RequestFlags(request, &flags));
366366

367367
TRITONSERVER_InferenceTrace* triton_trace;
368-
RETURN_IF_ERROR(TRITONBACKEND_RequestTrace(request, &triton_trace));
369-
368+
auto err = TRITONBACKEND_RequestTrace(request, &triton_trace);
369+
if (err != nullptr) {
370+
triton_trace = nullptr;
371+
TRITONSERVER_ErrorDelete(err);
372+
}
370373
InferenceTrace trace = InferenceTrace(triton_trace);
371374

372375
std::unique_ptr<InferRequest> infer_request;

0 commit comments

Comments
 (0)