Skip to content

Conversation

@rmccorm4
Copy link
Contributor

@rmccorm4 rmccorm4 commented Jul 31, 2023

As far as I can tell, the Python Backend is completely thread-safe on calls to TRITONBACKEND_ModelInstanceInitialize

All use of model_state (the problem area with instance initialization in current backends) appears to be read-only.

It is passed to constructor for ModelInstanceState very simply.

L0_backend_python has been hanging on BLS model load tests, but this is now happening on main branches too, so I don't believe this is due to any parallel instance changes.


Corresponding tests: triton-inference-server/server#6126

@rmccorm4
Copy link
Contributor Author

rmccorm4 commented Aug 1, 2023

Going to merge for the sake of getting more pipelines running before code freeze to catch any unforeseen issues as early as possible. Tests may have some minor changes, as discussed above, I don't anticipate any python backend specific changes other than enabling the attribute. If we run into issues before the release, I can disable the attribute.

I'll update some docs separately where applicable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

4 participants