Triton inference server Type Error #8320
Unanswered
Jalitha-QrioMatrix
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I’m using the Merlin multi-stage recommender system example, and both notebooks ran successfully with the default dataset. However, when I try using my own dataset, I encounter an issue during inference from the Triton server. It results in a NoneType error related to the Feast feature repository.
I've confirmed that the Triton server loads all models successfully and that the local Feast data exists. I've tried several approaches to resolve the issue, but so far, no success.

Beta Was this translation helpful? Give feedback.
All reactions