'Using String parameter for nvidia triton

I'm trying to deploy a simple model on the Triton Inference Server. It is loaded well but I'm having trouble formatting the input to do a proper inference request.

My model has a config.pbtxt set up like this

  max_batch_size: 1
  input: [
    {
      name: "examples"
      data_type: TYPE_STRING
      format: FORMAT_NONE
      dims: [ -1 ]
      is_shape_tensor: false
      allow_ragged_batch: false
      optional: false
    }
  ]

I've tried using a pretty straightforward python code to setup the input data like this (the outputs are not written but are setup correctly)

        bytes_data = [input_data.encode('utf-8')]
        bytes_data = np.array(bytes_data, dtype=np.object_)
        bytes_data = bytes_data.reshape([-1, 1])
        inputs = [
            httpclient.InferInput('examples', bytes_data.shape, "BYTES"),
        ]
        inputs[0].set_data_from_numpy(bytes_data)

But I keep getting the same error message

tritonclient.utils.InferenceServerException: Could not parse example input, value: '[my text input here]'
         [[{{node ParseExample/ParseExampleV2}}]]

I've tried multiple ways of encoding the input, as bytes or even as TFX serving used to ask like this { "instances": [{"b64": "CjEKLwoJdXR0ZXJhbmNlEiIKIAoecmVuZGV6LXZvdXMgYXZlYyB1biBjb25zZWlsbGVy"}]}

I'm not exactly sure where the problems comes from if anyone knows?



Solution 1:[1]

If anyone gets this same problem, this solved it. I had to create a tf.train.Example() and set the data correctly

example = tf.train.Example()
example_bytes = str.encode(input_data)
example.features.feature['utterance'].bytes_list.value.extend([example_bytes])
inputs = [
    httpclient.InferInput('examples', [1], "BYTES"),
]
inputs[0].set_data_from_numpy(np.asarray(example.SerializeToString()).reshape([1]), binary_data=False)
    

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Regalia