Optional
inference_The batch size used during inference. When multiple time series are present, the inference will be conducted in batches. If not specified, the model default batch size will be used.
Optional
prediction_The prediction length for the forecast. The service will return this many periods beyond the last timestamp
in the inference data payload. If specified, prediction_length
must be an integer >=1 and no more than the
model default prediction length. When omitted the model default prediction_length will be used.
The parameters for the forecast request.