Constructs an instance of BatchInference with passed in options and external configuration.
The parameters to send to the service.
Private #apikeyFiles instance for managing batch inference files.
URL required for watsonx inference endpoints
The version date for the API of the form YYYY-MM-DD.
URL required for dataplatform endpoints
Static PLATFORM_Performs a POST request to the specified URL and returns a stream.
The parameters for the POST request.
Cancel a batch inference job.
Cancels a batch inference job that is in progress. Once cancelled, the job cannot be resumed.
The parameters to send to the service.
A promise that resolves to the response with cancelled batch job data
Create a new batch inference job.
Creates a batch inference job with the specified input file, endpoint, and completion window. The batch job will process requests from the input file and generate results.
The parameters to send to the service.
A promise that resolves to the response with batch job data
Retrieve batch job details or list all batch jobs.
When called with batchId, retrieves details of a specific batch job. When called without
batchId, retrieves a list of all batch jobs.
The parameters to send to the service.
A promise that resolves to either a single batch job or a collection of batch jobs
List all batch jobs.
Retrieves a list of all batch jobs for the specified space or project.
The parameters to send to the service.
A promise that resolves to an array of batch jobs
Service for managing batch inference operations in watsonx.ai.
Provides methods to create, retrieve, list, and cancel batch inference jobs, as well as manage associated files through the Files instance.