Skip to content

Choosing an AI stack

There are several options for deploying an AI stack to ingest IBM i data, train models, and deploy to production. This guide aims to summarize some of these options.

Rocket Cognitive Environment (RocketCE)

RocketCE offers a POWER-optimized software stack for running AI workloads. It has builtin exploitation of the AI acceleration of the Power chipset. The product aims to minimize the entry barrier for AI by natively using Python packages on Linux LPARS, without any container platform needed. Benefit from over 200 packages optimized for IBM Power10 and backed by enterprise support from IBM and Rocket.

Why choose it?

  • Keep your entire AI lifecycle on premise
  • Exploit AI acceleration in IBM’s Power hardware
  • No container platform (such as OpenShift or Kubernetes) needed
  • Evaluate at no cost

Resources

Rocket AI Hub

Rocket AI hub provides a development-centric data & AI environment (data scientists, developers, ops, scientists, …) and get access to the most up-to-date frameworks & tools at no license costs on top of Red Hat OpenShift or vanilla Kubernetes. AI Hub also takes advantage of the AI acceleration of the Power chipset.

Since AI Hub runs on a container platform, it allows you to start small and scale big over time. Plus, it includes a suite of advanced tooling to make the AI lifecycle easier, such as:

  • KServe, an open source framework for serving ML models. It is based on the Open API specification.
  • Kubeflow, an open source platform for machine learning based on Google’s TensorFlow and Keras.
  • Kubeflow Pipelines, an open source framework for building and deploying machine learning pipelines. It includes a suite of components to deploy, manage and monitor ML models.
  • Katib, an open source framework for automated machine learning (AutoML).

Why choose it?

  • Keep your entire AI lifecycle on premise
  • Exploit AI acceleration in IBM’s Power hardware
  • Leverage a container platform (such as OpenShift or Kubernetes) and advanced orchestration and deployment tools
  • Evaluate at no cost

Resources

Watsonx

IBM watsonx is IBM’s flagship AI suite. It can be deployed as a service or on-prem and consists of several components, including:

  • watsonx.ai, an enterprise-grade AI studio where you can train, refine, and deploy AI models.
  • watsonx.data, a data lakehouse and data federation offering to power AI and analytics with all your data.
  • watsonx.governance, a toolkit to manage risk, compliance, and the AI lifecycle

Why choose it?

  • Flexibility to consume and produce AI models in a cloud environment if needed
  • Utilize tools designed with non-programmers in mind.
  • Many features have trial options to help you get started

Resources

Native IBM i libraries

Some technologies are available to run natively on IBM i. See “Running AI Natively on IBM i” for more information.

Why choose it?

  • Enjoy the simplicity of having everything deployed on IBM i
  • Able to tolerate instances where the latest iterations are not yet available.

Others

The AI stacks above are explicitly tested and certified for use with IBM i and AI workloads. However, there are many other viable options, including: