K3ai (keɪ3ai)
Last updated
Last updated
K3ai is a lightweight infrastructure-in-a-box specifically built to install and configure AI tools and platforms to quickly experiment and/or run in production over edge devices.
All you have to do is, download the binary for your Operating System, move it to your path (if you like easy things), and use it.
once downloaded untar the file and move it to your path
once downloaded unzip the file and move it to your path or execute it from a folder of your choice (i.e.: k3ai.exe -h)
once downloaded untar the file and move it to your path
once downloaded untar the file and move it to your path
If for any reason it fails just go straight away to https://github.com/kf5i/k3ai-core/releases and download the binary. Place it in your path and that's it.
or use the following
Looking for more interaction? join our Slack channel here****
Windows
Linux
Mac
ARM
NOTE: Unfortunately not all plugins work with ARM. We will take care of this and make a way to let you know before installing them
Currently, we install the following components (the list is changing and growing):
Kubernetes based on K3s from Rancher: https://k3s.io/
Kubernetes based on K0s from Mirantis: https://k0sproject.io
Kubernetes KinD: https://kind.sigs.k8s.io/
Kubeflow pipelines: https://github.com/kubeflow/pipelines
Argo Workflows: https://github.com/argoproj/argo
H2O Community: https://h20.ai
Kubeflow: https://www.kubeflow.org/ - (coming soon)
NVIDIA GPU support: https://docs.nvidia.com/datacenter/cloud-native/index.html
NVIDIA Triton inference server: https://github.com/triton-inference-server/server/tree/master/deploy/single_server (coming soon)
Tensorflow Serving: https://www.tensorflow.org/tfx/serving/serving_kubernetes:
ResNet
Mnist (coming soon)
and many many others...