GPU Instances on Kubernetes
I noticed GPU instances aren't supported for cluster nodes on LKE. Has anyone tried manually running a GPU instance as a node on LKE?
Also, has anyone heard if there are plans to add support for this? I read this article which makes it sound like not much further work is planned with GPUs.
You are correct - currently you are not able to deploy GPU Linodes as part of your LKE cluster's node pools. It's worth mentioning that I'm not an expert in all the intricacies of Kubernetes, but from reviewing the LKE guide, it doesn't appear that you would be able to manually add a GPU instance as a node within the LKE cluster itself.
I believe you could instead deploy a GPU Linode external to the cluster and then utilize a network service inside the cluster to give access for your containers to interact with the external GPU Linode. With that said, I haven't messed around with this personally, so your mileage may vary.
An alternative option might be to standup your own unmanaged Kubernetes cluster, this should give you the ability to have a GPU Linode as a worker node within a Kubernetes cluster. While researching this topic, I came across a resource from the Kubernetes documentation that describes how you can schedule GPU nodes. In case this may be of interest to you I'll link it below:
With regards to future plans for GPU Linodes and LKE, I'm not sure. In my opinion it would be really cool and I definitely see the value for other users as well. I'll pass your interest in this functionality along to our teams responsible for making those sorts of decisions.
This would be a very useful feature and make LKE even closer in terms of feature parity with the larger cloud providers (i.e. AWS, GCP, Azure). In fact it may be the one feature that's preventing us from using Linode for our production workloads.
@RosebudAI-Alex Thanks for sharing your thoughts about this. While I don't have an ETA on this feature yet, I have created an escalation about allowing GPUs to be used for LKE nodes. Please, let us know if you have any additional feature request.