How can I use ssh to access my k8s cluster
I just created a k8s cluster and I wanna remote access the k8s cluster. However the terminal ask me to insert the password I really do not know what kinds of password should I insert. I had tried using the new root password I set before in the node of the k8s but it does not work
It is unclear what specifically you are trying to login to. It is also unknown if you are using the Linode Kubernetes Engine (LKE), a third party Managed K8s, or your own complete Kubernetes Cluster.
In general, a Kubernetes cluster is orchestrated by its control plane. Control of the cluster is performed through the Kubernetes API which uses HTTP. A cluster has worker nodes and worker nodes contain pods. Pods often have one or more containers to run micro services such as web servers and databases.
If you are using LKE, management of the control plane is performed for you by Linode and you cannot SSH in to a master node. You can use the Cloud Manager to add or remove clusters and allocate worker nodes.
Independent K8S Installation
If you have setup your own Kubernetes by installing it on one or more servers, you would generally manage it’s control plane with ‘kubeadm’ which allows you to do things like initialize the cluster and start or stop the control plane.
In this case your master node may disable root login via SSH. If the master node is on a Linode, you would be able to login as root via the LISH Console. From there you can configure SSH to permit root login.
You would otherwise use programs such as ‘kubectl’ that take your commands and execute them using the Kubernetes API behind the scenes.
Shell Access to objects in a cluster
Kubectl can let you get shell access to a worker node, pods, or containers within pods.
To get access to a worker node, this article may be helpful: