Fork me on GitHub
#aws
<
2020-11-30
>
zackteo02:11:05

Could someone help me to understand what I am missing to allow ssh into my EC2 instances in my cloudFormation template? I'm guessing that my connections are being blocked somewhere but I don't seem to find any semblance of network configuration options for the VPC or subnet https://gist.github.com/zackteo/cc5f3d718f4ada748c1d56805cc97a37

zackteo10:11:15

Also, does anyone know what is the best way to automate copying of public key from namenode to worker nodes in EC2 Cluster in CloudFormation? I want to do ssh hadoop-worker-1 'cat >> ~/.ssh/authorized_keys' < ~/.ssh/id_rsa.pub in my namenode but will be stopped by a password request. I believe I need my aws-key but I'm not sure what's a reasonable way for my namenode to get it. Or what might be a good workaround

lukasz14:11:53

I believe AWS systems manager is what you're looking for - it can run scripts etc on your behalf across your ec2 instances https://docs.aws.amazon.com/systems-manager/latest/userguide/execute-remote-commands.html Another approach is to use user-data and cloud-init to build the authorized keys on machine boot (that's something I used to do: we'd pull public keys from GitHub's user api and bake in an authorized_keys file for a jump host

orestis15:11:22

I dimly remember that there is a way to do ssh into an EC2 instance using AMI instead of dealing with keys etc.

lukasz15:11:26

You can configure your EC2 instance to allow a ssh key when building it or pre-bake the authorized_keys in your AMI image

lukasz15:11:48

Nowadays best practice is to go through AWS SSM and not use SSH keys at all, and instead delegate to your AWS credentials/identity

lukasz15:11:31

Something I really miss from Google Cloud - you can magically SSH to an instance, provided you have access to the GCP project

lukasz15:11:38

in AWS it requires some setup