Monday, October 31, 2022

Rotate Keys on multiple EC2 instances using Ansible

Description: Here I have explained, How to rotate/replace key pairs on EC2 instances using Ansible

Setup:

  • 2 Instances with same key pair 
  • 2 Key pair [one is existing and one is new]
  • IAM user with Administrator privilege

Procedure:

After create 2 Virtual Machines with same Key pair, create one more new key pair file for replacement.
Key name are as follow:
  • Old-key.pem [current key]
  • New-Key.pem [New key] 
Now I am creating one IAM user with Administrator privilege from AWS console as follow




 


















After creating user, download CSV file for reference in variable file for authentication. 

First creating variable file with key.vars file as follow:

access_key: "XXXXXX5UIGMCGDXXXXXX" secret_key: "XXXXXXVxsLGhbdrqz+I2IhnnrG+XXXXXXX" region: "us-west-2" #----> Example: "ap-south-1" old_key: "Old-key" #----> Upload this Pem file in the same directory with 400 Permission. new_key: "New-Key" system_user: "ubuntu" ssh_port: 22

Note:
  •  access_key = IAM user access key
  •  secret_key = IAM user secret key
  •  region = Infrastructure host region 
  •  old_key = current / existing key name 
  •  new_key = new key which need to replace
  •  system_user = ubuntu [ I have used ubuntu as operating system so default user is ubuntu]
After creating variable files, changing the both key file permission to 0400 using chmod command line

Now going to create main.yml as follow for replacement of key as follow



--- - name: "Creation of the Ansible Inventory Of EC2 Instances in which Key To Be Rotated" hosts: localhost vars_files: - key.vars tasks: # --------------------------------------------------------------- # Getting Information of the EC2 instances in which Key To Be Rotated # --------------------------------------------------------------- - name: "Fetching Details About EC2 Instance" ec2_instance_info: aws_access_key: "{{ access_key }}" aws_secret_key: "{{ secret_key }}" region: "{{ region }}" filters: "key-name": "{{ old_key }}" instance-state-name: [ "running"] register: ec2 # ------------------------------------------------------------ # Creating Inventory Of EC2 With Old SSH-keyPair # ------------------------------------------------------------ - name: "Creating Inventory " add_host: name: "{{ item.public_ip_address }}" groups: "aws" ansible_host: "{{ item.public_ip_address }}" ansible_port: "{{ ssh_port }}" ansible_user: "{{ system_user }}" ansible_ssh_private_key_file: "{{ old_key }}.pem" ansible_ssh_common_args: "-o StrictHostKeyChecking=no" with_items: - "{{ ec2.instances }}" no_log: true - name: "Updating SSH-Key Material" hosts: aws become: true gather_facts: false vars_files: - key.vars tasks: - name: "Register current SSH Authorized_key file of the system user" shell: cat /home/"{{system_user}}"/.ssh/authorized_keys register: oldauth - name: "Creating New SSH-Key Material" delegate_to: localhost run_once: True openssh_keypair: path: "{{ new_key }}" type: rsa size: 4096 state: present - name: "Adding New SSH-Key Material" authorized_key: user: "{{ system_user }}" state: present key: "{{ lookup('file', '{{ new_key }}.pub') }}" - name: "Creating SSH Connection Command" set_fact: ssh_connection: "ssh -o StrictHostKeyChecking=no -i {{ new_key }} {{ ansible_user }}@{{ ansible_host }} -p {{ ansible_port }} 'uptime'" - name: "Checking Connectivity To EC2 Using Newly Added Key" ignore_errors: true delegate_to: localhost shell: "{{ ssh_connection }}" - name: "Executing the Uptime command on remote servers" command: "uptime" register: uptimeoutput - debug: var: uptimeoutput.stdout_lines - name: "Removing Old SSH Public Key and adding New SSH Public Key to authorized_key" authorized_key: user: "{{ system_user }}" state: present key: "{{ lookup('file', '{{ new_key }}.pub') }}" exclusive: true - name: "Print Old authorized_keys file" debug: msg: "SSH Public Keys in Old authorized_keys file are '{{ oldauth.stdout }}'" - name: "Print New authorized_keys file" shell: cat /home/"{{system_user}}"/.ssh/authorized_keys register: newauth - debug: msg: "SSH Public Keys in New authorized_keys file are '{{ newauth.stdout }}'" - name: "Renaming new Private Key Locally" delegate_to: localhost run_once: True shell: | mv {{ new_key }} {{ new_key }}.pem chmod 400 {{ new_key }}.pem - name: "Removing Old SSH public key From AWS Account" delegate_to: localhost run_once: True ec2_key: aws_access_key: "{{ access_key }}" aws_secret_key: "{{ secret_key }}" region: "{{ region }}" name: "{{ old_key }}" state: absent - name: "Adding New SSH public key to AWS Account" delegate_to: localhost run_once: True ec2_key: aws_access_key: "{{ access_key }}" aws_secret_key: "{{ secret_key }}" region: "{{ region }}" name: "{{ new_key }}" key_material: "{{ lookup('file', '{{ new_key }}.pub') }}" state: present

After saving above file, open terminal and run ansible playbook using command line as follow. Kindly note run ansible command as root user with sudo rights 

# ansible-playbook main.yml





















Now verify the ssh connection with new keys.

No comments:

Post a Comment