Automated Remote Script Execution on EC2 Instances

Published on 30 June 2022

There are times when you need to run a script automatically on an EC2 instance, but do not want to have to copy and update the script each time, on the instance. The post will look at how you can use EventBridge to fire off a command on an EC2 instance (driven by an event, rather than a schedule), that will copy a script from an S3 bucket, and execute the script locally on the instance.

Create the Bucket and Script

First of all, you want an S3 bucket, with a script in it, that you are going to call from the EC2 instances. So you need to first create the bucket, and put the file in.

Create the bucket via the Console/CLI/Terraform/CloudFormation, etc. This is how to do it via CLI:

    aws s3api create-bucket --bucket mybucket-gh01 --region eu-west-2

You can then use the aws CLI again to copy your script, or again, use the console. An example of CLI is here:

    aws s3 cp /temp/myscript.ps1 s3://mybucket-gh01/ 

My script was a really simple example that would give me an output.

    $ErrorActionPreference = 'SilentlyContinue'
    $processes = get-process
    $time = (get-date )
    foreach ($p in $processes){
        "$time - $($" >> c:\temp\processes.log

SSM Agent

SSM is going to be the tool of choice to run the actual script. The SSM Agent is installed on a lot of AWS instances these days, but you will need to ensure that the IAM Role attached to the instance has rights. There is a list here, but you can also check by running either this, on an Linux instance:

    sudo systemctl status amazon-ssm-agent

Or on a Windows instance, this:

    Get-Service AmazonSSMAgent


Don't forget your IAM Role for the EC2 instance. See here for details., but if you are adding to an existing profile, you need to add the AmazonSSMManagedInstanceCore policy to your role.

The other thing we want to do, is grant rights to an S3 bucket that is going to contain a script. The reason for doing that, is that you can then edit the script as required without changing command lines, or SSM documents, etc. One thing to remember though, is that this can be a very bad idea too, because if someone with nefarious intent gets access to the the script, they can do damage. So make sure your security on your bucket is right, and that IAM role just has read access to your S3 bucket.

EventBridge Configuration

EventBridge is the newish name for CloudWatch Events, and a crucial part of most automation within AWS. eventbridge

So once you have created your bucket, copied your script, and added/edited your IAM role, you are ready to do the SSM part. So then go to EventBridge and choose to create a new rule.

Enter the details (name description) and you then have a couple of choices on how you want to run this. You can run it on a schedule, or based on an event. A schedule is pretty simple, and uses a cron syntax. The "rule with an event pattern" is really interesting, because that is the one that allows you to do event based tasks. So as an example, you may want to only run the script when a CodeDeploy Deployment state changes, or when an object is uploaded to S3, or maybe when an EC2 snapshot notification happens. So choose to use an event pattern, and press next.


I choose basically any EBS snapshot notification for any resource.


    "source": ["aws.ec2"],
    "detail-type": ["EBS Snapshot Notification"]

You then need to choose the destination. So when something matches the event, what happens. So I have an instance tagged with SSMRun:Yes.

Next you need to select the target (the destination is where the action will be performed, the target is the action itself). As I wanted to run a PowerShell script, I used "Systems Manager Run Command" as the target, and AWS-RunPowerShellScript as the the document. I then set the target key ( tag:SSMRun), and the then value, "Yes".


If you were going to run bash, you would use AWS-RunShellScript instead.

For the document parameters, this is where we put the code we want to run on the instance. In this case, I want to download a file that we created above, and run it. You can add multiple commands here, but I just put it all on one line, separated by semicolons.

    md c:\temp; cd c:\temp; Copy-S3Object -BucketName mybucket-gh01 -Key myscript.ps1 -localfile c:\temp\myscript.ps1; c:\temp\myscript.ps1

AWS will automatically create the IAM role for you, but you might want to check that is is what you need.

So now you need to trigger the event. You can do that by creating a snapshot of an instance, and that should then meet the rules, and fire the event.

Give it a few minutes, and once it has run, if you look in c:\temp, you should see the script, copied from the bucket, and processes.txt to show the script has run.



You can use the metrics to check when it has run, although it can take a few minutes for the metrics to actually update. You can also use the Cloudwatch Archive and Replay functionality to replay events (if you are testing, and don't want to have to keep creating snapshots!)

Apart from that, start simple. There is no cloudwatch Logs integration that I could find, but if you run the same commands using the AWS-RunPowerShellScript via SSM, it does integrate there, so that is a good option.

comments powered by Disqus