Ansible - Upload files to Amazon Web Services (AWS) S3 Buckets using the s3_object module
by
Jeremy Canfield |
Updated: July 31 2023
| Ansible articles
If you are not familiar with modules, check out Ansible - Getting Started with Modules.
Prerequisites
- Before you can use the Ansible Amazon Web Services (AWS) modules, you will need to install the AWS CLI tool on the hosts that will be using the Ansible Amazon Web Services (AWS) modules. Check out my article on Getting Started with the Ansible Amazon Web Services (AWS) modules.
- You will also need to set your Amazon Web Services (AWS) Profile Configurations. Check out my article Set Amazon Web Services (AWS) Profile Configurations.
- The aws_s3_bucket_info requires the following packages. Check out my article Resolve "boto3 required for this module".
- botocore version 1.25.0 or higher
- boto3 version 1.22.0 or higher
- Python 3.6 or higher must be used. The ansible --version command can be used to list the version of Python being used with Ansible. If your Ansible installation is used a version lower than Python 3.6, one solution would be to install Ansible in a Python virtual environment using Python 3.6 or higher.
- The community.aws collection will need to be installed. Check out my article on Install a collection using the ansible-galaxy collection install command.
s3_object can be used to upload files your Amazon Web Services (AWS) S3 Buckets.
---
- hosts: localhost
tasks:
- name: find files
find:
paths: /path/to/directory/that/contains/files/you/want/to/upload/to/S3/bucket
register: find_files
- name: append files to the 'files' list
set_fact:
files: "{{ files | default([]) + [ item.path ] }}"
with_items: "{{ find_files.files }}"
- debug:
var: files
- name: upload files to S3 bucket
amazon.aws.s3_object:
bucket: "{{ s3_bucket_name }}"
object: "some/folder/in/your/s3/bucket/{{ item | basename }}"
src: "{{ item }}"
mode: put
with_items: "{{ files }}"
...
Did you find this article helpful?
If so, consider buying me a coffee over at