Amazon S3 (Simple Storage Service) is a service from Amazon Web Services offering object storage through a web service interface. Amazon S3 can be used to store and retrieve any amount of data from anywhere. Integrate AWS S3 and DronaHQ to achieve the flexibility to connect to AWS services easily. You can then customize to make use of Amazon S3 data as objects within buckets. You can also get objects in your buckets for a limited period of time.
Configuring S3 connector
The Amazon AWS S3 connector is available under Connectors > APIs.
Add an Amazon AWS S3 Account to Authenticate. Here you can add an account by using Connect Amazon AWS Account.
For this you need to first configure the account and then proceed. Once all details are added, click Save.
While generating keys for the IAM role, you need to provide permission from the AWS console from the account to which you are trying to connect. You can add permission later also by using existing group or attaching permissions from Security and Credentials and edit the permission of the user.
Preformatted textUsers can give the IAM role access to the EC2 machine on AWS for S3 buckets and can connect without explicit AWS secret and other credentials. Just simply type Yes in Use EC2 IMA Access Role field.
Your connector account configuration is now done.
YOu also have access to provide S3 URLs to give path style and custom endpoint base URL You can use this to connect to s3 compatible services like Digital Ocean Spaces or Wasabi
If you have already configured your account, you simply need to choose it from the list. In case you want to make any changes to the account configuration you can hover your mouse across the account and you can note the edit and delete options. Click to Edit.
Using AWS S3 connector
Now let us consider a few scenarios to understand some of the functionalities available with this integration. You can make use of the connector actions by adding the connector from an Action Flow or a Workflow. You can also use the Bind Data to fetch data from the connector.
Now when you want to use the AWS S3 connector within your Apps, you can go to the UI Builder > Connectors > Connected tab. Here all connectors that have accounts configured or in other words, are connected will be listed out. If you click Manage Account, you can see the list of all available actions for the connector. You would need to add the action whenever you want to define further actions. Once you have fetched the data using the respective action, you can make use of the Bind Data to fetch data into a control.
Get a list of buckets
Let us consider a very simple example to fetch a list of buckets from your AWS S3 account and use it to provide options using a dropdown control. Add a Dropdown control and go to Bind data. Click Select Connector where you would get a list of Connectors. Select Amazon AWS S3 and click Continue. You get the list of available actions for the connector.
Select ListBuckets and click Continue. Select the Account to authenticate.
Add the connector name. Now if you want to transform response or add transform keys you can make the necessary changes and updates here and click Finish.
Once the configuration is done you need to select the keys to be used to display data. Click Save.
Now if you run the form you can get the list of buckets from your AWS S3.
List Objects in a Bucket
Continuing with the same example to get a list of buckets, you can use the selected item from the Dropdown control and use it to display the list of objects from the selected bucket.
So add a Tablegrid control and then Bind Data. Add a connector and select the action as ListObjects. Select the Configured account.
You need to provide the Bucket name from where you would list out the objects. Select the dropdown control name from the Use Keywords.
Test the connection and click Finish.
Select the Keys to get the data from the connector.
Now, whenever you take a preview note that the objects for the selected Bucket are listed out.
Uploading files to the bucket
There can be different scenarios where you would find it useful to upload files, may it be to store large numbers of supporting documentation as images or PDFs or to say take a backup and store files to the S3 bucket and so on. You can make use of the S3 connector’s UploadFile action for this purpose.
Let us assume that we have a form where we select the Bucket name from the dropdown, the folder name to upload the files to, and the File upload control to enable uploading of the file/s. Let us add an action button that triggers the action flow to upload files.
In this situation on the button_click event of the action button, you would add the Server-side action > AWS S3 connector and choose the action UploadFile.
Select the Connected account and click Continue.
Use the Dropdown control as the Bucket from the Use keywords, add the folder name (which is optional), and provide the Files using the Fileupload control’s name from the Use keywords. Click Continue.
Add the action name and provide the variable that returns the URL.
If you want you can use this variable to display the URL in a text control for the sake of understanding. You can see the selected bucket to which your file is uploaded.
Generating a pre-signed URL
At times you may want to allow your customers to get a specific object to your bucket for a limited period without AWS security credentials, then in that case you need to generate a pre-signed URL that allows you to temporarily share the URL without making it public. You can use the action available GetPreSignedUrl to enable using the URL.
Let us take the example where you select the Bucket name from the list of available buckets and display the files and folders in the table grid control.
When you select the file the action is triggered that is defined under action1_click.
Select the Bucketname from the Use Keywords, get the Filename or Object’s Key Name from the tablegrid as seen in the illustration above, and provide the time in seconds after which the link expires.
So now whenever you run the form the URL is returned as a response variable which will be shown in the text control.
Copy files from one bucket to another
To create a copy of a file from one bucket to another, we have the CopyObject sub-API which can help you to copy any object from a bucket to another bucket with a distinctive user-chosen name, provided that the object and the buckets should be in the same account.
Select the CopyObject endpoint and choose the AWS S3 account.
Fill in the required details such as:
- Copy Source: The source location along with the object name which you want to copy. The format example is provided right below.
- Destination Bucket: The destination bucket name where the object will be copied.
- Destination File Key: The user chosen name for the copied object.
Changes after copy:
Deleting an object
In building a CRUD app, deleting is a very important function to work with any database. Studio provides you with a sub-category of DeleteObjects.
Select the DeleteObjects endpoint and choose the AWS S3 account.
Provide the necessary details such as the bucket name where the object to be deleted is present along with the object name (with extension). Also if you pass an array of keys then bulk deleting of objects is also supported.