ToolJet/docs/versioned_docs/version-2.33.0/data-sources/s3.md
Akshay 2d08d889de
Release: Community changes (v2.34.0) (#9226)
* add custom resolvers info and editable row selection info (#9057)

* fix system requirements icon

* add auth info for webhooks and fix casing

* add regex custom validation info (#9068)

* [docs]: Marketplace 1.7 updates (#9085)

* [docs] Amazon redshift plugin

* make minor improvements

* add and update docs for marketplace 1.7

* update order of plugins in overview to match sidebar

* create new version

---------

Co-authored-by: Shubhendra <withshubh@gmail.com>

* add the latest version in the versions.json file (#9094)

* [docs]: Update PDF component (#9088)

* update PDF component

* merged with develop and added changes to the new version

* update docs favicon: (#9118)

* [docs] SSO revamp (#9031)

* add method to set default language

* update image settings through custom css and update screenshots for getting started and tooljet concepts (#9158)

* fix read documentation button

* fix formatting for setup icons (#9172)

* fix sidebar link for aws lambda

* Update static media (#9175)

* updated the screenshots

* reduced the gif size

* reverted the package.json file

* edited the zoomed in images and replaced some gifs with screenshots

* removed one gif

* update static media

* update file names

* update toolbar

* fix file names

* fix: dynamodb img path

* update media for org management dashboard

* fix: casing and formatting

* update workspace constant media

* update media in workspace settings and github

* update github sso

* minor change to github sso docs

* minor fix

* update google sso

* change includeCurrentVersion flag to false

---------

Co-authored-by: Asjad Ahmed Khan <iitasjad2001@gmail.com>
Co-authored-by: Asjad Ahmed Khan <60435499+2001asjad@users.noreply.github.com>
Co-authored-by: Karan Rathod <karan.altcampus@gmail.com>

* Feature: Engagespot plugin (#9012)

* feat(plugins): added engagespot plugin

* feat(docs): added engagespot plugin docs

* chore(engagespot-plugin): revised copywritings

* Feature: Databricks data source (#9174)

* plugin-created

* Databricks integration

* icon, error handling

* removed unrelated changes from marketplace and frontend package-lock.json removed runAsync and maxRows timeouts pending

* timeout implementation

* socket timeout and error handling

* resolve comments

* resolve comments2

* solved render issue test connection improvements

* solved undefined error

* fix TJDB not null value fail for is operation (#9055)

* fix TJDB not null value fail for is operation

* handling not null and null case insenstive values

* Support for marketplace plugin deploy on render preview app (#9221)

* Fix for marketplace error on render preview app

* add marketplace build command

* Adding new workflow for building marketplace plugin

* removed render app creation

* [:docs] Add documentation for Databricks plugin (#9224)

* add docs for databricks

* update databricks docs

* update docs

* remove ref to clusters

* bump to v2.34.0

* Fixed data source cypress failure (#9227)

* updated spec with required text

* updated mongodb and import spec

* updated import spec

---------

Co-authored-by: Karan Rathod <karan.altcampus@gmail.com>
Co-authored-by: Adish M <44204658+adishM98@users.noreply.github.com>
Co-authored-by: Midhun G S <gsmithun4@gmail.com>
Co-authored-by: Shubhendra <withshubh@gmail.com>
Co-authored-by: Aman Regu <amanregu@gmail.com>
Co-authored-by: Asjad Ahmed Khan <iitasjad2001@gmail.com>
Co-authored-by: Asjad Ahmed Khan <60435499+2001asjad@users.noreply.github.com>
Co-authored-by: Jobin Jose <129726530+jobin-logidots@users.noreply.github.com>
Co-authored-by: Syed Mohammad Akhtar Rizvi <85864291+ShazanRizvi@users.noreply.github.com>
Co-authored-by: blank0537 <111295371+blank0537@users.noreply.github.com>
Co-authored-by: Mekhla Asopa <59684099+Mekhla-Asopa@users.noreply.github.com>
2024-03-29 19:13:26 +05:30

180 lines
7.4 KiB
Markdown

---
id: s3
title: Amazon S3
---
# Amazon S3
ToolJet can connect to Amazon S3 buckets and perform various operation on them.
## Connection
To establish a connection with the Amazon S3 data source, you can either click on the `+Add new Data source` button located on the query panel or navigate to the [Data Sources](/docs/data-sources/overview/) page from the ToolJet dashboard.
ToolJet supports connecting to AWS S3 using **IAM credentials**, **AWS Instance Profile** or **AWS ARN Role**.
If you are using **IAM credentials**, you will need to provide the following details:
- **Region**
- **Access key**
- **Secret key**
It is recommended to create a new IAM user for the database so that you can control the access levels of ToolJet.
<div style={{textAlign: 'center'}}>
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/iamnew.png" alt="aws s3 modal" width="600" />
</div>
To connect to AWS S3 using **AWS Instance Profile**, select the **Use AWS Instance Profile**. This will use the IAM role attached to the EC2 instance where ToolJet is running.
To access the metadata service of an ECS container and the EC2 instance, we use the WebIdentityToken parameter which is obtained from a successful login with an identity provider.
<div style={{textAlign: 'center'}}>
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/instanew.png" alt="aws s3 modal" width="600" />
</div>
If you are using **AWS ARN Role**, you will need to provide the following details:
- **Region**
- **Role ARN**
<div style={{textAlign: 'center'}}>
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/arnnew.png" alt="aws s3 modal" width="600" />
</div>
Click on **Test connection** button to verify if the credentials are correct and that the database is accessible to ToolJet server. Click on **Save** button to save the data source.
:::tip
You can now connect to **[different S3 Hosts using custom endpoints](/docs/how-to/s3-custom-endpoints)**.
:::
## Querying AWS S3
Click on `+Add` button of the [query manager](/docs/app-builder/query-panel/#add) and select the data source added in the previous step as the data source. Select the operation that you want to perform, fill in the required parameters and click on **Run** button to run the query.
<div style={{textAlign: 'center'}}>
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/operations3.png" alt="aws s3 query" />
</div>
<br/>
:::info
Query results can be transformed using transformations. Read our [transformations documentation](/docs/tutorial/transformations).
:::
## Query operations
You can create query for AWS S3 data source to perform several actions such as:
1. **[Create a new bucket](#create-a-new-bucket)**
2. **[Read object](#read-object)**
3. **[Upload object](#upload-object)**
4. **[Remove object](#remove-object)**
5. **[List buckets](#list-buckets)**
6. **[List objects in a bucket](#list-objects-in-a-bucket)**
7. **[Signed url for download](#signed-url-for-download)**
8. **[Signed url for upload](#signed-url-for-upload)**
### Create a new bucket
You can create a new bucket in your S3 by using this operation. It requires one parameter - **Bucket** name.
<div style={{textAlign: 'center'}}>
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/createbucket.png" alt="Create a new bucket - S3 operation" />
</div>
### Read object
You can read an object in a bucket by using this operation. It requires two parameters - **Bucket** name and **Key**.
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/readv2.png" alt="aws s3 read object" />
### Upload object
You can use this operation to upload objects(files) to your S3 bucket. It requires four parameters:
1. **Bucket**: Specify the bucket name
2. **Key**: Key of the object/file
3. **Content type**: Specify file type such as text, image etc.
4. **Upload data**: File/object that is to be uploaded.
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/uplobjv2.png" alt="aws s3 upload"/>
### Remove object
You can use this operation to remove an object from your S3 bucket. It requires two parameters:
1. **Bucket**: Specify the bucket name
2. **Key**: Key of the object/file
<div style={{textAlign: 'center'}}>
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/removeobject.png" alt="Create a new bucket - S3 operation" />
</div>
### List buckets
This operation will list all the buckets in your S3. This does not require any parameter.
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/listbucketsv2.png" alt="aws s3 bucket" />
### List objects in a bucket
This operation will fetch the list of all the files in your bucket. It requires two parameters:
1. **Bucket**: Bucket name (mandatory)
2. **Prefix**: To limit the response to keys that begin with the specified prefix (optional)
3. **Max keys**: The maximum number of keys returned in the response body (optional). Default value is 1000.
4. **Offset**: The key to start with when listing objects in a bucket (optional).
5. **"Next Continuation Token"**: `Next Continuation Token` indicates Amazon S3 that the list is being continued on this bucket with a token. ContinuationToken is obfuscated and is not a real key (optional).
:::info
**Next Continuation Token**
For listing a bucket for objects that begin with a specific character or a prefix, then use the `Offset` parameter. For example, if you want to list all the objects that begin with `a`, then set the `Offset` parameter to `a`. Similarly, if you want to list all the objects that begin with `ab`, then set the `Offset` parameter to `ab`.
The `Next Continuation Token` is used to list the next set of objects in a bucket. It is returned by the API when the response is truncated. The results will contain `Next Continuation Token` if there are more keys in the bucket that satisfy the list query. To get the next set of objects, set the `Next Continuation Token` parameter and run the query again.
The results will continue from where the last listing finished.
:::
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/listobjectsv2.png" alt="aws s3 list object" />
### Signed url for download
The object owner can optionally share objects with others by creating a presigned URL, using their own security credentials, to grant time-limited permission to download the objects. For creating a presigned URL, the required parameters are:
1. **Bucket**: name of the bucket for uploading the file
2. **Key**: an object key
3. **Expires in**: an expiration time of URL
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/urldownv2.png" alt="aws s3 signed download" />
### Signed url for upload
The presigned URLs are useful if you want your user/customer to be able to upload a specific object to your bucket, but you don't require them to have AWS security credentials or permissions. For creating a presigned URL, the required parameters are:
1. **Bucket**: name of the bucket for uploading the file
2. **Key**: an object key
3. **Expires in**: an expiration time of URL
4. **Content type**: the content type such as text, image etc.
<img className="screenshot-full" src="/img/datasource-reference/aws-s3/urluplv2.png" alt="aws s3 signed upload" />
:::info
We built an app to view and upload files to AWS S3 buckets. Check out the complete tutorial **[here](https://blog.tooljet.com/build-an-aws-s3-broswer-with-tooljet/)**.
:::