* add custom resolvers info and editable row selection info (#9057) * fix system requirements icon * add auth info for webhooks and fix casing * add regex custom validation info (#9068) * [docs]: Marketplace 1.7 updates (#9085) * [docs] Amazon redshift plugin * make minor improvements * add and update docs for marketplace 1.7 * update order of plugins in overview to match sidebar * create new version --------- Co-authored-by: Shubhendra <withshubh@gmail.com> * add the latest version in the versions.json file (#9094) * [docs]: Update PDF component (#9088) * update PDF component * merged with develop and added changes to the new version * update docs favicon: (#9118) * [docs] SSO revamp (#9031) * add method to set default language * update image settings through custom css and update screenshots for getting started and tooljet concepts (#9158) * fix read documentation button * fix formatting for setup icons (#9172) * fix sidebar link for aws lambda * Update static media (#9175) * updated the screenshots * reduced the gif size * reverted the package.json file * edited the zoomed in images and replaced some gifs with screenshots * removed one gif * update static media * update file names * update toolbar * fix file names * fix: dynamodb img path * update media for org management dashboard * fix: casing and formatting * update workspace constant media * update media in workspace settings and github * update github sso * minor change to github sso docs * minor fix * update google sso * change includeCurrentVersion flag to false --------- Co-authored-by: Asjad Ahmed Khan <iitasjad2001@gmail.com> Co-authored-by: Asjad Ahmed Khan <60435499+2001asjad@users.noreply.github.com> Co-authored-by: Karan Rathod <karan.altcampus@gmail.com> * Feature: Engagespot plugin (#9012) * feat(plugins): added engagespot plugin * feat(docs): added engagespot plugin docs * chore(engagespot-plugin): revised copywritings * Feature: Databricks data source (#9174) * plugin-created * Databricks integration * icon, error handling * removed unrelated changes from marketplace and frontend package-lock.json removed runAsync and maxRows timeouts pending * timeout implementation * socket timeout and error handling * resolve comments * resolve comments2 * solved render issue test connection improvements * solved undefined error * fix TJDB not null value fail for is operation (#9055) * fix TJDB not null value fail for is operation * handling not null and null case insenstive values * Support for marketplace plugin deploy on render preview app (#9221) * Fix for marketplace error on render preview app * add marketplace build command * Adding new workflow for building marketplace plugin * removed render app creation * [:docs] Add documentation for Databricks plugin (#9224) * add docs for databricks * update databricks docs * update docs * remove ref to clusters * bump to v2.34.0 * Fixed data source cypress failure (#9227) * updated spec with required text * updated mongodb and import spec * updated import spec --------- Co-authored-by: Karan Rathod <karan.altcampus@gmail.com> Co-authored-by: Adish M <44204658+adishM98@users.noreply.github.com> Co-authored-by: Midhun G S <gsmithun4@gmail.com> Co-authored-by: Shubhendra <withshubh@gmail.com> Co-authored-by: Aman Regu <amanregu@gmail.com> Co-authored-by: Asjad Ahmed Khan <iitasjad2001@gmail.com> Co-authored-by: Asjad Ahmed Khan <60435499+2001asjad@users.noreply.github.com> Co-authored-by: Jobin Jose <129726530+jobin-logidots@users.noreply.github.com> Co-authored-by: Syed Mohammad Akhtar Rizvi <85864291+ShazanRizvi@users.noreply.github.com> Co-authored-by: blank0537 <111295371+blank0537@users.noreply.github.com> Co-authored-by: Mekhla Asopa <59684099+Mekhla-Asopa@users.noreply.github.com>
7.4 KiB
| id | title |
|---|---|
| s3 | Amazon S3 |
Amazon S3
ToolJet can connect to Amazon S3 buckets and perform various operation on them.
Connection
To establish a connection with the Amazon S3 data source, you can either click on the +Add new Data source button located on the query panel or navigate to the Data Sources page from the ToolJet dashboard.
ToolJet supports connecting to AWS S3 using IAM credentials, AWS Instance Profile or AWS ARN Role.
If you are using IAM credentials, you will need to provide the following details:
- Region
- Access key
- Secret key
It is recommended to create a new IAM user for the database so that you can control the access levels of ToolJet.
To connect to AWS S3 using AWS Instance Profile, select the Use AWS Instance Profile. This will use the IAM role attached to the EC2 instance where ToolJet is running. To access the metadata service of an ECS container and the EC2 instance, we use the WebIdentityToken parameter which is obtained from a successful login with an identity provider.
If you are using AWS ARN Role, you will need to provide the following details:
- Region
- Role ARN
Click on Test connection button to verify if the credentials are correct and that the database is accessible to ToolJet server. Click on Save button to save the data source.
:::tip You can now connect to different S3 Hosts using custom endpoints. :::
Querying AWS S3
Click on +Add button of the query manager and select the data source added in the previous step as the data source. Select the operation that you want to perform, fill in the required parameters and click on Run button to run the query.
:::info Query results can be transformed using transformations. Read our transformations documentation. :::
Query operations
You can create query for AWS S3 data source to perform several actions such as:
- Create a new bucket
- Read object
- Upload object
- Remove object
- List buckets
- List objects in a bucket
- Signed url for download
- Signed url for upload
Create a new bucket
You can create a new bucket in your S3 by using this operation. It requires one parameter - Bucket name.
Read object
You can read an object in a bucket by using this operation. It requires two parameters - Bucket name and Key.
Upload object
You can use this operation to upload objects(files) to your S3 bucket. It requires four parameters:
- Bucket: Specify the bucket name
- Key: Key of the object/file
- Content type: Specify file type such as text, image etc.
- Upload data: File/object that is to be uploaded.
Remove object
You can use this operation to remove an object from your S3 bucket. It requires two parameters:
- Bucket: Specify the bucket name
- Key: Key of the object/file
List buckets
This operation will list all the buckets in your S3. This does not require any parameter.
List objects in a bucket
This operation will fetch the list of all the files in your bucket. It requires two parameters:
- Bucket: Bucket name (mandatory)
- Prefix: To limit the response to keys that begin with the specified prefix (optional)
- Max keys: The maximum number of keys returned in the response body (optional). Default value is 1000.
- Offset: The key to start with when listing objects in a bucket (optional).
- "Next Continuation Token":
Next Continuation Tokenindicates Amazon S3 that the list is being continued on this bucket with a token. ContinuationToken is obfuscated and is not a real key (optional).
:::info
Next Continuation Token
For listing a bucket for objects that begin with a specific character or a prefix, then use the Offset parameter. For example, if you want to list all the objects that begin with a, then set the Offset parameter to a. Similarly, if you want to list all the objects that begin with ab, then set the Offset parameter to ab.
The Next Continuation Token is used to list the next set of objects in a bucket. It is returned by the API when the response is truncated. The results will contain Next Continuation Token if there are more keys in the bucket that satisfy the list query. To get the next set of objects, set the Next Continuation Token parameter and run the query again.
The results will continue from where the last listing finished.
:::
Signed url for download
The object owner can optionally share objects with others by creating a presigned URL, using their own security credentials, to grant time-limited permission to download the objects. For creating a presigned URL, the required parameters are:
- Bucket: name of the bucket for uploading the file
- Key: an object key
- Expires in: an expiration time of URL
Signed url for upload
The presigned URLs are useful if you want your user/customer to be able to upload a specific object to your bucket, but you don't require them to have AWS security credentials or permissions. For creating a presigned URL, the required parameters are:
- Bucket: name of the bucket for uploading the file
- Key: an object key
- Expires in: an expiration time of URL
- Content type: the content type such as text, image etc.
:::info We built an app to view and upload files to AWS S3 buckets. Check out the complete tutorial here. :::