Compare commits

...

761 commits

Author SHA1 Message Date
Adish M
e995fc1c5f
Merge pull request #16027 from ToolJet/test/workflow-update-develop
Some checks are pending
CI / build (push) Waiting to run
CI / lint-for-plugins (push) Blocked by required conditions
CI / lint-for-frontend (push) Blocked by required conditions
CI / lint-for-server (push) Blocked by required conditions
CI / unit-test (push) Blocked by required conditions
CI / e2e-test (push) Blocked by required conditions
2026-04-21 15:38:42 +05:30
ajith-k-v
81d10bed0a Update platform cypress workflow on develop 2026-04-21 15:37:05 +05:30
Adish M
d9071ad6d6
9PM IST (#16014) 2026-04-21 11:40:01 +05:30
Souvik
14ad0f7a19 9PM IST 2026-04-21 00:14:36 +05:30
Adish M
afab2c6247
Merge pull request #15928 from ToolJet/adishM98-patch-2
Update Node setup and dependency installation steps
2026-04-15 11:55:14 +05:30
Adish M
fa1eaf26f4
Update Node setup and dependency installation steps 2026-04-15 11:54:55 +05:30
Adish M
c2106ca487
Merge pull request #15927 from ToolJet/adishM98-patch-2
Add GitHub Actions workflow for Storybook deployment
2026-04-15 11:51:03 +05:30
Adish M
507d693c83
Add GitHub Actions workflow for Storybook deployment
This workflow automates the deployment of Storybook to Netlify upon pull request closure or manual trigger.
2026-04-15 11:50:27 +05:30
Adish M
87a935e63d
Merge pull request #15915 from ToolJet/adishM98-patch-2
Change base and ref branches in workflow file
2026-04-14 11:37:32 +05:30
Adish M
105531b8ce
Change base and ref branches in workflow file 2026-04-14 11:37:11 +05:30
Adish M
8500dc66f9
Merge pull request #15913 from ToolJet/adishM98-patch-2
Update deployment targets from Netlify to Cloudflare
2026-04-14 11:06:32 +05:30
Adish M
c8fe0c6bf2
Update deployment targets from Netlify to Cloudflare 2026-04-14 11:06:10 +05:30
github-actions[bot]
ccf6496958
docs: update LTS version table (#15905)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-04-13 15:13:02 +05:30
github-actions[bot]
c5b483da06
docs: update LTS version table (#15886)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-04-10 15:03:50 +05:30
github-actions[bot]
d103bc54fa
docs: update LTS version table (#15864)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-04-08 15:00:49 +05:30
github-actions[bot]
be16229ec3
docs: update LTS version table (#15839)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-04-06 15:05:12 +05:30
github-actions[bot]
861115284f
docs: update LTS version table (#15820)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-04-03 14:54:23 +05:30
github-actions[bot]
108f4f259f
docs: update LTS version table (#15790)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-04-01 15:02:18 +05:30
Adish M
e21d6087fd
Merge pull request #15761 from ToolJet/license-compliance-workflow-01
Added license Compliance check for default branch
2026-03-31 17:13:04 +05:30
Adish M
0357060e63
Added security fixes (#15766) 2026-03-31 15:08:50 +05:30
Souvik
5e370dece9 Added security fixes 2026-03-30 21:32:40 +05:30
Souvik
fdc10d7dc8 Added license Compliance check for default branch 2026-03-30 20:10:28 +05:30
github-actions[bot]
91db41c758
docs: update LTS version table (#15758)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-30 15:08:39 +05:30
github-actions[bot]
1d51dc9cd6
docs: update LTS version table (#15735)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-27 14:53:35 +05:30
github-actions[bot]
5bb37c36c3
docs: update LTS version table (#15691)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-25 14:53:26 +05:30
Midhun Kumar E
ea8f4b03a6
update seeding (#15679) 2026-03-25 08:10:04 +05:30
Adish M
d963256e66
Merge pull request #15672 from ToolJet/adishM98-patch-2
Update docs-netlify.yml
2026-03-24 11:03:31 +05:30
Adish M
ef02052ab5
Update docs-netlify.yml 2026-03-24 11:03:14 +05:30
Adish M
6804a0afb1
updated new host (#15652) 2026-03-23 15:43:54 +05:30
Yukti Goyal
f7dde80801 updated new host 2026-03-23 15:42:00 +05:30
github-actions[bot]
53780ad9b1
docs: update LTS version table (#15650)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-23 14:58:38 +05:30
github-actions[bot]
ddeb0d2e90
docs: update LTS version table (#15638)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-20 14:48:39 +05:30
Adish M
3d9666428f
Merge pull request #15636 from ToolJet/fix-render-auto-suspender
Aligning Auto-Suspend with lts-3.16
2026-03-20 13:27:37 +05:30
Souvik
180a3554df Aligning Auto-Suspend with lts-3.16 2026-03-20 13:24:26 +05:30
Adish M
178b479d73 fix: update Netlify deployment trigger from pull_request to push 2026-03-20 10:56:07 +05:30
github-actions[bot]
5be73f5e20
docs: update LTS version table (#15604)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-18 14:54:25 +05:30
Adish M
7354cbd604
Merge pull request #15586 from ToolJet/adishM98-patch-2
Add pull_request trigger for Netlify deployment
2026-03-17 16:53:48 +05:30
Adish M
af244ab6fb
Add pull_request trigger for Netlify deployment 2026-03-17 16:53:34 +05:30
Adish M
fd6611dadc fix: update branch input references in deployment workflow 2026-03-17 11:44:09 +05:30
Adish M
02d7124f24
Merge pull request #15584 from ToolJet/automation/mv-neto-page
refactor: update deployment workflow for Cloudflare Pages integration
2026-03-17 11:24:11 +05:30
Adish M
d315f7e9f7 refactor: update deployment workflow for Cloudflare Pages integration 2026-03-17 11:23:21 +05:30
Adish M
4d954d46f2
Update Netlify workflow for documentation deployment (#15573) 2026-03-16 18:30:45 +05:30
Adish M
ed5b7919ab
Update Netlify workflow for documentation deployment 2026-03-16 18:29:59 +05:30
github-actions[bot]
9e93225f8f
docs: update LTS version table (#15568)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-16 15:00:42 +05:30
Adish M
625c91b9d3
Merge pull request #15561 from ToolJet/gyrpe-automated-slack-notify-develop 2026-03-13 21:04:12 +05:30
Adish M
3821ae520a
Merge pull request #15560 from ToolJet/vulnerability-ci-update-2.0-develop 2026-03-13 21:03:42 +05:30
Souvik
53498e96c3 Aligning to lts-3.16 2026-03-13 21:00:46 +05:30
Souvik
b0b7c786f2 Aligning to lts-3.16 2026-03-13 20:38:30 +05:30
github-actions[bot]
dc7aa5c07c
docs: update LTS version table (#15550)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-13 14:48:13 +05:30
Adish M
3621f2569f
Merge pull request #15546 from ToolJet/adishM98-patch-2
Update Netlify workflow for documentation branch
2026-03-13 12:41:55 +05:30
Adish M
5fa054215f
Update Netlify workflow for documentation branch 2026-03-13 12:41:42 +05:30
Adish M
230c058993
Merge pull request #15541 from ToolJet/docs/remove-deprecated-versions
[docs]: Remove Deprecated Versions
2026-03-13 11:11:53 +05:30
rudrapratik30
7a303663ae [docs]: Remove Deprecated Versions 2026-03-12 18:10:59 +05:30
Pratik Agrawal
727b7ae951
[docs]: Workflow Versions & Environments (#15478)
* basic changes

* final chnages

* remove edges
2026-03-12 11:27:01 +05:30
Aditya Joshi
79ddbaf42c
[docs]: Added examples to workflow nodes (#14729)
* [docs]: Added examples to workflow nodes

* [docs]: Added more examples to workflow nodes & further simplified the explanation

* [docs]: Updated package-lock.json to latest develop branch

* [docs]: Fixed changes received in comments for workflow nodes examples

* [docs]: Started with example 2 on loop node

* [docs]: Added another example to loop node

* [docs]: Updated workflow example nodes with the comments

* [docs]: Resolved comments on GitHub PR

* [docs]: update screenshots in postgresql data source

* [docs]: Added GUI mode operations Postgresql

* [docs]: re-update gui mode in postgresql

* Update postgresql.md

---------

Co-authored-by: Nayana N <nayananagaraj10@gmail.com>
Co-authored-by: Pratik Agrawal <pratik104agrawal@gmail.com>
2026-03-12 10:42:28 +05:30
github-actions[bot]
7e945976ff
docs: update LTS version table (#15527)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-11 14:49:57 +05:30
Adish M
985bcfd2bb
Update Cypress Marketplace workflow to align with Platform pipeline (#15503)
- Remove CE build, keep EE only with run-cypress-marketplace-ee label
- Add missing steps: disk cleanup, debug labels, log matrix, view docker-compose,
  test DB connection, create delete_user procedure
- Update Docker Compose to v2.27.0
- Add permissions block and TIMESTAMP env
- Align environment variables with Platform
- Disable Cypress action caching (install: false + manual npm ci)
- Add server readiness check via log-based detection

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-10 15:41:38 +05:30
emidhun
cf2b8e29cb Update Cypress Marketplace workflow to align with Platform pipeline
- Remove CE build, keep EE only with run-cypress-marketplace-ee label
- Add missing steps: disk cleanup, debug labels, log matrix, view docker-compose,
  test DB connection, create delete_user procedure
- Update Docker Compose to v2.27.0
- Add permissions block and TIMESTAMP env
- Align environment variables with Platform
- Disable Cypress action caching (install: false + manual npm ci)
- Add server readiness check via log-based detection

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-10 21:06:38 +11:00
github-actions[bot]
13bddce444
docs: update LTS version table (#15481)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-09 14:52:58 +05:30
Adish M
927ebbd172
Merge pull request #15279 from ToolJet/docs/algolia-broken-url
[docs]: Fix Algolia Broken URLs
2026-03-09 12:13:17 +05:30
Pratik Agrawal
1b699934b7
Merge branch 'develop' into docs/algolia-broken-url 2026-03-09 12:11:36 +05:30
Adish M
c9f71055b6
[docs]: Update AI Data Policy (#15464)
* [docs]: Accordion Component

* [docs]: Reorderable List

* [docs]: Update AI Data Policy
2026-03-06 16:55:08 +05:30
Pratik Agrawal
b49b7bdd14
Merge pull request #15459 from ToolJet/docs/accordion-component
[docs]: Accordion Component
2026-03-06 16:51:03 +05:30
Pratik Agrawal
e62b6b9ef5
Merge pull request #15460 from ToolJet/docs/reorderable-list
[docs]: Reorderable List
2026-03-06 16:50:33 +05:30
rudrapratik30
5d4a2fe715 [docs]: Update AI Data Policy 2026-03-06 16:49:06 +05:30
github-actions[bot]
52aa260371
docs: update LTS version table (#15463)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-06 14:47:15 +05:30
rudrapratik30
88334da193 [docs]: Reorderable List 2026-03-06 12:41:22 +05:30
rudrapratik30
cbdceb450d [docs]: Accordion Component 2026-03-06 12:14:15 +05:30
Adish M
a2fa230eae
[docs]: Update AI Privacy Policy (#15437)
* [docs]: added ssh tunnelling in mssql data source page

* [docs]: added ssh tunnelling in mysql data source page

* improvements in mssql.md

* improvements in mysql.md

* [docs]: added awsv4 auth connection in restapi data source

* Update authentication.md

* [docs]: added ssh tunnelling connection in mongodb datasource

* [docs]: added ssh tunnelling connection in pgsql datasource

* Update mongodb.md

* Update postgresql.md

* [docs]: Update AI Privacy Policy

---------

Co-authored-by: Nayana N <nayananagaraj10@gmail.com>
2026-03-05 11:37:46 +05:30
Pratik Agrawal
2496f74270
Merge pull request #15426 from ToolJet/docs/aws-restapi
[docs]: AWSv4-Restapi Auth Configuration
2026-03-05 11:35:41 +05:30
rudrapratik30
0ca9e5b90d [docs]: Update AI Privacy Policy 2026-03-05 11:34:52 +05:30
Pratik Agrawal
43c3bd5ff6
Merge pull request #15401 from ToolJet/docs/ssh-tunnelling-datasources
[docs]: add SSH tunnelling in data sources
2026-03-05 11:34:04 +05:30
Pratik Agrawal
ad1c57bc26
Update postgresql.md 2026-03-05 11:32:46 +05:30
Pratik Agrawal
4239fbef9a
Update mongodb.md 2026-03-05 11:32:29 +05:30
Nayana N
0a79df9333 [docs]: added ssh tunnelling connection in pgsql datasource 2026-03-05 05:19:34 +00:00
Nayana N
702e3d1186 [docs]: added ssh tunnelling connection in mongodb datasource 2026-03-04 12:46:01 +00:00
github-actions[bot]
40bec54eee
docs: update LTS version table (#15427)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-04 14:49:00 +05:30
Pratik Agrawal
a95aadcc7b
Update authentication.md 2026-03-04 14:29:12 +05:30
Adish M
c5fc95a86c
Merge pull request #15400 from ToolJet/feat/cf-pages-prod
feat: add Cloudflare Pages production deployment workflow
2026-03-04 13:40:08 +05:30
Nayana N
3f18babc81 [docs]: added awsv4 auth connection in restapi data source 2026-03-04 08:02:21 +00:00
Adish M
3458600254
Merge pull request #15421 from ToolJet/update-render-deploy-workflow-develop-02
Added changes from lts-3.16
2026-03-03 20:54:41 +05:30
Souvik
48209f6035 added changes from lts-3.16 2026-03-03 20:49:19 +05:30
Pratik Agrawal
73a23412ed
improvements in mysql.md 2026-03-03 13:56:22 +05:30
Pratik Agrawal
db2cef397b
improvements in mssql.md 2026-03-03 13:54:00 +05:30
Nayana N
3dbe04c500 [docs]: added ssh tunnelling in mysql data source page 2026-03-03 06:59:04 +00:00
Nayana N
9b46ca939a [docs]: added ssh tunnelling in mssql data source page 2026-03-02 12:35:16 +00:00
Adish M
35853a7342 fix: update Cloudflare account ID reference in production deployment workflow 2026-03-02 16:00:59 +05:30
Adish M
40cfe5f4fd fix: update Cloudflare API token reference in deployment workflow 2026-03-02 15:55:25 +05:30
Adish M
cb4b792897 fix: update Cloudflare Pages API token reference in deployment workflow 2026-03-02 15:53:53 +05:30
Adish M
f74af24b37 feat: add Cloudflare Pages production deployment workflow 2026-03-02 15:27:59 +05:30
github-actions[bot]
9326d284a3
docs: update LTS version table (#15398)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-03-02 14:53:50 +05:30
github-actions[bot]
c2903613a0
docs: update LTS version table (#15383)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-02-27 14:49:35 +05:30
Adish M
0f600aac5f
Merge pull request #15339 from ToolJet/docs/app-gen-with-existing-db
[docs]: App gen with existing DB tables
2026-02-26 13:58:48 +05:30
Adish M
2568090c1e
Merge pull request #15365 from ToolJet/adishM98-patch-2
Refactor Cloudflare Pages deployment command
2026-02-26 13:36:34 +05:30
Adish M
4895ebdd46 fix: update Cloudflare Pages deployment to set production branch dynamically 2026-02-26 12:07:12 +05:30
Adish M
25bef13b58 fix: update deployment script to target production branch directly 2026-02-26 11:44:49 +05:30
Adish M
fa6f52db73 fix: update build command to include plugin and frontend builds 2026-02-26 10:22:35 +05:30
Adish M
4fcb65a8bf fix: update build command to use cloud build process 2026-02-26 10:17:00 +05:30
Adish M
e36629dd74 refactor: remove large asset upload process from Cloudflare Pages deployment 2026-02-26 10:14:18 +05:30
Adish M
f4161d3cf5 fix: update large asset upload process to add redirects and improve handling 2026-02-25 18:27:33 +05:30
Adish M
c2d9198cb9 feat: add large asset handling to S3 in deployment workflow 2026-02-25 18:15:00 +05:30
Adish M
524cc210b8 fix: update deployment script to use production branch alias 2026-02-25 17:40:36 +05:30
Adish M
f6bdfb3c6a
Refactor Cloudflare Pages deployment command 2026-02-25 17:36:56 +05:30
Adish M
10857e1678
Add Cloudflare Pages deployment workflow (#15364)
This workflow automates deployment to Cloudflare Pages for the Cloud Frontend project, including user authorization checks, Git operations, dependency installation, project building, and cache purging.
2026-02-25 16:13:31 +05:30
Adish M
1ecd1eca10
Add Cloudflare Pages deployment workflow
This workflow automates deployment to Cloudflare Pages for the Cloud Frontend project, including user authorization checks, Git operations, dependency installation, project building, and cache purging.
2026-02-25 16:13:15 +05:30
github-actions[bot]
da967142d6
docs: update LTS version table (#15363)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-02-25 14:53:38 +05:30
rudrapratik30
1457bc5998 screenshot updates 2026-02-25 12:16:21 +05:30
rudrapratik30
d8f197aaa9 feedback update 2026-02-25 10:37:28 +05:30
Adish M
14ef545d0c
Merge pull request #15319 from ToolJet/docs/pocketbase-salesforce-supabase
[docs] : update images in pocketbase-salesforce-supabase
2026-02-24 15:10:25 +05:30
Pratik Agrawal
d281f59e20
Merge pull request #15093 from ToolJet/docs/cohere-easypost-engagespot
[docs]: update images in cohere-easypost-engagespot plugin
2026-02-24 15:09:18 +05:30
Pratik Agrawal
14140db4b9
Merge branch 'docs/pocketbase-salesforce-supabase' into docs/cohere-easypost-engagespot 2026-02-24 15:08:53 +05:30
rudrapratik30
904eea8400 final changes 2026-02-24 15:07:55 +05:30
rudrapratik30
18be5b8e46 finishing changes 2026-02-24 15:01:42 +05:30
rudrapratik30
bec189419a add to sidebar 2026-02-24 10:48:10 +05:30
rudrapratik30
a500ebb736 privacy 2026-02-23 15:18:54 +05:30
rudrapratik30
7ec84c4d78 [docs]: App gen with existing DB tables 2026-02-23 15:13:56 +05:30
github-actions[bot]
e78c2f4cf7
docs: update LTS version table (#15337)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-02-23 14:55:01 +05:30
Nayana N
5f0cf66b8d [docs]: update images, query's broken url in supabase plugin 2026-02-23 07:14:42 +00:00
Adish M
40f8d33777
Merge pull request #15335 from ToolJet/docs/change-components-numbers
[docs] : change components numbers
2026-02-23 12:13:57 +05:30
Pratik Agrawal
5e1e6aa33e
change it to only 80 2026-02-23 12:09:08 +05:30
Pratik Agrawal
15deefc56d
change it to only 80 2026-02-23 12:08:29 +05:30
Nayana N
d164f08962 [docs]: Update number of components in platform overview V2.27.0 2026-02-23 05:09:39 +00:00
Nayana N
b362fc5983 [docs]: Update number of components in TJ concepts V3.5.0 LTS 2026-02-23 04:59:47 +00:00
Karan Rathod
73e5bd60d9
[docs]: Revise privacy policy for data sharing and security (#15329) 2026-02-21 12:16:31 +05:30
Adish M
637515a108
Merge pull request #15322 from ToolJet/automation/maketplace-plugin
feat: update Node.js version and add production deployment workflow for marketplace plugins
2026-02-20 17:23:27 +05:30
Adish M
09cdb05285 feat: update Node.js version and add production deployment workflow for marketplace plugins 2026-02-20 17:22:14 +05:30
Nayana N
e4289884e6 [docs]: update all images, query builder broken url in salesforce 2026-02-20 10:15:37 +00:00
Adish M
2d3f239010
Merge pull request #15276 from ToolJet/docs/ebs-ami-changes
Update AMI deployment instructions and add region-specific guidance
2026-02-20 15:10:54 +05:30
Adish M
60bea3c847 Update Packer workflow to support region-specific AMI builds and cleanup EC2 instances 2026-02-20 15:10:06 +05:30
github-actions[bot]
4bf70a9888
docs: update LTS version table (#15318)
Co-authored-by: adishM98 <44204658+adishM98@users.noreply.github.com>
2026-02-20 14:49:27 +05:30
Nayana N
f0ac50ba3e [docs]: update images, query broken url in pocketbase plugin 2026-02-20 07:18:41 +00:00
Nayana N
656c846813 [docs]: update the query builder's broken url in engagespot plugin 2026-02-20 06:11:55 +00:00
Adish M
dc6362df74
Merge pull request #15311 from ToolJet/docs/version-automation
[docs]: Automate Latest Docker Image
2026-02-20 10:43:22 +05:30
Adish M
8f6a820f0b [docs]: Enhance LTS table automation with improved tag fetching and markdown integration 2026-02-20 10:16:16 +05:30
Adish M
987e9b8fea Update LTS table automation and improve markdown generation 2026-02-19 18:28:37 +05:30
Adish M
a113f500bf
Merge pull request #15305 from ToolJet/contributing-docker-guide-fix
[Docs] contibuting docker guide
2026-02-19 17:19:19 +05:30
rudrapratik30
794297efdb add tooltip 2026-02-19 14:25:42 +05:30
rudrapratik30
a6016523d8 [docs]: Automate Latest Docker Image 2026-02-19 14:17:47 +05:30
Souvik
d7ace8f9ea [Docs] contibuting docker guide 2026-02-18 23:19:20 +05:30
Adish M
a4eae42594 Merge branch 'develop' into docs/ebs-ami-changes 2026-02-18 14:40:56 +05:30
Adish M
81537637f7 Update AMI region from us-east-1 to us-west-1 in deployment instructions 2026-02-18 14:39:49 +05:30
rudrapratik30
bfce5c559f [docs]: Fix Algolia Broken URLs 2026-02-18 10:40:00 +05:30
Adish M
b457ce5e4c Reduce volume size from 30 to 15 in AMI configuration 2026-02-17 18:00:31 +05:30
Adish M
9e712a4a69 Enhance Packer workflow with region-specific inputs and update default region to us-east-1 2026-02-17 17:56:21 +05:30
Adish M
9d5f91f954
Merge pull request #15273 from ToolJet/docs/update-readme-marketplace
[docs]: Add AWS and Azure Marketplace to Docs
2026-02-17 16:32:29 +05:30
Adish M
7ef4cdd831 Update AMI deployment instructions and add region-specific guidance 2026-02-17 16:19:14 +05:30
Adish M
6b2840df9a
Merge pull request #15270 from ToolJet/docker-local-setup-fix-contributors
Fix customer deployment issue - Docker (Contributors)
2026-02-17 13:06:49 +05:30
rudrapratik30
39d5edfc56 add it to deploy tooljet index 2026-02-17 12:26:55 +05:30
rudrapratik30
0b5c0ed889 [docs]: Add AWS and Azure Marketplace to Docs 2026-02-17 11:55:28 +05:30
Adish M
383730e240
Merge pull request #15268 from ToolJet/docs/deployment-restructure
[docs]: Deployment Folder Restructure
2026-02-17 11:16:05 +05:30
rudrapratik30
bd6566ba4e few more changes 2026-02-17 10:59:21 +05:30
rudrapratik30
dec64398ba fix build error 2026-02-16 23:19:02 +05:30
Souvik
40f1254d71 Fix customer deployment issue - Docker (Contributors) 2026-02-16 19:56:11 +05:30
rudrapratik30
ecb8988797 update url 2026-02-16 19:28:21 +05:30
rudrapratik30
825c7c1697 [docs]: Deployment Folder Restructure 2026-02-16 16:05:23 +05:30
Adish M
114e9acf74
Merge pull request #15254 from ToolJet/docs/anti-patterns
[docs]: Update Anti Patterns Docs
2026-02-16 14:25:56 +05:30
Pratik Agrawal
88037958d0
Merge pull request #15264 from ToolJet/docs/gs2.0-v2
[docs]: Google Sheets2.0 V2
2026-02-16 14:19:46 +05:30
Nayana N
af9e359d58 [docs]: update code & images in gs2.0 docs 2026-02-16 07:34:48 +00:00
Johnson Cherian
269ffeddc2
Merge pull request #15192 from ToolJet/docs/py-wf
[docs]: Workflow Python Node
2026-02-13 19:57:28 +05:30
rudrapratik30
650685f6ac [docs]: Update Anti Patterns Docs 2026-02-13 15:43:25 +05:30
Adish M
c11d8aa414
Merge pull request #15178 from ToolJet/docs/agent-node
[docs]: Agent Node
2026-02-13 14:41:45 +05:30
Adish M
696409afab
Merge pull request #15253 from ToolJet/adishM98-patch-2
Update Docker Compose version in workflow
2026-02-13 14:17:51 +05:30
Adish M
98b5cd7469
Update Docker Compose version in workflow 2026-02-13 14:17:26 +05:30
rudrapratik30
f4804d2db9 images 2026-02-13 14:01:49 +05:30
rudrapratik30
74058a06cc flow charts 2026-02-13 13:56:52 +05:30
Aditya Joshi
f736af6313
[docs]: Update docker upgradation guide (#15247)
* [docs]: Updated docker upgradation guide

* [docs]: Typo fix
2026-02-13 12:38:09 +05:30
rudrapratik30
96093ef381 temp chnage 2026-02-13 12:26:01 +05:30
Adish M
f4627faf89
Merge pull request #15236 from ToolJet/docs/update-robot-txt
[docs]: Update robot.txt and DS Screenshots
2026-02-13 11:10:55 +05:30
Pratik Agrawal
39589176de
Merge pull request #15238 from ToolJet/docs/all-datasources-merge
[docs]: Merge all data sources PRs into one PR
2026-02-13 11:09:33 +05:30
rudrapratik30
70c1c9febd Merge branch 'develop' into docs/update-robot-txt 2026-02-13 11:08:43 +05:30
rudrapratik30
c41a3c1ce4 build errors 2026-02-13 11:07:22 +05:30
rudrapratik30
b6932fae44 Merge branch 'develop' into docs/all-datasources-merge 2026-02-13 10:58:59 +05:30
Adish M
0a7925b4ad
Merge pull request #15239 from ToolJet/docs/google-translate-url
[docs]: enable URL-based Google Translate
2026-02-12 17:41:56 +05:30
Aman Regu
41dcae0af9 docs: harden Google Translate lang param handling 2026-02-12 17:23:43 +05:30
Aman Regu
4d943ea2bd docs: apply Google Translate via lang URL param 2026-02-12 16:41:10 +05:30
Johnson Cherian
89aff818eb
Merge pull request #15212 from ToolJet/docs/json-editor-explorer
[docs]: JSON Editor & Explorer & Key Value
2026-02-12 16:01:40 +05:30
Nayana N
00d673991d Merge remote-tracking branch 'origin/docs/typesense-woocommerce-zendesk' into docs/all-datasources-merge 2026-02-12 10:28:05 +00:00
Nayana N
e5620809d6 Merge remote-tracking branch 'origin/docs/snowflake-stripe-twilio' into docs/all-datasources-merge 2026-02-12 10:27:32 +00:00
Nayana N
1c50a5ac0e Merge remote-tracking branch 'origin/docs/saphana-runjs-runpython' into docs/all-datasources-merge 2026-02-12 10:27:13 +00:00
Nayana N
064cec751d Merge remote-tracking branch 'origin/docs/rethink-slack-smtp' into docs/all-datasources-merge 2026-02-12 10:27:05 +00:00
Nayana N
357694fa06 Merge remote-tracking branch 'origin/docs/restapi' into docs/all-datasources-merge 2026-02-12 10:26:55 +00:00
Nayana N
c95279b931 Merge remote-tracking branch 'origin/docs/oracledb-redis-soapApi' into docs/all-datasources-merge 2026-02-12 10:26:25 +00:00
Nayana N
cfb901dae0 Merge remote-tracking branch 'origin/docs/nocodb-notion-openapi' into docs/all-datasources-merge 2026-02-12 10:24:47 +00:00
Nayana N
f17c2a9713 Merge remote-tracking branch 'origin/docs/mysql' into docs/all-datasources-merge 2026-02-12 10:24:37 +00:00
Nayana N
c37f659c49 Merge remote-tracking branch 'origin/docs/mongoDB' into docs/all-datasources-merge 2026-02-12 10:24:28 +00:00
Nayana N
13bfc34e5b Merge remote-tracking branch 'origin/docs/minio-MSsql-n8n' into docs/all-datasources-merge 2026-02-12 10:24:19 +00:00
Nayana N
1f5219b556 Merge remote-tracking branch 'origin/docs/influxdb-mailgun-mariadb' into docs/all-datasources-merge 2026-02-12 10:24:06 +00:00
Nayana N
1a69e61eb6 Merge remote-tracking branch 'origin/docs/graphql-grpc-grpc2.0' into docs/all-datasources-merge 2026-02-12 10:23:54 +00:00
Nayana N
1e0fdea3c3 Merge remote-tracking branch 'origin/docs/elasticsearch-googlecloud-googlesheets' into docs/all-datasources-merge 2026-02-12 10:23:42 +00:00
Nayana N
a038d7249d Merge remote-tracking branch 'origin/docs/couchdb-databricks-dynamodb' into docs/all-datasources-merge 2026-02-12 10:23:22 +00:00
Nayana N
2a2a7c27de Merge remote-tracking branch 'origin/docs/clickhouse' into docs/all-datasources-merge 2026-02-12 10:22:55 +00:00
Nayana N
8c49905aed Merge remote-tracking branch 'origin/docs/bigquery-cloudfirestore-cosmosdb' into docs/all-datasources-merge 2026-02-12 10:22:46 +00:00
Nayana N
fc4bda417a Merge remote-tracking branch 'origin/docs/athena-azureblob-baserow' into docs/all-datasources-merge 2026-02-12 10:22:21 +00:00
Nayana N
b620535dcc Merge remote-tracking branch 'origin/docs/AseriesDatasources' into docs/all-datasources-merge 2026-02-12 10:20:29 +00:00
rudrapratik30
796f842dec [docs]: Update robot.txt 2026-02-12 15:49:40 +05:30
Nayana N
511261b1ce Merge remote-tracking branch 'origin/docs/datasources-overview' into docs/all-datasources-merge 2026-02-12 10:13:30 +00:00
Pratik Agrawal
102ce4c239
Merge pull request #15200 from ToolJet/docs/keyvalue
[docs]: Key Value Pair Component
2026-02-12 15:35:48 +05:30
Adish M
c31a5ec49f
Merge pull request #15235 from ToolJet/docs/url-fix
[docs]: Algolia Fixes
2026-02-12 15:31:52 +05:30
rudrapratik30
d03964332f some more fixes 2026-02-12 15:30:38 +05:30
Nayana N
5d42a01522 [docs]: update files types in multipart form data 2026-02-12 09:48:51 +00:00
rudrapratik30
10a37eb8ce [docs]: Small URL Fix 2026-02-12 15:07:11 +05:30
rudrapratik30
9b1d44cc8f final changes 2026-02-12 14:54:04 +05:30
rudrapratik30
5e131dd9c5 final changes 2026-02-12 14:48:19 +05:30
rudrapratik30
8b9117b0ae final changes 2026-02-12 14:42:43 +05:30
rudrapratik30
2d41c6fb2c final changes 2026-02-12 14:27:32 +05:30
rudrapratik30
99eb38ec8b changes requested 2026-02-12 12:54:36 +05:30
rudrapratik30
72ae316af0 final changes 2026-02-12 12:38:36 +05:30
rudrapratik30
48ec3b7b78 final changes 2026-02-12 12:27:02 +05:30
rudrapratik30
65694e8525 final changes 2026-02-12 12:19:30 +05:30
rudrapratik30
5f43d0d2e6 final changes 2026-02-12 12:15:46 +05:30
rudrapratik30
da494229f1 final changes 2026-02-12 12:11:13 +05:30
rudrapratik30
318c1251f5 final changes 2026-02-12 12:00:39 +05:30
Adish M
345232bf69
Merge pull request #15229 from ToolJet/docs/fix-broken-url
[docs]: Fix Broken URL
2026-02-12 11:21:20 +05:30
Pratik Agrawal
802953f405
Merge pull request #15220 from ToolJet/docs/custom-oidc-claims
[docs]: Add info about Custom OIDC Claims Support
2026-02-12 11:18:14 +05:30
rudrapratik30
a268ee7192 [docs]: Fix Broken URL 2026-02-12 11:10:46 +05:30
Aditya Joshi
3186851d71 [docs]: Indentation fix 2026-02-11 23:15:27 +05:30
Aditya Joshi
fabccee363 [docs]: Changed claims to variables 2026-02-11 20:33:29 +05:30
Aditya Joshi
34a29d24bc [docs]: Added details about custom attributes in SAML 2026-02-11 17:45:12 +05:30
rudrapratik30
4d5581e53f final changes 2026-02-11 12:56:53 +05:30
rudrapratik30
8668132655 final changes 2026-02-11 12:46:58 +05:30
rudrapratik30
764885d2fe final changes 2026-02-11 12:34:32 +05:30
rudrapratik30
778f891f61 final changes 2026-02-11 12:29:46 +05:30
rudrapratik30
b2dadb2363 final changes 2026-02-11 12:00:57 +05:30
rudrapratik30
146d91f919 final changes 2026-02-11 11:32:38 +05:30
Adish M
cbde3d50c1
Merge pull request #15224 from ToolJet/search-patch-1
Enable insights for Algolia search
2026-02-11 11:10:15 +05:30
Shubham Gupta
28aab7e663
Enable insights for Algolia search 2026-02-11 11:03:22 +05:30
Adish M
009e179b98
Merge pull request #15167 from ToolJet/docs/couchbase
[docs]: Couchbase Marketplace Plugin
2026-02-11 11:02:01 +05:30
rudrapratik30
ecdb07bfe7 changes - pratik 2026-02-11 10:40:55 +05:30
Adish M
e38329f484
Merge pull request #15184 from Yunus96/docs-header-css-fix
[docs]: Fix Dark Mode Navigation Buttons
2026-02-11 10:28:55 +05:30
Aditya Joshi
77e4991346 [docs]: Added info about Custom OIDC Claims Support 2026-02-10 17:36:11 +05:30
Pratik Agrawal
7d03c5b5d5
[docs]: Horizontal Progress Bar (#15133)
* [docs]: Horizontal Progress Bar

* update id
2026-02-10 14:41:35 +05:30
rudrapratik30
417b20e6ee done 2026-02-10 12:49:18 +05:30
rudrapratik30
d06ddc0740 init 2026-02-10 12:14:37 +05:30
Johnson Cherian
0e639d7ccc
Merge pull request #15203 from ToolJet/docs/update-rls-hyperlinks
[docs]: Update RLS Hyperlinks
2026-02-09 15:46:33 +05:30
Aditya Joshi
2b71169732 [docs]: Updated RLS Hyperlinks 2026-02-09 15:44:15 +05:30
rudrapratik30
cc8c063aee Merge branch 'develop' into docs/keyvalue 2026-02-09 12:45:11 +05:30
rudrapratik30
2e6caf2960 [docs]: Key Value Pair Component 2026-02-09 12:43:40 +05:30
rudrapratik30
7c782cb6f5 [docs]: data source googlesheets2.0 plugin (#15003)
* [docs]: Stage1 added in googlesheets2.0 plugin

* [docs]: stage2 added all ops data in gs2.0 plugin

* [docs]: Added images for all operations, fetch spreadsheet button in GS 2.0 plugin

* [docs]: added multi auth in gs2.0 plugin

* [docs]: added sample outputs and param description in gs2.0

* [docs]: added all sample outputs for ops in gs2.0 plugin

* update id

---------

Co-authored-by: rudrapratik30 <pratik104agrawal@gmail.com>
2026-02-09 12:01:30 +05:30
rudrapratik30
f13bd4d7aa need help section 2026-02-09 11:57:15 +05:30
rudrapratik30
dccaf3221f [docs]: Workflow Python Node 2026-02-06 16:18:06 +05:30
rudrapratik30
50afddea64 add more example 2026-02-06 13:16:51 +05:30
Mohammad Yunus
bb93c91152 docs: update documentation header styles 2026-02-05 14:00:30 +00:00
rudrapratik30
7f9f6015c9 complete 2026-02-05 15:49:49 +05:30
rudrapratik30
4d5d986038 create file 2026-02-05 14:51:52 +05:30
Aditya Joshi
36551c742a
[docs]: Add docs for LDAP Group Sync (#15052)
* [docs]: Added documentation for allowed domains

* [docs]: Added domain constraints as a new heading instead of individual subheadings for allowed and restricted domains

* [docs]: Added docs for LDAP group sync

* Revert "[docs]: Added domain constraints as a new heading instead of individual subheadings for allowed and restricted domains"

This reverts commit 63327d2495.

Removed domain constrains changes

* Revert "[docs]: Added documentation for allowed domains"

This reverts commit 4cf7a60fc0.

Removed domain constrain changes
2026-02-05 14:43:56 +05:30
Adish M
610dbc3c05
Merge pull request #15164 from ToolJet/docs/rating-column
[docs]: Rating Column in the table
2026-02-04 15:12:46 +05:30
Nayana N
f7a2d36bba [docs]: Add Couchbase marketplace plugin 2026-02-04 09:23:47 +00:00
rudrapratik30
44c898679b [docs]: Rating Column in the table 2026-02-04 11:43:35 +05:30
Adish M
640288c04e
Merge pull request #15162 from ToolJet/docs/tags-input
[docs]: Tag Input
2026-02-04 11:33:06 +05:30
rudrapratik30
ad2678fef0 [docs]: Tag Input 2026-02-04 11:31:55 +05:30
Adish M
bbf1121478
Merge pull request #15153 from ToolJet/docs/anthropic-gemini-openai
[docs] : Update marketplace plugins - anthropic, gemini, openai
2026-02-04 11:06:07 +05:30
rudrapratik30
edc17cbafd Merge branch 'develop' into docs/anthropic-gemini-openai 2026-02-04 10:50:03 +05:30
Pratik Agrawal
ac4a13ebd9
Update openai.md 2026-02-04 10:25:16 +05:30
Pratik Agrawal
afb70f260c
Update gemini.md 2026-02-04 10:19:09 +05:30
Pratik Agrawal
451ed039cd
Update gemini.md 2026-02-04 10:18:03 +05:30
Pratik Agrawal
9336cbc08b
Update gemini.md 2026-02-04 10:17:19 +05:30
Pratik Agrawal
980190e94c
Update anthropic.md 2026-02-04 10:16:17 +05:30
Nayana N
8c59f5b560 [docs]: update models and images, remove outdated model in openai plugin 2026-02-03 12:33:24 +00:00
Nayana N
ce9f724de6 [docs]: Update models and query images in gemini plugin 2026-02-03 11:55:31 +00:00
Adish M
ae0e197adf
Merge pull request #15155 from ToolJet/adishM98-patch-2
Remove continue-on-error from Cypress test steps
2026-02-03 17:19:32 +05:30
Adish M
b7af237500
Remove continue-on-error from Cypress test steps 2026-02-03 17:19:11 +05:30
Nayana N
f7d87f4131 [docs]: Update models and query image in anthropic plugin 2026-02-03 11:23:23 +00:00
Adish M
8c77672a6c
Refactor update-test-system workflow for clarity (#15149) 2026-02-03 11:44:27 +05:30
Adish M
04202be837
Refactor update-test-system workflow for clarity 2026-02-03 11:43:53 +05:30
Prajwal Pai
f488d5e4f1
Feature/add couchbase support (#14518)
* Add support for couchbase

* Update readme

* Update query operation

---------

Co-authored-by: Midhun G S <gsmithun4@gmail.com>
2026-02-03 08:54:34 +05:30
Aditya Joshi
f3b9b3870e
[docs]: Azure Container Apps - Postgrest Version Correction (#15134) 2026-02-02 12:26:55 +05:30
Pratik Agrawal
a4151fab31
[docs]: Currency Input Component (#15132)
* [docs]: Currency Input Component

* number format
2026-02-02 11:34:59 +05:30
Nayana Nagaraj
75b1562da6
[docs]: data source googlesheets2.0 plugin (#15003)
* [docs]: Stage1 added in googlesheets2.0 plugin

* [docs]: stage2 added all ops data in gs2.0 plugin

* [docs]: Added images for all operations, fetch spreadsheet button in GS 2.0 plugin

* [docs]: added multi auth in gs2.0 plugin

* [docs]: added sample outputs and param description in gs2.0

* [docs]: added all sample outputs for ops in gs2.0 plugin

* update id

---------

Co-authored-by: rudrapratik30 <pratik104agrawal@gmail.com>
2026-01-30 17:11:46 +05:30
Midhun G S
4196b30cb3
refactor: remove Sentry observability module and related components (#15121) 2026-01-30 16:27:03 +05:30
Pratik Agrawal
fb609b0ed5
[docs]: Iframe Component (#15118) 2026-01-30 11:38:50 +05:30
Aditya Joshi
4b61037603
[docs]: Add documentation for allowed domains (#14953)
* [docs]: Added documentation for allowed domains

* [docs]: Added domain constraints as a new heading instead of individual subheadings for allowed and restricted domains

* [docs]: Updated examples for allowed domains
2026-01-29 16:36:31 +05:30
Nayana N
3ddb59174c [docs]: updated all images in engagespot plugin 2026-01-28 12:04:01 +00:00
Nayana N
8db4ee0ca7 [docs]: updated images in easypost plugin 2026-01-28 10:35:47 +00:00
Nayana N
6fb7077eba [docs]: updated images in cohere plugin 2026-01-28 09:27:29 +00:00
Midhun G S
b81ed81f93
Merge pull request #15066 from ToolJet/rebase/develop-main
Merge main to develop
2026-01-27 11:44:00 +05:30
Pratik Agrawal
6041a17265
[docs]: Delete unwanted docs (#15078) 2026-01-27 10:49:52 +05:30
Pratik Agrawal
50ab9fae32
[docs]: Update NPM and NVM version (#15049) 2026-01-27 10:12:21 +05:30
Adish M
5c09a1b16e
Merge pull request #14950 from ToolJet/fix/ce-dockerfile-compose
Add Docker Compose and environment configuration for built-in and external PostgreSQL setups
2026-01-23 17:00:51 +05:30
Nayana N
c89455520c [docs]: added Multipart-form data doc in RestApi 2026-01-23 10:21:44 +00:00
gsmithun4
921714d8a9 Refactor code structure for improved readability and maintainability 2026-01-23 15:41:49 +05:30
Pratik Agrawal
a2eebf8595
[docs]: Improve Navigation Sidebar (#15047)
* nav sidebar label and auto open

* paid and self hosted tags

* navbar icons

* correct typo
2026-01-22 15:47:07 +05:30
Nayana N
58581e3a9b [docs]: updated all images in zendesk plugin 2026-01-22 06:29:35 +00:00
Nayana N
6be8c18887 [docs]: Added images for resources & description in woocommerce plugin 2026-01-22 05:51:30 +00:00
Adish M
c48a380d89
Merge pull request #15030 from ToolJet/docs/app-builder-permissions
[docs]: Dynamic Access Rules
2026-01-22 11:20:18 +05:30
Pratik Agrawal
45150bcd81
[docs]: Update Marketplace and Deployment Overview (#15016)
* [docs]: Update Marketplace Overview and Deployment Overview

* fix build issues
2026-01-21 15:28:42 +05:30
Nayana N
9451002ba4 [docs]: update all images in typesense plugin 2026-01-21 09:39:24 +00:00
Nayana N
03b2d83732 [docs]: update all images in twilio plugin 2026-01-21 07:53:26 +00:00
rudrapratik30
add8a4e317 [docs]: Dynamic Access Rules 2026-01-21 12:49:15 +05:30
Nayana N
56b84982bc [docs]: update stripe dashboard image, added listops image 2026-01-21 07:17:23 +00:00
Nayana N
3899d9ba40 [docs]: update all images, conenction methods in snowflake plugin 2026-01-21 06:23:15 +00:00
Pratik Agrawal
27e78409d7
[docs]: App Builder Sprint (#14879)
* [docs]: App Builder Sprint

* minor formatting

* html and reset query

* form styles

* minor updates

* Components UI/UX style enhancements

* table img updates
2026-01-21 11:23:40 +05:30
Souvik
3e78e1e319 added platform tag for tooljet and updated postgres 13 -> 16 2026-01-21 02:36:35 +05:30
Nayana N
d78f6ca853 [docs]: update data source image in run-python doc file 2026-01-20 12:48:55 +00:00
Nayana N
9521b8e9d7 [docs]: update all screenshots in runjs doc file 2026-01-20 12:33:52 +00:00
Nayana N
4ba1a066eb [docs]: update conenction images with description in nocodb plugin 2026-01-20 09:51:20 +00:00
Nayana N
adda4e7f2e [docs]: updated default ds image in DS overview 2026-01-20 09:21:50 +00:00
rudrapratik30
7c5718d873 minor changes 2026-01-20 12:08:22 +05:30
Souvik
f929911d54 ToolJet host changed to 80 2026-01-20 00:44:08 +05:30
Adish M
dbe1d5ef91
Added pre-release support (#15005) 2026-01-19 18:35:53 +05:30
Souvik
4b1d4d466c Added pre-release support 2026-01-19 18:31:31 +05:30
Nayana N
717a629310 [docs]: screenshots updates in restapi doc 2026-01-19 08:11:45 +00:00
Pratik Agrawal
f531b80f0d
[docs]: App Builder Sprint 21 (#14983)
* audio and camera docs

* iframe

* iframe and tabs component
2026-01-19 09:50:50 +05:30
Nayana N
7357042472 [docs]: update query image in saphana docs 2026-01-16 11:28:29 +00:00
Nayana N
16846f1397 [docs]: image update in restapi's metadata-cookies file 2026-01-16 11:17:10 +00:00
Nayana N
b4e98cd176 [docs]: stage1 image updates in restapi's auth file 2026-01-16 11:05:37 +00:00
Nayana N
08a10b7e72 [docs]: remove connection gif in restapi config 2026-01-16 09:44:51 +00:00
Adish M
df311eeb03
Enhance ToolJet deployment documentation for version 3.16.0-LTS (#14896)
* Enhance ToolJet deployment documentation for version 3.16.0-LTS

- Updated Google Cloud Run setup to clarify architecture and environment variables.
- Improved Kubernetes setup documentation for AKS, EKS, GKE, and general Kubernetes, emphasizing the need for two separate PostgreSQL databases.
- Revised OpenShift deployment instructions to include detailed environment variable configurations and deployment steps.
- Added system requirements for PostgreSQL and Redis, specifying recommended versions and minimum specifications.
- Included warnings about database naming conflicts and critical configurations for successful deployment.

* Enhance upgrade documentation for ToolJet LTS version

- Added critical backup reminder for PostgreSQL instance before upgrading.
- Clarified database requirements, emphasizing the need for two separate database names (PG_DB and TOOLJET_DB).
- Included deployment flexibility options for database hosting.
- Updated installation notes to specify that the upgrade guide is only for existing installations.
- Improved formatting and clarity in various setup guides (Azure, DigitalOcean, Docker, ECS, Google Cloud Run, Kubernetes, OpenShift).

* Enhance deployment documentation for ToolJet on various platforms, including AWS AMI, Azure Container Apps, AWS ECS, Google Cloud Run, Kubernetes (AKS, EKS, GKE), and Openshift. Updated warnings and notes regarding PostgreSQL database setup and ToolJet AI features.

* Enhance deployment documentation for ToolJet LTS version 3.16.0 by updating backup instructions, removing redundant version requirements, and clarifying Redis configuration for external instances.

* Refine deployment documentation for ToolJet by removing redundant environment variable references and adding notes on securing Redis passwords across various platforms including AWS AMI, Docker, Google Cloud Run, Kubernetes (AKS, EKS, GKE), and OpenShift.

* Refine Redis configuration instructions in AWS AMI deployment documentation for clarity and formatting consistency.

* Add new environment variables for bulk upload limits in deployment documentation

* Refine deployment documentation for ToolJet subpath installation by improving environment variable formatting and clarifying upgrade prerequisites.

* Update PostgreSQL database links in deployment documentation for consistency across ECS, Helm, Kubernetes (AKS, EKS, GKE), and OpenShift setups.

* Remove warning about whitelisting ToolJet AI features from Helm deployment documentation

* Refine environment variable configuration for workflow scheduling in Helm deployment documentation

* formatting updates till gke

* final formatting updates

---------

Co-authored-by: rudrapratik30 <pratik104agrawal@gmail.com>
2026-01-16 14:10:34 +05:30
Nayana N
ef9e2e37ae [docs]: update query image in smtp plugin 2026-01-16 08:11:58 +00:00
Nayana N
1b75d0ada8 [docs]: update images in slack doc file 2026-01-16 07:33:14 +00:00
Nayana N
d0261393dc [docs]: Update all images in RethinkDB doc 2026-01-16 06:54:11 +00:00
Adish M
2a813eb7a4
Merge pull request #14974 from ToolJet/adishM98-patch-1
Add Vercel deployment workflow
2026-01-15 12:38:40 +05:30
Adish M
d0b49b962c
Add Vercel deployment workflow
This workflow automates the deployment of the ToolJet frontend to Vercel, including user authorization, Git checkout with submodules, dependency installation, and build steps.
2026-01-15 12:37:39 +05:30
Adish M
72aba149df
Merge pull request #14973 from ToolJet/docs/update-no-ds
[docs]: Update no. of data sources
2026-01-15 11:22:09 +05:30
rudrapratik30
b0bcb2880c [docs]: Update no. of data sources 2026-01-15 11:20:47 +05:30
Nayana N
f3f9efaa8b [docs]: Update all images in Soap API plugin 2026-01-14 12:04:37 +00:00
Nayana N
dcc1406e8a [docs]: update all images in redis plugin 2026-01-14 10:47:39 +00:00
Nayana N
23a2577d4d [docs]: update images in oracledb doc 2026-01-14 10:28:36 +00:00
Nayana N
e370d9b6ae [docs]: Update all images in openapi doc 2026-01-14 08:56:21 +00:00
Nayana N
ddc47ef8df [docs]: Update all images in notion docs 2026-01-14 08:27:35 +00:00
Nayana N
d815b7f9a6 [docs]: Update all screenshots in nocodb 2026-01-14 07:03:20 +00:00
Nayana N
033f9efd55 [docs]: update query builder images in n8n docs 2026-01-14 05:38:09 +00:00
Nayana N
797a3778d0 [docs]: Update images in mssql docs 2026-01-14 05:20:36 +00:00
Adish M
3a05564099
refined a bit (#14963) 2026-01-13 19:43:58 +05:30
Nayana N
7f7fbaae29 [docs]: Update all images in minio doc 2026-01-13 11:31:18 +00:00
Nayana N
1ceb999d73 [docs]: Updated all images in MariaDB docs 2026-01-13 10:09:55 +00:00
Nayana N
39bf3db655 [docs]: update query builder image in mailgun docs 2026-01-13 08:52:11 +00:00
Nayana N
267d6e7201 [docs]: update all screnshots in influxdb docs 2026-01-13 07:52:28 +00:00
Aditya Joshi
741e5df1d3
[docs]: Add Observability Docs (#14871)
* [docs]: Add documentation for setting up observability using Datadog

* Updated doc flow for observability using datadog

* Updated doc flow for observability using datadog

* [docs]: Add docs for setting up observability using New Relic

* [docs]: Add docs for observability setup using Grafana

* Add intro line for setting up observability using Grafana docs

* [docs]: Add changes to Observability docs

* [docs]: Add documentation for observability using Last9

* [docs]: Fixed typo on setup page

* [docs]: Updated setup from observability docs

* [docs]: Changes in Observability docs wrt comments

* [docs]: Added screenshot for observability
2026-01-13 11:30:58 +05:30
Souvik
1eacba8ddc refined a bit 2026-01-12 18:45:17 +05:30
Nayana N
2828c12767 [docs]: Update mysql doc with connection methods and modified images, deleted unwanted images 2026-01-12 10:39:22 +00:00
Adish M
4fd8eb80e7
Merge pull request #14960 from ToolJet/adishM98-patch-1
Change default branch in CI workflow to lts-3.16
2026-01-12 14:00:42 +05:30
Adish M
88ff720c37
Change default branch in CI workflow to lts-3.16 2026-01-12 13:57:48 +05:30
Nayana N
3eb94c263f [docs]: Update images in graphql doc 2026-01-12 06:26:18 +00:00
Pratik Agrawal
ceb00ec33b
[docs]: Editable Tags (#14894)
* [docs]: Editable Tags

* new updates

* formating changes

* auto pick chip color
2026-01-12 11:03:55 +05:30
Nayana N
68df3ba0a9 [docs]: Update screenshots, added authentication method in GS doc 2026-01-09 12:24:32 +00:00
Nayana N
cedfbbdeaa [docs]: Update all screenshots in gcs doc 2026-01-09 11:28:50 +00:00
Adish M
681e80ac01 Add Docker Compose and environment configuration for built-in and external PostgreSQL setups 2026-01-09 16:30:41 +05:30
Nayana N
75e59f56ce [docs]: Update all screenshots in Elasticsearch doc 2026-01-09 10:26:10 +00:00
Adish M
aa84035333
Merge pull request #14945 from ToolJet/docs/sso-docs-merged
[docs]: Merge SSO docs into one PR
2026-01-09 14:26:14 +05:30
Aditya Joshi
74fbc77f49 [docs]: Updated img classes and sidebar order 2026-01-09 14:21:56 +05:30
Adish M
1b63833526
Merge pull request #14647 from ToolJet/docs/pat-integration
[docs]: Embed ToolJet Application
2026-01-09 14:04:29 +05:30
Aditya Joshi
776cb46ad6 [docs]: Merged SSO docs in one branch 2026-01-09 13:57:15 +05:30
Aditya Joshi
db952bc7f7 [docs] Merge OneLogin OIDC 2026-01-09 13:56:18 +05:30
rudrapratik30
c9e5cb8562 add link 2026-01-09 13:54:51 +05:30
Aditya Joshi
27464b9221 Merge branch 'docs/microsoft-entra-saml' into docs/sso-docs-merged
Merge Entra ID SAML
2026-01-09 13:51:48 +05:30
Adish M
61a8867c5c
Merge pull request #14944 from ToolJet/docs/open-api-ai
[docs]: OpenAPI data source in the AI
2026-01-09 13:51:47 +05:30
Aditya Joshi
cdf86568e1 Merge branch 'sso-oidc' into docs/sso-docs-merged
Merge Auth0
2026-01-09 13:51:07 +05:30
rudrapratik30
c23c0bfadc [docs]: OpenAPI data source in the AI 2026-01-09 13:49:09 +05:30
Nayana N
4d799b250b [docs]: Update screenshots in dynamoDB doc 2026-01-09 06:45:48 +00:00
rudrapratik30
fc461798a4 final 2026-01-09 10:07:44 +05:30
Aditya Joshi
6f289bd5a0 [docs]: Changes in auth0 docs as per doc review call 2026-01-08 19:47:11 +05:30
Aditya Joshi
c707b2b635 [docs]: Changes in auth0 docs as per doc review call 2026-01-08 19:39:05 +05:30
Aditya Joshi
9d47b638ae [docs]: Changes in auth0 docs as per doc review call 2026-01-08 19:34:17 +05:30
Aditya Joshi
ace6ec9795 [docs]: Changes in auth0 docs as per ddoc review call 2026-01-08 19:29:15 +05:30
Nayana N
750380f781 [docs]: Update screenshots and remove gifs in databricks doc 2026-01-08 12:10:48 +00:00
Nayana N
ad1e671a5c [docs]: Update all screenshots in couchDB doc 2026-01-08 08:26:52 +00:00
rudrapratik30
42357386e9 improved auth flow 2026-01-08 12:36:51 +05:30
rudrapratik30
ae45c1ebe7 almost final 2026-01-08 12:34:53 +05:30
Nayana N
b18a0eb6d8 [docs]: Update all screenshots and dynamic connection in mysql doc 2026-01-08 06:28:12 +00:00
rudrapratik30
e433551599 add code snippets 2026-01-08 10:18:31 +05:30
Adish M
350129652e
Merge pull request #14930 from ToolJet/render-preview-develop 2026-01-07 22:52:04 +05:30
Souvik
7c78b4c292 fixed render image issue 2026-01-07 22:46:48 +05:30
Souvik
0659e7cdd3 using pull_request 2026-01-07 22:44:03 +05:30
Adish M
3a9b9882e6
Merge pull request #14926 from ToolJet/render-docker-develop-v2.4
v2.4 added free up space
2026-01-07 13:53:47 +05:30
Souvik
acdf6b8abb v2.4 added free up space 2026-01-07 13:38:19 +05:30
Nayana N
2804e39264 [docs]: Update all screenshots in CosmosDB doc 2026-01-07 07:54:58 +00:00
Adish M
1e89efb4c6
Merge pull request #14924 from ToolJet/render-docker-develop-v2.3
updated with LTS
2026-01-07 12:53:50 +05:30
Souvik
777b149c42 removed recursive 2026-01-07 12:48:35 +05:30
Souvik
492384aff6 updated with LTS 2026-01-07 12:30:12 +05:30
Adish M
4f51346960
Merge pull request #14916 from ToolJet/render-docker-develop
added to develop
2026-01-06 17:35:49 +05:30
Souvik
dded9a4c9e added to develop 2026-01-06 17:29:22 +05:30
Nayana N
35b85316ae [docs]: Updated all screenshots in cloud firestore doc 2026-01-06 11:40:06 +00:00
Nayana N
d4ba0aaaae [docs]: Updated all screenshots in Bigquery doc 2026-01-06 10:23:05 +00:00
Nayana N
1638f15167 [docs]: removed image, update padding in s3 doc 2026-01-06 09:34:06 +00:00
Nayana N
2907fb991d [docs]: update images in appwrite doc 2026-01-06 09:06:19 +00:00
Nayana N
1c49e1aa3c [docs]: Remove unwanted image in amazon ses doc 2026-01-06 08:39:09 +00:00
Nayana N
2a1059774c [docs]: Remove unwanted images from airtable doc 2026-01-06 08:32:54 +00:00
Nayana N
460a4da2a6 [docs]: update padding and image in athena doc 2026-01-06 07:08:15 +00:00
Nayana N
44fc122d4c [docs]: update image, etc in clickhouse file 2026-01-06 06:10:21 +00:00
rudrapratik30
c8a322bf44 pvt app wip 2026-01-06 11:22:31 +05:30
Nayana N
02ef742bbf update data sources count to 80+ in Readme file 2026-01-06 05:14:33 +00:00
rudrapratik30
adbefd9135 public application 2026-01-06 10:35:15 +05:30
Nayana N
f72e32fabe [docs]: Update no. of data sources in platform overview doc 2026-01-06 04:57:36 +00:00
rudrapratik30
1c728eb50d new structure 2026-01-05 15:12:51 +05:30
Aditya Joshi
5a4507fba9 [docs]: Updated blurred image in Microsoft Entra ID OIDC doc 2026-01-02 21:23:27 +05:30
Aditya Joshi
064b98aeb5 [docs]: Updated example URL for Auth0 OIDC 2026-01-02 18:58:24 +05:30
Aditya Joshi
044d2aa648 [docs]: Added changes to Entra ID OIDC Docs 2026-01-02 18:47:18 +05:30
Aditya Joshi
e13ecb1f3e [docs]: Added changes to OneLogin OIDC docs 2026-01-02 18:03:19 +05:30
Aditya Joshi
9995d6a505 [docs]: Added changes in Auth0 OIDC docs 2026-01-02 17:51:52 +05:30
Nayana N
0eb5b36b30 [docs]: Update Data Sources Overview 2026-01-02 11:49:19 +00:00
Pratik Agrawal
66cecf98fd
Update intro para 2026-01-02 15:09:04 +05:30
Nayana N
c1db9379fd [docs]: update intro in mongodb doc 2026-01-02 09:34:34 +00:00
Pratik Agrawal
8deb03170b
[docs]: Trust Center (#14895)
* [docs]: Security Center

* de list docs v2

* materials to certifications
2026-01-02 14:28:15 +05:30
Nayana N
7e91607eac [docs]: Updated all the images in Baserow data source doc 2026-01-02 06:56:45 +00:00
Nayana N
a921bef17a [docs]: Updated all the images in Azureblob data source doc 2025-12-31 12:47:14 +00:00
Nayana N
dbba507bdc [docs]: Updated all the images in Athena plugin docs 2025-12-31 11:15:16 +00:00
Nayana N
3f5a5107d4 [docs]: remove accidentally added Xero image in clickhouse file 2025-12-31 08:16:04 +00:00
Nayana N
103c6ca268 [docs]: Reverted clickhouse doc file in the right path folder 2025-12-31 07:17:15 +00:00
Nayana N
d9aff720e4 [docs]: Updated images in appwrite datasource doc 2025-12-30 07:22:04 +00:00
Nayana N
627d1c8a0b [docs] : Updated S3 doc with h3 for authentication methods and removed one data source image 2025-12-30 06:41:59 +00:00
Nayana N
820e213746 [docs]: Updated Query image and line rephrasing in AWS SES plugin docs 2025-12-29 06:01:58 +00:00
Nayana N
7c332a9491 [docs]: added connection image in airtable plugin doc 2025-12-29 05:40:47 +00:00
Nayana N
9bb2c0d5ab [docs]: Updated images and renamed to snake case naming 2025-12-29 05:37:16 +00:00
Nayana N
c63d6ff002 [docs]: Updated all the images and added a connection method in Appwrite Plugin docs 2025-12-26 07:23:53 +00:00
Adish M
1a2a2d643b
Merge pull request #14868 from ToolJet/adishM98-patch-1
Enhance Cypress workflow with permissions and debug steps
2025-12-24 14:29:48 +05:30
Adish M
c7a340b5bb
Enhance Cypress workflow with permissions and debug steps
Updated Cypress workflow to include permissions and debugging steps. Removed deprecated environment variables and adjusted Cypress test configurations.
2025-12-24 14:29:34 +05:30
Adish M
897dbaf442
docs: LOCKBOX_MASTER_KEY rotation script (#14856)
* docs: LOCKBOX_MASTER_KEY rotation script

* Clarify LOCKBOX_MASTER_KEY rotation instructions and improve formatting

* Update LOCKBOX_MASTER_KEY rotation instructions for serverless deployments

* Add collapsible sections for deployment instructions in LOCKBOX_MASTER_KEY rotation guide

* Removes key rotation timing and scenarios guidance

* Update prerequisites for LOCKBOX_MASTER_KEY rotation to specify database user access requirements

* Correct title casing for LOCKBOX_MASTER_KEY rotation guide and section headers
2025-12-23 22:04:20 +05:30
Nayana N
066a729325 [docs] : Updated all images and added connection method in Aamazon S3 plugin file 2025-12-23 06:25:55 +00:00
Adish M
862c06d2e0
Merge pull request #14828 from ToolJet/docs/multi-tenant-oidc
[docs]: Add documentation for multiple OIDC Providers
2025-12-23 10:12:27 +05:30
Aditya Joshi
d85af30d97 [docs]: Removed unnecessary padding from images 2025-12-22 17:15:27 +05:30
Aditya Joshi
bb0608ce1b [docs]: Updated multi tenant docs with higher resolution images 2025-12-22 17:09:43 +05:30
Nayana N
0669b799de [docs]: Updated images and added a text in Amazon SES Plugin file doc 2025-12-22 06:52:22 +00:00
Nayana N
393ff106ef [docs]: Updated with the Query builder images of Airtable Plugin 2025-12-22 05:57:53 +00:00
Aditya Joshi
762736d431 [docs]: Add docs for configuring SAML usign Microsoft Entra ID 2025-12-19 23:30:22 +05:30
Aditya Joshi
fd5ab07a3b [docs]: Added group sync docs to the OneLogin OIDC Configuration 2025-12-19 19:07:08 +05:30
Aditya Joshi
f03f261059 [docs]: Microsoft entra id saml init 2025-12-19 17:58:21 +05:30
Adish M
89b4f44024
Merge pull request #14642 from ToolJet/test-system-v2
Version 2 of Test-system
2025-12-19 17:03:12 +05:30
Nayana N
409696f798 [docs] : Updated all the images in airtable plugin docs 2025-12-19 10:24:52 +00:00
Aditya Joshi
aa2e9d959f [docs]: Formating changes to OneLogin OIDC docs 2025-12-19 15:35:09 +05:30
Aditya Joshi
433a0fbf88 [docs]: Add documentation for configuring OIDC using OneLogin 2025-12-19 15:12:02 +05:30
Nayana N
9a855f4722 [docs] : Updated Clickhouse doc file with images and outdated data 2025-12-19 06:23:11 +00:00
Nayana N
acc3a3b8af [docs] : Added Title info, updated others chnages in Mongodb doc file 2025-12-19 05:23:43 +00:00
Aditya Joshi
46af29a542 [docs]: Init OneLogin OIDC Setup 2025-12-19 10:23:49 +05:30
Adish M
812d4e92a0
Merge pull request #14769 from ToolJet/docs/draft-version
[docs]: Updated the development lifecycle docs and images to comply with the latest draft version
2025-12-18 18:12:38 +05:30
Aditya Joshi
d2d76cab19 [docs]: Add documentation for group sync using Microsoft Entra ID usign OIDC 2025-12-18 18:04:30 +05:30
Aditya Joshi
91f56fab5d [docs]: Add docs for OIDC using Microsoft Entra ID 2025-12-18 15:28:00 +05:30
Nayana N
9f3dc356c8 added the images 2025-12-17 12:18:28 +00:00
Aditya Joshi
357573db60 [docs]: Add documentation for multiple OIDC Providers 2025-12-17 17:41:15 +05:30
Nayana N
e7fba25846 [docs] : Updated all the images, mainly Connection String, and added Connection Format texts 2025-12-17 11:17:22 +00:00
Aditya Joshi
4ac81242dc [docs]: Added padding to smaller images 2025-12-17 15:23:26 +05:30
Adish M
f0126f8345
Merge pull request #14822 from ToolJet/docs/gitsync-initial-commit
[docs]: Updated gitsync docs - Added information about initial commit and Homepage URL
2025-12-17 15:05:21 +05:30
Adish M
588f3f35d5
Merge pull request #14804 from ToolJet/docs/unique-oidc-id
[docs]: Added documentation for OIDC using non email identifier
2025-12-17 15:05:01 +05:30
Aditya Joshi
66cd31f8a7 Merge branch 'develop' into docs/unique-oidc-id 2025-12-17 14:44:29 +05:30
Aditya Joshi
a89ea366d3 [docs]: Styling changes in GitSync docs 2025-12-17 12:56:38 +05:30
Aditya Joshi
d51e86dfe0 [docs]: Updated gitsync docs - Added information about initial commit and Homepage URL 2025-12-17 12:06:45 +05:30
Aditya Joshi
cf9afd12d8 [docs]: Updated images resolution in draft version docs 2025-12-16 19:26:29 +05:30
Aditya Joshi
82f5b6c651 [docs]: Updated images resolution in draft version docs 2025-12-16 19:13:09 +05:30
Nayana N
6cb2aa0a3a [docs]: Updated all the images and removed outdated requirements 2025-12-16 10:03:59 +00:00
Aditya Joshi
e1b474bdc4 [docs]: Updated images with higher resolution ones 2025-12-16 14:42:51 +05:30
Adish M
51e329123a
Merge pull request #14814 from ToolJet/docs/sendgrid
[docs]: Update SenGrid Plugin
2025-12-16 13:11:02 +05:30
rudrapratik30
c6c3d03c1e [docs] : delete zone identifier file 2025-12-16 13:09:10 +05:30
Aditya Joshi
c5fc4e2b2a [docs]: Updated explanation for OIDC login using Non email identifier. 2025-12-16 12:36:38 +05:30
Nayana Nagaraj
235a3e39d3 [docs] : Updated images size and texts bullentins 2025-12-16 10:45:49 +05:30
Aditya Joshi
886af71905 [docs]: Updated doc with lower resolution images 2025-12-15 18:09:43 +05:30
Aditya Joshi
dadc2c0756 [docs]: Updated blurred screenshots and added a few pointers in draft version docs 2025-12-15 17:56:50 +05:30
Nayana N
44a82740b3 [docs]: Updated filename in sendgrid document 2025-12-15 12:15:58 +00:00
Nayana N
724952835e [docs]: Added sendgrid image 2025-12-15 12:09:36 +00:00
Aditya Joshi
e162f0e444 [docs]: Updated the environment variables for non email identifier in OIDC 2025-12-15 15:35:53 +05:30
Aditya Joshi
c6d9cb4aac [docs]: Added documentation for OIDC using non email identifier 2025-12-15 15:01:29 +05:30
Adish M
9c9d98bcf4
Merge pull request #14798 from ToolJet/docs/update-nvm-npm-version
[docs]: NVM and NPM Version Update
2025-12-15 13:02:23 +05:30
rudrapratik30
de0638b993 [docs]: NVM and NPM Version Update 2025-12-15 12:59:26 +05:30
Adish M
ed2dbe4938
Merge pull request #14787 from ToolJet/adishM98-patch-1
Add disk space cleanup step in deployment workflow
2025-12-12 16:52:08 +05:30
Adish M
e9d6282c40
Add disk space cleanup step in deployment workflow
Added a step to free up disk space before checking out the repository.
2025-12-12 16:51:45 +05:30
Adish M
e678669ea6
Merge pull request #14783 from ToolJet/docs/update-readme-domain
[docs]: Update Readme for Domain Change
2025-12-12 14:52:27 +05:30
rudrapratik30
f1e04ce5bc [docs]: Update Readme for Domain Change 2025-12-12 14:48:11 +05:30
Adish M
273f420cd3
Added documentation for configuring stronger password validation rules. (#14779)
* [docs]: Added documentation for configuring stronger password validation rules

* [docs]: Updated documentation for configuring stronger password validation rules

* [docs]: Updated documentation for configuring stronger password validation rules - Added a few infos
2025-12-12 14:31:31 +05:30
Aditya Joshi
a7cc192fce [docs]: Updated documentation for configuring stronger password validation rules - Added a few infos 2025-12-12 13:04:06 +05:30
Aditya Joshi
0d90a329f7 [docs]: Updated documentation for configuring stronger password validation rules 2025-12-12 11:49:58 +05:30
Aditya Joshi
6bbdd76ebd [docs]: Added documentation for configuring stronger password validation rules 2025-12-12 11:23:54 +05:30
Aditya Joshi
ae89645fb7 [docs]: Updated the development lifecycle docs and images to comply with the latest draft version 2025-12-11 18:13:35 +05:30
Adish M
b5ab988397
Merge pull request #14762 from ToolJet/docs/dropdown-v2-example-schema
[docs]: Dropdown Schema Example
2025-12-11 14:39:51 +05:30
rudrapratik30
101650a0ba [docs]: Dropdown Schema Example 2025-12-11 14:38:54 +05:30
Aditya Joshi
d743c33933 [docs]: Updated images to follow highlights from the design convention and updated formatting from the style doc 2025-12-11 12:48:09 +05:30
Adish M
24c5a08d33
Merge pull request #14760 from ToolJet/docs/small-fix-tj-api
[docs]: Small Fix in TJ API
2025-12-11 12:41:10 +05:30
rudrapratik30
f5c0cf35a7 [docs]: Small Fix in TJ API 2025-12-11 12:40:32 +05:30
Adish M
231f44c066
[docs]: TJ API Updated End Points (#14758)
* [docs]: TJ API Updated End Points

* move gitsync api
2025-12-11 10:23:17 +05:30
rudrapratik30
88e069c8d4 move gitsync api 2025-12-11 10:20:23 +05:30
rudrapratik30
e48a40a081 [docs]: TJ API Updated End Points 2025-12-11 10:01:10 +05:30
Adish M
6a0472641b
Merge pull request #14750 from ToolJet/docs/remove-python-install-tip
[docs]: Removed tip from import-libraries/python informing about run query on page load
2025-12-10 18:27:49 +05:30
Aditya Joshi
7f0fb2ff04 [docs]: Removed tip from import-libraries/python informing about running query on page load 2025-12-10 18:18:31 +05:30
Aditya Joshi
5a4e30495d [docs]: Updated image borders for Auth0 OIDC documentation 2025-12-10 17:32:51 +05:30
Aditya Joshi
6daf862997 [docs]: Added documentation for configuring SSO using Auth0 OIDC 2025-12-10 11:46:29 +05:30
Adish M
bee5513f0b
Merge pull request #14737 from ToolJet/adishM98-patch-1
Update cypress platform action
2025-12-09 12:03:23 +05:30
Adish M
6056a3e420
Update fmt.Println message from 'Hello' to 'Goodbye' 2025-12-09 12:02:41 +05:30
Adish M
f954edcb86
Merge pull request #14728 from ToolJet/docs/authorize.net
[docs]: Authorize.net Plugin
2025-12-05 18:42:59 +05:30
rudrapratik30
4fa379e0f4 [docs]: Authorize.net Plugin 2025-12-05 15:12:09 +05:30
Adish M
7d3c1895b5
Merge pull request #14698 from ToolJet/docs/saml-group-sync
[docs]: SAML Enable Group Sync UI
2025-12-04 12:22:08 +05:30
Souvik
14774cdb70 compiled in 2 jobs 2025-12-04 00:52:41 +05:30
rudrapratik30
03b3d05ef2 [docs]: SAML Enable Group Sync UI 2025-12-03 11:22:40 +05:30
Souvik
1087ec76a1 updated docker pat in LTS and pre-release 2025-12-02 22:55:25 +05:30
Pratik Agrawal
eeb7b3b7a6
[docs]: Update API Server Gateway Domain (#14691) 2025-12-02 16:21:58 +05:30
rudrapratik30
b6556fe4da minor updates 2025-12-02 10:08:34 +05:30
Souvik
6027e5eb03 updated manual docker build file 2025-12-01 13:34:35 +05:30
Souvik
6c2367f23d eof issue fixed 2025-11-26 21:01:53 +05:30
Souvik
163ae40ba6 added debug 2025-11-26 20:43:28 +05:30
Souvik
05f1293cca small fix 2025-11-26 02:24:03 +05:30
Souvik
12862f3010 removed view and made simple 2025-11-26 01:48:37 +05:30
rudrapratik30
c8f32b522e [docs]: Using PAT 2025-11-25 14:10:51 +05:30
Souvik
b08730bf4d Version 2 of Test-system 2025-11-24 20:36:20 +05:30
Adish M
2f8236cb8e
Merge pull request #12378 from ToolJet/docs/smtp
[docs]: Fix SMTP Attachment
2025-11-24 14:37:43 +05:30
rudrapratik30
cf1a58b2b1 update 3.5 and 3.16 2025-11-24 14:36:49 +05:30
rudrapratik30
718dfeed37 Merge branch 'develop' into docs/smtp 2025-11-24 14:32:43 +05:30
Adish M
d04b7f22b0
Merge pull request #14216 from ToolJet/docs/workflow-revamp
[docs]: Workflow Revamp
2025-11-24 14:31:47 +05:30
rudrapratik30
6bce6ed4e7 Merge branch 'develop' into docs/workflow-revamp 2025-11-24 14:22:09 +05:30
Adish M
ca2514e284
Merge pull request #14638 from ToolJet/docs/correction-Observability
Rename "Platform Ultimate Dashboard" to "Platform Metrics Dashboard" in observability documentation and update sidebar to include observability-otel setup.
2025-11-24 14:05:56 +05:30
adishM98 Bot
204983dee5 Add OpenTelemetry observability documentation and update sidebar reference 2025-11-24 14:05:22 +05:30
adishM98 Bot
a3ca14eb97 Rename "Platform Ultimate Dashboard" to "Platform Metrics Dashboard" in observability documentation and update sidebar to include observability-otel setup. 2025-11-24 14:02:12 +05:30
Adish M
ca13f81147
Merge pull request #14631 from ToolJet/docs/domain-0change
[docs]: Domain Change
2025-11-24 14:01:14 +05:30
Adish M
5185951b21
Merge pull request #14635 from ToolJet/docs/bigquery-scope
[docs]: BigQuery Scope Update
2025-11-24 12:09:29 +05:30
rudrapratik30
c9998f02c2 [docs]: BigQuery Scope Update 2025-11-24 11:19:30 +05:30
Adish M
450bf4a12c
Merge pull request #14633 from ToolJet/adishM98-patch-1
Update WEBSITE_SIGNUP_URL in cloud-frontend.yml
2025-11-24 11:04:07 +05:30
Adish M
48b0194de8
Update WEBSITE_SIGNUP_URL in cloud-frontend.yml 2025-11-24 11:03:33 +05:30
rudrapratik30
c255f95a6e config file changes 2025-11-24 10:44:49 +05:30
rudrapratik30
efbe3556e7 [docs]: Domain Change 2025-11-24 10:32:41 +05:30
Adish M
56fd2ff5e3
Merge pull request #14629 from ToolJet/adishM98-patch-1
Update WEBSITE_SIGNUP_URL in cloud-frontend.yml
2025-11-24 09:08:23 +05:30
Adish M
9c160bc04a
Update WEBSITE_SIGNUP_URL in cloud-frontend.yml 2025-11-24 09:08:05 +05:30
Pratik Agrawal
32b0f7785a
[docs]: OpenAPI for API References (#14531)
* [docs]: OpenAPI for API References

* wip

* scim doc

* update openapi spec

* feedback update
2025-11-20 14:04:07 +05:30
Adish M
98f21440e3
Add workflows section to Helm documentation with configuration details for scheduling and Redis setup (#14620)
* Add workflows section to Helm documentation with configuration details for scheduling and Redis setup

* link update

---------

Co-authored-by: rudrapratik30 <pratik104agrawal@gmail.com>
2025-11-20 14:03:57 +05:30
Pratik Agrawal
107a185a8c
[docs]: Branching and PR (#14601)
* [docs]: Branching and PR

* feedback updates
2025-11-19 10:46:02 +05:30
Pratik Agrawal
d278946373
[docs]: Prometheus Plugin 3.16 (#14609) 2025-11-18 16:33:33 +05:30
Adish M
7f58263378
Merge pull request #14602 from ToolJet/adishM98-patch-1
Refactor workflow input descriptions and cleanup steps
2025-11-17 17:42:06 +05:30
Adish M
8870c472be
Refactor workflow input descriptions and cleanup steps
Updated descriptions in the workflow inputs and added disk cleanup steps.
2025-11-17 17:41:26 +05:30
Adish M
46626980dc
Merge pull request #14600 from ToolJet/adishM98-patch-1
Modify vulnerability CI schedule and notifications
2025-11-17 12:51:32 +05:30
Adish M
c2e1564533
Modify vulnerability CI schedule and notifications
Updated the vulnerability CI workflow to run weekly instead of bi-weekly. Enhanced Slack notifications with structured payloads and added output retention for audit reports.
2025-11-17 12:51:09 +05:30
Adish M
373082c880
docs: Add observability documentation (#14419)
* docs: Add observability documentation

* minor formatting update

* Enhance OpenTelemetry configuration details and warnings for high cardinality metrics

---------

Co-authored-by: rudrapratik30 <pratik104agrawal@gmail.com>
2025-11-15 00:17:37 +05:30
Adish M
ebcd5936f5
Merge pull request #14597 from ToolJet/adishM98-patch-1
Update manual Docker build workflow descriptions
2025-11-14 23:29:27 +05:30
Adish M
33caa93fd6
Update manual Docker build workflow descriptions 2025-11-14 23:29:11 +05:30
Adish M
b6b23c8c40
Merge pull request #14588 from ToolJet/docs/domain-change
[docs] Add domain change details
2025-11-14 15:54:30 +05:30
Karan Rathod
6d272cc63e disable current version 2025-11-14 14:09:18 +05:30
Karan Rathod
ccffc0fabc add domain change details 2025-11-14 14:06:19 +05:30
Adish M
03e074ad48
docs: Add workflow scheduling instructions and Redis configuration guidance across deployment documentation (#14307)
* Update workflow scheduling instructions across multiple deployment documents

* refactor docs for support new workflow style with redis

* Add warning about whitelisting ToolJet AI features in DigitalOcean and Try ToolJet setup documentation

* Remove "High-Level" from the "How It Works" section title in the migration guide

* Add note about ToolJet version compatibility in migration guide

* Add migration guide reference for users transitioning from Temporal-based workflows

* Update documentation to recommend external Redis for multiple workflow workers across various deployment setups

* Update Docker setup instructions and clarify migration guide version compatibility

* Remove rollback plan section from the Temporal to BullMQ migration guide

* Update documentation to clarify external Redis requirements for multiple worker setups

* Remove TOOLJET_QUEUE_DASH_PASSWORD references from environment variable documentation across multiple setup guides

* Update Redis configuration documentation to clarify external instance requirements and add workflow scheduling variables

* Update migration guide to reflect correct ToolJet version for BullMQ transition
2025-11-12 16:25:56 +05:30
Adish M
c9a95dfaa6
[docs]: Streaming Rsyslog Audit Logs to Datadog (#14384)
* Add guide for streaming Rsyslog audit logs to Datadog

* add use case, add doc to sidebar

* feedback update

---------

Co-authored-by: rudrapratik30 <pratik104agrawal@gmail.com>
2025-11-04 10:51:43 +05:30
Pratik Agrawal
08f22b68d0
[docs]: OIDC PKCE (#14476) 2025-11-03 13:42:05 +05:30
Pratik Agrawal
6bb03104dc
[docs]: Prompting 101 (#14322)
* [docs]: Prompting 101

* feedback update
2025-11-03 12:53:25 +05:30
Adish M
7d4709efe7
Merge pull request #14462 from ToolJet/docs/update-support-email
[docs]: Update Support Mail
2025-10-31 14:27:05 +05:30
Adish M
662a4f3ae0
Merge pull request #14474 from ToolJet/docs/marketing-tracking-script
[docs]: Marketing Script
2025-10-31 11:36:39 +05:30
rudrapratik30
0e4e23879a [docs]: Marketing Script 2025-10-31 11:23:58 +05:30
rudrapratik30
97b2ee83ed [docs]: Update Support Mail 2025-10-30 12:20:40 +05:30
Adish M
c491dadfd8
Merge pull request #14460 from ToolJet/docs/minor-changes-oct-2025
[docs]: Minor Changes
2025-10-30 11:59:40 +05:30
rudrapratik30
d364827ba7 [docs]: Minor Changes 2025-10-30 11:58:56 +05:30
Adish M
4acc87eb36
Merge pull request #14458 from ToolJet/azure-container-entrypoint-fix
[Doc] Azure container apps V3.16 LTS
2025-10-30 11:46:28 +05:30
Adish M
7540c7dfc5
Merge pull request #14459 from ToolJet/docs/filepicker-component
[docs]: File Picker Component
2025-10-30 11:39:44 +05:30
rudrapratik30
84913a658f [docs]: File Picker Component 2025-10-30 11:39:05 +05:30
Souvik
117e2ef27c small fix 2025-10-30 01:15:28 +05:30
Adish M
9d92b4b848
Merge pull request #14448 from ToolJet/render-preview-fix
Render preview fix
2025-10-29 16:50:33 +05:30
Souvik
1f7e2e9d45 small fix 2025-10-29 16:47:54 +05:30
Souvik
738ecf1576 updated fix 2025-10-29 16:38:09 +05:30
Souvik
9cf86c862e
Update render-preview-deploy.yml 2025-10-29 16:12:17 +05:30
Souvik
e25f1ae6f5 Merge branch 'develop' into render-preview-fix 2025-10-29 13:48:25 +05:30
Souvik
dcf1e4c16d added fix 2025-10-29 13:48:01 +05:30
Adish M
52b15814fe
Merge pull request #14446 from ToolJet/fix/api-cache-develop
Add Cloudflare cache purge step with user authorization check to develop branch
2025-10-29 13:09:34 +05:30
adishM98 Bot
04d1c25f0b Add Cloudflare cache purge step with user authorization check to develop branch 2025-10-29 13:08:36 +05:30
Adish M
9c2ac21206
modified to target (#14445) 2025-10-29 12:38:55 +05:30
Souvik
9569be1412 modified to target 2025-10-29 12:35:56 +05:30
Adish M
7f95723f43
Merge pull request #14436 from ToolJet/render-preview-fix
Render-preview-deploy file FIX
2025-10-29 10:40:51 +05:30
Souvik
762c42fccb Use checkout-branch default 2025-10-28 23:38:23 +05:30
Pratik Agrawal
df07b3216b
[docs]: ToolJet API cURL (#14338)
* [docs]: ToolJet API cURL

* testing updates
2025-10-28 12:18:05 +05:30
Adish M
56b50c8ab5
Merge pull request #14416 from ToolJet/docs/marketplace-sprint-13
[docs]: Marketplace Sprint 13
2025-10-27 11:21:36 +05:30
rudrapratik30
08aca1e21e wip waiting for rest api changes 2025-10-27 11:14:42 +05:30
Adish M
7539a385b6
Merge pull request #14402 from ToolJet/docs/gemini-model-update
[docs]: Update Gemini Model
2025-10-24 14:35:09 +05:30
rudrapratik30
a945731e6a [docs]: Update Gemini Model 2025-10-24 14:34:19 +05:30
Adish M
60f69b79a2
Merge pull request #14393 from ToolJet/adishM98-patch-1
Disable scheduled cron job for LTS updates
2025-10-23 14:28:35 +05:30
Adish M
1df259e816
Disable scheduled cron job for LTS updates
Comment out the scheduled cron job for LTS updates.
2025-10-23 14:28:20 +05:30
Pratik Agrawal
a01089680e
[docs]: App Builder Feedback Update (#14327)
* nav bar changes

* update custom themes

* wip

* feedback complete
2025-10-22 11:30:58 +05:30
Adish M
b2b8de1a3d
Merge pull request #14350 from ToolJet/docs/end-user-app-preview
[docs]: Preview Access Control for End Users
2025-10-17 11:30:17 +05:30
Adish M
e4dafda7a8
Merge pull request #14356 from ToolJet/updated-render-preview 2025-10-17 06:50:18 +05:30
Souvik
87ab1b12da added event.issue.number 2025-10-16 22:51:15 +05:30
Adish M
760b2ae895
Merge pull request #14352 from ToolJet/updated-render-preview
updated render-preview-deploy file
2025-10-16 19:45:28 +05:30
Souvik
10b334b5a4 updated render-preview-deploy file 2025-10-16 19:12:13 +05:30
rudrapratik30
bf36a125db [docs]: Preview Access Control for End Users 2025-10-16 12:38:02 +05:30
Pratik Agrawal
2dbe792741
[docs]: Global App Version (#14328) 2025-10-14 12:27:02 +05:30
Pratik Agrawal
b8115a8ef1
[docs]: Add Custom Style (#14324) 2025-10-14 10:47:31 +05:30
adishM98 Bot
b165ead455 Revert "Update workflow scheduling instructions across deployment documentation"
This reverts commit 618ac7399a.
2025-10-10 13:36:37 +05:30
adishM98 Bot
618ac7399a Update workflow scheduling instructions across deployment documentation 2025-10-10 13:35:33 +05:30
rudrapratik30
0752bb3fbf build fix 2025-10-08 14:24:21 +05:30
rudrapratik30
664828e8c3 revamp done 2025-10-08 12:13:11 +05:30
Adish M
f836095a54
Merge pull request #14269 from ToolJet/adishM98-patch-1
Modify branch reference and remove cypress checks
2025-10-07 15:54:21 +05:30
Adish M
1568fe1331
Modify branch reference and remove cypress checks
Updated the GitHub Actions workflow to change the branch reference from 'main' to 'lts-3.16' for multiple jobs and removed the cypress vulnerability check section.
2025-10-07 15:53:13 +05:30
Adish M
0d6a364fe5
Add ALLOWED_USER14_TEST_SYSTEM to test system (#14225) 2025-09-30 19:35:18 +05:30
Adish M
91364a2460
Add ALLOWED_USER14_TEST_SYSTEM to test system 2025-09-30 19:34:55 +05:30
Pratik Agrawal
ea2327066a
[docs]: API User Metadata (#14217) 2025-09-30 19:28:52 +05:30
rudrapratik30
b2305bf9aa wip 2025-09-30 14:03:28 +05:30
Adish M
f9dc6e3afb
Add ALLOWED_USER13_TEST_SYSTEM to workflow (#14197) 2025-09-29 11:38:13 +05:30
Adish M
b2cc63f6a8
reverting netlify changes (#14193) 2025-09-28 23:45:55 +05:30
Adish M
ffb4e2ecbd
Use npx for Netlify CLI deployment (#14191)
* Use npx for Netlify CLI deployment

Replaced global installation of Netlify CLI with local installation using npx.

* Update Netlify CLI installation method in workflow
2025-09-28 23:14:03 +05:30
Adish M
c8168fa0e1
Merge pull request #14179 from ToolJet/docs/remove-slash
remove slash
2025-09-26 14:40:42 +05:30
rudrapratik30
8427b2ab95 remove slash 2025-09-26 14:40:01 +05:30
Adish M
1025196829
Merge pull request #14178 from ToolJet/docs/ai-whitelist
[docs]: AI Server URL Whitelisting
2025-09-26 14:30:30 +05:30
rudrapratik30
6f93abf529 [docs]: AI Server URL Whitelisting 2025-09-26 14:29:36 +05:30
Adish M
402b9d1d20
Merge pull request #14157 from ToolJet/docs/ldap-ou-sso
[docs]: LDAP Multi OU
2025-09-24 14:13:49 +05:30
rudrapratik30
e3436d75a1 [docs]: LDAP Multi OU 2025-09-24 14:12:01 +05:30
rudrapratik30
1476e102a5 wip 2025-09-24 14:08:51 +05:30
Adish M
8440a41b4b
Merge pull request #14131 from ToolJet/docs/ai-sept-release
[docs]: AI Sept Release
2025-09-22 12:33:52 +05:30
rudrapratik30
59f50898ff remove modify specs 2025-09-22 12:32:13 +05:30
Adish M
13793ff148
Merge pull request #14133 from ToolJet/adishM98-patch-1
Update WEBSITE_SIGNUP_URL in workflow configuration
2025-09-22 11:21:15 +05:30
Adish M
692a816aeb
Update WEBSITE_SIGNUP_URL in workflow configuration 2025-09-22 11:20:24 +05:30
rudrapratik30
1c84f10dc0 update docs version 2025-09-22 10:59:00 +05:30
rudrapratik30
d553e47fc2 Generate Code 2025-09-22 09:25:52 +05:30
rudrapratik30
81f13c41ff gen code wip 2025-09-20 15:48:16 +05:30
Adish M
a5f53b9493
Merge pull request #14115 from ToolJet/terraform-vm-added
Added GCP and EC2_AMI
2025-09-19 16:58:40 +05:30
rudrapratik30
ca04352ba6 buy ai credits 2025-09-19 15:24:27 +05:30
rudrapratik30
6b9b902137 debug components 2025-09-19 10:33:40 +05:30
rudrapratik30
8790c674ae privacy policy 2025-09-19 09:31:10 +05:30
Souvik
b44cac88b4 small fix 2025-09-19 01:29:34 +05:30
Souvik
0a47a27d7a Updated docs for ami and docker 2025-09-19 01:25:59 +05:30
Souvik
b179e92bb5 Added GCP and EC2_AMI 2025-09-19 01:12:30 +05:30
rudrapratik30
f976ab01e9 wip 2025-09-18 12:05:25 +05:30
Adish M
ae7e7431b8
Merge pull request #14034 from ToolJet/contributing-docker-fix
Fix no file error
2025-09-16 17:07:52 +05:30
Souvik
9f3ed98f2b updated filename 2025-09-16 17:07:10 +05:30
Adish M
d34d0c10fe
Merge pull request #14026 from ToolJet/docs/website-utm
[docs]: Website UTM Parameters
2025-09-16 14:28:49 +05:30
rudrapratik30
bcba8d6467 generate app steps 2025-09-16 12:35:42 +05:30
Adish M
d4aea0f621
Merge pull request #14066 from ToolJet/adishM98-patch-1
Change default branch for deployment to 'lts-3.16'
2025-09-16 12:27:16 +05:30
Adish M
e91ff22a3b
Change default branch for deployment to 'lts-3.16' 2025-09-16 12:27:05 +05:30
Adish M
bc4e5fe48a
Merge pull request #14065 from ToolJet/docs/fix-interlink-dev-lifecycle-rollback 2025-09-16 09:31:47 +05:30
rudrapratik30
b7b8a54c41 [docs]: fix interlinking 2025-09-16 09:05:47 +05:30
Souvik
1e6688415b Updated 3.16 lts docs 2025-09-16 00:49:29 +05:30
Souvik
aa8fe75998 Postgrest connection fix 2025-09-16 00:38:42 +05:30
Pratik Agrawal
a4bc5b92e9
Merge pull request #14053 from ToolJet/docs/fedex
[docs]: Marketplace FedEx Plugin
2025-09-15 20:57:41 +05:30
Adish M
1ac3451102
Merge pull request #13971 from ToolJet/docs/permissions
[docs]: App Builder Permissions
2025-09-15 14:16:32 +05:30
rudrapratik30
6b490d019e Merge branch 'docs/update-custom-themes' into docs/permissions 2025-09-15 14:15:24 +05:30
Adish M
7a9099c173
Merge pull request #14011 from ToolJet/docs/appbuilder-sprint-15
[docs]: App Builder Sprint 15
2025-09-15 14:00:36 +05:30
Adish M
5f319aa71c
Merge pull request #14049 from ToolJet/adishM98-patch-1
Update WEBSITE_SIGNUP_URL in workflow file
2025-09-12 17:05:37 +05:30
Adish M
29460a2d9a
Update WEBSITE_SIGNUP_URL in workflow file 2025-09-12 17:05:17 +05:30
Adish M
f6357fede7
Merge pull request #14048 from ToolJet/adishM98-patch-1
Update WEBSITE_SIGNUP_URL in workflow configuration
2025-09-12 16:42:22 +05:30
Adish M
61cc77bbcb
Update WEBSITE_SIGNUP_URL in workflow configuration 2025-09-12 16:42:09 +05:30
Adish M
12d9d89e4b
Merge pull request #13472 from ToolJet/docs/max-file-size
[docs]: Max JSON Size Env Var
2025-09-12 12:44:08 +05:30
rudrapratik30
10e54ae276 replicate 3.16 2025-09-12 12:42:28 +05:30
rudrapratik30
a9f8a15b98 Merge branch 'develop' into docs/max-file-size 2025-09-12 12:39:10 +05:30
Adish M
bdcfba912a
Merge pull request #13704 from ToolJet/docs/gmail
[docs]: Gmail Plugin
2025-09-12 12:10:33 +05:30
Adish M
a1d586e1bf
Merge pull request #14037 from ToolJet/adishM98-patch-1
Fix formatting in deploy-to-stage.yml
2025-09-12 11:37:01 +05:30
Adish M
e1d9544c05
Fix formatting in deploy-to-stage.yml 2025-09-12 11:36:20 +05:30
Souvik
e22c30a679 Fix no file error 2025-09-12 01:10:24 +05:30
rudrapratik30
228c80bbd5 Merge branch 'docs/easypost' into docs/gmail 2025-09-11 19:38:04 +05:30
Adish M
08552c750d
Merge pull request #14032 from ToolJet/adishM98-patch-1
Refactor branch handling in deployment workflow
2025-09-11 19:13:44 +05:30
adishM98 Bot
6f82a769e6 Fix indentation for user authorization step in deployment workflow 2025-09-11 19:13:15 +05:30
Adish M
c7dac6cfd1
Refactor branch handling in deployment workflow 2025-09-11 19:11:36 +05:30
rudrapratik30
d0c27ae6e3 [docs]: Website UTM Parameters 2025-09-11 15:27:12 +05:30
rudrapratik30
870923aea0 label width alignment 2025-09-11 15:07:31 +05:30
rudrapratik30
555bc93da1 star rating feedback update 2025-09-11 12:58:23 +05:30
rudrapratik30
5221dc268b typo 2025-09-10 14:48:31 +05:30
rudrapratik30
2ddac4d119 tags component 2025-09-10 14:48:09 +05:30
rudrapratik30
97c8302376 Merge branch 'develop' into docs/appbuilder-sprint-15 2025-09-10 12:43:18 +05:30
Adish M
ca87b15366
Merge pull request #13982 from ToolJet/docs/ai-credits 2025-09-10 10:28:24 +05:30
rudrapratik30
7d4eed342c all user to edit 2025-09-09 17:17:27 +05:30
rudrapratik30
52d256ec5e star-rating 2025-09-09 17:08:36 +05:30
rudrapratik30
69e979987e popover menu 2025-09-09 15:29:57 +05:30
Adish M
b15bafe1c2
Merge pull request #14007 from ToolJet/docs/deployment-explore-more
[docs]: deployment explore more page
2025-09-09 14:17:36 +05:30
rudrapratik30
c6c5270f73 [docs]: Setup Layout 2025-09-09 13:06:30 +05:30
rudrapratik30
2c5b7d2415 this much is working 2025-09-09 12:26:08 +05:30
rudrapratik30
5c70d0a838 wip 2025-09-09 11:52:00 +05:30
rudrapratik30
60fa3a4d7d banner note update 2025-09-09 11:30:42 +05:30
Adish M
7843a7b4fd
Merge pull request #13980 from ToolJet/Navaneeth-pk-patch-1
[docs] Reposition ToolJet as AI-native internal application & agent builder (README + repo description)
2025-09-09 09:53:06 +05:30
Karan Rathod
549683d090 update platform overview for LLM SEO 2025-09-08 17:23:43 +05:30
rudrapratik30
af38c008c7 feedback 2 2025-09-08 17:09:13 +05:30
Souvik
0f4e89f06e
Terraform added (#13973)
* Terraform added

* docs updated

* EOL
2025-09-08 16:58:07 +05:30
rudrapratik30
ed099dd70b feedback update 2025-09-08 14:56:33 +05:30
Adish M
bc336a1fd4
Fix security audit command for frontend (#13986) 2025-09-08 11:07:18 +05:30
rudrapratik30
4146961a04 replicate debug component in 3.16 2025-09-05 16:39:38 +05:30
rudrapratik30
18aa06288d [docs]: AI Credits 2025-09-05 15:04:46 +05:30
Karan Rathod
4cbabf8cae add more enterprise features 2025-09-05 12:25:07 +05:30
Karan Rathod
c32c562a6e update the list of features for CE and EE 2025-09-05 11:39:11 +05:30
Navaneeth Pk
84e2abd8f6
[docs] clarify community edition vs ToolJet AI (AI-native platform) 2025-09-04 12:39:47 -07:00
Pratik Agrawal
16b9d92146
[docs]: TJ Postgres Upgrade guide (#13960)
* [docs]: TJ Postgres Upgrade guide

* deleted the whole files
2025-09-04 23:03:01 +05:30
rudrapratik30
2204623574 [docs]: EasyPost Marketplace Plugin 2025-09-04 17:05:14 +05:30
rudrapratik30
a49692e0a7 wip 2025-09-04 12:52:07 +05:30
rudrapratik30
77abba5e68 [docs]: App Builder Permissions 2025-09-03 17:20:32 +05:30
Pratik Agrawal
3f51421330
[docs]: OIDC Custom Scope (#13969)
* [docs]: OIDC Custom Scope

* add self hosted
2025-09-03 16:02:39 +05:30
Adish M
d1414fd2da
Revise PostgreSQL 13.x end-of-life information (#13962) 2025-09-02 14:37:20 +05:30
Adish M
dfb7669283
Change RAM requirement for VM deployments (#13961)
Updated RAM requirement from 2GB to 4GB for VM deployments.
2025-09-02 13:58:49 +05:30
rudrapratik30
95b043a382 final updates 2025-09-02 12:32:56 +05:30
Pratik Agrawal
c9fe192c44
[docs]: OpenAPI Multi User Auth (#13951) 2025-09-01 11:09:51 +05:30
Adish M
a6b3f486e3
Update steps.md (#13949) 2025-08-29 22:25:31 +05:30
rudrapratik30
d6cf5365e6 wip 2025-08-29 16:31:25 +05:30
rudrapratik30
ca85271ccb wip 2025-08-29 16:30:30 +05:30
rudrapratik30
7789284ef8 Merge branch 'develop' into docs/update-custom-themes 2025-08-29 12:22:20 +05:30
Pratik Agrawal
6e216dc444
Update Readme File (#13943) 2025-08-29 11:38:40 +05:30
Dheeraj-P-Girish
756cf88a2f
fix:storing the UTM params in the local storage and passing the UTM params to tooljet.ai (#13930)
Co-authored-by: Dheeraj P Girish <dheerajpgirish@Dheeraj-ka-MacBook-Pro.local>
2025-08-28 14:08:14 +05:30
Pratik Agrawal
3891f1952c
[docs]: Slack in 3.16 (#13935) 2025-08-28 11:15:57 +05:30
Pratik Agrawal
d3b190f712
[docs]: CDN in 3.16 (#13934) 2025-08-28 10:56:38 +05:30
Pratik Agrawal
6bf74ffb23
[docs]: fix broken table in image (#13923) 2025-08-25 22:36:59 +05:30
Pratik Agrawal
cea8e70028
[docs]: Clickup Plugin Docs in 3.16 (#13921) 2025-08-25 17:25:11 +05:30
Johnson Cherian
e6ef9b6c5e
Merge pull request #13917 from ToolJet/docs/statistics
[docs]: Statistics Component
2025-08-25 17:18:37 +05:30
rudrapratik30
cdcd05330b feedback changes 2025-08-25 16:53:18 +05:30
rudrapratik30
567fbbd4af [docs]: Statistics Component 2025-08-25 15:07:01 +05:30
Pratik Agrawal
dc22bafac5
[docs]: HubSpot Marketplace Plugin (#13906) 2025-08-22 20:40:18 +05:30
Adish M
0a699a74bd
Update cloud-frontend-gcp.yml (#13913)
* Update cloud-frontend-gcp.yml

* Update cloud-frontend.yml
2025-08-22 20:20:23 +05:30
Adish M
741f00bf7c
Update cloud-frontend-gcp.yml (#13911) 2025-08-22 20:08:29 +05:30
Adish M
d648d14f79
Update cloud-frontend-gcp.yml (#13910) 2025-08-22 20:02:39 +05:30
Pratik Agrawal
12d0760816
[docs]: Embed Application (#13754)
* [docs]: Embed Application

* feedback update

* table left align
2025-08-22 17:45:52 +05:30
Pratik Agrawal
538516016b
[docs]: gRPC 2.0 Guide (#13844)
* grpc draft1

* sidebar update

* grpc doc

* update ss

* update

* update ss
2025-08-21 16:51:27 +05:30
Pratik Agrawal
14c3071efa
[docs]: Migrate WF from 3.5 to 3.16 (#13898) 2025-08-21 16:12:34 +05:30
Vrushabh Gawas
0b60e50f12
[docs]: Fix Typo in Docker Setup Guide (#13823)
* fixed typo in docker setup docs

* Fix Typo in Versioned Docs, Docker Setup Guide
2025-08-21 12:19:42 +05:30
Adish M
1da955cbda
Update try-tooljet.md (#13896) 2025-08-21 11:27:42 +05:30
Adish M
0edfc084cc
Update try-tooljet.md (#13895) 2025-08-21 11:18:08 +05:30
Johnson Cherian
f77f4e7432
Merge pull request #13848 from ToolJet/docs/app-builder-nav
[docs]: Update Navigation Bar
2025-08-20 13:21:45 +05:30
Pratik Agrawal
d74107afcf
[docs]: AWS Bedrock Plugin (#13705) 2025-08-20 11:57:00 +05:30
rudrapratik30
708955146a wip 2025-08-20 11:54:05 +05:30
Adish M
e5c9ab8a81
Fix formatting and enhance Docker deployment instructions for AI feature whitelisting (#13862) 2025-08-19 17:16:32 +05:30
rudrapratik30
9d3fcffebc feedback update 2025-08-19 15:43:31 +05:30
Adish M
afcbef5dc8
Merge pull request #13599 from ToolJet/docs/ups
[docs]: UPS Plugin
2025-08-19 13:56:40 +05:30
Adish M
45395aeb2f
Merge pull request #13852 from ToolJet/fix/deployment-docs
Update deployment documentation for ToolJet 3.16.0-LTS
2025-08-19 12:38:35 +05:30
adishM98 Bot
0c9dead723 Fix link to system requirements in AMI deployment documentation 2025-08-19 12:37:55 +05:30
rudrapratik30
70dbe32b32 formatting updates 2025-08-19 12:25:29 +05:30
adishM98 Bot
9f9473a4d5 Enhance AMI deployment documentation with AI feature whitelisting and upgrade steps 2025-08-19 10:57:53 +05:30
adishM98 Bot
515bfb216d Update deployment documentation for ToolJet 3.16.0-LTS
- Added warnings for enabling ToolJet AI features across various setup guides.
- Clarified PostgreSQL database setup instructions and recommended using specific services (Cloud SQL, Azure Database for PostgreSQL, RDS).
- Introduced Redis configuration guidelines for multi-service setups in Helm, AKS, EKS, GKE, and OpenShift documentation.
- Updated environment variable requirements for ToolJet and PostgREST, ensuring unique database names and proper configurations.
- Included SSL configuration instructions for AWS RDS PostgreSQL connections.
- Changed Docker image tags in the Try ToolJet setup to use the latest LTS version.
- Enhanced overall clarity and structure of the documentation for better user guidance.
2025-08-19 10:34:33 +05:30
rudrapratik30
ac103dcd22 [docs]: Update Navigation Bar 2025-08-18 16:55:42 +05:30
Adish M
17169cb3ac
Merge pull request #13847 from ToolJet/adishM98-patch-1
Update manual-docker-build.yml
2025-08-18 16:34:40 +05:30
Adish M
6aa68ed9e3
Update manual-docker-build.yml 2025-08-18 16:34:24 +05:30
Adish M
22030ed16e
Merge pull request #13774 from ToolJet/docs/postgresql-migratio
[docs]: PostgreSQL Upgrade Guides
2025-08-18 14:58:24 +05:30
adishM98 Bot
c850549e41 update system requirements for PostgreSQL version and clarify Redis usage 2025-08-18 14:58:04 +05:30
Adish M
be07d87824
remove Using Cloud SQL Proxy with Cloud Run 2025-08-18 12:51:59 +05:30
rudrapratik30
84c65b6bcd update gcp 2025-08-18 12:30:42 +05:30
rudrapratik30
78b26487cc tooljet 2025-08-18 11:36:56 +05:30
rudrapratik30
2f42d6427c Merge branch 'develop' into docs/ups 2025-08-14 18:44:32 +05:30
rudrapratik30
44434dcf50 remove from 3.5 2025-08-14 18:34:03 +05:30
rudrapratik30
c46cb7d8b9 Merge branch 'docs/cloud-spanner' into docs/ups 2025-08-14 18:32:41 +05:30
Pratik Agrawal
c5962e50fb
[docs]: GitSync Migration Changes (#13735)
* wip

* gitsync migration changes

* feedback updates
2025-08-14 18:29:20 +05:30
Adish M
978dbf51d2
Merge pull request #13812 from ToolJet/docs-ami-all-lts
[Docs] Updated ami.md 2.50-LTS to 3.16-LTS
2025-08-14 17:08:12 +05:30
Souvik
11a4e1ec2f Updated ami docs 2.50-LTS to 3.16-LTS 2025-08-13 23:53:18 +05:30
Adish M
a07e21fa21
Merge pull request #13575 from ToolJet/docs/aftership
[docs]: Aftership Plugin
2025-08-13 12:09:03 +05:30
rudrapratik30
82929db3d3 Merge confilicts 2025-08-13 12:01:00 +05:30
Pratik Agrawal
855ab19f4a
[docs]: Microsoft Graph (#13554)
* [wip] microsoft graph

* [docs]: Microsoft Graph

* replicate to 3.5

* replicate 3.16
2025-08-13 11:55:37 +05:30
rudrapratik30
8d8201bc32 minor formatting updates for headings 2025-08-12 21:33:09 +05:30
rudrapratik30
b283fe9781 gcp 2025-08-12 21:28:09 +05:30
Adish M
b7ff0a9fd6
Merge pull request #13781 from ToolJet/automation-test-system
Update test system fix
2025-08-12 21:22:44 +05:30
rudrapratik30
057168b95b azure 2025-08-12 20:51:58 +05:30
Souvik
67548b68f0 added fix 2025-08-12 18:26:50 +05:30
Adish M
3bb580198e
Merge pull request #13779 from ToolJet/adishM98-patch-1
Update manual-docker-build.yml
2025-08-12 17:47:52 +05:30
Adish M
a7195b588e
Update manual-docker-build.yml 2025-08-12 17:47:37 +05:30
Adish M
d36ed961c3
Merge pull request #13775 from ToolJet/adishM98-patch-1
Update update-test-system.yml
2025-08-12 16:46:34 +05:30
Adish M
1246f23fc6
Update update-test-system.yml 2025-08-12 16:46:20 +05:30
Adish M
6e2eb08afc
Merge pull request #13659 from ToolJet/automation-update-test-system
Automation: update test-system
2025-08-12 16:44:13 +05:30
Adish M
14b3318b34
updated the allow username to avoid conflict with existing secrets 2025-08-12 16:31:16 +05:30
rudrapratik30
cddd5ea8d4 aws 2025-08-12 16:03:51 +05:30
Adish M
77af1217be
Fix EOF 2025-08-12 12:59:22 +05:30
rudrapratik30
c22ae7f495 aftershipppppp 2025-08-11 17:11:55 +05:30
rudrapratik30
3b2a2dddae fix typo and wrap endpoints 2025-08-11 16:47:30 +05:30
rudrapratik30
895451a6c5 [docs]: Embed Application 2025-08-11 16:35:13 +05:30
Souvik
751eaaa56b Updates added 2025-08-11 12:25:30 +05:30
rudrapratik30
a4ff317901 replicate 3.16 2025-08-08 18:59:45 +05:30
rudrapratik30
8e808e09d6 Merge branch 'develop' into docs/aftership 2025-08-08 18:53:10 +05:30
rudrapratik30
a07a867748 replicate 3.16 2025-08-08 18:40:19 +05:30
rudrapratik30
ebd18a8757 Merge branch 'develop' into docs/ups 2025-08-08 18:35:29 +05:30
rudrapratik30
2c4c659217 [docs]: Google Cloud Spanner Plugin 2025-08-08 16:50:09 +05:30
Johnson Cherian
795762b10c
Merge pull request #13667 from ToolJet/docs/release-interlinking
[docs]: Release Interlinking
2025-08-08 12:59:57 +05:30
Pratik Agrawal
ac0a288891
revert changes 2025-08-08 11:46:02 +05:30
Pratik Agrawal
79e872226d
revert changes 3.5 listview 2025-08-08 11:44:14 +05:30
Pratik Agrawal
accd1ed04b
revert changes 2025-08-08 11:41:48 +05:30
Pratik Agrawal
8126623dd5
revert changes - bounded-box 3.5 2025-08-08 11:40:59 +05:30
rudrapratik30
86f0288442 fix broken action reference link 2025-08-08 11:09:41 +05:30
rudrapratik30
3c797da2c8 fix component library and 3.5 home page 2025-08-08 11:05:59 +05:30
rudrapratik30
1eed85af9e fix beta links 2025-08-08 10:52:02 +05:30
rudrapratik30
572310af62 Merge branch 'develop' into docs/release-interlinking 2025-08-08 10:23:19 +05:30
Pratik Agrawal
43c6735dd5
[docs]: Update Env Var (#13707) 2025-08-08 10:22:16 +05:30
Pratik Agrawal
ba3912ce53
[docs]: Post Release Platform and App Builder Changes (#13644)
* [docs]: Post Release Platform and App Builder Changes

* gitsync replica

* gitsync custom branch
2025-08-07 12:09:43 +05:30
rudrapratik30
b420cfd498 [docs]: Gmail Plugin 2025-08-06 18:07:53 +05:30
Pratik Agrawal
8a7dc03e8a
[docs]: Add Self Hosted Tag in Workflows (#13692) 2025-08-06 11:18:04 +05:30
Pratik Agrawal
e4228489b7
[docs]: Update LDAP OU Env Var (#13682)
* [docs]: Update LDAP OU Env Var

* update
2025-08-05 17:49:32 +05:30
rudrapratik30
91d9da714e completed 2025-08-05 16:56:16 +05:30
Pratik Agrawal
ef7a535362
[docs]: LDAP Multiple OU (#13607)
* [docs]: LDAP Multiple OU

* replicate 3.5

* remove changes from beta
2025-08-05 16:11:55 +05:30
Pratik Agrawal
62ef4461c3
[docs]: Post Release Updates (#13649)
* [docs]: Add New Docs Version and Docker Tag

* remove beta version and setup updates

* build errors
2025-08-05 14:55:44 +05:30
Midhun G S
cf03b5ef51
Merge pull request #13672 from ToolJet/main
Merge main to develop
2025-08-05 12:57:47 +05:30
rudrapratik30
c290082868 interlinking 2.5, 3, 3.5 2025-08-05 11:40:15 +05:30
Souvik
23315c57dd Automation: update test-system 2025-08-04 20:51:07 +05:30
rudrapratik30
7e76d54688 wip 2025-08-01 17:44:14 +05:30
rudrapratik30
b0a9b4f258 [docs]: UPS Plugin 2025-08-01 12:01:52 +05:30
rudrapratik30
5739a7fee5 [docs]: Aftership Plugin 2025-07-31 14:43:17 +05:30
rudrapratik30
8425cc2045 add quotations 2025-07-18 11:47:17 +05:30
rudrapratik30
2e86a9e83c replicate env var to beta 2025-07-18 11:26:23 +05:30
rudrapratik30
a25a320b9d [docs]: Max JSON Size Env Var 2025-07-17 18:02:57 +05:30
Vaishnavi Joshi
122474ada3 [docs]: fix smtp attachment 2025-03-26 10:31:24 +05:30
12531 changed files with 157256 additions and 2304090 deletions

View file

@ -0,0 +1,203 @@
name: Deploy to Cloudflare Pages prod (Cloud Frontend)
on:
workflow_dispatch:
inputs:
branch:
description: 'Git branch to deploy (must start with "lts-", e.g., lts-3.6)'
required: true
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: ✅ Check user authorization
run: |
allowed_user1=${{ secrets.ALLOWED_USER1_USERNAME }}
allowed_user2=${{ secrets.ALLOWED_USER2_USERNAME }}
allowed_user3=${{ secrets.ALLOWED_USER3_USERNAME }}
if [[ "${{ github.actor }}" != "$allowed_user1" && \
"${{ github.actor }}" != "$allowed_user2" && \
"${{ github.actor }}" != "$allowed_user3" ]]; then
echo "❌ User '${{ github.actor }}' is not authorized to trigger this workflow."
exit 1
else
echo "✅ User '${{ github.actor }}' is authorized."
fi
- name: 📥 Manual Git checkout with submodules
run: |
set -e
BRANCH="${{ github.event.inputs.branch }}"
REPO="https://x-access-token:${{ secrets.CUSTOM_GITHUB_TOKEN }}@github.com/${{ github.repository }}"
git config --global url."https://x-access-token:${{ secrets.CUSTOM_GITHUB_TOKEN }}@github.com/".insteadOf "https://github.com/"
git config --global http.version HTTP/1.1
git config --global http.postBuffer 524288000
echo "👉 Cloning $REPO (branch: $BRANCH)"
git clone --recurse-submodules --depth=1 --branch "$BRANCH" "$REPO" repo
cd repo
echo "🔎 Main repo: verifying checkout"
MAIN_CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "✅ Main repo: successfully checked out branch $MAIN_CURRENT"
echo "📍 Main repo: current commit $(git rev-parse --short HEAD): $(git log -1 --pretty=%s)"
echo "🔁 Updating submodules"
git submodule update --init --recursive
echo "🔀 Attempting to checkout '$BRANCH' in each submodule and validating"
BRANCH="$BRANCH" git submodule foreach --recursive bash -c '
name="$sm_path"
echo ""
echo "Entering '\''$name'\''"
echo "↪ $name: trying to checkout branch '\''$BRANCH'\''"
if git ls-remote --exit-code --heads origin "$BRANCH" >/dev/null; then
git fetch origin "$BRANCH:$BRANCH" || {
echo "❌ $name: fetch failed for $BRANCH"
exit 1
}
PREV=$(git rev-parse --short HEAD || echo "unknown")
git checkout "$BRANCH" || {
echo "❌ $name: checkout failed for $BRANCH"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: checked out branch $BRANCH"
else
echo "⚠️ $name: branch '$BRANCH' not found on origin. Falling back to 'lts-3.16'"
PREV=$(git rev-parse --short HEAD || echo "unknown")
git fetch origin lts-3.16:lts-3.16 || {
echo "❌ $name: fetch failed for lts-3.16"
exit 1
}
git checkout lts-3.16 || {
echo "❌ $name: fallback to lts-3.16 failed"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: now on branch lts-3.16"
fi
CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "🔎 $name: current branch = $CURRENT"
if [ "$CURRENT" != "$BRANCH" ] && [ "$CURRENT" != "lts-3.16" ]; then
echo "❌ $name: unexpected branch state — wanted '$BRANCH' or fallback 'lts-3.16', got '$CURRENT'"
exit 1
fi
'
- name: 🧰 Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 22.15.1
- name: 📦 Install dependencies
run: npm install
working-directory: repo
- name: 🛠️ Build the project
run: npm run build:plugins:prod && npm run build:frontend:cloud
working-directory: repo
env:
GOOGLE_MAPS_API_KEY: ${{ secrets.CLOUD_PROD_CLOUD_GOOGLE_MAPS_API_KEY }}
NODE_ENV: ${{ secrets.CLOUD_PROD_CLOUD_NODE_ENV }}
NODE_OPTIONS: ${{ secrets.CLOUD_PROD_CLOUD_NODE_OPTIONS }}
SENTRY_AUTH_TOKEN: ${{ secrets.CLOUD_PROD_CLOUD_SENTRY_AUTH_TOKEN }}
SENTRY_ORG: ${{ secrets.CLOUD_PROD_CLOUD_SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.CLOUD_PROD_CLOUD_SENTRY_PROJECT }}
SERVE_CLIENT: ${{ secrets.CLOUD_PROD_CLOUD_SERVE_CLIENT }}
SERVER_IP: ${{ secrets.CLOUD_PROD_CLOUD_SERVER_IP }}
TJDB_SQL_MODE_DISABLE: ${{ secrets.CLOUD_PROD_CLOUD_TJDB_SQL_MODE_DISABLE }}
TOOLJET_SERVER_URL: ${{ secrets.CLOUD_PROD_CLOUD_TOOLJET_SERVER_URL }}
WEBSITE_SIGNUP_URL: https://www.tooljet.com/create-account
TOOLJET_EDITION: cloud
- name: 📝 Add SPA routing redirect rule
run: echo "/* /index.html 200" > repo/frontend/build/_redirects
- name: 🔧 Set CF Pages production branch to input branch
run: |
echo "🔄 Updating CF Pages production branch to: ${{ github.event.inputs.branch }}"
response=$(curl -s -w "\n%{http_code}" -X PATCH \
"https://api.cloudflare.com/client/v4/accounts/${{ secrets.CF_PAGES_ACCOUNT_ID }}/pages/projects/${{ secrets.CF_PAGES_PROJECT_NAME_PROD }}" \
-H "Authorization: Bearer ${{ secrets.CF_PAGES_API_TOKEN }}" \
-H "Content-Type: application/json" \
--data '{"production_branch": "${{ github.event.inputs.branch }}"}')
http_code=$(echo "$response" | tail -n1)
if [ "$http_code" = "200" ]; then
echo "✅ Production branch updated to: ${{ github.event.inputs.branch }}"
else
echo "❌ Failed to update production branch (HTTP $http_code)"
echo "$response"
exit 1
fi
- name: 🚀 Deploy to Cloudflare Pages
run: |
echo "📦 Built from source branch: ${{ github.event.inputs.branch }}"
echo "🎯 Targeting CF Pages production slot (branch alias: ${{ github.event.inputs.branch }})"
npx wrangler pages deploy frontend/build \
--project-name=${{ secrets.CF_PAGES_PROJECT_NAME_PROD }} \
--branch=${{ github.event.inputs.branch }} \
--commit-dirty=true
working-directory: repo
env:
CLOUDFLARE_API_TOKEN: ${{ secrets.CF_PAGES_API_TOKEN }}
CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CF_PAGES_ACCOUNT_ID }}
purge_cache:
needs: deploy
runs-on: ubuntu-latest
steps:
- name: ✅ Check user authorization
run: |
allowed_user1=${{ secrets.ALLOWED_USER1_USERNAME }}
allowed_user2=${{ secrets.ALLOWED_USER2_USERNAME }}
allowed_user3=${{ secrets.ALLOWED_USER3_USERNAME }}
if [[ "${{ github.actor }}" != "$allowed_user1" && \
"${{ github.actor }}" != "$allowed_user2" && \
"${{ github.actor }}" != "$allowed_user3" ]]; then
echo "❌ User '${{ github.actor }}' is not authorized to trigger this workflow."
exit 1
else
echo "✅ User '${{ github.actor }}' is authorized."
fi
- name: 🧹 Purge Cloudflare Cache
continue-on-error: true
run: |
echo "🔄 Purging Cloudflare cache for specific URLs..."
response=$(curl -s -w "\n%{http_code}" -X POST \
"https://api.cloudflare.com/client/v4/zones/${{ secrets.CLOUDFLARE_ZONE_ID_PROD }}/purge_cache" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_API_TOKEN_PROD }}" \
-H "Content-Type: application/json" \
--data '{
"files": [
"${{ secrets.CLOUDFLARE_CONFIG_URL_PROD }}",
"${{ secrets.CLOUDFLARE_METADATA_URL_PROD }}"
]
}')
http_code=$(echo "$response" | tail -n1)
body=$(echo "$response" | sed '$d')
if [ "$http_code" = "200" ]; then
echo "✅ Cloudflare cache purged successfully for specified URLs"
echo "$body"
else
echo "⚠️ Cloudflare cache purge failed with status code: $http_code"
echo "$body"
exit 1
fi

View file

@ -0,0 +1,203 @@
name: Deploy to Cloudflare Pages stage (Cloud Frontend)
on:
workflow_dispatch:
inputs:
branch:
description: 'Git branch to deploy (must start with "lts-", e.g., lts-3.6)'
required: true
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: ✅ Check user authorization
run: |
allowed_user1=${{ secrets.ALLOWED_USER1_USERNAME }}
allowed_user2=${{ secrets.ALLOWED_USER2_USERNAME }}
allowed_user3=${{ secrets.ALLOWED_USER3_USERNAME }}
if [[ "${{ github.actor }}" != "$allowed_user1" && \
"${{ github.actor }}" != "$allowed_user2" && \
"${{ github.actor }}" != "$allowed_user3" ]]; then
echo "❌ User '${{ github.actor }}' is not authorized to trigger this workflow."
exit 1
else
echo "✅ User '${{ github.actor }}' is authorized."
fi
- name: 📥 Manual Git checkout with submodules
run: |
set -e
BRANCH="${{ github.event.inputs.branch }}"
REPO="https://x-access-token:${{ secrets.CUSTOM_GITHUB_TOKEN }}@github.com/${{ github.repository }}"
git config --global url."https://x-access-token:${{ secrets.CUSTOM_GITHUB_TOKEN }}@github.com/".insteadOf "https://github.com/"
git config --global http.version HTTP/1.1
git config --global http.postBuffer 524288000
echo "👉 Cloning $REPO (branch: $BRANCH)"
git clone --recurse-submodules --depth=1 --branch "$BRANCH" "$REPO" repo
cd repo
echo "🔎 Main repo: verifying checkout"
MAIN_CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "✅ Main repo: successfully checked out branch $MAIN_CURRENT"
echo "📍 Main repo: current commit $(git rev-parse --short HEAD): $(git log -1 --pretty=%s)"
echo "🔁 Updating submodules"
git submodule update --init --recursive
echo "🔀 Attempting to checkout '$BRANCH' in each submodule and validating"
BRANCH="$BRANCH" git submodule foreach --recursive bash -c '
name="$sm_path"
echo ""
echo "Entering '\''$name'\''"
echo "↪ $name: trying to checkout branch '\''$BRANCH'\''"
if git ls-remote --exit-code --heads origin "$BRANCH" >/dev/null; then
git fetch origin "$BRANCH:$BRANCH" || {
echo "❌ $name: fetch failed for $BRANCH"
exit 1
}
PREV=$(git rev-parse --short HEAD || echo "unknown")
git checkout "$BRANCH" || {
echo "❌ $name: checkout failed for $BRANCH"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: checked out branch $BRANCH"
else
echo "⚠️ $name: branch '$BRANCH' not found on origin. Falling back to 'lts-3.16'"
PREV=$(git rev-parse --short HEAD || echo "unknown")
git fetch origin lts-3.16:lts-3.16 || {
echo "❌ $name: fetch failed for lts-3.16"
exit 1
}
git checkout lts-3.16 || {
echo "❌ $name: fallback to lts-3.16 failed"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: now on branch lts-3.16"
fi
CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "🔎 $name: current branch = $CURRENT"
if [ "$CURRENT" != "$BRANCH" ] && [ "$CURRENT" != "lts-3.16" ]; then
echo "❌ $name: unexpected branch state — wanted '$BRANCH' or fallback 'lts-3.16', got '$CURRENT'"
exit 1
fi
'
- name: 🧰 Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 22.15.1
- name: 📦 Install dependencies
run: npm install
working-directory: repo
- name: 🛠️ Build the project
run: npm run build:plugins:prod && npm run build:frontend:cloud
working-directory: repo
env:
GOOGLE_MAPS_API_KEY: ${{ secrets.CLOUD_GOOGLE_MAPS_API_KEY }}
NODE_ENV: ${{ secrets.CLOUD_NODE_ENV }}
NODE_OPTIONS: ${{ secrets.CLOUD_NODE_OPTIONS }}
SENTRY_AUTH_TOKEN: ${{ secrets.CLOUD_SENTRY_AUTH_TOKEN }}
SENTRY_ORG: ${{ secrets.CLOUD_SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.CLOUD_SENTRY_PROJECT }}
SERVE_CLIENT: ${{ secrets.CLOUD_SERVE_CLIENT }}
SERVER_IP: ${{ secrets.CLOUD_SERVER_IP }}
TJDB_SQL_MODE_DISABLE: ${{ secrets.CLOUD_TJDB_SQL_MODE_DISABLE }}
TOOLJET_SERVER_URL: ${{ secrets.CLOUD_TOOLJET_SERVER_URL }}
TOOLJET_EDITION: cloud
WEBSITE_SIGNUP_URL: https://website-stage.tooljet.ai/signup
- name: 📝 Add SPA routing redirect rule
run: echo "/* /index.html 200" > repo/frontend/build/_redirects
- name: 🔧 Set CF Pages production branch to input branch
run: |
echo "🔄 Updating CF Pages production branch to: ${{ github.event.inputs.branch }}"
response=$(curl -s -w "\n%{http_code}" -X PATCH \
"https://api.cloudflare.com/client/v4/accounts/${{ secrets.CF_PAGES_ACCOUNT_ID }}/pages/projects/${{ secrets.CF_PAGES_PROJECT_NAME }}" \
-H "Authorization: Bearer ${{ secrets.CF_PAGES_API_TOKEN }}" \
-H "Content-Type: application/json" \
--data '{"production_branch": "${{ github.event.inputs.branch }}"}')
http_code=$(echo "$response" | tail -n1)
if [ "$http_code" = "200" ]; then
echo "✅ Production branch updated to: ${{ github.event.inputs.branch }}"
else
echo "❌ Failed to update production branch (HTTP $http_code)"
echo "$response"
exit 1
fi
- name: 🚀 Deploy to Cloudflare Pages
run: |
echo "📦 Built from source branch: ${{ github.event.inputs.branch }}"
echo "🎯 Targeting CF Pages production slot (branch alias: ${{ github.event.inputs.branch }})"
npx wrangler pages deploy frontend/build \
--project-name=${{ secrets.CF_PAGES_PROJECT_NAME }} \
--branch=${{ github.event.inputs.branch }} \
--commit-dirty=true
working-directory: repo
env:
CLOUDFLARE_API_TOKEN: ${{ secrets.CF_PAGES_API_TOKEN }}
CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CF_PAGES_ACCOUNT_ID }}
purge_cache:
needs: deploy
runs-on: ubuntu-latest
steps:
- name: ✅ Check user authorization
run: |
allowed_user1=${{ secrets.ALLOWED_USER1_USERNAME }}
allowed_user2=${{ secrets.ALLOWED_USER2_USERNAME }}
allowed_user3=${{ secrets.ALLOWED_USER3_USERNAME }}
if [[ "${{ github.actor }}" != "$allowed_user1" && \
"${{ github.actor }}" != "$allowed_user2" && \
"${{ github.actor }}" != "$allowed_user3" ]]; then
echo "❌ User '${{ github.actor }}' is not authorized to trigger this workflow."
exit 1
else
echo "✅ User '${{ github.actor }}' is authorized."
fi
- name: 🧹 Purge Cloudflare Cache
continue-on-error: true
run: |
echo "🔄 Purging Cloudflare cache for specific URLs..."
response=$(curl -s -w "\n%{http_code}" -X POST \
"https://api.cloudflare.com/client/v4/zones/${{ secrets.CLOUDFLARE_ZONE_ID_PROD }}/purge_cache" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_API_TOKEN_PROD }}" \
-H "Content-Type: application/json" \
--data '{
"files": [
"${{ secrets.CLOUDFLARE_CONFIG_URL_STAGE }}",
"${{ secrets.CLOUDFLARE_METADATA_URL_STAGE }}"
]
}')
http_code=$(echo "$response" | tail -n1)
body=$(echo "$response" | sed '$d')
if [ "$http_code" = "200" ]; then
echo "✅ Cloudflare cache purged successfully for specified URLs"
echo "$body"
else
echo "⚠️ Cloudflare cache purge failed with status code: $http_code"
echo "$body"
exit 1
fi

View file

@ -1,133 +0,0 @@
name: Deploy to cloud frontend stage
on:
workflow_dispatch:
inputs:
branch:
description: 'Git branch to deploy (must start with "lts-", e.g., lts-3.6)'
required: true
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: ✅ Check user authorization
run: |
allowed_user1=${{ secrets.ALLOWED_USER1_USERNAME }}
allowed_user2=${{ secrets.ALLOWED_USER2_USERNAME }}
allowed_user3=${{ secrets.ALLOWED_USER3_USERNAME }}
if [[ "${{ github.actor }}" != "$allowed_user1" && \
"${{ github.actor }}" != "$allowed_user2" && \
"${{ github.actor }}" != "$allowed_user3" ]]; then
echo "❌ User '${{ github.actor }}' is not authorized to trigger this workflow."
exit 1
else
echo "✅ User '${{ github.actor }}' is authorized."
fi
- name: 📥 Manual Git checkout with submodules
run: |
set -e
BRANCH="${{ github.event.inputs.branch }}"
REPO="https://x-access-token:${{ secrets.CUSTOM_GITHUB_TOKEN }}@github.com/${{ github.repository }}"
git config --global url."https://x-access-token:${{ secrets.CUSTOM_GITHUB_TOKEN }}@github.com/".insteadOf "https://github.com/"
git config --global http.version HTTP/1.1
git config --global http.postBuffer 524288000
echo "👉 Cloning $REPO (branch: $BRANCH)"
git clone --recurse-submodules --depth=1 --branch "$BRANCH" "$REPO" repo
cd repo
echo "🔁 Updating submodules"
git submodule update --init --recursive
echo "🔀 Attempting to checkout '$BRANCH' in each submodule and validating"
BRANCH="$BRANCH" git submodule foreach --recursive bash -c '
name="$sm_path"
echo ""
echo "Entering '\''$name'\''"
echo "↪ $name: trying to checkout branch '\''$BRANCH'\''"
if git ls-remote --exit-code --heads origin "$BRANCH" >/dev/null; then
git fetch origin "$BRANCH:$BRANCH" || {
echo "❌ $name: fetch failed for $BRANCH"
exit 1
}
PREV=$(git rev-parse --short HEAD || echo "unknown")
git checkout "$BRANCH" || {
echo "❌ $name: checkout failed for $BRANCH"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: checked out branch $BRANCH"
else
echo "⚠️ $name: branch '$BRANCH' not found on origin. Falling back to 'main'"
PREV=$(git rev-parse --short HEAD || echo "unknown")
git checkout main && git pull origin main || {
echo "❌ $name: fallback to main failed"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: now on branch main"
fi
CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "🔎 $name: current branch = $CURRENT"
if [ "$CURRENT" != "$BRANCH" ] && [ "$CURRENT" != "main" ]; then
echo "❌ $name: unexpected branch state — wanted '$BRANCH' or fallback 'main', got '$CURRENT'"
exit 1
fi
'
- name: 🧰 Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 22.15.1
- name: 📦 Install dependencies
run: npm install
working-directory: repo
- name: 🛠️ Build the project
run: npm run build:plugins:prod && npm run build:frontend
working-directory: repo
env:
GOOGLE_MAPS_API_KEY: ${{ secrets.CLOUD_GOOGLE_MAPS_API_KEY }}
NODE_ENV: ${{ secrets.CLOUD_NODE_ENV }}
NODE_OPTIONS: ${{ secrets.CLOUD_NODE_OPTIONS }}
SENTRY_AUTH_TOKEN: ${{ secrets.CLOUD_SENTRY_AUTH_TOKEN }}
SENTRY_ORG: ${{ secrets.CLOUD_SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.CLOUD_SENTRY_PROJECT }}
SERVE_CLIENT: ${{ secrets.CLOUD_SERVE_CLIENT }}
SERVER_IP: ${{ secrets.CLOUD_SERVER_IP }}
TJDB_SQL_MODE_DISABLE: ${{ secrets.CLOUD_TJDB_SQL_MODE_DISABLE }}
TOOLJET_SERVER_URL: ${{ secrets.CLOUD_TOOLJET_SERVER_URL }}
TOOLJET_EDITION: cloud
WEBSITE_SIGNUP_URL: https://website-stage.tooljet.ai/ai-create-account
- name: 🚀 Deploy to Netlify
run: |
npm install -g netlify-cli
netlify deploy --prod --dir=frontend/build --auth=$NETLIFY_AUTH_TOKEN --site=${{ secrets.CLOUD_NETLIFY_SITE_ID }}
working-directory: repo
env:
NETLIFY_AUTH_TOKEN: ${{ secrets.NETLIFY_AUTH_TOKEN }}
GOOGLE_MAPS_API_KEY: ${{ secrets.CLOUD_GOOGLE_MAPS_API_KEY }}
NODE_ENV: ${{ secrets.CLOUD_NODE_ENV }}
NODE_OPTIONS: ${{ secrets.CLOUD_NODE_OPTIONS }}
SENTRY_AUTH_TOKEN: ${{ secrets.CLOUD_SENTRY_AUTH_TOKEN }}
SENTRY_ORG: ${{ secrets.CLOUD_SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.CLOUD_SENTRY_PROJECT }}
SERVE_CLIENT: ${{ secrets.CLOUD_SERVE_CLIENT }}
SERVER_IP: ${{ secrets.CLOUD_SERVER_IP }}
TJDB_SQL_MODE_DISABLE: ${{ secrets.CLOUD_TJDB_SQL_MODE_DISABLE }}
TOOLJET_SERVER_URL: ${{ secrets.CLOUD_TOOLJET_SERVER_URL }}
WEBSITE_SIGNUP_URL: https://website-stage.tooljet.ai/ai-create-account
TOOLJET_EDITION: cloud

View file

@ -42,6 +42,11 @@ jobs:
git clone --recurse-submodules --depth=1 --branch "$BRANCH" "$REPO" repo
cd repo
echo "🔎 Main repo: verifying checkout"
MAIN_CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "✅ Main repo: successfully checked out branch $MAIN_CURRENT"
echo "📍 Main repo: current commit $(git rev-parse --short HEAD): $(git log -1 --pretty=%s)"
echo "🔁 Updating submodules"
git submodule update --init --recursive
@ -68,20 +73,24 @@ jobs:
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: checked out branch $BRANCH"
else
echo "⚠️ $name: branch '$BRANCH' not found on origin. Falling back to 'main'"
echo "⚠️ $name: branch '$BRANCH' not found on origin. Falling back to 'lts-3.16'"
PREV=$(git rev-parse --short HEAD || echo "unknown")
git checkout main && git pull origin main || {
echo "❌ $name: fallback to main failed"
git fetch origin lts-3.16:lts-3.16 || {
echo "❌ $name: fetch failed for lts-3.16"
exit 1
}
git checkout lts-3.16 || {
echo "❌ $name: fallback to lts-3.16 failed"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: now on branch main"
echo "✅ $name: now on branch lts-3.16"
fi
CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "🔎 $name: current branch = $CURRENT"
if [ "$CURRENT" != "$BRANCH" ] && [ "$CURRENT" != "main" ]; then
echo "❌ $name: unexpected branch state — wanted '$BRANCH' or fallback 'main', got '$CURRENT'"
if [ "$CURRENT" != "$BRANCH" ] && [ "$CURRENT" != "lts-3.16" ]; then
echo "❌ $name: unexpected branch state — wanted '$BRANCH' or fallback 'lts-3.16', got '$CURRENT'"
exit 1
fi
'
@ -109,7 +118,7 @@ jobs:
SERVER_IP: ${{ secrets.CLOUD_PROD_CLOUD_SERVER_IP }}
TJDB_SQL_MODE_DISABLE: ${{ secrets.CLOUD_TJDB_SQL_MODE_DISABLE }}
TOOLJET_SERVER_URL: ${{ secrets.CLOUD_TOOLJET_SERVER_URL }}
WEBSITE_SIGNUP_URL: https://tooljet.ai/ai-create-account
WEBSITE_SIGNUP_URL: https://www.tooljet.com/create-account
TOOLJET_EDITION: cloud
- name: 🚀 Deploy to Netlify
@ -129,5 +138,52 @@ jobs:
SERVER_IP: ${{ secrets.CLOUD_PROD_CLOUD_SERVER_IP }}
TJDB_SQL_MODE_DISABLE: ${{ secrets.CLOUD_PROD_TJDB_SQL_MODE_DISABLE }}
TOOLJET_SERVER_URL: ${{ secrets.CLOUD_PROD_TOOLJET_SERVER_URL }}
WEBSITE_SIGNUP_URL: https://tooljet.ai/ai-create-account
WEBSITE_SIGNUP_URL: https://www.tooljet.com/create-account
TOOLJET_EDITION: cloud
Purge_Cloudflare_Cache:
needs: deploy
runs-on: ubuntu-latest
steps:
- name: ✅ Check user authorization
run: |
allowed_user1=${{ secrets.ALLOWED_USER1_USERNAME }}
allowed_user2=${{ secrets.ALLOWED_USER2_USERNAME }}
allowed_user3=${{ secrets.ALLOWED_USER3_USERNAME }}
if [[ "${{ github.actor }}" != "$allowed_user1" && \
"${{ github.actor }}" != "$allowed_user2" && \
"${{ github.actor }}" != "$allowed_user3" ]]; then
echo "❌ User '${{ github.actor }}' is not authorized to trigger this workflow."
exit 1
else
echo "✅ User '${{ github.actor }}' is authorized."
fi
- name: 🧹 Purge Cloudflare Cache
continue-on-error: true
run: |
echo "🔄 Purging Cloudflare cache for specific URLs..."
response=$(curl -s -w "\n%{http_code}" -X POST \
"https://api.cloudflare.com/client/v4/zones/${{ secrets.CLOUDFLARE_ZONE_ID_PROD }}/purge_cache" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_API_TOKEN_PROD }}" \
-H "Content-Type: application/json" \
--data '{
"files": [
"${{ secrets.CLOUDFLARE_CONFIG_URL_PROD }}",
"${{ secrets.CLOUDFLARE_METADATA_URL_PROD }}"
]
}')
http_code=$(echo "$response" | tail -n1)
body=$(echo "$response" | sed '$d')
if [ "$http_code" = "200" ]; then
echo "✅ Cloudflare cache purged successfully for specified URLs"
echo "$body"
else
echo "⚠️ Cloudflare cache purge failed with status code: $http_code"
echo "$body"
exit 1
fi

View file

@ -1,13 +1,13 @@
name: Cypress Code-Coverage
on:
pull_request_target:
types: [labeled, unlabeled, closed]
pull_request:
types: [labeled, unlabeled, synchronize, closed]
workflow_dispatch:
env:
PR_NUMBER: ${{ github.event.number }}
PR_NUMBER: ${{ github.event.pull_request.number }}
BRANCH_NAME: ${{ github.head_ref || github.ref_name }}
jobs:
@ -15,26 +15,34 @@ jobs:
name: Code coverage
runs-on: ubuntu-22.04
if: ${{ github.event.action == 'labeled' && (github.event.label.name == 'check-coverage') }}
if: >-
(
(github.event.action == 'labeled' && github.event.label.name == 'check-coverage')
|| (github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'check-coverage'))
)
&& github.event.pull_request.head.repo.full_name == github.repository
steps:
- name: Setup Node.js
uses: actions/setup-node@v2
uses: actions/setup-node@v4
with:
node-version: 18.18.2
- name: Set up Docker
uses: docker-practice/actions-setup-docker@master
- name: Run PosgtreSQL Database Docker Container
- name: Run PostgreSQL Database Docker Container
run: |
sudo docker network create tooljet
sudo docker run -d --name postgres --network tooljet -p 5432:5432 -e POSTGRES_PASSWORD=postgres -e POSTGRES_USER=postgres -e POSTGRES_PORT=5432 -d postgres:13
sudo docker run -d --name postgres --network tooljet \
-p 5432:5432 \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_USER=postgres \
-e POSTGRES_PORT=5432 \
postgres:13
- name: Checkout
uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.ref }}
uses: actions/checkout@v4
- name: Install and build dependencies
run: |
@ -61,7 +69,7 @@ jobs:
echo "TOOLJET_DB_PASS=postgres" >> .env
echo "PGRST_JWT_SECRET=r9iMKoe5CRMgvJBBtp4HrqN7QiPpUToj" >> .env
echo "PGRST_HOST=localhost:3001" >> .env
echo "NODE_ENV=developement" >> .env
echo "NODE_ENV=development" >> .env
- name: Set up database
run: |
@ -69,14 +77,16 @@ jobs:
npm run --prefix server db:reset
npm run --prefix server db:seed
- name: sleep 5
- name: Wait for database seed to settle
run: sleep 5
- name: Run PostgREST Docker Container
run: |
sudo docker run -d --name postgrest --network tooljet -p 3001:3000 \
-e PGRST_DB_URI="postgres://postgres:postgres@postgres:5432/tooljet" -e PGRST_DB_ANON_ROLE="postgres" -e PGRST_JWT_SECRET="r9iMKoe5CRMgvJBBtp4HrqN7QiPpUToj" \
postgrest/postgrest:v10.1.1.20221215
-e PGRST_DB_URI="postgres://postgres:postgres@postgres:5432/tooljet" \
-e PGRST_DB_ANON_ROLE="postgres" \
-e PGRST_JWT_SECRET="r9iMKoe5CRMgvJBBtp4HrqN7QiPpUToj" \
postgrest/postgrest:v10.1.1.20221215
- name: Run plugins compilation in watch mode
run: cd plugins && npm start &
@ -94,7 +104,7 @@ jobs:
sleep 5
done'
- name: docker logs
- name: PostgREST logs
run: sudo docker logs postgrest
- name: Create Cypress environment file
@ -107,25 +117,22 @@ jobs:
- name: Install Cypress
working-directory: ./cypress-tests
run: |
npm install
run: npm install
- name: Run Cypress tests
working-directory: ./cypress-tests
run: |
npm run cy:run
run: npm run cy:run
- name: Debugging
if: always()
run: |
ls -R cypress-tests
ls -R /home/runner/work/ToolJet/ToolJet/cypress-tests
cat /home/runner/work/ToolJet/ToolJet/cypress-tests/.nyc_output/out.json
- name: Upload Coverage Report
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
if: always()
with:
name: coverage
path: cypress-tests/coverage
path: cypress-tests/coverage

View file

@ -6,51 +6,65 @@ on:
workflow_dispatch:
permissions:
contents: read
pull-requests: write
issues: write
env:
PR_NUMBER: ${{ github.event.number }}
BRANCH_NAME: ${{ github.head_ref || github.ref_name }}
TIMESTAMP: ${{ github.run_number }}-${{ github.run_attempt }}
jobs:
Cypress-Marketplace:
runs-on: ubuntu-22.04
if: contains(github.event.pull_request.labels.*.name, 'run-cypress') ||
contains(github.event.pull_request.labels.*.name, 'run-cypress-marketplace-ce') ||
contains(github.event.pull_request.labels.*.name, 'run-cypress-marketplace-ee') ||
contains(github.event.pull_request.labels.*.name, 'run-cypress-ce')
if: contains(github.event.pull_request.labels.*.name, 'run-cypress-marketplace-ee')
strategy:
fail-fast: false
matrix:
edition: >-
${{
contains(github.event.pull_request.labels.*.name, 'run-cypress') && fromJson('["ce", "ee"]') ||
contains(github.event.pull_request.labels.*.name, 'run-cypress-ce') && fromJson('["ce"]') ||
contains(github.event.pull_request.labels.*.name, 'run-cypress-marketplace-ce') && fromJson('["ce"]') ||
contains(github.event.pull_request.labels.*.name, 'run-cypress-marketplace-ee') && fromJson('["ee"]') ||
fromJson('[]')
}}
edition:
- ee
steps:
- name: Debug labels and matrix edition
run: |
echo "Labels: ${{ toJSON(github.event.pull_request.labels.*.name) }}"
echo "Matrix edition: ${{ matrix.edition }}"
- name: Free up disk space
run: |
echo "Available disk space before cleanup:"
df -h
# Remove unnecessary packages
sudo apt-get remove -y '^aspnetcore-.*' '^dotnet-.*' '^llvm-.*' '^php.*' '^mongodb-.*' '^mysql-.*' azure-cli google-cloud-sdk hhvm firefox powershell mono-devel || true
sudo apt-get autoremove -y
sudo apt-get clean
# Remove large directories
sudo rm -rf /usr/share/dotnet
sudo rm -rf /usr/local/lib/android
sudo rm -rf /opt/ghc
sudo rm -rf /usr/local/share/boost
sudo rm -rf "$AGENT_TOOLSDIRECTORY"
# Clean Docker
docker system prune -af --volumes
echo "Available disk space after cleanup:"
df -h
- name: Checkout
uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.ref }}
# Create Docker Buildx builder with platform configuration
- name: Set up Docker Buildx
run: |
mkdir -p ~/.docker/cli-plugins
curl -SL https://github.com/docker/buildx/releases/download/v0.11.0/buildx-v0.11.0.linux-amd64 -o ~/.docker/cli-plugins/docker-buildx
chmod a+x ~/.docker/cli-plugins/docker-buildx
docker buildx create --name mybuilder --platform linux/arm64,linux/amd64
docker buildx use mybuilder
- name: Set DOCKER_CLI_EXPERIMENTAL
run: echo "DOCKER_CLI_EXPERIMENTAL=enabled" >> $GITHUB_ENV
- name: use mybuilder buildx
run: docker buildx use mybuilder
- name: Docker Login
uses: docker/login-action@v2
with:
@ -60,34 +74,37 @@ jobs:
- name: Set SAFE_BRANCH_NAME
run: echo "SAFE_BRANCH_NAME=$(echo ${{ env.BRANCH_NAME }} | tr '/' '-')" >> $GITHUB_ENV
- name: Build CE Docker image
uses: docker/build-push-action@v4
with:
context: .
file: docker/ce-production.Dockerfile
push: true
tags: tooljet/tj-osv:${{ env.SAFE_BRANCH_NAME }}-ce
platforms: linux/amd64
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
- name: Log selected matrix
run: |
echo "Selected edition: ${{ matrix.edition }}"
echo "Matrix: ${{ toJSON(matrix) }}"
- name: Build EE Docker image
if: matrix.edition == 'ee'
uses: docker/build-push-action@v4
with:
context: .
file: docker/ee/ee-production.Dockerfile
push: true
tags: tooljet/tj-osv:${{ env.SAFE_BRANCH_NAME }}-ee
platforms: linux/amd64
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}
run: |
echo "Building EE Docker image..."
docker buildx build \
--platform=linux/amd64 \
-f cypress-tests/cypress-lts.Dockerfile \
--build-arg CUSTOM_GITHUB_TOKEN=${{ secrets.CUSTOM_GITHUB_TOKEN }} \
--build-arg BRANCH_NAME=${{ github.event.pull_request.head.ref }} \
-t tooljet/tj-osv:${{ env.SAFE_BRANCH_NAME }}-ee \
--no-cache \
--load \
.
echo "Pushing EE Docker image..."
docker push tooljet/tj-osv:${{ env.SAFE_BRANCH_NAME }}-ee
echo "Cleaning up build cache..."
docker builder prune -af
echo "Disk space after build:"
df -h
- name: Set up environment variables
run: |
echo "TOOLJET_EDITION=${{ matrix.edition == 'ee' && 'EE' || 'CE' }}" >> .env
echo "TOOLJET_EDITION=${{ matrix.edition }}" >> .env
echo "TOOLJET_HOST=http://localhost:3000" >> .env
echo "LOCKBOX_MASTER_KEY=cd97331a419c09387bef49787f7da8d2a81d30733f0de6bed23ad8356d2068b2" >> .env
echo "SECRET_KEY_BASE=7073b9a35a15dd20914ae17e36a693093f25b74b96517a5fec461fc901c51e011cd142c731bee48c5081ec8bac321c1f259ef097ef2a16f25df17a3798c03426" >> .env
@ -97,37 +114,61 @@ jobs:
echo "PG_PASS=postgres" >> .env
echo "PG_PORT=5432" >> .env
echo "ENABLE_TOOLJET_DB=true" >> .env
echo "PGRST_DB_PRE_CONFIG=postgrest.pre_config" >> .env
echo "TOOLJET_DB=tooljet_db" >> .env
echo "TOOLJET_DB_USER=postgres" >> .env
echo "TOOLJET_DB_HOST=postgres" >> .env
echo "TOOLJET_DB_PASS=postgres" >> .env
echo "TOOLJET_DB_STATEMENT_TIMEOUT=60000" >> .env
echo "TOOLJET_DB_RECONFIG=true" >> .env
echo "PGRST_JWT_SECRET=r9iMKoe5CRMgvJBBtp4HrqN7QiPpUToj" >> .env
echo "PGRST_HOST=postgrest" >> .env
echo "PGRST_HOST=localhost:3001" >> .env
echo "PGRST_DB_PRE_CONFIG=postgrest.pre_config" >> .env
echo "PGRST_DB_URI=postgres://postgres:postgres@postgres/tooljet_db" >> .env
echo "SSO_GIT_OAUTH2_CLIENT_ID=dummy" >> .env
echo "SSO_GIT_OAUTH2_CLIENT_SECRET=dummy" >> .env
echo "SSO_GIT_OAUTH2_HOST=dummy" >> .env
echo "SSO_GOOGLE_OAUTH2_CLIENT_ID=dummy" >> .env
echo "PGRST_SERVER_PORT=3001" >> .env
echo "ENABLE_MARKETPLACE_FEATURE=true" >> .env
echo "ENABLE_MARKETPLACE_DEV_MODE=true" >> .env
echo "ENABLE_PRIVATE_APP_EMBED=true" >> .env
echo "SSO_GOOGLE_OAUTH2_CLIENT_ID=123456789.apps.googleusercontent.com" >> .env
echo "SSO_GOOGLE_OAUTH2_CLIENT_SECRET=ABCGFDNF-FHSDVFY-bskfh6234" >> .env
echo "SSO_GIT_OAUTH2_CLIENT_ID=1234567890" >> .env
echo "SSO_GIT_OAUTH2_CLIENT_SECRET=3346shfvkdjjsfkvxce32854e026a4531ed" >> .env
echo "SSO_OPENID_NAME=tj-oidc-simulator" >> .env
echo "SSO_OPENID_CLIENT_ID=${{ secrets.SSO_OPENID_CLIENT_ID }}" >> .env
echo "SSO_OPENID_CLIENT_SECRET=${{ secrets.SSO_OPENID_CLIENT_SECRET }}" >> .env
echo "ENABLE_EXTERNAL_API=true" >> .env
echo "EXTERNAL_API_ACCESS_TOKEN=d980eb3af24d783991cee51a2d84dce9f9bd41d4b46f441cc691ccebbecd3cbc" >> .env
echo "TOOLJET_GLOBAL_CONSTANTS__development='{\"envConstant\":\"globalUI\",\"headerKey\":\"customHeader\",\"ui_url\":\"http://20.29.40.108:4000/development\",\"headerValue\":\"key=value\"}'" >> .env
echo "TOOLJET_SECRET_CONSTANTS__development='{\"envSecret\":\"secret\",\"headerKey\":\"customHeader\",\"ui_url\":\"http://20.29.40.108:4000/development\",\"headerValue\":\"key=value\"}'" >> .env
echo "TOOLJET_GLOBAL_CONSTANTS__staging='{\"envConstant\":\"globalUI\",\"headerKey\":\"customHeader\",\"ui_url\":\"http://20.29.40.108:4000/staging\",\"headerValue\":\"key=value\"}'" >> .env
echo "TOOLJET_SECRET_CONSTANTS__staging='{\"envSecret\":\"secret\",\"headerKey\":\"customHeader\",\"ui_url\":\"http://20.29.40.108:4000/staging\",\"headerValue\":\"key=value\"}'" >> .env
echo "TOOLJET_GLOBAL_CONSTANTS__production='{\"envConstant\":\"globalUI\",\"headerKey\":\"customHeader\",\"ui_url\":\"http://20.29.40.108:4000/production\",\"headerValue\":\"key=value\"}'" >> .env
echo "TOOLJET_SECRET_CONSTANTS__production='{\"envSecret\":\"secret\",\"headerKey\":\"customHeader\",\"ui_url\":\"http://20.29.40.108:4000/production\",\"headerValue\":\"key=value\"}'" >> .env
echo "SAML_SET_ENTITY_ID_REDIRECT_URL=true" >> .env
- name: clean up old docker containers
run: |
docker system prune -af --volumes
echo "Disk space after Docker cleanup:"
df -h
- name: Pulling the docker-compose file
run: curl -LO https://tooljet-test.s3.us-west-1.amazonaws.com/docker-compose.yaml && mkdir postgres_data
- name: Update docker-compose file for CE
- name: Update docker-compose file
run: |
# Update docker-compose.yaml with the new image
sed -i '/^[[:space:]]*tooljet:/,/^$/ s|^\([[:space:]]*image:[[:space:]]*\).*|\1tooljet/tj-osv:${{ env.SAFE_BRANCH_NAME }}-ce|' docker-compose.yaml
# Update docker-compose.yaml with the appropriate image based on edition
if [ "${{ matrix.edition }}" = "ce" ]; then
sed -i '/^[[:space:]]*tooljet:/,/^$/ s|^\([[:space:]]*image:[[:space:]]*\).*|\1tooljet/tj-osv:${{ env.SAFE_BRANCH_NAME }}-ce|' docker-compose.yaml
elif [ "${{ matrix.edition }}" = "ee" ]; then
sed -i '/^[[:space:]]*tooljet:/,/^$/ s|^\([[:space:]]*image:[[:space:]]*\).*|\1tooljet/tj-osv:${{ env.SAFE_BRANCH_NAME }}-ee|' docker-compose.yaml
fi
- name: Update docker-compose file for CE
if: matrix.edition == 'ee'
run: |
# Update docker-compose.yaml with the new image
sed -i '/^[[:space:]]*tooljet:/,/^$/ s|^\([[:space:]]*image:[[:space:]]*\).*|\1tooljet/tj-osv:${{ env.SAFE_BRANCH_NAME }}-ee|' docker-compose.yaml
- name: view docker-compose file
run: cat docker-compose.yaml
- name: Install Docker Compose
run: |
curl -L "https://github.com/docker/compose/releases/download/v2.10.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
curl -L "https://github.com/docker/compose/releases/download/v2.27.0/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
chmod +x /usr/local/bin/docker-compose
- name: Run docker-compose file
@ -136,31 +177,216 @@ jobs:
- name: Checking containers
run: docker ps -a
- name: Checking containers
run: docker ps -a
- name: sleep
run: sleep 80
- name: docker logs
run: sudo docker logs Tooljet-app
run: docker-compose logs tooljet
- name: Wait for the server to be ready
run: |
timeout 1500 bash -c '
until curl --silent --fail http://localhost:3000; do
sleep 5
done'
echo "Waiting for ToolJet to start (timeout: 700 seconds)..."
SUCCESS_FOUND=false
TIMEOUT=700
ELAPSED=0
while [ $ELAPSED -lt $TIMEOUT ]; do
# Check for success message in logs
if docker-compose logs tooljet 2>/dev/null | grep -qE "TOOLJET APPLICATION STARTED SUCCESSFULLY|Ready to use at http://localhost:82|Ready to use at http://localhost:80"; then
echo "Found success message in logs!"
SUCCESS_FOUND=true
break
fi
echo "Still waiting... (${ELAPSED}s elapsed)"
sleep 10
ELAPSED=$((ELAPSED + 10))
done
if [ "$SUCCESS_FOUND" = false ]; then
echo "Timeout reached without finding success logs"
echo "Showing current logs for troubleshooting..."
docker-compose logs --tail=100 tooljet
exit 1
fi
echo "Server is ready!"
- name: Test database connection
run: |
# Wait for database to be ready
echo "Testing database connection..."
docker-compose exec -T postgres psql -U postgres -d tooljet_development -c "SELECT current_database();"
- name: Create delete_user procedure
run: |
echo "Creating delete_users stored procedure..."
docker-compose exec -T postgres psql -U postgres -d tooljet_development -c "
CREATE OR REPLACE PROCEDURE delete_users(p_emails TEXT[])
LANGUAGE plpgsql
AS \$\$
DECLARE
v_email TEXT;
v_user_id UUID;
v_organization_ids UUID[] := ARRAY[]::UUID[];
v_organizations_to_delete UUID[] := ARRAY[]::UUID[];
v_log_message TEXT;
BEGIN
IF COALESCE(array_length(p_emails, 1), 0) = 0 THEN
RAISE NOTICE 'delete_users: no emails provided';
RETURN;
END IF;
FOREACH v_email IN ARRAY p_emails LOOP
BEGIN
RAISE NOTICE '========================================';
RAISE NOTICE 'Starting user deletion for email: %', v_email;
-- Fetch user id
SELECT id INTO v_user_id
FROM users
WHERE email = v_email;
IF v_user_id IS NULL THEN
RAISE NOTICE 'User with email % not found. Skipping.', v_email;
CONTINUE;
END IF;
RAISE NOTICE 'User found with id: %', v_user_id;
-- Collect organization memberships
SELECT COALESCE(ARRAY_AGG(organization_id), ARRAY[]::UUID[])
INTO v_organization_ids
FROM organization_users
WHERE user_id = v_user_id;
RAISE NOTICE 'Found % organizations for user',
COALESCE(array_length(v_organization_ids, 1), 0);
-- Find organizations with that single user
IF array_length(v_organization_ids, 1) > 0 THEN
SELECT COALESCE(ARRAY_AGG(organization_id), ARRAY[]::UUID[])
INTO v_organizations_to_delete
FROM (
SELECT organization_id
FROM organization_users
WHERE organization_id = ANY(v_organization_ids)
GROUP BY organization_id
HAVING COUNT(*) = 1
) subquery;
ELSE
v_organizations_to_delete := ARRAY[]::UUID[];
END IF;
RAISE NOTICE 'Found % organizations to delete',
COALESCE(array_length(v_organizations_to_delete, 1), 0);
-- Cascade delete records for orgs slated for removal
IF array_length(v_organizations_to_delete, 1) > 0 THEN
WITH deleted_apps AS (
DELETE FROM apps
WHERE organization_id = ANY(v_organizations_to_delete)
RETURNING id
)
SELECT 'Deleted ' || COUNT(*) || ' apps'
INTO v_log_message FROM deleted_apps;
RAISE NOTICE '%', v_log_message;
WITH deleted_data_sources AS (
DELETE FROM data_sources
WHERE organization_id = ANY(v_organizations_to_delete)
RETURNING id
)
SELECT 'Deleted ' || COUNT(*) || ' data sources'
INTO v_log_message FROM deleted_data_sources;
RAISE NOTICE '%', v_log_message;
WITH deleted_organizations AS (
DELETE FROM organizations
WHERE id = ANY(v_organizations_to_delete)
RETURNING id
)
SELECT 'Deleted ' || COUNT(*) || ' organizations'
INTO v_log_message FROM deleted_organizations;
RAISE NOTICE '%', v_log_message;
ELSE
RAISE NOTICE 'No organizations removed for user %', v_email;
END IF;
-- Delete audit logs for orgs (if any) and user
WITH deleted_audit_logs AS (
DELETE FROM audit_logs
WHERE user_id = v_user_id
OR organization_id = ANY(v_organizations_to_delete)
RETURNING id
)
SELECT 'Deleted ' || COUNT(*) || ' audit logs'
INTO v_log_message FROM deleted_audit_logs;
RAISE NOTICE '%', v_log_message;
-- Delete organization membership records
DELETE FROM organization_users
WHERE user_id = v_user_id;
-- Delete the user
DELETE FROM users
WHERE id = v_user_id;
RAISE NOTICE 'Deleted user with id: %', v_user_id;
RAISE NOTICE 'User deletion completed for email: %', v_email;
EXCEPTION
WHEN OTHERS THEN
RAISE NOTICE 'Error deleting user %: %', v_email, SQLERRM;
-- continue with next email
END;
END LOOP;
RAISE NOTICE '========================================';
RAISE NOTICE 'delete_users procedure finished.';
END;
\$\$;"
echo "delete_users procedure created successfully"
- name: Seeding (Setup Super Admin)
run: |
curl 'http://localhost:3000/api/onboarding/setup-super-admin' \
curl --fail-with-body 'http://localhost:3000/api/onboarding/setup-super-admin' \
-H 'Content-Type: application/json' \
--data-raw '{
"companyName": "ToolJet",
"name": "The Developer",
"workspaceName": "Tooljet'\''s workspace",
"workspaceName": "My workspace",
"email": "dev@tooljet.io",
"password": "password"
}'
- name: Seeding (Authenticate)
run: |
AUTH_RESPONSE=$(curl --fail-with-body \
-c /tmp/tj_cookies.txt \
'http://localhost:3000/api/authenticate' \
-H 'Content-Type: application/json' \
--data-raw '{
"email": "dev@tooljet.io",
"password": "password"
}')
echo "$AUTH_RESPONSE"
# Extract org ID and export for the next step via GITHUB_ENV
ORG_ID=$(echo "$AUTH_RESPONSE" | jq -r '.current_organization_id')
echo "TJ_ORG_ID=$ORG_ID" >> $GITHUB_ENV
- name: Seeding (Complete Onboarding)
run: |
# Sets onboarding_status = onboarding_completed so the frontend does not
# redirect every page visit to /setup, which would break all UI tests.
# tj-workspace-id header is required by JwtStrategy to resolve the user.
curl --fail-with-body \
-b /tmp/tj_cookies.txt \
-X POST \
'http://localhost:3000/api/onboarding/finish' \
-H 'Content-Type: application/json' \
-H "tj-workspace-id: $TJ_ORG_ID" \
--data-raw '{"region": "us"}'
- name: Create Cypress environment file
id: create-json
uses: jsdaniell/create-json@1.1.2
@ -170,8 +396,9 @@ jobs:
dir: "./cypress-tests"
- name: Marketplace
uses: cypress-io/github-action@v5
uses: cypress-io/github-action@v6
with:
browser: chrome
working-directory: ./cypress-tests
config: "baseUrl=http://localhost:3000"
config-file: cypress-marketplace.config.js
@ -180,7 +407,7 @@ jobs:
uses: actions/upload-artifact@v4
if: always()
with:
name: screenshots
name: screenshots-${{ matrix.edition }}
path: cypress-tests/cypress/screenshots
Cypress-Marketplace-Subpath:
@ -238,6 +465,7 @@ jobs:
echo "SUB_PATH=/apps/tooljet/" >> .env
echo "NODE_ENV=production" >> .env
echo "SERVE_CLIENT=true" >> .env
echo "LICENSE_KEY=${{ secrets.RENDER_LICENSE_KEY }}" >> .env
- name: Pulling the docker-compose file
run: curl -LO https://tooljet-test.s3.us-west-1.amazonaws.com/docker-compose.yaml && mkdir postgres_data
@ -281,4 +509,4 @@ jobs:
if: always()
with:
name: screenshots
path: cypress-tests/cypress/screenshots
path: cypress-tests/cypress/screenshots

File diff suppressed because it is too large Load diff

View file

@ -4,23 +4,23 @@ on:
workflow_dispatch:
inputs:
branch_name:
description: 'Git branch to build from'
description: "Git branch to build from"
required: true
default: 'main'
default: "lts-3.16"
dockerfile_path:
description: 'Path to Dockerfile'
description: "Path to Dockerfile"
required: true
default: './docker/LTS/cloud/cloud-server.Dockerfile'
default: "./docker/LTS/cloud/cloud-server.Dockerfile"
type: choice
options:
- ./docker/LTS/cloud/cloud-server.Dockerfile
docker_tag:
description: 'Docker tag suffix (e.g., cloud-staging-v14)'
description: "Docker tag suffix (e.g., cloud-staging-v14)"
required: true
jobs:
full-deploy:
name: Build Image, Deploy to AKS & Netlify
name: Build Image, Deploy to AKS & cloudflare
runs-on: ubuntu-latest
steps:
@ -39,6 +39,16 @@ jobs:
echo "✅ User '${{ github.actor }}' is authorized."
fi
- name: Free up disk space
run: |
sudo rm -rf /usr/share/dotnet
sudo rm -rf /opt/ghc
sudo rm -rf /usr/local/share/boost
sudo rm -rf "$AGENT_TOOLSDIRECTORY"
sudo docker system prune -af
sudo apt-get clean
df -h
- name: Checkout Repo
uses: actions/checkout@v4
with:
@ -82,7 +92,7 @@ jobs:
BRANCH_NAME=${{ github.event.inputs.branch_name }}
- name: Show the full Docker tag
run: |
run: |
echo "✅ Docker image tagged as: $IMAGE_TAG"
# Deploy to AKS
@ -121,7 +131,7 @@ jobs:
exit 1
deploy-frontend:
name: Deploy Frontend to Netlify
name: Deploy Frontend to cloudflare
runs-on: ubuntu-latest
needs: full-deploy
@ -144,6 +154,7 @@ jobs:
- name: 📥 Manual Git checkout with submodules
run: |
set -e
BRANCH="${{ github.event.inputs.branch_name }}"
REPO="https://x-access-token:${{ secrets.CUSTOM_GITHUB_TOKEN }}@github.com/${{ github.repository }}"
@ -155,6 +166,11 @@ jobs:
git clone --recurse-submodules --depth=1 --branch "$BRANCH" "$REPO" repo
cd repo
echo "🔎 Main repo: verifying checkout"
MAIN_CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "✅ Main repo: successfully checked out branch $MAIN_CURRENT"
echo "📍 Main repo: current commit $(git rev-parse --short HEAD): $(git log -1 --pretty=%s)"
echo "🔁 Updating submodules"
git submodule update --init --recursive
@ -162,13 +178,44 @@ jobs:
BRANCH="$BRANCH" git submodule foreach --recursive bash -c '
name="$sm_path"
echo "↪ $name: checking out branch $BRANCH"
echo ""
echo "Entering '\''$name'\''"
echo "↪ $name: trying to checkout branch '\''$BRANCH'\''"
if git ls-remote --exit-code --heads origin "$BRANCH" >/dev/null; then
git fetch origin "$BRANCH:$BRANCH"
git checkout "$BRANCH"
git fetch origin "$BRANCH:$BRANCH" || {
echo "❌ $name: fetch failed for $BRANCH"
exit 1
}
PREV=$(git rev-parse --short HEAD || echo "unknown")
git checkout "$BRANCH" || {
echo "❌ $name: checkout failed for $BRANCH"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: checked out branch $BRANCH"
else
echo "⚠️ Branch not found, falling back to main"
git checkout main && git pull origin main
echo "⚠️ $name: branch '$BRANCH' not found on origin. Falling back to 'lts-3.16'"
PREV=$(git rev-parse --short HEAD || echo "unknown")
git fetch origin lts-3.16:lts-3.16 || {
echo "❌ $name: fetch failed for lts-3.16"
exit 1
}
git checkout lts-3.16 || {
echo "❌ $name: fallback to lts-3.16 failed"
exit 1
}
echo "Previous HEAD position was $PREV: $(git log -1 --pretty=%s || echo 'unknown')"
echo "✅ $name: now on branch lts-3.16"
fi
CURRENT=$(git rev-parse --abbrev-ref HEAD)
echo "🔎 $name: current branch = $CURRENT"
if [ "$CURRENT" != "$BRANCH" ] && [ "$CURRENT" != "lts-3.16" ]; then
echo "❌ $name: unexpected branch state — wanted '$BRANCH' or fallback 'lts-3.16', got '$CURRENT'"
exit 1
fi
'
@ -181,8 +228,8 @@ jobs:
run: npm install
working-directory: repo
- name: 🛠️ Build project
run: npm run build:plugins:prod && npm run build:frontend
- name: 🛠️ Build the project
run: npm run build:plugins:prod && npm run build:frontend:cloud
working-directory: repo
env:
GOOGLE_MAPS_API_KEY: ${{ secrets.CLOUD_GOOGLE_MAPS_API_KEY }}
@ -196,24 +243,38 @@ jobs:
TJDB_SQL_MODE_DISABLE: ${{ secrets.CLOUD_TJDB_SQL_MODE_DISABLE }}
TOOLJET_SERVER_URL: ${{ secrets.CLOUD_TOOLJET_SERVER_URL }}
TOOLJET_EDITION: cloud
WEBSITE_SIGNUP_URL: https://website-stage.tooljet.ai/ai-create-account
WEBSITE_SIGNUP_URL: https://website-stage.tooljet.ai/signup
- name: 🚀 Deploy to Netlify
- name: 📝 Add SPA routing redirect rule
run: echo "/* /index.html 200" > repo/frontend/build/_redirects
- name: 🔧 Set CF Pages production branch to input branch
run: |
npm install -g netlify-cli
netlify deploy --prod --dir=frontend/build --auth=$NETLIFY_AUTH_TOKEN --site=${{ secrets.CLOUD_NETLIFY_SITE_ID }}
echo "🔄 Updating CF Pages production branch to: ${{ github.event.inputs.branch_name }}"
response=$(curl -s -w "\n%{http_code}" -X PATCH \
"https://api.cloudflare.com/client/v4/accounts/${{ secrets.CF_PAGES_ACCOUNT_ID }}/pages/projects/${{ secrets.CF_PAGES_PROJECT_NAME }}" \
-H "Authorization: Bearer ${{ secrets.CF_PAGES_API_TOKEN }}" \
-H "Content-Type: application/json" \
--data '{"production_branch": "${{ github.event.inputs.branch_name }}"}')
http_code=$(echo "$response" | tail -n1)
if [ "$http_code" = "200" ]; then
echo "✅ Production branch updated to: ${{ github.event.inputs.branch_name }}"
else
echo "❌ Failed to update production branch (HTTP $http_code)"
echo "$response"
exit 1
fi
- name: 🚀 Deploy to Cloudflare Pages
run: |
echo "📦 Built from source branch: ${{ github.event.inputs.branch_name }}"
echo "🎯 Targeting CF Pages production slot (branch alias: ${{ github.event.inputs.branch_name }})"
npx wrangler pages deploy frontend/build \
--project-name=${{ secrets.CF_PAGES_PROJECT_NAME }} \
--branch=${{ github.event.inputs.branch_name }} \
--commit-dirty=true
working-directory: repo
env:
NETLIFY_AUTH_TOKEN: ${{ secrets.NETLIFY_AUTH_TOKEN }}
GOOGLE_MAPS_API_KEY: ${{ secrets.CLOUD_GOOGLE_MAPS_API_KEY }}
NODE_ENV: ${{ secrets.CLOUD_NODE_ENV }}
NODE_OPTIONS: ${{ secrets.CLOUD_NODE_OPTIONS }}
SENTRY_AUTH_TOKEN: ${{ secrets.CLOUD_SENTRY_AUTH_TOKEN }}
SENTRY_ORG: ${{ secrets.CLOUD_SENTRY_ORG }}
SENTRY_PROJECT: ${{ secrets.CLOUD_SENTRY_PROJECT }}
SERVE_CLIENT: ${{ secrets.CLOUD_SERVE_CLIENT }}
SERVER_IP: ${{ secrets.CLOUD_SERVER_IP }}
TJDB_SQL_MODE_DISABLE: ${{ secrets.CLOUD_TJDB_SQL_MODE_DISABLE }}
TOOLJET_SERVER_URL: ${{ secrets.CLOUD_TOOLJET_SERVER_URL }}
WEBSITE_SIGNUP_URL: https://website-stage.tooljet.ai/ai-create-account
TOOLJET_EDITION: cloud
CLOUDFLARE_API_TOKEN: ${{ secrets.CF_PAGES_API_TOKEN }}
CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CF_PAGES_ACCOUNT_ID }}

View file

@ -4,34 +4,40 @@ on:
workflow_dispatch:
push:
branches:
- develop
- documentation
paths:
- docs/**
- docs/**
schedule:
- cron: '30 3 * * *' # 9:00 AM IST
- cron: '30 16 * * *' # 10:00 PM IST
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Checkout documentation branch
uses: actions/checkout@v4
with:
ref: documentation
- name: Setup Node.js
uses: actions/setup-node@v2
uses: actions/setup-node@v4
with:
node-version: 18.18.2
- name: Install dependencies
run: npm install
working-directory: docs
run: npm install
- name: Build the project
run: GTM=${{ secrets.GTM }} ALGOLIA_API_KEY=${{ secrets.ALGOLIA_API_KEY }} npm run build
working-directory: docs
run: |
GTM=${{ secrets.GTM }} ALGOLIA_API_KEY=${{ secrets.ALGOLIA_API_KEY }} npm run build
- name: Deploy to Netlify
run: |
npm install -g netlify-cli
netlify deploy --prod --dir=docs/build --auth=$NETLIFY_AUTH_TOKEN --site=${{ secrets.NETLIFY_SITE_ID }}
env:
NETLIFY_AUTH_TOKEN: ${{ secrets.NETLIFY_AUTH_TOKEN }}
run: |
npm install -g netlify-cli
netlify deploy --prod --dir=docs/build --site=${{ secrets.NETLIFY_SITE_ID }}

194
.github/workflows/grype-slack-notify.yml vendored Normal file
View file

@ -0,0 +1,194 @@
name: Grype - Docker Image Vulnerability Scan
on:
workflow_dispatch:
schedule:
- cron: "30 6 * * 1"
jobs:
PeriodicVulnerability-CheckOn-docker-image-lts:
if: github.event_name == 'schedule' || github.event_name == 'workflow_dispatch'
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Free up disk space
run: |
echo "=== Disk space before cleanup ==="
df -h
sudo rm -rf /usr/share/dotnet
sudo rm -rf /opt/ghc
sudo rm -rf /usr/local/share/boost
sudo rm -rf "$AGENT_TOOLSDIRECTORY"
sudo docker system prune -af
sudo apt-get clean
echo "=== Disk space after cleanup ==="
df -h
- name: Pull ToolJet LTS Docker image
run: docker pull tooljet/tooljet:ee-lts-latest
- name: Grype Scan - Table Output (visible in logs)
uses: anchore/scan-action@v7
with:
image: 'tooljet/tooljet:ee-lts-latest'
fail-build: false
severity-cutoff: high
output-format: table
only-fixed: true
- name: Grype Scan - JSON Output (for report)
uses: anchore/scan-action@v7
with:
image: 'tooljet/tooljet:ee-lts-latest'
fail-build: false
severity-cutoff: high
output-format: json
output-file: grype-lts-results.json
only-fixed: true
- name: Parse Results
id: parse-grype
run: |
if [ -f grype-lts-results.json ]; then
critical=$(jq '[.matches[]? | select(.vulnerability.severity=="Critical")] | length' grype-lts-results.json)
high=$(jq '[.matches[]? | select(.vulnerability.severity=="High")] | length' grype-lts-results.json)
else
critical=0
high=0
fi
total=$((critical + high))
echo "critical=$critical" >> $GITHUB_OUTPUT
echo "high=$high" >> $GITHUB_OUTPUT
echo "total=$total" >> $GITHUB_OUTPUT
echo "=== Vulnerability Summary ==="
echo "Critical: $critical"
echo "High: $high"
echo "Total: $total"
- name: Upload JSON Report as Artifact
if: always()
uses: actions/upload-artifact@v4
with:
name: grype-lts-scan-report
path: grype-lts-results.json
retention-days: 7
if-no-files-found: warn
- name: Determine notification color
id: determine-color
run: |
critical=${{ steps.parse-grype.outputs.critical }}
high=${{ steps.parse-grype.outputs.high }}
if [ "$critical" -gt 0 ]; then
echo "color=#FF0000" >> $GITHUB_OUTPUT
elif [ "$high" -gt 0 ]; then
echo "color=#FFA500" >> $GITHUB_OUTPUT
else
echo "color=#36A64F" >> $GITHUB_OUTPUT
fi
- name: Send Slack Notification
run: |
payload=$(cat <<EOF
{
"attachments": [
{
"color": "${{ steps.determine-color.outputs.color }}",
"blocks": [
{
"type": "header",
"text": {
"type": "plain_text",
"text": "🐳 Docker Image Vulnerability Scan Report",
"emoji": true
}
},
{
"type": "section",
"fields": [
{
"type": "mrkdwn",
"text": "*Repository:*\n${{ github.repository }}"
},
{
"type": "mrkdwn",
"text": "*Image:*\ntooljet/tooljet:ee-lts-latest"
},
{
"type": "mrkdwn",
"text": "*Scanner:*\nGrype"
},
{
"type": "mrkdwn",
"text": "*Scan Time:*\n$(date -u +"%Y-%m-%d %H:%M UTC")"
}
]
},
{
"type": "divider"
},
{
"type": "section",
"text": {
"type": "mrkdwn",
"text": "*Docker Image Vulnerabilities (fixable only):*"
}
},
{
"type": "section",
"fields": [
{
"type": "mrkdwn",
"text": "🔴 *Critical:*\n${{ steps.parse-grype.outputs.critical }}"
},
{
"type": "mrkdwn",
"text": "🟠 *High:*\n${{ steps.parse-grype.outputs.high }}"
},
{
"type": "mrkdwn",
"text": "📊 *Total:*\n${{ steps.parse-grype.outputs.total }}"
}
]
},
{
"type": "divider"
},
{
"type": "actions",
"elements": [
{
"type": "button",
"text": {
"type": "plain_text",
"text": "📥 Download Full Report",
"emoji": true
},
"url": "https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}",
"style": "primary"
}
]
}
]
}
]
}
EOF
)
response=$(curl -s -w "%{http_code}" -X POST \
-H 'Content-type: application/json' \
--data "$payload" \
"${{ secrets.SLACK_WEBHOOK_URL_VUR }}")
http_code="${response: -3}"
if [ "$http_code" != "200" ]; then
echo "Slack notification failed with HTTP $http_code"
exit 1
fi
echo "Slack notification sent successfully"

195
.github/workflows/license-compliance.yml vendored Normal file
View file

@ -0,0 +1,195 @@
name: License Compliance Check
on:
pull_request:
types: [opened, synchronize, reopened]
jobs:
license-check:
name: Check New Package Licenses
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: read
steps:
- name: Check licenses of new packages
uses: actions/github-script@v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const https = require('https');
// ── Fetch license from npm registry ──────────────────────────────
function fetchLicense(packageName) {
return new Promise((resolve) => {
const encoded = packageName.replace('/', '%2F');
const url = `https://registry.npmjs.org/${encoded}/latest`;
https.get(url, { headers: { 'User-Agent': 'tooljet-license-checker' } }, (res) => {
let data = '';
res.on('data', chunk => data += chunk);
res.on('end', () => {
try {
const json = JSON.parse(data);
resolve(json.license || 'UNKNOWN');
} catch {
resolve('UNKNOWN');
}
});
}).on('error', () => resolve('UNKNOWN'));
});
}
// ── License check — ONLY exact MIT or Apache-2.0 ─────────────────
// Dual licenses like "(MIT OR GPL-3.0-or-later)" are NOT permitted.
function isPermitted(license) {
if (!license || license === 'UNKNOWN') return false;
const l = license.trim();
return l === 'MIT' || l === 'Apache-2.0';
}
// ── Get PR diff files from GitHub API ─────────────────────────────
const prFiles = await github.rest.pulls.listFiles({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: context.issue.number,
per_page: 100,
});
const pkgFiles = prFiles.data.filter(f =>
f.filename.endsWith('package.json') &&
!f.filename.includes('node_modules')
);
if (pkgFiles.length === 0) {
console.log('No package.json files changed in this PR. Skipping.');
return;
}
console.log(`package.json files changed: ${pkgFiles.map(f => f.filename).join(', ')}`);
// ── Extract newly added packages from diff patch ──────────────────
function extractAddedPackages(patch) {
if (!patch) return [];
const packages = [];
for (const line of patch.split('\n')) {
if (!line.startsWith('+') || line.startsWith('+++')) continue;
const match = line.match(/^\+\s*"(@?[a-zA-Z0-9][\w\-\.\/]*)"\s*:\s*"\^?[\d~*]/);
if (match) {
packages.push(match[1]);
}
}
return packages;
}
// ── Main scan ─────────────────────────────────────────────────────
const violations = [];
const permitted = [];
for (const file of pkgFiles) {
console.log(`\n── Scanning: ${file.filename}`);
const addedPackages = extractAddedPackages(file.patch);
if (addedPackages.length === 0) {
console.log(' No new packages added.');
continue;
}
console.log(` New packages found: ${addedPackages.join(', ')}`);
for (const pkg of addedPackages) {
const license = await fetchLicense(pkg);
const ok = isPermitted(license);
if (ok) {
console.log(` [OK] ${pkg} — ${license}`);
permitted.push({ pkg, license, file: file.filename });
} else {
console.log(` [FAIL] ${pkg} — ${license}`);
violations.push({ pkg, license, file: file.filename });
}
}
}
console.log(`\n── Summary`);
console.log(` Permitted : ${permitted.length}`);
console.log(` Violations: ${violations.length}`);
// ── Delete previous bot comment if any ────────────────────────────
const comments = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
});
for (const comment of comments.data) {
if (comment.body.includes('<!-- license-compliance-bot -->')) {
await github.rest.issues.deleteComment({
owner: context.repo.owner,
repo: context.repo.repo,
comment_id: comment.id,
});
}
}
// ── Skip comment if nothing new was added ─────────────────────────
if (permitted.length === 0 && violations.length === 0) {
console.log('No new packages detected in diff. Skipping comment.');
return;
}
// ── Build and post comment ────────────────────────────────────────
let body = `<!-- license-compliance-bot -->\n`;
if (violations.length === 0) {
body += `## ✅ License Compliance Check Passed\n\n`;
body += `All new packages added in this PR use permitted licenses (MIT or Apache-2.0).\n\n`;
body += `| Package | License | File |\n|---|---|---|\n`;
body += permitted.map(p =>
`| \`${p.pkg}\` | \`${p.license}\` | \`${p.file}\` |`
).join('\n');
body += '\n';
} else {
body += `## ❌ License Compliance Check Failed\n\n`;
body += `This PR adds package(s) with licenses that are **not permitted**.\n`;
body += `Only \`MIT\` and \`Apache-2.0\` licenses are allowed.\n\n`;
body += `### 🚫 Not Permitted\n\n`;
body += `| Package | License | File |\n|---|---|---|\n`;
body += violations.map(v =>
`| \`${v.pkg}\` | \`${v.license}\` | \`${v.file}\` |`
).join('\n');
body += `\n\n`;
body += `> ❌ The package(s) above are not permitted. Please replace them with an equivalent that uses an MIT or Apache-2.0 license.\n`;
body += `> If this package genuinely needs to be exempted, a maintainer can bypass this check using the bypass rules option on this PR.\n\n`;
if (permitted.length > 0) {
body += `### ✅ Permitted Packages\n\n`;
body += `| Package | License | File |\n|---|---|---|\n`;
body += permitted.map(p =>
`| \`${p.pkg}\` | \`${p.license}\` | \`${p.file}\` |`
).join('\n');
body += '\n';
}
}
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body,
});
if (violations.length > 0) {
core.setFailed(
`License check failed: ${violations.length} package(s) with non-permitted licenses. See PR comment for details.`
);
}

View file

@ -38,7 +38,7 @@ jobs:
- name: Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 18.18.2
node-version: 22.15.1
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
@ -63,12 +63,11 @@ jobs:
npm run build --workspace=plugins/$plugin || exit 1
done
- name: Build marketplace plugins and capture summary
run: |
cd marketplace
echo "🚀 Uploading to S3"
AWS_BUCKET=tooljet-plugins-stage node scripts/upload-to-s3.js | tee upload_summary.log
AWS_BUCKET=tooljet-plugins-stage bash scripts/upload-to-s3.sh | tee upload_summary.log
- name: Extract upload summary
id: upload_summary
@ -133,4 +132,4 @@ jobs:
owner: context.repo.owner,
repo: context.repo.repo,
body: `❌ Marketplace Plugin deployment failed.\n\n🔍 [View Deployment Logs & Summary](${runUrl})`
});
});

View file

@ -4,14 +4,19 @@ on:
workflow_dispatch:
inputs:
branch_name:
description: 'Git branch to build from'
description: "Git branch to build from"
required: true
default: 'main'
default: "main"
dockerfile_path:
description: 'Path to Dockerfile'
description: "Path to Dockerfile"
required: true
type: choice
options:
- ./docker/LTS/ee/ee-production.Dockerfile
- ./docker/pre-release/ee/ee-production.Dockerfile
- ./docker/LTS/cloud/cloud-server.Dockerfile
docker_tag:
description: 'Docker tag suffix (e.g., pre-release-14)'
description: "Docker tag suffix (e.g., pre-release-14)"
required: true
jobs:
@ -19,6 +24,16 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Free up disk space
run: |
sudo rm -rf /usr/share/dotnet
sudo rm -rf /opt/ghc
sudo rm -rf /usr/local/share/boost
sudo rm -rf "$AGENT_TOOLSDIRECTORY"
sudo docker system prune -af
sudo apt-get clean
df -h
- name: Checkout repo
uses: actions/checkout@v4
with:
@ -32,7 +47,7 @@ jobs:
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
password: ${{ secrets.DOCKER_PAT }}
- name: Generate full Docker tag
id: taggen
@ -52,6 +67,8 @@ jobs:
push: true
tags: ${{ steps.taggen.outputs.tag }}
platforms: linux/amd64
cache-from: type=gha
cache-to: type=gha,mode=max
build-args: |
CUSTOM_GITHUB_TOKEN=${{ secrets.CUSTOM_GITHUB_TOKEN }}
BRANCH_NAME=${{ github.event.inputs.branch_name }}

View file

@ -0,0 +1,180 @@
name: Marketplace Plugin Production Deploy
on:
pull_request_target:
types: [labeled, unlabeled, closed]
workflow_dispatch:
inputs:
branch:
description: "Branch to deploy from"
required: true
default: "lts-3.16"
plugin_name:
description: "Plugin name to deploy individually (leave empty to deploy all)"
required: false
default: ""
env:
PR_NUMBER: ${{ github.event.number }}
BRANCH_NAME: ${{ github.event.inputs.branch || github.head_ref || github.ref_name }}
jobs:
deploy-marketplace-plugin-production:
if: |
github.event_name == 'workflow_dispatch' ||
(github.event.action == 'labeled' && github.event.label.name == 'deploy-marketplace-plugin-prod')
runs-on: ubuntu-latest
steps:
- name: ✅ Check user authorization
run: |
allowed_user1="${{ secrets.ALLOWED_USER1_USERNAME }}"
allowed_user2="${{ secrets.ALLOWED_USER2_USERNAME }}"
allowed_user3="${{ secrets.ALLOWED_USER3_USERNAME }}"
if [[ "${{ github.actor }}" != "$allowed_user1" && \
"${{ github.actor }}" != "$allowed_user2" && \
"${{ github.actor }}" != "$allowed_user3" ]]; then
echo "❌ User '${{ github.actor }}' is not authorized to trigger this workflow."
echo "Only the following users can deploy to production:"
echo " - $allowed_user1"
echo " - $allowed_user2"
echo " - $allowed_user3"
exit 1
else
echo "✅ User '${{ github.actor }}' is authorized to deploy to production."
fi
- name: Sync repo
uses: actions/checkout@v3
- name: Check if PR is from the same repo
if: github.event_name == 'pull_request_target'
id: check_repo
run: echo "::set-output name=is_fork::$(if [[ '${{ github.event.pull_request.head.repo.full_name }}' != '${{ github.event.pull_request.base.repo.full_name }}' ]]; then echo true; else echo false; fi)"
- name: Fetch the remote branch if it's a forked PR
if: github.event_name == 'pull_request_target' && steps.check_repo.outputs.is_fork == 'true'
run: |
git fetch origin pull/${{ github.event.number }}/head:${{ env.BRANCH_NAME }}
git checkout ${{ env.BRANCH_NAME }}
- name: Checkout PR branch
if: github.event_name == 'pull_request_target' && steps.check_repo.outputs.is_fork == 'false'
uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.ref }}
- name: Checkout specified branch
if: github.event_name == 'workflow_dispatch'
uses: actions/checkout@v3
with:
ref: ${{ github.event.inputs.branch }}
- name: Setup Node.js
uses: actions/setup-node@v2
with:
node-version: 22.15.1
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_PROD_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_PROD_ACCESS_KEY }}
aws-region: us-east-2
- name: Install and build dependencies in order
run: |
cd marketplace
echo "🔧 Installing all workspace dependencies"
npm install
echo "🏗️ Building 'common' plugin first"
npm run build --workspace=plugins/common || exit 1
echo "🔁 Building all remaining plugins"
PLUGINS=$(ls plugins | grep -v '^common$')
for plugin in $PLUGINS; do
echo "🔨 Building plugin: $plugin"
npm run build --workspace=plugins/$plugin || exit 1
done
- name: Build marketplace plugins and capture summary
run: |
cd marketplace
echo "🚀 Uploading to S3 Production"
AWS_BUCKET=${{ secrets.S3_BUCKET_PRODUCTION }} bash scripts/upload-to-s3.sh ${{ github.event.inputs.plugin_name }} | tee upload_summary.log
- name: Extract upload summary
id: upload_summary
run: |
SUMMARY=$(awk '/UPLOAD SUMMARY/,0' marketplace/upload_summary.log)
echo "UPLOAD_SUMMARY<<EOF" >> $GITHUB_ENV
echo "$SUMMARY" >> $GITHUB_ENV
echo "EOF" >> $GITHUB_ENV
- name: Output summary to logs (manual trigger)
if: success() && github.event_name == 'workflow_dispatch'
run: |
echo "========================================="
echo "PRODUCTION DEPLOYMENT SUMMARY"
echo "========================================="
echo "${{ env.UPLOAD_SUMMARY }}"
echo "========================================="
- name: Comment on success
if: success() && github.event_name == 'pull_request_target'
uses: actions/github-script@v5
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const runId = process.env.GITHUB_RUN_ID;
const runUrl = `https://github.com/${{ github.repository }}/actions/runs/${runId}`;
const summary = process.env.UPLOAD_SUMMARY;
const body = `✅ Marketplace Plugin deployed to **PRODUCTION** bucket\n\n🔍 [View Deployment Logs & Summary](${runUrl})\n\n\`\`\`\n${summary}\n\`\`\``;
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body
});
- name: Label update on success
if: success() && github.event_name == 'pull_request_target'
uses: actions/github-script@v6
with:
script: |
try {
await github.rest.issues.removeLabel({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'deploy-marketplace-plugin-prod'
})
} catch (e) {
console.log(e)
}
await github.rest.issues.addLabels({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['plugin-deployed-production']
})
- name: Comment on failure
if: failure() && github.event_name == 'pull_request_target'
uses: actions/github-script@v6
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const runId = process.env.GITHUB_RUN_ID;
const runUrl = `https://github.com/${{ github.repository }}/actions/runs/${runId}`;
await github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: `❌ Marketplace Plugin deployment to **PRODUCTION** failed.\n\n🔍 [View Deployment Logs & Summary](${runUrl})`
});

View file

@ -7,20 +7,55 @@ on:
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
inputs:
branch:
description: "Branch to build from (e.g. lts-3.16)"
required: true
default: "lts-3.16"
version:
description: "RELEASE_VERSION"
required: true
region:
description: "AWS region to build AMI in (default: us-west-1)"
required: false
default: "us-west-1"
jobs:
check-version:
runs-on: ubuntu-latest
name: check-version
outputs:
should_build: ${{ steps.check.outputs.should_build }}
steps:
- name: Check if version is AMI-eligible (multiple of 10)
id: check
run: |
if [[ "${{ github.event_name }}" == "workflow_dispatch" ]]; then
echo "Manual dispatch — always build"
echo "should_build=true" >> $GITHUB_OUTPUT
else
TAG="${GITHUB_REF#refs/*/}"
# Extract patch number: v3.20.100-lts → 100
PATCH=$(echo "$TAG" | sed 's/^v//' | cut -d'.' -f3 | cut -d'-' -f1)
if (( PATCH % 10 == 0 )); then
echo "Version $TAG is AMI-eligible (patch $PATCH is multiple of 10)"
echo "should_build=true" >> $GITHUB_OUTPUT
else
echo "Skipping AMI build — $TAG patch $PATCH is not a multiple of 10"
echo "should_build=false" >> $GITHUB_OUTPUT
fi
fi
packer-ee:
needs: check-version
if: needs.check-version.outputs.should_build == 'true'
runs-on: ubuntu-latest
name: packer-ee
steps:
- name: Checkout code to lts-3.16 branch
if: contains(github.event.release.tag_name, '-lts')
- name: Checkout code
uses: actions/checkout@v2
with:
ref: refs/heads/lts-3.16
ref: refs/heads/${{ github.event.inputs.branch || 'lts-3.16' }}
- name: Setting tag
if: "${{ github.event.inputs.version != '' }}"
@ -35,7 +70,7 @@ jobs:
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-west-1
aws-region: ${{ github.event.inputs.region || 'us-west-1' }}
# Initialize Packer templates
- name: Initialize Packer Template
@ -61,26 +96,43 @@ jobs:
# Dynamically update setup_machine.sh with PAT
- name: Validate PAT
run: |
sed -i "s|git config --global url."https://x-access-token:CUSTOM_GITHUB_TOKEN@github.com/".insteadOf "https://github.com/"|git config --global url."https://x-access-token:${ secrets.CUSTOM_GITHUB_TOKEN }@github.com/".insteadOf "https://github.com/"|g" ./deploy/ec2/ee/setup_machine.sh
sed -i "s|CUSTOM_GITHUB_TOKEN|${{ secrets.CUSTOM_GITHUB_TOKEN }}|g" ./deploy/ec2/ee/setup_machine.sh
# build artifact
- name: Build Artifact
id: packer-build
uses: hashicorp/packer-github-actions@master
with:
command: build
#The the below argument is specific for building EE AMI image
arguments: -color=false -on-error=abort -var ami_name=tooljet_${{ env.RELEASE_VERSION }}.ubuntu_jammy
arguments: -color=false -on-error=abort -var ami_name=tooljet_${{ env.RELEASE_VERSION }}.ubuntu_jammy -var ami_region=${{ github.event.inputs.region || 'us-west-1' }}
target: .
working_directory: deploy/ec2/ee
env:
PACKER_LOG: 1
- name: Send Slack Notification
- name: Cleanup EC2 instances
if: always()
run: |
if [[ "${{ job.status }}" == "success" ]]; then
message="ToolJet enterprise AWS AMI published:\\n\`tooljet_${{ env.RELEASE_VERSION }}.ubuntu-jammy\`"
echo "Listing all EC2 instances..."
INSTANCE_IDS=$(aws ec2 describe-instances \
--region ${{ github.event.inputs.region || 'us-west-1' }} \
--query 'Reservations[*].Instances[*].InstanceId' \
--output text)
if [ -n "$INSTANCE_IDS" ] && [ "$INSTANCE_IDS" != "None" ]; then
echo "Found instances: $INSTANCE_IDS"
aws ec2 terminate-instances --region ${{ github.event.inputs.region || 'us-west-1' }} --instance-ids $INSTANCE_IDS
echo "Terminated instances: $INSTANCE_IDS"
else
message="ToolJet enterprise AWS AMI release failed! \\n\`tooljet_${{ env.RELEASE_VERSION }}.ubuntu-jammy\`"
echo "No instances found to cleanup"
fi
- name: Send Slack Notification
if: success()
run: |
ami_name="tooljet_${{ env.RELEASE_VERSION }}.ubuntu_jammy"
message="✅ *ToolJet Enterprise AWS AMI Published*\nVersion: \`${{ env.RELEASE_VERSION }}\`\nType: 🔒 LTS Release\nBranch: \`${{ github.event.inputs.branch || 'lts-3.16' }}\`\nRegion: \`${{ github.event.inputs.region || 'us-west-1' }}\`\nAMI Name: \`${ami_name}\`"
curl -X POST -H 'Content-type: application/json' --data "{\"text\":\"$message\"}" ${{ secrets.SLACK_WEBHOOK_URL }}

File diff suppressed because it is too large Load diff

View file

@ -5,8 +5,8 @@ on:
issue_comment:
types: [created, edited, deleted]
env:
PR_NUMBER: ${{ github.event.number }}
BRANCH_NAME: ${{ github.head_ref || github.ref_name }}
PR_NUMBER: ${{ github.event.pull_request.number || github.event.issue.number }}
BRANCH_NAME: ${{ github.event.pull_request.head.ref || github.head_ref || github.ref_name }}
permissions:
pull-requests: write
@ -15,11 +15,26 @@ permissions:
jobs:
# Community Edition CE
create-ce-review-app:
if: ${{ github.event.action == 'labeled' && (github.event.label.name == 'create-ce-review-app' || github.event.label.name == 'review-app') }}
create-ce-review-app-old:
if: ${{ github.event.action == 'labeled' && (github.event.label.name == 'create-ce-review-app-old' || github.event.label.name == 'review-app') }}
runs-on: ubuntu-latest
steps:
- name: Get PR details for issue_comment events
if: github.event_name == 'issue_comment'
uses: actions/github-script@v6
with:
script: |
const pr = await github.rest.pulls.get({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: context.issue.number
});
core.exportVariable('PR_NUMBER', pr.data.number);
core.exportVariable('BRANCH_NAME', pr.data.head.ref);
console.log(`✅ PR Number: ${pr.data.number}`);
console.log(`✅ Branch Name: ${pr.data.head.ref}`);
- name: Sync repo
uses: actions/checkout@v3
@ -218,7 +233,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'create-ce-review-app'
name: 'create-ce-review-app-old'
})
} catch (e) {
console.log(e)
@ -228,11 +243,11 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['active-ce-review-app']
labels: ['active-ce-review-app-old']
})
destroy-ce-review-app:
if: ${{ (github.event.action == 'labeled' && github.event.label.name == 'destroy-ce-review-app') || github.event.action == 'closed' }}
destroy-ce-review-app-old:
if: ${{ (github.event.action == 'labeled' && github.event.label.name == 'destroy-ce-review-app-old') || github.event.action == 'closed' }}
runs-on: ubuntu-latest
steps:
@ -257,7 +272,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'destroy-ce-review-app'
name: 'destroy-ce-review-app-old'
})
} catch (e) {
console.log(e)
@ -268,7 +283,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'suspend-ce-review-app'
name: 'suspend-ce-review-app-old'
})
} catch (e) {
console.log(e)
@ -279,44 +294,15 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'active-ce-review-app'
name: 'active-ce-review-app-old'
})
} catch (e) {
console.log(e)
}
# - name: Install PostgreSQL client
# run: |
# sudo apt update
# sudo apt install postgresql-client -y
# - name: Wait after installing PostgreSQL
# run: sleep 25
# - name: Drop PostgreSQL PR databases
# env:
# PGHOST: ${{ secrets.RENDER_DS_PG_HOST }}
# PGPORT: 5432
# PGUSER: ${{ secrets.RENDER_DS_PG_USER }}
# PGDATABASE: ${{ env.PR_NUMBER }}-ce
# PGTJBDATABASE: ${{ env.PR_NUMBER }}-ce-tjdb
# run: |
# if PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -lqt | cut -d \| -f 1 | grep -qw $PGDATABASE; then
# echo "Database $PGDATABASE exists, deleting..."
# PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -d postgres -c "drop database \"$PGDATABASE\" ;"
# else
# echo "Database $PGDATABASE does not exist."
# fi
# if PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -lqt | cut -d \| -f 1 | grep -qw $PGTJBDATABASE; then
# echo "Database $PGTJBDATABASE exists, deleting..."
# PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -d postgres -c "drop database \"$PGTJBDATABASE\" ;"
# else
# echo "Database $PGTJBDATABASE does not exist."
# fi
suspend-ce-review-app:
if: ${{ github.event.action == 'labeled' && github.event.label.name == 'suspend-ce-review-app' }}
suspend-ce-review-app-old:
if: ${{ github.event.action == 'labeled' && github.event.label.name == 'suspend-ce-review-app-old' }}
runs-on: ubuntu-latest
steps:
@ -341,14 +327,14 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'active-ce-review-app'
name: 'active-ce-review-app-old'
})
} catch (e) {
console.log(e)
}
resume-ce-review-app:
if: ${{ github.event.action == 'unlabeled' && github.event.label.name == 'suspend-ce-review-app' }}
resume-ce-review-app-old:
if: ${{ github.event.action == 'unlabeled' && github.event.label.name == 'suspend-ce-review-app-old' }}
runs-on: ubuntu-latest
steps:
@ -372,7 +358,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['active-ce-review-app']
labels: ['active-ce-review-app-old']
})
try {
@ -380,7 +366,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'suspend-ce-review-app'
name: 'suspend-ce-review-app-old'
})
} catch (e) {
console.log(e)
@ -390,11 +376,28 @@ jobs:
# Enterprise Edition
create-ee-review-app:
if: ${{ github.event.action == 'labeled' && (github.event.label.name == 'create-ee-review-app' || github.event.label.name == 'review-app') }}
create-ee-review-app-old:
if: |
(github.event.action == 'labeled' && (github.event.label.name == 'create-ee-review-app-old' || github.event.label.name == 'create-ee-lts-review-app-old' || github.event.label.name == 'review-app-old')) ||
(github.event.action == 'created' && (contains(github.event.comment.body, '/deploy-ee') || contains(github.event.comment.body, '/deploy-ee-lts')))
runs-on: ubuntu-latest
steps:
- name: Get PR details for issue_comment events
if: github.event_name == 'issue_comment'
uses: actions/github-script@v6
with:
script: |
const pr = await github.rest.pulls.get({
owner: context.repo.owner,
repo: context.repo.repo,
pull_number: context.issue.number
});
core.exportVariable('PR_NUMBER', pr.data.number);
core.exportVariable('BRANCH_NAME', pr.data.head.ref);
console.log(`✅ PR Number: ${pr.data.number}`);
console.log(`✅ Branch Name: ${pr.data.head.ref}`);
- name: Sync repo
uses: actions/checkout@v3
@ -426,25 +429,26 @@ jobs:
if: env.is_fork == 'false'
uses: actions/checkout@v3
- name: Detect base branch and set Dockerfile path
- name: Determine Dockerfile path
run: |
BASE_BRANCH="${{ github.event.pull_request.base.ref }}"
echo "Base branch: $BASE_BRANCH"
if [[ "$BASE_BRANCH" == "main" || "$BASE_BRANCH" == release/* ]]; then
DOCKERFILE="./docker/pre-release/ee/ee-preview.Dockerfile"
echo "Using pre-release track"
elif [[ "$BASE_BRANCH" == "lts-3.16" || "$BASE_BRANCH" == release-lts/* ]]; then
# Check if LTS deployment is requested via comment or label
if [[ "${{ github.event.action }}" == "labeled" && "${{ github.event.label.name }}" == "create-ee-lts-review-app-old" ]]; then
DOCKERFILE="./docker/LTS/ee/ee-preview.Dockerfile"
echo "Using LTS track"
EDITION_TYPE="LTS"
echo "Using LTS EE Dockerfile (triggered by label)"
elif [[ "${{ github.event.action }}" == "created" && "${{ github.event.comment.body }}" == *"/deploy-ee-lts"* ]]; then
DOCKERFILE="./docker/LTS/ee/ee-preview.Dockerfile"
EDITION_TYPE="LTS"
echo "Using LTS EE Dockerfile (triggered by comment)"
else
echo "Error: Unsupported base branch '$BASE_BRANCH'"
echo "Supported branches: main, release/*, lts-3.16, release-lts/*"
exit 1
DOCKERFILE="./docker/pre-release/ee/ee-preview.Dockerfile"
EDITION_TYPE="pre-release"
echo "Using pre-release EE Dockerfile"
fi
echo "Edition Type: $EDITION_TYPE"
echo "Selected Dockerfile: $DOCKERFILE"
echo "DOCKERFILE=$DOCKERFILE" >> $GITHUB_ENV
echo "EDITION_TYPE=$EDITION_TYPE" >> $GITHUB_ENV
- name: Creating deployment for Enterprise Edition
id: create-ee-deployment
@ -561,23 +565,27 @@ jobs:
},
{
"key": "REDIS_HOST",
"value": "${{ secrets.RENDER_REDIS_HOST }}"
"value": "localhost"
},
{
"key": "REDIS_PORT",
"value": "${{ secrets.RENDER_REDIS_PORT }}"
"value": "6379"
},
{
"key": "TEMPORAL_SERVER_ADDRESS",
"value": "https://auto-setup-1-25-1.onrender.com"
"key": "REDIS_DB",
"value": "0"
},
{
"key": "TEMPORAL_TASK_QUEUE_NAME_FOR_WORKFLOWS",
"value": "tooljet-ee-pr-${{ env.PR_NUMBER }}"
"key": "REDIS_TLS_ENABLED",
"value": "false"
},
{
"key": "TOOLJET_WORKFLOWS_TEMPORAL_NAMESPACE",
"value": "default"
"key": "REDIS_PASSWORD",
"value": ""
},
{
"key": "WORKER",
"value": "true"
},
{
"key": "TOOLJET_MARKETPLACE_URL",
@ -626,11 +634,12 @@ jobs:
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
const editionType = '${{ env.EDITION_TYPE }}' === 'LTS' ? '(LTS)' : '';
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: 'Enterpise Edition: \n Deployment: https://tooljet-ee-pr-${{ env.PR_NUMBER }}.onrender.com \n Dashboard: https://dashboard.render.com/web/${{ env.SERVICE_ID }}'
body: `Enterprise Edition ${editionType}: \n Deployment: https://tooljet-ee-pr-${{ env.PR_NUMBER }}.onrender.com \n Dashboard: https://dashboard.render.com/web/${{ env.SERVICE_ID }}`
})
- uses: actions/github-script@v6
@ -641,7 +650,18 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'create-ee-review-app'
name: 'create-ee-review-app-old'
})
} catch (e) {
console.log(e)
}
try {
await github.rest.issues.removeLabel({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'create-ee-lts-review-app-old'
})
} catch (e) {
console.log(e)
@ -651,11 +671,11 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['active-ee-review-app']
labels: ['active-ee-review-app-old']
})
destroy-ee-review-app:
if: ${{ (github.event.action == 'labeled' && github.event.label.name == 'destroy-ee-review-app') || github.event.action == 'closed' }}
destroy-ee-review-app-old:
if: ${{ (github.event.action == 'labeled' && github.event.label.name == 'destroy-ee-review-app-old') || github.event.action == 'closed' }}
runs-on: ubuntu-latest
steps:
@ -680,7 +700,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'destroy-ee-review-app'
name: 'destroy-ee-review-app-old'
})
} catch (e) {
console.log(e)
@ -691,7 +711,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'suspend-ee-review-app'
name: 'suspend-ee-review-app-old'
})
} catch (e) {
console.log(e)
@ -702,44 +722,16 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'active-ee-review-app'
name: 'active-ee-review-app-old'
})
} catch (e) {
console.log(e)
}
# - name: Install PostgreSQL client
# run: |
# sudo apt update
# sudo apt install postgresql-client -y
# - name: Wait after installing PostgreSQL
# run: sleep 25
# - name: Drop PostgreSQL PR databases
# env:
# PGHOST: ${{ secrets.RENDER_DS_PG_HOST }}
# PGPORT: 5432
# PGUSER: ${{ secrets.RENDER_DS_PG_USER }}
# PGDATABASE: ${{ env.PR_NUMBER }}-ee
# PGTJBDATABASE: ${{ env.PR_NUMBER }}-ee-tjdb
# run: |
# if PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -lqt | cut -d \| -f 1 | grep -qw $PGDATABASE; then
# echo "Database $PGDATABASE exists, deleting..."
# PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -d postgres -c "drop database \"$PGDATABASE\" ;"
# else
# echo "Database $PGDATABASE does not exist."
# fi
# if PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -lqt | cut -d \| -f 1 | grep -qw $PGTJBDATABASE; then
# echo "Database $PGTJBDATABASE exists, deleting..."
# PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -d postgres -c "drop database \"$PGTJBDATABASE\" ;"
# else
# echo "Database $PGTJBDATABASE does not exist."
# fi
suspend-ee-review-app:
if: ${{ github.event.action == 'labeled' && github.event.label.name == 'suspend-ee-review-app' }}
suspend-ee-review-app-old:
if: ${{ github.event.action == 'labeled' && github.event.label.name == 'suspend-ee-review-app-old' }}
runs-on: ubuntu-latest
steps:
@ -764,14 +756,14 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'active-ee-review-app'
name: 'active-ee-review-app-old'
})
} catch (e) {
console.log(e)
}
resume-ee-review-app:
if: ${{ github.event.action == 'unlabeled' && github.event.label.name == 'suspend-ee-review-app' }}
resume-ee-review-app-old:
if: ${{ github.event.action == 'unlabeled' && github.event.label.name == 'suspend-ee-review-app-old' }}
runs-on: ubuntu-latest
steps:
@ -795,7 +787,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['active-ee-review-app']
labels: ['active-ee-review-app-old']
})
try {
@ -803,7 +795,7 @@ jobs:
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
name: 'suspend-ee-review-app'
name: 'suspend-ee-review-app-old'
})
} catch (e) {
console.log(e)
@ -811,369 +803,6 @@ jobs:
# Cloud Edition
# create-cloud-review-app:
# if: ${{ github.event.action == 'labeled' && (github.event.label.name == 'create-cloud-review-app' || github.event.label.name == 'review-app') }}
# runs-on: ubuntu-latest
# steps:
# - name: Creating deployment for Cloud Edition
# id: create-cloud-deployment
# run: |
# export RESPONSE=$(curl --request POST \
# --url https://api.render.com/v1/services \
# --header 'accept: application/json' \
# --header 'content-type: application/json' \
# --header 'Authorization: Bearer ${{ secrets.RENDER_API_KEY }}' \
# --data '
# {
# "autoDeploy": "yes",
# "branch": "${{ env.BRANCH_NAME }}",
# "name": "ToolJet Cloud PR #${{ env.PR_NUMBER }}",
# "notifyOnFail": "default",
# "ownerId": "tea-caeo4bj19n072h3dddc0",
# "repo": "https://github.com/ToolJet/ToolJet",
# "slug": "tooljet-cloud-pr-${{ env.PR_NUMBER }}",
# "suspended": "not_suspended",
# "suspenders": [],
# "type": "web_service",
# "envVars": [
# {
# "key": "PG_HOST",
# "value": "${{ secrets.RENDER_PG_HOST }}"
# },
# {
# "key": "PG_PORT",
# "value": "5432"
# },
# {
# "key": "PG_USER",
# "value": "${{ secrets.RENDER_PG_USER }}"
# },
# {
# "key": "PG_PASS",
# "value": "${{ secrets.RENDER_PG_PASS }}"
# },
# {
# "key": "PG_DB",
# "value": "${{ env.PR_NUMBER }}-cloud"
# },
# {
# "key": "TOOLJET_DB",
# "value": "${{ env.PR_NUMBER }}-cloud-tjdb"
# },
# {
# "key": "TOOLJET_DB_HOST",
# "value": "${{ secrets.RENDER_PG_HOST }}"
# },
# {
# "key": "TOOLJET_DB_USER",
# "value": "${{ secrets.RENDER_PG_USER }}"
# },
# {
# "key": "TOOLJET_DB_PASS",
# "value": "${{ secrets.RENDER_PG_PASS }}"
# },
# {
# "key": "TOOLJET_DB_PORT",
# "value": "5432"
# },
# {
# "key": "PGRST_DB_PRE_CONFIG",
# "value": "postgrest.pre_config"
# },
# {
# "key": "PGRST_DB_URI",
# "value": "postgres://${{ secrets.RENDER_PG_USER }}:${{ secrets.RENDER_PG_PASS }}@${{ secrets.RENDER_PG_HOST }}/${{ env.PR_NUMBER }}-cloud-tjdb"
# },
# {
# "key": "PGRST_HOST",
# "value": "127.0.0.1:3000"
# },
# {
# "key": "PGRST_JWT_SECRET",
# "value": "r9iMKoe5CRMgvJBBtp4HrqN7QiPpUToj"
# },
# {
# "key": "PGRST_LOG_LEVEL",
# "value": "info"
# },
# {
# "key": "PORT",
# "value": "80"
# },
# {
# "key": "TOOLJET_HOST",
# "value": "https://tooljet-cloud-pr-${{ env.PR_NUMBER }}.onrender.com"
# },
# {
# "key": "DISABLE_TOOLJET_TELEMETRY",
# "value": "true"
# },
# {
# "key": "SMTP_ADDRESS",
# "value": "smtp.mailtrap.io"
# },
# {
# "key": "SMTP_DOMAIN",
# "value": "smtp.mailtrap.io"
# },
# {
# "key": "SMTP_PORT",
# "value": "2525"
# },
# {
# "key": "SMTP_USERNAME",
# "value": "${{ secrets.RENDER_SMTP_USERNAME }}"
# },
# {
# "key": "SMTP_PASSWORD",
# "value": "${{ secrets.RENDER_SMTP_PASSWORD }}"
# },
# {
# "key": "REDIS_HOST",
# "value": "${{ secrets.RENDER_REDIS_HOST }}"
# },
# {
# "key": "REDIS_PORT",
# "value": "${{ secrets.RENDER_REDIS_PORT }}"
# },
# {
# "key": "TEMPORAL_SERVER_ADDRESS",
# "value": "https://auto-setup-1-25-1.onrender.com"
# },
# {
# "key": "TEMPORAL_TASK_QUEUE_NAME_FOR_WORKFLOWS",
# "value": "tooljet-cloud-pr-${{ env.PR_NUMBER }}"
# },
# {
# "key": "TOOLJET_WORKFLOWS_TEMPORAL_NAMESPACE",
# "value": "default"
# },
# {
# "key": "TOOLJET_MARKETPLACE_URL",
# "value": "${{ secrets.MARKETPLACE_BUCKET }}"
# },
# {
# "key": "CUSTOM_GITHUB_TOKEN",
# "value": "${{ secrets.CUSTOM_GITHUB_TOKEN }}"
# }
# ],
# "serviceDetails": {
# "disk": null,
# "env": "docker",
# "envSpecificDetails": {
# "dockerCommand": "",
# "dockerContext": "./",
# "dockerfilePath": "./docker/cloud/cloud-preview.Dockerfile"
# },
# "healthCheckPath": "/api/health",
# "numInstances": 1,
# "openPorts": [{
# "port": 80,
# "protocol": "TCP"
# }],
# "plan": "starter",
# "pullRequestPreviewsEnabled": "no",
# "region": "oregon",
# "url": "https://tooljet-cloud-pr-${{ env.PR_NUMBER }}.onrender.com"
# }
# }')
# echo "response: $RESPONSE"
# export SERVICE_ID=$(echo $RESPONSE | jq -r '.service.id')
# echo "SERVICE_ID=$SERVICE_ID" >> $GITHUB_ENV
# - name: Comment deployment URL
# uses: actions/github-script@v5
# with:
# github-token: ${{secrets.GITHUB_TOKEN}}
# script: |
# github.rest.issues.createComment({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# body: 'Cloud Edition: \n Deployment: https://tooljet-cloud-pr-${{ env.PR_NUMBER }}.onrender.com \n Dashboard: https://dashboard.render.com/web/${{ env.SERVICE_ID }}'
# })
# - uses: actions/github-script@v6
# with:
# script: |
# try {
# await github.rest.issues.removeLabel({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# name: 'create-cloud-review-app'
# })
# } catch (e) {
# console.log(e)
# }
# await github.rest.issues.addLabels({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# labels: ['active-cloud-review-app']
# })
# destroy-cloud-review-app:
# if: ${{ (github.event.action == 'labeled' && github.event.label.name == 'destroy-cloud-review-app') || github.event.action == 'closed' }}
# runs-on: ubuntu-latest
# steps:
# - name: Delete service
# run: |
# export SERVICE_ID=$(curl --request GET \
# --url 'https://api.render.com/v1/services?name=ToolJet%20PR%20%23${{ env.PR_NUMBER }}&limit=1' \
# --header 'accept: application/json' \
# --header 'authorization: Bearer ${{ secrets.RENDER_API_KEY }}' | \
# jq -r '.[0].service.id')
# curl --request DELETE \
# --url https://api.render.com/v1/services/$SERVICE_ID \
# --header 'accept: application/json' \
# --header 'authorization: Bearer ${{ secrets.RENDER_API_KEY }}'
# - uses: actions/github-script@v6
# with:
# script: |
# try {
# await github.rest.issues.removeLabel({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# name: 'destroy-cloud-review-app'
# })
# } catch (e) {
# console.log(e)
# }
# try {
# await github.rest.issues.removeLabel({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# name: 'suspend-cloud-review-app'
# })
# } catch (e) {
# console.log(e)
# }
# try {
# await github.rest.issues.removeLabel({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# name: 'active-cloud-review-app'
# })
# } catch (e) {
# console.log(e)
# }
# - name: Install PostgreSQL client
# run: |
# sudo apt update
# sudo apt install postgresql-client -y
# - name: Wait after installing PostgreSQL
# run: sleep 25
# - name: Drop PostgreSQL PR databases
# env:
# PGHOST: ${{ secrets.RENDER_DS_PG_HOST }}
# PGPORT: 5432
# PGUSER: ${{ secrets.RENDER_DS_PG_USER }}
# PGDATABASE: ${{ env.PR_NUMBER }}-cloud
# PGTJBDATABASE: ${{ env.PR_NUMBER }}-cloud-tjdb
# run: |
# if PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -lqt | cut -d \| -f 1 | grep -qw $PGDATABASE; then
# echo "Database $PGDATABASE exists, deleting..."
# PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -d postgres -c "drop database \"$PGDATABASE\" ;"
# else
# echo "Database $PGDATABASE does not exist."
# fi
# if PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -lqt | cut -d \| -f 1 | grep -qw $PGTJBDATABASE; then
# echo "Database $PGTJBDATABASE exists, deleting..."
# PGPASSWORD=${{ secrets.RENDER_DS_PG_PASS }} psql -h $PGHOST -p $PGPORT -U $PGUSER -d postgres -c "drop database \"$PGTJBDATABASE\" ;"
# else
# echo "Database $PGTJBDATABASE does not exist."
# fi
# suspend-cloud-review-app:
# if: ${{ github.event.action == 'labeled' && github.event.label.name == 'suspend-cloud-review-app' }}
# runs-on: ubuntu-latest
# steps:
# - name: Suspend service
# run: |
# export SERVICE_ID=$(curl --request GET \
# --url 'https://api.render.com/v1/services?name=ToolJet%20PR%20%23${{ env.PR_NUMBER }}&limit=1' \
# --header 'accept: application/json' \
# --header 'authorization: Bearer ${{ secrets.RENDER_API_KEY }}' | \
# jq -r '.[0].service.id')
# curl --request POST \
# --url https://api.render.com/v1/services/$SERVICE_ID/suspend \
# --header 'accept: application/json' \
# --header 'authorization: Bearer ${{ secrets.RENDER_API_KEY }}'
# - uses: actions/github-script@v6
# with:
# script: |
# try {
# await github.rest.issues.removeLabel({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# name: 'active-cloud-review-app'
# })
# } catch (e) {
# console.log(e)
# }
# resume-cloud-review-app:
# if: ${{ github.event.action == 'unlabeled' && github.event.label.name == 'suspend-cloud-review-app' }}
# runs-on: ubuntu-latest
# steps:
# - name: Resume service
# run: |
# export SERVICE_ID=$(curl --request GET \
# --url 'https://api.render.com/v1/services?name=ToolJet%20PR%20%23${{ env.PR_NUMBER }}&limit=1' \
# --header 'accept: application/json' \
# --header 'authorization: Bearer ${{ secrets.RENDER_API_KEY }}' | \
# jq -r '.[0].service.id')
# curl --request POST \
# --url https://api.render.com/v1/services/$SERVICE_ID/resume \
# --header 'accept: application/json' \
# --header 'authorization: Bearer ${{ secrets.RENDER_API_KEY }}'
# - uses: actions/github-script@v6
# with:
# script: |
# await github.rest.issues.addLabels({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# labels: ['active-cloud-review-app']
# })
# try {
# await github.rest.issues.removeLabel({
# issue_number: context.issue.number,
# owner: context.repo.owner,
# repo: context.repo.repo,
# name: 'suspend-cloud-review-app'
# })
# } catch (e) {
# console.log(e)
# }
redeploy-review-app:
if: ${{ github.event.action == 'synchronize' || github.event.action == 'opened' }}
runs-on: ubuntu-latest
@ -1191,7 +820,7 @@ jobs:
return labels.data.map(l => l.name);
- name: Redeploy CE review app if active
if: contains(steps.get_labels.outputs.result, 'active-ce-review-app')
if: contains(steps.get_labels.outputs.result, 'active-ce-review-app-old')
id: redeploy_ce
env:
RENDER_API_KEY: ${{ secrets.RENDER_API_KEY }}
@ -1213,7 +842,7 @@ jobs:
- name: Redeploy EE review app if active
if: contains(steps.get_labels.outputs.result, 'active-ee-review-app')
if: contains(steps.get_labels.outputs.result, 'active-ee-review-app-old')
id: redeploy_ee
env:
RENDER_API_KEY: ${{ secrets.RENDER_API_KEY }}
@ -1404,3 +1033,4 @@ jobs:
} catch (e) {
console.log(e)
}

View file

@ -2,7 +2,7 @@ name: Label for stale render deploys
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * *'
- cron: '30 15 * * *'
permissions:
issues: write
@ -79,3 +79,75 @@ jobs:
labels: ['suspend-ee-review-app']
})
}
label-stale-ee-lts-deploys:
runs-on: ubuntu-latest
permissions:
pull-requests: write
steps:
- uses: akshaysasidrn/stale-label-fetch@v1.1
id: stale-label
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
stale-label: 'active-ee-lts-review-app'
stale-time: '86400'
type: 'pull_request'
- name: Get stale numbers
run: echo "Matched PR numbers - ${{ steps.stale-label.outputs.stale-numbers }}"
- name: Add suspend label
uses: actions/github-script@v6
env:
STALE_NUMBERS: ${{ steps.stale-label.outputs.stale-numbers }}
with:
github-token: ${{ secrets.TJ_BOT_PAT }}
script: |
if (!process.env.STALE_NUMBERS) return
const prNumbers = process.env.STALE_NUMBERS.split(",")
console.log(`Adding suspend labels for EE LTS: ${prNumbers}`)
for (const prNumber of prNumbers) {
github.rest.issues.addLabels({
issue_number: prNumber,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['suspend-ee-lts-review-app']
})
}
label-stale-ee-pre-release-deploys:
runs-on: ubuntu-latest
permissions:
pull-requests: write
steps:
- uses: akshaysasidrn/stale-label-fetch@v1.1
id: stale-label
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
stale-label: 'active-ee-pre-release-review-app'
stale-time: '86400'
type: 'pull_request'
- name: Get stale numbers
run: echo "Matched PR numbers - ${{ steps.stale-label.outputs.stale-numbers }}"
- name: Add suspend label
uses: actions/github-script@v6
env:
STALE_NUMBERS: ${{ steps.stale-label.outputs.stale-numbers }}
with:
github-token: ${{ secrets.TJ_BOT_PAT }}
script: |
if (!process.env.STALE_NUMBERS) return
const prNumbers = process.env.STALE_NUMBERS.split(",")
console.log(`Adding suspend labels for EE Pre-release: ${prNumbers}`)
for (const prNumber of prNumbers) {
github.rest.issues.addLabels({
issue_number: prNumber,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['suspend-ee-pre-release-review-app']
})
}

View file

@ -0,0 +1,53 @@
name: Deploy Storybook to Netlify
on:
pull_request:
types: [closed]
branches:
- lts-3.16
workflow_dispatch:
inputs:
branch:
description: "Branch to deploy"
required: true
default: "lts-3.16"
jobs:
deploy-storybook:
if: github.event_name == 'workflow_dispatch' || github.event.pull_request.merged == true
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event_name == 'workflow_dispatch' && inputs.branch || 'lts-3.16' }}
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "22.15.1"
cache: "npm"
cache-dependency-path: frontend/package.json
- name: Install dependencies
working-directory: frontend
run: npm install
- name: Build Storybook
working-directory: frontend
run: npx storybook build
- name: Deploy to Netlify
uses: nwtgck/actions-netlify@v3
with:
publish-dir: frontend/storybook-static
production-branch: lts-3.16
production-deploy: ${{ github.event_name == 'pull_request' || inputs.branch == 'lts-3.16' }}
deploy-message: |
Storybook deploy from ${{ github.event_name == 'workflow_dispatch' && inputs.branch || github.ref_name }} @ ${{ github.sha }}
github-token: ${{ secrets.GITHUB_TOKEN }}
enable-commit-comment: false
env:
NETLIFY_AUTH_TOKEN: ${{ secrets.NETLIFY_AUTH_TOKEN }}
NETLIFY_SITE_ID: ${{ secrets.NETLIFY_STORYBOOK_SITE_ID }}

459
.github/workflows/update-test-system.yml vendored Normal file
View file

@ -0,0 +1,459 @@
name: Update test system (LTS and pre-release)
on:
workflow_dispatch:
inputs:
branch_name:
description: "Git branch to build from (required for deploy operations)"
required: false
default: "main"
dockerfile_path:
description: "Select Dockerfile (required for deploy operations)"
required: false
type: choice
options:
- ./docker/LTS/ee/ee-production.Dockerfile
- ./docker/pre-release/ee/ee-production.Dockerfile
docker_tag:
description: "Docker tag suffix (e.g., pre-release-14). Leave blank if only managing env vars."
required: false
default: ""
env_changes:
description: "Environment changes (Format: ADD KEY=value, EDIT KEY=value, REMOVE KEY) - one per line. Leave blank if only deploying."
required: false
default: ""
test_system:
description: "Select test system"
required: true
type: choice
options:
- app-builder-3.16-lts
- app-builder-pre-release
- platform-3.16-lts
- platform-pre-release
- marketplace-3.16-lts
- marketplace-pre-release
- ai-3.16-lts
- ai-pre-release
jobs:
manage-environment:
if: ${{ github.event.inputs.env_changes != '' }}
runs-on: ubuntu-latest
steps:
- name: ✅ Check user authorization
run: |
allowed_users=(
"${{ secrets.ALLOWED_USER1_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER2_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER3_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER4_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER5_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER6_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER7_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER8_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER9_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER10_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER11_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER12_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER13_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER14_TEST_SYSTEM }}"
)
current_user="${{ github.actor }}"
authorized=false
for user in "${allowed_users[@]}"; do
if [[ "$current_user" == "$user" ]]; then
authorized=true
break
fi
done
if [[ "$authorized" == "false" ]]; then
echo "❌ User '$current_user' is not authorized to trigger this workflow."
exit 1
else
echo "✅ User '$current_user' is authorized."
fi
- name: Install SSH and JQ
run: sudo apt-get update && sudo apt-get install -y jq openssh-client
- name: Determine target host
id: vmhost
run: |
test_system="${{ github.event.inputs.test_system }}"
vm_host=$(echo '${{ secrets.VM_HOST_MAP_JSON }}' | jq -r --arg sys "$test_system" '.[$sys]')
if [[ -z "$vm_host" || "$vm_host" == "null" ]]; then
echo "VM mapping not found for $test_system"
exit 1
fi
echo "host=$vm_host" >> $GITHUB_OUTPUT
- name: Update environment variables
run: |
echo "$SSH_KEY" > key.pem
chmod 600 key.pem
TARGET_SYSTEM="${{ github.event.inputs.test_system }}"
ENV_CHANGES="${{ github.event.inputs.env_changes }}"
ssh -o StrictHostKeyChecking=no -o LogLevel=ERROR -i key.pem $SSH_USER@${{ steps.vmhost.outputs.host }} << EOF
set -e
TARGET_SYSTEM="$TARGET_SYSTEM"
ENV_CHANGES="$ENV_CHANGES"
cd ~
echo "📁 Finding correct deployment directory"
if [[ "\$TARGET_SYSTEM" == *-3.16-lts ]]; then
echo "Detected LTS system: \$TARGET_SYSTEM"
echo "🔍 Searching for LTS directories..."
LTS_DIRS=\$(ls -1d ./*-lts 2>/dev/null | grep -E '[0-9]+\.[0-9]+' | sed 's|^\./||' | sort -V; \\
ls -1d ./*-lts 2>/dev/null | grep -Ev '[0-9]+\.[0-9]+' | sed 's|^\./||' | sort)
if [[ -z "\$LTS_DIRS" ]]; then
echo "❌ No LTS directories found!"
echo "Available directories:"
ls -la | grep "^d"
exit 1
fi
echo "Available LTS directories:"
echo "\$LTS_DIRS"
SELECTED_LTS_DIR=\$(echo "\$LTS_DIRS" | head -n 1)
echo "📂 Selected LTS directory: \$SELECTED_LTS_DIR"
cd "\$SELECTED_LTS_DIR"
echo "✅ Now in directory: \$(pwd)"
else
echo "Detected pre-release system: \$TARGET_SYSTEM"
echo "📂 Working in home directory: \$(pwd)"
fi
echo ""
echo "🔧 PROCESSING ENVIRONMENT VARIABLE CHANGES"
BACKUP_FILE=".env.backup.\$(date +%s)"
sudo cp .env "\$BACKUP_FILE"
echo "✅ Backup created: \$BACKUP_FILE"
PROTECTED_VARS="TOOLJET_HOST|LOCKBOX_MASTER_KEY|SECRET_KEY_BASE|ORM_LOGGING|PG_DB|PG_USER|PG_HOST|PG_PASS|TOOLJET_DB|TOOLJET_DB_USER|TOOLJET_DB_HOST|TOOLJET_DB_PASS|PGRST_DB_URI|PGRST_HOST|PGRST_JWT_SECRET|PGRST_SERVER_PORT|REDIS_HOST|REDIS_PORT|REDIS_USER|REDIS_PASSWORD|OLD_IMAGE|TOOLJET_IMAGE"
ADD_SUCCESS=0
ADD_FAIL=0
EDIT_SUCCESS=0
EDIT_FAIL=0
REMOVE_SUCCESS=0
REMOVE_FAIL=0
while IFS= read -r line; do
line=\$(echo "\$line" | xargs)
[[ -z "\$line" || "\$line" =~ ^# ]] && continue
if [[ "\$line" =~ ^ADD[[:space:]]+([^=]+)=(.*)$ ]]; then
KEY="\${BASH_REMATCH[1]}"
VALUE="\${BASH_REMATCH[2]}"
if echo "\$KEY" | grep -qE "\$PROTECTED_VARS"; then
echo "❌ FAILED: Cannot add protected variable '\$KEY'"
ADD_FAIL=\$((ADD_FAIL + 1))
continue
fi
if grep -q "^\${KEY}=" .env; then
echo "⚠️ SKIPPED: Variable '\$KEY' already exists (use EDIT)"
continue
else
if echo "\${KEY}=\${VALUE}" | sudo tee -a .env > /dev/null; then
echo "✅ SUCCESS: Added '\$KEY'"
ADD_SUCCESS=\$((ADD_SUCCESS + 1))
else
echo "❌ FAILED: Could not add '\$KEY'"
ADD_FAIL=\$((ADD_FAIL + 1))
fi
fi
elif [[ "\$line" =~ ^EDIT[[:space:]]+([^=]+)=(.*)$ ]]; then
KEY="\${BASH_REMATCH[1]}"
VALUE="\${BASH_REMATCH[2]}"
if echo "\$KEY" | grep -qE "\$PROTECTED_VARS"; then
echo "❌ FAILED: Cannot edit protected variable '\$KEY'"
EDIT_FAIL=\$((EDIT_FAIL + 1))
continue
fi
if grep -q "^\${KEY}=" .env; then
if sudo sed -i "s|^\${KEY}=.*|\${KEY}=\${VALUE}|" .env; then
echo "✅ SUCCESS: Edited '\$KEY'"
EDIT_SUCCESS=\$((EDIT_SUCCESS + 1))
else
echo "❌ FAILED: Could not edit '\$KEY'"
EDIT_FAIL=\$((EDIT_FAIL + 1))
fi
else
echo "⚠️ SKIPPED: Variable '\$KEY' not found (use ADD)"
continue
fi
elif [[ "\$line" =~ ^REMOVE[[:space:]]+([^=[:space:]]+)$ ]]; then
KEY="\${BASH_REMATCH[1]}"
if echo "\$KEY" | grep -qE "\$PROTECTED_VARS"; then
echo "❌ FAILED: Cannot remove protected variable '\$KEY'"
REMOVE_FAIL=\$((REMOVE_FAIL + 1))
continue
fi
if grep -q "^\${KEY}=" .env; then
if sudo sed -i "/^\${KEY}=/d" .env; then
echo "✅ SUCCESS: Removed '\$KEY'"
REMOVE_SUCCESS=\$((REMOVE_SUCCESS + 1))
else
echo "❌ FAILED: Could not remove '\$KEY'"
REMOVE_FAIL=\$((REMOVE_FAIL + 1))
fi
else
echo "⚠️ SKIPPED: Variable '\$KEY' not found"
continue
fi
else
echo "⚠️ INVALID FORMAT: \$line"
fi
done <<< "\$ENV_CHANGES"
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "📊 SUMMARY"
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "ADD: ✅ \$ADD_SUCCESS succeeded | ❌ \$ADD_FAIL failed"
echo "EDIT: ✅ \$EDIT_SUCCESS succeeded | ❌ \$EDIT_FAIL failed"
echo "REMOVE: ✅ \$REMOVE_SUCCESS succeeded | ❌ \$REMOVE_FAIL failed"
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
TOTAL_SUCCESS=\$((ADD_SUCCESS + EDIT_SUCCESS + REMOVE_SUCCESS))
TOTAL_FAIL=\$((ADD_FAIL + EDIT_FAIL + REMOVE_FAIL))
if [ \$TOTAL_SUCCESS -eq 0 ]; then
echo "⚠️ No changes were applied"
sudo cp "\$BACKUP_FILE" .env
exit 1
fi
echo ""
echo "🔄 Restarting containers..."
sudo docker-compose down
sudo docker-compose up -d
echo "⏳ Waiting for containers (timeout: 120s)..."
SUCCESS_FOUND=false
TIMEOUT=120
ELAPSED=0
while [ \$ELAPSED -lt \$TIMEOUT ]; do
if sudo docker-compose logs 2>/dev/null | grep -qE "🚀 TOOLJET APPLICATION STARTED SUCCESSFULLY|Ready to use at http://localhost:82 🚀|Ready to use at http://localhost:80"; then
SUCCESS_FOUND=true
break
fi
sleep 10
ELAPSED=\$((ELAPSED + 10))
done
if [ "\$SUCCESS_FOUND" = false ]; then
echo "❌ Container startup failed"
echo "🔄 Rolling back..."
sudo cp "\$BACKUP_FILE" .env
sudo docker-compose down
sudo docker-compose up -d
echo "✅ Rollback completed"
exit 1
fi
echo "✅ Environment variables updated successfully!"
echo "🧹 Cleaning up old backups..."
ls -t .env.backup.* 2>/dev/null | tail -n +2 | xargs -r sudo rm -f
EOF
env:
SSH_USER: ${{ secrets.AZURE_VM_USER }}
SSH_KEY: ${{ secrets.AZURE_VM_KEY }}
build-and-deploy:
if: ${{ !cancelled() && github.event.inputs.docker_tag != '' }}
needs: manage-environment
runs-on: ubuntu-latest
steps:
- name: Validate required inputs
run: |
if [[ -z "${{ github.event.inputs.branch_name }}" ]]; then
echo "❌ Error: branch_name is required"
exit 1
fi
if [[ -z "${{ github.event.inputs.dockerfile_path }}" ]]; then
echo "❌ Error: dockerfile_path is required"
exit 1
fi
- name: Free up disk space
run: |
sudo rm -rf /usr/share/dotnet
sudo rm -rf /opt/ghc
sudo rm -rf /usr/local/share/boost
sudo rm -rf "$AGENT_TOOLSDIRECTORY"
sudo docker system prune -af
sudo apt-get clean
df -h
- name: ✅ Check user authorization
run: |
allowed_users=(
"${{ secrets.ALLOWED_USER1_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER2_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER3_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER4_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER5_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER6_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER7_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER8_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER9_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER10_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER11_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER12_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER13_TEST_SYSTEM }}"
"${{ secrets.ALLOWED_USER14_TEST_SYSTEM }}"
)
current_user="${{ github.actor }}"
authorized=false
for user in "${allowed_users[@]}"; do
if [[ "$current_user" == "$user" ]]; then
authorized=true
break
fi
done
if [[ "$authorized" == "false" ]]; then
echo "❌ User '$current_user' is not authorized to trigger this workflow."
exit 1
else
echo "✅ User '$current_user' is authorized."
fi
- name: Checkout code
uses: actions/checkout@v4
with:
ref: ${{ github.event.inputs.branch_name }}
fetch-depth: 0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to DockerHub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PAT }}
- name: Generate full Docker tag
id: taggen
run: |
input_tag="${{ github.event.inputs.docker_tag }}"
if [[ "$input_tag" == *"/"* ]]; then
echo "tag=$input_tag" >> $GITHUB_OUTPUT
else
echo "tag=tooljet/tj-osv:$input_tag" >> $GITHUB_OUTPUT
fi
- name: Build and Push Docker image
uses: docker/build-push-action@v4
with:
context: .
file: ${{ github.event.inputs.dockerfile_path }}
push: true
tags: ${{ steps.taggen.outputs.tag }}
platforms: linux/amd64
build-args: |
CUSTOM_GITHUB_TOKEN=${{ secrets.CUSTOM_GITHUB_TOKEN }}
BRANCH_NAME=${{ github.event.inputs.branch_name }}
- name: Show the full Docker tag
run: echo "✅ Docker image built and pushed:${{ steps.taggen.outputs.tag }}"
- name: Install SSH and JQ
run: sudo apt-get update && sudo apt-get install -y jq openssh-client
- name: Determine target host
id: vmhost
run: |
test_system="${{ github.event.inputs.test_system }}"
vm_host=$(echo '${{ secrets.VM_HOST_MAP_JSON }}' | jq -r --arg sys "$test_system" '.[$sys]')
if [[ -z "$vm_host" || "$vm_host" == "null" ]]; then
echo "VM mapping not found for $test_system"
exit 1
fi
echo "host=$vm_host" >> $GITHUB_OUTPUT
- name: Deploy to target environment
run: |
echo "$SSH_KEY" > key.pem
chmod 600 key.pem
IMAGE_TAG="${{ steps.taggen.outputs.tag }}"
TARGET_SYSTEM="${{ github.event.inputs.test_system }}"
ssh -o StrictHostKeyChecking=no -o LogLevel=ERROR -i key.pem $SSH_USER@${{ steps.vmhost.outputs.host }} << EOF
set -e
IMAGE_TAG="$IMAGE_TAG"
TARGET_SYSTEM="$TARGET_SYSTEM"
cd ~
echo "📁 Finding correct deployment directory"
if [[ "\$TARGET_SYSTEM" == *-3.16-lts ]]; then
echo "Detected LTS system: \$TARGET_SYSTEM"
echo "🔍 Searching for LTS directories..."
LTS_DIRS=\$(ls -1d ./*-lts 2>/dev/null | grep -E '[0-9]+\.[0-9]+' | sed 's|^\./||' | sort -V; \\
ls -1d ./*-lts 2>/dev/null | grep -Ev '[0-9]+\.[0-9]+' | sed 's|^\./||' | sort)
if [[ -z "\$LTS_DIRS" ]]; then
echo "❌ No LTS directories found!"
echo "Available directories:"
ls -la | grep "^d"
exit 1
fi
echo "Available LTS directories:"
echo "\$LTS_DIRS"
SELECTED_LTS_DIR=\$(echo "\$LTS_DIRS" | head -n 1)
echo "📂 Selected LTS directory: \$SELECTED_LTS_DIR"
cd "\$SELECTED_LTS_DIR"
echo "✅ Now in directory: \$(pwd)"
else
echo "Detected pre-release system: \$TARGET_SYSTEM"
echo "📂 Moving to target directory: \$TARGET_SYSTEM"
cd ~
echo "✅ Now in directory: \$(pwd)"
fi
echo "🔐 Docker login"
echo "${{ secrets.DOCKER_PAT }}" | sudo docker login --username "${{ secrets.DOCKER_USERNAME }}" --password-stdin
echo "current image"
cat .env | grep TOOLJET_IMAGE
echo "📦 Reading current TOOLJET_IMAGE from .env"
CURRENT_IMAGE=\$(grep '^TOOLJET_IMAGE=' .env | cut -d '=' -f2- | tr -d '"' | tr -d "'")
echo "Found CURRENT_IMAGE: \$CURRENT_IMAGE"
echo "🛑 Stopping containers"
sudo docker-compose down
echo "📝 Updating .env with new image"
sudo sed -i "s|^TOOLJET_IMAGE=.*|TOOLJET_IMAGE=\$IMAGE_TAG|" .env
echo "📥 Pulling new image: \$IMAGE_TAG"
if [ -z "\$IMAGE_TAG" ]; then
echo "❌ IMAGE_TAG is empty!"
exit 1
fi
sudo docker pull "\$IMAGE_TAG"
echo "🚀 Starting container in background"
sudo docker-compose up -d
echo "⏳ Waiting for ToolJet to start (timeout: 300 seconds)..."
SUCCESS_FOUND=false
TIMEOUT=300
ELAPSED=0
while [ \$ELAPSED -lt \$TIMEOUT ]; do
if sudo docker-compose logs 2>/dev/null | grep -qE "🚀 TOOLJET APPLICATION STARTED SUCCESSFULLY|Ready to use at http://localhost:82 🚀|Ready to use at http://localhost:80"; then
echo "✅ Found success message in logs!"
SUCCESS_FOUND=true
break
fi
echo "⏳ Still waiting... (\${ELAPSED}s elapsed)"
sleep 10
ELAPSED=\$((ELAPSED + 10))
done
if [ "\$SUCCESS_FOUND" = false ]; then
echo "❌ Timeout reached without finding success logs"
echo "📄 Showing current logs for troubleshooting..."
sudo docker-compose logs --tail=50
echo ""
echo "=== CONTAINER STATUS ==="
sudo docker-compose ps
echo ""
echo "🛑 Starting rollback process..."
sudo docker-compose down
echo "🔄 Reverting to previous image: \$CURRENT_IMAGE"
sudo sed -i "s|^TOOLJET_IMAGE=.*|TOOLJET_IMAGE=\$CURRENT_IMAGE|" .env
echo "🔄 Starting previous image..."
sudo docker-compose up -d
echo "✅ Rollback completed!"
exit 1
fi
echo "✅ Deployment successful!"
echo "📌 Storing successful deployment info in .env"
sudo sed -i "/^OLD_IMAGE=/d" .env
echo "OLD_IMAGE=\$CURRENT_IMAGE" | sudo tee -a .env
echo "📄 Final application logs:"
sudo docker-compose logs --tail=50
echo "🧹 Pruning old Docker images"
sudo docker image prune -a -f
EOF
env:
SSH_USER: ${{ secrets.AZURE_VM_USER }}
SSH_KEY: ${{ secrets.AZURE_VM_KEY }}

View file

@ -3,7 +3,7 @@ name: Update LTS Table
on:
workflow_dispatch: # manually triggered
schedule:
- cron: '30 5 * * 1,4'
- cron: '0 9 * * 1,3,5' # 9am UTC — Monday, Wednesday, Friday
jobs:
update-lts:
@ -14,7 +14,7 @@ jobs:
- name: ⬇️ Checkout repo
uses: actions/checkout@v4
with:
ref: develop
ref: documentation
- name: 🛠 Setup Git
run: |
@ -29,7 +29,19 @@ jobs:
- name: 🧠 Run regenerate_lts_table.sh
run: bash ./docs/regenerate_lts_table.sh
- name: 🔍 Check for changes
id: changes
run: |
if git diff --quiet; then
echo "changed=false" >> $GITHUB_OUTPUT
echo " No new patch found — skipping PR."
else
echo "changed=true" >> $GITHUB_OUTPUT
echo "✅ New patch detected — creating PR."
fi
- name: 📦 Create Pull Request
if: steps.changes.outputs.changed == 'true'
id: cpr
uses: peter-evans/create-pull-request@v6
with:
@ -37,9 +49,10 @@ jobs:
branch: auto/update-lts-${{ github.run_id }}
title: "docs: update LTS version table"
body: "Automated update of the LTS version table from DockerHub."
base: develop
base: documentation
- name: 🤖 Auto-merge the PR (fallback if needed)
if: steps.changes.outputs.changed == 'true'
run: |
echo " Attempting auto-merge of PR #${PR_NUMBER}..."
gh pr merge --squash --auto "$PR_NUMBER" --repo ToolJet/ToolJet || {

File diff suppressed because it is too large Load diff

100
README.md
View file

@ -1,8 +1,8 @@
ToolJet is an **open-source low-code framework** to build and deploy internal tools with minimal engineering effort. ToolJet's drag-and-drop frontend builder allows you to create complex, responsive frontends within minutes. Additionally, you can integrate various data sources, including databases like PostgreSQL, MongoDB, and Elasticsearch; API endpoints with OpenAPI spec and OAuth2 support; SaaS tools such as Stripe, Slack, Google Sheets, Airtable, and Notion; as well as object storage services like S3, GCS, and Minio, to fetch and write data.
ToolJet is the open-source foundation of ToolJet AI - the AI-native platform for building and deploying internal tools, workflows and AI agents. The community edition provides a powerful visual builder, drag-and-drop UI, and integrations with databases, APIs, SaaS apps, and object storage. For AI-powered UI generation, query building, debugging, and enterprise features, see ToolJet AI.
:star: If you find ToolJet useful, please consider giving us a star on GitHub! Your support helps us continue to innovate and deliver exciting features.
![Docker Cloud Build Status](https://img.shields.io/docker/cloud/build/tooljet/tooljet-ce)
![Docker Cloud Build Status](https://img.shields.io/docker/automated/tooljet/tooljet-ce)
![Number of GitHub contributors](https://img.shields.io/github/contributors/tooljet/tooljet)
[![Number of GitHub issues that are open](https://img.shields.io/github/issues/ToolJet/ToolJet)](https://github.com/ToolJet/ToolJet/issues)
[![Number of GitHub stars](https://img.shields.io/github/stars/ToolJet/ToolJet)](https://github.com/ToolJet/ToolJet/stargazers)
@ -14,39 +14,45 @@ ToolJet is an **open-source low-code framework** to build and deploy internal to
[![Follow us on X, formerly Twitter](https://img.shields.io/twitter/follow/ToolJet?style=social)](https://twitter.com/ToolJet)
<p align="center">
<img src="https://user-images.githubusercontent.com/7828962/211444352-4d6d2e4a-13c9-4980-9e16-4aed4af9811b.png" alt="Tooljet dashboard showing inventory and orders"/>
<img src="docs/static/img/readme/banner.png" alt="Tooljet dashboard showing inventory and orders"/>
</p>
<p align="center">
<img src="https://github.com/ToolJet/ToolJet/assets/25361949/0e711f3a-edb7-496b-8833-107de3826933"/>
<img src="docs/static/img/readme/flowchart.png"/>
</p>
## All features
## Features
- **Visual App Builder:** 45+ built-in responsive components, including Tables, Charts, Lists, Forms, and Progress Bars.
- **ToolJet Database:** Built-in no-code database.
- **Multi-Page:** Build an application with multiple pages.
- **Multiplayer editing:** Allows simultaneous app building by multiple developers.
- **50+ data sources:** Integrate with external databases, cloud storage, and APIs.
- **Desktop & mobile:** Customize layout widths to fit various screen sizes.
- **Self-host:** Supports Docker, Kubernetes, AWS EC2, Google Cloud Run, and more.
- **Collaborate:** Add comments anywhere on the canvas and tag your team members.
- **Extend with plugins:** Use our [command-line tool](https://www.npmjs.com/package/@tooljet/cli) to easily bootstrap new connectors.
- **Version control:** Manage multiple application versions with a structured release cycle.
- **Run JS & Python code:** Execute custom JavaScript and Python snippets.
- **Granular access control:** Set permissions at both group and app levels.
- **Low-code:** Use JS code almost anywhere within the builder, such as setting text color based on status with
`status === 'success' ? 'green' : 'red'`.
- **No-code query editors:** Query Editors are available for all supported data sources.
- **Join and transform data:** Transform query results using JavaScript or Python code.
- **Secure:** All the credentials are securely encrypted using `aes-256-gcm`.
- **Data Privacy:** ToolJet serves solely as a proxy and does not store data.
- **SSO:** Supports multiple Single Sign-On providers.
### Community Edition (CE)
- **Visual App Builder:** 60+ responsive components (Tables, Charts, Forms, Lists, Progress Bars, and more).
- **ToolJet Database:** Built-in no-code database.
- **Multi-page Apps & Multiplayer Editing:** Build complex apps collaboratively.
- **80+ Data Sources:** Connect to databases, APIs, cloud storage, and SaaS tools.
- **Flexible Deployment:** Self-host with Docker, Kubernetes, AWS, GCP, Azure, and more.
- **Collaboration Tools:** Inline comments, mentions, and granular access control.
- **Extensibility:** Create plugins and connectors with the [ToolJet CLI](https://www.npmjs.com/package/@tooljet/cli).
- **Code Anywhere:** Run JavaScript and Python inside your apps.
- **Secure by Design:** AES-256-GCM encryption, proxy-only data flow, SSO support.
### ToolJet AI (Enterprise)
Everything in CE, plus:
- **AI App Generation:** Create apps instantly from natural language prompts.
- **AI Query Builder:** Generate and transform queries with AI assistance.
- **AI Debugging:** Identify and fix issues with one click.
- **Agent Builder:** Create intelligent agents to automate workflows and orchestrate processes.
- **Enterprise-grade Security & Compliance:** SOC 2 and GDPR readiness, audit logs, and advanced access control.
- **User Management:** Role-based access (RBAC), custom groups, and granular app/data permissions.
- **Multi-environment Management:** Seamless dev/stage/prod environments.
- **GitSync & CI/CD:** Integrate with GitHub/GitLab for version control and streamlined deployments.
- **Branding & Customization:** White-labeling, and custom theming for organizational branding.
- **Fine-Grained Access Control:** Secure data and actions at the row, component, page, and query levels.
- **Embedded Apps:** Embed ToolJet apps securely within other applications or portals.
- **Enterprise Support:** SLAs, priority bug fixes, and onboarding assistance.
<hr>
## Quickstart
The easiest way to get started with ToolJet is by creating a [ToolJet Cloud](https://tooljet.ai) account. ToolJet Cloud offers a hosted solution of ToolJet. If you want to self-host ToolJet, kindly proceed to [deployment documentation](https://docs.tooljet.ai/docs/setup/).
The easiest way to get started with ToolJet is by creating a [ToolJet Cloud](https://tooljet.com) account. ToolJet Cloud offers a hosted solution of ToolJet. If you want to self-host ToolJet, kindly proceed to [deployment documentation](https://docs.tooljet.com/docs/setup/).
### Try using Docker
Want to give ToolJet a quick spin on your local machine? You can run the following command from your terminal to have ToolJet up and running right away.
@ -66,35 +72,35 @@ docker run \
## Tutorials and examples
[Time Tracker Application](https://docs.tooljet.ai/docs/#quickstart-guide)<br>
[Build your own CMS using low-code](https://blog.tooljet.ai/build-cms-using-lowcode-and-mongodb/)<br>
[AWS S3 Browser](https://blog.tooljet.ai/build-an-aws-s3-broswer-with-tooljet/)<br>
[Time Tracker Application](https://docs.tooljet.com/docs/#quickstart-guide)<br>
[Build your own CMS using low-code](https://blog.tooljet.com/build-cms-using-lowcode-and-mongodb/)<br>
[AWS S3 Browser](https://blog.tooljet.com/build-an-aws-s3-broswer-with-tooljet/)<br>
## Documentation
Documentation is available at https://docs.tooljet.ai.
Documentation is available at https://docs.tooljet.com.
- [Getting Started](https://docs.tooljet.ai)<br>
- [Data source Reference](https://docs.tooljet.ai/docs/data-sources/airtable/)<br>
- [Component Reference](https://docs.tooljet.ai/docs/widgets/button)
- [Getting Started](https://docs.tooljet.com)<br>
- [Data source Reference](https://docs.tooljet.com/docs/data-sources/airtable/)<br>
- [Component Reference](https://docs.tooljet.com/docs/widgets/button)
## Self-hosted
You can use ToolJet Cloud for a fully managed solution. If you want to self-host ToolJet, we have guides on deploying ToolJet on Kubernetes, AWS EC2, Docker, and more.
| Provider | Documentation |
| :------------- | :------------- |
| Digital Ocean | [Link](https://docs.tooljet.ai/docs/setup/digitalocean) |
| Docker | [Link](https://docs.tooljet.ai/docs/setup/docker) |
| AWS EC2 | [Link](https://docs.tooljet.ai/docs/setup/ec2) |
| AWS ECS | [Link](https://docs.tooljet.ai/docs/setup/ecs) |
| OpenShift | [Link](https://docs.tooljet.ai/docs/setup/openshift) |
| Helm | [Link](https://docs.tooljet.ai/docs/setup/helm) |
| AWS EKS (Kubernetes) | [Link](https://docs.tooljet.ai/docs/setup/kubernetes) |
| GCP GKE (Kubernetes) | [Link](https://docs.tooljet.ai/docs/setup/kubernetes-gke) |
| Azure AKS (Kubernetes) | [Link](https://docs.tooljet.ai/docs/setup/kubernetes-aks) |
| Azure Container | [Link](https://docs.tooljet.ai/docs/setup/azure-container) |
| Google Cloud Run | [Link](https://docs.tooljet.ai/docs/setup/google-cloud-run) |
| Deploying ToolJet client | [Link](https://docs.tooljet.ai/docs/setup/client) |
| Deploying ToolJet on a Subpath | [Link](https://docs.tooljet.ai/docs/setup/tooljet-subpath/) |
| Digital Ocean | [Link](https://docs.tooljet.com/docs/setup/digitalocean) |
| Docker | [Link](https://docs.tooljet.com/docs/setup/docker) |
| AWS EC2 | [Link](https://docs.tooljet.com/docs/setup/ec2) |
| AWS ECS | [Link](https://docs.tooljet.com/docs/setup/ecs) |
| OpenShift | [Link](https://docs.tooljet.com/docs/setup/openshift) |
| Helm | [Link](https://docs.tooljet.com/docs/setup/helm) |
| AWS EKS (Kubernetes) | [Link](https://docs.tooljet.com/docs/setup/kubernetes) |
| GCP GKE (Kubernetes) | [Link](https://docs.tooljet.com/docs/setup/kubernetes-gke) |
| Azure AKS (Kubernetes) | [Link](https://docs.tooljet.com/docs/setup/kubernetes-aks) |
| Azure Container | [Link](https://docs.tooljet.com/docs/setup/azure-container) |
| Google Cloud Run | [Link](https://docs.tooljet.com/docs/setup/google-cloud-run) |
| Deploying ToolJet client | [Link](https://docs.tooljet.com/docs/setup/client) |
| Deploying ToolJet on a Subpath | [Link](https://docs.tooljet.com/docs/setup/tooljet-subpath/) |
## Marketplace
ToolJet can now be found on both AWS and Azure Marketplaces, making it simpler than ever to access and deploy our app-building platform.
@ -102,9 +108,9 @@ ToolJet can now be found on both AWS and Azure Marketplaces, making it simpler t
Find ToolJet on AWS Marketplace [here](https://aws.amazon.com/marketplace/pp/prodview-fxjto27jkpqfg?sr=0-1&ref_=beagle&applicationId=AWSMPContessa) and explore seamless integration on Azure Marketplace [here](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/tooljetsolutioninc1679496832216.tooljet?tab=Overview).
## Community support
For general help using ToolJet, please refer to the official [documentation](https://docs.tooljet.ai/docs/). For additional help, you can use one of these channels to ask a question:
For general help using ToolJet, please refer to the official [documentation](https://docs.tooljet.com/docs/). For additional help, you can use one of these channels to ask a question:
- [Slack](https://tooljet.ai/slack) - Discussions with the community and the team.
- [Slack](https://tooljet.com/slack) - Discussions with the community and the team.
- [GitHub](https://github.com/ToolJet/ToolJet/issues) - For bug reports and feature requests.
- [𝕏 (Twitter)](https://twitter.com/ToolJet) - Get the product updates quickly.

View file

@ -5,4 +5,6 @@
/cypress/downloads
/cypress/videos
/coverage
/.nyc_output
/.nyc_output
/.claude
/.webpack_cache

View file

@ -0,0 +1,49 @@
const { defineConfig } = require("cypress");
module.exports = defineConfig({
execTimeout: 1800000,
defaultCommandTimeout: 30000,
requestTimeout: 30000,
pageLoadTimeout: 30000,
responseTimeout: 30000,
viewportWidth: 1440,
viewportHeight: 960,
chromeWebSecurity: false,
projectId: "sk3oji",
e2e: {
setupNodeEvents(on, config) {
require("./cypress/config/tasks")(on);
require("./cypress/config/browserConfig")(on);
return require("./cypress/plugins/index.js")(on, config);
},
baseUrl: "http://localhost:3000", // Default for local development (GitHub workflow overrides this)
specPattern: [
"cypress/e2e/happyPath/platform/firstUser/firstUserOnboarding.cy.js",
"cypress/e2e/happyPath/platform/eeTestcases/licensing/updateLicense.cy.js",
"cypress/e2e/happyPath/platform/eeTestcases/gitSync/**/*.cy.js",
],
testIsolation: true,
redirectionLimit: 10,
numTestsKeptInMemory: 0,
experimentalMemoryManagement: true,
experimentalRunAllSpecs: true,
experimentalModifyObstructiveThirdPartyCode: true,
experimentalOriginDependencies: true,
downloadsFolder: "cypress/downloads",
trashAssetsBeforeRuns: true,
video: false,
videoUploadOnPasses: false,
screenshotOnRunFailure: true,
screenshotsFolder: "cypress/screenshots",
coverage: false,
codeCoverageTasksRegistered: false,
},
});

View file

@ -1,75 +0,0 @@
# Create .env from this example file and replace values for the environment.
# The application expects a separate .env.test for test environment configuration
# Get detailed information about each variable here: https://docs.tooljet.com/docs/setup/env-vars
TOOLJET_HOST=http://localhost:8082
LOCKBOX_MASTER_KEY= # replace_with_lockbox_master_key
SECRET_KEY_BASE= # replace_with_secret_key_base
# DATABASE CONFIG
ORM_LOGGING=all
PG_DB=tooljet_production
PG_USER=postgres
PG_HOST=postgresql
PG_PASS= # postgres database password
# The above postgres values is set to its default state. If necessary, kindly modify it according to your personal preference.
# TOOLJET DATABASE
TOOLJET_DB=tooljet_db
TOOLJET_DB_USER=postgres
TOOLJET_DB_HOST=postgresql
TOOLJET_DB_PASS=
PGRST_DB_URI= # postgres://<postgres_username>:<postgres_password><@postgres_hostname>/<database_name>
PGRST_HOST=postgrest
PGRST_JWT_SECRET= # If you have openssl installed, you can run the following command openssl rand -hex 32 to generate the value for PGRST_JWT_SECRET.
# Checks every 24 hours to see if a new version of ToolJet is available
# (Enabled by default. Set false to disable)
CHECK_FOR_UPDATES=true
# Checks every 24 hours to update app telemetry data to ToolJet hub.
# (Telemetry is enabled by default. Set value to true to disable.)
# DISABLE_TOOLJET_TELEMETRY=false
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
# EMAIL CONFIGURATION
DEFAULT_FROM_EMAIL=hello@tooljet.io
SMTP_USERNAME=
SMTP_PASSWORD=
SMTP_DOMAIN=
SMTP_PORT=
# DISABLE USER SIGNUPS (true or false). only applicable if Multi-Workspace feature is enabled
DISABLE_SIGNUPS=
# OBSERVABILITY
APM_VENDOR=
SENTRY_DNS=
SENTRY_DEBUG=
# FEATURE TOGGLE
COMMENT_FEATURE_ENABLE=
ENABLE_MULTIPLAYER_EDITING=true
# SSO (Applicable only for Multi-Workspace)
SSO_GOOGLE_OAUTH2_CLIENT_ID=
SSO_GIT_OAUTH2_CLIENT_ID=
SSO_GIT_OAUTH2_CLIENT_SECRET=
SSO_GIT_OAUTH2_HOST=
SSO_ACCEPTED_DOMAINS=
SSO_DISABLE_SIGNUPS=
#ONBOARDING
ENABLE_ONBOARDING_QUESTIONS_FOR_ALL_SIGN_UPS=
#session expiry in minutes
USER_SESSION_EXPIRY=2880
#TELEMETRY
DEPLOYMENT_PLATFORM=docker

View file

@ -1,51 +0,0 @@
version: "3"
services:
tooljet:
tty: true
stdin_open: true
container_name: Tooljet-app
image: tooljet/tooljet-ce:latest
restart: always
env_file: .env
ports:
- 80:80
depends_on:
- postgres
environment:
SERVE_CLIENT: "true"
PORT: "80"
command: npm run start:prod
postgres:
container_name: ${PG_HOST}
image: postgres:13
restart: always
volumes:
- postgres:/var/lib/postgresql/data
env_file: .env
environment:
- POSTGRES_USER=${PG_USER}
- POSTGRES_PASSWORD=${PG_PASS}
postgrest:
container_name: postgrest
image: postgrest/postgrest:v12.0.2
restart: always
depends_on:
- postgres
env_file: .env
environment:
- PGRST_SERVER_PORT=80
- PGRST_DB_PRE_CONFIG=postgrest.pre_config
volumes:
postgres:
driver: local
driver_opts:
o: bind
type: none
device: ${PWD}/postgres_data
certs:
logs:
fallbackcerts:

View file

@ -0,0 +1,77 @@
# Create .env from this example file and replace values for the environment.
# The application expects a separate .env.test for test environment configuration
# Get detailed information about each variable here: https://docs.tooljet.com/docs/setup/env-vars
TOOLJET_HOST=http://localhost:80
LOCKBOX_MASTER_KEY= # replace_with_lockbox_master_key
SECRET_KEY_BASE= # replace_with_secret_key_base
# DATABASE CONFIG
ORM_LOGGING=all
PG_DB=tooljet_production
PG_USER=postgres
PG_HOST=postgresql
PG_PASS= # postgres database password
# The above postgres values is set to its default state. If necessary, kindly modify it according to your personal preference.
# TOOLJET DATABASE
TOOLJET_DB=tooljet_db
TOOLJET_DB_USER=postgres
TOOLJET_DB_HOST=postgresql
TOOLJET_DB_PASS=
PGRST_DB_URI= # postgres://<postgres_username>:<postgres_password><@postgres_hostname>/<database_name>
PGRST_HOST=localhost:3001
PGRST_JWT_SECRET= # If you have openssl installed, you can run the following command openssl rand -hex 32 to generate the value for PGRST_JWT_SECRET.
PGRST_SERVER_PORT=3001
PGRST_DB_PRE_CONFIG=postgrest.pre_config
# Checks every 24 hours to see if a new version of ToolJet is available
# (Enabled by default. Set false to disable)
CHECK_FOR_UPDATES=true
# Checks every 24 hours to update app telemetry data to ToolJet hub.
# (Telemetry is enabled by default. Set value to true to disable.)
# DISABLE_TOOLJET_TELEMETRY=false
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
# EMAIL CONFIGURATION
DEFAULT_FROM_EMAIL=hello@tooljet.io
SMTP_USERNAME=
SMTP_PASSWORD=
SMTP_DOMAIN=
SMTP_PORT=
# DISABLE USER SIGNUPS (true or false). only applicable if Multi-Workspace feature is enabled
DISABLE_SIGNUPS=
# OBSERVABILITY
APM_VENDOR=
SENTRY_DNS=
SENTRY_DEBUG=
# FEATURE TOGGLE
COMMENT_FEATURE_ENABLE=
ENABLE_MULTIPLAYER_EDITING=true
# SSO (Applicable only for Multi-Workspace)
SSO_GOOGLE_OAUTH2_CLIENT_ID=
SSO_GIT_OAUTH2_CLIENT_ID=
SSO_GIT_OAUTH2_CLIENT_SECRET=
SSO_GIT_OAUTH2_HOST=
SSO_ACCEPTED_DOMAINS=
SSO_DISABLE_SIGNUPS=
#ONBOARDING
ENABLE_ONBOARDING_QUESTIONS_FOR_ALL_SIGN_UPS=
#session expiry in minutes
USER_SESSION_EXPIRY=2880
#TELEMETRY
DEPLOYMENT_PLATFORM=docker

View file

@ -0,0 +1,41 @@
version: "3"
services:
tooljet:
tty: true
stdin_open: true
container_name: Tooljet-app
image: tooljet/tooljet-ce:ce-lts-latest
platform: linux/amd64
restart: always
env_file: .env
ports:
- 80:80
depends_on:
- postgres
environment:
SERVE_CLIENT: "true"
PORT: "80"
command: npm run start:prod
postgres:
container_name: ${PG_HOST}
image: postgres:16
restart: always
volumes:
- postgres:/var/lib/postgresql/data
env_file: .env
environment:
- POSTGRES_USER=${PG_USER}
- POSTGRES_PASSWORD=${PG_PASS}
volumes:
postgres:
driver: local
driver_opts:
o: bind
type: none
device: ${PWD}/postgres_data
certs:
logs:
fallbackcerts:

View file

@ -22,10 +22,11 @@ TOOLJET_DB_USER= # Postgres database username
TOOLJET_DB_HOST= # Postgres database host
TOOLJET_DB_PASS= # Postgres database password
PGRST_HOST=postgrest
PGRST_HOST=localhost:3001
PGRST_DB_URI=
PGRST_JWT_SECRET= # If you have openssl installed, you can run the following command openssl rand -hex 32 to generate the value for PGRST_JWT_SECRET.
PGRST_SERVER_PORT=3001
PGRST_DB_PRE_CONFIG=postgrest.pre_config
# Checks every 24 hours to see if a new version of ToolJet is available
# (Enabled by default. Set false to disable)

View file

@ -0,0 +1,17 @@
version: "3"
services:
tooljet:
tty: true
stdin_open: true
container_name: Tooljet-app
image: tooljet/tooljet-ce:ce-lts-latest
platform: linux/amd64
restart: always
env_file: .env
ports:
- 80:80
environment:
SERVE_CLIENT: "true"
PORT: "80"
command: npm run start:prod

View file

@ -1,24 +0,0 @@
version: "3"
services:
tooljet:
tty: true
stdin_open: true
container_name: Tooljet-app
image: tooljet/tooljet-ce:latest
restart: always
env_file: .env
ports:
- 80:80
environment:
SERVE_CLIENT: "true"
PORT: "80"
command: npm run start:prod
# Uncomment if ENABLE_TOOLJET_DB=true
postgrest:
image: postgrest/postgrest:v12.0.2
restart: always
env_file: .env
environment:
- PGRST_SERVER_PORT=80
- PGRST_DB_PRE_CONFIG=postgrest.pre_config

View file

@ -11,7 +11,6 @@ source "amazon-ebs" "ubuntu" {
ami_name = "${var.ami_name}"
instance_type = "${var.instance_type}"
region = "${var.ami_region}"
ami_regions = "${var.ami_regions}"
ami_groups = "${var.ami_groups}"
source_ami_filter {
@ -30,7 +29,7 @@ source "amazon-ebs" "ubuntu" {
launch_block_device_mappings {
device_name = "/dev/sda1"
volume_size = 30
volume_size = 15
delete_on_termination = true
}

View file

@ -9,7 +9,7 @@ variable "instance_type" {
variable "ami_region" {
type = string
default = "us-west-2"
default = "us-east-1"
}
variable "ami_groups" {
@ -17,11 +17,6 @@ variable "ami_groups" {
default = ["all"]
}
variable "ami_regions" {
type = list(string)
default = ["us-west-1","us-east-1", "us-east-2", "eu-central-1", "ap-northeast-1", "ca-central-1"]
}
variable "PACKER_BUILDER_TYPE" {
type = string
default = "amazon-ebs"

View file

@ -40,6 +40,8 @@ services:
platform: linux/x86_64
depends_on:
- postgres
env_file:
- .env
volumes:
- ./server:/app/server:delegated
- ./plugins:/app/plugins
@ -57,7 +59,7 @@ services:
container_name: postgrest
image: postgrest/postgrest:v12.0.2
ports:
- "3001:3000"
- "3002:3002"
env_file:
- .env
depends_on:
@ -76,5 +78,22 @@ services:
- POSTGRES_USER=${PG_USER}
- POSTGRES_PASSWORD=${PG_PASS}
redis:
container_name: redis
image: redis:6.2
restart: always
deploy:
resources:
limits:
cpus: '0.5'
memory: 1G
env_file:
- .env
environment:
- MASTER=redis
- REDIS_USER=${REDIS_USER}
- REDIS_PASSWORD=${REDIS_PASSWORD}
volumes:
postgres:
redis:

View file

@ -0,0 +1,82 @@
# Create .env from this example file and replace values for the environment.
# The application expects a separate .env.test for test environment configuration
# Get detailed information about each variable here: https://docs.tooljet.com/docs/setup/env-vars
TOOLJET_HOST=http://localhost:8082
LOCKBOX_MASTER_KEY= # replace_with_lockbox_master_key
SECRET_KEY_BASE= # replace_with_secret_key_base
# DATABASE CONFIG
ORM_LOGGING=all
PG_DB=tooljet_production
PG_USER=postgres
PG_HOST=postgresql
PG_PASS= # postgres database password
# The above postgres values is set to its default state. If necessary, kindly modify it according to your personal preference.
# TOOLJET DATABASE
TOOLJET_DB=tooljet_db
TOOLJET_DB_USER=postgres
TOOLJET_DB_HOST=postgresql
TOOLJET_DB_PASS=
PGRST_DB_URI= # postgres://<postgres_username>:<postgres_password><@postgres_hostname>/<database_name>
PGRST_HOST=postgrest:3002
PGRST_JWT_SECRET= # If you have openssl installed, you can run the following command openssl rand -hex 32 to generate the value for PGRST_JWT_SECRET.
PGRST_SERVER_PORT=3002
PGRST_DB_PRE_CONFIG=postgrest.pre_config
# Redis configuration
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_USER=default
REDIS_PASSWORD=
# Checks every 24 hours to see if a new version of ToolJet is available
# (Enabled by default. Set false to disable)
CHECK_FOR_UPDATES=true
# Checks every 24 hours to update app telemetry data to ToolJet hub.
# (Telemetry is enabled by default. Set value to true to disable.)
# DISABLE_TOOLJET_TELEMETRY=false
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
# EMAIL CONFIGURATION
DEFAULT_FROM_EMAIL=hello@tooljet.io
SMTP_USERNAME=
SMTP_PASSWORD=
SMTP_DOMAIN=
SMTP_PORT=
# DISABLE USER SIGNUPS (true or false). only applicable if Multi-Workspace feature is enabled
DISABLE_SIGNUPS=
# OBSERVABILITY
APM_VENDOR=
SENTRY_DNS=
SENTRY_DEBUG=
# FEATURE TOGGLE
COMMENT_FEATURE_ENABLE=
ENABLE_MULTIPLAYER_EDITING=true
# SSO (Applicable only for Multi-Workspace)
SSO_GOOGLE_OAUTH2_CLIENT_ID=
SSO_GIT_OAUTH2_CLIENT_ID=
SSO_GIT_OAUTH2_CLIENT_SECRET=
SSO_GIT_OAUTH2_HOST=
SSO_ACCEPTED_DOMAINS=
SSO_DISABLE_SIGNUPS=
#ONBOARDING
ENABLE_ONBOARDING_QUESTIONS_FOR_ALL_SIGN_UPS=
#session expiry in minutes
USER_SESSION_EXPIRY=2880
#TELEMETRY
DEPLOYMENT_PLATFORM=docker

View file

@ -5,6 +5,24 @@ if [ -f "./.env" ]; then
export $(grep -v '^#' ./.env | xargs -d '\n') || true
fi
# Check if PGRST_HOST starts with "localhost"
if [[ "$PGRST_HOST" == localhost:* ]]; then
echo "Starting PostgREST server locally..."
# Generate PostgREST configuration in a writable directory
POSTGREST_CONFIG_PATH="/tmp/postgrest.conf"
echo "db-uri = \"${PGRST_DB_URI}\"" > "$POSTGREST_CONFIG_PATH"
echo "db-pre-config = \"postgrest.pre_config\"" >> "$POSTGREST_CONFIG_PATH"
echo "server-port = \"${PGRST_SERVER_PORT}\"" >> "$POSTGREST_CONFIG_PATH"
# Starting PostgREST
echo "Starting PostgREST..."
postgrest "$POSTGREST_CONFIG_PATH" &
else
echo "Using external PostgREST at $PGRST_HOST."
fi
if [ -d "./server/dist" ]; then
SETUP_CMD='npm run db:setup:prod'
else

View file

@ -35,6 +35,19 @@ RUN npm install -g @nestjs/cli
RUN npm install -g copyfiles
RUN npm --prefix server run build
# Install dependencies for PostgREST, curl, tar, etc.
RUN apt-get update && apt-get install -y \
curl ca-certificates tar \
&& rm -rf /var/lib/apt/lists/*
ENV POSTGREST_VERSION=v12.2.0
RUN curl -Lo postgrest.tar.xz https://github.com/PostgREST/postgrest/releases/download/${POSTGREST_VERSION}/postgrest-v12.2.0-linux-static-x64.tar.xz && \
tar -xf postgrest.tar.xz && \
mv postgrest /postgrest && \
rm postgrest.tar.xz && \
chmod +x /postgrest
FROM debian:12
RUN apt-get update -yq \
@ -103,6 +116,13 @@ RUN useradd --create-home --home-dir /home/appuser appuser \
&& chmod u+x /app \
&& chmod -R g=u /app
# Use the PostgREST binary from the builder stage
COPY --from=builder --chown=appuser:0 /postgrest /usr/local/bin/postgrest
RUN mv /usr/local/bin/postgrest /usr/local/bin/postgrest-original && \
echo '#!/bin/bash\nexec /usr/local/bin/postgrest-original "$@" 2>&1 | sed "s/^/[PostgREST] /"' > /usr/local/bin/postgrest && \
chmod +x /usr/local/bin/postgrest
# Set npm cache directory
ENV npm_config_cache /home/appuser/.npm

View file

@ -1,9 +1,9 @@
# pull official base image
FROM node:18.18.2-buster
FROM node:22.15.1-bullseye
ENV NODE_ENV=development
RUN npm i -g npm@9.8.1
RUN npm i -g npm@10.9.2
# set working directory
WORKDIR /app

101
docker/internal.sh Executable file
View file

@ -0,0 +1,101 @@
#!/bin/bash
# Load the .env file
source .env
# Check if LOCKBOX_MASTER_KEY is present or empty
if [[ -z "$LOCKBOX_MASTER_KEY" ]]; then
# Generate LOCKBOX_MASTER_KEY
LOCKBOX_MASTER_KEY=$(openssl rand -hex 32)
# Update .env file
awk -v key="$LOCKBOX_MASTER_KEY" '
BEGIN { FS=OFS="=" }
/^LOCKBOX_MASTER_KEY=/ { $2=key; found=1 }
1
END { if (!found) print "LOCKBOX_MASTER_KEY="key }
' .env > temp.env && mv temp.env .env
echo "Generated a secure master key for the lockbox"
else
echo "The lockbox master key already exists."
fi
# Check if SECRET_KEY_BASE is present or empty
if [[ -z "$SECRET_KEY_BASE" ]]; then
# Generate SECRET_KEY_BASE
SECRET_KEY_BASE=$(openssl rand -hex 64)
# Update .env file
awk -v key="$SECRET_KEY_BASE" '
BEGIN { FS=OFS="=" }
/^SECRET_KEY_BASE=/ { $2=key; found=1 }
1
END { if (!found) print "SECRET_KEY_BASE="key }
' .env > temp.env && mv temp.env .env
echo "Created a secret key for secure operations."
else
echo "The secret key base is already in place."
fi
# Check if PGRST_JWT_SECRET is present or empty
if [[ -z "$PGRST_JWT_SECRET" ]]; then
# Generate PGRST_JWT_SECRET
PGRST_JWT_SECRET=$(openssl rand -hex 32)
# Update .env file
awk -v key="$PGRST_JWT_SECRET" '
BEGIN { FS=OFS="=" }
/^PGRST_JWT_SECRET=/ { $2=key; found=1 }
1
END { if (!found) print "PGRST_JWT_SECRET="key }
' .env > temp.env && mv temp.env .env
echo "Generated a unique secret for PGRST authentication."
else
echo "The PGRST JWT secret is already generated and in place."
fi
# Function to generate a random password
generate_password() {
openssl rand -base64 12 | tr -d '/+' | cut -c1-16
}
# Check if PG_PASS and TOOLJET_DB_PASS are present or empty
if [[ -z "$PG_PASS" ]] && [[ -z "$TOOLJET_DB_PASS" ]]; then
# Generate random passwords
PASSWORD=$(generate_password)
# Update .env file
awk -v pass="$PASSWORD" '
BEGIN { FS=OFS="=" }
/^(PG_PASS|TOOLJET_DB_PASS)=/ { $2=pass; found=1 }
1
END { if (!found) print "PG_PASS="pass ORS "TOOLJET_DB_PASS="pass }
' .env > temp.env && mv temp.env .env
echo "Successfully generated a secure password for the PostgreSQL database."
else
echo "Postgres password already exist"
fi
# Check if PGRST_DB_URI is present or empty
if [[ -z "$PGRST_DB_URI" ]]; then
# Construct PGRST_DB_URI with PG_PASS
PGRST_DB_URI="postgres://postgres:$PASSWORD@postgresql/tooljet_db"
# Update .env file for PGRST_DB_URI
awk -v uri="$PGRST_DB_URI" '
BEGIN { FS=OFS="=" }
/^PGRST_DB_URI=/ { $2=uri; found=1 }
1
END { if (!found) print "PGRST_DB_URI="uri }
' .env > temp.env && mv temp.env .env
echo "Successfully updated PGRST database URI"
else
echo "The PGRST DB URI is already configured and in use."
fi
exec "$@"

View file

@ -1,7 +1,7 @@
# pull official base image
FROM node:18.18.2-buster
FROM node:22.15.1-bullseye
RUN npm i -g npm@9.8.1
RUN npm i -g npm@10.9.2
# set working directory
WORKDIR /app

View file

@ -1,5 +1,5 @@
# pull official base image
FROM node:18.18.2-buster
FROM node:22.15.1-bullseye
RUN apt-get update && apt-get install -y postgresql-client freetds-dev libaio1 wget
# Install Instantclient Basic Light Oracle and Dependencies
@ -19,7 +19,7 @@ WORKDIR /
ENV NODE_ENV=development
ENV NODE_OPTIONS="--max-old-space-size=4096"
RUN npm i -g npm@9.8.1
RUN npm i -g npm@10.9.2
RUN mkdir -p /app
WORKDIR /app
@ -30,4 +30,4 @@ COPY ./server/package.json ./server/package-lock.json ./server/
RUN npm --prefix server install
COPY ./server/ ./server/
ENTRYPOINT ["./server/entrypoint.sh"]
ENTRYPOINT ["./server/local-ce-entrypoint.sh"]

View file

@ -3,7 +3,7 @@
<h1 align="center">ToolJet Documentation</h1>
</p>
The directory "ToolJet/docs/" holds the code and markdown source files for the ToolJet documentation website, which is accessible at [docs.tooljet.ai](docs.tooljet.ai)
The directory "ToolJet/docs/" holds the code and markdown source files for the ToolJet documentation website, which is accessible at [docs.tooljet.com](docs.tooljet.com)
## Index
- [Feedback](#feedback)
@ -26,7 +26,7 @@ In case you encounter any issues with the ToolJet product, please select the rel
To contribute to ToolJet documentation, you need to fork this repository and submit a pull request for the Markdown and/or image changes that you're proposing.
### Repository organization
The content in this directory follows the organization of documentation at https://docs.tooljet.ai
The content in this directory follows the organization of documentation at https://docs.tooljet.com
This directory contains the following folders:
@ -41,15 +41,15 @@ This directory contains the following folders:
├── versioned_docs
│ ├── version-x.x.x # Current/latest version (set it on docusauras.config.js)
│ │ ├── Enterprise
│ │ │ └── multi-environment.md # https://docs.tooljet.ai/docs/Enterprise/multi-environment
│ │ └── tooljet-database.md. # https://docs.tooljet.ai/docs/tooljet-database
│ │ │ └── multi-environment.md # https://docs.tooljet.com/docs/Enterprise/multi-environment
│ │ └── tooljet-database.md. # https://docs.tooljet.com/docs/tooljet-database
│ └── version-2.0.0
│ │ ├── Enterprise
│ │ │ └── multi-environment.md # https://docs.tooljet.ai/docs/2.0.0/Enterprise/multi-environment
│ │ │ └── multi-environment.md # https://docs.tooljet.com/docs/2.0.0/Enterprise/multi-environment
│ │ └── tooljet-database.md
│ └── version-1.0.0
│ ├── Enterprise
│ │ └── multi-environment.md # https://docs.tooljet.ai/docs/1.0.0/Enterprise/multi-environment
│ │ └── multi-environment.md # https://docs.tooljet.com/docs/1.0.0/Enterprise/multi-environment
│ └── tooljet-database.md
├── versioned_sidebars # includes sidebar for the specific versions
│ ├── version-x.x.x-sidebars.json

View file

@ -39,16 +39,16 @@ Make sure to run it within the WSL2 terminal.
git clone https://github.com/<your-username>/ToolJet.git
```
3. Create a `.env` file by copying `.env.example`. More information on the variables that can be set is given in the **[environment variables reference](/docs/setup/env-vars)**.
3. Create a `.env` file by copying `.env.internal.example`. More information on the variables that can be set is given in the **[environment variables reference](/docs/setup/env-vars)**.
```bash
cp ./deploy/docker/.env.internal.example .env
cp ./docker/.env.internal.example .env
```
4. Populate the keys in the `.env` using the below the command:
```bash
chmod +x ./deploy/docker/internal.sh && ./deploy/docker/internal.sh
chmod +x ./docker/internal.sh && ./docker/internal.sh
```
:::warning

View file

@ -67,7 +67,7 @@ To create a query for sending an email, follow these steps:
- **CC mail to** : Email address of the recipients that will receive a copy of the email, and their email addresses will be visible to other recipients.
- **BCC mail to** : Email address of the recipients that will receive a copy of the email but the email addressed will be hidden to other recipients.
- **Attachments** : You can add attachments to an SMTP query by referencing the file from the File Picker component in the attachments field.
- For instance, you can set the `Attachments` field value to `{{ components.filepicker1.file }}` or pass an object `{{ name: 'filename.jpg', dataURL: '......' }}` to include attachments.
- For instance, you can set the `Attachments` field value to `{{ components.filepicker1.file }}` or pass an object `{{[{ name: "filename.jpg", dataURL: " " }]}}` to include attachments.
<img className="screenshot-full" src="/img/datasource-reference/smtp/querysmtp-v2.png" alt="smtp connect" />

View file

@ -55,6 +55,8 @@ To remove a plugin, follow these steps:
- On the `Installed` page, click on the `Remove` button of the related plugin that you wish to remove.
## Available Plugins
- **[Aftership](/docs/marketplace/plugins/marketplace-plugin-aftership)**
- **[Anthropic](/docs/marketplace/plugins/marketplace-plugin-anthropic)**
- **[AWS Redshift](/docs/marketplace/plugins/marketplace-plugin-awsredshift)**
- **[AWS Textract](/docs/marketplace/plugins/marketplace-plugin-textract)**
@ -67,6 +69,7 @@ To remove a plugin, follow these steps:
- **[HarperDB](/docs/marketplace/plugins/marketplace-plugin-harperdb)**
- **[Hugging Face](/docs/marketplace/plugins/marketplace-plugin-hugging_face)**
- **[Jira](/docs/marketplace/plugins/marketplace-plugin-jira)**
- **[Microsoft Graph](/docs/marketplace/plugins/marketplace-plugin-microsoft_graph)**
- **[Mistral AI](/docs/marketplace/plugins/marketplace-plugin-mistral_ai)**
- **[OpenAI](/docs/marketplace/plugins/marketplace-plugin-openai)**
- **[Pinecone](/docs/marketplace/plugins/marketplace-plugin-pinecone)**
@ -78,6 +81,7 @@ To remove a plugin, follow these steps:
- **[Salesforce](/docs/marketplace/plugins/marketplace-plugin-salesforce)**
- **[Sharepoint](/docs/marketplace/plugins/marketplace-plugin-sharepoint)**
- **[Supabase](/docs/marketplace/plugins/marketplace-plugin-supabase)**
- **[UPS](/docs/marketplace/plugins/marketplace-plugin-ups)**
- **[Weaviate](/docs/marketplace/plugins/marketplace-plugin-weaviate)**
:::info For Plugin Developers

View file

@ -0,0 +1,185 @@
---
id: marketplace-plugin-aftership
title: Aftership
---
Integrating AfterShip with ToolJet enables teams to build custom internal tools for tracking and managing shipments in real time. With this integration, you can fetch delivery statuses, monitor carrier updates, and centralize logistics data within your ToolJet application, streamlining operations and improving customer support efficiency.
## Connection
To connect AfterShip with ToolJet you will need the API Key, which you can generate from [Aftership Tracking API](https://www.aftership.com/tracking-api).
<img className="screenshot-full img-full" src="/img/marketplace/plugins/aftership/connection.png" alt="Aftership Configuration" />
## Supported Operations
### Tracking
#### Basic Tracking Operations
| Method | Endpoint | Description |
| ------ | ------------ | --------------------------- |
| GET | `/trackings` | Retrieve list of trackings. |
| POST | `/trackings` | Create a new tracking. |
| GET | `/couriers` | Get supported courier list. |
#### ID
| Method | Endpoint | Description |
| ------ | ----------------------------------- | ---------------------------- |
| GET | `/trackings/{id}` | Get tracking by ID. |
| PUT | `/trackings/{id}` | Update tracking by ID. |
| DELETE | `/trackings/{id}` | Delete tracking by ID. |
| POST | `/trackings/{id}/retrack` | Retrack an expired tracking. |
| POST | `/trackings/{id}/mark-as-completed` | Mark tracking as completed. |
#### Detect
| Method | Endpoint | Description |
| ------ | ------------------ | ---------------------------------- |
| POST | `/couriers/detect` | Detect courier by tracking number. |
#### All
| Method | Endpoint | Description |
| ------ | ------------------ | ---------------------------------- |
| GET | `/couriers/all` | Get all available couriers. |
#### Predict Batch
| Method | Endpoint | Description |
| ------ | ---------------------------------------- | ------------------------------------- |
| POST | `/estimated-delivery-date/predict-batch` | Predict estimated delivery for batch. |
### Shipping
#### Labels
| Method | Endpoint | Description |
| ------ | -------------- | ----------------- |
| GET | `/labels` | Get labels |
| POST | `/labels` | Create a label |
| GET | `/labels/{id}` | Get a label by ID |
#### Cancel Labels
| Method | Endpoint | Description |
| ------ | --------------------- | --------------------------- |
| GET | `/cancel-labels` | Get the cancelled labels |
| POST | `/cancel-labels` | Cancel a label |
| GET | `/cancel-labels/{id}` | Get a cancelled label by ID |
#### Rates
| Method | Endpoint | Description |
| ------ | ------------- | ---------------- |
| GET | `/rates` | Get rates |
| POST | `/rates` | Calculate rates |
| GET | `/rates/{id}` | Get a rate by ID |
#### Manifests
| Method | Endpoint | Description |
| ------ | ----------------- | -------------------- |
| GET | `/manifests` | Get manifests |
| POST | `/manifests` | Create a manifest |
| GET | `/manifests/{id}` | Get a manifest by ID |
#### Couriers
| Method | Endpoint | Description |
| ------ | ----------- | ---------------- |
| GET | `/couriers` | Get all couriers |
#### Address Validations
| Method | Endpoint | Description |
| ------ | ---------------------- | ---------------------------- |
| POST | `/address-validations` | Create an address validation |
#### Location
| Method | Endpoint | Description |
| ------ | ------------ | ------------------------------------------- |
| GET | `/locations` | Get carrier locations (requires production) |
#### Pickup
| Method | Endpoint | Description |
| ------ | --------------- | ---------------------------------------------------- |
| GET | `/pickups` | Get pickups |
| POST | `/pickups` | Create a pickup (FedEx, UPS, DHL Express, Purolator) |
| GET | `/pickups/{id}` | Get a pickup by ID |
#### Cancel Pickups
| Method | Endpoint | Description |
| ------ | ---------------------- | ---------------------------- |
| GET | `/cancel-pickups` | Get the cancelled pickups |
| POST | `/cancel-pickups` | Cancel a pickup |
| GET | `/cancel-pickups/{id}` | Get a cancelled pickup by ID |
#### Shipper Accounts
| Method | Endpoint | Description |
| ------ | ------------------------------------ | ----------------------------------------- |
| GET | `/shipper-accounts` | Get shipper accounts |
| POST | `/shipper-accounts` | Create a shipper account |
| GET | `/shipper-accounts/{id}` | Get a shipper account by ID |
| DELETE | `/shipper-accounts/{id}` | Delete a shipper account |
| PUT | `/shipper-accounts/{id}/info` | Update shipper account's information |
| PUT | `/shipper-accounts/{id}/credentials` | Update shipper account's credentials |
| PUT | `/shipper-accounts/{id}/settings` | Update shipper account's settings (FedEx) |
### Return
#### Returns Management
| Method | Endpoint | Description |
| ------ | --------------------------- | -------------------------------------------- |
| GET | `/returns` | Get returns with optional filtering |
| POST | `/returns` | Create a new return (supports only "Refund") |
| GET | `/returns/{return_id}` | Get return detail by return ID |
| GET | `/returns/rma/{rma_number}` | Get return detail by RMA number |
#### Return Status Management
| Method | Endpoint | Description |
| ------ | ----------------------------------- | ---------------------------- |
| POST | `/returns/{return_id}/approve` | Approve return by return ID |
| POST | `/returns/rma/{rma_number}/approve` | Approve return by RMA number |
| POST | `/returns/{return_id}/resolve` | Resolve return by return ID |
| POST | `/returns/rma/{rma_number}/resolve` | Resolve return by RMA number |
| POST | `/returns/{return_id}/reject` | Reject return by return ID |
| POST | `/returns/rma/{rma_number}/reject` | Reject return by RMA number |
#### Item Management
| Method | Endpoint | Description |
| ------ | ------------------------------------------- | ---------------------------------------------- |
| POST | `/returns/{return_id}/receive-items` | Record received items by return ID |
| POST | `/returns/rma/{rma_number}/receive-items` | Record received items by RMA number |
| PUT | `/returns/{return_id}/items/{item_id}` | Update return item (tags/images) by return ID |
| PUT | `/returns/rma/{rma_number}/items/{item_id}` | Update return item (tags/images) by RMA number |
| POST | `/returns/{return_id}/remove-items` | Remove items from return by return ID |
| POST | `/returns/rma/{rma_number}/remove-items` | Remove items from return by RMA number |
#### Shipping Management
| Method | Endpoint | Description |
| ------ | -------------------------------------------- | ---------------------------------- |
| POST | `/returns/{return_id}/attach-shipments` | Upload shipment info by return ID |
| POST | `/returns/rma/{rma_number}/attach-shipments` | Upload shipment info by RMA number |
#### Dropoff Management
| Method | Endpoint | Description |
| ------ | ------------------------------------------------------- | -------------------------------------- |
| POST | `/returns/rma/{rma_number}/dropoffs/{dropoff_id}/drops` | Record dropped-off items (QR dropoffs) |
#### Utility Endpoints
| Method | Endpoint | Description |
| ------ | --------------- | ---------------------------------------------------- |
| POST | `/returns/link` | Generate returns page deep link with pre-filled info |
| GET | `/item-tags` | Retrieve all available item tags |

View file

@ -0,0 +1,246 @@
---
id: marketplace-plugin-couchbase
title: Couchbase
---
ToolJet integrates with Couchbase to utilize its NoSQL database capabilities and advanced vector search features. This integration enables ToolJet to perform document operations such as creating, reading, updating, and deleting documents, as well as executing SQL++ queries, Full-Text Search (FTS) operations in Couchbase databases. With Couchbase's vector store capabilities, ToolJet can leverage semantic search, hybrid search combining traditional and AI-powered queries, and build intelligent applications.
:::note
Before following this guide, it is assumed that you have already completed the process of **[Using Marketplace plugins](/docs/marketplace/marketplace-overview#using-marketplace-plugins)**.
:::
## Connection
For connecting to Couchbase, the following credentials are required:
- **Data API Endpoint**: Your Couchbase Data API endpoint URL
- **Username**: Your Couchbase username
- **Password**: Your Couchbase password
<img className="screenshot-full" src="/img/marketplace/plugins/couchbase/connection.png" alt="Configuring Couchbase in ToolJet" />
## Supported Operations
- **[Get Document](#get-document)**
- **[Create Document](#create-document)**
- **[Update Document](#update-document)**
- **[Delete Document](#delete-document)**
- **[Query](#query)**
- **[FTS Search](#fts-search)**
### Get Document
This operation retrieves a specific document by its ID from a Couchbase collection.
#### Required Parameters
- **Bucket**: The name of the bucket containing the document
- **Document ID**: The unique identifier of the document to retrieve
- **Scope**: The scope name
- **Collection**: The collection name
<img className="screenshot-full" src="/img/marketplace/plugins/couchbase/get-document.png" alt="Get Document Operation" />
<details id="tj-dropdown">
<summary>**Example Response**</summary>
```json
{
"id": "user::123",
"name": "John Doe",
"email": "john@example.com",
"age": 30,
"created_at": "2023-01-15T10:30:00Z"
}
```
</details>
### Create Document
This operation creates a new document in a Couchbase collection.
#### Required Parameters
- **Bucket**: The name of the bucket to create the document in
- **Scope**: The scope name
- **Collection**: The collection name
- **Document ID**: The unique identifier for the new document
- **Document**: The document data as a JSON object
<img className="screenshot-full" src="/img/marketplace/plugins/couchbase/create-document.png" alt="Create Document Operation" />
<details id="tj-dropdown">
<summary>**Example Response**</summary>
```yaml
Created successfully
```
</details>
### Update Document
This operation updates an existing document in a Couchbase collection.
#### Required Parameters
- **Bucket**: The name of the bucket containing the document
- **Scope**: The scope name
- **Collection**: The collection name
- **Document ID**: The unique identifier of the document to update
- **Document**: The updated document data as a JSON object
<img className="screenshot-full" src="/img/marketplace/plugins/couchbase/update-document.png" alt="Update Document Operation" />
<details id="tj-dropdown">
<summary>**Example Response**</summary>
```yaml
Updated successfully
```
</details>
Note: Update operation replaces the original document with the updated value of the document passed.
### Delete Document
This operation deletes a document from a Couchbase collection.
#### Required Parameters
- **Bucket**: The name of the bucket containing the document
- **Scope**: The scope name
- **Collection**: The collection name
- **Document ID**: The unique identifier of the document to delete
<img className="screenshot-full" src="/img/marketplace/plugins/couchbase/delete-document.png" alt="Delete Document Operation" />
<details id="tj-dropdown">
<summary>**Example Response**</summary>
```yaml
Deleted successfully
```
</details>
### Query
This operation executes SQL++ queries against your Couchbase database.
#### Required Parameters
- **SQL++ Query**: The SQL++ statement to execute (use `$parameter` placeholders for named parameters)
#### Optional Parameters
- **Arguments (Key-Value)**: Key-value object for named parameters that replace `$parameter` placeholders in the query
- **Query Options**: JSON object containing additional query options like `readonly`, `timeout`, etc.
<img className="screenshot-full" src="/img/marketplace/plugins/couchbase/query.png" alt="Query Operation" />
<details id="tj-dropdown">
<summary>**Example Query**</summary>
```sql
SELECT * FROM `travel-sample`.`inventory`.`airline` WHERE country = $country LIMIT 10
```
**Arguments (Key-Value)**: `{ "$country": "France" }`
**Query Options**: `{ "readonly": true, "query_context": "travel-sample.inventory" }`
Refer to the [request paramters](https://docs.couchbase.com/server/current/n1ql-rest-query/index.html#Request) for supported query options.
</details>
<details id="tj-dropdown">
<summary>**Example Response**</summary>
```json
{
"results": [
{
"airline": {
"id": 137,
"type": "airline",
"name": "Air France",
"iata": "AF",
"icao": "AFR",
"callsign": "AIRFRANS",
"country": "France"
}
}
],
"status": "success",
"metrics": {
"elapsedTime": "15.2ms",
"executionTime": "14.8ms",
"resultCount": 1,
"resultSize": 234
}
}
```
</details>
### FTS Search
This operation performs Full-Text Search queries against a Couchbase FTS index.
#### Required Parameters
- **Bucket**: The name of the bucket to search in
- **Scope**: The scope name
- **Index Name**: The name of the FTS index to search against
- **Search Query**: The FTS search query as a JSON object
<img className="screenshot-full" src="/img/marketplace/plugins/couchbase/fts-search.png" alt="FTS Search Operation" />
<details id="tj-dropdown">
<summary>**Example Search Query**</summary>
```json
{
"query": {
"match": "hotel",
"field": "name"
}
}
```
</details>
<details id="tj-dropdown">
<summary>**Example Response**</summary>
```json
{
"status": {
"total": 1,
"failed": 0,
"successful": 1
},
"request": {
"query": {
"match": "hotel",
"field": "name"
}
},
"hits": [
{
"index": "hotel-index",
"id": "hotel_123",
"score": 0.8567,
"fields": {
"name": "Grand Hotel",
"city": "Paris",
"country": "France"
}
}
],
"total_hits": 1,
"max_score": 0.8567,
"took": 12
}
```
</details>

View file

@ -0,0 +1,231 @@
---
id: marketplace-plugin-microsoft_graph
title: Microsoft Graph
---
By integrating Microsoft Graph with ToolJet, you can interact with Microsoft 365 services such as Outlook Mail, Calendar, Users, and OneDrive.
## Connection
To connect ToolJet with Microsoft Graph, youll need the following credentials:
- Tenant
- Access token URL
- Client ID
- Client secret
Follow this [Microsoft guide](https://learn.microsoft.com/en-us/graph/auth-register-app-v2) to register an app and generate the required credentials.
You can enable the **Authentication required for all users** toggle in the configuration panel. When enabled, each user will be redirected to the OAuth consent screen the first time a query from this data source is triggered in your application. This ensures that every user connects with their own Microsoft account securely.
**Note**: After completing the OAuth flow, the query must be triggered again to fetch data from Microsoft Graph.
<img className="screenshot-full img-full" src="/img/marketplace/plugins/microsoft-graph/connection.png" alt="Microsoft Graph Configuration" />
## Supported Operations
### Outlook
#### Messages
| Method | Endpoint | Description |
| ------ | ------------------------------------------ | ------------------------------------ |
| GET | `/me/messages` | List messages in the user's mailbox. |
| POST | `/me/messages` | Create a new draft message. |
| GET | `/me/messages/{message-id}` | Get a specific message by ID. |
| PATCH | `/me/messages/{message-id}` | Update a message. |
| DELETE | `/me/messages/{message-id}` | Delete a message. |
| POST | `/me/messages/{message-id}/forward` | Forward an existing message. |
| POST | `/me/messages/{message-id}/createForward` | Create a forward draft. |
| POST | `/me/messages/{message-id}/reply` | Reply to a message. |
| POST | `/me/messages/{message-id}/createReply` | Create a reply draft. |
| POST | `/me/messages/{message-id}/replyAll` | Reply all to a message. |
| POST | `/me/messages/{message-id}/createReplyAll` | Create a reply-all draft. |
| POST | `/me/messages/{message-id}/send` | Send a draft message. |
| POST | `/me/messages/{message-id}/move` | Move a message. |
| POST | `/me/messages/{message-id}/copy` | Copy a message. |
| POST | `/me/sendMail` | Send mail without creating a draft. |
#### Mail Folders
| Method | Endpoint | Description |
| ------ | ------------------------------------------------ | ------------------------------------- |
| GET | `/me/mailFolders` | List mail folders. |
| POST | `/me/mailFolders` | Create a mail folder. |
| GET | `/me/mailFolders/{mailFolder-id}` | Get specific mail folder. |
| PATCH | `/me/mailFolders/{mailFolder-id}` | Update a mail folder. |
| DELETE | `/me/mailFolders/{mailFolder-id}` | Delete a mail folder. |
| GET | `/me/mailFolders/{mailFolder-id}/messages` | List messages inside a folder. |
| GET | `/me/mailFolders/Inbox/messages/delta` | Track changes to inbox messages. |
| GET | `/me/mailFolders/{mailFolder-id}/messages/delta` | Track changes to a folder's messages. |
| GET | `/me/mailFolders/delta` | Track changes to all folders. |
#### Categories and Rooms
| Method | API Endpoint | Description |
| ------ | --------------------------------------------------- | ----------------------- |
| GET | `/me/outlook/masterCategories` | List master categories |
| POST | `/me/outlook/masterCategories` | Create a new category |
| GET | `/me/outlook/masterCategories/{outlookCategory-id}` | Get a specific category |
| PATCH | `/me/outlook/masterCategories/{outlookCategory-id}` | Update a category |
| DELETE | `/me/outlook/masterCategories/{outlookCategory-id}` | Delete a category |
| GET | `/me/findRooms` | List available rooms |
| GET | `/me/findRooms(RoomList='{roomList-emailAddress}')` | Find rooms by room list |
| GET | `/me/findRoomLists` | List room lists |
### Calendar
#### Default Calendar
| Method | API Endpoint | Description |
| ------ | ------------------------------------------------- | ------------------------------------- |
| GET | `/me/calendar` | Get default calendar |
| PATCH | `/me/calendar` | Update default calendar |
| GET | `/me/calendar/events` | List events from default calendar |
| POST | `/me/calendar/events` | Create an event in default calendar |
| GET | `/me/calendar/calendarPermissions` | List calendar permissions |
| POST | `/me/calendar/calendarPermissions` | Grant permissions to default calendar |
| GET | `/me/calendar/calendarPermissions/{permissionId}` | Get specific calendar permission |
| PATCH | `/me/calendar/calendarPermissions/{permissionId}` | Update calendar permission |
| DELETE | `/me/calendar/calendarPermissions/{permissionId}` | Delete calendar permission |
| POST | `/me/calendar/getSchedule` | Get free/busy schedule info |
#### User Calendars and Groups
| Method | API Endpoint | Description |
| ------ | ---------------------------------------- | --------------------------------------- |
| GET | `/user/{userId}/calendar` | Get default calendar of a specific user |
| GET | `/me/calendars` | List user calendars |
| POST | `/me/calendars` | Create a new calendar |
| GET | `/me/calendars/{calendarId}` | Get a specific calendar |
| PATCH | `/me/calendars/{calendarId}` | Update a calendar |
| DELETE | `/me/calendars/{calendarId}` | Delete a calendar |
| GET | `/me/calendars/{calendarId}/events` | List events in a specific calendar |
| POST | `/me/calendars/{calendarId}/events` | Create event in a specific calendar |
| GET | `/me/calendarGroups` | List calendar groups |
| POST | `/me/calendarGroups` | Create a calendar group |
| GET | `/me/calendarGroups/{groupId}/calendars` | Get calendars in a group |
| POST | `/me/calendarGroups/{groupId}/calendars` | Add calendar to a group |
#### Events
| Method | API Endpoint | Description |
| ------ | ---------------------------------- | ----------------------------------- |
| GET | `/me/events/{eventId}` | Get an event by ID |
| PATCH | `/me/events/{eventId}` | Update an event |
| DELETE | `/me/events/{eventId}` | Delete an event |
| GET | `/me/events/{eventId}/instances` | List instances of a recurring event |
| GET | `/me/events/{eventId}/attachments` | List attachments of an event |
| POST | `/me/events/{eventId}/attachments` | Add attachments to an event |
| GET | `/me/calendarView` | Get calendar view of events |
| POST | `/me/findMeetingTimes` | Find meeting times |
### Users
#### User Management
| Method | API Endpoint | Description |
| ------ | ------------------ | ---------------------- |
| GET | `/users` | List all users |
| POST | `/users` | Create a user |
| GET | `/users/{user-id}` | Get a specific user |
| PATCH | `/users/{user-id}` | Update a specific user |
| DELETE | `/users/{user-id}` | Delete a specific user |
#### Profile
| Method | API Endpoint | Description |
| ------ | ------------ | -------------------------------- |
| GET | `/me` | Get profile of signed-in user |
| PATCH | `/me` | Update profile of signed-in user |
### Teams
#### Teams and Chats
| Method | API Endpoint | Description |
| ------ | ----------------- | ------------------------------ |
| GET | `/teams` | List teams |
| POST | `/teams` | Create a team |
| GET | `/chats` | List chats |
| POST | `/chats` | Create a chat |
| GET | `/me/joinedTeams` | List teams the user has joined |
#### Chat Operations
| Method | API Endpoint | Description |
| ------ | -------------------------------------------------- | -------------------------------- |
| GET | `/chats/{chat-id}` | Get a chat |
| PATCH | `/chats/{chat-id}` | Update a chat |
| DELETE | `/chats/{chat-id}` | Delete a chat |
| GET | `/chats/{chat-id}/members` | List members in a chat |
| POST | `/chats/{chat-id}/members` | Add members to a chat |
| POST | `/chats/{chat-id}/members/add` | Add members (alternate endpoint) |
| GET | `/chats/{chat-id}/members/{conversationMember-id}` | Get chat member details |
| PATCH | `/chats/{chat-id}/members/{conversationMember-id}` | Update chat member |
| DELETE | `/chats/{chat-id}/members/{conversationMember-id}` | Remove chat member |
| GET | `/chats/{chat-id}/messages` | List messages in a chat |
| POST | `/chats/{chat-id}/messages` | Send message in a chat |
| GET | `/chats/{chat-id}/messages/{chatMessage-id}` | Get a specific chat message |
| PATCH | `/chats/{chat-id}/messages/{chatMessage-id}` | Update a chat message |
| DELETE | `/chats/{chat-id}/messages/{chatMessage-id}` | Delete a chat message |
| GET | `/chats/getAllMessages` | Get all messages across chats |
#### Team Operation
| Method | API Endpoint | Description |
| ------ | ------------------------------ | -------------------------------- |
| GET | `/teams/{team-id}` | Get a team |
| PATCH | `/teams/{team-id}` | Update a team |
| DELETE | `/teams/{team-id}` | Delete a team |
| POST | `/teams/{team-id}/archive` | Archive a team |
| POST | `/teams/{team-id}/unarchive` | Unarchive a team |
| GET | `/teams/{team-id}/members` | List team members |
| POST | `/teams/{team-id}/members` | Add team members |
| POST | `/teams/{team-id}/members/add` | Add members (alternate endpoint) |
#### Channels and Messages
| Method | API Endpoint | Description |
| ------ | ------------------------------------------------------------------ | --------------------------------------- |
| GET | `/teams/{team-id}/allChannels` | List all channels in a team |
| GET | `/teams/{team-id}/channels` | List standard channels in a team |
| POST | `/teams/{team-id}/channels` | Create a channel in a team |
| GET | `/teams/{team-id}/channels/{channel-id}` | Get channel details |
| PATCH | `/teams/{team-id}/channels/{channel-id}` | Update a channel |
| DELETE | `/teams/{team-id}/channels/{channel-id}` | Delete a channel |
| GET | `/teams/{team-id}/channels/{channel-id}/members` | List members in a channel |
| POST | `/teams/{team-id}/channels/{channel-id}/members` | Add members to a channel |
| GET | `/teams/{team-id}/channels/{channel-id}/messages` | List messages in a channel |
| POST | `/teams/{team-id}/channels/{channel-id}/messages` | Send message in a channel |
| GET | `/teams/{team-id}/channels/{channel-id}/messages/{chatMessage-id}` | Get a specific channel message |
| PATCH | `/teams/{team-id}/channels/{channel-id}/messages/{chatMessage-id}` | Update a channel message |
| DELETE | `/teams/{team-id}/channels/{channel-id}/messages/{chatMessage-id}` | Delete a channel message |
| GET | `/teams/{team-id}/allChannels/{channel-id}` | Get specific channel under all channels |
### OneDrive
#### Root and Shared Content
| Method | API Endpoint | Description |
| ------ | ------------------------------------------- | ----------------------------------- |
| GET | `/me/drive/root/children` | List items in root folder |
| POST | `/me/drive/root/children` | Create a new file or folder in root |
| GET | `/me/drive/recent` | List recent files |
| GET | `/me/drive/sharedWithMe` | List files shared with the user |
| GET | `/me/drive/root/search(q='{search-query}')` | Search files by query |
#### Specific Drives and Items
| Method | API Endpoint | Description |
| ------ | ------------------------------------------------ | ------------------------------------- |
| GET | `/drives/{drive-id}/root/children` | List items in a specific drive's root |
| GET | `/drives/{drive-id}/items/{item-id}/children` | List children of a folder |
| POST | `/drives/{drive-id}/items/{item-id}/children` | Add item to folder |
| GET | `/drives/{drive-id}/items/{item-id}` | Get metadata for an item |
| PATCH | `/drives/{drive-id}/items/{item-id}` | Update metadata of an item |
| DELETE | `/drives/{drive-id}/items/{item-id}` | Delete an item |
| GET | `/drives/{drive-id}/items/{item-id}/content` | Download file content |
| PUT | `/drives/{drive-id}/items/{item-id}/content` | Upload file content |
| POST | `/drives/{drive-id}/items/{item-id}/createLink` | Create sharing link |
| GET | `/drives/{drive-id}/items/{item-id}/permissions` | Get item permissions |

View file

@ -0,0 +1,98 @@
---
id: marketplace-plugin-ups
title: UPS
---
By integrating UPS with ToolJet you can track packages, calculate shipping rates, validate addresses, and automate logistics processes, all within your ToolJet applications to enhance operational visibility and reduce manual overhead.
## Connection
To connect with UPS you need the following credentials:
- Client ID
- Client secret
- Shipper number
You can follow the steps in the [Getting Staerted with UPS APIs](https://developer.ups.com/get-started) guide to generate these credentials.
<img className="screenshot-full img-full" src="/img/marketplace/plugins/ups/connection.png" alt="UPS Install" />
## Supported Operations
### Shipping
#### Version
| **Method** | **Endpoint** | **Description** |
| ---------- | --------------| -----------------|
| POST | `/shipments/{version}/ship` | Create a new shipment. |
| DELETE | `/shipments/{version}/void/cancel/{shipmentIdentificationNumber}` | Cancel a shipment using its shipment ID. |
| POST | `/labels/{version}/recovery` | Recover a label for a previously created shipment. |
#### Deprecated Version
| **Method** | **Endpoint** | **Description** |
| ---------- | --------------| -----------------|
| POST | `/shipments/{deprecatedversion}/ship` | Create shipment using an older API version. |
| DELETE | `/shipments/{deprecatedversion}/void/cancel/{shipmentIdentificationNumber}` | Cancel shipment using an older API version. |
### Rating
#### Version
| Method | API Endpoint | Description |
| ------ | ----------------------------------- | ------------------------------------------------ |
| POST | `/rating/{version}/{requestoption}` | Retrieve or calculate shipping rate quotes (UPS) |
#### Deprecated Version
| Method | API Endpoint | Description |
| ------ | --------------| ------------ |
| POST | `/rating/{deprecatedVersion}/{requestoption}` | Retrieve shipping rate quotes using a deprecated UPS API version |
### Tracking
| Method | API Endpoint | Description |
| ------ | -------------| -------------|
| GET | `/track/v1/details/{inquiryNumber}` | Retrieve shipment tracking details using the tracking (inquiry) number |
| GET | `/track/v1/reference/details/{referenceNumber}` | Retrieve tracking information using a shipment reference number |
### Address Validation
#### Version
| Method | API Endpoint | Description |
| ------ | -------------| -------------|
| POST | `/addressvalidation/{version}/{requestoption}` | Validate and verify shipping addresses to ensure accuracy (UPS) |
#### Deprecated Version
| Method | API Endpoint | Description |
| ------ | -------------| -------------|
| POST | `/addressvalidation/{deprecatedVersion}/{requestoption}` | Validate shipping addresses using a deprecated UPS API version |
### Time In Transit
| Method | API Endpoint | Description |
| ------ | ------------ | ----------- |
| POST | `/shipments/{version}/{transittimes}` | Retrieve estimated transit times for shipments using UPS API |
### Pickup
#### Versions
| Method | API Endpoint | Description |
| ------ | ------------ | ----------- |
| POST | `/shipments/{version}/pickup/{pickuptype}` | Schedule a shipment pickup based on pickup type |
| GET | `/shipments/{version}/pickup/{pickuptype}` | Retrieve pickup availability or details for a specific pickup type |
| DELETE | `/shipments/{version}/pickup/{CancelBy}` | Cancel a scheduled pickup using a specified cancellation method |
| POST | `/pickupcreation/{version}/pickup` | Create a new UPS pickup request |
| GET | `/pickup/{version}/countries/{countrycode}` | Get pickup service availability for a specified country |
| POST | `/pickup/{version}/servicecenterlocations` | Locate nearby UPS service centers for pickup services |
#### Deprecated Version
| Method | API Endpoint | Description |
| ------ | -------------| ------------|
| DELETE | `/shipments/{deprecatedVersion}/pickup/{CancelBy}` | Cancel a scheduled pickup using a deprecated UPS API version |
| POST | `/pickupcreation/{deprecatedVersion}/pickup` | Create a new pickup request using a deprecated UPS API version |

View file

@ -16,9 +16,10 @@ Please find the latest LTS version here: <br/>
Starting from **`v3.5.0-ee-lts`** all releases are AI releases. Checkout the **[Build with AI](/docs/build-with-ai/overview)** section for more information. If you have any questions feel free to join our [Slack Community](https://join.slack.com/t/tooljet/shared_invite/zt-2rk4w42t0-ZV_KJcWU9VL1BBEjnSHLCA) or send us an email at hello@tooljet.com.
:::
| Version | Release Date | Docker Pull Command |
| ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------- | -------------------------------------------- |
| Latest EE-LTS | N/A | `docker pull tooljet/tooljet:ee-lts-latest` |
| Version | Release Date | Docker Pull Command |
| ---------| ------------- | ----------------------|
| Latest EE-LTS | N/A | `docker pull tooljet/tooljet:ee-lts-latest` |
| [v3.16.0-lts](https://hub.docker.com/layers/tooljet/tooljet/v3.16.0-lts/images/sha256-626a6463504f74659e1468a69edbdacc264eded5867ae159a18358fc43d47b48) | August 4, 2025 | `docker pull tooljet/tooljet:v3.16.0-lts` |
| [v3.5.0-ee-lts](https://hub.docker.com/layers/tooljet/tooljet/v3.5.0-ee-lts/images/sha256-9580d2377d17ce0c26fca0535eca51bce899015f26bfc81769d032b4b15a5da5) | February 12, 2025 | `docker pull tooljet/tooljet:v3.5.0-ee-lts` |
| [v3.0.24-ee-lts](https://hub.docker.com/layers/tooljet/tooljet/v3.0.24-ee-lts/images/sha256-33494c8ee72c440ce0ded925cdeb15507cd87f2b7c3fe172dd1cbee790e3b96f?context=explore) | January 3, 2025 | `docker pull tooljet/tooljet:v3.0.24-ee-lts` |
| [v3.0.23-ee-lts](https://hub.docker.com/layers/tooljet/tooljet/v3.0.23-ee-lts/images/sha256-1ca2bcb5dac66b1d3d089bd8300b7077c0dcd27bb2cfe6665bf388b680294467?context=explore) | January 2, 2025 | `docker pull tooljet/tooljet:v3.0.23-ee-lts` |

View file

@ -129,7 +129,7 @@ To use ToolJet Database, you'd have to set up and deploy PostgREST server which
Deploying ToolJet Database is mandatory from ToolJet 3.0 or else the migration might break, checkout the following docs to know more about new major version, including breaking changes that require you to adjust your applications accordingly:
- [ToolJet 3.0 Migration Guide for Self-Hosted Versions](./upgrade-to-v3.md)
- [ToolJet 3.0 Migration Guide for Self-Hosted Versions](/docs/setup/upgrade-to-v3/)
- [Cloud](./cloud-v3-migration.md)
Follow the steps below to deploy PostgREST on a ECS cluster.

View file

@ -1,368 +1,248 @@
---
id: env-vars
title: Environment variables
title: Environment Variables
---
# Environment variables
ToolJet requires several environment variables to function properly. Below is a simplified guide to setting them up.
Both the ToolJet server and client requires some environment variables to start running.
## ToolJet Server
_If you have any questions feel free to join our [Slack Community](https://join.slack.com/t/tooljet/shared_invite/zt-2rk4w42t0-ZV_KJcWU9VL1BBEjnSHLCA) or send us an email at hello@tooljet.com._
### Required Variables
## ToolJet server
#### ToolJet Host
### ToolJet host ( required )
- `TOOLJET_HOST`: Public URL of ToolJet (e.g., `https://app.tooljet.ai`)
| variable | description |
| ------------ | ---------------------------------------------------------------- |
| TOOLJET_HOST | the public URL of ToolJet client ( eg: https://app.tooljet.com ) |
#### Lockbox Configuration
- `LOCKBOX_MASTER_KEY`: 32-byte hex string for encrypting datasource credentials
- Generate using: `openssl rand -hex 32`
### Lockbox configuration ( required )
#### Application Secret
- `SECRET_KEY_BASE`: 64-byte hex string for encrypting session cookies
- Generate using: `openssl rand -hex 64`
ToolJet server uses lockbox to encrypt datasource credentials. You should set the environment variable `LOCKBOX_MASTER_KEY` with a 32 byte hexadecimal string.
#### Database Configuration
- `PG_HOST`: PostgreSQL database host
- `PG_DB`: Database name
- `PG_USER`: Username
- `PG_PASS`: Password
- `PG_PORT`: Port
**Docker Compose Setup:** If you are using a Docker Compose setup with an in-built PostgreSQL instance, set `PG_HOST` to `postgres`. This ensures that Docker's internal DNS resolves the hostname correctly, allowing the ToolJet server to connect to the database seamlessly.
### Application Secret ( required )
**Database Connection URL:** If you intend to use the database connection URL and your database does not support SSL, use the following format when setting the `DATABASE_URL` variable:
ToolJet server uses a secure 64 byte hexadecimal string to encrypt session cookies. You should set the environment variable `SECRET_KEY_BASE`.
```
DATABASE_URL=postgres://PG_USER:PG_PASS@PG_HOST:5432/PG_DB?sslmode=disable
```
:::tip
If you have `openssl` installed, you can run the following commands to generate the value for `LOCKBOX_MASTER_KEY` and `SECRET_KEY_BASE`.
Replace `username`, `password`, `hostname`, `port`, and `database_name` with your actual database details.
For `LOCKBOX_MASTER_KEY` use `openssl rand -hex 32`
For `SECRET_KEY_BASE` use `openssl rand -hex 64`
:::
#### Disabling Automatic Database & Extension Creation (Optional)
- `PG_DB_OWNER=false`: ToolJet by default tries to create database based on `PG_DB` variable set and additionally my try to create postgres extensions. This requires the postgres user to have `CREATEDB` permission. If this cannot be granted you can disable this behaviour by setting `PG_DB_OWNER` as `false` and will have to manually run them.
### Database configuration ( required )
#### ToolJet Database
- `TOOLJET_DB`: Default database name (`tooljet_db`)
- `TOOLJET_DB_HOST`: Database host
- `TOOLJET_DB_USER`: Database username
- `TOOLJET_DB_PASS`: Database password
- `TOOLJET_DB_PORT`: Database port
ToolJet server uses PostgreSQL as the database.
| variable | description |
| -------- | ---------------------- |
| PG_HOST | postgres database host |
| PG_DB | name of the database |
| PG_USER | username |
| PG_PASS | password |
| PG_PORT | port |
:::tip
If you are using docker-compose setup, you can set PG_HOST as `postgres` which will be DNS resolved by docker
:::
:::info
If you intent you use the DB connection url and if the connection does not support ssl. Please use the below format using the variable DATABASE_URL.
`postgres://username:password@hostname:port/database_name?sslmode=disable`
:::
### Disable database and extension creation (optional)
ToolJet by default tries to create database based on `PG_DB` variable set and additionally my try to create postgres extensions. This requires the postgres user to have CREATEDB permission. If this cannot be granted you can disable this behaviour by setting `PG_DB_OWNER` as `false` and will have to manually run them.
### Check for updates ( optional )
Self-hosted version of ToolJet pings our server to fetch the latest product updates every 24 hours. You can disable this by setting the value of `CHECK_FOR_UPDATES` environment variable to `0`. This feature is enabled by default.
### Comment feature enable ( optional )
Use this environment variable to enable/disable the feature that allows you to add comments on the canvas. To configure this environment variable, ensure that multiplayer editing is enabled in the Settings.
| variable | value |
| ---------------------- | ----------------- |
| COMMENT_FEATURE_ENABLE | `true` or `false` |
### Marketplace
#### Enable Marketplace plugin developement mode ( optional )
Use this environment variable to enable/disable the developement mode that allows developers to build the plugin.
| variable | value |
| --------------------------- | ----------------- |
| ENABLE_MARKETPLACE_DEV_MODE | `true` or `false` |
### User Session Expiry Time (Optional)
| variable | description |
| ------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| USER_SESSION_EXPIRY | This variable controls the user session expiry time. By default, the session expires after **10** days. The variable expects the value in minutes. ex: USER_SESSION_EXPIRY = 120 which is 2 hours |
### Enable ToolJet Database (required)
| variable | description |
| ------------------- | -------------------------------------------- |
| TOOLJET_DB | Default value is `tooljet_db` |
| TOOLJET_DB_HOST | database host |
| TOOLJET_DB_USER | database username |
| TOOLJET_DB_PASS | database password |
| TOOLJET_DB_PORT | database port |
| PGRST_JWT_SECRET | JWT token client provided for authentication |
| PGRST_HOST | postgrest database host |
| PGRST_DB_PRE_CONFIG | postgrest.pre_config |
:::tip
The database name provided for `TOOLJET_DB` will be utilized to create a new database during server boot process in all of our production deploy setups.
Incase you want to trigger it manually, use the command `npm run db:create` on ToolJet server.
:::
### Why ToolJet Requires Two Databases
#### Why ToolJet Requires Two Databases
ToolJet requires two separate databases for optimal functionality. **TOOLJET_DB** is used to store the platform's internal metadata, including tables created within ToolJet. On the other hand, **PG_DB** acts as the primary database for application data, handling end-user data managed by the apps built on ToolJet.
:::info
If you intent you use the DB connection url and if the connection does not support ssl. Please use the below format using the variable TOOLJET_DB_URL.
`postgres://username:password@hostname:port/database_name?sslmode=disable`
:::
### Server Host ( optional )
**Automatic Database Creation:** The database name specified in `TOOLJET_DB` will be automatically created during the server boot process in all production deployment setups.
You can specify a different server for backend if it is hosted on another server.
#### PostgREST
ToolJet uses **PostgREST (v12.2.0)** for API access. The following environment variables are required for PostgREST:
| variable | value |
| ----------- | ------------------------------------------------------------------------------------------------- |
| SERVER_HOST | Configure a hostname for the server as a proxy pass. If no value is set, it defaults to `server`. |
- `PGRST_JWT_SECRET`: JWT secret (Generate using `openssl rand -hex 32`). If this parameter is not specified, PostgREST will refuse authentication requests.
- `PGRST_DB_URI`: Database connection string
- `PGRST_LOG_LEVEL=info`
### Hide account setup link
If you intent to make changes in the above configuration. Please refer [PostgREST configuration docs](https://postgrest.org/en/stable/configuration.html#environment-variables).
If you want to hide account setup link from admin in manage user page, set the environment variable `HIDE_ACCOUNT_SETUP_LINK` to `true`, please make sure you have configured SMTP to receive welcome mail for users.
#### Configuring PGRST_DB_URI
### Disabling signups ( optional )
`PGRST_DB_URI` is required for PostgREST, which is responsible for exposing the database as a REST API. It must be explicitly set to ensure proper functionality.
If you want to restrict the signups and allow new users only by invitations, set the environment variable `DISABLE_SIGNUPS` to `true`.
This follows the format:
:::tip
You will still be able to see the signup page but won't be able to successfully submit the form.
:::
```
PGRST_DB_URI=postgres://TOOLJET_DB_USER:TOOLJET_DB_PASS@TOOLJET_DB_HOST:5432/TOOLJET_DB
```
### Serve client as a server end-point ( optional )
Ensure that:
By default, the `SERVE_CLIENT` variable will be unset and the server will serve the client at its `/` end-point.
You can set `SERVE_CLIENT` to `false` to disable this behaviour.
- `username` and `password` match the credentials for the PostgREST database user.
- `hostname` is correctly set (`postgres` if using Docker Compose setup with an in-built PostgreSQL).
- `port` is the PostgreSQL port (default: `5432`).
- `database_name` is the database used for PostgREST (`tooljet_db` in this example).
### Serve client at subpath
#### Redis Configuration
If ToolJet is hosted on a domain subpath, you can set the environment variable `SUB_PATH` to support it.
Please note the subpath is to be set with trailing `/` and is applicable only when the server is serving the frontend client.
Include the following Redis environment variables within the ToolJet deployment only if you are connecting to an external **Redis instance (v6.2)** for a multi-service or multi-pod setup and have followed the necessary steps to create Redis.
### SMTP Configuration (Optional)
```
REDIS_HOST=
REDIS_PORT=
REDIS_USER=
REDIS_PASSWORD=
```
ToolJet uses SMTP services to send emails (e.g., invitation emails when you add new users to your workspace).
### Optional Configurations
For Enterprise Edition, you must configure SMTP settings through the user interface (UI) in the ToolJet Settings. For more information, see [SMTP Configuration](/docs/org-setup/smtp-config).
#### Comments Feature
- `COMMENT_FEATURE_ENABLE=true/false`: Use this environment variable to enable/disable the feature that allows you to add comments on the canvas. To configure this environment variable, ensure that multiplayer editing is enabled in the Settings.
#### User Session Expiry
- `USER_SESSION_EXPIRY`: Controls session expiry time (in minutes). Default: **10 days**.
Note: The variable expects the value in minutes. ex: USER_SESSION_EXPIRY = 120 which is 2 hours
#### Password Retry Limit
By default, an account is locked after 5 failed login attempts. You can control this with:
- `DISABLE_PASSWORD_RETRY_LIMIT=true`: Disables the retry limit.
- `PASSWORD_RETRY_LIMIT=<number>`: Sets a custom retry limit (default is 5).
#### Hide Account Setup Link
- `HIDE_ACCOUNT_SETUP_LINK`: Set to `true` to hide the account setup link from the admin in the manage user page. Ensure SMTP is configured to send welcome emails.
#### Restrict Signups
Set `DISABLE_SIGNUPS=true` to allow only invited users to sign up. The signup page will still be visible but unusable.
#### SMTP Configuration
ToolJet sends emails via SMTP.
:::info
If you have upgraded from a version prior to v2.62.0, the SMTP variables in your .env file will automatically be mapped to the UI.
For versions v2.62.0 and later, SMTP configuration will no longer be picked up from the .env file for Enterprise Edition. You must configure SMTP through the UI. You can safely remove these variables from your .env file after ensuring that the configuration is properly set up in the UI.
If you have upgraded from a version prior to v2.62.0, the SMTP variables in your .env file will automatically be mapped to the UI. For versions v2.62.0 and later, SMTP configuration will no longer be picked up from the .env file for Enterprise Edition. You must configure SMTP through the UI. You can safely remove these variables from your .env file after ensuring that the configuration is properly set up in the UI.
:::
For Community Edition, you can configure SMTP via environment variables using the following:
For **Enterprise Edition**, configure SMTP in the ToolJet Settings UI.
| Variable | Description |
| ------------------ | ------------------------------------- |
| DEFAULT_FROM_EMAIL | From email for emails sent by ToolJet |
| SMTP_USERNAME | Username |
| SMTP_PASSWORD | Password |
| SMTP_DOMAIN | Domain or host |
| SMTP_PORT | Port |
For **Community Edition**, use these environment variables:
### Slack configuration ( optional )
- `DEFAULT_FROM_EMAIL`: Sender email address
- `SMTP_USERNAME`: SMTP username
- `SMTP_PASSWORD`: SMTP password
- `SMTP_DOMAIN`: SMTP host
- `SMTP_PORT`: SMTP port
If your ToolJet installation requires Slack as a data source, you need to create a Slack app and set the following environment variables:
#### Custom CA Certificate
If ToolJet needs to connect to self-signed HTTPS endpoints, ensure the `NODE_EXTRA_CA_CERTS` environment variable is set to the absolute path of the CA certificate file.
| variable | description |
| ------------------- | ------------------------------ |
| SLACK_CLIENT_ID | client id of the slack app |
| SLACK_CLIENT_SECRET | client secret of the slack app |
- `NODE_EXTRA_CA_CERTS=/path/to/cert.pem`: Absolute path to the PEM file (can contain multiple certificates).
### Google OAuth ( optional )
#### ToolJet API Import Application
If your ToolJet installation needs access to data sources such as Google sheets, you need to create OAuth credentials from Google Cloud Console.
By default, server accepts maximum JSON size as 50 MB. To increase this limit, use the following environment variable:
| variable | description |
| -------------------- | ------------- |
| GOOGLE_CLIENT_ID | client id |
| GOOGLE_CLIENT_SECRET | client secret |
- `MAX_JSON_SIZE = "150mb"`
### Google maps configuration ( optional )
### Third-Party Integrations
If your ToolJet installation requires `Maps` widget, you need to create an API key for Google Maps API.
#### Slack
To use Slack as a data source in ToolJet, create a Slack app and set:
| variable | description |
| ------------------- | ------------------- |
| GOOGLE_MAPS_API_KEY | Google maps API key |
- `SLACK_CLIENT_ID`: Slack app client ID
- `SLACK_CLIENT_SECRET`: Slack app client secret
### APM VENDOR ( optional )
#### Google OAuth
To connect ToolJet with Google services like Google Sheets, create OAuth credentials in Google Cloud Console.
Specify application monitoring vendor. Currently supported values - `sentry`.
- `GOOGLE_CLIENT_ID`: Google OAuth client ID
- `GOOGLE_CLIENT_SECRET`: Google OAuth client secret
| variable | description |
| ---------- | ----------------------------------------- |
| APM_VENDOR | Application performance monitoring vendor |
#### Google Maps API
To use the Maps widget in ToolJet, create a Google Maps API key and set:
### SENTRY DNS ( optional )
- `GOOGLE_MAPS_API_KEY`: Google Maps API key
| variable | description |
| ---------- | ------------------------------------------------------------------------------------------------- |
| SENTRY_DNS | DSN tells a Sentry SDK where to send events so the events are associated with the correct project |
#### Application Monitoring (APM)
- `APM_VENDOR=sentry`: Set APM vendor.
- `SENTRY_DNS`: Sentry project DSN.
- `SENTRY_DEBUG=true/false`: Enable/disable Sentry debugging.
### SENTRY DEBUG ( optional )
#### Security & Authentication
By default, ToolJet sends user count updates every 24 hours. To disable this, use:
Prints logs for sentry.
- `DISABLE_TOOLJET_TELEMETRY=true`: Disables telemetry.(Enabled by default)
| variable | description |
| ------------ | ------------------------------------------- |
| SENTRY_DEBUG | `true` or `false`. Default value is `false` |
#### Single Sign-On (SSO)
Enable Google or GitHub SSO with these environment variables:
### Server URL ( optional)
**Google SSO:**
- `SSO_GOOGLE_OAUTH2_CLIENT_ID`: Google OAuth client ID
This is used to set up for CSP headers and put trace info to be used with APM vendors.
**GitHub SSO:**
- `SSO_GIT_OAUTH2_CLIENT_ID`: GitHub OAuth client ID
- `SSO_GIT_OAUTH2_CLIENT_SECRET`: GitHub OAuth client secret
- `SSO_GIT_OAUTH2_HOST`: GitHub host if self-hosted
| variable | description |
| ------------------ | -------------------------------------------------------------- |
| TOOLJET_SERVER_URL | the URL of ToolJet server ( eg: `https://server.tooljet.com` ) |
**General SSO Settings:**
- `SSO_ACCEPTED_DOMAINS`: Comma-separated list of allowed email domains
- `SSO_DISABLE_SIGNUPS=true`: Restricts signups to existing users
### RELEASE VERSION ( optional)
#### REST API Cookie Forwarding
By default, ToolJet does not forward cookies with REST API requests. To enable this (self-hosted only), set:
Once set any APM provider that supports segregation with releases will track it.
- `FORWARD_RESTAPI_COOKIES=true`: Allows forwarding cookies with REST API requests.
### NODE_EXTRA_CA_CERTS (optional)
#### Asset Path
ToolJet needs to be configured for custom CA certificate to be able to trust and establish connection over https. This requires you to configure an additional env var `NODE_EXTRA_CA_CERTS` to have absolute path to your CA certificates. This file named `cert.pem` needs to be in PEM format and can have more than one certificates.
This is required when the assets for the client are to be loaded from elsewhere (eg: CDN). This can be an absolute path, or relative to main HTML file.
| variable | description |
| ------------------- | ------------------------------------------------------------------ |
| NODE_EXTRA_CA_CERTS | absolute path to certificate PEM file ( eg: /ToolJet/ca/cert.pem ) |
- `ASSET_PATH`: Path for loading frontend assets (e.g., `https://app.tooljet.ai/`)
### Disable telemetry ( optional )
## Additional Configurations
Pings our server to update the total user count every 24 hours. You can disable this by setting the value of `DISABLE_TOOLJET_TELEMETRY` environment variable to `true`. This feature is enabled by default.
#### Log File Path
- `LOG_FILE_PATH`: Path to store audit logs (e.g., `tooljet/log/tooljet-audit.log`)
### Password Retry Limit (Optional)
#### Embedding Private Apps
By default, only public apps can be embedded. To allow embedding of private ToolJet apps, set:
The maximum retry limit of login password for a user is by default set to 5, account will be locked after 5 unsuccessful login attempts. Use the variables mentioned below to control this behavior:
- `ENABLE_PRIVATE_APP_EMBED=true/false`: Allows embedding of private ToolJet apps.
| variable | description |
| ---------------------------- | ------------------------------------------------------------------------------------------------------ |
| DISABLE_PASSWORD_RETRY_LIMIT | (true/false) To disable the password retry check, if value is `true` then no limits for password retry |
| PASSWORD_RETRY_LIMIT | To change the default password retry limit (5) |
**Note: Available in ToolJet Enterprise 2.8.0+ and Community/Cloud 2.10.0+.**
### SSO Configurations (Optional)
#### Default Language
Set the default language using the `LANGUAGE` variable. Supported options:
Configurations for instance level SSO.
| variable | description |
| ---------------------------- | -------------------------------------------------------------- |
| SSO_GOOGLE_OAUTH2_CLIENT_ID | Google OAuth client id |
| SSO_GIT_OAUTH2_CLIENT_ID | GitHub OAuth client id |
| SSO_GIT_OAUTH2_CLIENT_SECRET | GitHub OAuth client secret |
| SSO_GIT_OAUTH2_HOST | GitHub OAuth host name if GitHub is self hosted |
| SSO_ACCEPTED_DOMAINS | comma separated email domains that supports SSO authentication |
| SSO_DISABLE_SIGNUPS | Disable user sign up if authenticated user does not exist |
<div style={{ display: 'flex' }} >
### Enable Cookie Forwarding to REST API (Optional)
<div style = {{ width:'40%' }} >
By default, the ToolJet server does not forward cookies along with the REST API requests. You can enable this functionality by setting the `FORWARD_RESTAPI_COOKIES` environment variable to `true`. This option is available only in the self-hosted version of ToolJet.
| Language | Code | Native Name |
|-------------|------|-------------------|
| English | en | English |
| French | fr | Français |
| Spanish | es | Español |
| Italian | it | Italiano |
| variable | description |
| ----------------------- | ----------------- |
| FORWARD_RESTAPI_COOKIES | `true` or `false` |
</div>
## ToolJet client
<div style = {{ width:'5%' }} > </div>
### Server URL ( optionally required )
<div style = {{ width:'50%' }} >
This is required when client is built separately.
| Language | Code | Native Name |
|-------------|------|-------------------|
| Indonesian | id | Bahasa Indonesia |
| Ukrainian | uk | Українська |
| Russian | ru | Русский |
| German | de | Deutsch |
| variable | description |
| ------------------ | -------------------------------------------------------------- |
| TOOLJET_SERVER_URL | the URL of ToolJet server ( eg: `https://server.tooljet.com` ) |
</div>
### Server Port ( optional)
</div>
This could be used to for local development, it will set the server url like so: `http://localhost:<TOOLJET_SERVER_PORT>`
Example: `LANGUAGE=fr` (for French).
| variable | description |
| ------------------- | --------------------------------------- |
| TOOLJET_SERVER_PORT | the port of ToolJet server ( eg: 3000 ) |
### Asset path ( optionally required )
This is required when the assets for the client are to be loaded from elsewhere (eg: CDN).
This can be an absolute path, or relative to main HTML file.
| variable | description |
| ---------- | ------------------------------------------------------------- |
| ASSET_PATH | the asset path for the website ( eg: https://app.tooljet.ai/) |
### Serve client as a server end-point ( optional )
By default the client build will be done to be served with ToolJet server.
If you intend to use client separately then can set `SERVE_CLIENT` to `false`.
## PostgREST server (required)
| variable | description |
| ---------------- | ----------------------------------------------- |
| PGRST_JWT_SECRET | JWT token client provided for authentication |
| PGRST_DB_URI | database connection string for tooljet database |
| PGRST_LOG_LEVEL | `info` |
If you intent to make changes in the above configuration. Please refer [PostgREST configuration docs](https://postgrest.org/en/stable/configuration.html#environment-variables).
:::tip
If you have openssl installed, you can run the
command `openssl rand -hex 32` to generate the value for `PGRST_JWT_SECRET`.
If this parameter is not specified, PostgREST will refuse authentication requests.
:::
:::info
Please make sure that DB_URI is given in the format `postgrest://[USERNAME]:[PASSWORD]@[HOST]:[PORT]/[DATABASE]`
:::
## Log file path ( Optional )
If a log file path is specified in environment variables, a log file containing all the data from audit logs will be created at the specified path. The file will be updated every time a new audit log is created.
| Variable | Description |
| ------------- | -------------------------------------------------------------------------------- |
| LOG_FILE_PATH | the path where the log file will be created ( eg: tooljet/log/tooljet-audit.log) |
## ToolJet Apps
### Enabling embedding of private apps
By default, only embedding of public apps is permitted. By setting this variable, users will be able to embed private ToolJet Apps.
| Variable | Description |
| ------------------------ | ----------------- |
| ENABLE_PRIVATE_APP_EMBED | `true` or `false` |
:::caution
The option is only available starting from ToolJet Enterprise Edition `2.8.0` or higher, and `2.10.0` for the Community edition and cloud version.
:::
## Configuring the Default Language
To change the default language, set the LANGUAGE variable to your desired language code.
| Variable | Description |
| -------- | --------------- |
| LANGUAGE | `LANGUAGE_CODE` |
Available Languages with their codes and native names:
| Language | Code | Native Name |
| ---------- | ---- | ---------------- |
| English | en | English |
| French | fr | Français |
| Spanish | es | Español |
| Italian | it | Italiano |
| Indonesian | id | Bahasa Indonesia |
| Ukrainian | uk | Українська |
| Russian | ru | Русский |
| German | de | Deutsch |
For instance, to set the language to French, you can set the LANGUAGE variable to `fr`.
:::info
The option to set a default language is not available on cloud version of ToolJet.
:::
**Note:** This setting is not available in ToolJet Cloud.

View file

@ -0,0 +1,46 @@
---
id: tooljet-domain-change
title: ToolJet Domain Change
---
We're updating our domain from `tooljet.ai` to `tooljet.com`.
## When is This Happening?
The change will take place at the following times:
- **ET (Eastern Time):** Sunday, November 23, 2025 11:00 PM
- **PT (Pacific Time):** Sunday, November 23, 2025 8:00 PM
- **GMT (Greenwich Mean Time):** Monday, November 24, 2025 4:00 AM
- **IST (Indian Standard Time):** Monday, November 24, 2025 9:30 AM
## What This Means For You
If your organization uses **Single Sign-On (SSO)** to access ToolJet, youll need to update your SSO redirect URLs to continue signing in after the domain change.
- This change is **only applicable to ToolJet Cloud** users.
- If you do **not** update your SSO configuration, **SSO login will stop working** after the domain change.
## How to Fix it
Youll need to regenerate and update the redirect URL for **each SSO provider** you have configured.
### Steps
1. Log in to ToolJet at **`https://app.tooljet.com`**.
2. Go to: **Workspace settings → Workspace login**.
3. Under SSO providers (Google, OIDC, SAML, etc.):
- Click on each configured provider.
- Copy the new redirect URL (it will now contain `tooljet.com` instead of `tooljet.ai`).
4. Go to your SSO providers admin console (e.g., Google, Okta, Azure AD).
5. Update the redirect/callback URL with the new **`tooljet.com`** URL.
6. Save the changes.
7. Test SSO login to ensure everything is working correctly.
For provider-specific configuration details, refer to your usual **[SSO](/docs/user-management/sso/overview)** setup guides.
## When to Update
To ensure uninterrupted access for your team, **please complete this update by Sunday, November 23, 2025, at 8:00 PM** (aligned with your relevant timezone in the schedule above).
If the redirect URLs are not updated before the domain change window, users relying on SSO will not be able to sign in until the configuration is updated.

View file

@ -34,11 +34,6 @@ ToolJet API allows you to interact with the ToolJet platform programmatically. Y
- [Replace User Workspaces Relations](#replace-user-workspaces-relations)
- [Export Application](#export-application)
- [Import Application](#import-application)
- [Add HTTPS Git Config for an Organization](#add-https-git-config-for-an-organization)
- [Push an App Version to GitHub](#push-an-app-version-to-github)
- [Create a New App from GitHub](#create-a-new-app-from-github)
- [Sync and Pull Changes to Existing App from Git](#sync-and-pull-changes-to-existing-app-from-git)
- [Auto Promote App](#auto-promote-app)
## Enabling ToolJet API
@ -942,6 +937,8 @@ From version **`v3.5.7-ee-lts`**, you can use ToolJet API to import application.
:::info
By default, server accepts maximum JSON size as 50 MB. To increase this limit, use the following environment variable:
`MAX_JSON_SIZE`
**Example**: `MAX_JSON_SIZE = "250mb"`
:::
<details id="tj-dropdown">

View file

@ -12,14 +12,10 @@ To set up LDAP as Single Sign-On (SSO) for ToolJet, follow these steps:
Role Required: **Admin** <br/>
1. Click on the settings icon (⚙️) on the bottom left of your dashboard.
2. Go to **Workspace settings > Workspace login**. <br/>
(Example URL - `https://app.corp.com/nexus/workspace-settings/workspace-login`)
<img className="screenshot-full" src="/img/sso/ldap/url-v4.png" alt="SSO :LDAP"/>
<img style={{ marginBottom:'15px', marginTop: '15px' }} className="screenshot-full" src="/img/sso/ldap/url-v4.png" alt="SSO :LDAP"/>
3. To **enable** LDAP, toggle the switch. Then, add the configuration:
- **Name**: Enter the name of the SSO.
- **Hostname**: Provide the hostname or IP address of your LDAP server.
- **Port**: Enter the Port number of LDAP server.
@ -27,17 +23,12 @@ Role Required: **Admin** <br/>
- **SSL**: Toggle this option to enable the SSL. After enabling you can select the type of SSL: **None** or **Certificates**. If you choose Certificates, you'll need to provide the **Client Key**, **Client Certificate**, and **Server Certificate**.
<br/>
<img className="screenshot-full img-l" src="/img/sso/ldap/fields-v2.png" alt="SSO :LDAP"/>
4. After making the necessary configurations, click the **Save Changes** button located at the bottom.
5. Next, proceed to the **Workspace login** and copy the **Login URL** provided.
6. The **Login URL** obtained can be utilized for accessing the workspace. Please note that ToolJet supports LDAP login at the workspace level and not at the instance level. Thus, users will be logged in specifically to the chosen workspace.
<img className="screenshot-full" src="/img/sso/ldap/login-v2.png" alt="SSO :LDAP"/>
<img style={{ marginBottom:'15px', marginTop: '15px' }} className="screenshot-full" src="/img/sso/ldap/login-v2.png" alt="SSO :LDAP"/>
7. Click on the **Sign in with `<LDAP Name>`** button, and provide your username and password to log in to the workspace. For signing in, ToolJet uses the **common name (cn)** associated with each LDAP server user as the **Username**. Upon the initial login, users will be redirected to the **Workspace Invite** page, while subsequent logins will lead them directly to the ToolJet dashboard.
:::info
During the first login, ToolJet performs additional checks. It verifies the user groups in the LDAP server, and if the corresponding group exists in the ToolJet workspace, the user will be automatically added to that group. Additionally, ToolJet also looks for the user's profile picture in the LDAP server and updates the ToolJet account accordingly.
:::

View file

@ -5,23 +5,26 @@ title: ToolJet Documentation Versions
## Current LTS Versions (Stable)
| Version | Documentation |
|------------|-------------------------------------------------|
| 3.5.0-LTS | [Documentation](https://docs.tooljet.ai/docs/) |
| 3.0.0-LTS | [Documentation](https://docs.tooljet.ai/docs/3.0.0-LTS/) |
| Version | Documentation |
|------------|------------------------------------------------------------|
| 3.16.0-LTS | [Documentation](https://docs.tooljet.ai/docs/) |
| 3.5.0-LTS | [Documentation](https://docs.tooljet.ai/docs/3.5.0-LTS/) |
| 3.0.0-LTS | [Documentation](https://docs.tooljet.ai/docs/3.0.0-LTS/) |
| 2.50.0-LTS | [Documentation](https://docs.tooljet.ai/docs/2.50.0-LTS/) |
<!--
## Beta Version (Pre-Release)
| Version | Documentation |
|-------------|-------------------------------------------|
| 3.11.0-Beta 🚧 | [Documentation](https://docs.tooljet.ai/docs/beta/) |
| Version | Documentation |
|------------|------------------------------------------------------|
| beta 🚧 | [Documentation](https://docs.tooljet.ai/docs/beta/) |
-->
## Past Versions (Not Maintained Anymore)
| Version | Documentation |
|-------------|-------------------------------------------|
| 2.65.0 | [Documentation](https://archived-docs.tooljet.com/docs/) |
| Version | Documentation |
|-------------|----------------------------------------------------------------|
| 2.65.0 | [Documentation](https://archived-docs.tooljet.com/docs/) |
| 2.62.0 | [Documentation](https://archived-docs.tooljet.com/docs/2.62.0) |
| 2.61.0 | [Documentation](https://archived-docs.tooljet.com/docs/2.61.0) |
| 2.43.0 | [Documentation](https://archived-docs.tooljet.com/docs/2.43.0) |

View file

@ -16,7 +16,7 @@ const isProd = process.env.NODE_ENV === 'production';
module.exports = {
title: 'ToolJet',
tagline: 'Low-code framework to Build internal tools and business apps.',
url: 'https://docs.tooljet.ai',
url: 'https://docs.tooljet.com',
baseUrl: '/',
onBrokenLinks: 'ignore',
onBrokenMarkdownLinks: 'warn',
@ -60,21 +60,21 @@ module.exports = {
position: 'right',
},
{
href: 'https://www.tooljet.ai/',
href: 'https://www.tooljet.com/',
position: 'right',
label: 'Website',
className: 'navbar-signin',
'aria-label': 'Visit ToolJet Website',
},
{
href: 'https://www.tooljet.ai/login',
href: 'https://www.tooljet.com/login',
position: 'right',
label: 'Sign in',
className: 'navbar-signin',
'aria-label': 'Signin to ToolJet',
},
{
href: 'https://www.tooljet.ai/create-account',
href: 'https://www.tooljet.com/create-account',
position: 'right',
label: 'Try for free',
className: 'navbar-website',
@ -83,7 +83,7 @@ module.exports = {
],
},
footer: {
style: 'light',
style: 'light',
logo: {
alt: 'ToolJet Logo',
src: '/img/docs_logo.svg',
@ -93,35 +93,36 @@ module.exports = {
{
title: 'Platform',
items: [
{ label: 'App builder', to: 'https://www.tooljet.ai/visual-app-builder' },
{ label: 'AI Agent builder', to: 'https://www.tooljet.ai/ai-agent-builder' },
{ label: 'ToolJet Database', to: 'https://www.tooljet.ai/database' },
{ label: 'App builder', to: 'https://www.tooljet.com/visual-app-builder' },
{ label: 'AI Agent builder', to: 'https://www.tooljet.com/ai-agent-builder' },
{ label: 'ToolJet Database', to: 'https://www.tooljet.com/database' },
{ label: 'Trust Center', to: 'https://trust.tooljet.com/' },
],
},
{
title: 'Solutions',
items: [
{ label: 'Back office tools', to: 'https://www.tooljet.ai/building-back-office-apps' },
{ label: 'Business applications', to: 'https://www.tooljet.ai/business-applications' },
{ label: 'Back office tools', to: 'https://www.tooljet.com/building-back-office-apps' },
{ label: 'Business applications', to: 'https://www.tooljet.com/business-applications' },
],
},
{
title: 'Developers',
items: [
{ label: 'Blogs', to: 'https://blog.tooljet.ai/' },
{ label: 'Events', to: 'https://www.tooljet.ai/events' },
{ label: 'Blogs', to: 'https://blog.tooljet.com/' },
{ label: 'Events', to: 'https://www.tooljet.com/events' },
{ label: 'GitHub', href: 'https://github.com/ToolJet/ToolJet' },
{ label: 'Slack', href: 'https://tooljet.ai/slack' },
{ label: 'Slack', href: 'https://tooljet.com/slack' },
],
},
{
title: 'Templates',
items: [
{ label: 'Lead management', to: 'https://www.tooljet.ai/templates/lead-management-system' },
{ label: 'KPI management', to: 'https://www.tooljet.ai/templates/kpi-management-dashboard' },
{ label: 'Inventory management', to: 'https://www.tooljet.ai/templates/inventory-management-system' },
{ label: 'Leave management', to: 'https://www.tooljet.ai/templates/leave-management-portal' },
{ label: 'Applicant tracking', to: 'https://www.tooljet.ai/templates/applicant-tracking-system' },
{ label: 'Lead management', to: 'https://www.tooljet.com/templates/lead-management-system' },
{ label: 'KPI management', to: 'https://www.tooljet.com/templates/kpi-management-dashboard' },
{ label: 'Inventory management', to: 'https://www.tooljet.com/templates/inventory-management-system' },
{ label: 'Leave management', to: 'https://www.tooljet.com/templates/leave-management-portal' },
{ label: 'Applicant tracking', to: 'https://www.tooljet.com/templates/applicant-tracking-system' },
],
},
{
@ -242,13 +243,47 @@ module.exports = {
window.buildUrlWithStoredParams = buildUrlWithStoredParams; // NEW: Build URLs with UTM params
})();
</script>
<script>
document.addEventListener('DOMContentLoaded', function () {
console.log("Script for cookie called");
var cookieName = "source_page";
var domain = ".tooljet.ai";
var maxAge = 7 * 24 * 60 * 60; // 7 days
var currentHost = window.location.hostname;
var fullUrl = window.location.href;
// Helper: read cookie
function getCookie(name) {
var match = document.cookie.match(new RegExp('(^| )' + name + '=([^;]+)'));
return match ? decodeURIComponent(match[2]) : null;
}
// Helper: set cookie
function setCookie(name, value, maxAgeSeconds, domain) {
document.cookie =
name + "=" + encodeURIComponent(value) +
"; path=/; domain=" + domain +
"; max-age=" + maxAgeSeconds + ";";
}
// If user is on blog.tooljet.ai → always update cookie with latest blog URL
// Else → do not overwrite, just keep existing one
if (currentHost.includes("blog.tooljet.ai")) {
setCookie(cookieName, fullUrl, maxAge, domain);
console.log("Updated source_page cookie with latest blog URL: " + fullUrl);
} else {
console.log("Not on blog domain — keeping existing source_page: " + getCookie(cookieName));
}
});
</script>
<!-- Start of HubSpot Embed Code -->
<script type="text/javascript" id="hs-script-loader" async defer src="//js.hs-scripts.com/39494431.js"></script>
<!-- End of HubSpot Embed Code -->
`,
},
},
algolia: {
appId: 'O8HQRLI0WA',
apiKey: process.env.ALGOLIA_API_KEY || 'development', // Public API key: it is safe to commit it
indexName: 'tooljet',
contextualSearch: true,
insights: true,
externalUrlRegex: 'external\\.com|domain\\.com',
},
},
@ -260,16 +295,18 @@ module.exports = {
sidebarPath: require.resolve('./sidebars.js'),
// Please change this to your repo.
editUrl: 'https://github.com/ToolJet/Tooljet/blob/develop/docs/',
includeCurrentVersion: true,
includeCurrentVersion: false, // Set to true if you want to include the beta version in the sidebar
lastVersion: '3.16.0-LTS',
versions: {
current: {
label: 'beta 🚧',
path: 'beta',
banner: 'none',
badge: false
},
// Uncomment the following line to include the beta version in the sidebar
// current: {
// label: 'beta 🚧',
// path: 'beta',
// banner: 'none',
// badge: false,
// },
"2.50.0-LTS": {
label: '2.50.0-LTS (Legacy)',
banner: 'none',
badge: false
},
@ -282,6 +319,7 @@ module.exports = {
badge: false
},
"3.16.0-LTS": {
label: '3.16 - 3.20 LTS',
banner: 'none',
badge: false
}
@ -296,6 +334,7 @@ module.exports = {
ignorePatterns: ['/docs/1.x.x/**'],
filename: 'sitemap.xml',
},
googleTagManager: isProd
? {
containerId: process.env.GTM || 'development',
@ -303,6 +342,18 @@ module.exports = {
: undefined,
},
],
[
'redocusaurus',
{
openapi: {
path: 'openapi', // scans all folders inside openapi/, e.g., scim, tj-api
routeBasePath: '/api', // pages will be /api/scim, /api/tj-api
},
theme: {
primaryColor: '#1890ff', // customize the color
},
},
],
],
plugins: [
devServerPlugin,
@ -343,122 +394,6 @@ module.exports = {
to: '/docs/user-management/authentication/self-hosted/instance-login',
from: '/docs/enterprise/superadmin',
},
{
to: '/docs/beta/user-management/sso/oidc/setup',
from: '/docs/beta/category/openid-connect/',
},
{
to: '/docs/beta/development-lifecycle/release/share-app/',
from: '/docs/beta/dashboard',
},
{
to: '/docs/beta/security/audit-logs',
from: '/docs/beta/enterprise/audit_logs',
},
{
to: '/docs/beta/user-management/role-based-access/super-admin',
from: '/docs/beta/enterprise/superadmin',
},
{
to: '/docs/beta/tj-setup/org-branding/white-labeling',
from: '/docs/beta/enterprise/white-label',
},
{
to: '/docs/beta/development-lifecycle/gitsync/overview',
from: '/docs/beta/gitsync',
},
{
to: '/docs/beta/tj-setup/licensing/self-hosted',
from: '/docs/beta/org-management/licensing/self-hosted/',
},
{
to: '/docs/beta/user-management/role-based-access/access-control',
from: '/docs/beta/org-management/permissions',
},
{
to: '/docs/beta/tj-setup/smtp-setup/configuration',
from: '/docs/beta/org-management/smtp-configuration',
},
{
to: '/docs/beta/security/constants/',
from: '/docs/beta/org-management/workspaces/workspace_constants/',
},
{
to: '/docs/beta/tj-setup/workspaces',
from: '/docs/beta/org-management/workspaces/workspace_overview/',
},
{
to: '/docs/beta/security/constants/variables',
from: '/docs/beta/org-management/workspaces/workspace-variables-migration',
},
{
to: '/docs/beta/development-lifecycle/gitsync/pull',
from: '/docs/beta/release-management/gitsync/git-pull',
},
{
to: '/docs/beta/development-lifecycle/gitsync/gitsync-config',
from: '/docs/beta/release-management/gitsync/tj-config/',
},
{
to: '/docs/beta/security/compliance',
from: '/docs/beta/security',
},
{
to: '/docs/beta/build-with-ai/overview',
from: '/docs/beta/tooljet-copilot',
},
{
to: '/docs/beta/user-management/role-based-access/custom-groups',
from: '/docs/beta/tutorial/manage-users-groups',
},
{
to: '/docs/beta/tooljet-api',
from: '/docs/beta/tutorial/tooljet-api',
},
{
to: '/docs/beta/user-management/authentication/self-hosted/overview',
from: '/docs/beta/user-authentication/general-settings/',
},
{
to: '/docs/beta/user-management/authentication/self-hosted/instance-login',
from: '/docs/beta/user-authentication/password-login',
},
{
to: '/docs/beta/user-management/authentication/self-hosted/instance-login',
from: '/docs/beta/user-authentication/sso/auto-sso-login',
},
{
to: '/docs/user-management/sso/github',
from: '/docs/beta/user-authentication/sso/github',
},
{
to: '/docs/user-management/sso/ldap',
from: '/docs/beta/user-authentication/sso/ldap',
},
{
to: '/docs/beta/user-management/sso/oidc/azuread',
from: '/docs/beta/user-authentication/sso/openid/azuread/',
},
{
to: '/docs/beta/user-management/sso/oidc/google',
from: '/docs/beta/user-authentication/sso/openid/google-openid',
},
{
to: '/docs/beta/user-management/sso/oidc/okta',
from: '/docs/beta/user-authentication/sso/openid/okta',
},
{
to: '/docs/beta/user-management/sso/saml/setup',
from: '/docs/beta/user-authentication/sso/saml',
},
{
to: '/docs/beta/user-management/onboard-users/overview',
from: '/docs/beta/user-authentication/user-lifecycle/',
},
{
to: '/docs/beta/user-management/authentication/self-hosted/workspace-login',
from: '/docs/beta/user-authentication/workspace-login',
},
{
to: '/docs/user-management/sso/oidc/setup',
from: '/docs/category/openid-connect',
@ -577,11 +512,15 @@ module.exports = {
from: '/docs/widgets/table/table-properties',
},
{
to: '/docs/workflows/how-to/trigger-workflow-from-app',
from: '/docs/workflows/trigger-workflow-from-app',
}
to: '/docs/setup/upgrade-to-v3',
from: '/docs/setup/cloud-v3-migration',
},
// {
// to: '/docs/workflows/how-to/trigger-workflow-from-app',
// from: '/docs/workflows/trigger-workflow-from-app',
// }
],
},
],
],
};
};

View file

@ -0,0 +1,419 @@
openapi: 3.0.3
info:
title: ToolJet SCIM API
version: 1.0.0
description: >
ToolJet supports SCIM 2.0 for automated user and group provisioning.
All standard SCIM endpoints are supported — including `/Schemas`, `/ResourceTypes`, `/Users`, and `/Groups`.
servers:
- url: https://app.tooljet.com/api/scim/v2
description: Production server
- url: http://localhost:3000/api/scim/v2
description: Local development server
paths:
/Users:
get:
summary: List Users
responses:
"200":
description: List of users
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMListResponse'
post:
summary: Create User
requestBody:
required: true
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMUserRequest'
responses:
"201":
description: User created
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMUserResponse'
/Users/{id}:
get:
summary: Get User by ID
parameters:
- name: id
in: path
required: true
schema:
type: string
responses:
"200":
description: User details
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMUserResponse'
put:
summary: Replace User
parameters:
- name: id
in: path
required: true
schema:
type: string
requestBody:
required: true
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMUserRequest'
responses:
"200":
description: Updated user
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMUserResponse'
patch:
summary: Patch User
parameters:
- name: id
in: path
required: true
schema:
type: string
requestBody:
required: true
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMPatchRequest'
responses:
"200":
description: User updated
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMUserResponse'
delete:
summary: Delete User
parameters:
- name: id
in: path
required: true
schema:
type: string
responses:
"204":
description: User deleted
/Groups:
get:
summary: List Groups
responses:
"200":
description: List of groups
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMGroupListResponse'
post:
summary: Create Group
requestBody:
required: true
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMGroupRequest'
responses:
"201":
description: Group created
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMGroupResponse'
/Groups/{id}:
get:
summary: Get Group by ID
parameters:
- name: id
in: path
required: true
schema:
type: string
responses:
"200":
description: Group details
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMGroupResponse'
put:
summary: Replace Group
parameters:
- name: id
in: path
required: true
schema:
type: string
requestBody:
required: true
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMGroupRequest'
responses:
"200":
description: Group updated
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMGroupResponse'
patch:
summary: Patch Group
parameters:
- name: id
in: path
required: true
schema:
type: string
requestBody:
required: true
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMPatchRequest'
responses:
"200":
description: Group updated
content:
application/scim+json:
schema:
$ref: '#/components/schemas/SCIMGroupResponse'
delete:
summary: Delete Group
parameters:
- name: id
in: path
required: true
schema:
type: string
responses:
"204":
description: Group deleted
components:
schemas:
SCIMUserRequest:
type: object
required:
- schemas
- userName
properties:
schemas:
type: array
items:
type: string
example:
- "urn:ietf:params:scim:schemas:core:2.0:User"
- "urn:ietf:params:scim:schemas:extension:tooljet:User:2.0"
userName:
type: string
name:
type: object
properties:
givenName:
type: string
familyName:
type: string
active:
type: boolean
password:
type: string
description: User password for creation.
emails:
type: array
items:
type: object
properties:
value:
type: string
primary:
type: boolean
type:
type: string
groups:
type: array
items:
type: object
properties:
value:
type: string
display:
type: string
meta:
type: object
properties:
resourceType:
type: string
urn:ietf:params:scim:schemas:extension:tooljet:User:2.0:
type: object
description: ToolJets custom SCIM extension (only used in requests)
properties:
role:
type: string
SCIMUserResponse:
type: object
properties:
schemas:
type: array
items:
type: string
example:
- "urn:ietf:params:scim:schemas:core:2.0:User"
id:
type: string
format: uuid
userName:
type: string
name:
type: object
properties:
givenName:
type: string
familyName:
type: string
active:
type: boolean
emails:
type: array
items:
type: object
properties:
value:
type: string
primary:
type: boolean
type:
type: string
meta:
type: object
properties:
resourceType:
type: string
example: "User"
created:
type: string
format: date-time
lastModified:
type: string
format: date-time
SCIMListResponse:
type: object
properties:
totalResults:
type: integer
startIndex:
type: integer
itemsPerPage:
type: integer
Resources:
type: array
items:
$ref: '#/components/schemas/SCIMUserResponse'
SCIMPatchRequest:
type: object
required:
- schemas
- Operations
properties:
schemas:
type: array
items:
type: string
example:
- "urn:ietf:params:scim:api:messages:2.0:PatchOp"
Operations:
type: array
items:
type: object
properties:
op:
type: string
enum: [add, remove, replace]
path:
type: string
value: {}
SCIMGroupRequest:
type: object
required:
- schemas
- displayName
properties:
schemas:
type: array
items:
type: string
example:
- "urn:ietf:params:scim:schemas:core:2.0:Group"
displayName:
type: string
example: "Developers"
members:
type: array
description: List of User members belonging to this group.
items:
type: object
properties:
value:
type: string
format: uuid
description: User ID
display:
type: string
description: User display name
SCIMGroupResponse:
allOf:
- $ref: '#/components/schemas/SCIMGroupRequest'
- type: object
properties:
id:
type: string
format: uuid
meta:
type: object
properties:
resourceType:
type: string
example: "Group"
created:
type: string
format: date-time
lastModified:
type: string
format: date-time
SCIMGroupListResponse:
type: object
properties:
totalResults:
type: integer
startIndex:
type: integer
itemsPerPage:
type: integer
Resources:
type: array
items:
$ref: '#/components/schemas/SCIMGroupResponse'

1518
docs/package-lock.json generated

File diff suppressed because it is too large Load diff

View file

@ -28,6 +28,7 @@
"prism-react-renderer": "^2.1.0",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"redocusaurus": "^2.5.0",
"tailwindcss": "^3.4.12"
},
"devDependencies": {

View file

@ -3,11 +3,16 @@ set -e
# Configuration
DOCKER_REPO="tooljet/tooljet"
MARKDOWN_FILE="docs/versioned_docs/version-3.5.0-LTS/setup/choose-your-tooljet.md"
MARKDOWN_FILE="docs/versioned_docs/version-3.16.0-LTS/setup/overview/choose-your-tooljet.mdx"
TABLE_HEADER="| Version | Release Date | Docker Pull Command |"
TABLE_DIVIDER="|---------|--------------|----------------------|"
DRY_RUN=false
# Current LTS line — update CURRENT_LTS_PREFIX when a new LTS series starts (e.g. v3.21)
CURRENT_LTS_PREFIX="v3.20"
CURRENT_LTS_TAG_PATTERN="^v3\\.20\\.[0-9]+-lts$"
CURRENT_LTS_MAX=6
# Enhanced logging function
log() {
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" >&2
@ -81,34 +86,45 @@ get_lts_tags() {
fi
local page_tags
page_tags=$(echo "$resp" | jq -r '.results[]? | select(.name | test("^v.*ee-lts$")) | .name' 2>/dev/null)
page_tags=$(echo "$resp" | jq -r --arg pat "$CURRENT_LTS_TAG_PATTERN" '.results[]? | select(.name | test($pat)) | .name' 2>/dev/null)
if [[ -n "$page_tags" ]]; then
while IFS= read -r tag; do
[[ -n "$tag" ]] && tags+=("$tag")
done <<< "$page_tags"
fi
url=$(echo "$resp" | jq -r '.next // empty' 2>/dev/null)
[[ "$url" == "null" || -z "$url" ]] && break
# Stop early once we have enough tags to avoid unnecessary pages
if [[ ${#tags[@]} -ge $CURRENT_LTS_MAX ]]; then
log "✅ Collected enough tags (${#tags[@]}), stopping pagination"
break
fi
# Safety check to prevent infinite loops
if [[ $page_count -gt 10 ]]; then
log "⚠️ Reached maximum page limit (10), stopping pagination"
break
fi
done
if [[ ${#tags[@]} -eq 0 ]]; then
log_error "No LTS tags found"
return 1
fi
log "✅ Found ${#tags[@]} LTS tags"
# Sort tags by version (reverse)
# Sort tags by version (reverse) and cap to CURRENT_LTS_MAX
IFS=$'\n' tags=($(printf '%s\n' "${tags[@]}" | sort -Vr))
if [[ ${#tags[@]} -gt $CURRENT_LTS_MAX ]]; then
tags=("${tags[@]:0:$CURRENT_LTS_MAX}")
log "✂️ Capped to ${CURRENT_LTS_MAX} most recent ${CURRENT_LTS_PREFIX} tags"
fi
log "📋 LTS tags (sorted):"
printf ' %s\n' "${tags[@]}"
@ -196,14 +212,7 @@ build_table_rows() {
return 1
fi
# Add latest EE-LTS row (always use ee-lts-latest tag)
local latest="${tags[0]}"
log "⭐ Latest EE-LTS is: $latest"
local latest_row="| Latest EE-LTS | N/A | \`docker pull tooljet/tooljet:ee-lts-latest\` |"
log "📝 Generated ${#rows[@]} table rows plus Latest EE-LTS row"
echo "$latest_row"
log "📝 Generated ${#rows[@]} table rows"
printf "%s\n" "${rows[@]}"
}
@ -239,38 +248,41 @@ replace_table_in_file() {
log "✍️ Writing to $MARKDOWN_FILE..."
local tmp_md="${MARKDOWN_FILE}.tmp"
local in_table=false
local lines_written=0
local table_lines=0
> "$tmp_md"
while IFS= read -r line || [[ -n "$line" ]]; do
if [[ "$line" == "$TABLE_HEADER" ]]; then
log "🔍 Found table header, replacing table content"
echo "$TABLE_HEADER" >> "$tmp_md"
echo "$TABLE_DIVIDER" >> "$tmp_md"
echo "$table_body" >> "$tmp_md"
in_table=true
table_lines=$(echo "$table_body" | wc -l)
lines_written=$((lines_written + 2 + table_lines))
elif [[ "$in_table" == true && "$line" == :::* ]]; then
echo "$line" >> "$tmp_md"
in_table=false
lines_written=$((lines_written + 1))
log "✅ Table replacement complete"
elif [[ "$in_table" == false ]]; then
echo "$line" >> "$tmp_md"
lines_written=$((lines_written + 1))
fi
done < "$MARKDOWN_FILE"
# Extract preserved rows — older LTS lines that are NOT part of the current LTS series.
# These rows sit below the current LTS entries and are never auto-updated.
local preserved_rows
preserved_rows=$(awk '
/^### Latest Patch$/ { in_section=1; next }
/^### Base Versions$/ { in_section=0 }
in_section && /^\| \[/ { print }
' "$MARKDOWN_FILE" | grep -v "${CURRENT_LTS_PREFIX}\.")
# Combine: new current-LTS rows on top, preserved older-LTS rows below
local full_body
if [[ -n "$preserved_rows" ]]; then
full_body="${table_body}
${preserved_rows}"
else
full_body="$table_body"
fi
local new_table="${TABLE_HEADER}
${TABLE_DIVIDER}
${full_body}"
awk -v tbl="$new_table" '
/^### Latest Patch$/ { print; print ""; print tbl; skip=1; next }
/^### Base Versions$/ { skip=0 }
!skip { print }
' "$MARKDOWN_FILE" > "$tmp_md"
if ! mv "$tmp_md" "$MARKDOWN_FILE"; then
log_error "Failed to move temporary file to $MARKDOWN_FILE"
return 1
fi
log "✅ Markdown updated successfully ($lines_written lines written, $table_lines table rows)"
log "✅ Markdown updated successfully"
}
main() {

View file

@ -86,7 +86,7 @@ const sidebars = {
'setup/upgrade-to-v3',
'setup/cloud-v3-migration',
'setup/upgrade-to-v3.16',
'setup/tooljet-domain-change'
]
}
],
@ -438,7 +438,10 @@ const sidebars = {
'marketplace/plugins/marketplace-plugin-weaviate',
'marketplace/plugins/marketplace-plugin-qdrant',
'marketplace/plugins/marketplace-plugin-azurerepos',
'marketplace/plugins/marketplace-plugin-googlecalendar'
'marketplace/plugins/marketplace-plugin-googlecalendar',
'marketplace/plugins/marketplace-plugin-ups',
'marketplace/plugins/marketplace-plugin-aftership',
'marketplace/plugins/marketplace-plugin-microsoft_graph'
],
},
],

View file

@ -1,54 +1,57 @@
import React, { useEffect, useState } from 'react'
import styles from './DocsCard.css'
import React, { useEffect, useState } from 'react'
import styles from './DocsCard.css'
export const DocsCard = ({ label, imgSrc, link, height = 40, width = 40, title }) => {
const kubernetesSvg = '/img/setup/icons/kubernetes.svg'
export const DocsCard = ({ label, imgSrc, link, height = 40, width = 40, title }) => {
const kubernetesSvg = '/img/setup/icons/kubernetes.svg'
const imagePath = imgSrc && imgSrc.includes('kubernetes')
? kubernetesSvg
: imgSrc
? `/img/setup/icons/${imgSrc}.svg`
: '/img/setup/icons/default.svg';
const imagePath = imgSrc.includes('kubernetes') ? kubernetesSvg : `/img/setup/icons/${imgSrc}.svg`
const description = {
"Try ToolJet": "Try out ToolJet with single docker command",
"Choose Your ToolJet": "Important information on which version of ToolJet to use.",
"System Requirements": "Learn about system requirements for running ToolJet",
DigitalOcean: "Quickly deploy ToolJet using the Deploy to DigitalOcean button",
Docker: "Deploy ToolJet on a server using docker-compose",
Heroku: "Deploy ToolJet on Heroku using the one-click-deployment button",
"AWS AMI": "Deploy ToolJet on AWS AMI instances",
"AWS ECS": "Deploy ToolJet on AWS ECS instances",
Openshift: "Deploy ToolJet on Openshift",
Helm: "Deploy ToolJet with Helm Chart",
Kubernetes: "Deploy ToolJet on a Kubernetes cluster",
"Kubernetes (GKE)": "Deploy ToolJet on a GKE Kubernetes cluster",
"Kubernetes (AKS)": "Deploy ToolJet on a AKS Kubernetes cluster",
"Kubernetes (EKS)": "Deploy ToolJet on a EKS Kubernetes cluster",
const description = {
"Try ToolJet": "Try out ToolJet with single docker command",
"Choose Your ToolJet": "Important information on which version of ToolJet to use.",
"System Requirements": "Learn about system requirements for running ToolJet",
DigitalOcean: "Quickly deploy ToolJet using the Deploy to DigitalOcean button",
Docker: "Deploy ToolJet on a server using docker-compose",
Heroku: "Deploy ToolJet on Heroku using the one-click-deployment button",
"AWS AMI": "Deploy ToolJet on AWS AMI instances",
"AWS ECS": "Deploy ToolJet on AWS ECS instances",
Openshift: "Deploy ToolJet on Openshift",
Helm: "Deploy ToolJet with Helm Chart",
Kubernetes: "Deploy ToolJet on a Kubernetes cluster",
"Kubernetes (GKE)": "Deploy ToolJet on a GKE Kubernetes cluster",
"Kubernetes (AKS)": "Deploy ToolJet on a AKS Kubernetes cluster",
"Kubernetes (EKS)": "Deploy ToolJet on a EKS Kubernetes cluster",
"Azure container apps": "Deploy ToolJet on a Azure Container Apps",
"Google Cloud Run": "Deploy ToolJet on Cloud Run with GCloud CLI",
"Deploying ToolJet client": "Deploy ToolJet Client on static website hosting services",
"Environment variables": "Environment variables required by ToolJet Client and Server to start running",
"Connecting via HTTP proxy": "Environment variables required by ToolJet to connect via HTTP proxy",
"Deploying ToolJet on a subpath": "Steps to deploy ToolJet on a subpath rather than root of domain",
"V2 migration guide": "Things to know before migrating to ToolJet V2",
"Upgrading ToolJet to the LTS Version": "Guide to upgrade ToolJet to the latest LTS Version.",
"ToolJet v3 (Beta) Migration Guide": "Breaking changes and migration guide for ToolJet v3",
"ToolJet Cloud v3 Migration Guide": "Breaking changes and migration guide for ToolJet Cloud v3",
}
return (
<a href={link} className="card" style={{ textDecoration: "none", color: "inherit" }}>
<div className="card-body">
<div className="card-icon">
<img className='img' src={imagePath} width="100%" />
</div>
<div className="card-info">
<h3 style={{ margin: "0", paddingBottom: "0.5rem" }}>{label}</h3>
<p>
{description[label]}
</p>
</div>
</div>
</a>
)
}
"Google Cloud Run": "Deploy ToolJet on Cloud Run with GCloud CLI",
"Deploying ToolJet client": "Deploy ToolJet Client on static website hosting services",
"Environment variables": "Environment variables required by ToolJet Client and Server to start running",
"Connecting via HTTP proxy": "Environment variables required by ToolJet to connect via HTTP proxy",
"Deploying ToolJet on a subpath": "Steps to deploy ToolJet on a subpath rather than root of domain",
"V2 migration guide": "Things to know before migrating to ToolJet V2",
"Upgrading ToolJet to the LTS Version": "Guide to upgrade ToolJet to the latest LTS Version.",
"ToolJet v3 (Beta) Migration Guide": "Breaking changes and migration guide for ToolJet v3",
"ToolJet Cloud v3 Migration Guide": "Breaking changes and migration guide for ToolJet Cloud v3",
}
return (
<a href={link} className="card" style={{ textDecoration: "none", color: "inherit" }}>
<div className="card-body">
<div className="card-icon">
<img className='img' src={imagePath} width="100%" />
</div>
<div className="card-info">
<h3 style={{ margin: "0", paddingBottom: "0.5rem" }}>{label}</h3>
<p>
{description[label]}
</p>
</div>
</div>
</a>
)
}

View file

@ -1,12 +1,24 @@
import React from 'react'
import { DocsCard } from './';
import styles from './DocsCard.css'
export const DocsCardList = ({ list }) => {
return (
<div className='card-container-setup'>
{list.map(item => <DocsCard key={item.docId} label={item.label} imgSrc={item.docId.split('/')[1]} link={item.href} />)}
</div>
)
}
import React from 'react'
import { DocsCard } from './';
import styles from './DocsCard.css'
export const DocsCardList = ({ list }) => {
return (
<div className='card-container-setup'>
{list.map(item => {
const docId = item?.docId || "";
const parts = docId.split("/");
const imgSrc = item.customProps?.icon || parts[parts.length - 1];
return (
<DocsCard
key={docId || item.label}
label={item.label}
imgSrc={imgSrc}
link={item.href}
/>
);
})}
</div>
)
}

View file

@ -536,12 +536,16 @@ img {
}
[data-theme='dark'] .navbar-signin,
[data-theme='dark'] .navbar-signin:hover,
[data-theme='dark'] .navbar-website,
[data-theme='dark'] .navbar-website:hover {
[data-theme='dark'] .navbar-website
{
color: white;
}
[data-theme='dark'] .navbar-signin:hover,
[data-theme='dark'] .navbar-website:hover {
color: black;
}
.navbar-website {
box-shadow: 0px 0px 1px 0px rgba(48, 50, 51, 0.05),
0px 1px 1px 0px rgba(48, 50, 51, 0.10);
@ -890,6 +894,16 @@ button[title="Switch between dark and light mode (currently light mode)"] svg,
background-image: url('../../static/img/sidebar-icons/resources.svg');
}
/* Self-hosted only indicator for sidebar category */
.self-hosted-icon {
width: 14px;
height: 14px;
margin-left: 6px;
vertical-align: middle;
display: inline-block;
box-shadow: none !important;
border-radius: 0 !important;
}
/* Dropdownns */
@ -967,3 +981,50 @@ button[title="Switch between dark and light mode (currently light mode)"] svg,
.footer {
padding: 0px 70px !important;
}
.sc-kSaXSp.dkiBl { /* This hides the redocly branding at the bottom on the API pages.*/
visibility: hidden;
}
.jcDBan {
border: 0px !important;
margin-left: 0px !important;
}
.gHrCVQ {
padding: 30px 0 !important;
}
.hgujxv {
margin: 0 !important;
}
.sc-dTvVRJ {
padding: 10px 0px !important;
}
/* Tooltip for outdated version in navbar dropdown */
.dropdown__menu {
overflow: visible !important;
}
.dropdown__link[href*="2.50.0-LTS"] {
position: relative;
}
.dropdown__link[href*="2.50.0-LTS"]:hover::after {
content: "No longer maintained — upgrade to the latest LTS version.";
position: absolute;
bottom: calc(100% + 6px);
left: 50%;
transform: translateX(-50%);
background: #1b1f24;
color: #fff;
font-size: 11px;
padding: 5px 10px;
border-radius: 6px;
white-space: nowrap;
pointer-events: none;
z-index: 9999;
box-shadow: 0 2px 8px rgba(0,0,0,0.25);
}

View file

@ -0,0 +1,25 @@
import React from 'react';
import Category from '@theme-original/DocSidebarItem/Category';
export default function CategoryWrapper(props) {
const isSelfHosted = props.item.customProps?.selfHosted === true;
if (isSelfHosted) {
const modifiedItem = {
...props.item,
label: (
<>
{props.item.label}
<img
src="/img/badge-icons/premium.svg"
alt="Self-hosted"
className="self-hosted-icon"
/>
</>
),
};
return <Category {...props} item={modifiedItem} />;
}
return <Category {...props} />;
}

View file

@ -0,0 +1,25 @@
import React from 'react';
import Link from '@theme-original/DocSidebarItem/Link';
export default function LinkWrapper(props) {
const isSelfHosted = props.item.customProps?.selfHosted === true;
if (isSelfHosted) {
const modifiedItem = {
...props.item,
label: (
<>
{props.item.label}
<img
src="/img/badge-icons/premium.svg"
alt="Self-hosted"
className="self-hosted-icon"
/>
</>
),
};
return <Link {...props} item={modifiedItem} />;
}
return <Link {...props} />;
}

189
docs/src/theme/Root.js Normal file
View file

@ -0,0 +1,189 @@
import React, { useCallback, useEffect } from "react";
import { useLocation, useHistory } from "@docusaurus/router";
const GOOGLE_TRANSLATE_SCRIPT_ID = "tooljet-google-translate-script";
const GOOGLE_TRANSLATE_CALLBACK = "tooljetGoogleTranslateInit";
const GOOGLE_TRANSLATE_CONTAINER_ID = "tooljet-google-translate-runtime";
const GOOGLE_TRANSLATE_SOURCE_LANGUAGE = "en";
const GOOGLE_TRANSLATE_PARAM = "lang";
const LANGUAGE_CODE_REGEX =
/^[A-Za-z]{2,3}(?:-[A-Za-z]{4})?(?:-(?:[A-Za-z]{2}|\d{3}))?(?:-[A-Za-z0-9]{4,8})*$/;
function normalizeLanguageCode(value) {
if (!value) return null;
const languageCode = value.trim();
if (!LANGUAGE_CODE_REGEX.test(languageCode)) return null;
const parts = languageCode.split("-");
const normalizedParts = [parts[0].toLowerCase()];
let index = 1;
if (parts[index] && /^[A-Za-z]{4}$/.test(parts[index])) {
const script = parts[index];
normalizedParts.push(
`${script.charAt(0).toUpperCase()}${script.slice(1).toLowerCase()}`
);
index += 1;
}
if (
parts[index] &&
(/^[A-Za-z]{2}$/.test(parts[index]) || /^\d{3}$/.test(parts[index]))
) {
const region = parts[index];
normalizedParts.push(/^\d{3}$/.test(region) ? region : region.toUpperCase());
index += 1;
}
while (index < parts.length) {
normalizedParts.push(parts[index].toLowerCase());
index += 1;
}
return normalizedParts.join("-");
}
function setGoogleTranslateCookie(targetLanguage) {
const cookieValue = `/${GOOGLE_TRANSLATE_SOURCE_LANGUAGE}/${targetLanguage}`;
const maxAge = 60 * 60 * 24 * 365;
const secure = window.location.protocol === "https:" ? ";secure" : "";
document.cookie = `googtrans=${cookieValue};path=/;max-age=${maxAge};SameSite=Lax${secure}`;
}
function ensureTranslateRuntimeContainer() {
let container = document.getElementById(GOOGLE_TRANSLATE_CONTAINER_ID);
if (container) return container;
container = document.createElement("div");
container.id = GOOGLE_TRANSLATE_CONTAINER_ID;
container.setAttribute("aria-hidden", "true");
container.style.position = "absolute";
container.style.left = "-9999px";
container.style.width = "1px";
container.style.height = "1px";
container.style.overflow = "hidden";
container.dataset.tooljetGoogleTranslateRuntime = "true";
document.body.appendChild(container);
return container;
}
export default function Root({ children }) {
const location = useLocation();
const history = useHistory();
function getStoredUTMParams() {
return JSON.parse(sessionStorage.getItem("utmParams") || "{}");
}
const initializeTranslate = useCallback(() => {
if (!window.google?.translate?.TranslateElement) return;
if (window.__tooljetGoogleTranslateInitialized) return;
ensureTranslateRuntimeContainer();
new window.google.translate.TranslateElement(
{
pageLanguage: GOOGLE_TRANSLATE_SOURCE_LANGUAGE,
autoDisplay: true,
},
GOOGLE_TRANSLATE_CONTAINER_ID
);
window.__tooljetGoogleTranslateInitialized = true;
}, []);
// Store UTMs on first page load
useEffect(() => {
const urlParams = new URLSearchParams(window.location.search);
const storedParams = JSON.parse(
sessionStorage.getItem("utmParams") || "{}"
);
let hasNewParams = false;
urlParams.forEach((value, key) => {
if (key.startsWith("utm_")) {
storedParams[key] = value;
hasNewParams = true;
}
});
if (hasNewParams) {
sessionStorage.setItem("utmParams", JSON.stringify(storedParams));
}
}, []);
// Append UTMs on every route change
useEffect(() => {
const storedParams = getStoredUTMParams();
if (Object.keys(storedParams).length === 0) return;
const url = new URL(window.location.href);
// Append UTMs only if they're not already present
Object.entries(storedParams).forEach(([key, value]) => {
if (!url.searchParams.has(key)) {
url.searchParams.set(key, value);
}
});
const newUrl = url.pathname + url.search + url.hash;
if (newUrl !== location.pathname + location.search + location.hash) {
history.replace(newUrl); // update URL without reloading
}
}, [location.pathname, location.search, location.hash, history]);
// Support ?lang=<code> links and sync through Google's cookie mechanism.
useEffect(() => {
const url = new URL(window.location.href);
const requestedLanguage = normalizeLanguageCode(
url.searchParams.get(GOOGLE_TRANSLATE_PARAM)
);
if (!requestedLanguage) return;
setGoogleTranslateCookie(requestedLanguage);
url.searchParams.delete(GOOGLE_TRANSLATE_PARAM);
const updatedUrl = `${url.pathname}${url.search}${url.hash}`;
const currentUrl = `${location.pathname}${location.search}${location.hash}`;
if (updatedUrl !== currentUrl) {
history.replace(updatedUrl);
}
}, [
history,
location.pathname,
location.search,
location.hash,
]);
// Initialize Google Translate globally once.
useEffect(() => {
window[GOOGLE_TRANSLATE_CALLBACK] = initializeTranslate;
if (window.google?.translate?.TranslateElement) {
initializeTranslate();
} else if (!document.getElementById(GOOGLE_TRANSLATE_SCRIPT_ID)) {
const script = document.createElement("script");
script.id = GOOGLE_TRANSLATE_SCRIPT_ID;
script.src = `https://translate.google.com/translate_a/element.js?cb=${GOOGLE_TRANSLATE_CALLBACK}`;
script.async = true;
document.body.appendChild(script);
}
return () => {
if (window[GOOGLE_TRANSLATE_CALLBACK] === initializeTranslate) {
delete window[GOOGLE_TRANSLATE_CALLBACK];
}
delete window.__tooljetGoogleTranslateInitialized;
const container = document.getElementById(GOOGLE_TRANSLATE_CONTAINER_ID);
if (container?.dataset.tooljetGoogleTranslateRuntime === "true") {
container.remove();
}
};
}, [initializeTranslate]);
return <>{children}</>;
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 47 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 541 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 534 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 390 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 170 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 34 KiB

After

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 89 KiB

After

Width:  |  Height:  |  Size: 73 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 41 KiB

After

Width:  |  Height:  |  Size: 42 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 84 KiB

After

Width:  |  Height:  |  Size: 85 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 309 KiB

After

Width:  |  Height:  |  Size: 245 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 71 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 94 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 142 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 45 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 45 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 60 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 50 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 47 KiB

Some files were not shown because too many files have changed in this diff Show more