feat: Migrate documentation to Mintlify and implement Helper Agent with search functionality (#15443)

This commit is contained in:
Abdul Rahman 2025-10-31 14:47:54 +05:30 committed by GitHub
parent 5211d6ac7e
commit 2c39fc04c2
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
307 changed files with 14027 additions and 74 deletions

View file

@ -226,6 +226,7 @@
"packages/twenty-utils",
"packages/twenty-zapier",
"packages/twenty-website",
"packages/twenty-docs",
"packages/twenty-e2e-testing",
"packages/twenty-shared",
"packages/twenty-sdk",

View file

@ -0,0 +1,95 @@
# Mintlify Migration Summary
## What Was Migrated
### Documentation Files
- **69 MDX files** copied from twenty-website to twenty-docs
- **45 User Guide articles**
- **22 Developer documentation articles**
- **2 Getting Started guides** (existing)
### Images & Assets
- **81 images** copied to `public/images/`
- User guide screenshots
- Developer documentation images
- Logo and branding assets
### Navigation Structure
- Complete `mint.json` configuration with tabs and nested navigation
- User Guide tab with 11 sections
- Developers tab with 6 sections
## Components Converted
### Custom Components → Mintlify Equivalents
- `<ArticleWarning>``<Warning>`
- `<ArticleLink href="...">text</ArticleLink>``[text](...)`
- `<ArticleEditContent>` → Removed (not needed)
### Still Need Manual Review
Some components may need additional conversion:
- `<ArticleTabs>` - Mintlify uses `<Tabs>` component
- Embedded iframes/videos - May need adjustment
- Custom styled elements - Review for Mintlify compatibility
## Directory Structure
```
packages/twenty-docs/
├── mint.json # Main configuration
├── user-guide/
│ ├── getting-started/ # 7 files
│ ├── data-model/ # 6 files
│ ├── crm-essentials/ # 4 files
│ ├── views/ # 2 files
│ ├── workflows/ # 7 files
│ ├── collaboration/ # 3 files
│ ├── integrations-api/ # 3 files
│ ├── reporting/ # 1 file
│ ├── settings/ # 9 files
│ ├── pricing/ # 1 file
│ └── resources/ # 2 files
├── developers/
│ ├── self-hosting/ # 5 files
│ ├── api-and-webhooks/ # 2 files
│ ├── frontend-development/ # 8 files
│ ├── backend-development/ # 7 files
│ ├── local-setup.mdx
│ └── bug-and-requests.mdx
└── public/
└── images/ # 81 images
```
## Testing
Start the local Mintlify dev server:
```bash
npx nx run twenty-docs:dev
```
Open http://localhost:3000 to preview all migrated documentation.
## Deployment
To deploy to Mintlify:
1. Push changes to GitHub
2. Connect the repo in Mintlify dashboard
3. Set subdirectory to `packages/twenty-docs`
4. Mintlify will auto-deploy and generate search embeddings
## Next Steps
1. **Manual Review** - Check for any component conversion issues
2. **Fix Image Paths** - Verify all images render correctly
3. **Test Navigation** - Ensure all internal links work
4. **Deploy** - Push to production Mintlify
5. **Update Helper Agent** - Verify searchArticles tool works with full content
6. **Deprecate twenty-website docs** - Once migration is confirmed working
## Known Issues to Review
- ArticleTabs components may need manual conversion
- Some images may have incorrect paths
- Custom styled components may need adjustment
- Video embeds might need review

View file

@ -0,0 +1,49 @@
---
title: API
image: /images/docs/getting-started/api.png
info: Discover how to use our APIs.
---
<Frame>
<img src="/images/docs/getting-started/api.png" alt="Header" />
</Frame>
## Overview
The Twenty API allows developers to interact programmatically with the Twenty CRM platform. Using the API, you can integrate Twenty with other systems, automate data synchronization, and build custom solutions around your customer data. The API provides endpoints to **create, read, update, and delete** core CRM objects (such as people and companies) as well as access metadata configuration.
**API Playground:** You can now access the API Playground within the app's settings. To try out API calls in real-time, log in to your Twenty workspace and navigate to **Settings → APIs & Webhooks**. This opens the in-app API Playground and the settings for API keys.
**[Go to API Settings](https://app.twenty.com/settings)**
## Authentication
Twentys API uses API keys for authentication. Every request to protected endpoints must include an API key in the header.
* **API Keys:** You can generate a new API key from your Twenty apps **API settings** page. Each API key is a secret token that grants access to your CRM data, so keep it safe. If a key is compromised, revoke it from the settings and generate a new one.
* **Auth Header:** Once you have an API key, include it in the `Authorization` header of your HTTP requests. Use the Bearer token scheme. For example:
```
Authorization: Bearer YOUR_API_KEY
```
Replace `YOUR_API_KEY` with the key you obtained. This header must be present on **all API requests**. If the token is missing or invalid, the API will respond with an authentication error (HTTP 401 Unauthorized).
## API Endpoints
All resources can be accessed and via REST or GraphQL.
* **Cloud:** `https://api.twenty.com/` or your custom domain / sub-domain
* **Self-Hosted Instances:** If you are running Twenty on your own server, use your own domain in place of `api.twenty.com` (for example, `https://<your-domain>/rest/`).
Endpoints are grouped into two categories: **Core API** and **Metadata API**. The **Core API** deals with primary CRM data (e.g. people, companies, notes, tasks), while the **Metadata API** covers configuration data (like custom fields or object definitions). Most integrations will primarily use the Core API.
### Core API
Accessed on `/rest/` or `/graphql/`.
The **Core API** serves as a unified interface for managing core CRM entities (people, companies, notes, tasks) and their relationships, offering **both REST and GraphQL** interaction models.
### Metadata API
Accessed on `/rest/metadata/` or `/metadata/`.
The Metadata API endpoints allow you to retrieve information about your schema and settings. For instance, you can fetch definitions of custom fields, object schemas, etc.
* **Example Endpoints:**
* `GET /rest/metadata/objects` List all object types and their metadata (fields, relationships).
* `GET /rest/metadata/objects/{objectName}` Get metadata for a specific object (e.g., `people`, `companies`).
* `GET /rest/metadata/picklists` Retrieve picklist (dropdown) field options defined in the CRM.
Typically, the metadata endpoints are used to understand the structure of data (for dynamic integrations or form-building) rather than to manage actual records. They are read-only in most cases. Authentication is required for these as well (use your API key).

View file

@ -0,0 +1,82 @@
---
title: Webhooks
image: /images/docs/getting-started/webhooks.png
info: Discover how to use our Webhooks.
---
<Frame>
<img src="/images/docs/getting-started/webhooks.png" alt="Header" />
</Frame>
## Overview
Webhooks in Twenty complement the API by enabling **real-time notifications** to your own applications when certain events happen in your CRM. Instead of continuously polling the API for changes, you can set up webhooks to have Twenty **push** data to your system whenever specific events occur (for example, when a new record is created or an existing record is updated). This helps keep external systems in sync with Twenty instantly and efficiently.
With webhooks, Twenty will send an HTTP POST request to a URL you specify, containing details about the event. You can then handle that data in your application (e.g., to update your external database, trigger workflows, or send alerts).
## Setting Up a Webhook
To create a webhook in Twenty, use the **APIs & Webhooks** settings in your Twenty app:
1. **Navigate to Settings:** In your Twenty application, go to **Settings → APIs & Webhooks**.
2. **Create a Webhook:** Under **Webhooks** click on **+ Create webhook**.
3. **Enter URL:** Provide the endpoint URL on your server where you want Twenty to send webhook requests. This should be a publicly accessible URL that can handle POST requests.
4. **Save:** Click **Save** to create the webhook. The new webhook will be active immediately.
You can create multiple webhooks if you need to send different events to different endpoints. Each webhook is essentially a subscription for all relevant events (at this time, Twenty sends all event types to the given URL; filtering specific event types may be configurable in the UI). If you ever need to remove a webhook, you can delete it from the same settings page (select the webhook and choose delete).
## Events and Payloads
Once a webhook is set up, Twenty will send an HTTP POST request to your specified URL whenever a trigger event occurs in your CRM data. Common events that trigger webhooks include:
* **Record Created:** e.g. a new person is added (`person.created`), a new company is created (`company.created`), a note is created (`note.created`), etc.
* **Record Updated:** e.g. an existing person's information is updated (`person.updated`), a company record is edited (`company.updated`), etc.
* **Record Deleted:** e.g. a person or company is deleted (`person.deleted`, `company.deleted`).
* **Other Events:** If applicable, other object events or custom triggers (for instance, if tasks or other objects are updated, similar event types would be used like `task.created`, `note.updated`, etc.).
The webhook POST request contains a JSON payload in its body. The payload will generally include at least two things: the type of event, and the data related to that event (often the record that was created/updated). For example, a webhook for a newly created person might send a payload like:
```
{
"event": "person.created",
"data": {
"id": "abc12345",
"firstName": "Alice",
"lastName": "Doe",
"email": "alice@example.com",
"createdAt": "2025-02-10T15:30:45Z",
"createdBy": "user_123"
},
"timestamp": "2025-02-10T15:30:50Z"
}
```
In this example:
* `"event"` specifies what happened (`person.created`).
* `"data"` contains the new record's details (the same information you would get if you requested that person via the API).
* `"timestamp"` is when the event occurred (in UTC).
Your endpoint should be prepared to receive such JSON data via POST. Typically, you'll parse the JSON, look at the `"event"` type to understand what happened, and then use the `"data"` accordingly (e.g., create a new contact in your system, or update an existing one).
**Note:** It's important to respond with a **2xx HTTP status** from your webhook endpoint to acknowledge successful receipt. If the Twenty webhook sender does not get a 2xx response, it may consider the delivery failed. (In the future, retry logic might attempt to resend failed webhooks, so always strive to return a 200 OK as quickly as possible after processing the data.)
## Webhook Validation
To ensure the security of your webhook endpoints, Twenty includes a signature in the `X-Twenty-Webhook-Signature` header.
This signature is an HMAC SHA256 hash of the request payload, computed using your webhook secret.
To validate the signature, you'll need to:
1. Concatenate the timestamp (from `X-Twenty-Webhook-Timestamp` header), a colon, and the JSON string of the payload
2. Compute the HMAC SHA256 hash using your webhook secret as the key ()
3. Compare the resulting hex digest with the signature header
Here's an example in Node.js:
```javascript
const crypto = require("crypto");
const timestamp = "1735066639761";
const payload = JSON.stringify({...});
const secret = "your-secret";
const stringToSign = `${timestamp}:${JSON.stringify(payload)}`;
const signature = crypto.createHmac("sha256", secret)
.update(stringToSign)
.digest("hex");
```

View file

@ -0,0 +1,27 @@
---
title: Best Practices
image: /images/user-guide/tips/light-bulb.png
---
<Frame>
<img src="/images/user-guide/tips/light-bulb.png" alt="Header" />
</Frame>
This document outlines the best practices you should follow when working on the backend.
## Follow a modular approach
The backend follows a modular approach, which is a fundamental principle when working with NestJS. Make sure you break down your code into reusable modules to maintain a clean and organized codebase.
Each module should encapsulate a particular feature or functionality and have a well-defined scope. This modular approach enables clear separation of concerns and removes unnecessary complexities.
## Expose services to use in modules
Always create services that have a clear and single responsibility, which enhances code readability and maintainability. Name the services descriptively and consistently.
You should also expose services that you want to use in other modules. Exposing services to other modules is possible through NestJS's powerful dependency injection system, and promotes loose coupling between components.
## Avoid using `any` type
When you declare a variable as `any`, TypeScript's type checker doesn't perform any type checking, making it possible to assign any type of values to the variable. TypeScript uses type inference to determine the type of variable based on the value. By declaring it as `any`, TypeScript can no longer infer the type. This makes it hard to catch type-related errors during development, leading to runtime errors and makes the code less maintainable, less reliable, and harder to understand for others.
This is why everything should have a type. So if you create a new object with a first name and last name, you should create an interface or type that contains a first name and last name that defines the shape of the object you are manipulating.

View file

@ -0,0 +1,44 @@
---
title: Custom Objects
image: /images/user-guide/objects/objects.png
---
<Frame>
<img src="/images/user-guide/objects/objects.png" alt="Header" />
</Frame>
Objects are structures that allow you to store data (records, attributes, and values) specific to an organization. Twenty provides both standard and custom objects.
Standard objects are in-built objects with a set of attributes available for all users. Examples of standard objects in Twenty include Company and Person. Standard objects have standard fields that are also available for all Twenty users, like Company.displayName.
Custom objects are objects that you can create to store information that is unique to your organization. They are not built-in; members of your workspace can create and customize custom objects to hold information that standard objects aren't suitable for.
## High-level schema
<div style={{textAlign: 'center'}}>
<img src="/images/docs/server/custom-object-schema.png" alt="High level schema" />
</div>
<br/>
## How it works
Custom objects come from metadata tables that determine the shape, name, and type of the objects. All this information is present in the metadata schema database, consisting of tables:
- **DataSource**: Details where the data is present.
- **Object**: Describes the object and links to a DataSource.
- **Field**: Outlines an Object's fields and connects to the Object.
To add a custom object, the workspaceMember will query the /metadata API. This updates the metadata accordingly and computes a GraphQL schema based on the metadata, storing it in a GQL cache for later use.
<div style={{textAlign: 'center'}}>
<img src="/images/docs/server/add-custom-objects.jpeg" alt="Query the /metadata API to add custom objects" />
</div>
<br/>
To fetch data, the process involves making queries through the /graphql endpoint and passing them through the Query Resolver.
<div style={{textAlign: 'center'}}>
<img src="/images/docs/server/custom-object-schema.png" alt="Query the /graphql endpoint to fetch data" />
</div>

View file

@ -0,0 +1,54 @@
---
title: Feature Flags
image: /images/user-guide/table-views/table.png
---
<Frame>
<img src="/images/user-guide/table-views/table.png" alt="Header" />
</Frame>
Feature flags are used to hide experimental features. For Twenty, they are set on workspace level and not on a user level.
## Adding a new feature flag
In `FeatureFlagKey.ts` add the feature flag:
```ts
type FeatureFlagKey =
| 'IS_FEATURENAME_ENABLED'
| ...;
```
Also add it to the enum in `feature-flag.entity.ts`:
```ts
enum FeatureFlagKeys {
IsFeatureNameEnabled = 'IS_FEATURENAME_ENABLED',
...
}
```
To apply a feature flag on a **backend** feature use:
```ts
@Gate({
featureFlag: 'IS_FEATURENAME_ENABLED',
})
```
To apply a feature flag on a **frontend** feature use:
```ts
const isFeatureNameEnabled = useIsFeatureEnabled('IS_FEATURENAME_ENABLED');
```
## Configure feature flags for the deployment
Change the corresponding record in the Table `core.featureFlag`:
| id | key | workspaceId | value |
|----------|--------------------------|---------------|--------|
| Random | `IS_FEATURENAME_ENABLED` | WorkspaceID | `true` |

View file

@ -0,0 +1,130 @@
---
title: Folder Architecture
info: A detailed look into our server folder architecture
image: /images/user-guide/fields/field.png
---
<Frame>
<img src="/images/user-guide/fields/field.png" alt="Header" />
</Frame>
The backend directory structure is as follows:
```
server
└───ability
└───constants
└───core
└───database
└───decorators
└───filters
└───guards
└───health
└───integrations
└───metadata
└───workspace
└───utils
```
## Ability
Defines permissions and includes handlers for each entity.
## Decorators
Defines custom decorators in NestJS for added functionality.
See [custom decorators](https://docs.nestjs.com/custom-decorators) for more details.
## Filters
Includes exception filters to handle exceptions that might occur in GraphQL endpoints.
## Guards
See [guards](https://docs.nestjs.com/guards) for more details.
## Health
Includes a publicly available REST API (healthz) that returns a JSON to confirm whether the database is working as expected.
## Metadata
Defines custom objects and makes available a GraphQL API (graphql/metadata).
## Workspace
Generates and serves custom GraphQL schema based on the metadata.
### Workspace Directory Structure
```
workspace
└───workspace-schema-builder
└───factories
└───graphql-types
└───database
└───interfaces
└───object-definitions
└───services
└───storage
└───utils
└───workspace-resolver-builder
└───factories
└───interfaces
└───workspace-query-builder
└───factories
└───interfaces
└───workspace-query-runner
└───interfaces
└───utils
└───workspace-datasource
└───workspace-manager
└───workspace-migration-runner
└───utils
└───workspace.module.ts
└───workspace.factory.spec.ts
└───workspace.factory.ts
```
The root of the workspace directory includes the `workspace.factory.ts`, a file containing the `createGraphQLSchema` function. This function generates workspace-specific schema by using the metadata to tailor a schema for individual workspaces. By separating the schema and resolver construction, we use the `makeExecutableSchema` function, which combines these discrete elements.
This strategy is not just about organization, but also helps with optimization, such as caching generated type definitions to enhance performance and scalability.
### Workspace Schema builder
Generates the GraphQL schema, and includes:
#### Factories:
Specialised constructors to generate GraphQL-related constructs.
- The type.factory translates field metadata into GraphQL types using `TypeMapperService`.
- The type-definition.factory creates GraphQL input or output objects derived from `objectMetadata`.
#### GraphQL Types
Includes enumerations, inputs, objects, and scalars, and serves as the building blocks for the schema construction.
#### Interfaces and Object Definitions
Contains the blueprints for GraphQL entities, and includes both predefined and custom types like `MONEY` or `URL`.
#### Services
Contains the service responsible for associating FieldMetadataType with its appropriate GraphQL scalar or query modifiers.
#### Storage
Includes the `TypeDefinitionsStorage` class that contains reusable type definitions, preventing duplication of GraphQL types.
### Workspace Resolver Builder
Creates resolver functions for querying and mutating the GraphQL schema.
Each factory in this directory is responsible for producing a distinct resolver type, such as the `FindManyResolverFactory`, designed for adaptable application across various tables.
### Workspace Query Runner
Runs the generated queries on the database and parses the result.

View file

@ -0,0 +1,46 @@
---
title: Message Queue
image: /images/user-guide/emails/emails_header.png
---
<Frame>
<img src="/images/user-guide/emails/emails_header.png" alt="Header" />
</Frame>
Queues facilitate async operations to be performed. They can be used for performing background tasks such as sending a welcome email on register.
Each use case will have its own queue class extended from `MessageQueueServiceBase`.
Currently, we only support `bull-mq`[bull-mq](https://bullmq.io/) as the queue driver.
## Steps to create and use a new queue
1. Add a queue name for your new queue under enum `MESSAGE_QUEUES`.
2. Provide the factory implementation of the queue with the queue name as the dependency token.
3. Inject the queue that you created in the required module/service with the queue name as the dependency token.
4. Add worker class with token based injection just like producer.
### Example usage
```ts
class Resolver {
constructor(@Inject(MESSAGE_QUEUES.custom) private queue: MessageQueueService) {}
async onSomeAction() {
//business logic
await this.queue.add(someData);
}
}
//async worker
class CustomWorker {
constructor(@Inject(MESSAGE_QUEUES.custom) private queue: MessageQueueService) {
this.initWorker();
}
async initWorker() {
await this.queue.work(async ({ id, data }) => {
//worker logic
});
}
}
```

View file

@ -0,0 +1,103 @@
---
title: Backend Commands
image: /images/user-guide/kanban-views/kanban.png
---
<Frame>
<img src="/images/user-guide/kanban-views/kanban.png" alt="Header" />
</Frame>
## Useful commands
These commands should be executed from packages/twenty-server folder.
From any other folder you can run `npx nx <command> twenty-server` (or `npx nx run twenty-server:<command>`).
### First time setup
```
npx nx database:reset twenty-server # setup the database with dev seeds
```
### Starting the server
```
npx nx run twenty-server:start
```
### Lint
```
npx nx run twenty-server:lint # pass --fix to fix lint errors
```
### Test
```
npx nx run twenty-server:test:unit # run unit tests
npx nx run twenty-server:test:integration # run integration tests
```
Note: you can run `npx nx run twenty-server:test:integration:with-db-reset` in case you need to reset the database before running the integration tests.
### Resetting the database
If you want to reset and seed the database, you can run the following command:
```bash
npx nx run twenty-server:database:reset
```
### Migrations
#### For objects in Core/Metadata schemas (TypeORM)
```bash
npx nx run twenty-server:typeorm migration:generate src/database/typeorm/core/migrations/nameOfYourMigration -d src/database/typeorm/core/core.datasource.ts
```
#### For Workspace objects
There are no migrations files, migration are generated automatically for each workspace,
stored in the database, and applied with this command
```bash
npx nx run twenty-server:command workspace:sync-metadata -f
```
<Warning>
This will drop the database and re-run the migrations and seed.
Make sure to back up any data you want to keep before running this command.
</Warning>
## Tech Stack
Twenty primarily uses NestJS for the backend.
Prisma was the first ORM we used. But in order to allow users to create custom fields and custom objects, a lower-level made more sense as we need to have fine-grained control. The project now uses TypeORM.
Here's what the tech stack now looks like.
**Core**
- [NestJS](https://nestjs.com/)
- [TypeORM](https://typeorm.io/)
- [GraphQL Yoga](https://the-guild.dev/graphql/yoga-server)
**Database**
- [Postgres](https://www.postgresql.org/)
**Third-party integrations**
- [Sentry](https://sentry.io/welcome/) for tracking bugs
**Testing**
- [Jest](https://jestjs.io/)
**Tooling**
- [Yarn](https://yarnpkg.com/)
- [ESLint](https://eslint.org/)
**Development**
- [AWS EKS](https://aws.amazon.com/eks/)

View file

@ -0,0 +1,79 @@
---
title: Zapier App
image: /images/user-guide/integrations/plug.png
---
<Frame>
<img src="/images/user-guide/integrations/plug.png" alt="Header" />
</Frame>
Effortlessly sync Twenty with 3000+ apps using [Zapier](https://zapier.com/). Automate tasks, boost productivity, and supercharge your customer relationships!
## About Zapier
Zapier is a tool that allows you to automate workflows by connecting the apps that your team uses every day. The fundamental concept of Zapier is automation workflows, called Zaps, and include triggers and actions.
You can learn more about how Zapier works [here](https://zapier.com/how-it-works).
## Setup
### Step 1: Install Zapier packages
```bash
cd packages/twenty-zapier
yarn
```
### Step 2: Login with the CLI
Use your Zapier credentials to log in using the CLI:
```bash
zapier login
```
### Step 3: Set environment variables
From the `packages/twenty-zapier` folder, run:
```bash
cp .env.example .env
```
Run the application locally, go to [http://localhost:3000/settings/api-webhooks](http://localhost:3000/settings/api-webhooks), and generate an API key.
Replace the **YOUR_API_KEY** value in the `.env` file with the API key you just generated.
## Development
<Warning>
Make sure to run `yarn build` before any `zapier` command.
</Warning>
### Test
```bash
yarn test
```
### Lint
```bash
yarn format
```
### Watch and compile as you edit code
```bash
yarn watch
```
### Validate your Zapier app
```bash
yarn validate
```
### Deploy your Zapier app
```bash
yarn deploy
```
### List all Zapier CLI commands
```bash
zapier
```

View file

@ -0,0 +1,18 @@
---
title: Bugs and Requests
image: /images/user-guide/api/api.png
info: Ask for help on GitHub or Discord
---
<Frame>
<img src="/images/user-guide/api/api.png" alt="Header" />
</Frame>
## Reporting Bugs
To report a bug, please [create an issue on GitHub](https://github.com/twentyhq/twenty/issues/new).
You can also ask for help on [Discord](https://discord.gg/cx5n4Jzs57).
## Feature Requests
If you're not sure if it's a bug, and you feel it's closer to a feature request, then you should probably [open a discussion instead](https://github.com/twentyhq/twenty/discussions/new).

View file

@ -0,0 +1,332 @@
---
title: Best Practices
image: /images/user-guide/tips/light-bulb.png
---
<Frame>
<img src="/images/user-guide/tips/light-bulb.png" alt="Header" />
</Frame>
This document outlines the best practices you should follow when working on the frontend.
## State management
React and Recoil handle state management in the codebase.
### Use `useRecoilState` to store state
It's good practice to create as many atoms as you need to store your state.
<Warning>
It's better to use extra atoms than trying to be too concise with props drilling.
</Warning>
```tsx
export const myAtomState = atom({
key: 'myAtomState',
default: 'default value',
});
export const MyComponent = () => {
const [myAtom, setMyAtom] = useRecoilState(myAtomState);
return (
<div>
<input
value={myAtom}
onChange={(e) => setMyAtom(e.target.value)}
/>
</div>
);
}
```
### Do not use `useRef` to store state
Avoid using `useRef` to store state.
If you want to store state, you should use `useState` or `useRecoilState`.
See [how to manage re-renders](#managing-re-renders) if you feel like you need `useRef` to prevent some re-renders from happening.
## Managing re-renders
Re-renders can be hard to manage in React.
Here are some rules to follow to avoid unnecessary re-renders.
Keep in mind that you can **always** avoid re-renders by understanding their cause.
### Work at the root level
Avoiding re-renders in new features is now made easy by eliminating them at the root level.
The `PageChangeEffect` sidecar component contains just one `useEffect` that holds all the logic to execute on a page change.
That way you know that there's just one place that can trigger a re-render.
### Always think twice before adding `useEffect` in your codebase
Re-renders are often caused by unnecessary `useEffect`.
You should think whether you need `useEffect`, or if you can move the logic in a event handler function.
You'll find it generally easy to move the logic in a `handleClick` or `handleChange` function.
You can also find them in libraries like Apollo: `onCompleted`, `onError`, etc.
### Use a sibling component to extract `useEffect` or data fetching logic
If you feel like you need to add a `useEffect` in your root component, you should consider extracting it in a sidecar component.
You can apply the same for data fetching logic, with Apollo hooks.
```tsx
// ❌ Bad, will cause re-renders even if data is not changing,
// because useEffect needs to be re-evaluated
export const PageComponent = () => {
const [data, setData] = useRecoilState(dataState);
const [someDependency] = useRecoilState(someDependencyState);
useEffect(() => {
if(someDependency !== data) {
setData(someDependency);
}
}, [someDependency]);
return <div>{data}</div>;
};
export const App = () => (
<RecoilRoot>
<PageComponent />
</RecoilRoot>
);
```
```tsx
// ✅ Good, will not cause re-renders if data is not changing,
// because useEffect is re-evaluated in another sibling component
export const PageComponent = () => {
const [data, setData] = useRecoilState(dataState);
return <div>{data}</div>;
};
export const PageData = () => {
const [data, setData] = useRecoilState(dataState);
const [someDependency] = useRecoilState(someDependencyState);
useEffect(() => {
if(someDependency !== data) {
setData(someDependency);
}
}, [someDependency]);
return <></>;
};
export const App = () => (
<RecoilRoot>
<PageData />
<PageComponent />
</RecoilRoot>
);
```
### Use recoil family states and recoil family selectors
Recoil family states and selectors are a great way to avoid re-renders.
They are useful when you need to store a list of items.
### You shouldn't use `React.memo(MyComponent)`
Avoid using `React.memo()` because it does not solve the cause of the re-render, but instead breaks the re-render chain, which can lead to unexpected behavior and make the code very hard to refactor.
### Limit `useCallback` or `useMemo` usage
They are often not necessary and will make the code harder to read and maintain for a gain of performance that is unnoticeable.
## Console.logs
`console.log` statements are valuable during development, offering real-time insights into variable values and code flow. But, leaving them in production code can lead to several issues:
1. **Performance**: Excessive logging can affect the runtime performance, especially on client-side applications.
2. **Security**: Logging sensitive data can expose critical information to anyone who inspects the browser's console.
3. **Cleanliness**: Filling up the console with logs can obscure important warnings or errors that developers or tools need to see.
4. **Professionalism**: End users or clients checking the console and seeing a myriad of log statements might question the code's quality and polish.
Make sure you remove all `console.logs` before pushing the code to production.
## Naming
### Variable Naming
Variable names ought to precisely depict the purpose or function of the variable.
#### The issue with generic names
Generic names in programming are not ideal because they lack specificity, leading to ambiguity and reduced code readability. Such names fail to convey the variable or function's purpose, making it challenging for developers to understand the code's intent without deeper investigation. This can result in increased debugging time, higher susceptibility to errors, and difficulties in maintenance and collaboration. Meanwhile, descriptive naming makes the code self-explanatory and easier to navigate, enhancing code quality and developer productivity.
```tsx
// ❌ Bad, uses a generic name that doesn't communicate its
// purpose or content clearly
const [value, setValue] = useState('');
```
```tsx
// ✅ Good, uses a descriptive name
const [email, setEmail] = useState('');
```
#### Some words to avoid in variable names
- dummy
### Event handlers
Event handler names should start with `handle`, while `on` is a prefix used to name events in components props.
```tsx
// ❌ Bad
const onEmailChange = (val: string) => {
// ...
};
```
```tsx
// ✅ Good
const handleEmailChange = (val: string) => {
// ...
};
```
## Optional Props
Avoid passing the default value for an optional prop.
**EXAMPLE**
Take the`EmailField` component defined below:
```tsx
type EmailFieldProps = {
value: string;
disabled?: boolean;
};
const EmailField = ({ value, disabled = false }: EmailFieldProps) => (
<TextInput value={value} disabled={disabled} fullWidth />
);
```
**Usage**
```tsx
// ❌ Bad, passing in the same value as the default value adds no value
const Form = () => <EmailField value="username@email.com" disabled={false} />;
```
```tsx
// ✅ Good, assumes the default value
const Form = () => <EmailField value="username@email.com" />;
```
## Component as props
Try as much as possible to pass uninstantiated components as props, so children can decide on their own of what props they need to pass.
The most common example for that is icon components:
```tsx
const SomeParentComponent = () => <MyComponent Icon={MyIcon} />;
// In MyComponent
const MyComponent = ({ MyIcon }: { MyIcon: IconComponent }) => {
const theme = useTheme();
return (
<div>
<MyIcon size={theme.icon.size.md}>
</div>
)
};
```
For React to understand that the component is a component, you need to use PascalCase, to later instantiate it with `<MyIcon>`
## Prop Drilling: Keep It Minimal
Prop drilling, in the React context, refers to the practice of passing state variables and their setters through many component layers, even if intermediary components don't use them. While sometimes necessary, excessive prop drilling can lead to:
1. **Decreased Readability**: Tracing where a prop originates or where it's utilized can become convoluted in a deeply nested component structure.
2. **Maintenance Challenges**: Changes in one component's prop structure might require adjustments in several components, even if they don't directly use the prop.
3. **Reduced Component Reusability**: A component receiving a lot of props solely for passing them down becomes less general-purpose and harder to reuse in different contexts.
If you feel that you are using excessive prop drilling, see [state management best practices](#state-management).
## Imports
When importing, opt for the designated aliases rather than specifying complete or relative paths.
**The Aliases**
```js
{
alias: {
"~": path.resolve(__dirname, "src"),
"@": path.resolve(__dirname, "src/modules"),
"@testing": path.resolve(__dirname, "src/testing"),
},
}
```
**Usage**
```tsx
// ❌ Bad, specifies the entire relative path
import {
CatalogDecorator
} from '../../../../../testing/decorators/CatalogDecorator';
import {
ComponentDecorator
} from '../../../../../testing/decorators/ComponentDecorator';
```
```tsx
// ✅ Good, utilises the designated aliases
import { CatalogDecorator } from '~/testing/decorators/CatalogDecorator';
import { ComponentDecorator } from 'twenty-ui/testing';
```
## Schema Validation
[Zod](https://github.com/colinhacks/zod) is the schema validator for untyped objects:
```js
const validationSchema = z
.object({
exist: z.boolean(),
email: z
.string()
.email('Email must be a valid email'),
password: z
.string()
.regex(PASSWORD_REGEX, 'Password must contain at least 8 characters'),
})
.required();
type Form = z.infer<typeof validationSchema>;
```
## Breaking Changes
Always perform thorough manual testing before proceeding to guarantee that modifications havent caused disruptions elsewhere, given that tests have not yet been extensively integrated.

View file

@ -0,0 +1,115 @@
---
title: Folder Architecture
info: A detailed look into our folder architecture
image: /images/user-guide/fields/field.png
---
<Frame>
<img src="/images/user-guide/fields/field.png" alt="Header" />
</Frame>
In this guide, you will explore the details of the project directory structure and how it contributes to the organization and maintainability of Twenty.
By following this folder architecture convention, it's easier to find the files related to specific features and ensure that the application is scalable and maintainable.
```
front
└───modules
│ └───module1
│ │ └───submodule1
│ └───module2
│ └───ui
│ │ └───display
│ │ └───inputs
│ │ │ └───buttons
│ │ └───...
└───pages
└───...
```
## Pages
Includes the top-level components defined by the application routes. They import more low-level components from the modules folder (more details below).
## Modules
Each module represents a feature or a group of feature, comprising its specific components, states, and operational logic.
They should all follow the structure below. You can nest modules within modules (referred to as submodules) and the same rules will apply.
```
module1
└───components
│ └───component1
│ └───component2
└───constants
└───contexts
└───graphql
│ └───fragments
│ └───queries
│ └───mutations
└───hooks
│ └───internal
└───states
│ └───selectors
└───types
└───utils
```
### Contexts
A context is a way to pass data through the component tree without having to pass props down manually at every level.
See [React Context](https://react.dev/reference/react#context-hooks) for more details.
### GraphQL
Includes fragments, queries, and mutations.
See [GraphQL](https://graphql.org/learn/) for more details.
- Fragments
A fragment is a reusable piece of a query, which you can use in different places. By using fragments, it's easier to avoid duplicating code.
See [GraphQL Fragments](https://graphql.org/learn/queries/#fragments) for more details.
- Queries
See [GraphQL Queries](https://graphql.org/learn/queries/) for more details.
- Mutations
See [GraphQL Mutations](https://graphql.org/learn/queries/#mutations) for more details.
### Hooks
See [Hooks](https://react.dev/learn/reusing-logic-with-custom-hooks) for more details.
### States
Contains the state management logic. [RecoilJS](https://recoiljs.org) handles this.
- Selectors: See [RecoilJS Selectors](https://recoiljs.org/docs/basic-tutorial/selectors) for more details.
React's built-in state management still handles state within a component.
### Utils
Should just contain reusable pure functions. Otherwise, create custom hooks in the `hooks` folder.
## UI
Contains all the reusable UI components used in the application.
This folder can contain sub-folders, like `data`, `display`, `feedback`, and `input` for specific types of components. Each component should be self-contained and reusable, so that you can use it in different parts of the application.
By separating the UI components from the other components in the `modules` folder, it's easier to maintain a consistent design and to make changes to the UI without affecting other parts (business logic) of the codebase.
## Interface and dependencies
You can import other module code from any module except for the `ui` folder. This will keep its code easy to test.
### Internal
Each part (hooks, states, ...) of a module can have an `internal` folder, which contains parts that are just used within the module.

View file

@ -0,0 +1,94 @@
---
title: Frontend Commands
image: /images/user-guide/create-workspace/workspace-cover.png
---
<Frame>
<img src="/images/user-guide/create-workspace/workspace-cover.png" alt="Header" />
</Frame>
## Useful commands
### Starting the app
```bash
npx nx start twenty-front
```
### Regenerate graphql schema based on API graphql schema
```bash
npx nx run twenty-front:graphql:generate --configuration=metadata
```
OR
```bash
npx nx run twenty-front:graphql:generate
```
### Lint
```bash
npx nx run twenty-front:lint # pass --fix to fix lint errors
```
## Translations
```bash
npx nx run twenty-front:lingui:extract
npx nx run twenty-front:lingui:compile
```
### Test
```bash
npx nx run twenty-front:test # run jest tests
npx nx run twenty-front:storybook:serve:dev # run storybook
npx nx run twenty-front:storybook:test # run tests # (needs yarn storybook:serve:dev to be running)
npx nx run twenty-front:storybook:coverage # (needs yarn storybook:serve:dev to be running)
```
## Tech Stack
The project has a clean and simple stack, with minimal boilerplate code.
**App**
- [React](https://react.dev/)
- [Apollo](https://www.apollographql.com/docs/)
- [GraphQL Codegen](https://the-guild.dev/graphql/codegen)
- [Recoil](https://recoiljs.org/docs/introduction/core-concepts)
- [TypeScript](https://www.typescriptlang.org/)
**Testing**
- [Jest](https://jestjs.io/)
- [Storybook](https://storybook.js.org/)
**Tooling**
- [Yarn](https://yarnpkg.com/)
- [Craco](https://craco.js.org/docs/)
- [ESLint](https://eslint.org/)
## Architecture
### Routing
[React Router](https://reactrouter.com/) handles the routing.
To avoid unnecessary [re-renders](/contributor/frontend/best-practices#managing-re-renders) all the routing logic is in a `useEffect` in `PageChangeEffect`.
### State Management
[Recoil](https://recoiljs.org/docs/introduction/core-concepts) handles state management.
See [best practices](/developers/section/frontend-development/best-practices-front#state-management) for more information on state management.
## Testing
[Jest](https://jestjs.io/) serves as the tool for unit testing while [Storybook](https://storybook.js.org/) is for component testing.
Jest is mainly for testing utility functions, and not components themselves.
Storybook is for testing the behavior of isolated components, as well as displaying the design system.

View file

@ -0,0 +1,180 @@
---
title: Hotkeys
image: /images/user-guide/table-views/table.png
---
<Frame>
<img src="/images/user-guide/table-views/table.png" alt="Header" />
</Frame>
## Introduction
When you need to listen to a hotkey, you would normally use the `onKeyDown` event listener.
In `twenty-front` however, you might have conflicts between same hotkeys that are used in different components, mounted at the same time.
For example, if you have a page that listens for the Enter key, and a modal that listens for the Enter key, with a Select component inside that modal that listens for the Enter key, you might have a conflict when all are mounted at the same time.
## The `useScopedHotkeys` hook
To handle this problem, we have a custom hook that makes it possible to listen to hotkeys without any conflict.
You place it in a component, and it will listen to the hotkeys only when the component is mounted AND when the specified **hotkey scope** is active.
## How to listen for hotkeys in practice?
There are two steps involved in setting up hotkey listening :
1. Set the [hotkey scope](#what-is-a-hotkey-scope-) that will listen to hotkeys
2. Use the `useScopedHotkeys` hook to listen to hotkeys
Setting up hotkey scopes is required even in simple pages, because other UI elements like left menu or command menu might also listen to hotkeys.
## Use cases for hotkeys
In general, you'll have two use cases that require hotkeys :
1. In a page or a component mounted in a page
2. In a modal-type component that takes the focus due to a user action
The second use case can happen recursively : a dropdown in a modal for example.
### Listening to hotkeys in a page
Example :
```tsx
const PageListeningEnter = () => {
const {
setHotkeyScopeAndMemorizePreviousScope,
goBackToPreviousHotkeyScope,
} = usePreviousHotkeyScope();
// 1. Set the hotkey scope in a useEffect
useEffect(() => {
setHotkeyScopeAndMemorizePreviousScope(
ExampleHotkeyScopes.ExampleEnterPage,
);
// Revert to the previous hotkey scope when the component is unmounted
return () => {
goBackToPreviousHotkeyScope();
};
}, [goBackToPreviousHotkeyScope, setHotkeyScopeAndMemorizePreviousScope]);
// 2. Use the useScopedHotkeys hook
useScopedHotkeys(
Key.Enter,
() => {
// Some logic executed on this page when the user presses Enter
// ...
},
ExampleHotkeyScopes.ExampleEnterPage,
);
return <div>My page that listens for Enter</div>;
};
```
### Listening to hotkeys in a modal-type component
For this example we'll use a modal component that listens for the Escape key to tell its parent to close it.
Here the user interaction is changing the scope.
```tsx
const ExamplePageWithModal = () => {
const [showModal, setShowModal] = useState(false);
const {
setHotkeyScopeAndMemorizePreviousScope,
goBackToPreviousHotkeyScope,
} = usePreviousHotkeyScope();
const handleOpenModalClick = () => {
// 1. Set the hotkey scope when user opens the modal
setShowModal(true);
setHotkeyScopeAndMemorizePreviousScope(
ExampleHotkeyScopes.ExampleModal,
);
};
const handleModalClose = () => {
// 1. Revert to the previous hotkey scope when the modal is closed
setShowModal(false);
goBackToPreviousHotkeyScope();
};
return <div>
<h1>My page with a modal</h1>
<button onClick={handleOpenModalClick}>Open modal</button>
{showModal && <MyModalComponent onClose={handleModalClose} />}
</div>;
};
```
Then in the modal component :
```tsx
const MyDropdownComponent = ({ onClose }: { onClose: () => void }) => {
// 2. Use the useScopedHotkeys hook to listen for Escape.
// Note that escape is a common hotkey that could be used by many other components
// So it's important to use a hotkey scope to avoid conflicts
useScopedHotkeys(
Key.Escape,
() => {
onClose()
},
ExampleHotkeyScopes.ExampleModal,
);
return <div>My modal component</div>;
};
```
It's important to use this pattern when you're not sure that just using a useEffect with mount/unmount will be enough to avoid conflicts.
Those conflicts can be hard to debug, and it might happen more often than not with useEffects.
## What is a hotkey scope?
A hotkey scope is a string that represents a context in which the hotkeys are active. It is generally encoded as an enum.
When you change the hotkey scope, the hotkeys that are listening to this scope will be enabled and the hotkeys that are listening to other scopes will be disabled.
You can set only one scope at a time.
As an example, the hotkey scopes for each page are defined in the `PageHotkeyScope` enum:
```tsx
export enum PageHotkeyScope {
Settings = 'settings',
CreateWorkspace = 'create-workspace',
SignInUp = 'sign-in-up',
CreateProfile = 'create-profile',
PlanRequired = 'plan-required',
ShowPage = 'show-page',
PersonShowPage = 'person-show-page',
CompanyShowPage = 'company-show-page',
CompaniesPage = 'companies-page',
PeoplePage = 'people-page',
OpportunitiesPage = 'opportunities-page',
ProfilePage = 'profile-page',
WorkspaceMemberPage = 'workspace-member-page',
TaskPage = 'task-page',
}
```
Internally, the currently selected scope is stored in a Recoil state that is shared across the application :
```tsx
export const currentHotkeyScopeState = createState<HotkeyScope>({
key: 'currentHotkeyScopeState',
defaultValue: INITIAL_HOTKEYS_SCOPE,
});
```
But this Recoil state should never be handled manually ! We'll see how to use it in the next section.
## How is it working internally?
We made a thin wrapper on top of [react-hotkeys-hook](https://react-hotkeys-hook.vercel.app/docs/intro) that makes it more performant and avoids unnecessary re-renders.
We also create a Recoil state to handle the hotkey scope state and make it available everywhere in the application.

View file

@ -0,0 +1,9 @@
---
title: Storybook
description: Browse Twenty's UI component library
---
View our complete component library and documentation in Storybook.
[Open Storybook →](https://storybook.twenty.com)

View file

@ -0,0 +1,294 @@
---
title: Style Guide
image: /images/user-guide/notes/notes_header.png
---
<Frame>
<img src="/images/user-guide/notes/notes_header.png" alt="Header" />
</Frame>
This document includes the rules to follow when writing code.
The goal here is to have a consistent codebase, which is easy to read and easy to maintain.
For this, it's better to be a bit more verbose than to be too concise.
Always keep in mind that people read code more often than they write it, specially on an open source project, where anyone can contribute.
There are a lot of rules that are not defined here, but that are automatically checked by linters.
## React
### Use functional components
Always use TSX functional components.
Do not use default `import` with `const`, because it's harder to read and harder to import with code completion.
```tsx
// ❌ Bad, harder to read, harder to import with code completion
const MyComponent = () => {
return <div>Hello World</div>;
};
export default MyComponent;
// ✅ Good, easy to read, easy to import with code completion
export function MyComponent() {
return <div>Hello World</div>;
};
```
### Props
Create the type of the props and call it `(ComponentName)Props` if there's no need to export it.
Use props destructuring.
```tsx
// ❌ Bad, no type
export const MyComponent = (props) => <div>Hello {props.name}</div>;
// ✅ Good, type
type MyComponentProps = {
name: string;
};
export const MyComponent = ({ name }: MyComponentProps) => <div>Hello {name}</div>;
```
#### Refrain from using `React.FC` or `React.FunctionComponent` to define prop types
```tsx
/* ❌ - Bad, defines the component type annotations with `FC`
* - With `React.FC`, the component implicitly accepts a `children` prop
* even if it's not defined in the prop type. This might not always be
* desirable, especially if the component doesn't intend to render
* children.
*/
const EmailField: React.FC<{
value: string;
}> = ({ value }) => <TextInput value={value} disabled fullWidth />;
```
```tsx
/* ✅ - Good, a separate type (OwnProps) is explicitly defined for the
* component's props
* - This method doesn't automatically include the children prop. If
* you want to include it, you have to specify it in OwnProps.
*/
type EmailFieldProps = {
value: string;
};
const EmailField = ({ value }: EmailFieldProps) => (
<TextInput value={value} disabled fullWidth />
);
```
#### No Single Variable Prop Spreading in JSX Elements
Avoid using single variable prop spreading in JSX elements, like `{...props}`. This practice often results in code that is less readable and harder to maintain because it's unclear which props the component is receiving.
```tsx
/* ❌ - Bad, spreads a single variable prop into the underlying component
*/
const MyComponent = (props: OwnProps) => {
return <OtherComponent {...props} />;
}
```
```tsx
/* ✅ - Good, Explicitly lists all props
* - Enhances readability and maintainability
*/
const MyComponent = ({ prop1, prop2, prop3 }: MyComponentProps) => {
return <OtherComponent {...{ prop1, prop2, prop3 }} />;
};
```
Rationale:
- At a glance, it's clearer which props the code passes down, making it easier to understand and maintain.
- It helps to prevent tight coupling between components via their props.
- Linting tools make it easier to identify misspelled or unused props when you list props explicitly.
## JavaScript
### Use nullish-coalescing operator `??`
```tsx
// ❌ Bad, can return 'default' even if value is 0 or ''
const value = process.env.MY_VALUE || 'default';
// ✅ Good, will return 'default' only if value is null or undefined
const value = process.env.MY_VALUE ?? 'default';
```
### Use optional chaining `?.`
```tsx
// ❌ Bad
onClick && onClick();
// ✅ Good
onClick?.();
```
## TypeScript
### Use `type` instead of `interface`
Always use `type` instead of `interface`, because they almost always overlap, and `type` is more flexible.
```tsx
// ❌ Bad
interface MyInterface {
name: string;
}
// ✅ Good
type MyType = {
name: string;
};
```
### Use string literals instead of enums
[String literals](https://www.typescriptlang.org/docs/handbook/2/everyday-types.html#literal-types) are the go-to way to handle enum-like values in TypeScript. They are easier to extend with Pick and Omit, and offer a better developer experience, specially with code completion.
You can see why TypeScript recommends avoiding enums [here](https://www.typescriptlang.org/docs/handbook/2/everyday-types.html#enums).
```tsx
// ❌ Bad, utilizes an enum
enum Color {
Red = "red",
Green = "green",
Blue = "blue",
}
let color = Color.Red;
```
```tsx
// ✅ Good, utilizes a string literal
let color: "red" | "green" | "blue" = "red";
```
#### GraphQL and internal libraries
You should use enums that GraphQL codegen generates.
It's also better to use an enum when using an internal library, so the internal library doesn't have to expose a string literal type that is not related to the internal API.
Example:
```TSX
const {
setHotkeyScopeAndMemorizePreviousScope,
goBackToPreviousHotkeyScope,
} = usePreviousHotkeyScope();
setHotkeyScopeAndMemorizePreviousScope(
RelationPickerHotkeyScope.RelationPicker,
);
```
## Styling
### Use StyledComponents
Style the components with [styled-components](https://emotion.sh/docs/styled).
```tsx
// ❌ Bad
<div className="my-class">Hello World</div>
```
```tsx
// ✅ Good
const StyledTitle = styled.div`
color: red;
`;
```
Prefix styled components with "Styled" to differentiate them from "real" components.
```tsx
// ❌ Bad
const Title = styled.div`
color: red;
`;
```
```tsx
// ✅ Good
const StyledTitle = styled.div`
color: red;
`;
```
### Theming
Utilizing the theme for the majority of component styling is the preferred approach.
#### Units of measurement
Avoid using `px` or `rem` values directly within the styled components. The necessary values are generally already defined in the theme, so its recommended to make use of the theme for these purposes.
#### Colors
Refrain from introducing new colors; instead, use the existing palette from the theme. Should there be a situation where the palette does not align, please leave a comment so that the team can rectify it.
```tsx
// ❌ Bad, directly specifies style values without utilizing the theme
const StyledButton = styled.button`
color: #333333;
font-size: 1rem;
font-weight: 400;
margin-left: 4px;
border-radius: 50px;
`;
```
```tsx
// ✅ Good, utilizes the theme
const StyledButton = styled.button`
color: ${({ theme }) => theme.font.color.primary};
font-size: ${({ theme }) => theme.font.size.md};
font-weight: ${({ theme }) => theme.font.weight.regular};
margin-left: ${({ theme }) => theme.spacing(1)};
border-radius: ${({ theme }) => theme.border.rounded};
`;
```
## Enforcing No-Type Imports
Avoid type imports. To enforce this standard, an ESLint rule checks for and reports any type imports. This helps maintain consistency and readability in the TypeScript code.
```tsx
// ❌ Bad
import { type Meta, type StoryObj } from '@storybook/react';
// ❌ Bad
import type { Meta, StoryObj } from '@storybook/react';
// ✅ Good
import { Meta, StoryObj } from '@storybook/react';
```
### Why No-Type Imports
- **Consistency**: By avoiding type imports and using a single approach for both type and value imports, the codebase remains consistent in its module import style.
- **Readability**: No-type imports improve code readability by making it clear when you're importing values or types. This reduces ambiguity and makes it easier to understand the purpose of imported symbols.
- **Maintainability**: It enhances codebase maintainability because developers can identify and locate type-only imports when reviewing or modifying code.
### ESLint Rule
An ESLint rule, `@typescript-eslint/consistent-type-imports`, enforces the no-type import standard. This rule will generate errors or warnings for any type import violations.
Please note that this rule specifically addresses rare edge cases where unintentional type imports occur. TypeScript itself discourages this practice, as mentioned in the [TypeScript 3.8 release notes](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-3-8.html). In most situations, you should not need to use type-only imports.
To ensure your code complies with this rule, make sure to run ESLint as part of your development workflow.

View file

@ -0,0 +1,67 @@
---
title: Work with Figma
info: Learn how you can collaborate with Twenty's Figma
image: /images/user-guide/objects/objects.png
---
<Frame>
<img src="/images/user-guide/objects/objects.png" alt="Header" />
</Frame>
Figma is a collaborative interface design tool that aids in bridging the communication barrier between designers and developers.
This guide explains how you can collaborate with Figma.
## Access
1. **Access the shared link:** You can access the project's Figma file [here](https://www.figma.com/file/xt8O9mFeLl46C5InWwoMrN/Twenty).
2. **Sign in:** If you're not already signed in, Figma will prompt you to do so.
Key features are only available to logged-in users, such as the developer mode and the ability to select a dedicated frame.
<Warning>
You will not be able to collaborate effectively without an account.
</Warning>
## Figma structure
On the left sidebar, you can access the different pages of Twenty's Figma. This is how they're organized:
- **Components page:** This is the first page. The designer uses it to create and organize the reusable design elements used throughout the design file. For example, buttons, icons, symbols, or any other reusable components. It serves to maintain consistency across the design.
- **Main page:** The second page is the main page, which shows the complete user interface of the project. You can press ***Play*** to use the full app prototype.
- **Features pages:** The other pages are typically dedicated to features in progress. They contain the design of specific features or modules of the application or website. They are typically still in progress.
## Useful Tips
With read-only access, you can't edit the design, but you can access all features that will be useful to convert the designs into code.
### Use the Dev mode
Figma's Dev Mode enhances developers' productivity by providing easy design navigation, effective asset management, efficient communication tools, toolbox integrations, quick code snippets, and key layer information, bridging the gap between design and development. You can learn more about Dev Mode [here](https://www.figma.com/dev-mode/).
Switch to the "Developer" mode in the right part of the toolbar to see design specs, copy CSS, and access assets.
### Use the Prototype
Click on any element on the canvas and press the “Play” button at the top right edge of the interface to access the prototype view. Prototype mode allows you to interact with the design as if it were the final product. It demonstrates the flow between screens and how interface elements like buttons, links, or menus behave when interacted with.
1. **Understanding transitions and animations:** In the Prototype mode, you can view any transitions or animations added by a designer between screens or UI elements, providing clear visual instructions to developers on the intended behavior and style.
2. **Implementation clarification:** A prototype can also help reduce ambiguities. Developers can interact with it to gain a better understanding of the functionality or appearance of particular elements.
For more comprehensive details and guidance on learning the Figma platform, you can visit the official [Figma Documentation](https://help.figma.com/hc/en-us).
### Measure distances
Select an element, hold `Option` key (Mac) or `Alt` key (Windows), then hover over another element to see the distance between them.
### Figma extension for VSCode (Recommended)
[Figma for VS Code](https://marketplace.visualstudio.com/items?itemName=figma.figma-vscode-extension)
lets you navigate and inspect design files, collaborate with designers, track changes, and speed up implementation - all without leaving your text editor.
It's part of our recommended extensions.
## Collaboration
1. **Using Comments:** You are welcome to use the comment feature by clicking on the bubble icon in the left part of the toolbar.
2. **Cursor chat:** A nice feature of Figma is the Cursor chat. Just press `;` on Mac and `/` on Windows to send a message if you see someone else using Figma as the same time as you.

View file

@ -0,0 +1,36 @@
---
title: Overview
description: Technical documentation for contributors and developers working with Twenty
---
## Getting started
<CardGroup cols={2}>
<Card title="Local Setup" href="/developers/local-setup" img="/images/user-guide/fields/field.png">
The guide for contributors (or curious developers) who want to run Twenty locally (on laptop, PC...)
</Card>
<Card title="Self-Hosting" href="/developers/self-hosting/docker-compose" img="/images/user-guide/integrations/plug.png">
Learn how to host Twenty on your own server
</Card>
<Card title="API and Webhooks" href="/developers/api-and-webhooks/api" img="/images/user-guide/api/api.png">
REST and GraphQL APIs, webhooks, and integrations
</Card>
</CardGroup>
## Contributing
<CardGroup cols={2}>
<Card title="Bugs and Requests" href="/developers/bug-and-requests" img="/images/user-guide/api/api.png">
Ask for help on GitHub or Discord
</Card>
<Card title="Frontend Development" href="/developers/frontend-development/frontend-commands" img="/images/user-guide/create-workspace/workspace-cover.png">
Frontend commands, Figma, React Best Practices...
</Card>
<Card title="Backend Development" href="/developers/backend-development/server-commands" img="/images/user-guide/kanban-views/kanban.png">
NestJS, Custom Objects, Queues...
</Card>
</CardGroup>

View file

@ -0,0 +1,304 @@
---
title: Local Setup
description: "The guide for contributors (or curious developers) who want to run Twenty locally."
image: /images/user-guide/fields/field.png
---
<Frame>
<img src="/images/user-guide/fields/field.png" alt="Header" />
</Frame>
## Prerequisites
<Tabs>
<Tab title="Linux and MacOS">
Before you can install and use Twenty, make sure you install the following on your computer:
- [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
- [Node v24.5.0](https://nodejs.org/en/download)
- [yarn v4](https://yarnpkg.com/getting-started/install)
- [nvm](https://github.com/nvm-sh/nvm/blob/master/README.md)
<Warning>
`npm` won't work, you should use `yarn` instead. Yarn is now shipped with Node.js, so you don't need to install it separately.
You only have to run `corepack enable` to enable Yarn if you haven't done it yet.
</Warning>
</Tab>
<Tab title="Windows (WSL)">
1. Install WSL
Open PowerShell as Administrator and run:
```powershell
wsl --install
```
You should now see a prompt to restart your computer. If not, restart it manually.
Upon restart, a powershell window will open and install Ubuntu. This may take up some time.
You'll see a prompt to create a username and password for your Ubuntu installation.
2. Install and configure git
```bash
sudo apt-get install git
git config --global user.name "Your Name"
git config --global user.email "youremail@domain.com"
```
3. Install nvm, node.js and yarn
<Warning>
Use `nvm` to install the correct `node` version. The `.nvmrc` ensures all contributors use the same version.
</Warning>
```bash
sudo apt-get install curl
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/master/install.sh | bash
```
Close and reopen your terminal to use nvm. Then run the following commands.
```bash
nvm install # installs recommended node version
nvm use # use recommended node version
corepack enable
```
</Tab>
</Tabs>
---
## Step 1: Git Clone
In your terminal, run the following command.
<Tabs>
<Tab title="SSH (Recommended)">
If you haven't already set up SSH keys, you can learn how to do so [here](https://docs.github.com/en/authentication/connecting-to-github-with-ssh/about-ssh).
```bash
git clone git@github.com:twentyhq/twenty.git
```
</Tab>
<Tab title="HTTPS">
```bash
git clone https://github.com/twentyhq/twenty.git
```
</Tab>
</Tabs>
## Step 2: Position yourself at the root
```bash
cd twenty
```
You should run all commands in the following steps from the root of the project.
## Step 3: Set up a PostgreSQL Database
<Tabs>
<Tab title="Linux">
**Option 1 (preferred):** To provision your database locally:
Use the following link to install Postgresql on your Linux machine: [Postgresql Installation](https://www.postgresql.org/download/linux/)
```bash
psql postgres -c "CREATE DATABASE \"default\";" -c "CREATE DATABASE test;"
```
Note: You might need to add `sudo -u postgres` to the command before `psql` to avoid permission errors.
**Option 2:** If you have docker installed:
```bash
make postgres-on-docker
```
</Tab>
<Tab title="Mac OS">
**Option 1 (preferred):** To provision your database locally with `brew`:
```bash
brew install postgresql@16
export PATH="/opt/homebrew/opt/postgresql@16/bin:$PATH"
brew services start postgresql@16
psql postgres -c "CREATE DATABASE \"default\";" -c "CREATE DATABASE test;"
```
You can verify if the PostgreSQL server is running by executing:
```bash
brew services list
```
The installer might not create the `postgres` user by default when installing
via Homebrew on MacOS. Instead, it creates a PostgreSQL role that matches your macOS
username (e.g., "john").
To check and create the `postgres` user if necessary, follow these steps:
```bash
# Connect to PostgreSQL
psql postgres
or
psql -U $(whoami) -d postgres
```
Once at the psql prompt (postgres=#), run:
```bash
# List existing PostgreSQL roles
\du
```
You'll see output similar to:
```bash
Role name | Attributes | Member of
-----------+-------------+-----------
john | Superuser | {}
```
If you do not see a `postgres` role listed, proceed to the next step.
Create the `postgres` role manually:
```bash
CREATE ROLE postgres WITH SUPERUSER LOGIN;
```
This creates a superuser role named `postgres` with login access.
**Option 2:** If you have docker installed:
```bash
make postgres-on-docker
```
</Tab>
<Tab title="Windows (WSL)">
All the following steps are to be run in the WSL terminal (within your virtual machine)
**Option 1:** To provision your Postgresql locally:
Use the following link to install Postgresql on your Linux virtual machine: [Postgresql Installation](https://www.postgresql.org/download/linux/)
```bash
psql postgres -c "CREATE DATABASE \"default\";" -c "CREATE DATABASE test;"
```
Note: You might need to add `sudo -u postgres` to the command before `psql` to avoid permission errors.
**Option 2:** If you have docker installed:
Running Docker on WSL adds an extra layer of complexity.
Only use this option if you are comfortable with the extra steps involved, including turning on [Docker Desktop WSL2](https://docs.docker.com/desktop/wsl).
```bash
make postgres-on-docker
```
</Tab>
</Tabs>
You can now access the database at [localhost:5432](localhost:5432), with user `postgres` and password `postgres` .
## Step 4: Set up a Redis Database (cache)
Twenty requires a redis cache to provide the best performance
<Tabs>
<Tab title="Linux">
**Option 1:** To provision your Redis locally:
Use the following link to install Redis on your Linux machine: [Redis Installation](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/install-redis-on-linux/)
**Option 2:** If you have docker installed:
```bash
make redis-on-docker
```
</Tab>
<Tab title="Mac OS">
**Option 1 (preferred):** To provision your Redis locally with `brew`:
```bash
brew install redis
```
Start your redis server:
```brew services start redis```
**Option 2:** If you have docker installed:
```bash
make redis-on-docker
```
</Tab>
<Tab title="Windows (WSL)">
**Option 1:** To provision your Redis locally:
Use the following link to install Redis on your Linux virtual machine: [Redis Installation](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/install-redis-on-linux/)
**Option 2:** If you have docker installed:
```bash
make redis-on-docker
```
</Tab>
</Tabs>
If you need a Client GUI, we recommend [redis insight](https://redis.io/insight/) (free version available)
## Step 5: Setup environment variables
Use environment variables or `.env` files to configure your project. More info [here](https://twenty.com/developers/section/self-hosting/setup)
Copy the `.env.example` files in `/front` and `/server`:
```bash
cp ./packages/twenty-front/.env.example ./packages/twenty-front/.env
cp ./packages/twenty-server/.env.example ./packages/twenty-server/.env
```
## Step 6: Installing dependencies
To build Twenty server and seed some data into your database, run the following command:
```bash
yarn
```
Note that `npm` or `pnpm` won't work
## Step 7: Running the project
<Tabs>
<Tab title="Linux">
Depending on your Linux distribution, Redis server might be started automatically.
If not, check the [Redis installation guide](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/) for your distro.
</Tab>
<Tab title="Mac OS">
Redis should already be running. If not, run:
```bash
brew services start redis
```
</Tab>
<Tab title="Windows (WSL)">
Depending on your Linux distribution, Redis server might be started automatically.
If not, check the [Redis installation guide](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/) for your distro.
</Tab>
</Tabs>
Set up your database with the following command:
```bash
npx nx database:reset twenty-server
```
Start the server, the worker and the frontend services:
```bash
npx nx start twenty-server
npx nx worker twenty-server
npx nx start twenty-front
```
Alternatively, you can start all services at once:
```bash
npx nx start
```
## Step 8: Use Twenty
**Frontend**
Twenty's frontend will be running at [http://localhost:3001](http://localhost:3001).
You can log in using the default demo account: `tim@apple.dev` (password: `tim@apple.dev`)
**Backend**
- Twenty's server will be up and running at [http://localhost:3000](http://localhost:3000)
- The GraphQL API can be accessed at [http://localhost:3000/graphql](http://localhost:3000/graphql)
- The REST API can be reached at [http://localhost:3000/rest](http://localhost:3000/rest)
## Troubleshooting
If you encounter any problem, check [Troubleshooting](https://twenty.com/developers/section/self-hosting/troubleshooting) for solutions.

View file

@ -0,0 +1,46 @@
---
title: Other methods
image: /images/user-guide/notes/notes_header.png
---
<Frame>
<img src="/images/user-guide/notes/notes_header.png" alt="Header" />
</Frame>
<Warning>
This document is maintained by the community. It might contain issues.
</Warning>
## Kubernetes via Terraform and Manifests
Community-led documentation for Kubernetes deployment is available [here](https://github.com/twentyhq/twenty/tree/main/packages/twenty-docker/k8s)
### Coolify
Deploy Twenty on servers using Coolify. (official image on Coolify will be available soon)
[Coolify documentation](https://coolify.io/docs/get-started/introduction)
### EasyPanel
Deploy Twenty on EasyPanel with the community maintained template below.
[Deploy on EasyPanel](https://easypanel.io/docs/templates/twenty)
### Elest.io
Deploy Twenty on servers with Elest.io using link below.
[Deploy on Elest.io](https://elest.io/open-source/twenty)
### Twenty on Railway
Deploy Twenty on Railway with the community maintained template below.
[![Deploy on Railway](https://railway.com/button.svg)](https://railway.com/deploy/nAL3hA)
## Others
Please feel free to Open a PR to add more Cloud Provider options.

View file

@ -0,0 +1,201 @@
---
title: 1-Click w/ Docker Compose
image: /images/user-guide/objects/objects.png
---
<Frame>
<img src="/images/user-guide/objects/objects.png" alt="Header" />
</Frame>
<Warning>
Docker containers are for production hosting or self-hosting, for the contribution please check the [Local Setup](https://twenty.com/developers/local-setup).
</Warning>
## Overview
This guide provides step-by-step instructions to install and configure the Twenty application using Docker Compose. The aim is to make the process straightforward and prevent common pitfalls that could break your setup.
**Important:** Only modify settings explicitly mentioned in this guide. Altering other configurations may lead to issues.
See docs [Setup Environment Variables](https://twenty.com/developers/section/self-hosting/setup) for advanced configuration. All environment variables must be declared in the docker-compose.yml file at the server and / or worker level depending on the variable.
## System Requirements
- RAM: Ensure your environment has at least 2GB of RAM. Insufficient memory can cause processes to crash.
- Docker & Docker Compose: Make sure both are installed and up-to-date.
## Option 1: One-line script
Install the latest stable version of Twenty with a single command:
```bash
bash <(curl -sL https://raw.githubusercontent.com/twentyhq/twenty/main/packages/twenty-docker/scripts/install.sh)
```
To install a specific version or branch:
```bash
VERSION=vx.y.z BRANCH=branch-name bash <(curl -sL https://raw.githubusercontent.com/twentyhq/twenty/main/packages/twenty-docker/scripts/install.sh)
```
- Replace x.y.z with the desired version number.
- Replace branch-name with the name of the branch you want to install.
## Option 2: Manual steps
Follow these steps for a manual setup.
### Step 1: Set Up the Environment File
1. **Create the .env File**
Copy the example environment file to a new .env file in your working directory:
```bash
curl -o .env https://raw.githubusercontent.com/twentyhq/twenty/refs/heads/main/packages/twenty-docker/.env.example
```
2. **Generate Secret Tokens**
Run the following command to generate a unique random string:
```bash
openssl rand -base64 32
```
**Important:** Keep this value secret / do not share it.
3. **Update the `.env`**
Replace the placeholder value in your .env file with the generated token:
```ini
APP_SECRET=first_random_string
```
4. **Set the Postgres Password**
Update the `PG_DATABASE_PASSWORD` value in the .env file with a strong password without special characters.
```ini
PG_DATABASE_PASSWORD=my_strong_password
```
### Step 2: Obtain the Docker Compose File
Download the `docker-compose.yml` file to your working directory:
```bash
curl -o docker-compose.yml https://raw.githubusercontent.com/twentyhq/twenty/refs/heads/main/packages/twenty-docker/docker-compose.yml
```
### Step 3: Launch the Application
Start the Docker containers:
```bash
docker compose up -d
```
### Step 4: Access the Application
If you host twentyCRM on your own computer, open your browser and navigate to [http://localhost:3000](http://localhost:3000).
If you host it on a server, check that the server is running and that everything is ok with
```bash
curl http://localhost:3000
```
## Configuration
### Expose Twenty to External Access
By default, Twenty runs on `localhost` at port `3000`. To access it via an external domain or IP address, you need to configure the `SERVER_URL` in your `.env` file.
#### Understanding `SERVER_URL`
- **Protocol:** Use `http` or `https` depending on your setup.
- Use `http` if you haven't set up SSL.
- Use `https` if you have SSL configured.
- **Domain/IP:** This is the domain name or IP address where your application is accessible.
- **Port:** Include the port number if you're not using the default ports (`80` for `http`, `443` for `https`).
### SSL Requirements
SSL (HTTPS) is required for certain browser features to work properly. While these features might work during local development (as browsers treat localhost differently), a proper SSL setup is needed when hosting Twenty on a regular domain.
For example, the clipboard API might require a secure context - some features like copy buttons throughout the application might not work without HTTPS enabled.
We strongly recommend setting up Twenty behind a reverse proxy with SSL termination for optimal security and functionality.
#### Configuring `SERVER_URL`
1. **Determine Your Access URL**
- **Without Reverse Proxy (Direct Access):**
If you're accessing the application directly without a reverse proxy:
```ini
SERVER_URL=http://your-domain-or-ip:3000
```
- **With Reverse Proxy (Standard Ports):**
If you're using a reverse proxy like Nginx or Traefik and have SSL configured:
```ini
SERVER_URL=https://your-domain-or-ip
```
- **With Reverse Proxy (Custom Ports):**
If you're using non-standard ports:
```ini
SERVER_URL=https://your-domain-or-ip:custom-port
````
2. **Update the `.env` File**
Open your `.env` file and update the `SERVER_URL`:
```ini
SERVER_URL=http(s)://your-domain-or-ip:your-port
```
**Examples:**
- Direct access without SSL:
```ini
SERVER_URL=http://123.45.67.89:3000
```
- Access via domain with SSL:
```ini
SERVER_URL=https://mytwentyapp.com
```
3. **Restart the Application**
For changes to take effect, restart the Docker containers:
```bash
docker compose down
docker compose up -d
```
#### Considerations
- **Reverse Proxy Configuration:**
Ensure your reverse proxy forwards requests to the correct internal port (`3000` by default). Configure SSL termination and any necessary headers.
- **Firewall Settings:**
Open necessary ports in your firewall to allow external access.
- **Consistency:**
The `SERVER_URL` must match how users access your application in their browsers.
#### Persistence
- **Data Volumes:**
The Docker Compose configuration uses volumes to persist data for the database and server storage.
- **Stateless Environments:**
If deploying to a stateless environment (e.g., certain cloud services), configure external storage to persist data.
## Troubleshooting
If you encounter any problem, check [Troubleshooting](https://twenty.com/developers/section/self-hosting/troubleshooting) for solutions.

View file

@ -0,0 +1,241 @@
---
title: Setup
image: /images/user-guide/table-views/table.png
---
<Frame>
<img src="/images/user-guide/table-views/table.png" alt="Header" />
</Frame>
import OptionTable from '@site/src/theme/OptionTable'
# Configuration Management
<Warning>
**First time installing?** Follow the [Docker Compose installation guide](https://twenty.com/developers/section/self-hosting/docker-compose) to get Twenty running, then return here for configuration.
</Warning>
Twenty offers **two configuration modes** to suit different deployment needs:
**Admin panel access:** Only users with admin privileges (`canAccessFullAdminPanel: true`) can access the configuration interface.
## 1. Admin Panel Configuration (Default)
```bash
IS_CONFIG_VARIABLES_IN_DB_ENABLED=true # default
```
**Most configuration happens through the UI** after installation:
1. Access your Twenty instance (usually `http://localhost:3000`)
2. Go to **Settings / Admin Panel / Configuration Variables**
3. Configure integrations, email, storage, and more
4. Changes take effect immediately (within 15 seconds for multi-container deployments)
<Warning>
**Multi-Container Deployments:** When using database configuration (`IS_CONFIG_VARIABLES_IN_DB_ENABLED=true`), both server and worker containers read from the same database. Admin panel changes affect both automatically, eliminating the need to duplicate environment variables between containers (except for infrastructure variables).
</Warning>
**What you can configure through the admin panel:**
- **Authentication** - Google/Microsoft OAuth, password settings
- **Email** - SMTP settings, templates, verification
- **Storage** - S3 configuration, local storage paths
- **Integrations** - Gmail, Google Calendar, Microsoft services
- **Workflow & Rate Limiting** - Execution limits, API throttling
- **And much more...**
![Admin Panel Configuration Variables](/images/user-guide/setup/admin-panel-config-variables.png)
<Warning>
Each variable is documented with descriptions in your admin panel at **Settings → Admin Panel → Configuration Variables**.
Some infrastructure settings like database connections (`PG_DATABASE_URL`), server URLs (`SERVER_URL`), and app secrets (`APP_SECRET`) can only be configured via `.env` file.
[Complete technical reference →](https://github.com/twentyhq/twenty/blob/main/packages/twenty-server/src/engine/core-modules/twenty-config/config-variables.ts)
</Warning>
## 2. Environment-Only Configuration
```bash
IS_CONFIG_VARIABLES_IN_DB_ENABLED=false
```
**All configuration managed through `.env` files:**
1. Set `IS_CONFIG_VARIABLES_IN_DB_ENABLED=false` in your `.env` file
2. Add all configuration variables to your `.env` file
3. Restart containers for changes to take effect
4. Admin panel will show current values but cannot modify them
## Gmail & Google Calendar Integration
### Create Google Cloud Project
1. Go to [Google Cloud Console](https://console.cloud.google.com/)
2. Create a new project or select existing one
3. Enable these APIs:
- [Gmail API](https://console.cloud.google.com/apis/library/gmail.googleapis.com)
- [Google Calendar API](https://console.cloud.google.com/apis/library/calendar-json.googleapis.com)
- [People API](https://console.cloud.google.com/apis/library/people.googleapis.com)
### Configure OAuth
1. Go to [Credentials](https://console.cloud.google.com/apis/credentials)
2. Create OAuth 2.0 Client ID
3. Add these redirect URIs:
- `https://<your-domain>/auth/google/redirect` (for SSO)
- `https://<your-domain>/auth/google-apis/get-access-token` (for integrations)
### Configure in Twenty
1. Go to **Settings → Admin Panel → Configuration Variables**
2. Find the **Google Auth** section
3. Set these variables:
- `MESSAGING_PROVIDER_GMAIL_ENABLED=true`
- `CALENDAR_PROVIDER_GOOGLE_ENABLED=true`
- `AUTH_GOOGLE_CLIENT_ID=<client-id>`
- `AUTH_GOOGLE_CLIENT_SECRET=<client-secret>`
- `AUTH_GOOGLE_CALLBACK_URL=https://<your-domain>/auth/google/redirect`
- `AUTH_GOOGLE_APIS_CALLBACK_URL=https://<your-domain>/auth/google-apis/get-access-token`
<Warning>
**Environment-only mode:** If you set `IS_CONFIG_VARIABLES_IN_DB_ENABLED=false`, add these variables to your `.env` file instead.
</Warning>
**Required scopes** (automatically configured):
[See relevant source code](https://github.com/twentyhq/twenty/blob/main/packages/twenty-server/src/engine/core-modules/auth/utils/get-google-apis-oauth-scopes.ts#L4-L10)
- `https://www.googleapis.com/auth/calendar.events`
- `https://www.googleapis.com/auth/gmail.readonly`
- `https://www.googleapis.com/auth/profile.emails.read`
### If your app is in test mode
If your app is in test mode, you will need to add test users to your project.
Under [OAuth consent screen](https://console.cloud.google.com/apis/credentials/consent), add your test users to the "Test users" section.
## Microsoft 365 Integration
<Warning>
Users must have a [Microsoft 365 Licence](https://admin.microsoft.com/Adminportal/Home) to be able to use the Calendar and Messaging API. They will not be able to sync their account on Twenty without one.
</Warning>
### Create a project in Microsoft Azure
You will need to create a project in [Microsoft Azure](https://portal.azure.com/#view/Microsoft_AAD_IAM/AppGalleryBladeV2) and get the credentials.
### Enable APIs
On Microsoft Azure Console enable the following APIs in "Permissions":
- Microsoft Graph: Mail.ReadWrite
- Microsoft Graph: Mail.Send
- Microsoft Graph: Calendars.Read
- Microsoft Graph: User.Read
- Microsoft Graph: openid
- Microsoft Graph: email
- Microsoft Graph: profile
- Microsoft Graph: offline_access
Note: "Mail.ReadWrite" and "Mail.Send" are only mandatory if you want to send emails using our workflow actions. You can use "Mail.Read" instead if you only want to receive emails.
### Authorized redirect URIs
You need to add the following redirect URIs to your project:
- `https://<your-domain>/auth/microsoft/redirect` if you want to use Microsoft SSO
- `https://<your-domain>/auth/microsoft-apis/get-access-token`
### Configure in Twenty
1. Go to **Settings → Admin Panel → Configuration Variables**
2. Find the **Microsoft Auth** section
3. Set these variables:
- `MESSAGING_PROVIDER_MICROSOFT_ENABLED=true`
- `CALENDAR_PROVIDER_MICROSOFT_ENABLED=true`
- `AUTH_MICROSOFT_ENABLED=true`
- `AUTH_MICROSOFT_CLIENT_ID=<client-id>`
- `AUTH_MICROSOFT_CLIENT_SECRET=<client-secret>`
- `AUTH_MICROSOFT_CALLBACK_URL=https://<your-domain>/auth/microsoft/redirect`
- `AUTH_MICROSOFT_APIS_CALLBACK_URL=https://<your-domain>/auth/microsoft-apis/get-access-token`
<Warning>
**Environment-only mode:** If you set `IS_CONFIG_VARIABLES_IN_DB_ENABLED=false`, add these variables to your `.env` file instead.
</Warning>
### Configure scopes
[See relevant source code](https://github.com/twentyhq/twenty/blob/main/packages/twenty-server/src/engine/core-modules/auth/utils/get-microsoft-apis-oauth-scopes.ts#L2-L9)
- 'openid'
- 'email'
- 'profile'
- 'offline_access'
- 'Mail.ReadWrite'
- 'Mail.Send'
- 'Calendars.Read'
### If your app is in test mode
If your app is in test mode, you will need to add test users to your project.
Add your test users to the "Users and groups" section.
## Background Jobs for Calendar & Messaging
After configuring Gmail, Google Calendar, or Microsoft 365 integrations, you need to start the background jobs that sync data.
Register the following recurring jobs in your worker container:
```bash
# from your worker container
yarn command:prod cron:messaging:messages-import
yarn command:prod cron:messaging:message-list-fetch
yarn command:prod cron:calendar:calendar-event-list-fetch
yarn command:prod cron:calendar:calendar-events-import
yarn command:prod cron:messaging:ongoing-stale
yarn command:prod cron:calendar:ongoing-stale
yarn command:prod cron:workflow:automated-cron-trigger
```
## Email Configuration
1. Go to **Settings → Admin Panel → Configuration Variables**
2. Find the **Email** section
3. Configure your SMTP settings:
<ArticleTabs label1="Gmail" label2="Office365" label3="Smtp4dev">
<ArticleTab>
You will need to provision an [App Password](https://support.google.com/accounts/answer/185833).
- EMAIL_DRIVER=smtp
- EMAIL_SMTP_HOST=smtp.gmail.com
- EMAIL_SMTP_PORT=465
- EMAIL_SMTP_USER=gmail_email_address
- EMAIL_SMTP_PASSWORD='gmail_app_password'
</ArticleTab>
<ArticleTab>
Keep in mind that if you have 2FA enabled, you will need to provision an [App Password](https://support.microsoft.com/en-us/account-billing/manage-app-passwords-for-two-step-verification-d6dc8c6d-4bf7-4851-ad95-6d07799387e9).
- EMAIL_DRIVER=smtp
- EMAIL_SMTP_HOST=smtp.office365.com
- EMAIL_SMTP_PORT=587
- EMAIL_SMTP_USER=office365_email_address
- EMAIL_SMTP_PASSWORD='office365_password'
</ArticleTab>
<ArticleTab>
**smtp4dev** is a fake SMTP email server for development and testing.
- Run the smtp4dev image: `docker run --rm -it -p 8090:80 -p 2525:25 rnwood/smtp4dev`
- Access the smtp4dev ui here: [http://localhost:8090](http://localhost:8090)
- Set the following variables:
- EMAIL_DRIVER=smtp
- EMAIL_SMTP_HOST=localhost
- EMAIL_SMTP_PORT=2525
</ArticleTab>
</ArticleTabs>
<Warning>
**Environment-only mode:** If you set `IS_CONFIG_VARIABLES_IN_DB_ENABLED=false`, add these variables to your `.env` file instead.
</Warning>

View file

@ -0,0 +1,220 @@
---
title: Troubleshooting
image: /images/user-guide/what-is-twenty/20.png
---
<Frame>
<img src="/images/user-guide/what-is-twenty/20.png" alt="Header" />
</Frame>
## Troubleshooting
If you encounter any problem while setting up environment for development, upgrading your instance or self-hosting,
here are some solutions for common problems.
### Self-hosting
#### First install results in `password authentication failed for user "postgres"`
🚨 **IMPORTANT: This solution is ONLY for fresh installations** 🚨
If you have an existing Twenty instance with production data, **DO NOT** follow these steps as they will permanently delete your database!
While installing Twenty for the first time, you might want to change the default database password.
The password you set during the first installation becomes permanently stored in the database volume. If you later try to change this password in your configuration without removing the old volume, you'll get authentication errors because the database is still using the original password.
⚠️ WARNING: Following steps will PERMANENTLY DELETE all database data! ⚠️
Only proceed if this is a fresh installation with no important data.
In order to update the `PG_DATABASE_PASSWORD` you need to:
```sh
# Update the PG_DATABASE_PASSWORD in .env
docker compose down --volumes
docker compose up -d
```
#### CR line breaks found [Windows]
This is due to the line break characters of Windows and the git configuration. Try running:
```
git config --global core.autocrlf false
```
Then delete the repository and clone it again.
#### Missing metadata schema
During Twenty installation, you need to provision your postgres database with the right schemas, extensions, and users.
If you're successful in running this provisioning, you should have `default` and `metadata` schemas in your database.
If you don't, make sure you don't have more than one postgres instance running on your computer.
#### Cannot find module 'twenty-emails' or its corresponding type declarations.
You have to build the package `twenty-emails` before running the initialization of the database with `npx nx run twenty-emails:build`
#### Missing twenty-x package
Make sure to run yarn in the root directory and then run `npx nx server:dev twenty-server`. If this still doesn't work try building the missing package manually.
#### Lint on Save not working
This should work out of the box with the eslint extension installed. If this doesn't work try adding this to your vscode setting (on the dev container scope):
```
"editor.codeActionsOnSave": {
"source.fixAll.eslint": "explicit"
}
```
#### While running `npx nx start` or `npx nx start twenty-front`, Out of memory error is thrown
In `packages/twenty-front/.env` uncomment `VITE_DISABLE_TYPESCRIPT_CHECKER=true` and `VITE_DISABLE_ESLINT_CHECKER=true` to disable background checks thus reducing amount of needed RAM.
**If it does not work:**
Run only the services you need, instead of `npx nx start`. For instance, if you work on the server, run only `npx nx worker twenty-server`
**If it does not work:**
If you tried to run only `npx nx run twenty-server:start` on WSL and it's failing with the below memory error:
`FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory`
Workaround is to execute below command in terminal or add it in .bashrc profile to get setup automatically:
`export NODE_OPTIONS="--max-old-space-size=8192"`
The --max-old-space-size=8192 flag sets an upper limit of 8GB for the Node.js heap; usage scales with application demand.
Reference: https://stackoverflow.com/questions/56982005/where-do-i-set-node-options-max-old-space-size-2048
**If it does not work:**
Investigate which processes are taking you most of your machine RAM. At Twenty, we noticed that some VScode extensions were taking a lot of RAM so we temporarily disable them.
**If it does not work:**
Restart your machine helps to clean up ghost processes.
#### While running `npx nx start` there are weird [0] and [1] in logs
That's expected as command `npx nx start` is running more commands under the hood
#### No emails are sent
Most of the time, it's because the `worker` is not running in the background. Try to run
```
npx nx worker twenty-server
```
#### Cannot connect my Microsoft 365 account
Most of the time, it's because your admin has not enabled the Microsoft 365 Licence for your account. Check [https://admin.microsoft.com/](https://admin.microsoft.com/Adminportal/Home).
If you have an error code `AADSTS50020`, it probably means that you are using a personal Microsoft account. This is not supported yet. More info [here](https://learn.microsoft.com/fr-fr/troubleshoot/entra/entra-id/app-integration/error-code-aadsts50020-user-account-identity-provider-does-not-exist)
#### While running `yarn` warnings appear in console
Warnings are informing about pulling additional dependencies which aren't explicitly stated in `package.json`, so as long as no breaking error appears, everything should work as expected.
#### When user accesses login page, error about unauthorized user trying to access workspace appears in logs
That's expected as user is unauthorized when logged out since its identity is not verified.
#### How to check if your worker is running?
- Go to [webhook-test.com](https://webhook-test.com/) and copy **Your Unique Webhook URL**.
<div style={{textAlign: 'center'}}>
<img src="/images/docs/developers/self-hosting/webhook-test.jpg" alt="Webhook test" />
</div>
- Open your Twenty app, navigate to `/settings`, and enable the **Advanced** toggle at the bottom left of the screen.
- Create a new webhook.
- Paste **Your Unique Webhook URL** in the **Endpoint Url** field in Twenty. Set the **Filters** to `Companies` and `Created`.
<div style={{textAlign: 'center'}}>
<img src="/images/docs/developers/self-hosting/webhook-settings.jpg" alt="Webhook settings" />
</div>
- Go to `/objects/companies` and create a new company record.
- Return to [webhook-test.com](https://webhook-test.com/) and check if a new **POST request** has been received.
<div style={{textAlign: 'center'}}>
<img src="/images/docs/developers/self-hosting/webhook-test-result.jpg" alt="Webhook test result" />
</div>
- If a **POST request** is received, your worker is running successfully. Otherwise, you need to troubleshoot your worker.
#### Front-end fails to start and returns error TS5042: Option 'project' cannot be mixed with source files on a command line
Comment out checker plugin in `packages/twenty-ui/vite-config.ts` like in example below
```
plugins: [
react({ jsxImportSource: '@emotion/react' }),
tsconfigPaths(),
svgr(),
dts(dtsConfig),
// checker(checkersConfig),
wyw({
include: [
'**/OverflowingTextWithTooltip.tsx',
'**/Chip.tsx',
'**/Tag.tsx',
'**/Avatar.tsx',
'**/AvatarChip.tsx',
],
babelOptions: {
presets: ['@babel/preset-typescript', '@babel/preset-react'],
},
}),
],
```
#### Admin panel not accessible
Run `UPDATE core."user" SET "canAccessFullAdminPanel" = TRUE WHERE email = 'you@yourdomain.com';` in database container to get access to admin panel.
### 1-click Docker compose
#### Unable to Log In
If you can't log in after setup:
1. Run the following commands:
```bash
docker exec -it twenty-server-1 yarn
docker exec -it twenty-server-1 npx nx database:reset --configuration=no-seed
```
2. Restart the Docker containers:
```bash
docker compose down
docker compose up -d
```
Note the database:reset command will completely erase your database and recreate it from scratch.
#### Connection Issues Behind a Reverse Proxy
If you're running Twenty behind a reverse proxy and experiencing connection issues:
1. **Verify SERVER_URL:**
Ensure `SERVER_URL` in your `.env` file matches your external access URL, including `https` if SSL is enabled.
2. **Check Reverse Proxy Settings:**
- Confirm that your reverse proxy is correctly forwarding requests to the Twenty server.
- Ensure headers like `X-Forwarded-For` and `X-Forwarded-Proto` are properly set.
3. **Restart Services:**
After making changes, restart both the reverse proxy and Twenty containers.
#### Error when uploading an image - permission denied
Switching the data folder ownership on the host from root to another user and group resolves this problem.
## Getting Help
If you encounter issues not covered in this guide:
- Check Logs:
View container logs for error messages:
```bash
docker compose logs
```
- Community Support:
Reach out to the [Twenty community](https://github.com/twentyhq/twenty/issues) or [support channels](https://discord.gg/cx5n4Jzs57) for assistance.

View file

@ -0,0 +1,389 @@
---
title: Upgrade guide
image: /images/user-guide/notes/notes_header.png
---
<Frame>
<img src="/images/user-guide/notes/notes_header.png" alt="Header" />
</Frame>
## General guidelines
**Always make sure to back up your database before starting the upgrade process** by running `docker exec -it <db_container_name_or_id> pg_dumpall -U <postgres_user> > databases_backup.sql`.
To restore backup, run `cat databases_backup.sql | docker exec -i <db_container_name_or_id> psql -U <postgres_user>`.
If you used Docker Compose, follow these steps:
1. In a terminal, on the host where Twenty is running, turn off Twenty: `docker compose down`
2. Upgrade the version by changing the `TAG` value in the .env file near your docker-compose. ( We recommend consuming `major.minor` version such as `v0.53` )
3. Bring Twenty back online with `docker compose up -d`
If you want to upgrade your instance by few versions, e.g. from v0.33.0 to v0.35.0, you have to upgrade your instance sequentially, in this example from v0.33.0 to v0.34.0, then from v0.34.0 to v0.35.0.
**Make sure that after each upgraded version you have non-corrupted backup.**
## Version-specific upgrade steps
## v1.0
Hello Twenty v1.0! 🎉
## v0.60
### Performance Enhancements
All interactions with the metadata API have been optimized for better performance, particularly for object metadata manipulation and workspace creation operations.
We've refactored our caching strategy to prioritize cache hits over database queries when possible, significantly improving the performance of metadata API operations.
If you encounter any runtime issues after upgrading, you may need to flush your cache to ensure it's synchronized with the latest changes. Run this command in your twenty-server container:
```bash
yarn command:prod cache:flush
```
### v0.55
Upgrade your Twenty instance to use v0.55 image
You don't need to run any command anymore, the new image will automatically care about running all required migrations.
### `User does not have permission` error
If you encounter authorization errors on most requests after upgrading, you may need to flush your cache to recompute the latest permissions.
In your `twenty-server` container, run:
```bash
yarn command:prod cache:flush
```
This issue is specific to this Twenty version and should not be required for future upgrades.
### v0.54
Since version `0.53`, no manual actions needed.
#### Metadata schema deprecation
We've merged the `metadata` schema into the `core` one to simplify data retrieval from `TypeORM`.
We have merged the `migrate` command step within the `upgrade` command. We do not recommend running `migrate` manually within any of your server/worker containers.
### Since v0.53
Starting from `0.53`, upgrade is programmatically done within the `DockerFile`, this means from now on, you shouldn't have to run any command manually anymore.
Make sure to keep upgrading your instance sequentially, without skipping any major version (e.g. `0.43.3` to `0.44.0` is allowed, but `0.43.1` to `0.45.0` isn't), else could lead to workspace version desynchronization that could result in runtime error and missing functionality.
To check if a workspace has been correctly migrated you can review its version in database in `core.workspace` table.
It should always be in the range of your current Twenty's instance `major.minor` version, you can view your instance version in the admin panel (at `/settings/admin-panel`, accessible if your user has `canAccessFullAdminPanel` property set to true in the database) or by running `echo $APP_VERSION` in your `twenty-server` container.
To fix a desynchronized workspace version, you will have to upgrade from the corresponding twenty's version following related upgrade guide sequentially and so on until it reaches desired version.
#### `auditLog` removal
We've removed the auditLog standard object, which means your backup size might be significantly reduced after this migration.
### v0.51 to v0.52
Upgrade your Twenty instance to use v0.52 image
```
yarn database:migrate:prod
yarn command:prod upgrade
```
#### I have a workspace blocked in version between `0.52.0` and `0.52.6`
Unfortunately `0.52.0` and `0.52.6` have been completely removed from dockerHub.
You will have to manually update your workspace version to `0.51.0` in database and upgrade using twenty version `0.52.11` following its just above upgrade guide.
### v0.50 to v0.51
Upgrade your Twenty instance to use v0.51 image
```
yarn database:migrate:prod
yarn command:prod upgrade
```
### v0.44.0 to v0.50.0
Upgrade your Twenty instance to use v0.50.0 image
```
yarn database:migrate:prod
yarn command:prod upgrade
```
#### Docker-compose.yml mutation
This version includes a `docker-compose.yml` mutation to give `worker` service access to the `server-local-data` volume.
Please update your local `docker-compose.yml` with [v0.50.0 docker-compose.yml](https://github.com/twentyhq/twenty/blob/v0.50.0/packages/twenty-docker/docker-compose.yml)
### v0.43.0 to v0.44.0
Upgrade your Twenty instance to use v0.44.0 image
```
yarn database:migrate:prod
yarn command:prod upgrade
```
### v0.42.0 to v0.43.0
Upgrade your Twenty instance to use v0.43.0 image
```
yarn database:migrate:prod
yarn command:prod upgrade
```
In this version, we have also switched to postgres:16 image in docker-compose.yml.
#### (Option 1) Database migration
Keeping the existing postgres-spilo image is fine, but you will have to freeze the version in your docker-compose.yml to be 0.43.0.
#### (Option 2) Database migration
If you want to migrate your database to the new postgres:16 image, please follow these steps:
1. Dump your database from the old postgres-spilo container
```
docker exec -it twenty-db-1 sh
pg_dump -U {YOUR_POSTGRES_USER} -d {YOUR_POSTGRES_DB} > databases_backup.sql
exit
docker cp twenty-db-1:/home/postgres/databases_backup.sql .
```
Make sure your dump file is not empty.
2. Upgrade your docker-compose.yml to use postgres:16 image as in the [docker-compose.yml](https://raw.githubusercontent.com/twentyhq/twenty/main/packages/twenty-docker/docker-compose.yml) file.
3. Restore the database to the new postgres:16 container
```
docker cp databases_backup.sql twenty-db-1:/databases_backup.sql
docker exec -it twenty-db-1 sh
psql -U {YOUR_POSTGRES_USER} -d {YOUR_POSTGRES_DB} -f databases_backup.sql
exit
```
### v0.41.0 to v0.42.0
Upgrade your Twenty instance to use v0.42.0 image
```
yarn database:migrate:prod
yarn command:prod upgrade-0.42
```
**Environment Variables**
- Removed: `FRONT_PORT`, `FRONT_PROTOCOL`, `FRONT_DOMAIN`, `PORT`
- Added: `FRONTEND_URL`, `NODE_PORT`, `MAX_NUMBER_OF_WORKSPACES_DELETED_PER_EXECUTION`, `MESSAGING_PROVIDER_MICROSOFT_ENABLED`, `CALENDAR_PROVIDER_MICROSOFT_ENABLED`, `IS_MICROSOFT_SYNC_ENABLED`
### v0.40.0 to v0.41.0
Upgrade your Twenty instance to use v0.41.0 image
```
yarn database:migrate:prod
yarn command:prod upgrade-0.41
```
**Environment Variables**
- Removed: `AUTH_MICROSOFT_TENANT_ID`
### v0.35.0 to v0.40.0
Upgrade your Twenty instance to use v0.40.0 image
```
yarn database:migrate:prod
yarn command:prod upgrade-0.40
```
**Environment Variables**
- Added: `IS_EMAIL_VERIFICATION_REQUIRED`, `EMAIL_VERIFICATION_TOKEN_EXPIRES_IN`, `WORKFLOW_EXEC_THROTTLE_LIMIT`, `WORKFLOW_EXEC_THROTTLE_TTL`
### v0.34.0 to v0.35.0
Upgrade your Twenty instance to use v0.35.0 image
```
yarn database:migrate:prod
yarn command:prod upgrade-0.35
```
The `yarn database:migrate:prod` command will apply the migrations to the database structure (core and metadata schemas)
The `yarn command:prod upgrade-0.35` takes care of the data migration of all workspaces.
**Environment Variables**
- We replaced `ENABLE_DB_MIGRATIONS` with `DISABLE_DB_MIGRATIONS` (default value is now `false`, you probably don't have to set anything)
### v0.33.0 to v0.34.0
Upgrade your Twenty instance to use v0.34.0 image
```
yarn database:migrate:prod
yarn command:prod upgrade-0.34
```
The `yarn database:migrate:prod` command will apply the migrations to the database structure (core and metadata schemas)
The `yarn command:prod upgrade-0.34` takes care of the data migration of all workspaces.
**Environment Variables**
- Removed: `FRONT_BASE_URL`
- Added: `FRONT_DOMAIN`, `FRONT_PROTOCOL`, `FRONT_PORT`
We have updated the way we handle the frontend URL.
You can now set the frontend URL using the `FRONT_DOMAIN`, `FRONT_PROTOCOL` and `FRONT_PORT` variables.
If FRONT_DOMAIN is not set, the frontend URL will fall back to `SERVER_URL`.
### v0.32.0 to v0.33.0
Upgrade your Twenty instance to use v0.33.0 image
```
yarn command:prod cache:flush
yarn database:migrate:prod
yarn command:prod upgrade-0.33
```
The `yarn command:prod cache:flush` command will flush the Redis cache.
The `yarn database:migrate:prod` command will apply the migrations to the database structure (core and metadata schemas)
The `yarn command:prod upgrade-0.33` takes care of the data migration of all workspaces.
Starting from this version, twenty-postgres image for DB became deprecated and twenty-postgres-spilo is used instead.
If you want to keep using twenty-postgres image, simply replace `twentycrm/twenty-postgres:${TAG}` with `twentycrm/twenty-postgres` in docker-compose.yml.
### v0.31.0 to v0.32.0
Upgrade your Twenty instance to use v0.32.0 image
**Schema and data migration**
```
yarn database:migrate:prod
yarn command:prod upgrade-0.32
```
The `yarn database:migrate:prod` command will apply the migrations to the database structure (core and metadata schemas)
The `yarn command:prod upgrade-0.32` takes care of the data migration of all workspaces.
**Environment Variables**
We have updated the way we handle the Redis connection.
- Removed: `REDIS_HOST`, `REDIS_PORT`, `REDIS_USERNAME`, `REDIS_PASSWORD`
- Added: `REDIS_URL`
Update your `.env` file to use the new `REDIS_URL` variable instead of the individual Redis connection parameters.
We have also simplified the way we handle the JWT tokens.
- Removed: `ACCESS_TOKEN_SECRET`, `LOGIN_TOKEN_SECRET`, `REFRESH_TOKEN_SECRET`, `FILE_TOKEN_SECRET`
- Added: `APP_SECRET`
Update your `.env` file to use the new `APP_SECRET` variable instead of the individual tokens secrets (you can use the same secret as before or generate a new random string)
**Connected Account**
If you are using connected account to synchronize your Google emails and calendars, you will need to activate the [People API](https://developers.google.com/people) on your Google Admin console.
### v0.30.0 to v0.31.0
Upgrade your Twenty instance to use v0.31.0 image
**Schema and data migration**:
```
yarn database:migrate:prod
yarn command:prod upgrade-0.31
```
The `yarn database:migrate:prod` command will apply the migrations to the database structure (core and metadata schemas)
The `yarn command:prod upgrade-0.31` takes care of the data migration of all workspaces.
### v0.24.0 to v0.30.0
Upgrade your Twenty instance to use v0.30.0 image
**Breaking change**:
To enhance performances, Twenty now requires redis cache to be configured. We have updated our [docker-compose.yml](https://raw.githubusercontent.com/twentyhq/twenty/main/packages/twenty-docker/docker-compose.yml) to reflect this.
Make sure to update your configuration and to update your environment variables accordingly:
```
REDIS_HOST={your-redis-host}
REDIS_PORT={your-redis-port}
CACHE_STORAGE_TYPE=redis
```
**Schema and data migration**:
```
yarn database:migrate:prod
yarn command:prod upgrade-0.30
```
The `yarn database:migrate:prod` command will apply the migrations to the database structure (core and metadata schemas)
The `yarn command:prod upgrade-0.30` takes care of the data migration of all workspaces.
### v0.23.0 to v0.24.0
Upgrade your Twenty instance to use v0.24.0 image
Run the following commands:
```
yarn database:migrate:prod
yarn command:prod upgrade-0.24
```
The `yarn database:migrate:prod` command will apply the migrations to the database structure (core and metadata schemas)
The `yarn command:prod upgrade-0.24` takes care of the data migration of all workspaces.
### v0.22.0 to v0.23.0
Upgrade your Twenty instance to use v0.23.0 image
Run the following commands:
```
yarn database:migrate:prod
yarn command:prod upgrade-0.23
```
The `yarn database:migrate:prod` command will apply the migrations to the Database.
The `yarn command:prod upgrade-0.23` takes care of the data migration, including transferring activities to tasks/notes.
### v0.21.0 to v0.22.0
Upgrade your Twenty instance to use v0.22.0 image
Run the following commands:
```
yarn database:migrate:prod
yarn command:prod workspace:sync-metadata -f
yarn command:prod upgrade-0.22
```
The `yarn database:migrate:prod` command will apply the migrations to the Database.
The `yarn command:prod workspace:sync-metadata -f` command will sync the definition of standard objects to the metadata tables and apply to required migrations to existing workspaces.
The `yarn command:prod upgrade-0.22` command will apply specific data transformations to adapt to the new object defaultRequestInstrumentationOptions.

View file

@ -0,0 +1,263 @@
{
"$schema": "https://mintlify.com/schema.json",
"name": "Twenty Documentation",
"theme": "almond",
"logo": {
"light": "/logo.svg",
"dark": "/logo.svg"
},
"favicon": "/favicon.png",
"colors": {
"primary": "#141414",
"light": "#fafafa",
"dark": "#141414"
},
"interaction": {
"drilldown": false
},
"navbar": {
"primary": {
"type": "button",
"label": "Get Started",
"href": "https://app.twenty.com/welcome"
}
},
"styling": {
"eyebrows": "breadcrumbs"
},
"navigation": {
"tabs": [
{
"tab": "User Guide",
"groups": [
{
"group": "Getting Started",
"icon": "rocket",
"pages": [
"user-guide/introduction",
"user-guide/getting-started/what-is-twenty",
"user-guide/getting-started/create-workspace",
"user-guide/getting-started/getting-around-twenty",
"user-guide/getting-started/configure-your-workspace",
"user-guide/getting-started/implementation-services",
"user-guide/getting-started/migrating-from-other-crms",
"user-guide/getting-started/import-export-data"
]
},
{
"group": "Data Model",
"icon": "database",
"pages": [
"user-guide/data-model/customize-your-data-model",
"user-guide/data-model/objects",
"user-guide/data-model/fields",
"user-guide/data-model/creating-records",
"user-guide/data-model/relation-fields",
"user-guide/data-model/data-model-faq"
]
},
{
"group": "CRM Essentials",
"icon": "users",
"pages": [
"user-guide/crm-essentials/contact-and-account-management",
"user-guide/crm-essentials/pipeline",
"user-guide/crm-essentials/view-management",
"user-guide/crm-essentials/sales-use-cases"
]
},
{
"group": "Views",
"icon": "table",
"pages": [
"user-guide/views/kanban-views",
"user-guide/views/views-sort-filter"
]
},
{
"group": "Workflows",
"icon": "bolt",
"pages": [
"user-guide/workflows/getting-started-workflows",
"user-guide/workflows/workflow-features",
"user-guide/workflows/internal-automations",
"user-guide/workflows/external-tool-integration",
"user-guide/workflows/workflow-troubleshooting",
"user-guide/workflows/workflow-credits",
"user-guide/workflows/professional-services"
]
},
{
"group": "Collaboration",
"icon": "envelope",
"pages": [
"user-guide/collaboration/emails-and-calendars",
"user-guide/collaboration/notes",
"user-guide/collaboration/tasks"
]
},
{
"group": "Integrations & API",
"icon": "plug",
"pages": [
"user-guide/integrations-api/apis-overview",
"user-guide/integrations-api/api-webhooks",
"user-guide/integrations-api/integrations"
]
},
{
"group": "Reporting",
"icon": "chart-bar",
"pages": ["user-guide/reporting/reporting-overview"]
},
{
"group": "Settings",
"icon": "gear",
"pages": [
"user-guide/settings/profile-settings",
"user-guide/settings/experience-settings",
"user-guide/settings/email-calendar-setup",
"user-guide/settings/workspace-settings",
"user-guide/settings/member-management",
"user-guide/settings/permissions",
"user-guide/settings/domains-settings",
"user-guide/settings/releases-settings",
"user-guide/settings/settings-faq"
]
},
{
"group": "Pricing",
"icon": "credit-card",
"pages": ["user-guide/pricing/billing-and-pricing-faq"]
},
{
"group": "Resources",
"icon": "book-open",
"pages": [
"user-guide/resources/glossary",
"user-guide/resources/github"
]
}
]
},
{
"tab": "Developers",
"groups": [
{
"group": "Developers"
},
{
"group": "Getting Started",
"icon": "rocket",
"pages": [
"developers/introduction",
"developers/local-setup",
{
"group": "Self-Hosting",
"pages": [
"developers/self-hosting/docker-compose",
"developers/self-hosting/setup",
"developers/self-hosting/upgrade-guide",
"developers/self-hosting/cloud-providers",
"developers/self-hosting/troubleshooting"
]
},
{
"group": "API and Webhooks",
"pages": [
"developers/api-and-webhooks/api",
"developers/api-and-webhooks/webhooks"
]
}
]
},
{
"group": "Contributing",
"icon": "code-branch",
"pages": [
"developers/bug-and-requests",
{
"group": "Frontend Development",
"pages": [
"developers/frontend-development/storybook",
{
"group": "Twenty UI",
"pages": [
"twenty-ui/introduction",
{
"group": "Display",
"pages": [
"twenty-ui/display/checkmark",
"twenty-ui/display/chip",
"twenty-ui/display/icons",
"twenty-ui/display/soon-pill",
"twenty-ui/display/tag",
"twenty-ui/display/app-tooltip"
]
},
{
"group": "Feedback",
"pages": ["twenty-ui/progress-bar"]
},
{
"group": "Input",
"pages": [
"twenty-ui/input/buttons",
"twenty-ui/input/color-scheme",
"twenty-ui/input/text",
"twenty-ui/input/checkbox",
"twenty-ui/input/icon-picker",
"twenty-ui/input/image-input",
"twenty-ui/input/radio",
"twenty-ui/input/select",
"twenty-ui/input/toggle",
"twenty-ui/input/block-editor"
]
},
{
"group": "Navigation",
"pages": [
"twenty-ui/navigation",
"twenty-ui/navigation/breadcrumb",
"twenty-ui/navigation/links",
"twenty-ui/navigation/menu-item",
"twenty-ui/navigation/navigation-bar",
"twenty-ui/navigation/step-bar"
]
}
]
},
"developers/frontend-development/frontend-commands",
"developers/frontend-development/work-with-figma",
"developers/frontend-development/best-practices-front",
"developers/frontend-development/style-guide",
"developers/frontend-development/folder-architecture-front",
"developers/frontend-development/hotkeys"
]
},
{
"group": "Backend Development",
"pages": [
"developers/backend-development/server-commands",
"developers/backend-development/feature-flags",
"developers/backend-development/folder-architecture-server",
"developers/backend-development/zapier",
"developers/backend-development/best-practices-server",
"developers/backend-development/custom-objects",
"developers/backend-development/queue"
]
}
]
}
]
}
]
},
"footer": {
"socials": {
"github": "https://github.com/twentyhq/twenty",
"twitter": "https://twitter.com/twentycrm",
"discord": "https://discord.gg/cx5n4Jzs57"
}
}
}

View file

@ -0,0 +1,12 @@
import baseConfig from '../../eslint.config.mjs';
export default [
...baseConfig,
{
files: ['**/*.mdx'],
rules: {
// MDX-specific rules if needed
},
},
];

Binary file not shown.

After

Width:  |  Height:  |  Size: 570 B

View file

@ -0,0 +1,40 @@
---
title: Create a Workspace
description: "Follow a step-by-step guide on how to register on Twenty, choose a subscription plan, confirm your payment and set up your account."
---
## Step 1: Registration
1. Navigate to [Twenty Sign Up](https://app.twenty.com).
2. Select your preferred sign-up method:
- **Continue with Google** for Google account registration.
- **Continue with Microsoft** for Microsoft account registration.
- Or, **Continue With Email** for email registration.
## Step 2: Choosing a Trial Period
Choose between two trial periods:
### 30 days
With credit card
### 7 days
Without credit card
Both trials include:
- Full access
- Unlimited contacts
- Email integration
- Custom objects
- API & Webhooks
You can click on "Change plan" to choose a different plan or billing interval.
## Step 3: Payment Confirmation & Account Setup
Post payment approval via Stripe, you're directed to create your workspace and user profile. Remember that you can cancel your subscription anytime.
## Support
For queries or help, connect with the dedicated support team at [contact@twenty.com](mailto:contact@twenty.com) or send a message on [Discord](https://discord.gg/cx5n4Jzs57).

View file

@ -0,0 +1,56 @@
---
title: What is Twenty
description: "Discover Twenty, an open-source CRM, its features, benefits, system requirements, and how to get involved."
---
Twenty is the leading open-source CRM, crafted by hundreds of contributors to suit your unique business needs.
## Vision
Creating a good CRM is hard because it's a balancing act.
For each business, the requirements seem straightforward, yet everyone's needs are distinct.
The result is a CRM that's either too basic, or one that's attempting to be a jack-of-all-trades but ending up as a master of none.
At first, Twenty looks like most CRMs you already know: you can track deals, organize contacts, manage tasks and notes.
But what sets it apart is our approach to extensibility. We are building an open platform that provides the building blocks for you to solve your unique business problems.
We prioritize universal principles and common patterns over feature lists.
We don't try to have all the answers and instead empower users to find what works best for them.
Open-source is the bedrock of our approach, ensuring that Twenty evolves with its community, for its community.
## Benefits
**Customizable:** Designed to fit your business needs.
**Community-driven:** Built and maintained by a large open-source community.
**Cost-effective:** You'll never be vendor-locked, because you can always self-host.
## Main Features
**Contact Management:** Efficiently store and manage customer data.
**Custom Objects:** Create and customize objects to fit your business needs.
**Custom Fields:** Tailor data fields to capture and organize information specific to your operations.
**Deal Management:** Track and manage your sales opportunities through customizable Pipeline stages.
**Kanban & Table Views:** Make data actionable with flexible table views.
**Workflows:** Automate your business processes and integrate with external tools using powerful workflow automation.
**Email Integration:** View the emails of a specific customer or company within your workspace.
**Notes:** Create detailed notes for each record to share knowledge more effectively.
**Tasks:** Schedule tasks to track customer interactions.
**Permissions:** Control access and manage user roles with flexible workspace and object-level permissions.
**API & Webhooks:** Connect to other apps and automate workflows with API and Webhooks.
## Join now
[Register here](https://app.twenty.com) or [become a contributor on GitHub](https://github.com/twentyhq/twenty).

Binary file not shown.

After

Width:  |  Height:  |  Size: 570 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 356 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 599 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 583 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 43 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 333 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 146 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 197 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 298 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 194 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 130 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 184 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 628 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 326 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.9 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 827 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 699 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 43 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 17 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.5 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.5 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.6 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 399 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 360 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 289 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 450 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 369 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 154 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 137 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 234 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 255 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 188 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 127 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 405 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 391 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 132 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 248 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 439 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 70 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 119 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 107 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 135 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 127 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 209 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 74 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 168 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 154 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 318 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 282 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 276 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 282 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 462 KiB

Some files were not shown because too many files have changed in this diff Show more