# Access Control ::: tip **Interactive Learning Available! 🚀** Looking to get hands-on with this topic? Try out our new on Access Control, where you can explore and practice directly in the browser. This guided experience offers step-by-step lessons to help you master Access Control in Remult with practical examples and exercises. ::: Access control is essential for ensuring that users can only access resources they are authorized to in web applications. This article explores the various layers of access control, focusing on a framework that provides a granular approach to securing your application. ## Entity-Level Authorization Entity-level authorization governs CRUD operations at the entity level. Each entity can define permissions for these operations using the following options: - `allowApiRead`: Controls read access. - `allowApiInsert`: Controls insert access. - `allowApiUpdate`: Controls update access. - `allowApiDelete`: Controls delete access. Each option can be set to a boolean, a string role, an array of string roles, or an arrow function: ## Row-Level Authorization Row-level authorization allows control over which rows a user can access or modify. ### Authorization on Specific Rows The `allowApiUpdate`, `allowApiDelete`, and `allowApiInsert` options can also accept a function that receives the specific item as the first parameter, allowing row-level authorization: ### Filtering Accessible Rows To limit the rows a user has access to, use the `apiPrefilter` option: The `apiPrefilter` adds a filter to all CRUD API requests, ensuring that only authorized data is accessible through the API. ### Preprocessing Filters for API Requests For more complex scenarios, you can use `apiPreprocessFilter` to dynamically modify the filter based on the specific request and additional filter information: In this example, `apiPreprocessFilter` uses the `getPreciseValues` method to ensure that users must specify a valid `customerId` filter when querying tasks, allowing for more granular control over the data that is accessible through the API. **Note:** The `preciseValues` object includes the actual values that are used in the filter. For example, in the code sample above, if the `customerId` filter specifies the values `'1'`, `'2'`, and `'3'`, then `preciseValues.customerId` will be an array containing these values. This allows you to check and enforce specific filter criteria in your preprocessing logic. This added note explains the significance of the `preciseValues` property and how it includes the actual values used in the filter, providing an example for clarity. ### Warning: API Filters Do Not Affect Backend Queries It's important to note that `apiPrefilter` and `apiPreprocessFilter` only apply to API requests. They do not affect backend queries, such as those executed through backend methods or non-Remult routes. For instance, in a sign-in scenario, a backend method might need to check all user records to verify a user's existence without exposing all user data through the API. Once authenticated, the user should only have access to their own record for updates. ### Backend Filters for Consistent Access Control To apply similar filtering logic to backend queries, you can use `backendPrefilter` and `backendPreprocessFilter`: In this example, `backendPrefilter` and `backendPreprocessFilter` ensure that non-admin users can only access their own tasks in backend queries, providing consistent access control across both API and backend operations. ## Field-Level Authorization Field-level authorization allows control over individual fields within an entity: _Field level authorization happens after entity level authorization AND if it's allowed._ - `includeInApi`: Determines if the field is included in the API response. - `allowApiUpdate`: Controls if a field can be updated. If false, any change to the field is ignored. Examples: ### Field Masking To mask a field, combine a non-API field with a `serverExpression` that returns the masked value: ## BackendMethod Authorization Backend methods use the `allowed` option to determine authorization: The `allowed` option can receive a boolean, a string role, an array of role strings, or a function. ## Reusing Access Control Definitions in the Frontend Access control definitions set in entities can be reused as a single source of truth in the frontend. This allows for consistent and centralized management of access control logic across your application. For example, in a React component, you can conditionally render UI elements based on the access control rules defined in the entity: ::: code-group ::: ## Additional Resources Check out this informative . It discusses the concepts covered in this article and provides practical examples to help you understand how to implement robust access control in your applications. --- This article provides a comprehensive overview of the layers of access control in web applications, offering a granular approach to securing your application at the entity, row, field, and method levels. # Mutability and the Active Record Pattern The Active Record pattern is a concept in software architecture, particularly useful when working with mutable objects whose state may change over time. This design pattern facilitates direct interaction with the database through the object representing a row of the data table. In this article, we'll delve into the fundamentals of the Active Record pattern, contrasting it with immutable patterns, and exploring its implementation and advantages in software development. ### Immutable vs. Mutable Patterns In modern software development, handling data objects can generally be approached in two ways: immutable and mutable patterns. **Immutable objects** do not change once they are created. Any modification on an immutable object results in a new object. For example, in the React framework, immutability is often preferred: However, libraries like MobX offer the flexibility to work with mutable objects while still providing the reactivity that React components need. **Mutable objects**, on the other hand, allow for changes directly on the object itself: Mutable patterns are especially prevalent in scenarios where the state of objects changes frequently, making them a staple in many programming environments outside of React. ### The Role of Active Record Pattern The Active Record pattern embodies the concept of mutability by binding business logic to object data models. Typically, each model instance corresponds to a row in the database, with the class methods providing the functionality to create, read, update, and delete records. ### Warning: Mutable Objects in React Using mutable objects with the Active Record pattern in React requires careful handling. React’s rendering cycle is built around the premise of immutability; it typically relies on immutable state management to trigger re-renders. When mutable objects change state outside the scope of React's `useState` or `useReducer`, React does not automatically know to re-render the affected components. This can lead to issues where the UI does not reflect the current application state. These challenges can be mitigated by integrating state management tools that are designed to work well with mutable objects, such as MobX. MobX provides mechanisms to track changes in data and automatically re-render components when mutations occur. This aligns more naturally with the Active Record pattern within the context of React, ensuring that the UI stays in sync with the underlying data. #### Using EntityBase and IdEntity In practice, leveraging the Active Record pattern often involves inheriting from classes such as `EntityBase` or `IdEntity` . These base classes enrich models with methods that simplify manipulations of their attributes and their persistence in the database. **Explanation:** The `Person` class represents individuals in the 'people' table and inherits from `IdEntity`. This inheritance means that there is no need to explicitly define an `id` field for this class, as `IdEntity` automatically includes a UUID field . Consequently, `Person` benefits from all the functionalities of `EntityBase`, which include tracking changes and handling CRUD operations, while also automatically gaining a UUID as the identifier. ### Mutable vs EntityBase **Traditional approach without Active Record:** **Using Active Record with EntityBase:** This pattern also simplifies other operations: #### Helper Members in EntityBase EntityBase provides additional utility members like `_` and `$` to facilitate more complex interactions: - **`_` :** Allows performing operations on a specific instance of an entity. - **`$` :** Provides access to detailed information about each field in the current instance, such as their original and current values: ### Alternative Implementations Even without direct inheritance from `EntityBase`, similar functionalities can be achieved using helper functions such as `getEntityRef`, which encapsulates an entity instance for manipulation and persistence: ### Conclusion The Active Record pattern offers a straightforward and intuitive approach to interacting with database records through object-oriented models. It is particularly beneficial in environments where business logic needs to be tightly coupled with data manipulation, providing a clear and efficient way to handle data state changes. However, integrating the Active Record pattern with mutable objects in React can be challenging. --- outline: --- # Add Remult to your App ::: tip New Project? We suggest using one of the to have a fresh good start. ::: ::: tip Add to your existing project? You are at the right place, Remult is designed to be added to existing projects. Embark on the Remult journey at your own pace and start reaping the benefits from day one. Gradual adoption is the preferred route for many, allowing for a smooth integration into your workflow. ::: ## Installation **The _remult_ package is one and the same for both the frontend bundle and the backend server.** If you're using one `package.json` for both frontend and backend - **install Remult once** in the project's root folder. If you're using multiple `package.json` files - **install Remult in both server and client folders**. ## Server-side Initialization Remult is initialized on the server-side as a request handling middleware, with **a single line of code**. Here is the code for setting up the Remult middleware: ### Express ### Fastify ### Next.js Pages Router ### Next.js App Router ### Sveltekit ### Hapi ### Nest ### Koa ## Client-side Initialization On the client side, `remult` can use any standard javascript HTTP-client to call the data API. **By default, remult uses the browser's `fetch` API, and makes data API calls using the base URL `/api` .** Here is the code for setting up a Remult client instance: ### Using Fetch ### Using Axios ### Using Angular HttpClient ### Changing the default API base URL By default, remult makes data API calls to routes based at the `/api` route of the origin of the client-side app. To use a different base URL for API calls , set the remult object's `apiClient.url` property. ::: warning CORS Handling is outside the scope of Remult. ::: ## Database Initialization Got a database ready? Fantastic! Unleash the full potential of your existing setup by generating your entities directly from the database itself. Check out and see how it can one shot generate your entities in no time. # Adding Graphql To add graphql to a `remult` application follow these steps: 1. Install the `graphql-yoga` packages: ## Express: In the `/src/server/index.ts` file add the following code: ## Next App Router ## Svelte `src/routes/api/graphql/+server.ts` # Adding Swagger and openApi In short, swagger provides a quick UI that describes the api which is exposed by the application. To add swagger to a `remult` application follow these steps: 1. Install the `swagger-ui-express` package: 2. In the `/src/server/index.ts` file add the following code: ## Adding Swagger UI to a NextJs App To add swagger UI to a `NextJs` application follow these steps: 1. Install the following packages: 2. Get the openApi document from RemultNextAppServer: 3. Create a new page to render Swagger UI: 4. Navigate to `http://localhost:3000/api-doc` to see the Swagger UI. ! # Adding open api specific field options Checkout the following example project that demos how to add `openApi` specific options to field options # Admin UI Enjoy a fully featured Admin UI for your entities, you can do CRUD operations on your entities, view their relationships via the Diagram entry, and ensure secure management with the same validations and authorizations as your application. ## Enabling the Admin UI Add the Admin UI to your application by setting the `admin` option to `true` in the remult configuration. ### Tunning the Admin UI You can pass some options to admin as well: - `allow`, using . `true`, `"admin"`, ... - `customHtmlHead`, to add custom html to the head of the admin page. It's a function that receives remult as an argument. Example: ## Accessing and Using the Admin UI Navigate to `/api/admin` to access the Admin UI. Here, you can perform CRUD operations on your entities, view their relationships via the Diagram entry, and ensure secure management with the same validations and authorizations as your application. ! ## Features - **Entity List**: On the left side of the screen you have the entity list, you can use the search field to search for entities. - **Entity Details**: Clicking on an entity in the menu will open the entity details screen , here you can view filter & paginate your data __. You can also see all relations of entity by clicking on the arrow on the left of each row. The last column is dedicated for actions where you can edit or delete an entity. On top left you can also add a new entity by clicking on the `+`. - **Entity Diagram**: Clicking on the Diagram entry will open the entity diagram screen, here you can see the entity relationships. ! - **Settings**: On top left, you have a menu __ where you can find various settings for your admin ui. - You want to confirm a delete all the time? - You want to display Captions or Keys? - Multiple options for automatic diagram layout - You don't use cookies? No problem, you can set your bearer token ## Demo in video Watch this quick demo to see the Remult Admin UI in action: This video showcases the key features and functionality of the Remult Admin UI, giving you a practical overview of how it can streamline your entity management process. # Allowed Throughout the api you'll see methods that use the `Allowed` data type, for example `allowApiRead` etc... The `Allowed` data type can be set to one of the following value: - true/false - a Role - Checks if the current user has this role. or with a constant - An Array of Roles - checks if the current user has at least one of the roles in the array - A function that get's a `remult` object as a parameter and returns true or false or: # AllowedForInstance In some cases, the allowed can be evaluated with regards to a specific instance, for example `allowApiUpdate` can consider specific row values. The Allowed for Instance method accepts two parameters: 1. The relevant `remult` object 2. The relevant entity instance For Example: # this is my test # Backend Methods Backend methods run on the backend and are used to improve performance, execute server-only code , or perform operations not accessible through the API. ## Static Backend Methods Static backend methods represent the most straightforward type, transmitting their parameters to the backend and delivering their outcome to the frontend. 1. **Define the Backend Method:** Each controller can house one or more backend methods, each serving distinct purposes tailored to your application's needs. In the provided example, the `TasksController` class contains a single backend method named `setAll`, responsible for setting the completion status of all tasks. The method name, such as `setAll`, serves as the URL for the corresponding REST endpoint on the backend server. It's worth noting that you can configure a prefix for these endpoints using the `apiPrefix` option, providing flexibility in structuring your backend API routes. The allowed: true parameter signifies that the backend method can be invoked by anyone. Alternatively, you can customize the authorization settings for finer control over who can access the method. For instance, setting allow: Allow.authenticated restricts access to authenticated users only, ensuring that only logged-in users can utilize the method. Similarly, specifying allow: 'admin' limits access to users with administrative privileges, granting access exclusively to administrators. These options offer granular control over authorization, allowing you to tailor access permissions based on your application's specific requirements and security considerations. 2. **Register the Controller:** 3. **Call from the Frontend:** This example demonstrates how to define and use a static backend method, `setAll`, within the `TasksController` class. When called from the frontend, this method sets the completion status of all tasks to the specified value . The method leverages Remult's `BackendMethod` decorator to handle the communication between the frontend and backend seamlessly. # Creating a Remult Project _The easiest way to start building a Remult app_ Yes, that's it! ::: tip Let us know how you liked the process! ::: ## Demo ! ## What you get ? ### 1. **Tailored Setup** Answer a few questions about your preferred tech stack and project requirements. `Project name`: The name of your project __ `Choose your Framework` `Choose your Web Server` __ `Choose your Database` `Authentication`: Do you want to add `auth.js` to your project directly ? including a complete implementation for `credentials` and `github` providers `Add CRUD demo`: A comprehensive example of how to use an entity. It will show you how to create, read, update and delete data. `Admin UI`: Will then be available at `/api/admin` ### 2. **Instant Configuration** Based on your answers, Remult will configure the project with the best-suited options. With all combinations of frameworks, servers, databases and authentication, we manage more than `180 different project flavors`! We are missing yours? Let us know ! ### 3. **Feature-Rich Demo** Once you run your project, you'll be greeted with a comprehensive dashboard that showcases all of Remult's powerful features. It will look like this: ! Each tile is a fully functional example of a feature that you selected. ### 4. **Easy Eject** Simply remove the demo folder to eject the demo components. # CRUD your first Entity ## Define an Entity Model Class Remult entity classes are shared between frontend and backend code. Alternatively, . ## Register the Entity on the Server All Remult server middleware options contain an `entities` array. Use it to register your Entity. ## Query and Mutate data in Front-end code # Leveraging Custom Filters for Enhanced Data Filtering In modern web applications, efficiently filtering data is essential for providing a seamless user experience. Whether it's an e-commerce platform filtering products, a task management system sorting tasks, or any other application that requires data manipulation, the ability to apply complex filters is crucial. Custom filters offer a powerful solution, enabling developers to create reusable, declarative, and versatile filters that are executed on the backend and easily utilized from the frontend. This article delves into the concept of custom filters, illustrating their advantages and practical applications. ## The Advantages of Custom Filters Custom filters provide several benefits that make them an attractive choice for handling data filtering in web applications: 1. **Declarative and Readable:** Custom filters allow you to express filtering logic in a clear, declarative manner. This improves code readability and maintainability, making it easier to understand and modify filtering criteria. 2. **Reusability:** By encapsulating filtering logic in custom filters, you can reuse the same filters across different parts of your application, reducing code duplication and ensuring consistency in filtering behavior. 3. **Backend Execution:** Custom filters are evaluated on the backend, leveraging the full capabilities of the underlying database or data provider. This enables more efficient data processing and allows you to perform complex operations that would be difficult or impossible to handle on the frontend. 4. **Composability:** Custom filters can be combined with other filters, both custom and standard, allowing you to build complex filtering logic in a modular and maintainable way. 5. **Flexibility with Data Providers:** Custom filters can be used with various data providers, including SQL databases, in-memory JSON arrays, and others. This flexibility allows you to apply custom filters in different contexts and with different data storage solutions. 6. **Enhanced Security:** When using custom filters with parameterized queries or data provider-specific filtering methods, you can mitigate the risk of injection attacks and ensure that user input is properly sanitized. ## Practical Example: Filtering Orders in an E-Commerce Application Consider an e-commerce application where you need to filter orders based on their status and creation year. Without custom filters, the filtering logic might be repetitive and scattered throughout the codebase. By using custom filters, you can encapsulate this logic in a reusable component, simplifying the code and making it more maintainable. In the following sections, we'll explore how to implement custom filters in this scenario, demonstrating their advantages and how they can be used to create more efficient and readable code. ## The Problem with Repetitive Filtering Consider a scenario where you have an `Order` entity, and you frequently need to filter orders that are considered "active" based on their status and creation year. Without custom filters, your code might look something like this: This code is not only repetitive but also clutters your application, making it harder to maintain. Moreover, it generates lengthy REST API calls, such as: ## Introducing Custom Filters Custom filters allow you to refactor your filtering logic into a reusable and declarative component. Here's how you can define a custom filter for active orders: - **First Generic Parameter :** This parameter specifies the entity class that the filter is associated with. In this case, it's the `Order` class. This is important because it ensures that the filter criteria you define are compatible with the fields and types of the `Order` entity. - **Second Generic Parameter :** This parameter defines the type of the argument that the filter will receive when executed. In this example, the filter expects an object with a single property `year` of type `number`. This allows you to pass dynamic values to the filter when you use it in a query, making the filter more flexible and reusable. - **Callback Function => { ... }`):** This function is where you define the actual filtering criteria. It receives an argument matching the type specified in the second generic parameter. Inside the function, you return an object representing the filter conditions. In this case, the conditions are based on the `status` and `createdAt` fields of the `Order` entity. Now, you can use this custom filter in your queries: This generates a much simpler REST API call: ## Composability of Custom Filters One of the key advantages of custom filters is their ability to be composed with other filters. This means you can combine custom filters with regular filters or even other custom filters to build complex filtering logic. Let's take a closer look at the example you provided: In this query, we're filtering orders based on two criteria: 1. The `customerId` should be "123". 2. The order should satisfy the conditions defined in the `activeOrders` custom filter for the specified year. By using the `$and` operator, we're able to combine the custom filter with a regular filter. This demonstrates the composability of custom filters, allowing you to build more complex and nuanced filtering logic while maintaining readability and reusability. ### More on Composability The power of composability doesn't stop there. You can also combine multiple custom filters to create even more specific filters. For example, suppose you have another custom filter called `highValueOrders` that filters orders based on their total value: You can then combine this with the `activeOrders` filter to find high-value active orders for a specific year: This ability to compose filters allows you to create modular and reusable filtering logic, which can significantly improve the maintainability and clarity of your code. ### Evaluating Custom Filters on the Backend One of the significant advantages of custom filters is that they are evaluated on the backend. This allows you to perform complex data-related operations that would be inefficient or impossible to do solely on the frontend. For instance, you can leverage database queries or other server-side logic to build your filtering criteria. Let's examine the example you provided: In this example, the custom filter `activeOrders` now takes an additional parameter `customerCity`. The filter performs a database query to fetch all customers from the specified city. It then uses the IDs of these customers to filter orders that belong to them. This is combined with the existing criteria of filtering orders based on their status and creation year. ::: tip Key Points - **Backend Evaluation:** The filter is evaluated on the backend, where it has access to the database and can perform efficient queries. This offloads complex data processing from the frontend to the backend, where it can be handled more effectively. - **Complex Filtering:** By leveraging backend capabilities, you can create filters that involve complex operations, such as fetching related data from other tables or entities . - **Asynchronous Operations:** Notice the use of `async` in the filter definition. This allows you to perform asynchronous operations, such as database queries, within your custom filter. ::: ## Leveraging Database Capabilities with Raw SQL in Custom Filters Since custom filters are **evaluated on the backend**, you have the opportunity to harness the raw capabilities of the underlying database. This can be particularly useful when you need to perform complex operations that are more efficiently handled by the database itself. For instance, you can use raw SQL queries to improve the performance or functionality of your custom filters. Let's modify the `activeOrders` custom filter to use a raw SQL query for filtering orders based on the customer's city: In this example, we've added a `$and` condition that uses `SqlDatabase.rawFilter` to include a raw SQL fragment in our filter. This SQL fragment selects the IDs of customers from the specified city and uses them to filter the orders. This generates the following sql: #### Important Notes - **Parameterized Queries:** It's crucial to use parameterized queries `) when incorporating user-supplied values into your SQL queries. This helps prevent SQL injection attacks by ensuring that user input is properly escaped. - **Performance Considerations:** Leveraging raw SQL can lead to significant performance improvements, especially for complex queries. However, it's important to ensure that your SQL queries are well-optimized to avoid potential performance issues. #### Usage Example Using the custom filter remains straightforward: ### Using `dbNamesOf` with Table Names and Aliases The `dbNamesOf` utility function can be customized to include the table name in the SQL queries. This is particularly useful for ensuring consistency between your entity definitions and your raw SQL queries. Here's an updated example of the `activeOrders` custom filter using `dbNamesOf` with table names and aliases: In this example: - The `Order` table is referenced with its full name. - The `Customer` table is aliased as `"c"`, and this alias is used in the SQL query. ### Explanation of `tableName` and Aliases - **`tableName: true`:** By setting `tableName: true`, you indicate that you want to include the table name when referring to fields, resulting in SQL expressions like `"customer"."id"`. - **Aliases:** You can use aliases for table names, which is particularly useful in complex join scenarios. For example, setting `tableName: "c"` would use the alias `"c"` for the table name in the SQL query. ### Resulting SQL Query Using the `activeOrders` custom filter with the enhancements mentioned above would generate the following SQL query: In this SQL query, the `Customer` table is aliased as `"c"`, and this alias is used throughout the query to ensure consistency with the entity definitions and to handle complex join scenarios effectively. ### SQL-Based Custom Filters: Unleashing the Power of Composability The greatest advantage of using SQL-based custom filters lies in their composability and the ability to handle complex situations. By breaking down filtering logic into smaller, atomic custom filters, developers can compose these filters to create more sophisticated and nuanced filtering criteria. This modular approach not only enhances the readability and maintainability of the code but also allows for greater flexibility in constructing complex queries. For instance, consider a scenario where you need to filter orders based on multiple criteria, such as status, creation year, customer location, and order value. By creating separate custom filters for each of these criteria, you can easily combine them to form a comprehensive filtering solution. This composability ensures that your filtering logic can adapt to various requirements without becoming convoluted or difficult to manage. Furthermore, the ability to handle complex situations is a significant advantage of SQL-based custom filters. By leveraging the raw power of SQL, you can perform advanced operations such as subqueries, joins, and aggregate functions directly within your filters. This opens up a wide range of possibilities for data analysis and manipulation, enabling you to tackle complex filtering scenarios with ease. SQL is a language that is widely recognized and understood by AI technologies such as ChatGPT, Copilot and others. This makes it possible to generate highly optimized queries with ease. These AI technologies can assist in writing SQL queries, ensuring they are efficient and effective. This is particularly beneficial when dealing with complex data structures and large datasets, where writing optimal queries can be challenging. With the assistance of AI, developers can focus more on the logic of their applications, while the AI handles the intricacies of SQL query optimization. In summary, the composability of SQL-based custom filters, coupled with their ability to handle complex situations, makes them an invaluable tool for developers seeking to create flexible, efficient, and powerful data filtering solutions in their web applications. ### Using Raw Filters with Different Data Providers Custom filters with raw filters are not limited to SQL databases. You can also use raw filters with other data providers, such as Knex or an in-memory JSON data provider. This flexibility allows you to leverage the power of raw filters in various contexts, depending on your application's needs. #### Knex Example Knex is a popular SQL query builder for Node.js. You can use Knex with custom filters to define complex filtering logic directly using the Knex query builder syntax. In this example, the `idBetween` custom filter uses Knex to filter `Task` entities whose `id` falls between the specified `from` and `to` values. #### JSON Example For applications that use an in-memory JSON data provider, you can define custom filters that operate directly on the JSON data. In this example, the `titleLengthFilter` custom filter filters `Task` entities based on the length of their `title` property, ensuring that it exceeds the specified `minLength`. ## Conclusion Custom filters represent a powerful tool in the arsenal of web developers, offering a flexible and efficient way to handle data filtering in web applications. By encapsulating filtering logic into reusable components, custom filters not only enhance code readability and maintainability but also enable the execution of complex filtering operations on the backend. This leads to improved performance and security, as well as the ability to compose intricate filtering criteria with ease. The versatility of custom filters extends to their compatibility with various data providers, from SQL databases to in-memory JSON arrays, allowing developers to leverage the most suitable data handling mechanisms for their specific use cases. Moreover, the declarative nature of custom filters ensures that the filtering logic remains clear and concise, facilitating easier debugging and future modifications. In conclusion, adopting custom filters in your web development projects can significantly streamline the process of data filtering, resulting in cleaner, more efficient, and more secure code. By embracing this approach, developers can focus on delivering a seamless user experience, confident in the knowledge that their data filtering logic is both robust and adaptable. --- tags: - options - bespoke options - customizing options - type augmentation - module augmentation - UserInfo - RemultContext - context --- # Extensibility in TypeScript allows you to extend existing types with custom properties or methods. This enhances the functionality of third-party libraries like `remult` without altering their source code, enabling seamless integration of custom features while maintaining type safety. In Remult, you can use TypeScript's module augmentation to enhance your application with custom features. Here are some examples: 1. **Add more fields to the User object:** Extend the `UserInfo` interface to include additional fields like `email` and `phone`. 2. **Add custom options/metadata to fields and entities:** Extend the `FieldOptions` or `EntityOptions` interfaces to include custom properties such as `placeholderText` or `helpText`. 3. **Add fields/methods to the `remult.context` object:** Extend the `RemultContext` interface to include additional properties or methods that can be accessed throughout your code. ## Setting Up the types.d.ts File for Custom Type Extensions To set up the `types.d.ts` file for custom type extensions in Remult: 1. **Create a TypeScript Declaration File:** Add a file named `types.d.ts` in the `src` folder of your project. This file will be used to declare custom type extensions, such as additional user info fields. The `export {}` is required to indicate that this file is a module, as per the . 2. **Include the Declaration File in tsconfig:** Make sure that the `types.d.ts` file is included in the `include` section of your `tsconfig.json` file. If you have a separate `tsconfig` for the server, ensure that it's also added there. 3. **Utilize the Custom Fields in Your Code:** Once you've defined custom fields in the `types.d.ts` file and ensured they're included in your `tsconfig.json`, you can start using them throughout your application. For instance, if you've added `phone` and `email` to the `UserInfo` interface, you can access these properties in your code as follows: This enables you to seamlessly integrate the new fields into your application's logic and user interface. ## Enhancing Field and Entity Definitions with Custom Options One of the key motivations for adding custom options to `FieldOptions` or `EntityOptions` is to maintain consistency and centralize the definition of entities and fields in your application. By keeping these definitions close to the entity or field, you ensure a single source of truth for your application's data model. This approach enhances maintainability and readability, as all relevant information and metadata about an entity or field are located in one place. Additionally, it allows for easier integration with UI components, as custom options like `placeholderText` can be directly accessed and used in your frontend code. For adding custom options to `FieldOptions` or `EntityOptions`, such as `placeholderText`: 1. **Extend FieldOptions:** In your `types.d.ts` file, extend the `FieldOptions` interface to include your custom options. For example: 2. **Set Custom Option:** Specify the `placeholderText` in your entity field options: 3. **Use in UI:** Access the custom option in your UI components: By following these steps, you can extend `FieldOptions` with custom options that can be utilized throughout your project. ### Extending Remult's `context` Property for Request-Specific Information Augmenting Remult's `context` property is particularly useful because it allows you to store and access request-specific information throughout your code. This can be especially handy for including data from the request and utilizing it in entities or backend methods. For example, you can add a custom property `origin` to the `RemultContext` interface: Then, set the `origin` property in the `initRequest` option in the `api.ts` file: You can now use the `origin` property anywhere in your code, for example: or in an entity's saving event: By leveraging module augmentation, you can tailor Remult to your specific needs, adding custom options and extending interfaces to suit your application's requirements. # Generate Entities from Existing Database ## Remult kit Want to use Remult for full-stack CRUD with your existing database? Check out this video to see how to connect http://remult.dev to your existing database and start building type-safe #fullstack apps with any #typescript frontend, backend, and any DB! Watch now 👉 https://youtu.be/5QCzJEO-qQ0. # Entity Instance Backend Methods When leveraging the Active Record pattern, backend methods for entity instances offer a powerful way to integrate client-side behavior with server-side logic. These methods, when invoked, transport the entire entity's state from the client to the server and vice versa, even if the data has not yet been saved. This feature is particularly useful for executing entity-specific operations that require a round-trip to the server to maintain consistency and integrity. ## Overview of Entity Backend Methods Entity backend methods enable all the fields of an entity, including unsaved values, to be sent to and from the server during the method's execution. This approach is essential for operations that rely on the most current state of an entity, whether or not the changes have been persisted to the database. ### Defining a Backend Method To define a backend method, use the `@BackendMethod` decorator to annotate methods within an entity class. This decorator ensures that the method is executed on the server, taking advantage of server-side resources and permissions. Here is an example demonstrating how to define and use a backend method in an entity class: ### Calling the Backend Method from the Frontend Once the backend method is defined, it can be called from the client-side code. This process typically involves fetching an entity instance and then invoking the backend method as shown below: ### Security Considerations ::: danger It's important to note that backend methods bypass certain API restrictions that might be set on the entity, such as `allowApiUpdate=false`. This means that even if an entity is configured not to allow updates through standard API operations, it can still be modified through backend methods if they are permitted by their `allowed` setting. Consequently, developers must explicitly handle security and validation within these methods to prevent unauthorized actions. The principle here is that if a user has permission to execute the `BackendMethod`, then all operations within that method are considered authorized. It is up to the developer to implement any necessary restrictions within the method itself. ::: --- outline: --- # Relations Between Entities ::: tip **Interactive Learning Available! 🚀** Looking to get hands-on with this topic? Try out our new on Relations, where you can explore and practice directly in the browser. This guided experience offers step-by-step lessons to help you master relations in Remult with practical examples and exercises. ::: ### Understanding Entity Relations in Remult In Remult, entity relations play a useful role in modeling and navigating the complex relationships that exist within your data. To illustrate this concept, we will use two primary entities: `Customer` and `Order`. These entities will serve as the foundation for discussing various types of relations and how to define and work with them . To experiment with these entities online, you can access the following CodeSandbox link, which is preconfigured with these two entities and a postgres database: Feel free to explore and experiment with the provided entities and their relations in the CodeSandbox environment. #### Customer Entity The `Customer` entity represents individuals or organizations with attributes such as an ID, name, and city. Each customer can be uniquely identified by their `id`. #### Order Entity The `Order` entity represents transactions or purchases made by customers. Each order is associated with a `customer`, representing the customer who placed the order, and has an `amount` attribute indicating the total purchase amount. Throughout the following discussion, we will explore how to define and use relations between these entities, enabling you to create sophisticated data models and efficiently query and manipulate data using Remult. Whether you are dealing with one-to-one, one-to-many, or many-to-many relationships, understanding entity relations is essential for building robust and feature-rich applications with Remult. ## Simple Many-to-One In Remult, many-to-one relations allow you to establish connections between entities, where multiple records of one entity are associated with a single record in another entity. Let's delve into a common use case of a many-to-one relation, specifically the relationship between the `Order` and `Customer` entities. ### Defining the Relation To establish a many-to-one relation from the `Order` entity to the `Customer` entity, you can use the `@Relations.toOne` decorator in your entity definition: In this example, each `Order` is associated with a single `Customer`. The `customer` property in the `Order` entity represents this relationship. ### Fetching Relational Data When querying data that involves a many-to-one relation, you can use the `include` option to specify which related entity you want to include in the result set. In this case, we want to include the associated `Customer` when querying `Order` records. Here's how you can include the relation in a query using Remult: #### Resulting Data Structure The result of the query will contain the related `Customer` information within each `Order` record, creating a nested structure. Here's an example result of running `JSON.stringify` on the `orders` array: As shown in the result, each `Order` object contains a nested `customer` object, which holds the details of the associated customer, including their `id`, `name`, and `city`. This structured data allows you to work seamlessly with the many-to-one relationship between `Order` and `Customer` entities . ### Querying a Single Item To retrieve a single `Order` item along with its associated `Customer`, you can use the `findFirst` method provided by your repository . Here's an example of how to perform this query: ### Relation Loading In Remult, by default, a relation is not loaded unless explicitly specified in the `include` statement of a query. This behavior ensures that you only load the related data you require for a specific task, optimizing performance and minimizing unnecessary data retrieval. Here's an example: In the above query, the `customer` relation will not be loaded and have the value of `undefined` because it is not specified in the `include` statement. #### Overriding Default Behavior with `defaultIncluded` Sometimes, you may have scenarios where you want a relation to be included by default in most queries, but you also want the flexibility to exclude it in specific cases. Remult allows you to control this behavior by using the `defaultIncluded` setting in the relation definition. In this example, we set `defaultIncluded` to `true` for the `customer` relation in the `Order` entity. This means that, by default, the `customer` relation will be loaded in most queries unless explicitly excluded. #### Example: Excluding `customer` Relation in a Specific Query In this query, we override the default behavior by explicitly setting `customer: false` in the `include` statement. This instructs Remult not to load the `customer` relation for this specific query, even though it is set to be included by default. By combining the default behavior with the ability to override it in specific queries, Remult provides you with fine-grained control over relation loading, ensuring that you can optimize data retrieval based on your application's requirements and performance considerations. ## Advanced Many-to-One In certain scenarios, you may require more granular control over the behavior of relations and want to access specific related data without loading the entire related entity. Remult provides advanced configuration options to meet these requirements. Let's explore how to achieve this level of control through advanced relation configurations. ### Custom Relation Field In Remult, you can define custom relation fields that allow you to access the `id` without loading the entire related entity. To define a custom relation field, follow these steps: #### Step 1: Define a Custom Field in the Entity In your entity definition, define a custom field that will hold the identifier or key of the related entity. This field serves as a reference to the related entity without loading the entity itself. In this example, we define a custom field called `customerId`, which stores the identifier of the related `Customer` entity. #### Step 2: Define the Relation Using `toOne` Use the `@Relations.toOne` decorator to define the relation, specifying the types for the `fromEntity` and `toEntity` in the generic parameters. Additionally, provide the name of the custom field as the third argument. This configuration establishes a relation between `Order` and `Customer` using the `customerId` field as the reference. #### Migrating from a Simple `toOne` Relation to a Custom Field Relation with Existing Data When transitioning from a simple `toOne` relation to a custom field relation in Remult and you already have existing data, it's important to ensure a smooth migration. In this scenario, you need to make sure that the newly introduced custom field can access the existing data in your database. This is accomplished using the `dbName` option. Here's how to perform this migration: ##### 1. Understand the Existing Data Structure Before making any changes, it's crucial to understand the structure of your existing data. In the case of a simple `toOne` relation, there may be rows in your database where a field holds the identifier of the related entity. ##### 2. Define the Custom Field with `dbName` When defining the custom field in your entity, use the `dbName` option to specify the name of the database column where the related entity's identifier is stored. This ensures that the custom field correctly accesses the existing data in your database. In this example, we use the `dbName` option to specify that the `customerId` field corresponds to the `customer` column in the database. This mapping ensures that the custom field can access the existing data that uses the `customer` column for the related entity's identifier. #### Using the `field` Option for Custom Relation Configuration When you require additional customization for a relation field in Remult, you can utilize the field option to specify additional options for the related field. In this example, we use the `field` option to define a custom relation between the `Order` and `Customer` entities. Here are some key points to understand about using the `field` option: 1. **Custom Relation Field**: The `field` option allows you to specify a custom field name that represents the relationship between entities. This field can be used to access related data without loading the entire related entity. 2. **Additional Configuration**: In addition to specifying the `field`, you can include other options as well. In this example, we set the `caption` option to provide a descriptive caption for the relation field. Using the `field` option provides you with granular control over how the relation field is configured and accessed . You can customize various aspects of the relation to meet your specific requirements, enhance documentation, and improve the overall usability of your codebase. ### Relation Based on Multiple Fields In some scenarios, establishing a relation between entities requires considering multiple fields to ensure the correct association. Remult provides the flexibility to define relations based on multiple fields using the `fields` option. Here's how to create a relation based on multiple fields in Remult: #### Defining Entities Let's consider a scenario where both `Order` and `Customer` entities belong to specific branches, and we need also the `branchId` fields to ensure the correct association. First, define your entities with the relevant fields: In this example, we have two entities: `Customer` and `Order`. Both entities have a `branchId` field that represents the branch they belong to. To create a relation based on these fields, we specify the `fields` option in the relation configuration. #### Using the `fields` Option In the `@Relations.toOne` decorator, use the `fields` option to specify the mapping between fields in the related entity and your entity . Each entry in the `fields` object corresponds to a field in the related entity and maps it to a field in your entity. In this configuration: - `branchId` from the `Customer` entity is mapped to `branchId` in the `Order` entity. - `id` from the `Order` entity is mapped to `customerId` in the `Customer` entity. This ensures that the relation between `Order` and `Customer` is based on both the `branchId` and `customerId` fields, providing a comprehensive association between the entities. By utilizing the `fields` option, you can create relations that consider multiple fields, ensuring accurate and meaningful associations between your entities in Remult. ## One-to-Many In Remult, you can easily define a `toMany` relation to retrieve multiple related records. Let's consider a scenario where you want to retrieve a list of orders for each customer. We'll start with the basic `toOne` relation example and then add a `toMany` relation to achieve this: #### Basic `toOne` Relation Example First, let's define the `Customer` and `Order` entities with a basic `toOne` relation: In this initial setup: - The `Order` entity has a property `customer`, which is decorated with `@Relations.toOne => Customer)`. This establishes a relation between an order and its associated customer. ### Adding a `toMany` Relation Now, let's enhance this setup to include a `toMany` relation that allows you to retrieve a customer's orders: In this updated configuration: - The `Customer` entity has a property `orders`, which is decorated with `@Relations.toMany => Order)`. This indicates that a customer can have multiple orders. With this setup, you can use the `orders` property of a `Customer` entity to retrieve all the orders associated with that customer. This provides a convenient way to access and work with a customer's orders. By defining a `toMany` relation, you can easily retrieve and manage multiple related records, such as a customer's orders. ### Fetching Relational Data To retrieve customers along with their associated order in Remult, you can use the `include` option in your query. Let's see how to fetch customers with their orders using the `include` option: In this code snippet: - We first obtain a repository for the `Customer` entity using `repo`. - Next, we use the `find` method to query the `Customer` entity. Within the query options, we specify the `include` option to indicate that we want to include related records. - Inside the `include` option, we specify `orders: true`, indicating that we want to fetch the associated orders for each customer. As a result, the `customers` variable will contain an array of customer records, with each customer's associated orders included. This allows you to easily access and work with both customer and order data. #### Resulting Data Structure When you fetch customers along with their associated orders using the `include` option in Remult, the result will be an array that includes both customer and order data. Here's an example result of running `JSON.stringify` on the `customers` array: In this example: - Each customer is represented as an object with properties such as `id`, `name`, and `city`. - The `orders` property within each customer object contains an array of associated order records. - Each order record within the `orders` array includes properties like `id` and `amount`. This structured result allows you to easily navigate and manipulate the data . You can access customer information as well as the details of their associated orders, making it convenient to work with related records in your application's logic and UI. ### Specifying Reference Fields In Remult, you can specify a field or fields for `toMany` relations to have more control over how related records are retrieved. This can be useful when you want to customize the behavior of the relation. Here's how you can specify a field or fields for `toMany` relations: #### Specifying a Single Field To specify a single field for a `toMany` relation, you can use the `field` option. This option allows you to define the field in your entity that establishes the relation. For example: In this case, the `field` option is set to `"customer"`, indicating that the `customer` field in the `Order` entity establishes the relation between customers and their orders. #### Specifying Multiple Fields In some cases, you may need to specify multiple fields to establish a `toMany` relation. To do this, you can use the `fields` option, which allows you to define a mapping of fields between entities. Here's an example: In this example, the `fields` option is used to specify that the `branchId` field in the `Order` entity corresponds to the `branchId` field in the `Customer` entity, and the `customerId` field in the `Order` entity corresponds to the `id` field in the `Customer` entity. By specifying fields in this manner, you have fine-grained control over how the relation is established and how related records are retrieved. This allows you to tailor the behavior of `toMany` relations to your specific use case and data model. ### Customizing a `toMany` Relation In Remult, you can exercise precise control over a `toMany` relation by utilizing the `findOptions` option. This option allows you to define specific criteria and behaviors for retrieving related records. Here's how you can use `findOptions` to fine-tune a `toMany` relation: In this example, we've specified the following `findOptions`: - `limit: 5`: Limits the number of related records to 5. Only the first 5 related records will be included. - `orderBy: { amount: "desc" }`: Orders the related records by the `amount` field in descending order. This means that records with higher `amount` values will appear first in the result. - `where: { amount: { $gt: 10 } }`: Applies a filter to include only related records where the `amount` is greater than 10. This filters out records with an `amount` of 10 or lower. By using `findOptions` in this manner, you gain precise control over how related records are retrieved and included in your query results. This flexibility allows you to tailor the behavior of the `toMany` relation to suit your specific application requirements and use cases. #### Fine-Tuning a `toMany` Relation with `include` In Remult, you can exercise even more control over a `toMany` relation by using the `include` option within your queries. This option allows you to further customize the behavior of the relation for a specific query. Here's how you can use `include` to fine-tune a `toMany` relation: In this code snippet: - We use the `include` option within our query to specify that we want to include the related `orders` for each customer. - Inside the `include` block, we can provide additional options to control the behavior of this specific inclusion. For example: - `limit: 10` limits the number of related orders to 10 per customer. This will override the `limit` set in the original relation. - `where: { completed: true }` filters the included orders to only include those that have been marked as completed. The `where` option specified within `include` will be combined with the `where` conditions defined in the `findOptions` of the relation using an "and" relationship. This means that both sets of conditions must be satisfied for related records to be included. Using `include` in this way allows you to fine-tune the behavior of your `toMany` relation to meet the specific requirements of each query, making Remult a powerful tool for building flexible and customized data retrieval logic in your application. ## Repository `relations` In Remult, managing relationships between entities is a crucial aspect of working with your data. When dealing with a `toMany` relationship, Remult provides you with powerful tools through the repository's `relations` property to handle related rows efficiently, whether you want to retrieve them or insert new related records. ### Inserting Related Records Consider a scenario where you have a `Customer` entity with a `toMany` relationship to `Order` entities. You can create a new customer and insert related orders in a straightforward manner: In this example, you first create a new `Customer` entity with the name "Abshire Inc." Then, using the `relations` method, you access the related `orders`. By calling the `insert` method on the `orders` relation, you can add new order records. Remult automatically sets the `customer` field for these orders based on the specific customer associated with the `relations` call. ### Loading Unfetched Relations Another powerful use of the `repository` methods is to load related records that were not initially retrieved. Let's say you have found a specific customer and want to access their related orders: Here, you first search for a customer with the name "Abshire Inc." After locating the customer, you can use the `relations` method again to access their related orders. By calling the `find` method on the `orders` relation, you retrieve all related order records associated with the customer. #### Contextual Repository: Tailored Operations for Related Data The `relations` method serves as a specialized repository, tightly associated with the particular customer you supply to it. This dedicated repository offers a tailored context for performing operations related to the specific customer's connection to orders. It enables you to seamlessly find related records, insert new ones, calculate counts, and perform other relevant actions within the precise scope of that customer's relationship with orders. This versatile capability streamlines the management of intricate relationships in your application, ensuring your data interactions remain organized and efficient. Remult's repository methods empower you to seamlessly manage and interact with related data, making it easier to work with complex data structures and relationships in your applications. Whether you need to insert related records or load unfetched relations, these tools provide the flexibility and control you need to handle your data efficiently. Certainly, here's an extension of the "Loading Unfetched Relations" section that covers the topic of fetching unloaded `toOne` relations using the `findOne` function: --- ### Fetching Unloaded `toOne` Relations with `findOne` In addition to loading unfetched `toMany` relations, Remult offers a convenient way to retrieve `toOne` relations that were not initially loaded. This capability is especially useful when dealing with many-to-one relationships. Consider the following example, where we have a many-to-one relation between orders and customers. We want to fetch the customer related to a specific order, even if we didn't load it initially: In this code snippet: 1. We first obtain the order using the `findFirst` function, providing the order's unique identifier. 2. Next, we use the `relations` method to access the repository's relations and then chain the `customer` relation using dot notation. 3. Finally, we call `findOne` on the `customer` relation to efficiently retrieve the related customer information. This approach allows you to access and load related data on-demand, providing flexibility and control over your data retrieval process. Whether you're working with loaded or unloaded relations, Remult's intuitive functions give you the power to seamlessly access the data you need. --- You can seamlessly incorporate this extension into the "Loading Unfetched Relations" section of your documentation to provide a comprehensive overview of working with both `toMany` and `toOne` relations. --- ### Accessing Relations with `activeRecord` If you're following the `activeRecord` pattern and your entity inherits from `EntityBase` or `IdEntity`, you can access relations directly from the entity instance. This approach offers a convenient and straightforward way to work with relations. #### Inserting Related Records You can insert related records directly from the entity instance. For example, consider a scenario where you have a `Customer` entity and a `toMany` relation with `Order` entities. Here's how you can insert related orders for a specific customer: In this code: - We create a new `Customer` instance using `customerRepo.insert` and set its properties. - Using `customer._.relations.orders`, we access the `orders` relation of the customer. - We insert two orders related to the customer by calling `.insert` on the `orders` relation. #### Retrieving Related Records Fetching related records is just as straightforward. Let's say you want to find a customer by name and then retrieve their related orders: In this code: - We search for a customer with the specified name using `customerRepo.findFirst`. - Once we have the customer instance, we access their `orders` relation with `customer._.relations.orders`. - We use `.find` to retrieve all related orders associated with the customer. Using the `activeRecord` pattern and direct access to relations simplifies the management of related data, making it more intuitive and efficient. ## Many-to-Many In Remult, you can effectively handle many-to-many relationships between entities by using an intermediate table. This approach is especially useful when you need to associate multiple instances of one entity with multiple instances of another entity. In this section, we'll walk through the process of defining and working with many-to-many relationships using this intermediate table concept. #### Entity Definitions: To illustrate this concept, let's consider two entities: `Customer` and `Tag`. In this scenario, multiple customers can be associated with multiple tags. ### Intermediate Table To establish this relationship, we'll create an intermediate table called `tagsToCustomers`. In this table, both `customerId` and `tagId` fields are combined as the primary key. - To uniquely identify associations between customers and tags in a many-to-many relationship, we use the combined `customerId` and `tagId` fields as the primary key, specified using the 'id' option in the `@Entity` decorator. - In this scenario, we've defined a `toOne` relation to the `Tag` entity within the `TagsToCustomers` entity to efficiently retrieve tags associated with a specific customer. This approach simplifies the management of many-to-many relationships while ensuring unique identification of each association. Now, let's enhance our customer entity with a toMany relationship, enabling us to fetch all of its associated tags effortlessly. ### Working with Many-to-Many Relationships Let's explore how to interact with many-to-many relationships using an intermediate table in Remult. #### 1. Adding Tags to a Customer: To associate a tag with a customer, consider the follow code: Here's an explanation of what's happening in this code: 1. We first insert some tags into the "tags" entity. 2. We then create a repository instance for the "customer" entity using `repo`. 3. We retrieve a specific customer by searching for one with the name "Abshire Inc" using `customerRepo.findFirst`. The `customer` variable now holds the customer entity. 4. To associate tags with the customer, we use the `relations` method provided by the repository. This method allows us to work with the customer's related entities, in this case, the "tags" relation to the TagsToCustomers entity. 5. Finally, we call the `insert` method on the "tags" relationship and provide an array of tag objects to insert. In this example, we associate the customer with the "vip" tag and the "influencer" tag by specifying the tags' indices in the `tags` array. **2. Retrieving Tags for a Customer:** To fetch the tags associated with a specific customer: Certainly, here's a shorter explanation: In this code, we're querying the "customer" entity to find a customer named "Abshire Inc." We're also including the related "tags" for that customer, along with the details of each tag. This allows us to fetch both customer and tag data in a single query, making it more efficient when working with related entities. ### Resulting Data Structure Here's an example result of running `JSON.stringify` on the `customer` object: Utilizing an intermediate table for managing many-to-many relationships in Remult allows for a flexible and efficient approach to handle complex data associations. Whether you are connecting customers with tags or other entities, this method provides a powerful way to maintain data integrity and perform queries effectively within your application. --- In this guide, we've explored the essential concepts of managing entity relations within the Remult library. From one-to-one to many-to-many relationships, we've covered the declaration, customization, and querying of these relations. By understanding the nuances of entity relations, users can harness the full potential of Remult to build robust TypeScript applications with ease. --- tags: - Where - Filter - Entity Where - Entity Filter --- # EntityFilter Used to filter the desired result set ### Basic example ( this will include only items where the status is equal to 1. ### In Statement ### Not Equal ### Not in ### Comparison operators ### Contains ### Not Contains ### Starts With ### Ends With ### Id Equal ### Multiple conditions has an `and` relationship ### $and ### $or ### $not # Example Apps We have already a _ton_ of examples! Pick and choose the one that fits your needs 😊 ## Todo MVC ## CRM Demo A fully featured CRM! Make sure to check out the link: Dev / Admin on top right! ## Shadcn React Table Using remult with server side sorting, filtering, paging & CRUD ## TanStack React Table Example of using remult with react table - most basic design, with server side sorting, paging & filtering ## 🚀 Ready to play An environment to reproduce issues using stackblitz, with optional sqlite database ## Group by Example And example of the usage of groupBy ## Todo for most frameworks - - - - - - - - - - ## Other example - - - - - # Field Types ## Common field types There are also several built in Field decorators for common use case: ### @Fields.string A field of type string ### @Fields.number Just like TypeScript, by default any number is a decimal . ### @Fields.integer For cases where you don't want to have decimal values, you can use the `@Fields.integer` decorator ### @Fields.boolean ### @Fields.date ### @Fields.dateOnly Just like TypeScript, by default any `Date` field includes the time as well. For cases where you only want a date, and don't want to meddle with time and time zone issues, use the `@Fields.dateOnly` ### @Fields.createdAt Automatically set on the backend on insert, and can't be set through the API ### @Fields.updatedAt Automatically set on the backend on update, and can't be set through the API ## JSON Field You can store JSON data and arrays in fields. ## Auto Generated Id Field Types ### @Fields.uuid This id value is determined on the backend on insert, and can't be updated through the API. ### @Fields.cuid This id value is determined on the backend on insert, and can't be updated through the API. Uses the package ### @Fields.autoIncrement This id value is determined by the underlying database on insert, and can't be updated through the API. ### MongoDB ObjectId Field To indicate that a field is of type object id, change it's `fieldTypeInDb` to `dbid`. ## Enum Field Enum fields allow you to define a field that can only hold values from a specific enumeration. The `@Fields.enum` decorator is used to specify that a field is an enum type. When using the `@Fields.enum` decorator, an automatic validation is added that checks if the value is valid in the specified enum. In this example, the `priority` field is defined as an enum type using the `@Fields.enum` decorator. The `Priority` enum is passed as an argument to the decorator, ensuring that only valid `Priority` enum values can be assigned to the `priority` field. The `Validators.enum` validation is used and ensures that any value assigned to this field must be a member of the `Priority` enum, providing type safety and preventing invalid values. ## Literal Fields Literal fields let you restrict a field to a specific set of string values using the `@Fields.literal` decorator. This is useful for fields with a finite set of possible values. In this example, we use the `as const` assertion to ensure that the array `` is treated as a readonly array, which allows TypeScript to infer the literal types 'open', 'closed', 'frozen', and 'in progress' for the elements of the array. This is important for the type safety of the `status` field. The `status` field is typed as `'open' | 'closed' | 'frozen' | 'in progress'`, which means it can only hold one of these string literals. The `@Fields.literal` decorator is used to specify that the `status` field can hold values from this set of strings, and it uses the `Validators.in` validator to ensure that the value of `status` matches one of the allowed values. For better reusability and maintainability, and to follow the DRY principle, it is recommended to refactor the literal type and the array of allowed values into separate declarations: In this refactored example, `statuses` is a readonly array of the allowed values, and `StatusType` is a type derived from the elements of `statuses`. The `@Fields.literal` decorator is then used with the `statuses` array, and the `status` field is typed as `StatusType`. This approach makes it easier to manage and update the allowed values for the `status` field, reducing duplication and making the code more robust and easier to maintain. ## ValueListFieldType ### Overview The `ValueListFieldType` is useful in cases where simple enums and unions are not enough, such as when you want to have more properties for each value. For example, consider representing countries where you want to have a country code, description, currency, and international phone prefix. ### Defining a ValueListFieldType Using enums or union types for this purpose can be challenging. Instead, you can use the `ValueListFieldType`: ### Using in an Entity In your entity, you can define the field as follows: ### Accessing Properties The property called `id` will be stored in the database and used through the API, while in the code itself, you can use each property: Note: Only the `id` property is saved in the database and used in the API. Other properties, such as `caption`, `currency`, and `phonePrefix`, are only accessible in the code and are not persisted in the database. ### Getting Optional Values To get the optional values for `Country`, you can use the `getValueList` function, which is useful for populating combo boxes: ### Special Properties: id and caption The `id` and `caption` properties are special in that the `id` will be used to save and load from the database, and the `caption` will be used as the display value. ### Automatic Generation of id and caption If `id` and/or `caption` are not provided, they are automatically generated based on the static member name. For example: In this case, the `open` member will have an `id` of `'open'` and a `caption` of `'Open'`, and similarly for the `closed` member. ### Handling Partial Lists of Values In cases where you only want to generate members for a subset of values, you can use the `getValues` option of `@ValueListFieldType` to specify which values should be included: This approach is useful when you want to limit the options available for a field to a specific subset of values, without needing to define all possible values as static members. ::: warning Warning: TypeScript may throw an error similar to `Uncaught TypeError: Currency_1 is not a constructor`. This happens in TypeScript versions <5.1.6 and target es2022. It's a TypeScript bug. To fix it, upgrade to version >=5.1.6 or change the target from es2022. Alternatively, you can call the `ValueListFieldType` decorator as a function after the type: ::: ### Summary The `ValueListFieldType` enables the creation of more complex value lists that provide greater flexibility and functionality for your application's needs beyond what enums and unions can offer. By allowing for additional properties and partial lists of values, it offers a versatile solution for representing and managing data with multiple attributes. ## Control Field Type in Database In some cases, you may want to explicitly specify the type of a field in the database. This can be useful when you need to ensure a specific data type or precision for your field. To control the field type in the database, you can use the `fieldTypeInDb` option within the `valueConverter` property of a field decorator. For example, if you want to ensure that a numeric field is stored as a decimal with specific precision in the database, you can specify the `fieldTypeInDb` as follows: In this example, the `price` field will be stored as a `decimal` with 16 digits in total and 8 digits after the decimal point in the database. This allows you to control the storage format and precision of numeric fields in your database schema. ## Creating Custom Field Types Sometimes, you may need to create custom field types to handle specific requirements or use cases in your application. By creating custom field types, you can encapsulate the logic for generating, validating, and converting field values. ### Example: Creating a Custom ID Field Type with NanoID NanoID is a tiny, secure, URL-friendly, unique string ID generator. You can create a custom field type using NanoID to generate unique IDs for your entities. Here's an example of how to create a custom NanoID field type: In this example, the `NanoIdField` function creates a custom field type based on the `Fields.string` type. It uses the `nanoid` function to generate a unique ID as the default value and ensures that the ID is generated before saving the record if it hasn't been set yet. This custom field type can be used in your entities to automatically generate and assign unique IDs using NanoID. ## Customize DB Value Conversions Sometimes you want to control how data is saved to the db, or the dto object. You can do that using the `valueConverter` option. For example, the following code will save the `tags` as a comma separated string in the db. You can also refactor it to create your own FieldType And then use it: There are several ready made valueConverters included in the `remult` package, which can be found in `remult/valueConverters` ## Class Fields Sometimes you may want a field type to be a class, you can do that, you just need to provide an implementation for its transition from and to JSON. For example: Alternatively you can decorate the `Phone` class with the `FieldType` decorator, so that whenever you use it, its `valueConverter` will be used. # Filtering and Relations ::: tip **Interactive Learning Available! 🚀** Looking to get hands-on with this topic? Try out our new on Filtering relations, where you can explore and practice directly in the browser. This guided experience offers step-by-step lessons to help you master filtering in Remult with practical examples and exercises. ::: In this article, we'll discuss several relevant techniques for one-to-many relations. Consider the following scenario where we have a customer entity and an Orders entity. We'll use the following entities and data for this article. ::: tip Use Case in this article Let's say that we want to filter all the orders of customers who are in London. Let's have a look at the different options to achieve this. ::: ## Option 1 - Use In Statement Add the `where` inline to the `find` method. ## Option 2 - Use Custom Filter We can refactor this to a custom filter that will be easier to use and will run on the backend And then we can use it: ## Option 3 - Custom Filter We can improve on the custom filter by using the database's in statement capabilities: We can also reuse the entity definitions by using `dbNamesOf` and `filterToRaw` ## Option 4 - sqlExpression field - This adds a calculated `city` field to the `Order` entity that we can use to order by or filter ::: details Side Note In this option, `city` is always calculated, and the `sqlExpression` is always executed. Not a big deal, but it's woth mentioning. ::: ## Option 5 - Dedicated entity Like this, in your code, you can use `OrderWithCity` or `Order` depending on your needs. ::: tip As `OrderWithCity` extends `Order`, everything in `Order` is also available in `OrderWithCity` 🎉. ::: **Remult** is a fullstack CRUD framework that uses your TypeScript model types to provide: - Secure REST API - Type-safe frontend API client - Type-safe backend query builder #### Use the same model classes for both frontend and backend code With Remult it is simple to keep your code and increase development speed and maintainability by defining a single TypeScript model class and sharing it between your frontend and backend code. As Remult is "aware" of the runtime context , data validations and entity lifecycle hooks can be written in layer-agnostic TypeScript which will run, as needed, on either the frontend, the backend, or both. ## Choose Your Remult Learning Path Explore the flexibility of Remult through different learning paths tailored to match your style and project needs. ### `Option A`: Start with the Interactive Online Tutorial If you're new to Remult or prefer a guided, hands-on approach, we recommend starting with our . This tutorial will walk you through building a full-stack application step by step, providing immediate feedback and insights as you learn. ### `Option B`: Create a new Project ### `Option C`: Follow a Step-by-step Tutorial
### `Option D`: Quickstart Use this guide to quickly setup and try out Remult or add Remult to an existing app. ### `Option E`: Browse Example Apps ### `Option F`: Video Tutorials Check out these official . Here’s the polished version of the **Better-sqlite3** setup: ### Better-sqlite3 To use **Better-sqlite3** as the database provider for your Remult application, follow these steps: ### Step 1: Install Better-sqlite3 Run the following command to install the `better-sqlite3` package: ### Step 2: Configure the `dataProvider` In your `api.ts` or server file, configure the `dataProvider` to connect to the SQLite database using **Better-sqlite3**: This setup connects to an SQLite database stored in the `mydb.sqlite` file. The `BetterSqlite3DataProvider` is wrapped inside the `SqlDatabase` class to allow Remult to interact with SQLite efficiently. ### Bun:SQLite ### Step 1: Configure the `dataProvider` In your `api.ts` or server file, configure the `dataProvider` to use `bun:sqlite` as follows: ### Explanation: - **bun:sqlite**: This uses Bun's native SQLite database, `bun:sqlite`, to manage SQLite databases efficiently in a Bun-based environment. - **BunSqliteDataProvider**: The `BunSqliteDataProvider` integrates the Bun SQLite database as a data provider for Remult. - **SqlDatabase**: Wraps the `BunSqliteDataProvider` to make it compatible with Remult's SQL-based data provider system. This setup allows you to use Bun's SQLite implementation as the database provider for your Remult application, leveraging Bun’s performance benefits with SQLite. ## DuckDB To use DuckDB as the database provider in your Remult-based application, follow these steps: ### Step 1: Install DuckDB Run the following command to install `duckdb`: ### Step 2: Configure the `dataProvider` In your `index.ts` , configure the `dataProvider` to use DuckDB: ### Explanation: - **DuckDB setup**: The database is initialized with `new Database` to create an in-memory database. Replace `':memory:'` with a file path if you want to persist the database to disk. - **SqlDatabase**: `SqlDatabase` is used to connect Remult with DuckDB through the `DuckDBDataProvider`. This setup allows you to use DuckDB as your database provider in a Remult project. # Choose a Database By default, if no database provider is specified, Remult will use a simple JSON file-based database. This will store your data in JSON files located in the `db` folder at the root of your project.
## JSON Files You can store data in JSON files using Remult. Here's how to configure your server: ### Step 1: Configure the `dataProvider` In your `index.ts` , configure the `dataProvider` to use JSON files as the storage mechanism: ### Explanation: - **`JsonDataProvider`**: This is the data provider that will store your data in JSON format. - **`JsonEntityFileStorage`**: Specifies the directory where the JSON files will be stored . - **`"./db"`**: The path where JSON files for entities will be created. Ensure the folder exists or it will be created automatically. This configuration allows you to store and manage your application data in JSON files, ideal for small projects or quick setups. ## MongoDB To use MongoDB as the database provider for your Remult application, follow the steps below. ### Step 1: Install MongoDB Driver Run the following command to install the `mongodb` package: ### Step 2: Set the `dataProvider` Property In your `api.ts` or server file, configure the `dataProvider` to connect to your MongoDB database: This setup connects to a MongoDB instance running on `localhost` and uses the `test` database. The `MongoDataProvider` manages the connection, allowing Remult to interact with MongoDB seamlessly. # Microsoft SQL Server ### Step 1: Install Required Packages Install `knex` and `tedious` to enable Microsoft SQL Server integration. ### Step 2: Configure the `dataProvider` In your `index.ts` , configure the `dataProvider` to use Microsoft SQL Server with the following `knex` client configuration: ### Step 3: Use an Existing `knex` Provider If you have an existing `knex` instance, you can easily integrate it with Remult like this: ### Explanation: - **`tedious`**: The underlying driver used by `knex` to connect to SQL Server. - **`client: "mssql"`**: Specifies that we are using Microsoft SQL Server. - **`createKnexDataProvider`**: Allows you to use `knex` to connect to SQL Server as the data provider for Remult. - **`options`**: The additional configuration for SQL Server, including `enableArithAbort` and `encrypt`. This setup lets you easily connect Remult to Microsoft SQL Server using `knex` for query building and `tedious` as the driver. # MySQL ### Step 1: Install `knex` and `mysql2` Run the following command to install the required packages: ### Step 2: Set the `dataProvider` Property In your `api.ts` file, configure the `dataProvider` to connect to your MySQL database using `Knex`: ### Alternative: Use an Existing Knex Provider If you're already using a `knex` instance in your project, you can pass it directly to Remult: ## Oracle Database To use an Oracle database as the data provider for your Remult-based application, follow these steps: ### Step 1: Install Required Packages Install `knex` and `oracledb`: ### Step 2: Configure the `dataProvider` In your `index.ts` , configure the `dataProvider` to use Oracle through `knex`: ### Step 3: Using an Existing `knex` Provider If you're already using a `knex` instance, you can easily plug it into Remult: ### Explanation: - **Knex configuration**: `client: "oracledb"` indicates you're using Oracle, and `connection` contains the necessary credentials and connection string. - **Existing knex provider**: If you already have a `knex` instance, it can be reused directly with Remult. This setup integrates Oracle into your Remult-based application. # PostgreSQL To set up PostgreSQL as the database provider for your Remult application, you'll need to configure the `dataProvider` property in the `api.ts` file. ### Step 1: Install the `node-postgres` package Run the following command to install the necessary PostgreSQL client for Node.js: ### Step 2: Set the `dataProvider` Property In the `api.ts` file, configure the `dataProvider` property to connect to your PostgreSQL database: ### Alternative: Use an Existing PostgreSQL Connection If you already have a PostgreSQL connection set up, you can pass it directly to Remult: In this example, the `pg.Pool` is used to create the PostgreSQL connection, and `SqlDatabase` is used to interface with the `PostgresDataProvider`. Here’s the polished version of the **sqlite3** setup: ### SQLite3 Setup This version of **SQLite3** works well even on platforms like StackBlitz. ### Step 1: Install SQLite3 Run the following command to install the `sqlite3` package: ### Step 2: Configure the `dataProvider` In your `api.ts` or server file, configure the `dataProvider` to connect to the SQLite database using **sqlite3**: This configuration connects to an SQLite database stored in the `mydb.sqlite` file. The `Sqlite3DataProvider` is wrapped inside the `SqlDatabase` class, enabling Remult to work with SQLite databases smoothly across different environments, including StackBlitz. ### sql.js ### Step 1: Install sql.js Run the following command to install the `sql.js` package: ### Step 2: Configure the `dataProvider` In your `api.ts` or server file, configure the `dataProvider` to use `sql.js`: ### Explanation: - **sql.js**: This setup initializes an in-memory SQLite database using `sql.js`, a library that runs SQLite in the browser or in Node.js. - **SqlJsDataProvider**: The `SqlJsDataProvider` is used to integrate the `sql.js` database as a Remult data provider. - **Async Initialization**: The `initSqlJs` function initializes the SQL.js engine and sets up the database instance. This configuration allows you to use an in-memory SQLite database in your Remult application, powered by `sql.js`. Here’s the polished version of the **Turso** setup: ### Turso Setup ### Step 1: Install Turso Client Run the following command to install the `@libsql/client` package: ### Step 2: Configure the `dataProvider` In your `api.ts` or server file, configure the `dataProvider` to connect to Turso using the Turso client: ### Explanation: - **Turso Client**: This configuration uses the `@libsql/client` package to connect to the Turso database. - **Environment Variables**: Ensure you have `TURSO_DATABASE_URL` and `TURSO_AUTH_TOKEN` defined in your environment to securely pass the database connection URL and authentication token. - **SqlDatabase**: The `TursoDataProvider` is wrapped with the `SqlDatabase` class, allowing seamless integration of Turso as a Remult data provider. This setup allows you to use Turso as the backend database for your application. # Angular ## Create an Angular Project To set up a new Angular project, use the Angular CLI: ## Install Remult Install the latest version of Remult in your Angular project: ## Proxy API Requests from Angular DevServer to the API Server In development, your Angular app will be served from `http://localhost:4200`, while the API server will run on `http://localhost:3002`. To allow the Angular app to communicate with the API server during development, you can use Angular's feature. 1. Create a file named `proxy.conf.json` in the root folder of your project with the following content: This configuration redirects all API calls from the Angular dev server to the API server running at `http://localhost:3002`. ## Adjust the `package.json` Modify the `package.json` to use the newly created proxy configuration when serving the Angular app: Running the `dev` script will start the Angular dev server with the proxy configuration enabled. ## Configure a Server Now that the app is set up, # Select a framework
# Next.js ## Create a Next.js Project To create a new Next.js project, run the following command: When prompted, use these answers: Afterward, navigate into the newly created project folder: ## Install Remult Install the latest version of Remult: ## Bootstrap Remult in the Backend Remult is bootstrapped in a Next.js app by creating a . This route will pass API requests to an object created using the `remultNextApp` function. 1. **Create an API file** In the `src/` directory, create a file called `api.ts` with the following code to set up Remult: 2. **Create the API Route** In the `src/app/api` directory, create a `` subdirectory. Inside that directory, create a `route.ts` file with the following code: This file serves as a catch-all route for the Next.js API, handling all API requests by routing them through Remult. ## Enable TypeScript Decorators To enable the use of decorators in your Next.js app, modify the `tsconfig.json` file. Add the following entry under the `compilerOptions` section: ## Run the App To start the development server, open a terminal and run the following command: Your Next.js app is now running with Remult integrated and listening for API requests. ### Create a Nuxt Project To create a new Nuxt project, run the following command: ### Install Remult Install Remult in your Nuxt project by running the following command: ### Enable TypeScript Decorators To enable the use of TypeScript decorators in your Nuxt project, modify the `nuxt.config.ts` file as follows: ### Bootstrap Remult 1. **Create the API File** In the `server/api/` directory, create a dynamic API route that integrates Remult with Nuxt. The following code sets up the API and defines the entities to be used: This setup uses the Remult `Task` entity and registers the API routes dynamically for the entities within the app. ### Run the App To start the development server, run: The Nuxt app will now be running on the default address . ### Setup Completed Your Nuxt app with Remult is now set up and ready to go. You can now move on to defining your entities and building your task list app. # React ## Create a React Project with Vite To set up a new React project using Vite, run the following commands: ## Install Remult Install the latest version of Remult: ## Enable TypeScript Decorators in Vite To enable the use of decorators in your React app, modify the `vite.config.ts` file by adding the following to the `defineConfig` section: This configuration ensures that TypeScript decorators are enabled for the project. ## Proxy API Requests from Vite DevServer to the API Server In development, your React app will be served from `http://localhost:5173`, while the API server will run on `http://localhost:3002`. To allow the React app to communicate with the API server during development, use Vite's feature. Add the following proxy configuration to the `vite.config.ts` file: This setup proxies all requests starting with `/api` from `http://localhost:5173` to your API server running at `http://localhost:3002`. ## Configure a Server Now that the app is set up, # SolidStart ### Step 1: Create a New SolidStart Project Run the following command to initialize a new SolidStart project: Answer the prompts as follows: Once completed, navigate to the project directory: ### Step 2: Install Remult To install the Remult package, run: ### Step 3: Bootstrap Remult in the Backend Remult is integrated into `SolidStart` using a , which passes API requests to a handler created using the `remultSolidStart` function. 1. **Create the Remult API Configuration File** In the `src` directory, create a file named `api.ts` with the following code: 2. **Set Up the Catch-All API Route** In the `src/routes/api/` directory, create a file named `.ts` with the following code: ### Step 4: Enable TypeScript Decorators 1. **Install Babel Plugins for Decorators**: 2. **Configure Babel Plugins in SolidStart**: Add the following configuration to the `app.config.ts` file to enable TypeScript decorators: ### Setup Complete Your SolidStart project is now set up with Remult and ready to run. You can now proceed to the next steps of building your application. # SvelteKit ## Create a SvelteKit Project To create a new SvelteKit project, run the following command: During the setup, answer the prompts as follows: 1. **Which Svelte app template?**: ... `minimal` Project 2. **Add type checking with TypeScript?** ... Yes, using `TypeScript` syntax 3. **Select additional options**: ... We didn't select anything for this tutorial. Feel free to adapt it to your needs. 4. **Which package manager?**: ... We took `npm`, if you perfer others, feel free. Once the setup is complete, navigate into the project directory: ## Install Required Packages and Remult Install Remult and any necessary dependencies by running: ## Bootstrap Remult To set up Remult in your SvelteKit project: 1. Create your remult `api` ::: code-group ::: 2. Create a remult `api route` ::: code-group ::: ## Final Tweaks Remult uses TypeScript decorators to enhance classes into entities. To enable decorators in your SvelteKit project, modify the `tsconfig.json` file by adding the following to the `compilerOptions` section: ## Run the App To start the development server, run the following command: Your SvelteKit app will be available at . Your SvelteKit project with Remult is now up and running. # Extra ## Extra - Remult in other SvelteKit routes To enable remult across all sveltekit route ::: code-group ::: ## Extra - Universal load & SSR To Use remult in ssr `PageLoad` - this will leverage the `event`'s fetch to load data on the server without reloading it on the frontend, and abiding to all api rules even when it runs on the server ::: code-group ::: ::: tip You can add this in `+layout.ts` as well and all routes **under** will have the correct fetch out of the box. ::: ## Extra - Server load If you return a remult entity from the `load` function of a `+page.server.ts`, SvelteKit will complain and show this error: To fix this, you can use `repo.toJson` in the server load function and `repo.fromJson` in the .svelte file to serialize and deserialize well the entity. ::: code-group ::: --- #### Since `@sveltejs/kit@2.11.0`, there is a new feature: With this new feature, you can get rid of `repo.toJson` and `repo.fromJson` thanks to this file: `hooks.ts`. ::: code-group ::: ## Extra - Svelte 5 & Reactivity Remult is fully compatible with Svelte 5, Rune, and Reactivity. To take full advantage of it, add this snippet: ::: code-group ::: Then you can use `$state`, `$derived` like any other places ::: code-group ::: ### Focus on auth reactivity Anywhere in your frontend code you can set `remult.user = xxx` and all remult auth reactivity will work , ...) If you want `remult.user` to be filled in SSR, here is the code: ::: code-group ::: And you can trigger this with : # Vue ## Create a Vue Project with Vite To set up a new Vue project using Vite, run the following commands: ## Install Remult Install the latest version of Remult: ## Enable TypeScript Decorators in Vite To enable the use of decorators in your React app, modify the `vite.config.ts` file by adding the following to the `defineConfig` section: This configuration ensures that TypeScript decorators are enabled for the project. ## Proxy API Requests from Vite DevServer to the API Server In development, your React app will be served from `http://localhost:5173`, while the API server will run on `http://localhost:3002`. To allow the React app to communicate with the API server during development, use Vite's feature. Add the following proxy configuration to the `vite.config.ts` file: This setup proxies all requests starting with `/api` from `http://localhost:5173` to your API server running at `http://localhost:3002`. ## Configure a Server Now that the app is set up, # Stacks ## Frameworks
## Servers
## Databases
# Express ### Install Required Packages To set up your Express server with Remult, run the following commands to install the necessary packages: ### Bootstrap Remult in the Backend Remult is integrated into your backend as an `Express middleware`. 1. **Create the API File** Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware: 2. **Register the Middleware** Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following lines: ::: warning ESM Configuration In this tutorial, we are using ECMAScript modules for the Node.js server. This means that when importing files, you must include the `.js` suffix . Additionally, make sure to set `"type": "module"` in your `package.json` file. ::: #### Create the Server's TypeScript Configuration In the root folder, create a TypeScript configuration file named `tsconfig.server.json` to manage the server's settings: This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code. #### Create an `npm` Script to Start the API Server To simplify the development process, add a new script in your `package.json` file to start the Express server in development mode: - `tsx`: A TypeScript Node.js execution environment that watches for file changes and automatically restarts the server on each save. - `--env-file=.env`: Ensures environment variables are loaded from the `.env` file. - `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server. #### Start the Node Server Finally, open a new terminal and run the following command to start the development server: The server will now run on port 3002. `tsx` will watch for any file changes, automatically restarting the server whenever updates are made. # Fastify ### Install Required Packages To set up your Fastify server with Remult, run the following commands to install the necessary packages: ### Bootstrap Remult in the Backend Remult is integrated into your backend as Fastify middleware. 1. **Create the API File** Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware for Fastify: 2. **Register the Middleware** Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following lines: ::: warning ESM Configuration Similar to the Express setup, when using ECMAScript modules in Fastify, you must include the `.js` suffix when importing files . Also, ensure that `"type": "module"` is set in your `package.json`. ::: #### Create the Server's TypeScript Configuration In the root folder, create a TypeScript configuration file named `tsconfig.server.json` for the server project: This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code. #### Create an `npm` Script to Start the API Server To simplify the development process, add a new script in your `package.json` to start the Fastify server in development mode: - `tsx`: A TypeScript Node.js execution environment that watches for file changes and automatically restarts the server on each save. - `--env-file=.env`: Ensures environment variables are loaded from the `.env` file. - `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server. #### Start the Fastify Server Open a new terminal and run the following command to start the development server: The server will now run on port 3002. `tsx` will watch for any file changes, automatically restarting the Fastify server whenever updates are made. # Hapi ### Install Required Packages To set up your Hapi server with Remult, install the necessary packages: ### Bootstrap Remult in the Backend Remult is integrated into your backend as a Hapi plugin. 1. **Create the API File** Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware for Hapi: 2. **Register the Middleware** Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following code: ::: warning ESM Configuration When using ECMAScript modules in Hapi, ensure you include the `.js` suffix when importing files, as shown in the `import { api } from './api.js'` statement. Also, make sure that `"type": "module"` is set in your `package.json`. ::: #### Create the Server's TypeScript Configuration In the root folder, create a TypeScript configuration file named `tsconfig.server.json` for the Hapi server: This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code. #### Create an `npm` Script to Start the API Server Add a new script in your `package.json` to start the Hapi server in development mode: - `tsx`: A TypeScript execution environment that watches for file changes and automatically restarts the server on each save. - `--env-file=.env`: Ensures environment variables are loaded from the `.env` file. - `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server. #### Start the Hapi Server Open a new terminal and run the following command to start the development server: The server will now run on port 3002. `tsx` will watch for file changes, automatically restarting the Hapi server whenever updates are made. # Hono ### Install Required Packages To set up your Hono server with Remult, install the necessary packages: ### Bootstrap Remult in the Backend Remult is integrated into your backend using the `remultHono` adapter for Hono. 1. **Create the API File** Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware for Hono: 2. **Register the Middleware** Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following code: ::: warning ESM Configuration When using ECMAScript modules in Hono, ensure you include the `.js` suffix when importing files, as shown in the `import { api } from './api.js'` statement. Also, make sure that `"type": "module"` is set in your `package.json`. ::: #### Create the Server's TypeScript Configuration In the root folder, create a TypeScript configuration file named `tsconfig.server.json` for the Hono server: This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code. #### Create an `npm` Script to Start the API Server Add a new script in your `package.json` to start the Hono server in development mode: - `tsx`: A TypeScript execution environment that watches for file changes and automatically restarts the server on each save. - `--env-file=.env`: Ensures environment variables are loaded from the `.env` file. - `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server. #### Start the Hono Server Open a new terminal and run the following command to start the development server: The server will now run on port 3002. `tsx` will watch for file changes, automatically restarting the Hono server whenever updates are made. # Select a server
# Koa ### Install Required Packages To set up your Koa server with Remult, run the following commands to install the necessary packages: ### Bootstrap Remult in the Backend Remult is integrated into your backend as middleware for Koa. 1. **Create the API File** Create a new `api.ts` file in the `src/server/` folder with the following code to set up the Remult middleware: 2. **Register the Middleware** Update the `index.ts` file in your `src/server/` folder to include the Remult middleware. Add the following lines: ::: warning ESM Configuration In this tutorial, we are using ECMAScript modules for the Node.js server. When importing files, you must include the `.js` suffix . Additionally, make sure to set `"type": "module"` in your `package.json` file. ::: #### Create the Server's TypeScript Configuration In the root folder, create a TypeScript configuration file named `tsconfig.server.json` to manage the server's settings: This configuration enables TypeScript decorators, ensures compatibility with ECMAScript modules, and specifies the file paths for the server and shared code. #### Create an `npm` Script to Start the API Server To simplify the development process, add a new script in your `package.json` file to start the Koa server in development mode: - `tsx`: A TypeScript Node.js execution environment that watches for file changes and automatically restarts the server on each save. - `--env-file=.env`: Ensures environment variables are loaded from the `.env` file. - `--tsconfig tsconfig.server.json`: Specifies the TypeScript configuration file for the server. #### Start the Koa Server Finally, open a new terminal and run the following command to start the development server: The server will now run on port 3002. `tsx` will watch for any file changes, automatically restarting the server whenever updates are made. # Nest.js ### Bootstrap Remult in the Nest.js back-end 1. Create a `main.ts` file in the `src/` folder with the following code: 2. Add a simple `AppModule` in `src/app.module.ts`: ### Run the Nest.js server Run the server with: Your Nest.js app with Remult is now up and running on port `3002`. # Lazy loading of related entities When an `entity` is loaded, its `many to one` relation fields are also automatically loaded, using a cache mechanism to prevent the reload of an already loaded `entity`. To disable that, set the `lazy` option to `true`. let's use the example in #### Working with Lazy - To manually load a related entity, use its `FieldRef`'s load method. - If the field was not loaded, and you'll access it - it'll return `undefined` and will issue a request to load the related entity. once that entity is loaded, the field will return its value. - To check if a field has value, you can use the `valueIsNull` method of its `FieldRef` - You can override the default `lazy` definitions by setting the `load` option of the repository's `find` method. - To load none of the related entities use: - To specify which fields to load: # Entity Lifecycle Hooks In Remult, you can take advantage of Entity Lifecycle Hooks to add custom logic and actions at specific stages of an entity's lifecycle. There are five lifecycle events available: `validation`, `saving`, `saved`, `deleting`, and `deleted`. These hooks allow you to perform actions or validations when specific events occur in the entity's lifecycle. ## Validation - **Runs On**: Backend and Frontend. - **Purpose**: To perform validations on the entity's data before saving. - **Example**: You can run custom validation like in this example, and you can also use . ## Saving - **Runs On**: Backend . - **Purpose**: To execute custom logic before an entity is saved. - **Example**: ## Saved - **Runs On**: Backend . - **Purpose**: To perform actions after an entity has been successfully saved. - **Example**: Useful for triggering additional processes or updates after saving. ## Deleting - **Runs On**: Backend . - **Purpose**: To execute custom logic before an entity is deleted. - **Example**: You can use this to ensure related data is properly cleaned up or archived. ## Deleted - **Runs On**: Backend . - **Purpose**: To perform actions after an entity has been successfully deleted. - **Example**: Similar to the `saved` event, this is useful for any post-deletion processes. ## Field Saving Hook Additionally, you can define a field-specific `saving` hook that allows you to perform custom logic on a specific field before the entity `saving` hook. It has the following signature: or using the fieldRef You can use the field `saving` hook to perform specialized actions on individual fields during the entity's saving process. ## Lifecycle Event Args Each lifecycle event receives an instance of the relevant entity and an event args of type `LifecycleEvent`. The `LifecycleEvent` object provides various fields and methods to interact with the entity and its context. Here are the fields available in the `LifecycleEvent`: - `isNew`: A boolean indicating whether the entity is new . - `fields`: A reference to the entity's fields, allowing you to access and modify field values. - `id`: The ID of the entity. - `originalId`: The original ID of the entity, which may differ during certain operations. - `repository`: The repository associated with the entity. - `metadata`: The metadata of the entity, providing information about its structure. - `preventDefault`: A method to prevent the default behavior associated with the event. - `relations`: Access to repository relations for the entity, allowing you to work with related data. ## Example Usage Here's an example of how to use Entity Lifecycle Hooks to add custom logic to the `saving` event: In this example, we've defined a `saving` event for the `Task` entity. When a task is being saved, the event handler is called. If the task is new , we set its `createdAt` field to the current date. In either case, we update the `lastUpdated` field with the current date. Entity Lifecycle Hooks provide a powerful way to customize the behavior of your entities and ensure that specific actions or validations are performed at the right time in the entity's lifecycle. You can use these hooks to streamline your application's data management and enforce business rules. # Docs for LLMs We support the convention for making documentation available to large language models and the applications that make use of them. Currently, we have the these different levels: - — a listing of the available files - — complete documentation for remult - — complete documentation for remult with all tutorials __ # Migrations Managing database schemas is crucial in web development. Traditional migration approaches introduce complexity and risks. Remult, designed for data-driven web apps with TypeScript, offers a simpler method. ## You Don't Necessarily Need Migrations Migration files are standard but can complicate database schema management. They're prone to errors, potentially leading to conflicts or downtime. Remult proposes a streamlined alternative: automatic schema synchronization. This approach simplifies schema management by ensuring your database schema aligns with your application code without the manual overhead of traditional migrations. ### Embracing Schema Synchronization with Remult Remult offers an alternative: automatic schema synchronization. **By default, Remult checks for and synchronizes your database schema with the entity types** provided in the `RemultServerOptions.entities` property when the server loads. This feature automatically adds any missing tables or columns, significantly simplifying schema management. ::: tip No Data Loss with Remult's Safe Schema Updates **Remult's schema synchronization** ensures **safe and automatic updates** to your database schema. By only adding new tables or columns without altering existing ones, Remult prevents data loss. This design offers a secure way to evolve your application's database schema. ::: #### Disabling Automatic Schema Synchronization For manual control, Remult allows disabling automatic schema synchronization: #### Manually Triggering Schema Synchronization In certain scenarios, you might want to manually trigger the `ensureSchema` function to ensure that your database schema is up-to-date with your entity definitions. Here's how you can do it: ## Quick Start: Introducing Migrations to Your Application Introducing migrations to your Remult application involves a few straightforward steps. The goal is to ensure that your migrations and API share the same data provider and entity definitions. Here's how you can do it: ### 1. Refactor Your Configuration Start by refactoring the `dataProvider` and `entities` definitions from the `api.ts` file to a new file named `src/server/config.ts`. This allows you to use the same configurations for both your API and migrations. In your `src/server/config.ts` file, define your entities and data provider as follows: :::tip Using environment variables In most cases, the connection string for your database will not be hard-coded but stored in an environment variable for security and flexibility. A common practice is to use a `.env` file to store environment variables in development and load them using the `dotenv` npm package. Here's how you can set it up: 1. Install the `dotenv` package: 2. Create a `.env` file in the root of your project and add your database connection string: 3. At the beginning of your `src/server/config.ts` file, load the environment variables: 4. Access the connection string using `process.env`: By following these steps, you ensure that your application securely and flexibly manages the database connection string. ::: ### 2. Adjust the API Configuration Next, adjust your `api.ts` file to use the configurations from the `config.ts` file, and disable the `ensureSchema` migrations: ### 3. Generate the migration ::: tip Prettier The migration generator uses `prettier` to format the generated code for better readability and consistency. If you don't already have `prettier` installed in your project, we recommend installing it as a development dependency using the following command: ::: To enable automatic generation of migration scripts, follow these steps: 1. **Create the Migrations Folder:** In your `src/server` directory, create a new folder named `migrations`. This folder will hold all your migration scripts. 2. **Create the Migration Generator File:** Inside the `migrations` folder, create a file named `generate-migrations.ts`. This file will contain the script that generates migration scripts based on changes in your entities. Here's the revised section: 3. **Populate the Generator File:** Add the following code to `generate-migrations.ts`: This script generates migration scripts based on changes in your entities. If you're calling this method on a server where the database connection should remain open, omit the `endConnection` parameter or set it to `false`. 4. **Generate Migrations:** To generate the migration scripts, run the `generate-migrations.ts` script using the following command: This command will create two important files: 1. **`migrations-snapshot.json`**: This file stores the last known state of your entities. It helps the migration generator understand what changes have been made since the last migration was generated. 2. **`migrations.ts`**: This file contains the actual migration scripts that need to be run to update your database schema. The structure of this file is as follows: Each migration script is associated with a unique identifier and contains the SQL commands necessary to update the database schema. By running this script whenever you make changes to your entities, you can automatically generate the necessary migration scripts to keep your database schema in sync with your application's data model. It's important to note that each migration can include any code that the developer wishes to include, not just SQL statements. The `sql` parameter is provided to facilitate running SQL commands, but you can also include other logic or code as needed. Additionally, developers are encouraged to add their own custom migrations to address specific requirements or changes that may not be covered by automatically generated migrations. This flexibility allows for a more tailored approach to managing database schema changes. ### 4. Run the Migrations To apply the migrations to your database, you'll need to create a script that executes them. #### Setting Up the Migration Script 1. **Create the Migration Script:** In the `src/server/migrations` folder, add a file named `migrate.ts`. 2. **Populate the Script:** Add the following code to `migrate.ts`: This script sets up the migration process. The `migrate` function checks the last migration executed on the database and runs all subsequent migrations based on their index in the `migrations` file. The entire call to `migrate` is executed in a transaction, ensuring that either all required migration steps are executed or none at all, maintaining the integrity of your database schema. ::: warning Warning: Database Transaction Support for Structural Changes It's important to note that some databases, like MySQL, do not support rolling back structural changes as part of a transaction. This means that if you make changes to the database schema and something goes wrong, those changes might not be automatically rolled back. Developers need to be aware of this limitation and plan their migrations accordingly to avoid leaving the database in an inconsistent state. Always consult your database's documentation to understand the specifics of transaction support and plan your migrations accordingly. ::: 3. **Execute the Script:** Run the migration script using the following command: ## Integrating Migrations into Your Deployment Process You have a couple of options for when and how to run your migrations: - **As Part of the Build Step:** You can include the migration script as part of your build or deployment process. This way, if the migration fails, the deployment will also fail, preventing potential issues with an inconsistent database state. - **During Application Initialization:** Alternatively, you can run the migrations when your application loads by using the `initApi` option in your `api.ts` file: This approach ensures that the migrations are applied each time the API initializes. Note that the `migrate` and `generateMigrations` functions typically close the connection used by the `dataProvider` when they complete. In this code, we disable this behavior using the `endConnection: false` option, instructing the `migrate` function to keep the `dataProvider` connection open when it completes. Choose the approach that best fits your application's deployment and initialization process. ### Migration Philosophy: Embracing Backward Compatibility We believe in designing migrations with a backward compatibility mindset. This approach ensures that older versions of the code can operate smoothly with newer versions of the database. To achieve this, we recommend: - Never dropping columns or tables. - Instead of altering a column, adding a new column and copying the data to it as part of the migration process. This philosophy minimizes disruptions and ensures a smoother transition during database schema updates. # Introduction to Mutable Controllers and Backend Methods In web development architectures, mutable controllers offer a convenient way to manage state and facilitate interactions between the client and the server . These controllers are useful in scenarios where state needs to be maintained and manipulated across server calls, providing a streamlined approach to handling data. ## Overview of Controller Backend Methods A Controller is a class designed to encapsulate business logic and data processing. When a backend method in a controller is called, it ensures that all field values are preserved and appropriately transferred between the frontend and backend, maintaining state throughout the process. ### Defining a Mutable Controller The mutable controller is typically defined in a shared module, allowing both the frontend and backend to interact with it efficiently. Below is an example of how to define such a controller and a backend method within it. ### Explanation with Data Flow and Example Usage This example demonstrates the use of a mutable controller, `UserSignInController`, to handle the sign-in process for users in a web application. Let's break down the key components of this example: 1. **Controller Definition**: The `UserSignInController` is a class annotated with `@Controller`, indicating that it serves as a controller for handling user sign-in operations. 2. **Data Flow**: When the `signInUser` backend method is called from the frontend, all the values of the controller fields will be sent to the backend for processing. Once the method completes its execution, the updated values will be sent back to the frontend. ### Example Usage Here's how you can use the `UserSignInController` on the frontend to initiate the sign-in process: In this example, we create an instance of `UserSignInController` and set its `email`, `password`, and `rememberMe` fields with the appropriate values. We then call the `signInUser` method to initiate the sign-in process. If successful, we log a message indicating that the user has signed in. If an error occurs during the sign-in process, we catch the error and log a corresponding error message. This usage demonstrates how to interact with the mutable controller to handle user sign-in operations seamlessly within a web application. ### Summary Mutable controllers and backend methods provide a powerful mechanism for managing state and handling user interactions in web applications. By encapsulating business logic and data processing within controllers, developers can ensure consistent behavior and efficient data flow between the frontend and backend. With the ability to preserve and transfer field values during server calls, mutable controllers facilitate a smooth and responsive user experience, enhancing the overall functionality and performance of web applications. # Offline Support In modern web applications, providing a seamless user experience often involves enabling offline functionality. This ensures that users can continue to interact with the application even without an active internet connection. Remult supports several offline databases that can be used to store data in the browser for offline scenarios, enhancing the application's resilience and usability. ## Using Local Database for Specific Calls To utilize a local database for a specific call, you can pass the `dataProvider` as a second parameter to the `repo` function. This allows you to specify which database should be used for that particular operation. In this example, `localDb` is used as the data provider for the `Task` repository, enabling data fetching from the local database. ## JSON in LocalStorage / SessionStorage For simple data storage needs, you can use JSON data providers that leverage the browser's `localStorage` or `sessionStorage`. This approach is straightforward and suitable for small datasets that need to persist across sessions or page reloads. ## JSON Storage in IndexedDB For more complex offline storage needs, such as larger datasets and structured queries, `IndexedDB` provides a robust solution. Using Remult’s `JsonEntityIndexedDbStorage`, you can store entities in `IndexedDB`, which is supported across all major browsers. This allows for efficient offline data management while offering support for larger volumes of data compared to `localStorage` or `sessionStorage`. In this example, `JsonEntityIndexedDbStorage` is used to persist the data to `IndexedDB`. This method is ideal for applications with large data sets or those requiring more complex interactions with the stored data in offline mode. ## JSON Storage in OPFS Origin Private File System is a modern browser feature supported by Chrome and Safari, allowing for more structured and efficient data storage in the frontend. Using OPFS with Remult's `JsonDataProvider` provides a robust solution for storing entities in the frontend, especially for applications requiring more complex data handling than what `localStorage` or `sessionStorage` can offer. Certainly! Here's the adjusted section on `sql.js` with an enriched code sample: ## `sql.js`: A SQLite Implementation for the Frontend For applications requiring advanced database functionality, provides a SQLite implementation that runs entirely in the frontend. This allows you to use SQL queries and transactions, offering a powerful and flexible data management solution for offline scenarios. Before using `sql.js` in your project, you need to install the package and its TypeScript definitions. Run the following commands in your terminal: After installing the necessary packages, you can use the following code sample in your project: This code sets up a SQLite database using `sql.js` in your Remult project, with support for saving to and loading from `localStorage`. ## Summary Remult's support for various offline databases empowers developers to create web applications that provide a seamless user experience, even in offline scenarios. Whether using simple JSON storage in `localStorage` or more advanced solutions like OPFS or `sql.js`, Remult offers the flexibility to choose the right data storage solution for your application's needs. By leveraging these offline capabilities, you can ensure that your application remains functional and responsive, regardless of the user's connectivity status. # Quickstart Jumpstart your development with this Quickstart guide. Learn to seamlessly integrate Remult in various stacks, from installation to defining entities for efficient data querying and manipulation. ### Experience Remult with an Interactive Tutorial For a guided, hands-on experience, . It's the fastest way to get up and running with Remult and understand its powerful features. ## Installation The _remult_ package is all you need for both frontend and backend code. If you're using one `package.json` for both frontend and backend - **install Remult once** in the project's root folder. If you're using multiple `package.json` files - **install Remult in both server and client folders**. ::: code-group ::: ## Server-side Initialization Remult is initialized on the server-side as a request handling middleware, with **a single line of code**. Here is the code for setting up the Remult middleware: ::: code-group ::: ## Connecting a Database Use the `dataProvider` property of Remult's server middleware to set up a database connection for Remult. ::: tip Recommended - Use default local JSON files and connect a database later If the `dataProvider` property is not set, Remult stores data as JSON files under the `./db` folder. ::: Here are examples of connecting to some commonly used back-end databases: ::: tabs == Postgres Install node-postgres: Set the `dataProvider` property: Or use your existing postgres connection == MySQL Install knex and mysql2: Set the `dataProvider` property: Or use your existing knex provider == MongoDB Install mongodb: Set the `dataProvider` property: == SQLite There are several sqlite providers supported ### Better-sqlite3 Install better-sqlite3: Set the `dataProvider` property: ### sqlite3 This version of sqlite3 works even on stackblitz Install sqlite3: Set the `dataProvider` property: ### bun:sqlite Set the `dataProvider` property: ### sql.js Install sqlite3: Set the `dataProvider` property: ### Turso Install turso: Set the `dataProvider` property: == Microsoft SQL Server Install knex and tedious: Set the `dataProvider` property: Or use your existing knex provider == DuckDB Install DuckDB: Set the `dataProvider` property: == Oracle Install knex and oracledb: Set the `dataProvider` property: Or use your existing knex provider == JSON Files Set the `dataProvider` property: ::: ## Integrate Auth **Remult is completely unopinionated when it comes to user authentication.** You are free to use any kind of authentication mechanism, and only required to provide Remult with a function that extracts a user object from a request. Here are examples of integrating some commonly used auth providers: ::: code-group ::: ## Defining and Serving an Entity Remult entity classes are shared between frontend and backend code. Alternatively, from an existing Postgres database. ### Serve Entity CRUD API All Remult server middleware options contain an array. Use it to register your Entity. ## Using your Entity on the Client To start querying and mutating data from the client-side using Remult, use the function to create a object for your entity class. This approach simplifies data operations, allowing you to interact with your backend with the assurance of type safety. ## Client-side Customization ::: tip Recommended Defaults By default, remult uses the browser's , and makes data API calls using the base URL `/api` . ::: ### Changing the default API base URL To use a different origin or base URL for API calls, set the remult object's `apiClient.url` property. ### Using an alternative HTTP client Set the `remult` object's `apiClient.httpClient` property to customize the HTTP client used by Remult: ::: code-group ::: # ApiClient Interface for configuring the API client used by Remult to perform HTTP calls to the backend. ## httpClient The HTTP client to use when making API calls. It can be set to a function with the `fetch` signature or an object that has `post`, `put`, `delete`, and `get` methods. This can also be used to inject logic before each HTTP call, such as adding authorization headers. #### example: #### example: #### see: If you want to add headers using angular httpClient, see: https://medium.com/angular-shots/shot-3-how-to-add-http-headers-to-every-request-in-angular-fab3d10edc26 #### example: #### example: ## url The base URL for making API calls. By default, it is set to '/api'. It can be modified to be relative or to use a different domain for the server. #### example: #### example: ## subscriptionClient The subscription client used for real-time data updates. By default, it is set to use Server-Sent Events . It can be set to any subscription provider as illustrated in the Remult tutorial for deploying to a serverless environment. #### see: https://remult.dev/tutorials/react-next/deployment.html#deploying-to-a-serverless-environment ## wrapMessageHandling A function that wraps message handling for subscriptions. This is useful for executing some code before or after any message arrives from the subscription. For example, in Angular, to refresh a specific part of the UI, you can call the `NgZone` run method at this time. #### example: # BackendMethod Decorator indicating that the decorated method runs on the backend. It allows the method to be invoked from the frontend while ensuring that the execution happens on the server side. By default, the method runs within a database transaction, meaning it will either complete entirely or fail without making any partial changes. This behavior can be controlled using the `transactional` option in the `BackendMethodOptions`. For more details, see: . #### example: ## allowed Determines when this `BackendMethod` can execute, see: ## apiPrefix Used to determine the route for the BackendMethod. #### example: ## transactional Controls whether this `BackendMethod` runs within a database transaction. If set to `true`, the method will either complete entirely or fail without making any partial changes. If set to `false`, the method will not be transactional and may result in partial changes if it fails. #### default: #### example: ## queue EXPERIMENTAL: Determines if this method should be queued for later execution ## blockUser EXPERIMENTAL: Determines if the user should be blocked while this `BackendMethod` is running ## paramTypes * **paramTypes** # Entity Decorates classes that should be used as entities. Receives a key and an array of EntityOptions. #### example: #### note: EntityOptions can be set in two ways: #### example: #### example: ## caption A human readable name for the entity ## allowApiRead Determines if this Entity is available for get requests using Rest Api #### description: Determines if one has any access to the data of an entity. #### see: - - to restrict data based on a criteria, use ## allowApiUpdate Determines if this entity can be updated through the api. #### see: - - ## allowApiDelete Determines if entries for this entity can be deleted through the api. #### see: - - ## allowApiInsert Determines if new entries for this entity can be posted through the api. #### see: - - ## allowApiCrud sets the `allowApiUpdate`, `allowApiDelete` and `allowApiInsert` properties in a single set ## apiPrefilter An optional filter that determines which rows can be queried using the API. This filter is applied to all CRUD operations to ensure that only authorized data is accessible. Use `apiPrefilter` to restrict data based on user profile or other conditions. #### example: #### example: #### see: ## apiPreprocessFilter An optional function that allows for preprocessing or modifying the EntityFilter for a specific entity type before it is used in API CRUD operations. This function can be used to enforce additional access control rules or adjust the filter based on the current context or specific request. #### example: ## backendPreprocessFilter Similar to apiPreprocessFilter, but for backend operations. ## backendPrefilter A filter that will be used for all queries from this entity both from the API and from within the backend. #### example: #### see: ## defaultOrderBy An order by to be used, in case no order by was specified #### example: #### example: ## saving An event that will be fired before the Entity will be saved to the database. If the `error` property of the entity's ref or any of its fields will be set, the save will be aborted and an exception will be thrown. this is the place to run logic that we want to run in any case before an entity is saved. #### example: #### link: LifeCycleEvent object #### see: ## saved A hook that runs after an entity has been successfully saved. #### link: LifeCycleEvent object #### see: ## deleting A hook that runs before an entity is deleted. #### link: LifeCycleEvent object #### see: ## deleted A hook that runs after an entity has been successfully deleted. #### link: LifeCycleEvent object #### see: ## validation A hook that runs to perform validation checks on an entity before saving. This hook is also executed on the frontend. #### link: LifeCycleEvent object #### see: ## dbName The name of the table in the database that holds the data for this entity. If no name is set, the `key` will be used instead. #### example: #### example: ## sqlExpression For entities that are based on SQL expressions instead of a physical table or view #### example: ## id An arrow function that identifies the `id` column to use for this entity #### example: #### example: ## entityRefInit Arguments: * **ref** * **row** ## apiRequireId * **apiRequireId** # EntityBase * **EntityBase** ## constructor * **new EntityBase** ## $ * **$** ## _ * **_** ## assign * **assign** Arguments: * **values** ## delete * **delete** ## isNew * **isNew** ## save * **save** # EntityMetadata Metadata for an `Entity`, this metadata can be used in the user interface to provide a richer UI experience ## entityType The class type of the entity ## key The Entity's key also used as it's url ## fields Metadata for the Entity's fields ## caption A human readable caption for the entity. Can be used to achieve a consistent caption for a field throughout the app #### example: #### see: EntityOptions.caption ## dbName The name of the table in the database that holds the data for this entity. If no name is set in the entity options, the `key` will be used instead. #### see: EntityOptions.dbName ## options The options send to the `Entity`'s decorator #### see: EntityOptions ## apiUpdateAllowed true if the current user is allowed to update an entity instance #### see: * @example Arguments: * **item** ## apiReadAllowed true if the current user is allowed to read from entity #### see: EntityOptions.allowApiRead #### example: ## apiDeleteAllowed true if the current user is allowed to delete an entity instance * #### see: EntityOptions.allowApiDelete #### example: Arguments: * **item** ## apiInsertAllowed true if the current user is allowed to create an entity instance #### see: EntityOptions.allowApiInsert #### example: Arguments: * **item** ## getDbName * **getDbName** ## idMetadata Metadata for the Entity's id #### see: EntityOptions.id for configuration # EntityRef * **EntityRef** ## hasErrors * **hasErrors** ## undoChanges * **undoChanges** ## save * **save** ## reload * **reload** ## delete * **delete** ## isNew * **isNew** ## wasChanged * **wasChanged** ## wasDeleted * **wasDeleted** ## getId * **getId** ## getOriginalId * **getOriginalId** ## toApiJson * **toApiJson** ## validate * **validate** ## clone * **clone** ## subscribe * **subscribe** Arguments: * **listener** ## error * **error** ## repository * **repository** ## metadata * **metadata** ## apiUpdateAllowed * **apiUpdateAllowed** ## apiDeleteAllowed * **apiDeleteAllowed** ## apiInsertAllowed * **apiInsertAllowed** ## isLoading * **isLoading** ## fields * **fields** ## relations * **relations** # Field Decorates fields that should be used as fields. for more info see: FieldOptions can be set in two ways: #### example: #### example: ## valueType The value type for this field ## caption A human readable name for the field. Can be used to achieve a consistent caption for a field throughout the app #### example: ## allowNull If it can store null in the database ## required If a value is required ## includeInApi Specifies whether this field should be included in the API. This can be configured based on access control levels. #### example: #### see: - - ## allowApiUpdate Determines whether this field can be updated via the API. This setting can also be controlled based on user roles or other access control checks. _It happens after entity level authorization AND if it's allowed._ #### example: #### see: - - ## validate An arrow function that'll be used to perform validations on it #### example: #### example: #### example: #### example: ## saving Will be fired before this field is saved to the server/database ## serverExpression An expression that will determine this fields value on the backend and be provided to the front end ## dbName The name of the column in the database that holds the data for this field. If no name is set, the key will be used instead. #### example: ## sqlExpression Used or fields that are based on an sql expressions, instead of a physical table column #### example: ## dbReadOnly For fields that shouldn't be part of an update or insert statement ## valueConverter The value converter to be used when loading and saving this field ## displayValue an arrow function that translates the value to a display value ## defaultValue an arrow function that determines the default value of the field, when the entity is created using the `repo.create` method ## inputType The html input type for this field ## lazy * **lazy** ## target The entity type to which this field belongs ## key The key to be used for this field # FieldMetadata Metadata for a `Field`, this metadata can be used in the user interface to provide a richer UI experience ## valueType The field's value type ## key The field's member name in an object. #### example: ## caption A human readable caption for the field. Can be used to achieve a consistent caption for a field throughout the app #### example: #### see: FieldOptions#caption for configuration details ## dbName The name of the column in the database that holds the data for this field. If no name is set, the key will be used instead. #### example: #### see: FieldOptions#dbName for configuration details ## options The options sent to this field's decorator ## inputType The `inputType` relevant for this field, determined by the options sent to it's decorator and the valueConverter in these options ## allowNull if null is allowed for this field #### see: FieldOptions#allowNull for configuration details ## target The class that contains this field #### example: ## getDbName * **getDbName** ## isServerExpression Indicates if this field is based on a server express ## dbReadOnly indicates that this field should only be included in select statement, and excluded from update or insert. useful for db generated ids etc... #### see: FieldOptions#dbReadOnly for configuration details ## valueConverter the Value converter for this field ## displayValue Get the display value for a specific item #### see: FieldOptions#displayValue for configuration details #### example: Arguments: * **item** ## apiUpdateAllowed Determines if the current user is allowed to update a specific entity instance. #### example: #### see: FieldOptions#allowApiUpdate for configuration details #### returns: True if the update is allowed. Arguments: * **item** - Partial entity instance to check permissions against. ## includedInApi Determines if a specific entity field should be included in the API based on the current user's permissions. This method checks visibility permissions for a field within a partial entity instance. #### example: #### see: FieldOptions#includeInApi for configuration details #### returns: True if the field is included in the API. Arguments: * **item** - The partial entity instance used to evaluate field visibility. ## toInput Adapts the value for usage with html input #### example: #### see: ValueConverter#toInput for configuration details Arguments: * **value** * **inputType** ## fromInput Adapts the value for usage with html input #### example: #### see: ValueConverter#fromInput for configuration details Arguments: * **inputValue** * **inputType** # FieldRef * **FieldRef** ## subscribe * **subscribe** Arguments: * **listener** ## valueChanged * **valueChanged** ## load Loads the related value - returns null if the related value is not found ## valueIsNull * **valueIsNull** ## originalValueIsNull * **originalValueIsNull** ## validate * **validate** ## error * **error** ## displayValue * **displayValue** ## value * **value** ## originalValue * **originalValue** ## inputValue * **inputValue** ## entityRef * **entityRef** ## container * **container** ## metadata * **metadata** # Filter The `Filter` class is a helper class that focuses on filter-related concerns. It provides methods for creating and applying filters in queries. ## getPreciseValues Retrieves precise values for each property in a filter for an entity. #### returns: A promise that resolves to a FilterPreciseValues object containing the precise values for each property. #### example: Arguments: * **metadata** - The metadata of the entity being filtered. * **filter** - The filter to analyze. ## getPreciseValues Retrieves precise values for each property in a filter for an entity. #### returns: A promise that resolves to a FilterPreciseValues object containing the precise values for each property. #### example: ## createCustom Creates a custom filter. Custom filters are evaluated on the backend, ensuring security and efficiency. When the filter is used in the frontend, only its name is sent to the backend via the API, where the filter gets translated and applied in a safe manner. #### returns: A function that returns an `EntityFilter` of type `entityType`. #### example: #### see: Arguments: * **translator** - A function that returns an `EntityFilter`. * **key** - An optional unique identifier for the custom filter. ## entityFilterToJson Translates an `EntityFilter` to a plain JSON object that can be stored or transported. #### returns: A plain JSON object representing the `EntityFilter`. #### example: Arguments: * **entityDefs** - The metadata of the entity associated with the filter. * **where** - The `EntityFilter` to be translated. ## entityFilterFromJson Translates a plain JSON object back into an `EntityFilter`. #### returns: The reconstructed `EntityFilter`. #### example: Arguments: * **entityDefs** - The metadata of the entity associated with the filter. * **packed** - The plain JSON object representing the `EntityFilter`. ## fromEntityFilter Converts an `EntityFilter` to a `Filter` that can be used by the `DataProvider`. This method is mainly used internally. #### returns: A `Filter` instance that can be used by the `DataProvider`. #### example: Arguments: * **entity** - The metadata of the entity associated with the filter. * **whereItem** - The `EntityFilter` to be converted. ## constructor * **new Filter** Arguments: * **apply** ## resolve Resolves an entity filter. This method takes a filter which can be either an instance of `EntityFilter` or a function that returns an instance of `EntityFilter` or a promise that resolves to an instance of `EntityFilter`. It then resolves the filter if it is a function and returns the resulting `EntityFilter`. #### returns: The resolved entity filter. Arguments: * **filter** - The filter to resolve. ## toJson * **toJson** # FilterPreciseValues A mapping of property names to arrays of precise values for those properties. #### example: # generateMigrations Generates migration scripts based on changes in entities. #### see: Arguments: * **options** - Configuration options for generating migrations. * **entities** - An array of entity classes whose changes will be included in the migration. * **dataProvider** - The data provider instance or a function returning a promise of the data provider. * **migrationsFolder** - The path to the folder where migration scripts will be stored. Default is 'src/migrations'. * **snapshotFile** - The path to the file where the snapshot of the last known state will be stored. Default is 'migrations-snapshot.json' in the `migrationsFolder`. * **migrationsTSFile** - The path to the TypeScript file where the generated migrations will be written. Default is 'migrations.ts' in the `migrationsFolder`. * **endConnection** - Determines whether to close the database connection after generating migrations. Default is false. # getEntityRef Retrieves the EntityRef object associated with the specified entity instance. The EntityRef provides methods for performing operations on the entity instance. #### returns: The EntityRef object associated with the specified entity instance. #### throws: If throwException is true and the EntityRef object cannot be retrieved. #### see: Arguments: * **entity** - The entity instance. * **throwException** - Indicates whether to throw an exception if the EntityRef object cannot be retrieved. # getFields * **getFields** Arguments: * **container** * **remult** # IdEntity * **IdEntity** ## constructor * **new IdEntity** ## id * **id** ## $ * **$** ## _ * **_** ## assign * **assign** Arguments: * **values** ## delete * **delete** ## isNew * **isNew** ## save * **save** # LiveQuery The `LiveQuery` interface represents a live query that allows subscribing to changes in the query results. ## subscribe Subscribes to changes in the live query results. #### returns: A function that can be used to unsubscribe from the live query. #### example: Arguments: * **next** - A function that will be called with information about changes in the query results. # LiveQueryChangeInfo The `LiveQueryChangeInfo` interface represents information about changes in the results of a live query. ## items The updated array of result items. ## changes The changes received in the specific message. The change types can be "all" , "add", "replace", or "remove". ## applyChanges Applies the changes received in the message to an existing array. This method is particularly useful with React to update the component's state based on the live query changes. #### returns: The updated array of result items after applying the changes. #### example: Arguments: * **prevState** - The previous state of the array of result items. # migrate Applies migration scripts to update the database schema. #### see: Arguments: * **options** - Configuration options for applying migrations. * **migrations** - An object containing the migration scripts, each keyed by a unique identifier. * **dataProvider** - The data provider instance or a function returning a promise of the data provider. * **migrationsTable** - The name of the table that tracks applied migrations. Default is '__remult_migrations_version'. * **endConnection** - Determines whether to close the database connection after applying migrations. Default is false. * **beforeMigration** - A callback function that is called before each migration is applied. Receives an object with the migration index. * **afterMigration** - A callback function that is called after each migration is applied. Receives an object with the migration index and the duration of the migration. # Paginator An interface used to paginating using the `query` method in the `Repository` object #### example: #### example: ## items the items in the current page ## hasNextPage True if next page exists ## count the count of the total items in the `query`'s result ## nextPage Gets the next page in the `query`'s result set # PreprocessFilterInfo Provides additional information and utilities for preprocessing filters in API and backend operations. ## metadata Metadata of the entity being filtered. ## getFilterPreciseValues Retrieves precise values for each property in a filter for an entity. #### returns: A promise that resolves to a FilterPreciseValues object containing the precise values for each property. FilterInfo Arguments: * **filter** - Optional filter to analyze. If not provided, the current filter being preprocessed is used. # QueryResult The result of a call to the `query` method in the `Repository` object. ## returns an iterator that iterates the rows in the result using a paging mechanism #### example: ## count returns the number of rows that match the query criteria ## getPage gets the items in a specific page Arguments: * **pageNumber** ## forEach Performs an operation on all the items matching the query criteria Arguments: * **what** ## paginator Returns a `Paginator` object that is used for efficient paging # RelationOptions Options for configuring a relation between entities. ## caption A human readable name for the field. Can be used to achieve a consistent caption for a field throughout the app #### example: ## fields An object specifying custom field names for the relation. Each key represents a field in the related entity, and its value is the corresponding field in the source entity. For example, `{ customerId: 'id' }` maps the 'customerId' field in the related entity to the 'id' field in the source entity. This is useful when you want to define custom field mappings for the relation. ## field The name of the field for this relation. ## findOptions Find options to apply to the relation when fetching related entities. You can specify a predefined set of find options or provide a function that takes the source entity and returns find options dynamically. These options allow you to customize how related entities are retrieved. ## defaultIncluded Determines whether the relation should be included by default when querying the source entity. When set to true, related entities will be automatically included when querying the source entity. If false or not specified, related entities will need to be explicitly included using the `include` option. # Relations * **Relations** ## constructor * **new Relations** ## toMany Define a toMany relation between entities, indicating a one-to-many relationship. This method allows you to establish a relationship where one entity can have multiple related entities. #### returns: A decorator function to apply the toMany relation to an entity field. Example usage: Arguments: * **toEntityType** * **fieldInToEntity** - The field in the target entity that represents the relation. Use this if you want to specify a custom field name for the relation. ## toOne Define a to-one relation between entities, indicating a one-to-one relationship. If no field or fields are provided, it will automatically create a field in the database to represent the relation. #### returns: A decorator function to apply the to-one relation to an entity field. Example usage: Arguments: * **toEntityType** * **options** - : An object containing options for configuring the to-one relation. * **caption** - A human readable name for the field. Can be used to achieve a consistent caption for a field throughout the app #### example: * **fields** - An object specifying custom field names for the relation. Each key represents a field in the related entity, and its value is the corresponding field in the source entity. For example, `{ customerId: 'id' }` maps the 'customerId' field in the related entity to the 'id' field in the source entity. This is useful when you want to define custom field mappings for the relation. * **field** - The name of the field for this relation. * **findOptions** - Find options to apply to the relation when fetching related entities. You can specify a predefined set of find options or provide a function that takes the source entity and returns find options dynamically. These options allow you to customize how related entities are retrieved. * **defaultIncluded** - Determines whether the relation should be included by default when querying the source entity. When set to true, related entities will be automatically included when querying the source entity. If false or not specified, related entities will need to be explicitly included using the `include` option. # Remult * **Remult** ## repo Return's a `Repository` of the specific entity type #### example: #### see: Arguments: * **entity** - the entity to use * **dataProvider** - an optional alternative data provider to use. Useful for writing to offline storage or an alternative data provider ## user Returns the current user's info ## initUser Fetches user information from the backend and updates the `remult.user` object. Typically used during application initialization and user authentication. #### returns: A promise that resolves to the user's information or `undefined` if unavailable. ## authenticated Checks if a user was authenticated ## isAllowed checks if the user has any of the roles specified in the parameters #### example: #### see: Arguments: * **roles** ## isAllowedForInstance checks if the user matches the allowedForInstance callback #### see: Arguments: * **instance** * **allowed** ## useFetch * **useFetch** Arguments: * **fetch** ## dataProvider The current data provider ## constructor Creates a new instance of the `remult` object. Can receive either an HttpProvider or a DataProvider as a parameter - which will be used to fetch data from. If no provider is specified, `fetch` will be used as an http provider Arguments: * **http** ## call Used to call a `backendMethod` using a specific `remult` object #### example: Arguments: * **backendMethod** - the backend method to call * **classInstance** - the class instance of the backend method, for static backend methods use undefined * **args** - the arguments to send to the backend method ## onFind A helper callback that can be used to debug and trace all find operations. Useful in debugging scenarios Arguments: * **metadata** * **options** * **limit** - Determines the number of rows returned by the request, on the browser the default is 100 rows #### example: * **page** - Determines the page number that will be used to extract the data #### example: * **load** * **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems. #### param: An object specifying the related entities to include, their options, and filtering criteria. Example usage: In this example, the `tags` relation for each customer will be loaded and included in the query result. #### see: - Relations.toMany - Relations.toOne - RelationOptions * **where** - filters the data #### example: #### see: For more usage examples see * **orderBy** - Determines the order of items returned . #### example: #### example: ## clearAllCache * **clearAllCache** ## entityRefInit A helper callback that is called whenever an entity is created. ## context context information that can be used to store custom information that will be disposed as part of the `remult` object ## apiClient The api client that will be used by `remult` to perform calls to the `api` ## liveQueryStorage * **liveQueryStorage** ## subscriptionServer * **subscriptionServer** ## liveQueryPublisher * **liveQueryPublisher** ## liveQuerySubscriber * **liveQuerySubscriber** # RemultServerOptions * **RemultServerOptions** ## entities Entities to use for the api ## controllers Controller to use for the api ## getUser Will be called to get the current user based on the current request ## initRequest Will be called for each request and can be used for configuration ## initApi Will be called once the server is loaded and the data provider is ready ## dataProvider Data Provider to use for the api. #### see: . ## ensureSchema Will create tables and columns in supporting databases. default: true #### description: when set to true, it'll create entities that do not exist, and add columns that are missing. ## rootPath The path to use for the api, default:/api #### description: If you want to use a different api path adjust this field ## defaultGetLimit The default limit to use for find requests that did not specify a limit ## logApiEndPoints When set to true it'll console log each api endpoint that is created ## subscriptionServer A subscription server to use for live query and message channels ## liveQueryStorage A storage to use to store live queries, relevant mostly for serverless scenarios or larger scales ## contextSerializer Used to store the context relevant info for re running a live query ## admin When set to true, will display an admin ui in the `/api/admin` url. Can also be set to an arrow function for fine grained control #### example: #### example: #### see: ## queueStorage Storage to use for backend methods that use queue ## error This method is called whenever there is an error in the API lifecycle. #### example: # Repository used to perform CRUD operations on an `entityType` ## find returns a result array based on the provided options Arguments: * **options** * **limit** - Determines the number of rows returned by the request, on the browser the default is 100 rows #### example: * **page** - Determines the page number that will be used to extract the data #### example: * **load** * **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems. #### param: An object specifying the related entities to include, their options, and filtering criteria. Example usage: In this example, the `tags` relation for each customer will be loaded and included in the query result. #### see: - Relations.toMany - Relations.toOne - RelationOptions * **where** - filters the data #### example: #### see: For more usage examples see * **orderBy** - Determines the order of items returned . #### example: #### example: ## liveQuery returns a result array based on the provided options Arguments: * **options** * **limit** - Determines the number of rows returned by the request, on the browser the default is 100 rows #### example: * **page** - Determines the page number that will be used to extract the data #### example: * **load** * **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems. #### param: An object specifying the related entities to include, their options, and filtering criteria. Example usage: In this example, the `tags` relation for each customer will be loaded and included in the query result. #### see: - Relations.toMany - Relations.toOne - RelationOptions * **where** - filters the data #### example: #### see: For more usage examples see * **orderBy** - Determines the order of items returned . #### example: #### example: ## findFirst returns the first item that matchers the `where` condition #### example: #### example: Arguments: * **where** - filters the data #### see: * **options** * **load** * **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems. #### param: An object specifying the related entities to include, their options, and filtering criteria. Example usage: In this example, the `tags` relation for each customer will be loaded and included in the query result. #### see: - Relations.toMany - Relations.toOne - RelationOptions * **where** - filters the data #### example: #### see: For more usage examples see * **orderBy** - Determines the order of items returned . #### example: #### example: * **useCache** - determines if to cache the result, and return the results from cache. * **createIfNotFound** - If set to true and an item is not found, it's created and returned ## findOne returns the first item that matchers the `where` condition #### example: #### example: Arguments: * **options** * **load** * **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems. #### param: An object specifying the related entities to include, their options, and filtering criteria. Example usage: In this example, the `tags` relation for each customer will be loaded and included in the query result. #### see: - Relations.toMany - Relations.toOne - RelationOptions * **where** - filters the data #### example: #### see: For more usage examples see * **orderBy** - Determines the order of items returned . #### example: #### example: * **useCache** - determines if to cache the result, and return the results from cache. * **createIfNotFound** - If set to true and an item is not found, it's created and returned ## findId returns the items that matches the id. If id is undefined | null, returns null Arguments: * **id** * **options** * **load** * **include** - An option used in the `find` and `findFirst` methods to specify which related entities should be included when querying the source entity. It allows you to eagerly load related data to avoid N+1 query problems. #### param: An object specifying the related entities to include, their options, and filtering criteria. Example usage: In this example, the `tags` relation for each customer will be loaded and included in the query result. #### see: - Relations.toMany - Relations.toOne - RelationOptions * **useCache** - determines if to cache the result, and return the results from cache. * **createIfNotFound** - If set to true and an item is not found, it's created and returned ## groupBy Performs an aggregation on the repository's entity type based on the specified options. #### returns: The result of the aggregation. #### example: Arguments: * **options** - The options for the aggregation. * **group** - Fields to group by. The result will include one entry per unique combination of these fields. * **sum** - Fields to sum. The result will include the sum of these fields for each group. * **avg** - Fields to average. The result will include the average of these fields for each group. * **min** - Fields to find the minimum value. The result will include the minimum value of these fields for each group. * **max** - Fields to find the maximum value. The result will include the maximum value of these fields for each group. * **distinctCount** - Fields to count distinct values. The result will include the distinct count of these fields for each group. * **where** - Filters to apply to the query before aggregation. #### see: EntityFilter * **orderBy** - Fields and aggregates to order the results by. The result can be ordered by groupBy fields, sum fields, average fields, min fields, max fields, and distinctCount fields. ## aggregate Performs an aggregation on the repository's entity type based on the specified options. #### returns: The result of the aggregation. #### example: Arguments: * **options** - The options for the aggregation. ## query Fetches data from the repository in a way that is optimized for handling large sets of entity objects. Unlike the `find` method, which returns an array, the `query` method returns an iterable `QueryResult` object. This allows for more efficient data handling, particularly in scenarios that involve paging through large amounts of data. The method supports pagination and aggregation in a single request. When aggregation options are provided, the result will include both the items from the current page and the results of the requested aggregation. The `query` method is designed for asynchronous iteration using the `for await` statement. #### example: #### example: #### example: Arguments: * **options** ## count Returns a count of the items matching the criteria. #### see: #### example: Arguments: * **where** - filters the data #### see: ## validate Validates an item #### example: Arguments: * **item** * **fields** ## save saves an item or item to the data source. It assumes that if an `id` value exists, it's an existing row - otherwise it's a new row #### example: Arguments: * **item** ## insert Insert an item or item to the data source #### example: #### example: Arguments: * **item** ## update Updates an item, based on its `id` #### example: Arguments: * **id** * **item** ## updateMany Updates all items that match the `where` condition. Arguments: * **options** * **where** - filters the data #### see: * **set** ## upsert Inserts a new entity or updates an existing entity based on the specified criteria. If an entity matching the `where` condition is found, it will be updated with the provided `set` values. If no matching entity is found, a new entity will be created with the given data. The `upsert` method ensures that a row exists based on the `where` condition: if no entity is found, a new one is created. It can handle both single and multiple upserts. #### returns: A promise that resolves with the inserted or updated entity, or an array of entities if multiple options were provided. #### example: #### example: #### example: Arguments: * **options** - The options that define the `where` condition and the `set` values. Can be a single object or an array of objects. ## delete Deletes an Item Arguments: * **id** ## deleteMany Deletes all items that match the `where` condition. Arguments: * **options** * **where** - filters the data #### see: ## create Creates an instance of an item. It'll not be saved to the data source unless `save` or `insert` will be called. It's useful to start or reset a form taking your entity default values into account. Arguments: * **item** ## toJson Translates an entity to a json object. - Ready to be sent to the client __ - Strip out fields that are not allowed to be sent to the client! Check: #### example: Arguments: * **item** - Can be an array or a single entity, awaitable or not ## fromJson Translates a json object to an item instance. #### example: Arguments: * **data** - Can be an array or a single element * **isNew** - To help the creation of the instance ## getEntityRef returns an `entityRef` for an item returned by `create`, `find` etc... Arguments: * **item** ## fields Provides information about the fields of the Repository's entity #### example: ## metadata The metadata for the `entity` #### See: ## addEventListener * **addEventListener** Arguments: * **listener** ## relations * **relations** Arguments: * **item** # Sort The `Sort` class is used to describe sorting criteria for queries. It is mainly used internally, but it provides a few useful functions for working with sorting. ## toEntityOrderBy Translates the current `Sort` instance into an `EntityOrderBy` object. #### returns: An `EntityOrderBy` object representing the sort criteria. ## constructor Constructs a `Sort` instance with the provided sort segments. Arguments: * **segments** - The sort segments to be included in the sort criteria. ## Segments The segments of the sort criteria. ## reverse Reverses the sort order of the current sort criteria. #### returns: A new `Sort` instance with the reversed sort order. ## compare Compares two objects based on the current sort criteria. #### returns: A negative value if `a` should come before `b`, a positive value if `a` should come after `b`, or zero if they are equal. Arguments: * **a** - The first object to compare. * **b** - The second object to compare. * **getFieldKey** - An optional function to get the field key for comparison. ## translateOrderByToSort Translates an `EntityOrderBy` to a `Sort` instance. #### returns: A `Sort` instance representing the translated order by. Arguments: * **entityDefs** - The metadata of the entity associated with the order by. * **orderBy** - The `EntityOrderBy` to be translated. ## createUniqueSort Creates a unique `Sort` instance based on the provided `Sort` and the entity metadata. This ensures that the sort criteria result in a unique ordering of entities. #### returns: A `Sort` instance representing the unique sort criteria. Arguments: * **entityMetadata** - The metadata of the entity associated with the sort. * **orderBy** - The `Sort` instance to be made unique. ## createUniqueEntityOrderBy Creates a unique `EntityOrderBy` based on the provided `EntityOrderBy` and the entity metadata. This ensures that the order by criteria result in a unique ordering of entities. #### returns: An `EntityOrderBy` representing the unique order by criteria. Arguments: * **entityMetadata** - The metadata of the entity associated with the order by. * **orderBy** - The `EntityOrderBy` to be made unique. # SqlDatabase A DataProvider for Sql Databases #### example: #### see: ## getDb Gets the SQL database from the data provider. #### returns: The SQL database. #### see: Arguments: * **dataProvider** - The data provider. ## createCommand Creates a new SQL command. #### returns: The SQL command. #### see: ## execute Executes a SQL command. #### returns: The SQL result. #### see: Arguments: * **sql** - The SQL command. ## wrapIdentifier Wraps an identifier with the database's identifier syntax. ## ensureSchema * **ensureSchema** Arguments: * **entities** ## getEntityDataProvider Gets the entity data provider. #### returns: The entity data provider. Arguments: * **entity** - The entity metadata. ## transaction Runs a transaction. Used internally by remult when transactions are required #### returns: The promise of the transaction. Arguments: * **action** - The action to run in the transaction. ## rawFilter Creates a raw filter for entity filtering. #### returns: - The entity filter with a custom SQL filter. #### example: #### see: Arguments: * **build** - The custom SQL filter builder function. ## filterToRaw Converts a filter to a raw SQL string. #### see: Arguments: * **repo** * **condition** * **sqlCommand** * **dbNames** * **wrapIdentifier** ## LogToConsole `false` __ - No logging `true` - to log all queries to the console `oneLiner` - to log all queries to the console as one line a `function` - to log all queries to the console as a custom format #### example: ## durationThreshold Threshold in milliseconds for logging queries to the console. ## constructor Creates a new SQL database. #### example: Arguments: * **sql** - The SQL implementation. ## end # SubscriptionChannel The `SubscriptionChannel` class is used to send messages from the backend to the frontend, using the same mechanism used by live queries. #### example: ## constructor Constructs a new `SubscriptionChannel` instance. Arguments: * **channelKey** - The key that identifies the channel. ## channelKey The key that identifies the channel. ## publish Publishes a message to the channel. This method should only be used on the backend. Arguments: * **message** - The message to be published. * **remult** - An optional instance of Remult to use for publishing the message. ## subscribe Subscribes to messages from the channel. This method should only be used on the frontend. #### returns: A promise that resolves to a function that can be used to unsubscribe from the channel. Arguments: * **next** - A function that will be called with each message received. * **remult** - An optional instance of Remult to use for the subscription. # Validators Class containing various field validators. ## constructor * **new Validators** ## defaultMessage * **defaultMessage** ## email Validator to check if a value is a valid email address. ## enum Validator to check if a value exists in a given enum. ## in Validator to check if a value is one of the specified values. ## max Validator to check if a value is less than or equal to a maximum value. ## maxLength Validator to check if a string's length is less than or equal to a maximum length. ## min Validator to check if a value is greater than or equal to a minimum value. ## minLength Validator to check if a string's length is greater than or equal to a minimum length. ## notNull Validator to check if a value is not null. ## range Validator to check if a value is within a specified range. ## regex Validator to check if a value matches a given regular expression. ## relationExists Validator to check if a related value exists in the database. ## required Validator to check if a value is required . ## unique Validator to ensure a value is unique in the database. ## uniqueOnBackend * **uniqueOnBackend** ## url Validator to check if a value is a valid URL. # ValueConverter Interface for converting values between different formats, such as in-memory objects, database storage, JSON data transfer objects , and HTML input elements. ## fromJson Converts a value from a JSON DTO to the valueType. This method is typically used when receiving data from a REST API call or deserializing a JSON payload. #### returns: The converted value. #### example: Arguments: * **val** - The value to convert. ## toJson Converts a value of valueType to a JSON DTO. This method is typically used when sending data to a REST API or serializing an object to a JSON payload. #### returns: The converted value. #### example: Arguments: * **val** - The value to convert. ## fromDb Converts a value from the database format to the valueType. #### returns: The converted value. #### example: Arguments: * **val** - The value to convert. ## toDb Converts a value of valueType to the database format. #### returns: The converted value. #### example: Arguments: * **val** - The value to convert. ## toInput Converts a value of valueType to a string suitable for an HTML input element. #### returns: The converted value as a string. #### example: Arguments: * **val** - The value to convert. * **inputType** - The type of the input element . ## fromInput Converts a string from an HTML input element to the valueType. #### returns: The converted value. #### example: Arguments: * **val** - The value to convert. * **inputType** - The type of the input element . ## displayValue Returns a displayable string representation of a value of valueType. #### returns: The displayable string. #### example: Arguments: * **val** - The value to convert. ## fieldTypeInDb Specifies the storage type used in the database for this field. This can be used to explicitly define the data type and precision of the field in the database. #### example: ## inputType Specifies the type of HTML input element suitable for values of valueType. #### example: # Entity Rest Api Breakdown All entities automatically expose a rest API based on the parameters defined in its decorator. The API supports the following actions : | Http Method | Description | example | requires | | ----------- | ----------------------------------------------------------------------------------- | --------------- | -------------- | | GET | returns an array of rows | /api/products | allowApiRead | | GET | returns a single row based on its id | /api/products/7 | allowApiRead | | POST | creates a new row based on the object sent in the body, and returns the new row | /api/products | allowApiInsert | | PUT | updates an existing row based on the object sent in the body and returns the result | /api/products/7 | allowApiUpdate | | DELETE | deletes an existing row | /api/products/7 | allowApiDelete | ## Sort Add \_sort and \_order ## Filter You can filter the rows using different operators ### Filter Operators | operator | description | example | | ------------ | --------------------- | -------------------------------------------------- | | `none` | Equal To | price=10 | | .ne | Not Equal | price.ne=10 | | .in | is in json array | price.in=%5B10%2C20%5D __ | | .contains | Contains a string | name.contains=ee | | .notContains | Not contains a string | name.notContains=ee | | .startsWith | Starts with a string | name.startsWith=ee | | .endsWith | Ends with a string | name.endsWith=ee | | .gt | Greater than | price.gt=10 | | .gte | Greater than or equal | price.gte=10 | | .lt | Lesser than | price.lt=10 | | .lte | Lesser than or equal | price.lte=10 | | .null | is or is not null | price.null=true | - you can add several filter conditions using the `&` operator. ### Count returns: ## Paginate The default page size is 100 rows. :::tip You can use it all in conjunction: ::: # Accessing the Underlying Database in Remult While Remult provides a powerful abstraction for working with databases, there might be scenarios where you need to access the underlying database directly. This could be for performing complex queries, optimizations, or other database-specific operations that are not covered by Remult's API. :::warning Directly executing custom SQL can be dangerous and prone to SQL injection attacks. Always use parameterized queries and the `param` method provided by Remult to safely include user input in your queries. ::: ## Accessing SQL Databases For SQL-based databases, Remult provides the SqlDatabase class to interact directly with the database and allows you to run raw SQL queries directly. This is useful for executing complex queries that involve operations like GROUP BY, bulk updates, and other advanced SQL features. ### Basic SQL Query This approach is straightforward but can lead to inconsistencies if the database schema changes. #### the `dbNamesOf` function: The `dbNamesOf` function dynamically retrieves the database table and column names based on your entity definitions, ensuring that your queries stay in sync with your data model. This enhances consistency, maintainability, and searchability in your code. ##### Create index example ### Using Bound Parameters The `param` method safely incorporates user input into the query, reducing the risk of SQL injection by using parameterized queries. When executed, this code will run the following SQL: ### Leveraging EntityFilter for SQL Databases The `filterToRaw` function converts Remult's `EntityFilter` objects into SQL where clauses, enabling you to incorporate complex filtering logic defined in your models into custom SQL queries. This allows for reusability and integration with backend filters. #### Benefits of filterToRaw - **Reusability**: Allows you to reuse complex filters defined in your Remult models in custom SQL queries. - **Integration**: Respects any **backendPrefilter** and **backendPreprocessFilter** applied to your entities, ensuring consistent access control and data manipulation rules. Resulting SQL: Using `customFilter`: Resulting SQL: ## Accessing Other Databases ## Knex ### Leveraging EntityFilter for Knex ## MongoDB ### Leveraging EntityFilter for MongoDb ## Native postgres ## Conclusion Accessing the underlying database directly in Remult provides the flexibility to handle complex use cases that might not be covered by the ORM layer. However, it's important to use this capability judiciously and securely, especially when dealing with user input, to avoid potential security vulnerabilities like SQL injection. By leveraging utilities like `dbNamesOf` and `filterToRaw for prod ## Fixes go through the warnings of react and make them go away git diff after-ng-new..remult-setup -- :!package-lock.json** > ../remult-setup.diff npm i --- keywords: --- # Using Remult in Non-Remult Routes When using the CRUD api or , `remult` is automatically available. Still, there are many use cases where you may want to user remult in your own routes or other code without using `BackendMethods` but would still want to take advantage of `Remult` as an ORM and use it to check for user validity, etc... If you tried to use the `remult` object, you may have got the error: ## Error: remult object was requested outside of a valid context, try running it within initApi or a remult request cycle Here's how you can use remult in this context, according to the server you're using: ::: tabs == Express ### withRemult middleware You can use remult as an express middleware for a specific route, using `api.withRemult` Or as an express middleware for multiple routes ### withRemultAsync promise wrapper Use the `api.withRemultAsync` method in promises You can also use it without sending the request object, for non request related code == Fastify == Hono == Next.js app router == Sveltekit You can use the `withRemult` method in specific routes You can also define the withRemult as a hook, to make remult available throughout the application == SolidStart You can use the `withRemult` method in specific routes You can also use the same method for any "use server" function You can also define the withRemult as a hook, to make remult available throughout the application == Hapi ::: # Backend only code One of the main advantages of remult is that you write code once, and it runs both on the server and in the browser. However, if you are using a library that only works on the server, the fact that the same code is bundled to the frontend can cause problems. For example, when you build an Angular project, you'll get `Module not found` errors. This article will walk through such a scenario and how it can be solved. For this example, our customer would like us to document each call to the `updatePriceOnBackend` method in a log file. Our first instinct would be to add in the `products.controller.ts` file an import to `fs` and write the following code: ::: danger Error As soon as we do that, we'll get the following errors on the `ng-serve` terminal ::: We get this error because the `fs` module on which we rely here is only relevant in the remult of a `Node JS` server and not in the context of the browser. There are two ways to handle this: ## Solution 1 - exclude from bundler ::: tabs == vite ### Exclude in `vite.config` Instruct vite to exclude the `server-only` packages from the bundle == Webpack and Angular version <=16 Instruct `webpack` not to include the `fs` package in the `frontend` bundle by adding the following JSON to the main section of the project's `package.json` file. _package.json_ - note that you'll need to restart the react/angular dev server. == Angular 17 1. You'll need to either remove `types` entry in the `tsconfig.app.json` or add the types you need to that types array. 2. In `angular.json` you'll need to add an entry called `externalDependencies` to the `architect/build/options` key for your project ::: ## Solution 2 - abstract the call Abstract the call and separate it to backend only files and `inject` it only when we are running on the server. **Step 1**, abstract the call - We'll remove the import to `fs,` and instead of calling specific `fs` methods, we'll define and call a method `writeToLog` that describes what we are trying to do: The method `writeToLog` that we've defined serves as a place holder which we'll assign to in the remult of the server. It receives one parameter of type `string` and returns `void`. **Step 2**, implement the method: In the `/src/app/server` folder, we'll add a file called `log-writer.ts` with the following code: Here we set the implementation of the `writeToLog` method with the actual call to the `fs` module. This file is intended to only run on the server, so it'll not present us with any problem. **Step 3**, load the `log-writer.ts` file: In the `/src/app/server/server-init.ts` file, load the `log-writer.ts` file using an `import` statement That's it - it'll work now. ::: tip If you're still getting an error - check that you have a `logs` folder on your project :) ::: ## Additional Resources Check out this video where I implemented a similar solution when running into the same problem using `bcrypt`: # Using Vue in Markdown ## Browser API Access Restrictions Because VuePress applications are server-rendered in Node.js when generating static builds, any Vue usage must conform to the . In short, make sure to only access Browser / DOM APIs in `beforeMount` or `mounted` hooks. If you are using or demoing components that are not SSR friendly , you can wrap them inside the built-in `` component: ## ## what the fuck asdfsdaf asdf ## # Validation Validation is a key part of any application, and you will see that it's builtin Remult ! Let's dive into it... First of all, some props brings automatic validation, for example `required` and `minLength` for strings : You can establish your own validation rules by using the `validate` prop and do any custom code you want : You want to focus only on the value? The `validate` prop can also use buildin validators like this : It supports array of validators as well : Some validators like `unique` is running on the backend side, and nothing changes, you just have to use it : Also in custom validator you can check if you are in the backend or not : If you want to customize the error message, you can do it globally : # Working without decorators If you prefer to work without decorators, or use `remult` in a javascript project you can use the following: ## Entity ::: code-group ::: This is the same entity that is detailed in the ## Static BackendMethod This is the same backend method that is detailed in the