Compare commits

...

30 Commits
dev ... docs

Author SHA1 Message Date
Wayne
3a6800bc98 Merge branch 'main' into docs 2025-09-06 18:06:34 +03:00
Wayne
413188dc81 Code formatting 2025-09-06 18:04:53 +03:00
Wei S.
4b11cd931a Docs: update rate limiting docs (#91)
* Adding rate limiting docs

* update rate limiting docs

* Resolve conflict

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-06 17:56:34 +03:00
Wayne
a1239e6303 Resolve conflict 2025-09-06 17:56:09 +03:00
Wayne
adb548e184 Merge branch 'main' into docs 2025-09-06 17:55:35 +03:00
Wayne
f1c33b548e update rate limiting docs 2025-09-06 17:50:20 +03:00
scotscotmcc
0a21ad14cd Update README.md (#89)
fix folder in installation steps
2025-09-06 17:38:43 +03:00
Wei S.
63d3960f79 Adding rate limiting docs (#88)
Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-04 17:44:10 +03:00
Wayne
1b59af64c6 Adding rate limiting docs 2025-09-04 17:43:27 +03:00
Wei S.
85a526d1b6 User api key: JSON rate limiting message & status code (#87)
* feat(auth): Implement API key authentication

This commit enables API access with an API key system. This change provides a better experience for programmatic access and third-party integrations.

Key changes include:
- **API Key Management:** Users can now generate, manage, and revoke persistent API keys through a new "API Keys" section in the settings UI.
- **Authentication Middleware:** API requests are now authenticated via an `X-API-KEY` header instead of the previous `Authorization: Bearer` token.
- **Backend Implementation:** Adds a new `api_keys` database table, along with corresponding services, controllers, and routes to manage the key lifecycle securely.
- **Rate Limiting:** The API rate limiter now uses the API key to identify and track requests.
- **Documentation:** The API authentication documentation has been updated to reflect the new method.

* Add configurable API rate limiting

Two new variables are added to `.env.example`:
- `RATE_LIMIT_WINDOW_MS`: The time window in milliseconds for which requests are checked (defaults to 15 minutes).
- `RATE_LIMIT_MAX_REQUESTS`: The maximum number of requests allowed from an IP within the window (defaults to 100).

The installation documentation has been updated to reflect these new configuration options.

* Disable API operation in demo mode

* Exclude public API endpoints from rate limiting

* JSON rate limiting message & status code

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-04 17:32:43 +03:00
Wei S.
52a1a11973 User api key: Exclude public API endpoints from rate limiting (#86)
* feat(auth): Implement API key authentication

This commit enables API access with an API key system. This change provides a better experience for programmatic access and third-party integrations.

Key changes include:
- **API Key Management:** Users can now generate, manage, and revoke persistent API keys through a new "API Keys" section in the settings UI.
- **Authentication Middleware:** API requests are now authenticated via an `X-API-KEY` header instead of the previous `Authorization: Bearer` token.
- **Backend Implementation:** Adds a new `api_keys` database table, along with corresponding services, controllers, and routes to manage the key lifecycle securely.
- **Rate Limiting:** The API rate limiter now uses the API key to identify and track requests.
- **Documentation:** The API authentication documentation has been updated to reflect the new method.

* Add configurable API rate limiting

Two new variables are added to `.env.example`:
- `RATE_LIMIT_WINDOW_MS`: The time window in milliseconds for which requests are checked (defaults to 15 minutes).
- `RATE_LIMIT_MAX_REQUESTS`: The maximum number of requests allowed from an IP within the window (defaults to 100).

The installation documentation has been updated to reflect these new configuration options.

* Disable API operation in demo mode

* Exclude public API endpoints from rate limiting

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-04 17:27:57 +03:00
Wei S.
4048f47777 User api key: Disable API operation in demo mode (#85)
* feat(auth): Implement API key authentication

This commit enables API access with an API key system. This change provides a better experience for programmatic access and third-party integrations.

Key changes include:
- **API Key Management:** Users can now generate, manage, and revoke persistent API keys through a new "API Keys" section in the settings UI.
- **Authentication Middleware:** API requests are now authenticated via an `X-API-KEY` header instead of the previous `Authorization: Bearer` token.
- **Backend Implementation:** Adds a new `api_keys` database table, along with corresponding services, controllers, and routes to manage the key lifecycle securely.
- **Rate Limiting:** The API rate limiter now uses the API key to identify and track requests.
- **Documentation:** The API authentication documentation has been updated to reflect the new method.

* Add configurable API rate limiting

Two new variables are added to `.env.example`:
- `RATE_LIMIT_WINDOW_MS`: The time window in milliseconds for which requests are checked (defaults to 15 minutes).
- `RATE_LIMIT_MAX_REQUESTS`: The maximum number of requests allowed from an IP within the window (defaults to 100).

The installation documentation has been updated to reflect these new configuration options.

* Disable API operation in demo mode

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-04 16:56:45 +03:00
Wei S.
22b173cbe4 Feat: Implement API key authentication (#84)
* feat(auth): Implement API key authentication

This commit enables API access with an API key system. This change provides a better experience for programmatic access and third-party integrations.

Key changes include:
- **API Key Management:** Users can now generate, manage, and revoke persistent API keys through a new "API Keys" section in the settings UI.
- **Authentication Middleware:** API requests are now authenticated via an `X-API-KEY` header instead of the previous `Authorization: Bearer` token.
- **Backend Implementation:** Adds a new `api_keys` database table, along with corresponding services, controllers, and routes to manage the key lifecycle securely.
- **Rate Limiting:** The API rate limiter now uses the API key to identify and track requests.
- **Documentation:** The API authentication documentation has been updated to reflect the new method.

* Add configurable API rate limiting

Two new variables are added to `.env.example`:
- `RATE_LIMIT_WINDOW_MS`: The time window in milliseconds for which requests are checked (defaults to 15 minutes).
- `RATE_LIMIT_MAX_REQUESTS`: The maximum number of requests allowed from an IP within the window (defaults to 100).

The installation documentation has been updated to reflect these new configuration options.

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-04 15:07:53 +03:00
Wei S.
774b0d7a6b Bug fix: Status API response: needsSetup and Remove SUPER_API_KEY support (#83)
* Disable system settings for demo mode

* Status API response: needsSetup

* Remove SUPER_API_KEY support

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-03 16:30:06 +03:00
Wei S.
85607d2ab3 Disable system settings for demo mode (#78)
Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-01 13:29:45 +03:00
Wei S.
94021eab69 v0.3.0 release (#76)
* Remove extra ports in Docker Compose file

* Allow self-assigned cert

* Adding allow insecure cert option

* fix(IMAP): Share connections between each fetch email action

* Update docs: troubleshooting CORS error

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-09-01 12:44:22 +03:00
Wei S.
faefdac44a System settings: Copy locale files in backend build
Copy locale files in backend build
2025-08-31 15:10:40 +03:00
Wei S.
392f51dabc System settings: adding multi-language support for frontend (#72)
* System settings setup

* Multi-language support

* feat: Add internationalization (i18n) support to frontend

This commit introduces internationalization (i18n) to the frontend using the `sveltekit-i18n` library, allowing the user interface to be translated into multiple languages.

Key changes:
- Added translation files for 10 languages (en, de, es, fr, etc.).
- Replaced hardcoded text strings throughout the frontend components and pages with translation keys.
- Added a language selector to the system settings page, allowing administrators to set the default application language.
- Updated the backend settings API to store and expose the new language configuration.

* Adding greek translation

* feat(backend): Implement i18n for API responses

This commit introduces internationalization (i18n) to the backend API using the `i18next` library.

Hardcoded error and response messages in the API controllers have been replaced with translation keys, which are processed by the new i18next middleware. This allows for API responses to be translated into different languages.

The following dependencies were added:
- `i18next`
- `i18next-fs-backend`
- `i18next-http-middleware`

* Formatting code

* Translation revamp for frontend and backend, adding systems docs

* Docs site title

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-31 13:44:28 +03:00
Wei S.
baff1195c7 Feat: System settings (#66)
* Format checked, contributing.md update

* Middleware setup

* IAP API, create user/roles in frontend

* RBAC using CASL library

* Switch to CASL, secure search, resource-level access control

* Remove inherent behavior, index userEmail, adding docs for IAM policies

* Format

* System settings setup

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-28 14:12:05 +03:00
Wei S.
f1da17e484 Fix: storage chart legend overflow (#70)
* Fix storage chart legend overflow

* fix storage legend overflow

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-24 16:10:24 +02:00
Wei S.
a2c55f36ee Cla v2 (#68)
* Format checked, contributing.md update

* Middleware setup

* IAP API, create user/roles in frontend

* RBAC using CASL library

* Switch to CASL, secure search, resource-level access control

* Remove inherent behavior, index userEmail, adding docs for IAM policies

* Format

* CLA v2

* cla-v2

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-24 15:03:05 +02:00
Wei S.
9fdba4cd61 Role based access: Adding docs to docs site (#67)
* Format checked, contributing.md update

* Middleware setup

* IAP API, create user/roles in frontend

* RBAC using CASL library

* Switch to CASL, secure search, resource-level access control

* Remove inherent behavior, index userEmail, adding docs for IAM policies

* Format

* Adding IAM policy documentation to Docs site

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-24 14:52:08 +02:00
Wei S.
108c646596 CLA-v2
CLA-v2: Clarifying LogicLabs OÜ is the entity contributors are signing the agreement with.
2025-08-24 15:05:15 +03:00
Wei S.
61e44c81f7 Role based access (#61)
* Format checked, contributing.md update

* Middleware setup

* IAP API, create user/roles in frontend

* RBAC using CASL library

* Switch to CASL, secure search, resource-level access control

* Remove inherent behavior, index userEmail, adding docs for IAM policies

* Format

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-23 23:19:51 +03:00
Wei S.
f651aeab0e Role based access (#60)
* Format checked, contributing.md update

* Middleware setup

* IAP API, create user/roles in frontend

* RBAC using CASL library

* Switch to CASL, secure search, resource-level access control

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-22 00:51:56 +03:00
Wei S.
3fb4290934 Role based access (#59)
* Format checked, contributing.md update

* Middleware setup

* IAP API, create user/roles in frontend

* RBAC using CASL library

* Switch to CASL, secure search, resource-level access control

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-21 23:53:21 +03:00
Wei S.
8c33b63bdf feat: Role based access control (#58)
* Format checked, contributing.md update

* Middleware setup

* IAP API, create user/roles in frontend

* RBAC using CASL library

* Switch to CASL, secure search, resource-level access control

---------

Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-21 23:45:06 +03:00
David Girón
2b325f3461 feat: optimize Dockerfile (#47)
* define base image arg

* create base stage with common content

* chmod executable entrypoint file

this avoids re-copying the same file as is being modified in the docker
layer

* cache npm downloaded packages

avoids re-downloading deps if cache content is available
2025-08-19 12:17:32 +03:00
Til Wegener
4d3c164bc0 Fix UI size display and ingestion history graph (#50)
* fix: unify size display, improve graph interpolation & time readability

* fix display human-readable sizes in ingestion chart

* display human-readable sizes in ingestion chart

* fix: format code

* fix keep fallback for item.name
2025-08-19 11:06:31 +03:00
Wei S.
7288286fd9 Format checked, contributing.md update (#49)
Co-authored-by: Wayne <5291640+ringoinca@users.noreply.github.com>
2025-08-17 17:42:49 +03:00
152 changed files with 13590 additions and 1017 deletions

View File

@@ -54,17 +54,19 @@ STORAGE_S3_FORCE_PATH_STYLE=false
# --- Security & Authentication ---
# Rate Limiting
# The window in milliseconds for which API requests are checked. Defaults to 60000 (1 minute).
RATE_LIMIT_WINDOW_MS=60000
# The maximum number of API requests allowed from an IP within the window. Defaults to 100.
RATE_LIMIT_MAX_REQUESTS=100
# JWT
# IMPORTANT: Change this to a long, random, and secret string in your .env file
JWT_SECRET=a-very-secret-key-that-you-should-change
JWT_EXPIRES_IN="7d"
# Set the credentials for the initial admin user.
SUPER_API_KEY=
# Master Encryption Key for sensitive data (Such as Ingestion source credentials and passwords)
# IMPORTANT: Generate a secure, random 32-byte hex string for this
# You can use `openssl rand -hex 32` to generate a key.
ENCRYPTION_KEY=

27
.github/CLA-v2.md vendored Normal file
View File

@@ -0,0 +1,27 @@
# Contributor License Agreement (CLA)
Version: 2
This Agreement is for your protection as a Contributor as well as the protection of the maintainers of the Open Archiver software; it does not change your rights to use your own Contributions for any other purpose. Open Archiver is developed and maintained by LogicLabs OÜ, a private limited company established under the laws of the Republic of Estonia.
You accept and agree to the following terms and conditions for Your present and future Contributions submitted to LogicLabs OÜ. Except for the license granted herein to LogicLabs OÜ and recipients of software distributed by LogicLabs OÜ, You reserve all right, title, and interest in and to Your Contributions.
1. Definitions.
"You" (or "Your") shall mean the copyright owner or legal entity authorized by the copyright owner that is making this Agreement with LogicLabs OÜ. For legal entities, the entity making a Contribution and all other entities that control, are controlled by, or are under common control with that entity are considered to be a single Contributor.
"Contribution" shall mean any original work of authorship, including any modifications or additions to an existing work, that is intentionally submitted by You to LogicLabs OÜ for inclusion in, or documentation of, any of the products owned or managed by LogicLabs OÜ (the "Work"). For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to LogicLabs OÜ or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, LogicLabs OÜ for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by You as "Not a Contribution."
2. Grant of Copyright License. Subject to the terms and conditions of this Agreement, You grant to LogicLabs OÜ and to recipients of software distributed by LogicLabs OÜ a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare derivative works of, publicly display, publicly perform, sublicense, and distribute Your Contributions and such derivative works.
3. Grant of Patent License. Subject to the terms and conditions of this Agreement, You grant to LogicLabs OÜ and to recipients of software distributed by LogicLabs OÜ a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by You that are necessarily infringed by Your Contribution(s) alone or by combination of Your Contribution(s) with the Work to which such Contribution(s) was submitted. If any entity institutes patent litigation against You or any other entity (including a cross-claim or counterclaim in a lawsuit) alleging that your Contribution, or the Work to which you have contributed, constitutes direct or contributory patent infringement, then any patent licenses granted to that entity under this Agreement for that Contribution or Work shall terminate as of the date such litigation is filed.
4. You represent that you are legally entitled to grant the above license. If your employer(s) has rights to intellectual property that you create that includes your Contributions, you represent that you have received permission to make Contributions on behalf of that employer, that your employer has waived such rights for your Contributions to LogicLabs OÜ, or that your employer has executed a separate Contributor License Agreement with LogicLabs OÜ.
5. You represent that each of Your Contributions is Your original creation (see section 7 for submissions on behalf of others). You represent that Your Contribution submissions include complete details of any third-party license or other restriction (including, but not limited to, related patents and trademarks) of which you are personally aware and which are associated with any part of Your Contributions.
6. You are not expected to provide support for Your Contributions, except to the extent You desire to provide support. Unless required by applicable law or agreed to in writing, You provide Your Contributions on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE.
7. Should You wish to submit work that is not Your original creation, You may submit it to LogicLabs OÜ separately from any Contribution, identifying the complete details of its source and of any license or other restriction (including, but not limited to, related patents, trademarks, and license agreements) of which you are personally aware, and conspicuously marking the work as "Submitted on behalf of a third-party: [named here]".
8. You agree to notify LogicLabs OÜ of any facts or circumstances of which you become aware that would make these representations inaccurate in any respect.

View File

@@ -23,8 +23,8 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PERSONAL_ACCESS_TOKEN: ${{ secrets.PERSONAL_ACCESS_TOKEN }}
with:
path-to-signatures: 'signatures/version1/cla.json'
path-to-document: 'https://github.com/LogicLabs-OU/OpenArchiver/tree/main/.github/CLA.md'
path-to-signatures: 'signatures/version2/cla.json'
path-to-document: 'https://github.com/LogicLabs-OU/OpenArchiver/blob/main/.github/CLA-v2.md'
branch: 'main'
allowlist: 'wayneshn'

View File

@@ -51,3 +51,13 @@ This project and everyone participating in it is governed by the [Open Archiver
- Follow the existing code style.
- Use TypeScript's strict mode.
- Avoid using `any` as a type. Define clear interfaces and types in the `packages/types` directory.
### Formatting
We use Prettier for code formatting. Before you commit new code, it is necessary to check code format by running this command from the root folder:
`pnpm run lint`
If there are any format issues, you can use the following command to fix them
`pnpm run format`

View File

@@ -78,7 +78,7 @@ Open Archiver is built on a modern, scalable, and maintainable technology stack:
```bash
git clone https://github.com/LogicLabs-OU/OpenArchiver.git
cd open-archiver
cd OpenArchiver
```
2. **Configure your environment:**

View File

@@ -6,7 +6,6 @@ services:
container_name: open-archiver
restart: unless-stopped
ports:
- '4000:4000' # Backend
- '3000:3000' # Frontend
env_file:
- .env
@@ -29,8 +28,6 @@ services:
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-password}
volumes:
- pgdata:/var/lib/postgresql/data
ports:
- '5432:5432'
networks:
- open-archiver-net
@@ -39,8 +36,6 @@ services:
container_name: valkey
restart: unless-stopped
command: valkey-server --requirepass ${REDIS_PASSWORD}
ports:
- '6379:6379'
volumes:
- valkeydata:/data
networks:
@@ -52,8 +47,6 @@ services:
restart: unless-stopped
environment:
MEILI_MASTER_KEY: ${MEILI_MASTER_KEY:-aSampleMasterKey}
ports:
- '7700:7700'
volumes:
- meilidata:/meili_data
networks:

View File

@@ -1,21 +1,29 @@
# Dockerfile for Open Archiver
# 1. Build Stage: Install all dependencies and build the project
FROM node:22-alpine AS build
ARG BASE_IMAGE=node:22-alpine
# 0. Base Stage: Define all common dependencies and setup
FROM ${BASE_IMAGE} AS base
WORKDIR /app
# Install pnpm
RUN npm install -g pnpm
RUN --mount=type=cache,target=/root/.npm \
npm install -g pnpm
# Copy manifests and lockfile
COPY package.json pnpm-workspace.yaml pnpm-lock.yaml* ./
COPY packages/backend/package.json ./packages/backend/
COPY packages/frontend/package.json ./packages/frontend/
COPY packages/types/package.json ./packages/types/
# 1. Build Stage: Install all dependencies and build the project
FROM base AS build
COPY packages/frontend/svelte.config.js ./packages/frontend/
# Install all dependencies. Use --shamefully-hoist to create a flat node_modules structure
RUN pnpm install --shamefully-hoist --frozen-lockfile --prod=false
ENV PNPM_HOME="/pnpm"
RUN --mount=type=cache,id=pnpm,target=/pnpm/store \
pnpm install --shamefully-hoist --frozen-lockfile --prod=false
# Copy the rest of the source code
COPY . .
@@ -24,20 +32,8 @@ COPY . .
RUN pnpm build
# 2. Production Stage: Install only production dependencies and copy built artifacts
FROM node:22-alpine AS production
WORKDIR /app
FROM base AS production
# Install pnpm
RUN npm install -g pnpm
# Copy manifests and lockfile
COPY package.json pnpm-workspace.yaml pnpm-lock.yaml* ./
COPY packages/backend/package.json ./packages/backend/
COPY packages/frontend/package.json ./packages/frontend/
COPY packages/types/package.json ./packages/types/
# Install production dependencies
# RUN pnpm install --shamefully-hoist --frozen-lockfile --prod=true
# Copy built application from build stage
COPY --from=build /app/packages/backend/dist ./packages/backend/dist
@@ -48,7 +44,6 @@ COPY --from=build /app/packages/backend/src/database/migrations ./packages/backe
# Copy the entrypoint script and make it executable
COPY docker/docker-entrypoint.sh /usr/local/bin/
RUN chmod +x /usr/local/bin/docker-entrypoint.sh
# Expose the port the app runs on
EXPOSE 4000

0
docker/docker-entrypoint.sh Normal file → Executable file
View File

View File

@@ -10,8 +10,9 @@ export default defineConfig({
'data-website-id': '2c8b452e-eab5-4f82-8ead-902d8f8b976f',
},
],
['link', { rel: 'icon', href: '/logo-sq.svg' }],
],
title: 'Open Archiver',
title: 'Open Archiver Docs',
description: 'Official documentation for the Open Archiver project.',
themeConfig: {
search: {
@@ -53,6 +54,16 @@ export default defineConfig({
{ text: 'PST Import', link: '/user-guides/email-providers/pst' },
],
},
{
text: 'Settings',
collapsed: true,
items: [
{
text: 'System',
link: '/user-guides/settings/system',
},
],
},
],
},
{
@@ -60,6 +71,7 @@ export default defineConfig({
items: [
{ text: 'Overview', link: '/api/' },
{ text: 'Authentication', link: '/api/authentication' },
{ text: 'Rate Limiting', link: '/api/rate-limiting' },
{ text: 'Auth', link: '/api/auth' },
{ text: 'Archived Email', link: '/api/archived-email' },
{ text: 'Dashboard', link: '/api/dashboard' },
@@ -73,6 +85,10 @@ export default defineConfig({
items: [
{ text: 'Overview', link: '/services/' },
{ text: 'Storage Service', link: '/services/storage-service' },
{
text: 'IAM Service',
items: [{ text: 'IAM Policies', link: '/services/iam-service/iam-policy' }],
},
],
},
],

View File

@@ -1,60 +1,25 @@
# API Authentication
To access protected API endpoints, you need to include a JSON Web Token (JWT) in the `Authorization` header of your requests.
To access protected API endpoints, you need to include a user-generated API key in the `X-API-KEY` header of your requests.
## Obtaining a JWT
## 1. Creating an API Key
First, you need to authenticate with the `/api/v1/auth/login` endpoint by providing your email and password. If the credentials are correct, the API will return an `accessToken`.
You can create, manage, and view your API keys through the application's user interface.
**Request:**
1. Navigate to **Settings > API Keys** in the dashboard.
2. Click the **"Generate API Key"** button.
3. Provide a descriptive name for your key and select an expiration period.
4. The new API key will be displayed. **Copy this key immediately and store it in a secure location. You will not be able to see it again.**
```http
POST /api/v1/auth/login
Content-Type: application/json
## 2. Making Authenticated Requests
{
"email": "user@example.com",
"password": "your-password"
}
```
**Successful Response:**
```json
{
"accessToken": "your.jwt.token",
"user": {
"id": "user-id",
"email": "user@example.com",
"role": "user"
}
}
```
## Making Authenticated Requests
Once you have the `accessToken`, you must include it in the `Authorization` header of all subsequent requests to protected endpoints, using the `Bearer` scheme.
Once you have your API key, you must include it in the `X-API-KEY` header of all subsequent requests to protected API endpoints.
**Example:**
```http
GET /api/v1/dashboard/stats
Authorization: Bearer your.jwt.token
X-API-KEY: a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2
```
If the token is missing, expired, or invalid, the API will respond with a `401 Unauthorized` status code.
## Using a Super API Key
Alternatively, for server-to-server communication or scripts, you can use a super API key. This key provides unrestricted access to the API and should be kept secret.
You can set the `SUPER_API_KEY` in your `.env` file.
To authenticate using the super API key, include it in the `Authorization` header as a Bearer token.
**Example:**
```http
GET /api/v1/dashboard/stats
Authorization: Bearer your-super-secret-api-key
```
If the API key is missing, expired, or invalid, the API will respond with a `401 Unauthorized` status code.

51
docs/api/rate-limiting.md Normal file
View File

@@ -0,0 +1,51 @@
# Rate Limiting
The API implements rate limiting as a security measure to protect your instance from denial-of-service (DoS) and brute-force attacks. This is a crucial feature for maintaining the security and stability of the application.
## How It Works
The rate limiter restricts the number of requests an IP address can make within a specific time frame. These limits are configurable via environment variables to suit your security needs.
By default, the limits are:
- **100 requests** per **1 minute** per IP address.
If this limit is exceeded, the API will respond with an HTTP `429 Too Many Requests` status code.
### Response Body
When an IP address is rate-limited, the API will return a JSON response with the following format:
```json
{
"status": 429,
"message": "Too many requests from this IP, please try again after 15 minutes"
}
```
## Configuration
You can customize the rate-limiting settings by setting the following environment variables in your `.env` file:
- `RATE_LIMIT_WINDOW_MS`: The time window in milliseconds. Defaults to `60000` (1 minute).
- `RATE_LIMIT_MAX_REQUESTS`: The maximum number of requests allowed per IP address within the time window. Defaults to `100`.
## Handling Rate Limits
If you are developing a client that interacts with the API, you should handle rate limiting gracefully:
1. **Check the Status Code**: Monitor for a `429` HTTP status code in responses.
2. **Implement a Retry Mechanism**: When you receive a `429` response, it is best practice to wait before retrying the request. Implementing an exponential backoff strategy is recommended.
3. **Check Headers**: The response will include the following standard headers to help you manage your request rate:
- `RateLimit-Limit`: The maximum number of requests allowed in the current window.
- `RateLimit-Remaining`: The number of requests you have left in the current window.
- `RateLimit-Reset`: The time when the rate limit window will reset, in UTC epoch seconds.
## Excluded Endpoints
Certain essential endpoints are excluded from rate limiting to ensure the application's UI remains responsive. These are:
- `/auth/status`
- `/settings/system`
These endpoints can be called as needed without affecting your rate limit count.

View File

@@ -1,141 +0,0 @@
# IAM Policies Guide
This document provides a comprehensive guide to the Identity and Access Management (IAM) policies in Open Archiver. Our policy structure is inspired by AWS IAM, providing a powerful and flexible way to manage permissions.
## 1. Policy Structure
A policy is a JSON object that consists of one or more statements. Each statement includes an `Effect`, `Action`, and `Resource`.
```json
{
"Effect": "Allow",
"Action": ["archive:read", "archive:search"],
"Resource": ["archive/all"]
}
```
- **`Effect`**: Specifies whether the statement results in an `Allow` or `Deny`. An explicit `Deny` always overrides an `Allow`.
- **`Action`**: A list of operations that the policy grants or denies permission to perform. Actions are formatted as `service:operation`.
- **`Resource`**: A list of resources to which the actions apply. Resources are specified in a hierarchical format. Wildcards (`*`) can be used.
## 2. Wildcard Support
Our IAM system supports wildcards (`*`) in both `Action` and `Resource` fields to provide flexible permission management, as defined in the `PolicyValidator`.
### Action Wildcards
You can use wildcards to grant broad permissions for actions:
- **Global Wildcard (`*`)**: A standalone `*` in the `Action` field grants permission for all possible actions across all services.
```json
"Action": ["*"]
```
- **Service-Level Wildcard (`service:*`)**: A wildcard at the end of an action string grants permission for all actions within that specific service.
```json
"Action": ["archive:*"]
```
### Resource Wildcards
Wildcards can also be used to specify resources:
- **Global Wildcard (`*`)**: A standalone `*` in the `Resource` field applies the policy to all resources in the system.
```json
"Resource": ["*"]
```
- **Partial Wildcards**: Some services allow wildcards at specific points in the resource path to refer to all resources of a certain type. For example, to target all ingestion sources:
```json
"Resource": ["ingestion-source/*"]
```
## 3. Actions and Resources by Service
The following sections define the available actions and resources, categorized by their respective services.
### Service: `archive`
The `archive` service pertains to all actions related to accessing and managing archived emails.
**Actions:**
| Action | Description |
| :--------------- | :--------------------------------------------------------------------- |
| `archive:read` | Grants permission to read the content and metadata of archived emails. |
| `archive:search` | Grants permission to perform search queries against the email archive. |
| `archive:export` | Grants permission to export search results or individual emails. |
**Resources:**
| Resource | Description |
| :------------------------------------ | :--------------------------------------------------------------------------------------- |
| `archive/all` | Represents the entire email archive. |
| `archive/ingestion-source/{sourceId}` | Scopes the action to emails from a specific ingestion source. |
| `archive/mailbox/{email}` | Scopes the action to a single, specific mailbox, usually identified by an email address. |
| `archive/custodian/{custodianId}` | Scopes the action to emails belonging to a specific custodian. |
---
### Service: `ingestion`
The `ingestion` service covers the management of email ingestion sources.
**Actions:**
| Action | Description |
| :----------------------- | :--------------------------------------------------------------------------- |
| `ingestion:createSource` | Grants permission to create a new ingestion source. |
| `ingestion:readSource` | Grants permission to view the details of ingestion sources. |
| `ingestion:updateSource` | Grants permission to modify the configuration of an ingestion source. |
| `ingestion:deleteSource` | Grants permission to delete an ingestion source. |
| `ingestion:manageSync` | Grants permission to trigger, pause, or force a sync on an ingestion source. |
**Resources:**
| Resource | Description |
| :---------------------------- | :-------------------------------------------------------- |
| `ingestion-source/*` | Represents all ingestion sources. |
| `ingestion-source/{sourceId}` | Scopes the action to a single, specific ingestion source. |
---
### Service: `system`
The `system` service is for managing system-level settings, users, and roles.
**Actions:**
| Action | Description |
| :---------------------- | :-------------------------------------------------- |
| `system:readSettings` | Grants permission to view system settings. |
| `system:updateSettings` | Grants permission to modify system settings. |
| `system:readUsers` | Grants permission to list and view user accounts. |
| `system:createUser` | Grants permission to create new user accounts. |
| `system:updateUser` | Grants permission to modify existing user accounts. |
| `system:deleteUser` | Grants permission to delete user accounts. |
| `system:assignRole` | Grants permission to assign roles to users. |
**Resources:**
| Resource | Description |
| :--------------------- | :---------------------------------------------------- |
| `system/settings` | Represents the system configuration. |
| `system/users` | Represents all user accounts within the system. |
| `system/user/{userId}` | Scopes the action to a single, specific user account. |
---
### Service: `dashboard`
The `dashboard` service relates to viewing analytics and overview information.
**Actions:**
| Action | Description |
| :--------------- | :-------------------------------------------------------------- |
| `dashboard:read` | Grants permission to view all dashboard widgets and statistics. |
**Resources:**
| Resource | Description |
| :------------ | :------------------------------------------ |
| `dashboard/*` | Represents all components of the dashboard. |

View File

@@ -0,0 +1,289 @@
# IAM Policies
This document provides a guide to creating and managing IAM policies in Open Archiver. It is intended for developers and administrators who need to configure granular access control for users and roles.
## Policy Structure
IAM policies are defined as an array of JSON objects, where each object represents a single permission rule. The structure of a policy object is as follows:
```json
{
"action": "read" OR ["read", "create"],
"subject": "ingestion" OR ["ingestion", "dashboard"],
"conditions": {
"field_name": "value"
},
"inverted": false OR true,
}
```
- `action`: The action(s) to be performed on the subject. Can be a single string or an array of strings.
- `subject`: The resource(s) or entity on which the action is to be performed. Can be a single string or an array of strings.
- `conditions`: (Optional) A set of conditions that must be met for the permission to be granted.
- `inverted`: (Optional) When set to `true`, this inverts the rule, turning it from a "can" rule into a "cannot" rule. This is useful for creating exceptions to broader permissions.
## Actions
The following actions are available for use in IAM policies:
- `manage`: A wildcard action that grants all permissions on a subject (`create`, `read`, `update`, `delete`, `search`, `sync`).
- `create`: Allows the user to create a new resource.
- `read`: Allows the user to view a resource.
- `update`: Allows the user to modify an existing resource.
- `delete`: Allows the user to delete a resource.
- `search`: Allows the user to search for resources.
- `sync`: Allows the user to synchronize a resource.
## Subjects
The following subjects are available for use in IAM policies:
- `all`: A wildcard subject that represents all resources.
- `archive`: Represents archived emails.
- `ingestion`: Represents ingestion sources.
- `settings`: Represents system settings.
- `users`: Represents user accounts.
- `roles`: Represents user roles.
- `dashboard`: Represents the dashboard.
## Advanced Conditions with MongoDB-Style Queries
Conditions are the key to creating fine-grained access control rules. They are defined as a JSON object where each key represents a field on the subject, and the value defines the criteria for that field.
All conditions within a single rule are implicitly joined with an **AND** logic. This means that for a permission to be granted, the resource must satisfy _all_ specified conditions.
The power of this system comes from its use of a subset of [MongoDB's query language](https://www.mongodb.com/docs/manual/), which provides a flexible and expressive way to define complex rules. These rules are translated into native queries for both the PostgreSQL database (via Drizzle ORM) and the Meilisearch engine.
### Supported Operators and Examples
Here is a detailed breakdown of the supported operators with examples.
#### `$eq` (Equal)
This is the default operator. If you provide a simple key-value pair, it is treated as an equality check.
```json
// This rule...
{ "status": "active" }
// ...is equivalent to this:
{ "status": { "$eq": "active" } }
```
**Use Case**: Grant access to an ingestion source only if its status is `active`.
#### `$ne` (Not Equal)
Matches documents where the field value is not equal to the specified value.
```json
{ "provider": { "$ne": "pst_import" } }
```
**Use Case**: Allow a user to see all ingestion sources except for PST imports.
#### `$in` (In Array)
Matches documents where the field value is one of the values in the specified array.
```json
{
"id": {
"$in": ["INGESTION_ID_1", "INGESTION_ID_2"]
}
}
```
**Use Case**: Grant an auditor access to a specific list of ingestion sources.
#### `$nin` (Not In Array)
Matches documents where the field value is not one of the values in the specified array.
```json
{ "provider": { "$nin": ["pst_import", "eml_import"] } }
```
**Use Case**: Hide all manual import sources from a specific user role.
#### `$lt` / `$lte` (Less Than / Less Than or Equal)
Matches documents where the field value is less than (`$lt`) or less than or equal to (`$lte`) the specified value. This is useful for numeric or date-based comparisons.
```json
{ "sentAt": { "$lt": "2024-01-01T00:00:00.000Z" } }
```
#### `$gt` / `$gte` (Greater Than / Greater Than or Equal)
Matches documents where the field value is greater than (`$gt`) or greater than or equal to (`$gte`) the specified value.
```json
{ "sentAt": { "$lt": "2024-01-01T00:00:00.000Z" } }
```
#### `$exists`
Matches documents that have (or do not have) the specified field.
```json
// Grant access only if a 'lastSyncStatusMessage' exists
{ "lastSyncStatusMessage": { "$exists": true } }
```
## Inverted Rules: Creating Exceptions with `cannot`
By default, all rules are "can" rules, meaning they grant permissions. However, you can create a "cannot" rule by adding `"inverted": true` to a policy object. This is extremely useful for creating exceptions to broader permissions.
A common pattern is to grant broad access and then use an inverted rule to carve out a specific restriction.
**Use Case**: Grant a user access to all ingestion sources _except_ for one specific source.
This is achieved with two rules:
1. A "can" rule that grants `read` access to the `ingestion` subject.
2. An inverted "cannot" rule that denies `read` access for the specific ingestion `id`.
```json
[
{
"action": "read",
"subject": "ingestion"
},
{
"inverted": true,
"action": "read",
"subject": "ingestion",
"conditions": {
"id": "SPECIFIC_INGESTION_ID_TO_EXCLUDE"
}
}
]
```
## Policy Evaluation Logic
The system evaluates policies by combining all relevant rules for a user. The logic is simple:
- A user has permission if at least one `can` rule allows it.
- A permission is denied if a `cannot` (`"inverted": true`) rule explicitly forbids it, even if a `can` rule allows it. `cannot` rules always take precedence.
### Dynamic Policies with Placeholders
To create dynamic policies that are specific to the current user, you can use the `${user.id}` placeholder in the `conditions` object. This placeholder will be replaced with the ID of the current user at runtime.
## Special Permissions for User and Role Management
It is important to note that while `read` access to `users` and `roles` can be granted granularly, any actions that modify these resources (`create`, `update`, `delete`) are restricted to Super Admins.
A user must have the `{ "action": "manage", "subject": "all" }` permission (Typically a Super Admin role) to manage users and roles. This is a security measure to prevent unauthorized changes to user accounts and permissions.
## Policy Examples
Here are several examples based on the default roles in the system, demonstrating how to combine actions, subjects, and conditions to achieve specific access control scenarios.
### Administrator
This policy grants a user full access to all resources using wildcards.
```json
[
{
"action": "manage",
"subject": "all"
}
]
```
### End-User
This policy allows a user to view the dashboard, create new ingestion sources, and fully manage the ingestion sources they own.
```json
[
{
"action": "read",
"subject": "dashboard"
},
{
"action": "create",
"subject": "ingestion"
},
{
"action": "manage",
"subject": "ingestion",
"conditions": {
"userId": "${user.id}"
}
},
{
"action": "manage",
"subject": "archive",
"conditions": {
"ingestionSource.userId": "${user.id}" // also needs to give permission to archived emails created by the user
}
}
]
```
### Global Read-Only Auditor
This policy grants read and search access across most of the application's resources, making it suitable for an auditor who needs to view data without modifying it.
```json
[
{
"action": ["read", "search"],
"subject": ["ingestion", "archive", "dashboard", "users", "roles"]
}
]
```
### Ingestion Admin
This policy grants full control over all ingestion sources and archives, but no other resources.
```json
[
{
"action": "manage",
"subject": "ingestion"
}
]
```
### Auditor for Specific Ingestion Sources
This policy demonstrates how to grant access to a specific list of ingestion sources using the `$in` operator.
```json
[
{
"action": ["read", "search"],
"subject": "ingestion",
"conditions": {
"id": {
"$in": ["INGESTION_ID_1", "INGESTION_ID_2"]
}
}
}
]
```
### Limit Access to a Specific Mailbox
This policy grants a user access to a specific ingestion source, but only allows them to see emails belonging to a single user within that source.
This is achieved by defining two specific `can` rules: The rule grants `read` and `search` access to the `archive` subject, but the `userEmail` must match.
```json
[
{
"action": ["read", "search"],
"subject": "archive",
"conditions": {
"userEmail": "user1@example.com"
}
}
]
```

View File

@@ -0,0 +1,289 @@
# IAM Policy
This document provides a guide to creating and managing IAM policies in Open Archiver. It is intended for developers and administrators who need to configure granular access control for users and roles.
## Policy Structure
IAM policies are defined as an array of JSON objects, where each object represents a single permission rule. The structure of a policy object is as follows:
```json
{
"action": "read" OR ["read", "create"],
"subject": "ingestion" OR ["ingestion", "dashboard"],
"conditions": {
"field_name": "value"
},
"inverted": false OR true,
}
```
- `action`: The action(s) to be performed on the subject. Can be a single string or an array of strings.
- `subject`: The resource(s) or entity on which the action is to be performed. Can be a single string or an array of strings.
- `conditions`: (Optional) A set of conditions that must be met for the permission to be granted.
- `inverted`: (Optional) When set to `true`, this inverts the rule, turning it from a "can" rule into a "cannot" rule. This is useful for creating exceptions to broader permissions.
## Actions
The following actions are available for use in IAM policies:
- `manage`: A wildcard action that grants all permissions on a subject (`create`, `read`, `update`, `delete`, `search`, `sync`).
- `create`: Allows the user to create a new resource.
- `read`: Allows the user to view a resource.
- `update`: Allows the user to modify an existing resource.
- `delete`: Allows the user to delete a resource.
- `search`: Allows the user to search for resources.
- `sync`: Allows the user to synchronize a resource.
## Subjects
The following subjects are available for use in IAM policies:
- `all`: A wildcard subject that represents all resources.
- `archive`: Represents archived emails.
- `ingestion`: Represents ingestion sources.
- `settings`: Represents system settings.
- `users`: Represents user accounts.
- `roles`: Represents user roles.
- `dashboard`: Represents the dashboard.
## Advanced Conditions with MongoDB-Style Queries
Conditions are the key to creating fine-grained access control rules. They are defined as a JSON object where each key represents a field on the subject, and the value defines the criteria for that field.
All conditions within a single rule are implicitly joined with an **AND** logic. This means that for a permission to be granted, the resource must satisfy _all_ specified conditions.
The power of this system comes from its use of a subset of [MongoDB's query language](https://www.mongodb.com/docs/manual/), which provides a flexible and expressive way to define complex rules. These rules are translated into native queries for both the PostgreSQL database (via Drizzle ORM) and the Meilisearch engine.
### Supported Operators and Examples
Here is a detailed breakdown of the supported operators with examples.
#### `$eq` (Equal)
This is the default operator. If you provide a simple key-value pair, it is treated as an equality check.
```json
// This rule...
{ "status": "active" }
// ...is equivalent to this:
{ "status": { "$eq": "active" } }
```
**Use Case**: Grant access to an ingestion source only if its status is `active`.
#### `$ne` (Not Equal)
Matches documents where the field value is not equal to the specified value.
```json
{ "provider": { "$ne": "pst_import" } }
```
**Use Case**: Allow a user to see all ingestion sources except for PST imports.
#### `$in` (In Array)
Matches documents where the field value is one of the values in the specified array.
```json
{
"id": {
"$in": ["INGESTION_ID_1", "INGESTION_ID_2"]
}
}
```
**Use Case**: Grant an auditor access to a specific list of ingestion sources.
#### `$nin` (Not In Array)
Matches documents where the field value is not one of the values in the specified array.
```json
{ "provider": { "$nin": ["pst_import", "eml_import"] } }
```
**Use Case**: Hide all manual import sources from a specific user role.
#### `$lt` / `$lte` (Less Than / Less Than or Equal)
Matches documents where the field value is less than (`$lt`) or less than or equal to (`$lte`) the specified value. This is useful for numeric or date-based comparisons.
```json
{ "sentAt": { "$lt": "2024-01-01T00:00:00.000Z" } }
```
#### `$gt` / `$gte` (Greater Than / Greater Than or Equal)
Matches documents where the field value is greater than (`$gt`) or greater than or equal to (`$gte`) the specified value.
```json
{ "sentAt": { "$lt": "2024-01-01T00:00:00.000Z" } }
```
#### `$exists`
Matches documents that have (or do not have) the specified field.
```json
// Grant access only if a 'lastSyncStatusMessage' exists
{ "lastSyncStatusMessage": { "$exists": true } }
```
## Inverted Rules: Creating Exceptions with `cannot`
By default, all rules are "can" rules, meaning they grant permissions. However, you can create a "cannot" rule by adding `"inverted": true` to a policy object. This is extremely useful for creating exceptions to broader permissions.
A common pattern is to grant broad access and then use an inverted rule to carve out a specific restriction.
**Use Case**: Grant a user access to all ingestion sources _except_ for one specific source.
This is achieved with two rules:
1. A "can" rule that grants `read` access to the `ingestion` subject.
2. An inverted "cannot" rule that denies `read` access for the specific ingestion `id`.
```json
[
{
"action": "read",
"subject": "ingestion"
},
{
"inverted": true,
"action": "read",
"subject": "ingestion",
"conditions": {
"id": "SPECIFIC_INGESTION_ID_TO_EXCLUDE"
}
}
]
```
## Policy Evaluation Logic
The system evaluates policies by combining all relevant rules for a user. The logic is simple:
- A user has permission if at least one `can` rule allows it.
- A permission is denied if a `cannot` (`"inverted": true`) rule explicitly forbids it, even if a `can` rule allows it. `cannot` rules always take precedence.
### Dynamic Policies with Placeholders
To create dynamic policies that are specific to the current user, you can use the `${user.id}` placeholder in the `conditions` object. This placeholder will be replaced with the ID of the current user at runtime.
## Special Permissions for User and Role Management
It is important to note that while `read` access to `users` and `roles` can be granted granularly, any actions that modify these resources (`create`, `update`, `delete`) are restricted to Super Admins.
A user must have the `{ "action": "manage", "subject": "all" }` permission (Typically a Super Admin role) to manage users and roles. This is a security measure to prevent unauthorized changes to user accounts and permissions.
## Policy Examples
Here are several examples based on the default roles in the system, demonstrating how to combine actions, subjects, and conditions to achieve specific access control scenarios.
### Administrator
This policy grants a user full access to all resources using wildcards.
```json
[
{
"action": "manage",
"subject": "all"
}
]
```
### End-User
This policy allows a user to view the dashboard, create new ingestion sources, and fully manage the ingestion sources they own.
```json
[
{
"action": "read",
"subject": "dashboard"
},
{
"action": "create",
"subject": "ingestion"
},
{
"action": "manage",
"subject": "ingestion",
"conditions": {
"userId": "${user.id}"
}
},
{
"action": "manage",
"subject": "archive",
"conditions": {
"ingestionSource.userId": "${user.id}" // also needs to give permission to archived emails created by the user
}
}
]
```
### Global Read-Only Auditor
This policy grants read and search access across most of the application's resources, making it suitable for an auditor who needs to view data without modifying it.
```json
[
{
"action": ["read", "search"],
"subject": ["ingestion", "archive", "dashboard", "users", "roles"]
}
]
```
### Ingestion Admin
This policy grants full control over all ingestion sources and archives, but no other resources.
```json
[
{
"action": "manage",
"subject": "ingestion"
}
]
```
### Auditor for Specific Ingestion Sources
This policy demonstrates how to grant access to a specific list of ingestion sources using the `$in` operator.
```json
[
{
"action": ["read", "search"],
"subject": "ingestion",
"conditions": {
"id": {
"$in": ["INGESTION_ID_1", "INGESTION_ID_2"]
}
}
}
]
```
### Limit Access to a Specific Mailbox
This policy grants a user access to a specific ingestion source, but only allows them to see emails belonging to a single user within that source.
This is achieved by defining two specific `can` rules: The rule grants `read` and `search` access to the `archive` subject, but the `userEmail` must match.
```json
[
{
"action": ["read", "search"],
"subject": "archive",
"conditions": {
"userEmail": "user1@example.com"
}
}
]
```

View File

@@ -105,12 +105,14 @@ These variables are used by `docker-compose.yml` to configure the services.
#### Security & Authentication
| Variable | Description | Default Value |
| ---------------- | ------------------------------------------------------------------- | ------------------------------------------ |
| `JWT_SECRET` | A secret key for signing JWT tokens. | `a-very-secret-key-that-you-should-change` |
| `JWT_EXPIRES_IN` | The expiration time for JWT tokens. | `7d` |
| `SUPER_API_KEY` | An API key with super admin privileges. | |
| `ENCRYPTION_KEY` | A 32-byte hex string for encrypting sensitive data in the database. | |
| Variable | Description | Default Value |
| -------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------ |
| `JWT_SECRET` | A secret key for signing JWT tokens. | `a-very-secret-key-that-you-should-change` |
| `JWT_EXPIRES_IN` | The expiration time for JWT tokens. | `7d` |
| ~~`SUPER_API_KEY`~~ (Deprecated) | An API key with super admin privileges. (The SUPER_API_KEY is deprecated since v0.3.0 after we roll out the role-based access control system.) | |
| `RATE_LIMIT_WINDOW_MS` | The window in milliseconds for which API requests are checked. | `900000` (15 minutes) |
| `RATE_LIMIT_MAX_REQUESTS` | The maximum number of API requests allowed from an IP within the window. | `100` |
| `ENCRYPTION_KEY` | A 32-byte hex string for encrypting sensitive data in the database. | |
## 3. Run the Application
@@ -297,3 +299,31 @@ After you've saved the changes, run the following command in your terminal to ap
```
After this, any new data will be saved directly into the `./data/open-archiver` folder in your project directory.
## Troubleshooting
### 403 Cross-Site POST Forbidden Error
If you are running the application behind a reverse proxy or have mapped the application to a different port (e.g., `3005:3000`), you may encounter a `403 Cross-site POST from submissions are forbidden` error when uploading files.
To resolve this, you must set the `ORIGIN` environment variable to the URL of your application. This ensures that the backend can verify the origin of requests and prevent cross-site request forgery (CSRF) attacks.
Add the following line to your `.env` file, replacing `<your_host>` and `<your_port>` with your specific values:
```bash
ORIGIN=http://<your_host>:<your_port>
```
For example, if your application is accessible at `http://localhost:3005`, you would set the variable as follows:
```bash
ORIGIN=http://localhost:3005
```
After adding the `ORIGIN` variable, restart your Docker containers for the changes to take effect:
```bash
docker-compose up -d --force-recreate
```
This will ensure that your file uploads are correctly authorized.

View File

@@ -0,0 +1,32 @@
# System Settings
System settings allow administrators to configure the global look and theme of the application. These settings apply to all users.
## Configuration
### Language
This setting determines the default display language for the application UI. The selected language will be used for all interface elements, including menus, labels, and messages.
> **Important:** When the language is changed, the backend (API) language will only change after a restart of the server. The frontend will update immediately.
Supported languages:
- English
- German
- French
- Estonian
- Spanish
- Italian
- Portuguese
- Dutch
- Greek
- Japanese
### Default Theme
This setting controls the default color theme for the application. Users can choose between light, dark, or system default. The system default theme will sync with the user's operating system theme.
### Support Email
This setting allows administrators to provide a public-facing email address for user support inquiries. This email address may be displayed on error pages or in other areas where users may need to contact support.

View File

@@ -5,7 +5,8 @@
"main": "dist/index.js",
"scripts": {
"dev": "ts-node-dev --respawn --transpile-only src/index.ts ",
"build": "tsc",
"build": "tsc && pnpm copy-assets",
"copy-assets": "cp -r src/locales dist/locales",
"start": "node dist/index.js",
"start:ingestion-worker": "node dist/workers/ingestion.worker.js",
"start:indexing-worker": "node dist/workers/indexing.worker.js",
@@ -22,6 +23,7 @@
"@aws-sdk/client-s3": "^3.844.0",
"@aws-sdk/lib-storage": "^3.844.0",
"@azure/msal-node": "^3.6.3",
"@casl/ability": "^6.7.3",
"@microsoft/microsoft-graph-client": "^3.0.7",
"@open-archiver/types": "workspace:*",
"archiver": "^7.0.1",
@@ -39,6 +41,9 @@
"express-validator": "^7.2.1",
"google-auth-library": "^10.1.0",
"googleapis": "^152.0.0",
"i18next": "^25.4.2",
"i18next-fs-backend": "^2.6.0",
"i18next-http-middleware": "^3.8.0",
"imapflow": "^1.0.191",
"jose": "^6.0.11",
"mailparser": "^3.7.4",
@@ -55,7 +60,8 @@
"sqlite3": "^5.1.7",
"tsconfig-paths": "^4.2.0",
"xlsx": "^0.18.5",
"yauzl": "^3.2.0"
"yauzl": "^3.2.0",
"zod": "^4.1.5"
},
"devDependencies": {
"@bull-board/api": "^6.11.0",

View File

@@ -0,0 +1,66 @@
import { Request, Response } from 'express';
import { ApiKeyService } from '../../services/ApiKeyService';
import { z } from 'zod';
import { config } from '../../config';
const generateApiKeySchema = z.object({
name: z
.string()
.min(1, 'API kay name must be more than 1 characters')
.max(255, 'API kay name must not be more than 255 characters'),
expiresInDays: z
.number()
.int()
.positive('Only positive number is allowed')
.max(730, 'The API key must expire within 2 years / 730 days.'),
});
export class ApiKeyController {
public async generateApiKey(req: Request, res: Response) {
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
try {
const { name, expiresInDays } = generateApiKeySchema.parse(req.body);
if (!req.user || !req.user.sub) {
return res.status(401).json({ message: 'Unauthorized' });
}
const userId = req.user.sub;
const key = await ApiKeyService.generate(userId, name, expiresInDays);
res.status(201).json({ key });
} catch (error) {
if (error instanceof z.ZodError) {
return res
.status(400)
.json({ message: req.t('api.requestBodyInvalid'), errors: error.message });
}
res.status(500).json({ message: req.t('errors.internalServerError') });
}
}
public async getApiKeys(req: Request, res: Response) {
if (!req.user || !req.user.sub) {
return res.status(401).json({ message: 'Unauthorized' });
}
const userId = req.user.sub;
const keys = await ApiKeyService.getKeys(userId);
res.status(200).json(keys);
}
public async deleteApiKey(req: Request, res: Response) {
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
const { id } = req.params;
if (!req.user || !req.user.sub) {
return res.status(401).json({ message: 'Unauthorized' });
}
const userId = req.user.sub;
await ApiKeyService.deleteKey(id, userId);
res.status(204).send({ message: req.t('apiKeys.deleteSuccess') });
}
}

View File

@@ -8,36 +8,48 @@ export class ArchivedEmailController {
const { ingestionSourceId } = req.params;
const page = parseInt(req.query.page as string, 10) || 1;
const limit = parseInt(req.query.limit as string, 10) || 10;
const userId = req.user?.sub;
if (!userId) {
return res.status(401).json({ message: req.t('errors.unauthorized') });
}
const result = await ArchivedEmailService.getArchivedEmails(
ingestionSourceId,
page,
limit
limit,
userId
);
return res.status(200).json(result);
} catch (error) {
console.error('Get archived emails error:', error);
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
public getArchivedEmailById = async (req: Request, res: Response): Promise<Response> => {
try {
const { id } = req.params;
const email = await ArchivedEmailService.getArchivedEmailById(id);
const userId = req.user?.sub;
if (!userId) {
return res.status(401).json({ message: req.t('errors.unauthorized') });
}
const email = await ArchivedEmailService.getArchivedEmailById(id, userId);
if (!email) {
return res.status(404).json({ message: 'Archived email not found' });
return res.status(404).json({ message: req.t('archivedEmail.notFound') });
}
return res.status(200).json(email);
} catch (error) {
console.error(`Get archived email by id ${req.params.id} error:`, error);
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
public deleteArchivedEmail = async (req: Request, res: Response): Promise<Response> => {
if (config.app.isDemo) {
return res.status(403).json({ message: 'This operation is not allowed in demo mode.' });
return res.status(403).json({ message: req.t('errors.demoMode') });
}
try {
const { id } = req.params;
@@ -47,11 +59,11 @@ export class ArchivedEmailController {
console.error(`Delete archived email ${req.params.id} error:`, error);
if (error instanceof Error) {
if (error.message === 'Archived email not found') {
return res.status(404).json({ message: error.message });
return res.status(404).json({ message: req.t('archivedEmail.notFound') });
}
return res.status(500).json({ message: error.message });
}
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
}

View File

@@ -1,10 +1,13 @@
import type { Request, Response } from 'express';
import { AuthService } from '../../services/AuthService';
import { UserService } from '../../services/UserService';
import { IamService } from '../../services/IamService';
import { db } from '../../database';
import * as schema from '../../database/schema';
import { sql } from 'drizzle-orm';
import { eq, sql } from 'drizzle-orm';
import 'dotenv/config';
import { AuthorizationService } from '../../services/AuthorizationService';
import { CaslPolicy } from '@open-archiver/types';
export class AuthController {
#authService: AuthService;
@@ -24,7 +27,7 @@ export class AuthController {
const { email, password, first_name, last_name } = req.body;
if (!email || !password || !first_name || !last_name) {
return res.status(400).json({ message: 'Email, password, and name are required' });
return res.status(400).json({ message: req.t('auth.setup.allFieldsRequired') });
}
try {
@@ -34,7 +37,7 @@ export class AuthController {
const userCount = Number(userCountResult[0].count);
if (userCount > 0) {
return res.status(403).json({ message: 'Setup has already been completed.' });
return res.status(403).json({ message: req.t('auth.setup.alreadyCompleted') });
}
const newUser = await this.#userService.createAdminUser(
@@ -45,7 +48,7 @@ export class AuthController {
return res.status(201).json(result);
} catch (error) {
console.error('Setup error:', error);
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
@@ -53,32 +56,60 @@ export class AuthController {
const { email, password } = req.body;
if (!email || !password) {
return res.status(400).json({ message: 'Email and password are required' });
return res.status(400).json({ message: req.t('auth.login.emailAndPasswordRequired') });
}
try {
const result = await this.#authService.login(email, password);
if (!result) {
return res.status(401).json({ message: 'Invalid credentials' });
return res.status(401).json({ message: req.t('auth.login.invalidCredentials') });
}
return res.status(200).json(result);
} catch (error) {
console.error('Login error:', error);
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
public status = async (req: Request, res: Response): Promise<Response> => {
try {
const userCountResult = await db
.select({ count: sql<number>`count(*)` })
.from(schema.users);
const userCount = Number(userCountResult[0].count);
const needsSetup = userCount === 0;
const users = await db.select().from(schema.users);
/**
* Check the situation where the only user has "Super Admin" role, but they don't actually have Super Admin permission because the role was set up in an earlier version, we need to change that "Super Admin" role to the one used in the current version.
*/
if (users.length === 1) {
const iamService = new IamService();
const userRoles = await iamService.getRolesForUser(users[0].id);
if (userRoles.some((r) => r.name === 'Super Admin')) {
const authorizationService = new AuthorizationService();
const hasAdminPermission = await authorizationService.can(
users[0].id,
'manage',
'all'
);
if (!hasAdminPermission) {
const suerAdminPolicies: CaslPolicy[] = [
{
action: 'manage',
subject: 'all',
},
];
await db
.update(schema.roles)
.set({
policies: suerAdminPolicies,
slug: 'predefined_super_admin',
})
.where(eq(schema.roles.name, 'Super Admin'));
}
}
}
// in case user uses older version with admin user variables, we will create the admin user using those variables.
if (needsSetup && process.env.ADMIN_EMAIL && process.env.ADMIN_PASSWORD) {
const needsSetupUser = users.length === 0;
if (needsSetupUser && process.env.ADMIN_EMAIL && process.env.ADMIN_PASSWORD) {
await this.#userService.createAdminUser(
{
email: process.env.ADMIN_EMAIL,
@@ -90,10 +121,10 @@ export class AuthController {
);
return res.status(200).json({ needsSetup: false });
}
return res.status(200).json({ needsSetup });
return res.status(200).json({ needsSetup: needsSetupUser });
} catch (error) {
console.error('Status check error:', error);
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
}

View File

@@ -1,7 +1,9 @@
import { Request, Response } from 'express';
import { IamService } from '../../services/IamService';
import { PolicyValidator } from '../../iam-policy/policy-validator';
import type { PolicyStatement } from '@open-archiver/types';
import type { CaslPolicy } from '@open-archiver/types';
import { logger } from '../../config/logger';
import { config } from '../../config';
export class IamController {
#iamService: IamService;
@@ -12,10 +14,15 @@ export class IamController {
public getRoles = async (req: Request, res: Response): Promise<void> => {
try {
const roles = await this.#iamService.getRoles();
let roles = await this.#iamService.getRoles();
if (!roles.some((r) => r.slug?.includes('predefined_'))) {
// create pre defined roles
logger.info({}, 'Creating predefined roles');
await this.createDefaultRoles();
}
res.status(200).json(roles);
} catch (error) {
res.status(500).json({ error: 'Failed to get roles.' });
res.status(500).json({ message: req.t('iam.failedToGetRoles') });
}
};
@@ -27,45 +34,128 @@ export class IamController {
if (role) {
res.status(200).json(role);
} else {
res.status(404).json({ error: 'Role not found.' });
res.status(404).json({ message: req.t('iam.roleNotFound') });
}
} catch (error) {
res.status(500).json({ error: 'Failed to get role.' });
res.status(500).json({ message: req.t('iam.failedToGetRole') });
}
};
public createRole = async (req: Request, res: Response): Promise<void> => {
const { name, policy } = req.body;
public createRole = async (req: Request, res: Response) => {
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
const { name, policies } = req.body;
if (!name || !policy) {
res.status(400).json({ error: 'Missing required fields: name and policy.' });
if (!name || !policies) {
res.status(400).json({ message: req.t('iam.missingRoleFields') });
return;
}
for (const statement of policy) {
const { valid, reason } = PolicyValidator.isValid(statement as PolicyStatement);
if (!valid) {
res.status(400).json({ error: `Invalid policy statement: ${reason}` });
return;
}
}
try {
const role = await this.#iamService.createRole(name, policy);
for (const statement of policies) {
const { valid, reason } = PolicyValidator.isValid(statement as CaslPolicy);
if (!valid) {
res.status(400).json({ message: `${req.t('iam.invalidPolicy')} ${reason}` });
return;
}
}
const role = await this.#iamService.createRole(name, policies);
res.status(201).json(role);
} catch (error) {
res.status(500).json({ error: 'Failed to create role.' });
console.log(error);
res.status(500).json({ message: req.t('iam.failedToCreateRole') });
}
};
public deleteRole = async (req: Request, res: Response): Promise<void> => {
public deleteRole = async (req: Request, res: Response) => {
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
const { id } = req.params;
try {
await this.#iamService.deleteRole(id);
res.status(204).send();
} catch (error) {
res.status(500).json({ error: 'Failed to delete role.' });
res.status(500).json({ message: req.t('iam.failedToDeleteRole') });
}
};
public updateRole = async (req: Request, res: Response) => {
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
const { id } = req.params;
const { name, policies } = req.body;
if (!name && !policies) {
res.status(400).json({ message: req.t('iam.missingUpdateFields') });
return;
}
if (policies) {
for (const statement of policies) {
const { valid, reason } = PolicyValidator.isValid(statement as CaslPolicy);
if (!valid) {
res.status(400).json({ message: `${req.t('iam.invalidPolicy')} ${reason}` });
return;
}
}
}
try {
const role = await this.#iamService.updateRole(id, { name, policies });
res.status(200).json(role);
} catch (error) {
res.status(500).json({ message: req.t('iam.failedToUpdateRole') });
}
};
private createDefaultRoles = async () => {
try {
// end user who can manage its own data, and create new ingestions.
await this.#iamService.createRole(
'End user',
[
{
action: 'read',
subject: 'dashboard',
},
{
action: 'create',
subject: 'ingestion',
},
{
action: 'manage',
subject: 'ingestion',
conditions: {
userId: '${user.id}',
},
},
{
action: 'manage',
subject: 'archive',
conditions: {
'ingestionSource.userId': '${user.id}',
},
},
],
'predefined_end_user'
);
// read only
await this.#iamService.createRole(
'Read only',
[
{
action: ['read', 'search'],
subject: ['ingestion', 'archive', 'dashboard', 'users', 'roles'],
},
],
'predefined_read_only_user'
);
} catch (error) {
logger.error({}, 'Failed to create default roles');
}
};
}

View File

@@ -23,34 +23,38 @@ export class IngestionController {
public create = async (req: Request, res: Response): Promise<Response> => {
if (config.app.isDemo) {
return res.status(403).json({ message: 'This operation is not allowed in demo mode.' });
return res.status(403).json({ message: req.t('errors.demoMode') });
}
try {
const dto: CreateIngestionSourceDto = req.body;
const newSource = await IngestionService.create(dto);
const userId = req.user?.sub;
if (!userId) {
return res.status(401).json({ message: req.t('errors.unauthorized') });
}
const newSource = await IngestionService.create(dto, userId);
const safeSource = this.toSafeIngestionSource(newSource);
return res.status(201).json(safeSource);
} catch (error: any) {
logger.error({ err: error }, 'Create ingestion source error');
// Return a 400 Bad Request for connection errors
return res
.status(400)
.json({
message:
error.message ||
'Failed to create ingestion source due to a connection error.',
});
return res.status(400).json({
message: error.message || req.t('ingestion.failedToCreate'),
});
}
};
public findAll = async (req: Request, res: Response): Promise<Response> => {
try {
const sources = await IngestionService.findAll();
const userId = req.user?.sub;
if (!userId) {
return res.status(401).json({ message: req.t('errors.unauthorized') });
}
const sources = await IngestionService.findAll(userId);
const safeSources = sources.map(this.toSafeIngestionSource);
return res.status(200).json(safeSources);
} catch (error) {
console.error('Find all ingestion sources error:', error);
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
@@ -63,15 +67,15 @@ export class IngestionController {
} catch (error) {
console.error(`Find ingestion source by id ${req.params.id} error:`, error);
if (error instanceof Error && error.message === 'Ingestion source not found') {
return res.status(404).json({ message: error.message });
return res.status(404).json({ message: req.t('ingestion.notFound') });
}
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
public update = async (req: Request, res: Response): Promise<Response> => {
if (config.app.isDemo) {
return res.status(403).json({ message: 'This operation is not allowed in demo mode.' });
return res.status(403).json({ message: req.t('errors.demoMode') });
}
try {
const { id } = req.params;
@@ -82,15 +86,15 @@ export class IngestionController {
} catch (error) {
console.error(`Update ingestion source ${req.params.id} error:`, error);
if (error instanceof Error && error.message === 'Ingestion source not found') {
return res.status(404).json({ message: error.message });
return res.status(404).json({ message: req.t('ingestion.notFound') });
}
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
public delete = async (req: Request, res: Response): Promise<Response> => {
if (config.app.isDemo) {
return res.status(403).json({ message: 'This operation is not allowed in demo mode.' });
return res.status(403).json({ message: req.t('errors.demoMode') });
}
try {
const { id } = req.params;
@@ -99,32 +103,32 @@ export class IngestionController {
} catch (error) {
console.error(`Delete ingestion source ${req.params.id} error:`, error);
if (error instanceof Error && error.message === 'Ingestion source not found') {
return res.status(404).json({ message: error.message });
return res.status(404).json({ message: req.t('ingestion.notFound') });
}
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
public triggerInitialImport = async (req: Request, res: Response): Promise<Response> => {
if (config.app.isDemo) {
return res.status(403).json({ message: 'This operation is not allowed in demo mode.' });
return res.status(403).json({ message: req.t('errors.demoMode') });
}
try {
const { id } = req.params;
await IngestionService.triggerInitialImport(id);
return res.status(202).json({ message: 'Initial import triggered successfully.' });
return res.status(202).json({ message: req.t('ingestion.initialImportTriggered') });
} catch (error) {
console.error(`Trigger initial import for ${req.params.id} error:`, error);
if (error instanceof Error && error.message === 'Ingestion source not found') {
return res.status(404).json({ message: error.message });
return res.status(404).json({ message: req.t('ingestion.notFound') });
}
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
public pause = async (req: Request, res: Response): Promise<Response> => {
if (config.app.isDemo) {
return res.status(403).json({ message: 'This operation is not allowed in demo mode.' });
return res.status(403).json({ message: req.t('errors.demoMode') });
}
try {
const { id } = req.params;
@@ -134,26 +138,26 @@ export class IngestionController {
} catch (error) {
console.error(`Pause ingestion source ${req.params.id} error:`, error);
if (error instanceof Error && error.message === 'Ingestion source not found') {
return res.status(404).json({ message: error.message });
return res.status(404).json({ message: req.t('ingestion.notFound') });
}
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
public triggerForceSync = async (req: Request, res: Response): Promise<Response> => {
if (config.app.isDemo) {
return res.status(403).json({ message: 'This operation is not allowed in demo mode.' });
return res.status(403).json({ message: req.t('errors.demoMode') });
}
try {
const { id } = req.params;
await IngestionService.triggerForceSync(id);
return res.status(202).json({ message: 'Force sync triggered successfully.' });
return res.status(202).json({ message: req.t('ingestion.forceSyncTriggered') });
} catch (error) {
console.error(`Trigger force sync for ${req.params.id} error:`, error);
if (error instanceof Error && error.message === 'Ingestion source not found') {
return res.status(404).json({ message: error.message });
return res.status(404).json({ message: req.t('ingestion.notFound') });
}
return res.status(500).json({ message: 'An internal server error occurred' });
return res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
}

View File

@@ -12,22 +12,31 @@ export class SearchController {
public search = async (req: Request, res: Response): Promise<void> => {
try {
const { keywords, page, limit, matchingStrategy } = req.query;
const userId = req.user?.sub;
if (!keywords) {
res.status(400).json({ message: 'Keywords are required' });
if (!userId) {
res.status(401).json({ message: req.t('errors.unauthorized') });
return;
}
const results = await this.searchService.searchEmails({
query: keywords as string,
page: page ? parseInt(page as string) : 1,
limit: limit ? parseInt(limit as string) : 10,
matchingStrategy: matchingStrategy as MatchingStrategies,
});
if (!keywords) {
res.status(400).json({ message: req.t('search.keywordsRequired') });
return;
}
const results = await this.searchService.searchEmails(
{
query: keywords as string,
page: page ? parseInt(page as string) : 1,
limit: limit ? parseInt(limit as string) : 10,
matchingStrategy: matchingStrategy as MatchingStrategies,
},
userId
);
res.status(200).json(results);
} catch (error) {
const message = error instanceof Error ? error.message : 'An unknown error occurred';
const message = error instanceof Error ? error.message : req.t('errors.unknown');
res.status(500).json({ message });
}
};

View File

@@ -0,0 +1,29 @@
import type { Request, Response } from 'express';
import { SettingsService } from '../../services/SettingsService';
import { config } from '../../config';
const settingsService = new SettingsService();
export const getSystemSettings = async (req: Request, res: Response) => {
try {
const settings = await settingsService.getSystemSettings();
res.status(200).json(settings);
} catch (error) {
// A more specific error could be logged here
res.status(500).json({ message: req.t('settings.failedToRetrieve') });
}
};
export const updateSystemSettings = async (req: Request, res: Response) => {
try {
// Basic validation can be performed here if necessary
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
const updatedSettings = await settingsService.updateSystemSettings(req.body);
res.status(200).json(updatedSettings);
} catch (error) {
// A more specific error could be logged here
res.status(500).json({ message: req.t('settings.failedToUpdate') });
}
};

View File

@@ -10,7 +10,7 @@ export class StorageController {
const unsafePath = req.query.path as string;
if (!unsafePath) {
res.status(400).send('File path is required');
res.status(400).send(req.t('storage.filePathRequired'));
return;
}
@@ -24,7 +24,7 @@ export class StorageController {
const fullPath = path.join(basePath, normalizedPath);
if (!fullPath.startsWith(basePath)) {
res.status(400).send('Invalid file path');
res.status(400).send(req.t('storage.invalidFilePath'));
return;
}
@@ -34,7 +34,7 @@ export class StorageController {
try {
const fileExists = await this.storageService.exists(safePath);
if (!fileExists) {
res.status(404).send('File not found');
res.status(404).send(req.t('storage.fileNotFound'));
return;
}
@@ -44,7 +44,7 @@ export class StorageController {
fileStream.pipe(res);
} catch (error) {
console.error('Error downloading file:', error);
res.status(500).send('Error downloading file');
res.status(500).send(req.t('storage.downloadError'));
}
};
}

View File

@@ -0,0 +1,66 @@
import { Request, Response } from 'express';
import { UserService } from '../../services/UserService';
import * as schema from '../../database/schema';
import { sql } from 'drizzle-orm';
import { db } from '../../database';
import { config } from '../../config';
const userService = new UserService();
export const getUsers = async (req: Request, res: Response) => {
const users = await userService.findAll();
res.json(users);
};
export const getUser = async (req: Request, res: Response) => {
const user = await userService.findById(req.params.id);
if (!user) {
return res.status(404).json({ message: req.t('user.notFound') });
}
res.json(user);
};
export const createUser = async (req: Request, res: Response) => {
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
const { email, first_name, last_name, password, roleId } = req.body;
const newUser = await userService.createUser(
{ email, first_name, last_name, password },
roleId
);
res.status(201).json(newUser);
};
export const updateUser = async (req: Request, res: Response) => {
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
const { email, first_name, last_name, roleId } = req.body;
const updatedUser = await userService.updateUser(
req.params.id,
{ email, first_name, last_name },
roleId
);
if (!updatedUser) {
return res.status(404).json({ message: req.t('user.notFound') });
}
res.json(updatedUser);
};
export const deleteUser = async (req: Request, res: Response) => {
if (config.app.isDemo) {
return res.status(403).json({ message: req.t('errors.demoMode') });
}
const userCountResult = await db.select({ count: sql<number>`count(*)` }).from(schema.users);
const isOnlyUser = Number(userCountResult[0].count) === 1;
if (isOnlyUser) {
return res.status(400).json({
message: req.t('user.cannotDeleteOnlyUser'),
});
}
await userService.deleteUser(req.params.id);
res.status(204).send();
};

View File

@@ -1,10 +1,16 @@
import rateLimit from 'express-rate-limit';
import { config } from '../../config';
// Rate limiter to prevent brute-force attacks on the login endpoint
export const loginRateLimiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 10, // Limit each IP to 10 login requests per windowMs
message: 'Too many login attempts from this IP, please try again after 15 minutes',
standardHeaders: true, // Return rate limit info in the `RateLimit-*` headers
legacyHeaders: false, // Disable the `X-RateLimit-*` headers
const windowInMinutes = Math.ceil(config.api.rateLimit.windowMs / 60000);
export const rateLimiter = rateLimit({
windowMs: config.api.rateLimit.windowMs,
max: config.api.rateLimit.max,
message: {
status: 429,
message: `Too many requests from this IP, please try again after ${windowInMinutes} minutes`,
},
statusCode: 429,
standardHeaders: true,
legacyHeaders: false,
});

View File

@@ -2,6 +2,9 @@ import type { Request, Response, NextFunction } from 'express';
import type { AuthService } from '../../services/AuthService';
import type { AuthTokenPayload } from '@open-archiver/types';
import 'dotenv/config';
import { ApiKeyService } from '../../services/ApiKeyService';
import { UserService } from '../../services/UserService';
// By using module augmentation, we can add our custom 'user' property
// to the Express Request interface in a type-safe way.
declare global {
@@ -15,16 +18,30 @@ declare global {
export const requireAuth = (authService: AuthService) => {
return async (req: Request, res: Response, next: NextFunction) => {
const authHeader = req.headers.authorization;
const apiKeyHeader = req.headers['x-api-key'];
if (apiKeyHeader) {
const userId = await ApiKeyService.validateKey(apiKeyHeader as string);
if (!userId) {
return res.status(401).json({ message: 'Unauthorized: Invalid API key' });
}
const user = await new UserService().findById(userId);
if (!user) {
return res.status(401).json({ message: 'Unauthorized: Invalid user' });
}
req.user = {
sub: user.id,
email: user.email,
roles: user.role ? [user.role.name] : [],
};
return next();
}
if (!authHeader || !authHeader.startsWith('Bearer ')) {
return res.status(401).json({ message: 'Unauthorized: No token provided' });
}
const token = authHeader.split(' ')[1];
try {
// use a SUPER_API_KEY for all authentications. add process.env.SUPER_API_KEY conditional check in case user didn't set a SUPER_API_KEY.
if (process.env.SUPER_API_KEY && token === process.env.SUPER_API_KEY) {
next();
return;
}
const payload = await authService.verifyToken(token);
if (!payload) {
return res.status(401).json({ message: 'Unauthorized: Invalid token' });

View File

@@ -0,0 +1,38 @@
import { AuthorizationService } from '../../services/AuthorizationService';
import type { Request, Response, NextFunction } from 'express';
import { AppActions, AppSubjects } from '@open-archiver/types';
export const requirePermission = (
action: AppActions,
subjectName: AppSubjects,
rejectMessage?: string
) => {
return async (req: Request, res: Response, next: NextFunction) => {
const userId = req.user?.sub;
if (!userId) {
return res.status(401).json({ message: 'Unauthorized' });
}
let resourceObject = undefined;
// Logic to fetch resourceObject if needed for condition-based checks...
const authorizationService = new AuthorizationService();
const hasPermission = await authorizationService.can(
userId,
action,
subjectName,
resourceObject
);
if (!hasPermission) {
const message = rejectMessage
? req.t(rejectMessage)
: req.t('errors.noPermissionToAction');
return res.status(403).json({
message,
});
}
next();
};
};

View File

@@ -0,0 +1,15 @@
import { Router } from 'express';
import { ApiKeyController } from '../controllers/api-key.controller';
import { requireAuth } from '../middleware/requireAuth';
import { AuthService } from '../../services/AuthService';
export const apiKeyRoutes = (authService: AuthService) => {
const router = Router();
const controller = new ApiKeyController();
router.post('/', requireAuth(authService), controller.generateApiKey);
router.get('/', requireAuth(authService), controller.getApiKeys);
router.delete('/:id', requireAuth(authService), controller.deleteApiKey);
return router;
};

View File

@@ -1,6 +1,7 @@
import { Router } from 'express';
import { ArchivedEmailController } from '../controllers/archived-email.controller';
import { requireAuth } from '../middleware/requireAuth';
import { requirePermission } from '../middleware/requirePermission';
import { AuthService } from '../../services/AuthService';
export const createArchivedEmailRouter = (
@@ -12,11 +13,23 @@ export const createArchivedEmailRouter = (
// Secure all routes in this module
router.use(requireAuth(authService));
router.get('/ingestion-source/:ingestionSourceId', archivedEmailController.getArchivedEmails);
router.get(
'/ingestion-source/:ingestionSourceId',
requirePermission('read', 'archive'),
archivedEmailController.getArchivedEmails
);
router.get('/:id', archivedEmailController.getArchivedEmailById);
router.get(
'/:id',
requirePermission('read', 'archive'),
archivedEmailController.getArchivedEmailById
);
router.delete('/:id', archivedEmailController.deleteArchivedEmail);
router.delete(
'/:id',
requirePermission('delete', 'archive'),
archivedEmailController.deleteArchivedEmail
);
return router;
};

View File

@@ -1,5 +1,4 @@
import { Router } from 'express';
import { loginRateLimiter } from '../middleware/rateLimiter';
import type { AuthController } from '../controllers/auth.controller';
export const createAuthRouter = (authController: AuthController): Router => {
@@ -10,14 +9,14 @@ export const createAuthRouter = (authController: AuthController): Router => {
* @description Creates the initial administrator user.
* @access Public
*/
router.post('/setup', loginRateLimiter, authController.setup);
router.post('/setup', authController.setup);
/**
* @route POST /api/v1/auth/login
* @description Authenticates a user and returns a JWT.
* @access Public
*/
router.post('/login', loginRateLimiter, authController.login);
router.post('/login', authController.login);
/**
* @route GET /api/v1/auth/status

View File

@@ -1,6 +1,7 @@
import { Router } from 'express';
import { dashboardController } from '../controllers/dashboard.controller';
import { requireAuth } from '../middleware/requireAuth';
import { requirePermission } from '../middleware/requirePermission';
import { AuthService } from '../../services/AuthService';
export const createDashboardRouter = (authService: AuthService): Router => {
@@ -8,11 +9,31 @@ export const createDashboardRouter = (authService: AuthService): Router => {
router.use(requireAuth(authService));
router.get('/stats', dashboardController.getStats);
router.get('/ingestion-history', dashboardController.getIngestionHistory);
router.get('/ingestion-sources', dashboardController.getIngestionSources);
router.get('/recent-syncs', dashboardController.getRecentSyncs);
router.get('/indexed-insights', dashboardController.getIndexedInsights);
router.get(
'/stats',
requirePermission('read', 'dashboard', 'dashboard.permissionRequired'),
dashboardController.getStats
);
router.get(
'/ingestion-history',
requirePermission('read', 'dashboard', 'dashboard.permissionRequired'),
dashboardController.getIngestionHistory
);
router.get(
'/ingestion-sources',
requirePermission('read', 'dashboard', 'dashboard.permissionRequired'),
dashboardController.getIngestionSources
);
router.get(
'/recent-syncs',
requirePermission('read', 'dashboard', 'dashboard.permissionRequired'),
dashboardController.getRecentSyncs
);
router.get(
'/indexed-insights',
requirePermission('read', 'dashboard', 'dashboard.permissionRequired'),
dashboardController.getIndexedInsights
);
return router;
};

View File

@@ -1,36 +1,42 @@
import { Router } from 'express';
import { requireAuth } from '../middleware/requireAuth';
import { requirePermission } from '../middleware/requirePermission';
import type { IamController } from '../controllers/iam.controller';
import type { AuthService } from '../../services/AuthService';
export const createIamRouter = (iamController: IamController): Router => {
export const createIamRouter = (iamController: IamController, authService: AuthService): Router => {
const router = Router();
router.use(requireAuth(authService));
/**
* @route GET /api/v1/iam/roles
* @description Gets all roles.
* @access Private
*/
router.get('/roles', requireAuth, iamController.getRoles);
router.get('/roles', requirePermission('read', 'roles'), iamController.getRoles);
router.get('/roles/:id', requirePermission('read', 'roles'), iamController.getRoleById);
/**
* @route GET /api/v1/iam/roles/:id
* @description Gets a role by ID.
* @access Private
* Only super admin has the ability to modify existing roles or create new roles.
*/
router.get('/roles/:id', requireAuth, iamController.getRoleById);
router.post(
'/roles',
requirePermission('manage', 'all', 'iam.requiresSuperAdminRole'),
iamController.createRole
);
/**
* @route POST /api/v1/iam/roles
* @description Creates a new role.
* @access Private
*/
router.post('/roles', requireAuth, iamController.createRole);
router.delete(
'/roles/:id',
requirePermission('manage', 'all', 'iam.requiresSuperAdminRole'),
iamController.deleteRole
);
/**
* @route DELETE /api/v1/iam/roles/:id
* @description Deletes a role.
* @access Private
*/
router.delete('/roles/:id', requireAuth, iamController.deleteRole);
router.put(
'/roles/:id',
requirePermission('manage', 'all', 'iam.requiresSuperAdminRole'),
iamController.updateRole
);
return router;
};

View File

@@ -1,6 +1,7 @@
import { Router } from 'express';
import { IngestionController } from '../controllers/ingestion.controller';
import { requireAuth } from '../middleware/requireAuth';
import { requirePermission } from '../middleware/requirePermission';
import { AuthService } from '../../services/AuthService';
export const createIngestionRouter = (
@@ -12,21 +13,29 @@ export const createIngestionRouter = (
// Secure all routes in this module
router.use(requireAuth(authService));
router.post('/', ingestionController.create);
router.post('/', requirePermission('create', 'ingestion'), ingestionController.create);
router.get('/', ingestionController.findAll);
router.get('/', requirePermission('read', 'ingestion'), ingestionController.findAll);
router.get('/:id', ingestionController.findById);
router.get('/:id', requirePermission('read', 'ingestion'), ingestionController.findById);
router.put('/:id', ingestionController.update);
router.put('/:id', requirePermission('update', 'ingestion'), ingestionController.update);
router.delete('/:id', ingestionController.delete);
router.delete('/:id', requirePermission('delete', 'ingestion'), ingestionController.delete);
router.post('/:id/import', ingestionController.triggerInitialImport);
router.post(
'/:id/import',
requirePermission('create', 'ingestion'),
ingestionController.triggerInitialImport
);
router.post('/:id/pause', ingestionController.pause);
router.post('/:id/pause', requirePermission('update', 'ingestion'), ingestionController.pause);
router.post('/:id/sync', ingestionController.triggerForceSync);
router.post(
'/:id/sync',
requirePermission('sync', 'ingestion'),
ingestionController.triggerForceSync
);
return router;
};

View File

@@ -1,6 +1,7 @@
import { Router } from 'express';
import { SearchController } from '../controllers/search.controller';
import { requireAuth } from '../middleware/requireAuth';
import { requirePermission } from '../middleware/requirePermission';
import { AuthService } from '../../services/AuthService';
export const createSearchRouter = (
@@ -11,7 +12,7 @@ export const createSearchRouter = (
router.use(requireAuth(authService));
router.get('/', searchController.search);
router.get('/', requirePermission('search', 'archive'), searchController.search);
return router;
};

View File

@@ -0,0 +1,25 @@
import { Router } from 'express';
import * as settingsController from '../controllers/settings.controller';
import { requireAuth } from '../middleware/requireAuth';
import { requirePermission } from '../middleware/requirePermission';
import { AuthService } from '../../services/AuthService';
export const createSettingsRouter = (authService: AuthService): Router => {
const router = Router();
// Public route to get non-sensitive settings. settings read should not be scoped with a permission because all end users need the settings data in the frontend. However, for sensitive settings data, we need to add a new permission subject to limit access. So this route should only expose non-sensitive settings data.
/**
* @returns SystemSettings
*/
router.get('/system', settingsController.getSystemSettings);
// Protected route to update settings
router.put(
'/system',
requireAuth(authService),
requirePermission('manage', 'settings', 'settings.noPermissionToUpdate'),
settingsController.updateSystemSettings
);
return router;
};

View File

@@ -1,6 +1,7 @@
import { Router } from 'express';
import { StorageController } from '../controllers/storage.controller';
import { requireAuth } from '../middleware/requireAuth';
import { requirePermission } from '../middleware/requirePermission';
import { AuthService } from '../../services/AuthService';
export const createStorageRouter = (
@@ -12,7 +13,7 @@ export const createStorageRouter = (
// Secure all routes in this module
router.use(requireAuth(authService));
router.get('/download', storageController.downloadFile);
router.get('/download', requirePermission('read', 'archive'), storageController.downloadFile);
return router;
};

View File

@@ -1,6 +0,0 @@
import { Router } from 'express';
import { ingestionQueue } from '../../jobs/queues';
const router: Router = Router();
export default router;

View File

@@ -2,13 +2,14 @@ import { Router } from 'express';
import { uploadFile } from '../controllers/upload.controller';
import { requireAuth } from '../middleware/requireAuth';
import { AuthService } from '../../services/AuthService';
import { requirePermission } from '../middleware/requirePermission';
export const createUploadRouter = (authService: AuthService): Router => {
const router = Router();
router.use(requireAuth(authService));
router.post('/', uploadFile);
router.post('/', requirePermission('create', 'ingestion'), uploadFile);
return router;
};

View File

@@ -0,0 +1,38 @@
import { Router } from 'express';
import * as userController from '../controllers/user.controller';
import { requireAuth } from '../middleware/requireAuth';
import { requirePermission } from '../middleware/requirePermission';
import { AuthService } from '../../services/AuthService';
export const createUserRouter = (authService: AuthService): Router => {
const router = Router();
router.use(requireAuth(authService));
router.get('/', requirePermission('read', 'users'), userController.getUsers);
router.get('/:id', requirePermission('read', 'users'), userController.getUser);
/**
* Only super admin has the ability to modify existing users or create new users.
*/
router.post(
'/',
requirePermission('manage', 'all', 'user.requiresSuperAdminRole'),
userController.createUser
);
router.put(
'/:id',
requirePermission('manage', 'all', 'user.requiresSuperAdminRole'),
userController.updateUser
);
router.delete(
'/:id',
requirePermission('manage', 'all', 'user.requiresSuperAdminRole'),
userController.deleteUser
);
return router;
};

View File

@@ -0,0 +1,12 @@
import 'dotenv/config';
export const apiConfig = {
rateLimit: {
windowMs: process.env.RATE_LIMIT_WINDOW_MS
? parseInt(process.env.RATE_LIMIT_WINDOW_MS, 10)
: 1 * 60 * 1000, // 1 minutes
max: process.env.RATE_LIMIT_MAX_REQUESTS
? parseInt(process.env.RATE_LIMIT_MAX_REQUESTS, 10)
: 100, // limit each IP to 100 requests per windowMs
},
};

View File

@@ -2,10 +2,12 @@ import { storage } from './storage';
import { app } from './app';
import { searchConfig } from './search';
import { connection as redisConfig } from './redis';
import { apiConfig } from './api';
export const config = {
storage,
app,
search: searchConfig,
redis: redisConfig,
api: apiConfig,
};

View File

@@ -0,0 +1,2 @@
ALTER TABLE "ingestion_sources" ADD COLUMN "user_id" uuid;--> statement-breakpoint
ALTER TABLE "ingestion_sources" ADD CONSTRAINT "ingestion_sources_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;

View File

@@ -0,0 +1,2 @@
ALTER TABLE "roles" ADD COLUMN "slug" text;--> statement-breakpoint
ALTER TABLE "roles" ADD CONSTRAINT "roles_slug_unique" UNIQUE("slug");

View File

@@ -0,0 +1,4 @@
CREATE TABLE "system_settings" (
"id" serial PRIMARY KEY NOT NULL,
"config" jsonb NOT NULL
);

View File

@@ -0,0 +1,11 @@
CREATE TABLE "api_keys" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"name" text NOT NULL,
"user_id" uuid NOT NULL,
"key" text NOT NULL,
"expires_at" timestamp with time zone NOT NULL,
"created_at" timestamp DEFAULT now() NOT NULL,
"updated_at" timestamp DEFAULT now() NOT NULL
);
--> statement-breakpoint
ALTER TABLE "api_keys" ADD CONSTRAINT "api_keys_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;

View File

@@ -0,0 +1 @@
ALTER TABLE "api_keys" ADD COLUMN "key_hash" text NOT NULL;

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -106,6 +106,41 @@
"when": 1754831765718,
"tag": "0014_foamy_vapor",
"breakpoints": true
},
{
"idx": 15,
"version": "7",
"when": 1755443936046,
"tag": "0015_wakeful_norman_osborn",
"breakpoints": true
},
{
"idx": 16,
"version": "7",
"when": 1755780572342,
"tag": "0016_lonely_mariko_yashida",
"breakpoints": true
},
{
"idx": 17,
"version": "7",
"when": 1755961566627,
"tag": "0017_tranquil_shooting_star",
"breakpoints": true
},
{
"idx": 18,
"version": "7",
"when": 1756911118035,
"tag": "0018_flawless_owl",
"breakpoints": true
},
{
"idx": 19,
"version": "7",
"when": 1756937533843,
"tag": "0019_confused_scream",
"breakpoints": true
}
]
}

View File

@@ -5,3 +5,5 @@ export * from './schema/compliance';
export * from './schema/custodians';
export * from './schema/ingestion-sources';
export * from './schema/users';
export * from './schema/system-settings';
export * from './schema/api-keys';

View File

@@ -0,0 +1,15 @@
import { pgTable, text, timestamp, uuid } from 'drizzle-orm/pg-core';
import { users } from './users';
export const apiKeys = pgTable('api_keys', {
id: uuid('id').primaryKey().defaultRandom(),
name: text('name').notNull(),
userId: uuid('user_id')
.notNull()
.references(() => users.id, { onDelete: 'cascade' }),
key: text('key').notNull(), // Encrypted API key
keyHash: text('key_hash').notNull(),
expiresAt: timestamp('expires_at', { withTimezone: true, mode: 'date' }).notNull(),
createdAt: timestamp('created_at').defaultNow().notNull(),
updatedAt: timestamp('updated_at').defaultNow().notNull(),
});

View File

@@ -1,4 +1,6 @@
import { jsonb, pgEnum, pgTable, text, timestamp, uuid } from 'drizzle-orm/pg-core';
import { users } from './users';
import { relations } from 'drizzle-orm';
export const ingestionProviderEnum = pgEnum('ingestion_provider', [
'google_workspace',
@@ -21,6 +23,7 @@ export const ingestionStatusEnum = pgEnum('ingestion_status', [
export const ingestionSources = pgTable('ingestion_sources', {
id: uuid('id').primaryKey().defaultRandom(),
userId: uuid('user_id').references(() => users.id, { onDelete: 'cascade' }),
name: text('name').notNull(),
provider: ingestionProviderEnum('provider').notNull(),
credentials: text('credentials'),
@@ -32,3 +35,10 @@ export const ingestionSources = pgTable('ingestion_sources', {
createdAt: timestamp('created_at', { withTimezone: true }).notNull().defaultNow(),
updatedAt: timestamp('updated_at', { withTimezone: true }).notNull().defaultNow(),
});
export const ingestionSourcesRelations = relations(ingestionSources, ({ one }) => ({
user: one(users, {
fields: [ingestionSources.userId],
references: [users.id],
}),
}));

View File

@@ -0,0 +1,7 @@
import { pgTable, serial, jsonb } from 'drizzle-orm/pg-core';
import type { SystemSettings } from '@open-archiver/types';
export const systemSettings = pgTable('system_settings', {
id: serial('id').primaryKey(),
config: jsonb('config').$type<SystemSettings>().notNull(),
});

View File

@@ -1,6 +1,6 @@
import { relations, sql } from 'drizzle-orm';
import { pgTable, text, timestamp, uuid, primaryKey, jsonb } from 'drizzle-orm/pg-core';
import type { PolicyStatement } from '@open-archiver/types';
import type { CaslPolicy } from '@open-archiver/types';
/**
* The `users` table stores the core user information for authentication and identification.
@@ -40,9 +40,10 @@ export const roles = pgTable('roles', {
id: uuid('id').primaryKey().defaultRandom(),
name: text('name').notNull().unique(),
policies: jsonb('policies')
.$type<PolicyStatement[]>()
.$type<CaslPolicy[]>()
.notNull()
.default(sql`'[]'::jsonb`),
slug: text('slug').unique(),
createdAt: timestamp('created_at').defaultNow().notNull(),
updatedAt: timestamp('updated_at').defaultNow().notNull(),
});

View File

@@ -0,0 +1,95 @@
import { SQL, and, or, not, eq, gt, gte, lt, lte, inArray, isNull, sql } from 'drizzle-orm';
const camelToSnakeCase = (str: string) =>
str.replace(/[A-Z]/g, (letter) => `_${letter.toLowerCase()}`);
const relationToTableMap: Record<string, string> = {
ingestionSource: 'ingestion_sources',
// TBD: Add other relations here as needed
};
function getDrizzleColumn(key: string): SQL {
const keyParts = key.split('.');
if (keyParts.length > 1) {
const relationName = keyParts[0];
const columnName = camelToSnakeCase(keyParts[1]);
const tableName = relationToTableMap[relationName];
if (tableName) {
return sql.raw(`"${tableName}"."${columnName}"`);
}
}
return sql`${sql.identifier(camelToSnakeCase(key))}`;
}
export function mongoToDrizzle(query: Record<string, any>): SQL | undefined {
const conditions: (SQL | undefined)[] = [];
for (const key in query) {
const value = query[key];
if (key === '$or') {
conditions.push(or(...(value as any[]).map(mongoToDrizzle).filter(Boolean)));
continue;
}
if (key === '$and') {
conditions.push(and(...(value as any[]).map(mongoToDrizzle).filter(Boolean)));
continue;
}
if (key === '$not') {
const subQuery = mongoToDrizzle(value);
if (subQuery) {
conditions.push(not(subQuery));
}
continue;
}
const column = getDrizzleColumn(key);
if (typeof value === 'object' && value !== null) {
const operator = Object.keys(value)[0];
const operand = value[operator];
switch (operator) {
case '$eq':
conditions.push(eq(column, operand));
break;
case '$ne':
conditions.push(not(eq(column, operand)));
break;
case '$gt':
conditions.push(gt(column, operand));
break;
case '$gte':
conditions.push(gte(column, operand));
break;
case '$lt':
conditions.push(lt(column, operand));
break;
case '$lte':
conditions.push(lte(column, operand));
break;
case '$in':
conditions.push(inArray(column, operand));
break;
case '$nin':
conditions.push(not(inArray(column, operand)));
break;
case '$exists':
conditions.push(operand ? not(isNull(column)) : isNull(column));
break;
default:
// Unsupported operator
}
} else {
conditions.push(eq(column, value));
}
}
if (conditions.length === 0) {
return undefined;
}
return and(...conditions.filter((c): c is SQL => c !== undefined));
}

View File

@@ -0,0 +1,100 @@
import { db } from '../database';
import { ingestionSources } from '../database/schema';
import { eq } from 'drizzle-orm';
const snakeToCamelCase = (str: string): string => {
return str.replace(/_([a-z])/g, (match, letter) => letter.toUpperCase());
};
function getMeliColumn(key: string): string {
const keyParts = key.split('.');
if (keyParts.length > 1) {
const relationName = keyParts[0];
const columnName = keyParts[1];
return `${relationName}.${columnName}`;
}
return snakeToCamelCase(key);
}
function quoteIfString(value: any): any {
if (typeof value === 'string') {
return `"${value}"`;
}
return value;
}
export async function mongoToMeli(query: Record<string, any>): Promise<string> {
const conditions: string[] = [];
for (const key of Object.keys(query)) {
const value = query[key];
if (key === '$or') {
const orConditions = await Promise.all(value.map(mongoToMeli));
conditions.push(`(${orConditions.join(' OR ')})`);
continue;
}
if (key === '$and') {
const andConditions = await Promise.all(value.map(mongoToMeli));
conditions.push(`(${andConditions.join(' AND ')})`);
continue;
}
if (key === '$not') {
conditions.push(`NOT (${await mongoToMeli(value)})`);
continue;
}
const column = getMeliColumn(key);
if (typeof value === 'object' && value !== null) {
const operator = Object.keys(value)[0];
const operand = value[operator];
switch (operator) {
case '$eq':
conditions.push(`${column} = ${quoteIfString(operand)}`);
break;
case '$ne':
conditions.push(`${column} != ${quoteIfString(operand)}`);
break;
case '$gt':
conditions.push(`${column} > ${operand}`);
break;
case '$gte':
conditions.push(`${column} >= ${operand}`);
break;
case '$lt':
conditions.push(`${column} < ${operand}`);
break;
case '$lte':
conditions.push(`${column} <= ${operand}`);
break;
case '$in':
conditions.push(`${column} IN [${operand.map(quoteIfString).join(', ')}]`);
break;
case '$nin':
conditions.push(`${column} NOT IN [${operand.map(quoteIfString).join(', ')}]`);
break;
case '$exists':
conditions.push(`${column} ${operand ? 'EXISTS' : 'NOT EXISTS'}`);
break;
default:
// Unsupported operator
}
} else {
if (column === 'ingestionSource.userId') {
// for the userId placeholder. (Await for a more elegant solution)
const ingestionsIds = await db
.select({ id: ingestionSources.id })
.from(ingestionSources)
.where(eq(ingestionSources.userId, value));
conditions.push(
`ingestionSourceId IN [${ingestionsIds.map((i) => quoteIfString(i.id)).join(', ')}]`
);
} else {
conditions.push(`${column} = ${quoteIfString(value)}`);
}
}
}
return conditions.join(' AND ');
}

View File

@@ -0,0 +1,118 @@
// packages/backend/src/iam-policy/ability.ts
import { createMongoAbility, MongoAbility, RawRuleOf } from '@casl/ability';
import { CaslPolicy, AppActions, AppSubjects } from '@open-archiver/types';
import { ingestionSources, archivedEmails, users, roles } from '../database/schema';
import { InferSelectModel } from 'drizzle-orm';
// Define the application's ability type
export type AppAbility = MongoAbility<[AppActions, AppSubjects]>;
// Helper type for raw rules
export type AppRawRule = RawRuleOf<AppAbility>;
// Represents the possible object types that can be passed as subjects for permission checks.
export type SubjectObject =
| InferSelectModel<typeof ingestionSources>
| InferSelectModel<typeof archivedEmails>
| InferSelectModel<typeof users>
| InferSelectModel<typeof roles>
| AppSubjects;
// Function to create an ability instance from policies stored in the database
export function createAbilityFor(policies: CaslPolicy[]) {
// We will not expand policies, if a role needs access to ingestion X and its archived emails, the policy should also grant access to archives belonging to ingestion X
// const allPolicies = expandPolicies(policies);
return createMongoAbility<AppAbility>(policies as AppRawRule[]);
}
/**
* @deprecated This function should not be used since we don't need the inheritable behavior anymore.
* Translates conditions on an 'ingestion' subject to equivalent conditions on an 'archive' subject.
* This is used to implement inherent permissions, where permission on an ingestion source
* implies permission on the emails it has ingested.
* @param conditions The original conditions object for the 'ingestion' subject.
* @returns A new conditions object for the 'archive' subject.
*/
function translateIngestionConditionsToArchive(
conditions: Record<string, any>
): Record<string, any> {
if (!conditions || typeof conditions !== 'object') {
return conditions;
}
const translated: Record<string, any> = {};
for (const key in conditions) {
const value = conditions[key];
// Handle logical operators recursively
if (['$or', '$and', '$nor'].includes(key) && Array.isArray(value)) {
translated[key] = value.map((v) => translateIngestionConditionsToArchive(v));
continue;
}
if (key === '$not' && typeof value === 'object' && value !== null) {
translated[key] = translateIngestionConditionsToArchive(value);
continue;
}
// Translate field names
let newKey = key;
if (key === 'id') {
newKey = 'ingestionSourceId';
} else if (['userId', 'name', 'provider', 'status'].includes(key)) {
newKey = `ingestionSource.${key}`;
}
translated[newKey] = value;
}
return translated;
}
/**
* @deprecated This function should not be used since we don't need the inheritable behavior anymore.
* Expands the given set of policies to include inherent permissions.
* For example, a permission on an 'ingestion' source is expanded to grant
* the same permission on 'archive' records related to that source.
* @param policies The original array of CASL policies.
* @returns A new array of policies including the expanded, inherent permissions.
*/
function expandPolicies(policies: CaslPolicy[]): CaslPolicy[] {
const expandedPolicies: CaslPolicy[] = JSON.parse(JSON.stringify(policies));
// Create a set of all actions that are already explicitly defined for the 'archive' subject.
const existingArchiveActions = new Set<string>();
policies.forEach((p) => {
if (p.subject === 'archive') {
const actions = Array.isArray(p.action) ? p.action : [p.action];
actions.forEach((a) => existingArchiveActions.add(a));
}
// Only expand `can` rules for the 'ingestion' subject.
if (p.subject === 'ingestion' && !p.inverted) {
const policyActions = Array.isArray(p.action) ? p.action : [p.action];
// Check if any action in the current ingestion policy already has an explicit archive policy.
const hasExplicitArchiveRule = policyActions.some(
(a) => existingArchiveActions.has(a) || existingArchiveActions.has('manage')
);
// If a more specific rule for 'archive' already exists, do not expand this ingestion rule,
// as it would create a conflicting, overly permissive rule.
if (hasExplicitArchiveRule) {
return;
}
const archivePolicy: CaslPolicy = {
...JSON.parse(JSON.stringify(p)),
subject: 'archive',
};
if (p.conditions) {
archivePolicy.conditions = translateIngestionConditionsToArchive(p.conditions);
}
expandedPolicies.push(archivePolicy);
}
});
policies.forEach((policy) => {});
return expandedPolicies;
}

View File

@@ -1,116 +0,0 @@
/**
* @file This file serves as the single source of truth for all Identity and Access Management (IAM)
* definitions within Open Archiver. Centralizing these definitions is an industry-standard practice
* that offers several key benefits:
*
* 1. **Prevents "Magic Strings"**: Avoids the use of hardcoded strings for actions and resources
* throughout the codebase, reducing the risk of typos and inconsistencies.
* 2. **Single Source of Truth**: Provides a clear, comprehensive, and maintainable list of all
* possible permissions in the system.
* 3. **Enables Validation**: Allows for the creation of a robust validation function that can
* programmatically check if a policy statement is valid before it is saved.
* 4. **Simplifies Auditing**: Makes it easy to audit and understand the scope of permissions
* that can be granted.
*
* The structure is inspired by AWS IAM, using a `service:operation` format for actions and a
* hierarchical, slash-separated path for resources.
*/
// ===================================================================================
// SERVICE: archive
// ===================================================================================
const ARCHIVE_ACTIONS = {
READ: 'archive:read',
SEARCH: 'archive:search',
EXPORT: 'archive:export',
} as const;
const ARCHIVE_RESOURCES = {
ALL: 'archive/all',
INGESTION_SOURCE: 'archive/ingestion-source/*',
MAILBOX: 'archive/mailbox/*',
CUSTODIAN: 'archive/custodian/*',
} as const;
// ===================================================================================
// SERVICE: ingestion
// ===================================================================================
const INGESTION_ACTIONS = {
CREATE_SOURCE: 'ingestion:createSource',
READ_SOURCE: 'ingestion:readSource',
UPDATE_SOURCE: 'ingestion:updateSource',
DELETE_SOURCE: 'ingestion:deleteSource',
MANAGE_SYNC: 'ingestion:manageSync', // Covers triggering, pausing, and forcing syncs
} as const;
const INGESTION_RESOURCES = {
ALL: 'ingestion-source/*',
SOURCE: 'ingestion-source/{sourceId}',
} as const;
// ===================================================================================
// SERVICE: system
// ===================================================================================
const SYSTEM_ACTIONS = {
READ_SETTINGS: 'system:readSettings',
UPDATE_SETTINGS: 'system:updateSettings',
READ_USERS: 'system:readUsers',
CREATE_USER: 'system:createUser',
UPDATE_USER: 'system:updateUser',
DELETE_USER: 'system:deleteUser',
ASSIGN_ROLE: 'system:assignRole',
} as const;
const SYSTEM_RESOURCES = {
SETTINGS: 'system/settings',
USERS: 'system/users',
USER: 'system/user/{userId}',
} as const;
// ===================================================================================
// SERVICE: dashboard
// ===================================================================================
const DASHBOARD_ACTIONS = {
READ: 'dashboard:read',
} as const;
const DASHBOARD_RESOURCES = {
ALL: 'dashboard/*',
} as const;
// ===================================================================================
// EXPORTED DEFINITIONS
// ===================================================================================
/**
* A comprehensive set of all valid IAM actions in the system.
* This is used by the policy validator to ensure that any action in a policy is recognized.
*/
export const ValidActions: Set<string> = new Set([
...Object.values(ARCHIVE_ACTIONS),
...Object.values(INGESTION_ACTIONS),
...Object.values(SYSTEM_ACTIONS),
...Object.values(DASHBOARD_ACTIONS),
]);
/**
* An object containing regular expressions for validating resource formats.
* The validator uses these patterns to ensure that resource strings in a policy
* conform to the expected structure.
*
* Logic:
* - The key represents the service (e.g., 'archive').
* - The value is a RegExp that matches all valid resource formats for that service.
* - This allows for flexible validation. For example, `archive/*` is a valid pattern,
* as is `archive/email/123-abc`.
*/
export const ValidResourcePatterns = {
archive: /^archive\/(all|ingestion-source\/[^\/]+|mailbox\/[^\/]+|custodian\/[^\/]+)$/,
ingestion: /^ingestion-source\/(\*|[^\/]+)$/,
system: /^system\/(settings|users|user\/[^\/]+)$/,
dashboard: /^dashboard\/\*$/,
};

View File

@@ -1,106 +1,99 @@
import type { PolicyStatement } from '@open-archiver/types';
import { ValidActions, ValidResourcePatterns } from './iam-definitions';
import type { CaslPolicy, AppActions, AppSubjects } from '@open-archiver/types';
// Create sets of valid actions and subjects for efficient validation
const validActions: Set<AppActions> = new Set([
'manage',
'create',
'read',
'update',
'delete',
'search',
'export',
'sync',
]);
const validSubjects: Set<AppSubjects> = new Set([
'archive',
'ingestion',
'settings',
'users',
'roles',
'dashboard',
'all',
]);
/**
* @class PolicyValidator
*
* This class provides a static method to validate an IAM policy statement.
* This class provides a static method to validate a CASL policy.
* It is designed to be used before a policy is saved to the database, ensuring that
* only valid and well-formed policies are stored.
*
* The verification logic is based on the centralized definitions in `iam-definitions.ts`.
* The verification logic is based on the centralized definitions in `packages/types/src/iam.types.ts`.
*/
export class PolicyValidator {
/**
* Validates a single policy statement to ensure its actions and resources are valid.
* Validates a single policy statement to ensure its actions and subjects are valid.
*
* @param {PolicyStatement} statement - The policy statement to validate.
* @param {CaslPolicy} policy - The policy to validate.
* @returns {{valid: boolean; reason?: string}} - An object containing a boolean `valid` property
* and an optional `reason` string if validation fails.
*/
public static isValid(statement: PolicyStatement): { valid: boolean; reason: string } {
if (!statement || !statement.Action || !statement.Resource || !statement.Effect) {
return { valid: false, reason: 'Policy statement is missing required fields.' };
public static isValid(policy: CaslPolicy): { valid: boolean; reason: string } {
if (!policy || !policy.action || !policy.subject) {
return {
valid: false,
reason: 'Policy is missing required fields "action" or "subject".',
};
}
// 1. Validate Actions
for (const action of statement.Action) {
const actions = Array.isArray(policy.action) ? policy.action : [policy.action];
for (const action of actions) {
const { valid, reason } = this.isActionValid(action);
if (!valid) {
return { valid: false, reason };
}
}
// 2. Validate Resources
for (const resource of statement.Resource) {
const { valid, reason } = this.isResourceValid(resource);
// 2. Validate Subjects
const subjects = Array.isArray(policy.subject) ? policy.subject : [policy.subject];
for (const subject of subjects) {
const { valid, reason } = this.isSubjectValid(subject);
if (!valid) {
return { valid: false, reason };
}
}
// 3. (Optional) Validate Conditions, Fields, etc. in the future if needed.
return { valid: true, reason: 'valid' };
}
/**
* Checks if a single action string is valid.
*
* Logic:
* - If the action contains a wildcard (e.g., 'archive:*'), it checks if the service part
* (e.g., 'archive') is a recognized service.
* - If there is no wildcard, it checks if the full action string (e.g., 'archive:read')
* exists in the `ValidActions` set.
* Checks if a single action string is a valid AppAction.
*
* @param {string} action - The action string to validate.
* @returns {{valid: boolean; reason?: string}} - An object indicating validity and a reason for failure.
*/
private static isActionValid(action: string): { valid: boolean; reason: string } {
if (action === '*') {
return { valid: true, reason: 'valid' };
}
if (action.endsWith(':*')) {
const service = action.split(':')[0];
if (service in ValidResourcePatterns) {
return { valid: true, reason: 'valid' };
}
return {
valid: false,
reason: `Invalid service '${service}' in action wildcard '${action}'.`,
};
}
if (ValidActions.has(action)) {
private static isActionValid(action: AppActions): { valid: boolean; reason: string } {
if (validActions.has(action)) {
return { valid: true, reason: 'valid' };
}
return { valid: false, reason: `Action '${action}' is not a valid action.` };
}
/**
* Checks if a single resource string has a valid format.
* Checks if a single subject string is a valid AppSubject.
*
* Logic:
* - It extracts the service name from the resource string (e.g., 'archive' from 'archive/all').
* - It looks up the corresponding regular expression for that service in `ValidResourcePatterns`.
* - It tests the resource string against the pattern. If the service does not exist or the
* pattern does not match, the resource is considered invalid.
*
* @param {string} resource - The resource string to validate.
* @param {string} subject - The subject string to validate.
* @returns {{valid: boolean; reason?: string}} - An object indicating validity and a reason for failure.
*/
private static isResourceValid(resource: string): { valid: boolean; reason: string } {
const service = resource.split('/')[0];
if (service === '*') {
private static isSubjectValid(subject: AppSubjects): { valid: boolean; reason: string } {
if (validSubjects.has(subject)) {
return { valid: true, reason: 'valid' };
}
if (service in ValidResourcePatterns) {
const pattern = ValidResourcePatterns[service as keyof typeof ValidResourcePatterns];
if (pattern.test(resource)) {
return { valid: true, reason: 'valid' };
}
return {
valid: false,
reason: `Resource '${resource}' does not match the expected format for the '${service}' service.`,
};
}
return { valid: false, reason: `Invalid service '${service}' in resource '${resource}'.` };
return { valid: false, reason: `Subject '${subject}' is not a valid subject.` };
}
}

View File

@@ -0,0 +1,6 @@
[
{
"action": "manage",
"subject": "all"
}
]

View File

@@ -0,0 +1,17 @@
[
{
"action": ["read", "search"],
"subject": "ingestion",
"conditions": {
"id": "f16b7ed2-4e54-4283-9556-c633726f9405"
}
},
{
"inverted": true,
"action": ["read", "search"],
"subject": "archive",
"conditions": {
"userEmail": "dev@openarchiver.com"
}
}
]

View File

@@ -0,0 +1,14 @@
[
{
"action": ["read", "search"],
"subject": "ingestion",
"conditions": {
"id": {
"$in": [
"aeafbe44-d41c-4015-ac27-504f6e0c511a",
"f16b7ed2-4e54-4283-9556-c633726f9405"
]
}
}
}
]

View File

@@ -0,0 +1,17 @@
[
{
"action": "create",
"subject": "ingestion"
},
{
"action": "read",
"subject": "dashboard"
},
{
"action": "manage",
"subject": "ingestion",
"conditions": {
"userId": "${user.id}"
}
}
]

View File

@@ -0,0 +1,6 @@
[
{
"action": "manage",
"subject": "ingestion"
}
]

View File

@@ -0,0 +1,6 @@
[
{
"action": ["read", "search"],
"subject": ["ingestion", "archive", "dashboard", "users", "roles"]
}
]

View File

@@ -0,0 +1,9 @@
[
{
"action": "manage",
"subject": "ingestion",
"conditions": {
"id": "f3d7c025-060f-4f1f-a0e6-cdd32e6e07af"
}
}
]

View File

@@ -0,0 +1,10 @@
[
{
"action": "manage",
"subject": "users"
},
{
"action": "read",
"subject": "roles"
}
]

View File

@@ -15,12 +15,21 @@ import { createStorageRouter } from './api/routes/storage.routes';
import { createSearchRouter } from './api/routes/search.routes';
import { createDashboardRouter } from './api/routes/dashboard.routes';
import { createUploadRouter } from './api/routes/upload.routes';
import testRouter from './api/routes/test.routes';
import { createUserRouter } from './api/routes/user.routes';
import { createSettingsRouter } from './api/routes/settings.routes';
import { apiKeyRoutes } from './api/routes/api-key.routes';
import { AuthService } from './services/AuthService';
import { UserService } from './services/UserService';
import { IamService } from './services/IamService';
import { StorageService } from './services/StorageService';
import { SearchService } from './services/SearchService';
import { SettingsService } from './services/SettingsService';
import i18next from 'i18next';
import FsBackend from 'i18next-fs-backend';
import i18nextMiddleware from 'i18next-http-middleware';
import path from 'path';
import { logger } from './config/logger';
import { rateLimiter } from './api/middleware/rateLimiter';
// Load environment variables
dotenv.config();
@@ -34,6 +43,22 @@ if (!PORT_BACKEND || !JWT_SECRET || !JWT_EXPIRES_IN) {
);
}
// --- i18next Initialization ---
const initializeI18next = async () => {
const systemSettings = await settingsService.getSystemSettings();
const defaultLanguage = systemSettings?.language || 'en';
logger.info({ language: defaultLanguage }, 'Default language');
await i18next.use(FsBackend).init({
lng: defaultLanguage,
fallbackLng: defaultLanguage,
ns: ['translation'],
defaultNS: 'translation',
backend: {
loadPath: path.resolve(__dirname, './locales/{{lng}}/{{ns}}.json'),
},
});
};
// --- Dependency Injection Setup ---
const userService = new UserService();
@@ -47,6 +72,7 @@ const searchService = new SearchService();
const searchController = new SearchController();
const iamService = new IamService();
const iamController = new IamController(iamService);
const settingsService = new SettingsService();
// --- Express App Initialization ---
const app = express();
@@ -58,15 +84,31 @@ const archivedEmailRouter = createArchivedEmailRouter(archivedEmailController, a
const storageRouter = createStorageRouter(storageController, authService);
const searchRouter = createSearchRouter(searchController, authService);
const dashboardRouter = createDashboardRouter(authService);
const iamRouter = createIamRouter(iamController);
const iamRouter = createIamRouter(iamController, authService);
const uploadRouter = createUploadRouter(authService);
const userRouter = createUserRouter(authService);
const settingsRouter = createSettingsRouter(authService);
const apiKeyRouter = apiKeyRoutes(authService);
// upload route is added before middleware because it doesn't use the json middleware.
app.use('/v1/upload', uploadRouter);
// Middleware for all other routes
app.use((req, res, next) => {
// exclude certain API endpoints from the rate limiter, for example status, system settings
const excludedPatterns = [/^\/v\d+\/auth\/status$/, /^\/v\d+\/settings\/system$/];
for (const pattern of excludedPatterns) {
if (pattern.test(req.path)) {
return next();
}
}
rateLimiter(req, res, next);
});
app.use(express.json());
app.use(express.urlencoded({ extended: true }));
// i18n middleware
app.use(i18nextMiddleware.handle(i18next));
app.use('/v1/auth', authRouter);
app.use('/v1/iam', iamRouter);
app.use('/v1/ingestion-sources', ingestionRouter);
@@ -74,7 +116,9 @@ app.use('/v1/archived-emails', archivedEmailRouter);
app.use('/v1/storage', storageRouter);
app.use('/v1/search', searchRouter);
app.use('/v1/dashboard', dashboardRouter);
app.use('/v1/test', testRouter);
app.use('/v1/users', userRouter);
app.use('/v1/settings', settingsRouter);
app.use('/v1/api-keys', apiKeyRouter);
// Example of a protected route
app.get('/v1/protected', requireAuth(authService), (req, res) => {
@@ -91,15 +135,19 @@ app.get('/', (req, res) => {
// --- Server Start ---
const startServer = async () => {
try {
// Initialize i18next
await initializeI18next();
logger.info({}, 'i18next initialized');
// Configure the Meilisearch index on startup
console.log('Configuring email index...');
logger.info({}, 'Configuring email index...');
await searchService.configureEmailIndex();
app.listen(PORT_BACKEND, () => {
console.log(`Backend listening at http://localhost:${PORT_BACKEND}`);
logger.info({}, `Backend listening at http://localhost:${PORT_BACKEND}`);
});
} catch (error) {
console.error('Failed to start the server:', error);
logger.error({ error }, 'Failed to start the server:', error);
process.exit(1);
}
};

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "E-Mail, Passwort und Name sind erforderlich",
"alreadyCompleted": "Die Einrichtung wurde bereits abgeschlossen."
},
"login": {
"emailAndPasswordRequired": "E-Mail und Passwort sind erforderlich",
"invalidCredentials": "Ungültige Anmeldeinformationen"
}
},
"errors": {
"internalServerError": "Ein interner Serverfehler ist aufgetreten",
"demoMode": "Dieser Vorgang ist im Demo-Modus nicht zulässig.",
"unauthorized": "Unbefugt",
"unknown": "Ein unbekannter Fehler ist aufgetreten",
"noPermissionToAction": "Sie haben keine Berechtigung, die aktuelle Aktion auszuführen."
},
"user": {
"notFound": "Benutzer nicht gefunden",
"cannotDeleteOnlyUser": "Sie versuchen, den einzigen Benutzer in der Datenbank zu löschen, dies ist nicht gestattet.",
"requiresSuperAdminRole": "Die Rolle des Super-Admins ist erforderlich, um Benutzer zu verwalten."
},
"iam": {
"failedToGetRoles": "Rollen konnten nicht abgerufen werden.",
"roleNotFound": "Rolle nicht gefunden.",
"failedToGetRole": "Rolle konnte nicht abgerufen werden.",
"missingRoleFields": "Fehlende erforderliche Felder: Name und Richtlinie.",
"invalidPolicy": "Ungültige Richtlinienanweisung:",
"failedToCreateRole": "Rolle konnte nicht erstellt werden.",
"failedToDeleteRole": "Rolle konnte nicht gelöscht werden.",
"missingUpdateFields": "Fehlende Felder zum Aktualisieren: Name oder Richtlinien.",
"failedToUpdateRole": "Rolle konnte nicht aktualisiert werden.",
"requiresSuperAdminRole": "Die Rolle des Super-Admins ist erforderlich, um Rollen zu verwalten."
},
"settings": {
"failedToRetrieve": "Einstellungen konnten nicht abgerufen werden",
"failedToUpdate": "Einstellungen konnten nicht aktualisiert werden",
"noPermissionToUpdate": "Sie haben keine Berechtigung, die Systemeinstellungen zu aktualisieren."
},
"dashboard": {
"permissionRequired": "Sie benötigen die Leseberechtigung für das Dashboard, um Dashboard-Daten anzuzeigen."
},
"ingestion": {
"failedToCreate": "Die Erfassungsquelle konnte aufgrund eines Verbindungsfehlers nicht erstellt werden.",
"notFound": "Erfassungsquelle nicht gefunden",
"initialImportTriggered": "Erstimport erfolgreich ausgelöst.",
"forceSyncTriggered": "Erzwungene Synchronisierung erfolgreich ausgelöst."
},
"archivedEmail": {
"notFound": "Archivierte E-Mail nicht gefunden"
},
"search": {
"keywordsRequired": "Schlüsselwörter sind erforderlich"
},
"storage": {
"filePathRequired": "Dateipfad ist erforderlich",
"invalidFilePath": "Ungültiger Dateipfad",
"fileNotFound": "Datei nicht gefunden",
"downloadError": "Fehler beim Herunterladen der Datei"
}
}

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "Το email, ο κωδικός πρόσβασης και το όνομα είναι υποχρεωτικά",
"alreadyCompleted": "Η εγκατάσταση έχει ήδη ολοκληρωθεί."
},
"login": {
"emailAndPasswordRequired": "Το email και ο κωδικός πρόσβασης είναι υποχρεωτικά",
"invalidCredentials": "Μη έγκυρα διαπιστευτήρια"
}
},
"errors": {
"internalServerError": "Παρουσιάστηκε ένα εσωτερικό σφάλμα διακομιστή",
"demoMode": "Αυτή η λειτουργία δεν επιτρέπεται σε λειτουργία επίδειξης.",
"unauthorized": "Μη εξουσιοδοτημένο",
"unknown": "Παρουσιάστηκε ένα άγνωστο σφάλμα",
"noPermissionToAction": "Δεν έχετε την άδεια να εκτελέσετε την τρέχουσα ενέργεια."
},
"user": {
"notFound": "Ο χρήστης δεν βρέθηκε",
"cannotDeleteOnlyUser": "Προσπαθείτε να διαγράψετε τον μοναδικό χρήστη στη βάση δεδομένων, αυτό δεν επιτρέπεται.",
"requiresSuperAdminRole": "Απαιτείται ο ρόλος του Super Admin για τη διαχείριση των χρηστών."
},
"iam": {
"failedToGetRoles": "Η λήψη των ρόλων απέτυχε.",
"roleNotFound": "Ο ρόλος δεν βρέθηκε.",
"failedToGetRole": "Η λήψη του ρόλου απέτυχε.",
"missingRoleFields": "Λείπουν τα απαιτούμενα πεδία: όνομα και πολιτική.",
"invalidPolicy": "Μη έγκυρη δήλωση πολιτικής:",
"failedToCreateRole": "Η δημιουργία του ρόλου απέτυχε.",
"failedToDeleteRole": "Η διαγραφή του ρόλου απέτυχε.",
"missingUpdateFields": "Λείπουν πεδία για ενημέρωση: όνομα ή πολιτικές.",
"failedToUpdateRole": "Η ενημέρωση του ρόλου απέτυχε.",
"requiresSuperAdminRole": "Απαιτείται ο ρόλος του Super Admin για τη διαχείριση των ρόλων."
},
"settings": {
"failedToRetrieve": "Η ανάκτηση των ρυθμίσεων απέτυχε",
"failedToUpdate": "Η ενημέρωση των ρυθμίσεων απέτυχε",
"noPermissionToUpdate": "Δεν έχετε άδεια να ενημερώσετε τις ρυθμίσεις του συστήματος."
},
"dashboard": {
"permissionRequired": "Χρειάζεστε την άδεια ανάγνωσης του πίνακα ελέγχου για να δείτε τα δεδομένα του πίνακα ελέγχου."
},
"ingestion": {
"failedToCreate": "Η δημιουργία της πηγής πρόσληψης απέτυχε λόγω σφάλματος σύνδεσης.",
"notFound": "Η πηγή πρόσληψης δεν βρέθηκε",
"initialImportTriggered": "Η αρχική εισαγωγή ενεργοποιήθηκε με επιτυχία.",
"forceSyncTriggered": "Ο εξαναγκασμένος συγχρονισμός ενεργοποιήθηκε με επιτυχία."
},
"archivedEmail": {
"notFound": "Το αρχειοθετημένο email δεν βρέθηκε"
},
"search": {
"keywordsRequired": "Οι λέξεις-κλειδιά είναι υποχρεωτικές"
},
"storage": {
"filePathRequired": "Η διαδρομή του αρχείου είναι υποχρεωτική",
"invalidFilePath": "Μη έγκυρη διαδρομή αρχείου",
"fileNotFound": "Το αρχείο δεν βρέθηκε",
"downloadError": "Σφάλμα κατά τη λήψη του αρχείου"
}
}

View File

@@ -0,0 +1,69 @@
{
"auth": {
"setup": {
"allFieldsRequired": "Email, password, and name are required",
"alreadyCompleted": "Setup has already been completed."
},
"login": {
"emailAndPasswordRequired": "Email and password are required",
"invalidCredentials": "Invalid credentials"
}
},
"errors": {
"internalServerError": "An internal server error occurred",
"demoMode": "This operation is not allowed in demo mode.",
"unauthorized": "Unauthorized",
"unknown": "An unknown error occurred",
"noPermissionToAction": "You don't have the permission to perform the current action."
},
"user": {
"notFound": "User not found",
"cannotDeleteOnlyUser": "You are trying to delete the only user in the database, this is not allowed.",
"requiresSuperAdminRole": "Super Admin role is required to manage users."
},
"iam": {
"failedToGetRoles": "Failed to get roles.",
"roleNotFound": "Role not found.",
"failedToGetRole": "Failed to get role.",
"missingRoleFields": "Missing required fields: name and policy.",
"invalidPolicy": "Invalid policy statement:",
"failedToCreateRole": "Failed to create role.",
"failedToDeleteRole": "Failed to delete role.",
"missingUpdateFields": "Missing fields to update: name or policies.",
"failedToUpdateRole": "Failed to update role.",
"requiresSuperAdminRole": "Super Admin role is required to manage roles."
},
"settings": {
"failedToRetrieve": "Failed to retrieve settings",
"failedToUpdate": "Failed to update settings",
"noPermissionToUpdate": "You do not have permission to update system settings."
},
"dashboard": {
"permissionRequired": "You need the dashboard read permission to view dashboard data."
},
"ingestion": {
"failedToCreate": "Failed to create ingestion source due to a connection error.",
"notFound": "Ingestion source not found",
"initialImportTriggered": "Initial import triggered successfully.",
"forceSyncTriggered": "Force sync triggered successfully."
},
"archivedEmail": {
"notFound": "Archived email not found"
},
"search": {
"keywordsRequired": "Keywords are required"
},
"storage": {
"filePathRequired": "File path is required",
"invalidFilePath": "Invalid file path",
"fileNotFound": "File not found",
"downloadError": "Error downloading file"
},
"apiKeys": {
"generateSuccess": "API key generated successfully.",
"deleteSuccess": "API key deleted successfully."
},
"api": {
"requestBodyInvalid": "Invalid request body."
}
}

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "Se requieren correo electrónico, contraseña y nombre",
"alreadyCompleted": "La configuración ya se ha completado."
},
"login": {
"emailAndPasswordRequired": "Se requieren correo electrónico y contraseña",
"invalidCredentials": "Credenciales no válidas"
}
},
"errors": {
"internalServerError": "Ocurrió un error interno del servidor",
"demoMode": "Esta operación no está permitida en modo de demostración.",
"unauthorized": "No autorizado",
"unknown": "Ocurrió un error desconocido",
"noPermissionToAction": "No tienes permiso para realizar la acción actual."
},
"user": {
"notFound": "Usuario no encontrado",
"cannotDeleteOnlyUser": "Estás intentando eliminar al único usuario de la base de datos, esto no está permitido.",
"requiresSuperAdminRole": "Se requiere el rol de Superadministrador para gestionar usuarios."
},
"iam": {
"failedToGetRoles": "Error al obtener los roles.",
"roleNotFound": "Rol no encontrado.",
"failedToGetRole": "Error al obtener el rol.",
"missingRoleFields": "Faltan campos obligatorios: nombre y política.",
"invalidPolicy": "Declaración de política no válida:",
"failedToCreateRole": "Error al crear el rol.",
"failedToDeleteRole": "Error al eliminar el rol.",
"missingUpdateFields": "Faltan campos para actualizar: nombre o políticas.",
"failedToUpdateRole": "Error al actualizar el rol.",
"requiresSuperAdminRole": "Se requiere el rol de Superadministrador para gestionar los roles."
},
"settings": {
"failedToRetrieve": "Error al recuperar la configuración",
"failedToUpdate": "Error al actualizar la configuración",
"noPermissionToUpdate": "No tienes permiso para actualizar la configuración del sistema."
},
"dashboard": {
"permissionRequired": "Necesitas el permiso de lectura del panel de control para ver los datos del panel."
},
"ingestion": {
"failedToCreate": "Error al crear la fuente de ingesta debido a un error de conexión.",
"notFound": "Fuente de ingesta no encontrada",
"initialImportTriggered": "Importación inicial activada correctamente.",
"forceSyncTriggered": "Sincronización forzada activada correctamente."
},
"archivedEmail": {
"notFound": "Correo electrónico archivado no encontrado"
},
"search": {
"keywordsRequired": "Se requieren palabras clave"
},
"storage": {
"filePathRequired": "Se requiere la ruta del archivo",
"invalidFilePath": "Ruta de archivo no válida",
"fileNotFound": "Archivo no encontrado",
"downloadError": "Error al descargar el archivo"
}
}

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "E-post, parool ja nimi on kohustuslikud",
"alreadyCompleted": "Seadistamine on juba lõpule viidud."
},
"login": {
"emailAndPasswordRequired": "E-post ja parool on kohustuslikud",
"invalidCredentials": "Valed sisselogimisandmed"
}
},
"errors": {
"internalServerError": "Ilmnes sisemine serveriviga",
"demoMode": "See toiming pole demorežiimis lubatud.",
"unauthorized": "Volitamata",
"unknown": "Ilmnes tundmatu viga",
"noPermissionToAction": "Teil pole praeguse toimingu tegemiseks luba."
},
"user": {
"notFound": "Kasutajat ei leitud",
"cannotDeleteOnlyUser": "Püüate kustutada andmebaasi ainsat kasutajat, see pole lubatud.",
"requiresSuperAdminRole": "Kasutajate haldamiseks on vajalik superadministraatori roll."
},
"iam": {
"failedToGetRoles": "Rollide hankimine ebaõnnestus.",
"roleNotFound": "Rolli ei leitud.",
"failedToGetRole": "Rolli hankimine ebaõnnestus.",
"missingRoleFields": "Puuduvad kohustuslikud väljad: nimi ja poliitika.",
"invalidPolicy": "Kehtetu poliitika avaldus:",
"failedToCreateRole": "Rolli loomine ebaõnnestus.",
"failedToDeleteRole": "Rolli kustutamine ebaõnnestus.",
"missingUpdateFields": "Uuendamiseks puuduvad väljad: nimi või poliitikad.",
"failedToUpdateRole": "Rolli värskendamine ebaõnnestus.",
"requiresSuperAdminRole": "Rollide haldamiseks on vajalik superadministraatori roll."
},
"settings": {
"failedToRetrieve": "Seadete toomine ebaõnnestus",
"failedToUpdate": "Seadete värskendamine ebaõnnestus",
"noPermissionToUpdate": "Teil pole süsteemi seadete värskendamiseks luba."
},
"dashboard": {
"permissionRequired": "Armatuurlaua andmete vaatamiseks on teil vaja armatuurlaua lugemisluba."
},
"ingestion": {
"failedToCreate": "Söötmeallika loomine ebaõnnestus ühenduse vea tõttu.",
"notFound": "Söötmeallikat ei leitud",
"initialImportTriggered": "Esialgne import käivitati edukalt.",
"forceSyncTriggered": "Sundsünkroonimine käivitati edukalt."
},
"archivedEmail": {
"notFound": "Arhiveeritud e-kirja ei leitud"
},
"search": {
"keywordsRequired": "Märksõnad on kohustuslikud"
},
"storage": {
"filePathRequired": "Faili tee on kohustuslik",
"invalidFilePath": "Kehtetu faili tee",
"fileNotFound": "Faili ei leitud",
"downloadError": "Faili allalaadimisel ilmnes viga"
}
}

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "L'e-mail, le mot de passe et le nom sont requis",
"alreadyCompleted": "La configuration est déjà terminée."
},
"login": {
"emailAndPasswordRequired": "L'e-mail et le mot de passe sont requis",
"invalidCredentials": "Identifiants invalides"
}
},
"errors": {
"internalServerError": "Une erreur interne du serveur s'est produite",
"demoMode": "Cette opération n'est pas autorisée en mode démo.",
"unauthorized": "Non autorisé",
"unknown": "Une erreur inconnue s'est produite",
"noPermissionToAction": "Vous n'avez pas la permission d'effectuer l'action en cours."
},
"user": {
"notFound": "Utilisateur non trouvé",
"cannotDeleteOnlyUser": "Vous essayez de supprimer le seul utilisateur de la base de données, ce n'est pas autorisé.",
"requiresSuperAdminRole": "Le rôle de Super Admin est requis pour gérer les utilisateurs."
},
"iam": {
"failedToGetRoles": "Échec de la récupération des rôles.",
"roleNotFound": "Rôle non trouvé.",
"failedToGetRole": "Échec de la récupération du rôle.",
"missingRoleFields": "Champs obligatoires manquants : nom et politique.",
"invalidPolicy": "Déclaration de politique invalide :",
"failedToCreateRole": "Échec de la création du rôle.",
"failedToDeleteRole": "Échec de la suppression du rôle.",
"missingUpdateFields": "Champs à mettre à jour manquants : nom ou politiques.",
"failedToUpdateRole": "Échec de la mise à jour du rôle.",
"requiresSuperAdminRole": "Le rôle de Super Admin est requis pour gérer les rôles."
},
"settings": {
"failedToRetrieve": "Échec de la récupération des paramètres",
"failedToUpdate": "Échec de la mise à jour des paramètres",
"noPermissionToUpdate": "Vous n'avez pas la permission de mettre à jour les paramètres système."
},
"dashboard": {
"permissionRequired": "Vous avez besoin de la permission de lecture du tableau de bord pour afficher les données du tableau de bord."
},
"ingestion": {
"failedToCreate": "Échec de la création de la source d'ingestion en raison d'une erreur de connexion.",
"notFound": "Source d'ingestion non trouvée",
"initialImportTriggered": "Importation initiale déclenchée avec succès.",
"forceSyncTriggered": "Synchronisation forcée déclenchée avec succès."
},
"archivedEmail": {
"notFound": "E-mail archivé non trouvé"
},
"search": {
"keywordsRequired": "Des mots-clés sont requis"
},
"storage": {
"filePathRequired": "Le chemin du fichier est requis",
"invalidFilePath": "Chemin de fichier invalide",
"fileNotFound": "Fichier non trouvé",
"downloadError": "Erreur lors du téléchargement du fichier"
}
}

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "Email, password e nome sono obbligatori",
"alreadyCompleted": "La configurazione è già stata completata."
},
"login": {
"emailAndPasswordRequired": "Email and password are required",
"invalidCredentials": "Credenziali non valide"
}
},
"errors": {
"internalServerError": "Si è verificato un errore interno del server",
"demoMode": "Questa operazione non è consentita in modalità demo.",
"unauthorized": "Non autorizzato",
"unknown": "Si è verificato un errore sconosciuto",
"noPermissionToAction": "Non hai il permesso di eseguire l'azione corrente."
},
"user": {
"notFound": "Utente non trovato",
"cannotDeleteOnlyUser": "Stai tentando di eliminare l'unico utente nel database, ciò non è consentito.",
"requiresSuperAdminRole": "È richiesto il ruolo di Super Admin per gestire gli utenti."
},
"iam": {
"failedToGetRoles": "Impossibile ottenere i ruoli.",
"roleNotFound": "Ruolo non trovato.",
"failedToGetRole": "Impossibile ottenere il ruolo.",
"missingRoleFields": "Campi obbligatori mancanti: nome e policy.",
"invalidPolicy": "Dichiarazione di policy non valida:",
"failedToCreateRole": "Impossibile creare il ruolo.",
"failedToDeleteRole": "Impossibile eliminare il ruolo.",
"missingUpdateFields": "Campi da aggiornare mancanti: nome o policy.",
"failedToUpdateRole": "Impossibile aggiornare il ruolo.",
"requiresSuperAdminRole": "È richiesto il ruolo di Super Admin per gestire i ruoli."
},
"settings": {
"failedToRetrieve": "Impossibile recuperare le impostazioni",
"failedToUpdate": "Impossibile aggiornare le impostazioni",
"noPermissionToUpdate": "Non hai il permesso di aggiornare le impostazioni di sistema."
},
"dashboard": {
"permissionRequired": "È necessaria l'autorizzazione di lettura della dashboard per visualizzare i dati della dashboard."
},
"ingestion": {
"failedToCreate": "Impossibile creare l'origine di inserimento a causa di un errore di connessione.",
"notFound": "Origine di inserimento non trovata",
"initialImportTriggered": "Importazione iniziale attivata con successo.",
"forceSyncTriggered": "Sincronizzazione forzata attivata con successo."
},
"archivedEmail": {
"notFound": "Email archiviata non trovata"
},
"search": {
"keywordsRequired": "Le parole chiave sono obbligatorie"
},
"storage": {
"filePathRequired": "Il percorso del file è obbligatorio",
"invalidFilePath": "Percorso del file non valido",
"fileNotFound": "File non trovato",
"downloadError": "Errore durante il download del file"
}
}

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "メールアドレス、パスワード、名前は必須です",
"alreadyCompleted": "セットアップはすでに完了しています。"
},
"login": {
"emailAndPasswordRequired": "メールアドレスとパスワードは必須です",
"invalidCredentials": "無効な認証情報"
}
},
"errors": {
"internalServerError": "内部サーバーエラーが発生しました",
"demoMode": "この操作はデモモードでは許可されていません。",
"unauthorized": "不正なアクセス",
"unknown": "不明なエラーが発生しました",
"noPermissionToAction": "現在の操作を実行する権限がありません。"
},
"user": {
"notFound": "ユーザーが見つかりません",
"cannotDeleteOnlyUser": "データベース内の唯一のユーザーを削除しようとしていますが、これは許可されていません。",
"requiresSuperAdminRole": "ユーザーを管理するには、スーパー管理者ロールが必要です。"
},
"iam": {
"failedToGetRoles": "役割の取得に失敗しました。",
"roleNotFound": "役割が見つかりません。",
"failedToGetRole": "役割の取得に失敗しました。",
"missingRoleFields": "必須フィールドがありません:名前とポリシー。",
"invalidPolicy": "無効なポリシーステートメント:",
"failedToCreateRole": "役割の作成に失敗しました。",
"failedToDeleteRole": "役割の削除に失敗しました。",
"missingUpdateFields": "更新するフィールドがありません:名前またはポリシー。",
"failedToUpdateRole": "役割の更新に失敗しました。",
"requiresSuperAdminRole": "役割を管理するには、スーパー管理者ロールが必要です。"
},
"settings": {
"failedToRetrieve": "設定の取得に失敗しました",
"failedToUpdate": "設定の更新に失敗しました",
"noPermissionToUpdate": "システム設定を更新する権限がありません。"
},
"dashboard": {
"permissionRequired": "ダッシュボードのデータを表示するには、ダッシュボードの読み取り権限が必要です。"
},
"ingestion": {
"failedToCreate": "接続エラーのため、取り込みソースの作成に失敗しました。",
"notFound": "取り込みソースが見つかりません",
"initialImportTriggered": "初期インポートが正常にトリガーされました。",
"forceSyncTriggered": "強制同期が正常にトリガーされました。"
},
"archivedEmail": {
"notFound": "アーカイブされたメールが見つかりません"
},
"search": {
"keywordsRequired": "キーワードは必須です"
},
"storage": {
"filePathRequired": "ファイルパスは必須です",
"invalidFilePath": "無効なファイルパス",
"fileNotFound": "ファイルが見つかりません",
"downloadError": "ファイルのダウンロード中にエラーが発生しました"
}
}

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "E-mail, wachtwoord en naam zijn verplicht",
"alreadyCompleted": "De installatie is al voltooid."
},
"login": {
"emailAndPasswordRequired": "E-mail en wachtwoord zijn verplicht",
"invalidCredentials": "Ongeldige inloggegevens"
}
},
"errors": {
"internalServerError": "Er is een interne serverfout opgetreden",
"demoMode": "Deze bewerking is niet toegestaan in de demomodus.",
"unauthorized": "Ongeautoriseerd",
"unknown": "Er is een onbekende fout opgetreden",
"noPermissionToAction": "U heeft geen toestemming om de huidige actie uit te voeren."
},
"user": {
"notFound": "Gebruiker niet gevonden",
"cannotDeleteOnlyUser": "U probeert de enige gebruiker in de database te verwijderen, dit is niet toegestaan.",
"requiresSuperAdminRole": "De rol van Super Admin is vereist om gebruikers te beheren."
},
"iam": {
"failedToGetRoles": "Kan rollen niet ophalen.",
"roleNotFound": "Rol niet gevonden.",
"failedToGetRole": "Kan rol niet ophalen.",
"missingRoleFields": "Ontbrekende verplichte velden: naam en beleid.",
"invalidPolicy": "Ongeldige beleidsverklaring:",
"failedToCreateRole": "Kan rol niet aanmaken.",
"failedToDeleteRole": "Kan rol niet verwijderen.",
"missingUpdateFields": "Ontbrekende velden om bij te werken: naam of beleid.",
"failedToUpdateRole": "Kan rol niet bijwerken.",
"requiresSuperAdminRole": "De rol van Super Admin is vereist om rollen te beheren."
},
"settings": {
"failedToRetrieve": "Kan instellingen niet ophalen",
"failedToUpdate": "Kan instellingen niet bijwerken",
"noPermissionToUpdate": "U heeft geen toestemming om de systeeminstellingen bij te werken."
},
"dashboard": {
"permissionRequired": "U heeft de leesrechten voor het dashboard nodig om dashboardgegevens te bekijken."
},
"ingestion": {
"failedToCreate": "Kan de opnamebron niet aanmaken vanwege een verbindingsfout.",
"notFound": "Opnamebron niet gevonden",
"initialImportTriggered": "Initiële import succesvol geactiveerd.",
"forceSyncTriggered": "Geforceerde synchronisatie succesvol geactiveerd."
},
"archivedEmail": {
"notFound": "Gearchiveerde e-mail niet gevonden"
},
"search": {
"keywordsRequired": "Trefwoorden zijn verplicht"
},
"storage": {
"filePathRequired": "Bestandspad is verplicht",
"invalidFilePath": "Ongeldig bestandspad",
"fileNotFound": "Bestand niet gevonden",
"downloadError": "Fout bij het downloaden van het bestand"
}
}

View File

@@ -0,0 +1,62 @@
{
"auth": {
"setup": {
"allFieldsRequired": "E-mail, senha e nome são obrigatórios",
"alreadyCompleted": "A configuração já foi concluída."
},
"login": {
"emailAndPasswordRequired": "E-mail e senha são obrigatórios",
"invalidCredentials": "Credenciais inválidas"
}
},
"errors": {
"internalServerError": "Ocorreu um erro interno do servidor",
"demoMode": "Esta operação não é permitida no modo de demonstração.",
"unauthorized": "Não autorizado",
"unknown": "Ocorreu um erro desconhecido",
"noPermissionToAction": "Você não tem permissão para executar a ação atual."
},
"user": {
"notFound": "Usuário não encontrado",
"cannotDeleteOnlyUser": "Você está tentando excluir o único usuário no banco de dados, isso não é permitido.",
"requiresSuperAdminRole": "A função de Super Admin é necessária para gerenciar usuários."
},
"iam": {
"failedToGetRoles": "Falha ao obter as funções.",
"roleNotFound": "Função não encontrada.",
"failedToGetRole": "Falha ao obter a função.",
"missingRoleFields": "Campos obrigatórios ausentes: nome e política.",
"invalidPolicy": "Declaração de política inválida:",
"failedToCreateRole": "Falha ao criar a função.",
"failedToDeleteRole": "Falha ao excluir a função.",
"missingUpdateFields": "Campos ausentes para atualização: nome ou políticas.",
"failedToUpdateRole": "Falha ao atualizar a função.",
"requiresSuperAdminRole": "A função de Super Admin é necessária para gerenciar as funções."
},
"settings": {
"failedToRetrieve": "Falha ao recuperar as configurações",
"failedToUpdate": "Falha ao atualizar as configurações",
"noPermissionToUpdate": "Você não tem permissão para atualizar as configurações do sistema."
},
"dashboard": {
"permissionRequired": "Você precisa da permissão de leitura do painel para visualizar os dados do painel."
},
"ingestion": {
"failedToCreate": "Falha ao criar a fonte de ingestão devido a um erro de conexão.",
"notFound": "Fonte de ingestão não encontrada",
"initialImportTriggered": "Importação inicial acionada com sucesso.",
"forceSyncTriggered": "Sincronização forçada acionada com sucesso."
},
"archivedEmail": {
"notFound": "E-mail arquivado não encontrado"
},
"search": {
"keywordsRequired": "Palavras-chave são obrigatórias"
},
"storage": {
"filePathRequired": "O caminho do arquivo é obrigatório",
"invalidFilePath": "Caminho de arquivo inválido",
"fileNotFound": "Arquivo não encontrado",
"downloadError": "Erro ao baixar o arquivo"
}
}

View File

@@ -0,0 +1,72 @@
import { randomBytes, createHash } from 'crypto';
import { db } from '../database';
import { apiKeys } from '../database/schema/api-keys';
import { CryptoService } from './CryptoService';
import { and, eq } from 'drizzle-orm';
import { ApiKey } from '@open-archiver/types';
export class ApiKeyService {
public static async generate(
userId: string,
name: string,
expiresInDays: number
): Promise<string> {
const key = randomBytes(32).toString('hex');
const expiresAt = new Date();
expiresAt.setDate(expiresAt.getDate() + expiresInDays);
const keyHash = createHash('sha256').update(key).digest('hex');
await db.insert(apiKeys).values({
userId,
name,
key: CryptoService.encrypt(key),
keyHash,
expiresAt,
});
return key;
}
public static async getKeys(userId: string): Promise<ApiKey[]> {
const keys = await db.select().from(apiKeys).where(eq(apiKeys.userId, userId));
return keys
.map((apiKey) => {
const decryptedKey = CryptoService.decrypt(apiKey.key);
if (!decryptedKey) {
return null;
}
return {
...apiKey,
key: decryptedKey.slice(0, 5) + '*****',
expiresAt: apiKey.expiresAt.toISOString(),
createdAt: apiKey.createdAt.toISOString(),
};
})
.filter((k): k is NonNullable<typeof k> => k !== null);
}
public static async deleteKey(id: string, userId: string) {
await db.delete(apiKeys).where(and(eq(apiKeys.id, id), eq(apiKeys.userId, userId)));
}
/**
*
* @param key API key
* @returns The owner user ID or null. null means the API key is not found.
*/
public static async validateKey(key: string): Promise<string | null> {
const keyHash = createHash('sha256').update(key).digest('hex');
const [apiKey] = await db.select().from(apiKeys).where(eq(apiKeys.keyHash, keyHash));
if (!apiKey || apiKey.expiresAt < new Date()) {
return null;
}
const decryptedKey = CryptoService.decrypt(apiKey.key);
if (decryptedKey !== key) {
// This should not happen if the hash matches, but as a security measure, we double-check.
return null;
}
return apiKey.userId;
}
}

View File

@@ -1,6 +1,13 @@
import { count, desc, eq, asc, and } from 'drizzle-orm';
import { db } from '../database';
import { archivedEmails, attachments, emailAttachments } from '../database/schema';
import {
archivedEmails,
attachments,
emailAttachments,
ingestionSources,
} from '../database/schema';
import { FilterBuilder } from './FilterBuilder';
import { AuthorizationService } from './AuthorizationService';
import type {
PaginatedArchivedEmails,
ArchivedEmail,
@@ -41,25 +48,41 @@ export class ArchivedEmailService {
public static async getArchivedEmails(
ingestionSourceId: string,
page: number,
limit: number
limit: number,
userId: string
): Promise<PaginatedArchivedEmails> {
const offset = (page - 1) * limit;
const { drizzleFilter } = await FilterBuilder.create(userId, 'archive', 'read');
const where = and(eq(archivedEmails.ingestionSourceId, ingestionSourceId), drizzleFilter);
const [total] = await db
const countQuery = db
.select({
count: count(archivedEmails.id),
})
.from(archivedEmails)
.where(eq(archivedEmails.ingestionSourceId, ingestionSourceId));
.leftJoin(ingestionSources, eq(archivedEmails.ingestionSourceId, ingestionSources.id));
const items = await db
if (where) {
countQuery.where(where);
}
const [total] = await countQuery;
const itemsQuery = db
.select()
.from(archivedEmails)
.where(eq(archivedEmails.ingestionSourceId, ingestionSourceId))
.leftJoin(ingestionSources, eq(archivedEmails.ingestionSourceId, ingestionSources.id))
.orderBy(desc(archivedEmails.sentAt))
.limit(limit)
.offset(offset);
if (where) {
itemsQuery.where(where);
}
const results = await itemsQuery;
const items = results.map((r) => r.archived_emails);
return {
items: items.map((item) => ({
...item,
@@ -73,16 +96,28 @@ export class ArchivedEmailService {
};
}
public static async getArchivedEmailById(emailId: string): Promise<ArchivedEmail | null> {
const [email] = await db
.select()
.from(archivedEmails)
.where(eq(archivedEmails.id, emailId));
public static async getArchivedEmailById(
emailId: string,
userId: string
): Promise<ArchivedEmail | null> {
const email = await db.query.archivedEmails.findFirst({
where: eq(archivedEmails.id, emailId),
with: {
ingestionSource: true,
},
});
if (!email) {
return null;
}
const authorizationService = new AuthorizationService();
const canRead = await authorizationService.can(userId, 'read', 'archive', email);
if (!canRead) {
return null;
}
let threadEmails: ThreadEmail[] = [];
if (email.threadId) {

View File

@@ -63,7 +63,13 @@ export class AuthService {
roles: roles,
});
return { accessToken, user: userWithoutPassword };
return {
accessToken,
user: {
...userWithoutPassword,
role: null,
},
};
}
public async verifyToken(token: string): Promise<AuthTokenPayload | null> {

View File

@@ -0,0 +1,25 @@
import { IamService } from './IamService';
import { createAbilityFor, SubjectObject } from '../iam-policy/ability';
import { subject, Subject } from '@casl/ability';
import { AppActions, AppSubjects } from '@open-archiver/types';
export class AuthorizationService {
private iamService: IamService;
constructor() {
this.iamService = new IamService();
}
public async can(
userId: string,
action: AppActions,
resource: AppSubjects,
resourceObject?: SubjectObject
): Promise<boolean> {
const ability = await this.iamService.getAbilityForUser(userId);
const subjectInstance = resourceObject
? subject(resource, resourceObject as Record<PropertyKey, any>)
: resource;
return ability.can(action, subjectInstance as AppSubjects);
}
}

View File

@@ -0,0 +1,58 @@
import { SQL, sql } from 'drizzle-orm';
import { IamService } from './IamService';
import { rulesToQuery } from '@casl/ability/extra';
import { mongoToDrizzle } from '../helpers/mongoToDrizzle';
import { mongoToMeli } from '../helpers/mongoToMeli';
import { AppActions, AppSubjects } from '@open-archiver/types';
export class FilterBuilder {
public static async create(
userId: string,
resourceType: AppSubjects,
action: AppActions
): Promise<{
drizzleFilter: SQL | undefined;
searchFilter: string | undefined;
}> {
const iamService = new IamService();
const ability = await iamService.getAbilityForUser(userId);
// If the user has an unconditional `can` rule and no `cannot` rules,
// they have full access and we can skip building a complex query.
const rules = ability.rulesFor(action, resourceType);
const hasUnconditionalCan = rules.some(
(rule) => rule.inverted === false && !rule.conditions
);
const cannotConditions = rules
.filter((rule) => rule.inverted === true && rule.conditions)
.map((rule) => rule.conditions as object);
if (hasUnconditionalCan && cannotConditions.length === 0) {
return { drizzleFilter: undefined, searchFilter: undefined }; // Full access
}
let query = rulesToQuery(ability, action, resourceType, (rule) => rule.conditions);
if (hasUnconditionalCan && cannotConditions.length > 0) {
// If there's a broad `can` rule, the final query should be an AND of all
// the `cannot` conditions, effectively excluding them.
const andConditions = cannotConditions.map((condition) => {
const newCondition: Record<string, any> = {};
for (const key in condition) {
newCondition[key] = { $ne: (condition as any)[key] };
}
return newCondition;
});
query = { $and: andConditions };
}
if (query === null) {
return { drizzleFilter: undefined, searchFilter: undefined }; // Full access
}
if (Object.keys(query).length === 0) {
return { drizzleFilter: sql`1=0`, searchFilter: 'ingestionSourceId = "-1"' }; // No access
}
return { drizzleFilter: mongoToDrizzle(query), searchFilter: await mongoToMeli(query) };
}
}

View File

@@ -1,9 +1,24 @@
import { db } from '../database';
import { roles } from '../database/schema/users';
import type { Role, PolicyStatement } from '@open-archiver/types';
import { roles, userRoles, users } from '../database/schema/users';
import type { Role, CaslPolicy, User } from '@open-archiver/types';
import { eq } from 'drizzle-orm';
import { createAbilityFor, AppAbility } from '../iam-policy/ability';
export class IamService {
/**
* Retrieves all roles associated with a given user.
* @param userId The ID of the user.
* @returns A promise that resolves to an array of Role objects.
*/
public async getRolesForUser(userId: string): Promise<Role[]> {
const userRolesResult = await db
.select()
.from(userRoles)
.where(eq(userRoles.userId, userId))
.leftJoin(roles, eq(userRoles.roleId, roles.id));
return userRolesResult.map((r) => r.roles).filter((r): r is Role => r !== null);
}
public async getRoles(): Promise<Role[]> {
return db.select().from(roles);
}
@@ -13,12 +28,57 @@ export class IamService {
return role;
}
public async createRole(name: string, policy: PolicyStatement[]): Promise<Role> {
const [role] = await db.insert(roles).values({ name, policies: policy }).returning();
public async createRole(name: string, policy: CaslPolicy[], slug?: string): Promise<Role> {
const [role] = await db
.insert(roles)
.values({
name: name,
slug: slug || name.toLocaleLowerCase().replaceAll('', '_'),
policies: policy,
})
.returning();
return role;
}
public async deleteRole(id: string): Promise<void> {
await db.delete(roles).where(eq(roles.id, id));
}
public async updateRole(
id: string,
{ name, policies }: Partial<Pick<Role, 'name' | 'policies'>>
): Promise<Role> {
const [role] = await db
.update(roles)
.set({ name, policies })
.where(eq(roles.id, id))
.returning();
return role;
}
public async getAbilityForUser(userId: string): Promise<AppAbility> {
const user = await db.query.users.findFirst({
where: eq(users.id, userId),
});
if (!user) {
// Or handle this case as you see fit, maybe return an ability with no permissions
throw new Error('User not found');
}
const userRoles = await this.getRolesForUser(userId);
const allPolicies = userRoles.flatMap((role) => role.policies || []);
// Interpolate policies
const interpolatedPolicies = this.interpolatePolicies(allPolicies, {
...user,
role: null,
} as User);
return createAbilityFor(interpolatedPolicies);
}
private interpolatePolicies(policies: CaslPolicy[], user: User): CaslPolicy[] {
const userPoliciesString = JSON.stringify(policies);
const interpolatedPoliciesString = userPoliciesString.replace(/\$\{user\.id\}/g, user.id);
return JSON.parse(interpolatedPoliciesString);
}
}

View File

@@ -66,7 +66,11 @@ export class IndexingService {
.where(eq(emailAttachments.emailId, emailId));
}
const document = await this.createEmailDocument(email, emailAttachmentsResult);
const document = await this.createEmailDocument(
email,
emailAttachmentsResult,
email.userEmail
);
await this.searchService.addDocuments('emails', [document], 'id');
}
@@ -92,8 +96,10 @@ export class IndexingService {
email,
attachments,
ingestionSourceId,
archivedEmailId
archivedEmailId,
email.userEmail || ''
);
// console.log(document);
await this.searchService.addDocuments('emails', [document], 'id');
}
@@ -104,7 +110,8 @@ export class IndexingService {
email: EmailObject,
attachments: AttachmentsType,
ingestionSourceId: string,
archivedEmailId: string
archivedEmailId: string,
userEmail: string //the owner of the email inbox
): Promise<EmailDocument> {
const extractedAttachments = [];
for (const attachment of attachments) {
@@ -122,8 +129,10 @@ export class IndexingService {
// skip attachment or fail the job
}
}
// console.log('email.userEmail', userEmail);
return {
id: archivedEmailId,
userEmail: userEmail,
from: email.from[0]?.address,
to: email.to.map((i: EmailAddress) => i.address) || [],
cc: email.cc?.map((i: EmailAddress) => i.address) || [],
@@ -141,7 +150,8 @@ export class IndexingService {
*/
private async createEmailDocument(
email: typeof archivedEmails.$inferSelect,
attachments: Attachment[]
attachments: Attachment[],
userEmail: string //the owner of the email inbox
): Promise<EmailDocument> {
const attachmentContents = await this.extractAttachmentContents(attachments);
@@ -155,9 +165,10 @@ export class IndexingService {
'';
const recipients = email.recipients as DbRecipients;
// console.log('email.userEmail', email.userEmail);
return {
id: email.id,
userEmail: userEmail,
from: email.senderEmail,
to: recipients.to?.map((r) => r.address) || [],
cc: recipients.cc?.map((r) => r.address) || [],

View File

@@ -25,6 +25,7 @@ import { IndexingService } from './IndexingService';
import { SearchService } from './SearchService';
import { DatabaseService } from './DatabaseService';
import { config } from '../config/index';
import { FilterBuilder } from './FilterBuilder';
export class IngestionService {
private static decryptSource(
@@ -49,11 +50,15 @@ export class IngestionService {
return ['pst_import', 'eml_import'];
}
public static async create(dto: CreateIngestionSourceDto): Promise<IngestionSource> {
public static async create(
dto: CreateIngestionSourceDto,
userId: string
): Promise<IngestionSource> {
const { providerConfig, ...rest } = dto;
const encryptedCredentials = CryptoService.encryptObject(providerConfig);
const valuesToInsert = {
userId,
...rest,
status: 'pending_auth' as const,
credentials: encryptedCredentials,
@@ -81,11 +86,15 @@ export class IngestionService {
}
}
public static async findAll(): Promise<IngestionSource[]> {
const sources = await db
.select()
.from(ingestionSources)
.orderBy(desc(ingestionSources.createdAt));
public static async findAll(userId: string): Promise<IngestionSource[]> {
const { drizzleFilter } = await FilterBuilder.create(userId, 'ingestion', 'read');
let query = db.select().from(ingestionSources).$dynamic();
if (drizzleFilter) {
query = query.where(drizzleFilter);
}
const sources = await query.orderBy(desc(ingestionSources.createdAt));
return sources.flatMap((source) => {
const decrypted = this.decryptSource(source);
return decrypted ? [decrypted] : [];
@@ -398,6 +407,8 @@ export class IngestionService {
searchService,
storageService
);
//assign userEmail
email.userEmail = userEmail;
await indexingService.indexByEmail(email, source.id, archivedEmail.id);
} catch (error) {
logger.error({

View File

@@ -1,6 +1,7 @@
import { Index, MeiliSearch, SearchParams } from 'meilisearch';
import { config } from '../config';
import type { SearchQuery, SearchResult, EmailDocument, TopSender } from '@open-archiver/types';
import { FilterBuilder } from './FilterBuilder';
export class SearchService {
private client: MeiliSearch;
@@ -47,7 +48,7 @@ export class SearchService {
return index.deleteDocuments({ filter });
}
public async searchEmails(dto: SearchQuery): Promise<SearchResult> {
public async searchEmails(dto: SearchQuery, userId: string): Promise<SearchResult> {
const { query, filters, page = 1, limit = 10, matchingStrategy = 'last' } = dto;
const index = await this.getIndex<EmailDocument>('emails');
@@ -70,6 +71,20 @@ export class SearchService {
searchParams.filter = filterStrings.join(' AND ');
}
// Create a filter based on the user's permissions.
// This ensures that the user can only search for emails they are allowed to see.
const { searchFilter } = await FilterBuilder.create(userId, 'archive', 'read');
if (searchFilter) {
// Convert the MongoDB-style filter from CASL to a MeiliSearch filter string.
if (searchParams.filter) {
// If there are existing filters, append the access control filter.
searchParams.filter = `${searchParams.filter} AND ${searchFilter}`;
} else {
// Otherwise, just use the access control filter.
searchParams.filter = searchFilter;
}
}
console.log('searchParams', searchParams);
const searchResults = await index.search(query, searchParams);
return {
@@ -116,8 +131,17 @@ export class SearchService {
'bcc',
'attachments.filename',
'attachments.content',
'userEmail',
],
filterableAttributes: [
'from',
'to',
'cc',
'bcc',
'timestamp',
'ingestionSourceId',
'userEmail',
],
filterableAttributes: ['from', 'to', 'cc', 'bcc', 'timestamp', 'ingestionSourceId'],
sortableAttributes: ['timestamp'],
});
}

View File

@@ -0,0 +1,55 @@
import { db } from '../database';
import { systemSettings } from '../database/schema/system-settings';
import type { SystemSettings } from '@open-archiver/types';
import { eq } from 'drizzle-orm';
const DEFAULT_SETTINGS: SystemSettings = {
language: 'en',
theme: 'system',
supportEmail: null,
};
export class SettingsService {
/**
* Retrieves the current system settings.
* If no settings exist, it initializes and returns the default settings.
* @returns The system settings.
*/
public async getSystemSettings(): Promise<SystemSettings> {
const settings = await db.select().from(systemSettings).limit(1);
if (settings.length === 0) {
return this.createDefaultSystemSettings();
}
return settings[0].config;
}
/**
* Updates the system settings by merging the new configuration with the existing one.
* @param newConfig - A partial object of the new settings configuration.
* @returns The updated system settings.
*/
public async updateSystemSettings(newConfig: Partial<SystemSettings>): Promise<SystemSettings> {
const currentConfig = await this.getSystemSettings();
const mergedConfig = { ...currentConfig, ...newConfig };
// Since getSettings ensures a record always exists, we can directly update.
const [result] = await db.update(systemSettings).set({ config: mergedConfig }).returning();
return result.config;
}
/**
* Creates and saves the default system settings.
* This is called internally when no settings are found.
* @returns The newly created default settings.
*/
private async createDefaultSystemSettings(): Promise<SystemSettings> {
const [result] = await db
.insert(systemSettings)
.values({ config: DEFAULT_SETTINGS })
.returning();
return result.config;
}
}

View File

@@ -1,9 +1,8 @@
import { db } from '../database';
import * as schema from '../database/schema';
import { and, eq, asc, sql } from 'drizzle-orm';
import { eq, sql } from 'drizzle-orm';
import { hash } from 'bcryptjs';
import type { PolicyStatement, User } from '@open-archiver/types';
import { PolicyValidator } from '../iam-policy/policy-validator';
import type { CaslPolicy, User } from '@open-archiver/types';
export class UserService {
/**
@@ -23,11 +22,91 @@ export class UserService {
* @param id The ID of the user to find.
* @returns The user object if found, otherwise null.
*/
public async findById(id: string): Promise<typeof schema.users.$inferSelect | null> {
public async findById(id: string): Promise<User | null> {
const user = await db.query.users.findFirst({
where: eq(schema.users.id, id),
with: {
userRoles: {
with: {
role: true,
},
},
},
});
return user || null;
if (!user) return null;
return {
...user,
role: user.userRoles[0]?.role || null,
};
}
public async findAll(): Promise<User[]> {
const users = await db.query.users.findMany({
with: {
userRoles: {
with: {
role: true,
},
},
},
});
return users.map((u) => ({
...u,
role: u.userRoles[0]?.role || null,
}));
}
public async createUser(
userDetails: Pick<User, 'email' | 'first_name' | 'last_name'> & { password?: string },
roleId: string
): Promise<typeof schema.users.$inferSelect> {
const { email, first_name, last_name, password } = userDetails;
const hashedPassword = password ? await hash(password, 10) : undefined;
const newUser = await db
.insert(schema.users)
.values({
email,
first_name,
last_name,
password: hashedPassword,
})
.returning();
await db.insert(schema.userRoles).values({
userId: newUser[0].id,
roleId: roleId,
});
return newUser[0];
}
public async updateUser(
id: string,
userDetails: Partial<Pick<User, 'email' | 'first_name' | 'last_name'>>,
roleId?: string
): Promise<typeof schema.users.$inferSelect | null> {
const updatedUser = await db
.update(schema.users)
.set(userDetails)
.where(eq(schema.users.id, id))
.returning();
if (roleId) {
await db.delete(schema.userRoles).where(eq(schema.userRoles.userId, id));
await db.insert(schema.userRoles).values({
userId: id,
roleId: roleId,
});
}
return updatedUser[0] || null;
}
public async deleteUser(id: string): Promise<void> {
await db.delete(schema.users).where(eq(schema.users.id, id));
}
/**
@@ -66,29 +145,7 @@ export class UserService {
})
.returning();
// find super admin role
let superAdminRole = await db.query.roles.findFirst({
where: eq(schema.roles.name, 'Super Admin'),
});
if (!superAdminRole) {
const suerAdminPolicies: PolicyStatement[] = [
{
Effect: 'Allow',
Action: ['*'],
Resource: ['*'],
},
];
superAdminRole = (
await db
.insert(schema.roles)
.values({
name: 'Super Admin',
policies: suerAdminPolicies,
})
.returning()
)[0];
}
const superAdminRole = await this.createAdminRole();
await db.insert(schema.userRoles).values({
userId: newUser[0].id,
@@ -97,4 +154,31 @@ export class UserService {
return newUser[0];
}
public async createAdminRole() {
// find super admin role
let superAdminRole = await db.query.roles.findFirst({
where: eq(schema.roles.name, 'Super Admin'),
});
if (!superAdminRole) {
const suerAdminPolicies: CaslPolicy[] = [
{
action: 'manage',
subject: 'all',
},
];
superAdminRole = (
await db
.insert(schema.roles)
.values({
name: 'Super Admin',
slug: 'predefined_super_admin',
policies: suerAdminPolicies,
})
.returning()
)[0];
}
return superAdminRole;
}
}

View File

@@ -26,6 +26,10 @@ export class ImapConnector implements IEmailConnector {
host: this.credentials.host,
port: this.credentials.port,
secure: this.credentials.secure,
tls: {
rejectUnauthorized: this.credentials.allowInsecureCert,
requestCert: true,
},
auth: {
user: this.credentials.username,
pass: this.credentials.password,
@@ -145,108 +149,112 @@ export class ImapConnector implements IEmailConnector {
userEmail: string,
syncState?: SyncState | null
): AsyncGenerator<EmailObject | null> {
// list all mailboxes first
const mailboxes = await this.withRetry(async () => await this.client.list());
await this.disconnect();
try {
// list all mailboxes first
const mailboxes = await this.withRetry(async () => await this.client.list());
const processableMailboxes = mailboxes.filter((mailbox) => {
// filter out trash and all mail emails
if (mailbox.specialUse) {
const specialUse = mailbox.specialUse.toLowerCase();
if (specialUse === '\\junk' || specialUse === '\\trash' || specialUse === '\\all') {
const processableMailboxes = mailboxes.filter((mailbox) => {
// filter out trash and all mail emails
if (mailbox.specialUse) {
const specialUse = mailbox.specialUse.toLowerCase();
if (
specialUse === '\\junk' ||
specialUse === '\\trash' ||
specialUse === '\\all'
) {
return false;
}
}
// Fallback to checking flags
if (
mailbox.flags.has('\\Noselect') ||
mailbox.flags.has('\\Trash') ||
mailbox.flags.has('\\Junk') ||
mailbox.flags.has('\\All')
) {
return false;
}
}
// Fallback to checking flags
if (
mailbox.flags.has('\\Noselect') ||
mailbox.flags.has('\\Trash') ||
mailbox.flags.has('\\Junk') ||
mailbox.flags.has('\\All')
) {
return false;
}
return true;
});
return true;
});
for (const mailboxInfo of processableMailboxes) {
const mailboxPath = mailboxInfo.path;
logger.info({ mailboxPath }, 'Processing mailbox');
for (const mailboxInfo of processableMailboxes) {
const mailboxPath = mailboxInfo.path;
logger.info({ mailboxPath }, 'Processing mailbox');
try {
const mailbox = await this.withRetry(
async () => await this.client.mailboxOpen(mailboxPath)
);
const lastUid = syncState?.imap?.[mailboxPath]?.maxUid;
let currentMaxUid = lastUid || 0;
try {
const mailbox = await this.withRetry(
async () => await this.client.mailboxOpen(mailboxPath)
);
const lastUid = syncState?.imap?.[mailboxPath]?.maxUid;
let currentMaxUid = lastUid || 0;
if (mailbox.exists > 0) {
const lastMessage = await this.client.fetchOne(String(mailbox.exists), {
uid: true,
});
if (lastMessage && lastMessage.uid > currentMaxUid) {
currentMaxUid = lastMessage.uid;
}
}
// Initialize with last synced UID, not the maximum UID in mailbox
this.newMaxUids[mailboxPath] = lastUid || 0;
// Only fetch if the mailbox has messages, to avoid errors on empty mailboxes with some IMAP servers.
if (mailbox.exists > 0) {
const BATCH_SIZE = 250; // A configurable batch size
let startUid = (lastUid || 0) + 1;
const maxUidToFetch = currentMaxUid;
while (startUid <= maxUidToFetch) {
const endUid = Math.min(startUid + BATCH_SIZE - 1, maxUidToFetch);
const searchCriteria = { uid: `${startUid}:${endUid}` };
for await (const msg of this.client.fetch(searchCriteria, {
envelope: true,
source: true,
bodyStructure: true,
if (mailbox.exists > 0) {
const lastMessage = await this.client.fetchOne(String(mailbox.exists), {
uid: true,
})) {
if (lastUid && msg.uid <= lastUid) {
continue;
}
});
if (lastMessage && lastMessage.uid > currentMaxUid) {
currentMaxUid = lastMessage.uid;
}
}
if (msg.uid > this.newMaxUids[mailboxPath]) {
this.newMaxUids[mailboxPath] = msg.uid;
}
// Initialize with last synced UID, not the maximum UID in mailbox
this.newMaxUids[mailboxPath] = lastUid || 0;
logger.debug({ mailboxPath, uid: msg.uid }, 'Processing message');
// Only fetch if the mailbox has messages, to avoid errors on empty mailboxes with some IMAP servers.
if (mailbox.exists > 0) {
const BATCH_SIZE = 250; // A configurable batch size
let startUid = (lastUid || 0) + 1;
const maxUidToFetch = currentMaxUid;
if (msg.envelope && msg.source) {
try {
yield await this.parseMessage(msg, mailboxPath);
} catch (err: any) {
logger.error(
{ err, mailboxPath, uid: msg.uid },
'Failed to parse message'
);
throw err;
while (startUid <= maxUidToFetch) {
const endUid = Math.min(startUid + BATCH_SIZE - 1, maxUidToFetch);
const searchCriteria = { uid: `${startUid}:${endUid}` };
for await (const msg of this.client.fetch(searchCriteria, {
envelope: true,
source: true,
bodyStructure: true,
uid: true,
})) {
if (lastUid && msg.uid <= lastUid) {
continue;
}
if (msg.uid > this.newMaxUids[mailboxPath]) {
this.newMaxUids[mailboxPath] = msg.uid;
}
logger.debug({ mailboxPath, uid: msg.uid }, 'Processing message');
if (msg.envelope && msg.source) {
try {
yield await this.parseMessage(msg, mailboxPath);
} catch (err: any) {
logger.error(
{ err, mailboxPath, uid: msg.uid },
'Failed to parse message'
);
throw err;
}
}
}
}
// Move to the next batch
startUid = endUid + 1;
// Move to the next batch
startUid = endUid + 1;
}
}
} catch (err: any) {
logger.error({ err, mailboxPath }, 'Failed to process mailbox');
// Check if the error indicates a persistent failure after retries
if (err.message.includes('IMAP operation failed after all retries')) {
this.statusMessage =
'Sync paused due to reaching the mail server rate limit. The process will automatically resume later.';
}
}
} catch (err: any) {
logger.error({ err, mailboxPath }, 'Failed to process mailbox');
// Check if the error indicates a persistent failure after retries
if (err.message.includes('IMAP operation failed after all retries')) {
this.statusMessage =
'Sync paused due to reaching the mail server rate limit. The process will automatically resume later.';
}
} finally {
await this.disconnect();
}
} finally {
await this.disconnect();
}
}

View File

@@ -23,6 +23,7 @@
"lucide-svelte": "^0.525.0",
"postal-mime": "^2.4.4",
"svelte-persisted-store": "^0.12.0",
"sveltekit-i18n": "^2.4.2",
"tailwind-merge": "^3.3.1",
"tailwind-variants": "^1.0.0"
},

View File

@@ -1,6 +1,7 @@
<script lang="ts">
import PostalMime, { type Email } from 'postal-mime';
import type { Buffer } from 'buffer';
import { t } from '$lib/translations';
let {
raw,
@@ -51,13 +52,16 @@
<div class="mt-2 rounded-md border bg-white p-4">
{#if isLoading}
<p>Loading email preview...</p>
<p>{$t('components.email_preview.loading')}</p>
{:else if emailHtml}
<iframe title="Email Preview" srcdoc={emailHtml()} class="h-[600px] w-full border-none"
<iframe
title={$t('archive.email_preview')}
srcdoc={emailHtml()}
class="h-[600px] w-full border-none"
></iframe>
{:else if raw}
<p>Could not render email preview.</p>
<p>{$t('components.email_preview.render_error')}</p>
{:else}
<p class="text-gray-500">Raw .eml file not available for this email.</p>
<p class="text-gray-500">{$t('components.email_preview.not_available')}</p>
{/if}
</div>

View File

@@ -2,6 +2,7 @@
import { goto } from '$app/navigation';
import type { ArchivedEmail } from '@open-archiver/types';
import { ScrollArea } from '$lib/components/ui/scroll-area/index.js';
import { t } from '$lib/translations';
let {
thread,
@@ -47,16 +48,16 @@
goto(`/dashboard/archived-emails/${item.id}`, {
invalidateAll: true,
});
}}>{item.subject || 'No Subject'}</a
}}>{item.subject || $t('app.archive.no_subject')}</a
>
{:else}
{item.subject || 'No Subject'}
{item.subject || $t('app.archive.no_subject')}
{/if}
</h4>
<div
class="flex flex-col space-y-2 text-sm font-normal leading-none text-gray-400"
>
<span>From: {item.senderEmail}</span>
<span>{$t('app.archive.from')}: {item.senderEmail}</span>
<time class="">{new Date(item.sentAt).toLocaleString()}</time>
</div>
</div>

View File

@@ -1,12 +1,17 @@
<footer class=" bg-muted py-6 md:py-0">
<script lang="ts">
import { t } from '$lib/translations';
</script>
<footer class="bg-muted py-6 md:py-0">
<div
class="container mx-auto flex flex-col items-center justify-center gap-4 md:h-24 md:flex-row"
>
<div class="flex flex-col items-center gap-2">
<p class=" text-balance text-center text-xs font-medium leading-loose">
<p class="text-balance text-center text-xs font-medium leading-loose">
© {new Date().getFullYear()}
<a href="https://openarchiver.com/" target="_blank">Open Archiver</a>. All rights
reserved.
<a href="https://openarchiver.com/" target="_blank">Open Archiver</a>. {$t(
'app.components.footer.all_rights_reserved'
)}
</p>
</div>
</div>

View File

@@ -11,6 +11,7 @@
import { setAlert } from '$lib/components/custom/alert/alert-state.svelte';
import { api } from '$lib/api.client';
import { Loader2 } from 'lucide-svelte';
import { t } from '$lib/translations';
let {
source = null,
onSubmit,
@@ -20,11 +21,26 @@
} = $props();
const providerOptions = [
{ value: 'generic_imap', label: 'Generic IMAP' },
{ value: 'google_workspace', label: 'Google Workspace' },
{ value: 'microsoft_365', label: 'Microsoft 365' },
{ value: 'pst_import', label: 'PST Import' },
{ value: 'eml_import', label: 'EML Import' },
{
value: 'generic_imap',
label: $t('app.components.ingestion_source_form.provider_generic_imap'),
},
{
value: 'google_workspace',
label: $t('app.components.ingestion_source_form.provider_google_workspace'),
},
{
value: 'microsoft_365',
label: $t('app.components.ingestion_source_form.provider_microsoft_365'),
},
{
value: 'pst_import',
label: $t('app.components.ingestion_source_form.provider_pst_import'),
},
{
value: 'eml_import',
label: $t('app.components.ingestion_source_form.provider_eml_import'),
},
];
let formData: CreateIngestionSourceDto = $state({
@@ -33,6 +49,7 @@
providerConfig: source?.credentials ?? {
type: source?.provider ?? 'generic_imap',
secure: true,
allowInsecureCert: false,
},
});
@@ -42,7 +59,8 @@
});
const triggerContent = $derived(
providerOptions.find((p) => p.value === formData.provider)?.label ?? 'Select a provider'
providerOptions.find((p) => p.value === formData.provider)?.label ??
$t('app.components.ingestion_source_form.select_provider')
);
let isSubmitting = $state(false);
@@ -89,7 +107,7 @@
fileUploading = false;
setAlert({
type: 'error',
title: 'Upload Failed, please try again',
title: $t('app.components.ingestion_source_form.upload_failed'),
message: JSON.stringify(error),
duration: 5000,
show: true,
@@ -100,11 +118,11 @@
<form onsubmit={handleSubmit} class="grid gap-4 py-4">
<div class="grid grid-cols-4 items-center gap-4">
<Label for="name" class="text-left">Name</Label>
<Label for="name" class="text-left">{$t('app.ingestions.name')}</Label>
<Input id="name" bind:value={formData.name} class="col-span-3" />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="provider" class="text-left">Provider</Label>
<Label for="provider" class="text-left">{$t('app.ingestions.provider')}</Label>
<Select.Root name="provider" bind:value={formData.provider} type="single">
<Select.Trigger class="col-span-3">
{triggerContent}
@@ -119,16 +137,22 @@
{#if formData.provider === 'google_workspace'}
<div class="grid grid-cols-4 items-center gap-4">
<Label for="serviceAccountKeyJson" class="text-left">Service Account Key (JSON)</Label>
<Label for="serviceAccountKeyJson" class="text-left"
>{$t('app.components.ingestion_source_form.service_account_key')}</Label
>
<Textarea
placeholder="Paste your service account key JSON content"
placeholder={$t(
'app.components.ingestion_source_form.service_account_key_placeholder'
)}
id="serviceAccountKeyJson"
bind:value={formData.providerConfig.serviceAccountKeyJson}
class="col-span-3 max-h-32"
/>
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="impersonatedAdminEmail" class="text-left">Impersonated Admin Email</Label>
<Label for="impersonatedAdminEmail" class="text-left"
>{$t('app.components.ingestion_source_form.impersonated_admin_email')}</Label
>
<Input
id="impersonatedAdminEmail"
bind:value={formData.providerConfig.impersonatedAdminEmail}
@@ -137,30 +161,40 @@
</div>
{:else if formData.provider === 'microsoft_365'}
<div class="grid grid-cols-4 items-center gap-4">
<Label for="clientId" class="text-left">Application (Client) ID</Label>
<Label for="clientId" class="text-left"
>{$t('app.components.ingestion_source_form.client_id')}</Label
>
<Input id="clientId" bind:value={formData.providerConfig.clientId} class="col-span-3" />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="clientSecret" class="text-left">Client Secret Value</Label>
<Label for="clientSecret" class="text-left"
>{$t('app.components.ingestion_source_form.client_secret')}</Label
>
<Input
id="clientSecret"
type="password"
placeholder="Enter the secret Value, not the Secret ID"
placeholder={$t('app.components.ingestion_source_form.client_secret_placeholder')}
bind:value={formData.providerConfig.clientSecret}
class="col-span-3"
/>
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="tenantId" class="text-left">Directory (Tenant) ID</Label>
<Label for="tenantId" class="text-left"
>{$t('app.components.ingestion_source_form.tenant_id')}</Label
>
<Input id="tenantId" bind:value={formData.providerConfig.tenantId} class="col-span-3" />
</div>
{:else if formData.provider === 'generic_imap'}
<div class="grid grid-cols-4 items-center gap-4">
<Label for="host" class="text-left">Host</Label>
<Label for="host" class="text-left"
>{$t('app.components.ingestion_source_form.host')}</Label
>
<Input id="host" bind:value={formData.providerConfig.host} class="col-span-3" />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="port" class="text-left">Port</Label>
<Label for="port" class="text-left"
>{$t('app.components.ingestion_source_form.port')}</Label
>
<Input
id="port"
type="number"
@@ -169,11 +203,13 @@
/>
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="username" class="text-left">Username</Label>
<Label for="username" class="text-left"
>{$t('app.components.ingestion_source_form.username')}</Label
>
<Input id="username" bind:value={formData.providerConfig.username} class="col-span-3" />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="password" class="text-left">Password</Label>
<Label for="password" class="text-left">{$t('auth.password')}</Label>
<Input
id="password"
type="password"
@@ -182,12 +218,22 @@
/>
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="secure" class="text-left">Use TLS</Label>
<Label for="secure" class="text-left"
>{$t('app.components.ingestion_source_form.use_tls')}</Label
>
<Checkbox id="secure" bind:checked={formData.providerConfig.secure} />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="secure" class="text-left"
>{$t('app.components.ingestion_source_form.allow_insecure_cert')}</Label
>
<Checkbox id="secure" bind:checked={formData.providerConfig.allowInsecureCert} />
</div>
{:else if formData.provider === 'pst_import'}
<div class="grid grid-cols-4 items-center gap-4">
<Label for="pst-file" class="text-left">PST File</Label>
<Label for="pst-file" class="text-left"
>{$t('app.components.ingestion_source_form.pst_file')}</Label
>
<div class="col-span-3 flex flex-row items-center space-x-2">
<Input
id="pst-file"
@@ -203,7 +249,9 @@
</div>
{:else if formData.provider === 'eml_import'}
<div class="grid grid-cols-4 items-center gap-4">
<Label for="eml-file" class="text-left">EML File</Label>
<Label for="eml-file" class="text-left"
>{$t('app.components.ingestion_source_form.eml_file')}</Label
>
<div class="col-span-3 flex flex-row items-center space-x-2">
<Input
id="eml-file"
@@ -220,12 +268,10 @@
{/if}
{#if formData.provider === 'google_workspace' || formData.provider === 'microsoft_365'}
<Alert.Root>
<Alert.Title>Heads up!</Alert.Title>
<Alert.Title>{$t('app.components.ingestion_source_form.heads_up')}</Alert.Title>
<Alert.Description>
<div class="my-1">
Please note that this is an organization-wide operation. This kind of ingestions
will import and index <b>all</b> email inboxes in your organization. If you want
to import only specific email inboxes, use the IMAP connector.
{@html $t('app.components.ingestion_source_form.org_wide_warning')}
</div>
</Alert.Description>
</Alert.Root>
@@ -233,9 +279,9 @@
<Dialog.Footer>
<Button type="submit" disabled={isSubmitting || fileUploading}>
{#if isSubmitting}
Submitting...
{$t('app.components.common.submitting')}
{:else}
Submit
{$t('app.components.common.submit')}
{/if}
</Button>
</Dialog.Footer>

Some files were not shown because too many files have changed in this diff Show More