mirror of
https://github.com/LogicLabs-OU/OpenArchiver.git
synced 2026-04-06 00:31:57 +02:00
Merge pull request #28 from LogicLabs-OU/dev
Migrate to DB users, implement IAM & add PST/EML importers
This commit is contained in:
@@ -4,6 +4,8 @@
|
||||
NODE_ENV=development
|
||||
PORT_BACKEND=4000
|
||||
PORT_FRONTEND=3000
|
||||
# The frequency of continuous email syncing. Default is every minutes, but you can change it to another value based on your needs.
|
||||
SYNC_FREQUENCY='* * * * *'
|
||||
|
||||
# --- Docker Compose Service Configuration ---
|
||||
# These variables are used by docker-compose.yml to configure the services. Leave them unchanged if you use Docker services for Postgresql, Valkey (Redis) and Meilisearch. If you decide to use your own instances of these services, you can substitute them with your own connection credentials.
|
||||
@@ -20,7 +22,7 @@ MEILI_HOST=http://meilisearch:7700
|
||||
|
||||
|
||||
|
||||
# Valkey (Redis compatible)
|
||||
# Redis (We use Valkey, which is Redis-compatible and open source)
|
||||
REDIS_HOST=valkey
|
||||
REDIS_PORT=6379
|
||||
REDIS_PASSWORD=defaultredispassword
|
||||
@@ -55,13 +57,12 @@ STORAGE_S3_FORCE_PATH_STYLE=false
|
||||
JWT_SECRET=a-very-secret-key-that-you-should-change
|
||||
JWT_EXPIRES_IN="7d"
|
||||
|
||||
# Admin User
|
||||
# Set the credentials for the initial admin user.
|
||||
ADMIN_EMAIL=admin@local.com
|
||||
ADMIN_PASSWORD=a_strong_password_that_you_should_change
|
||||
SUPER_API_KEY=
|
||||
|
||||
# Master Encryption Key for sensitive data (Such as Ingestion source credentials and passwords)
|
||||
# IMPORTANT: Generate a secure, random 32-byte hex string for this
|
||||
# You can use `openssl rand -hex 32` to generate a key.
|
||||
ENCRYPTION_KEY=
|
||||
|
||||
|
||||
|
||||
43
README.md
43
README.md
@@ -1,14 +1,17 @@
|
||||
# Open Archiver
|
||||
|
||||

|
||||

|
||||

|
||||
[](https://www.docker.com)
|
||||
[](https://www.postgresql.org/)
|
||||
[](https://www.meilisearch.com/)
|
||||
[](https://www.typescriptlang.org/)
|
||||
[](https://redis.io)
|
||||
[](https://svelte.dev/)
|
||||
|
||||
**A secure, sovereign, and affordable open-source platform for email archiving and eDiscovery.**
|
||||
**A secure, sovereign, and open-source platform for email archiving and eDiscovery.**
|
||||
|
||||
Open Archiver provides a robust, self-hosted solution for archiving, storing, indexing, and searching emails from major platforms, including Google Workspace (Gmail), Microsoft 365, as well as generic IMAP-enabled email inboxes. Use Open Archiver to keep a permanent, tamper-proof record of your communication history, free from vendor lock-in.
|
||||
Open Archiver provides a robust, self-hosted solution for archiving, storing, indexing, and searching emails from major platforms, including Google Workspace (Gmail), Microsoft 365, PST files, as well as generic IMAP-enabled email inboxes. Use Open Archiver to keep a permanent, tamper-proof record of your communication history, free from vendor lock-in.
|
||||
|
||||
## Screenshots
|
||||
## 📸 Screenshots
|
||||
|
||||

|
||||
_Dashboard_
|
||||
@@ -19,7 +22,7 @@ _Archived emails_
|
||||

|
||||
_Full-text search across all your emails and attachments_
|
||||
|
||||
## Community
|
||||
## 👨👩👧👦 Join our community!
|
||||
|
||||
We are committed to build an engaging community around Open Archiver, and we are inviting all of you to join our community on Discord to get real-time support and connect with the team.
|
||||
|
||||
@@ -27,7 +30,7 @@ We are committed to build an engaging community around Open Archiver, and we are
|
||||
|
||||
[](https://bsky.app/profile/openarchiver.bsky.social)
|
||||
|
||||
## Live demo
|
||||
## 🚀 Live demo
|
||||
|
||||
Check out the live demo here: https://demo.openarchiver.com
|
||||
|
||||
@@ -35,16 +38,24 @@ Username: admin@local.com
|
||||
|
||||
Password: openarchiver_demo
|
||||
|
||||
## Key Features
|
||||
## ✨ Key Features
|
||||
|
||||
- **Universal Ingestion**: Connect to any email provider to perform initial bulk imports and maintain continuous, real-time synchronization. Ingestion sources include:
|
||||
|
||||
- IMAP connection
|
||||
- Google Workspace
|
||||
- Microsoft 365
|
||||
- PST files
|
||||
- Zipped .eml files
|
||||
|
||||
- **Universal Ingestion**: Connect to Google Workspace, Microsoft 365, and standard IMAP servers to perform initial bulk imports and maintain continuous, real-time synchronization.
|
||||
- **Secure & Efficient Storage**: Emails are stored in the standard `.eml` format. The system uses deduplication and compression to minimize storage costs. All data is encrypted at rest.
|
||||
- **Pluggable Storage Backends**: Support both local filesystem storage and S3-compatible object storage (like AWS S3 or MinIO).
|
||||
- **Powerful Search & eDiscovery**: A high-performance search engine indexes the full text of emails and attachments (PDF, DOCX, etc.).
|
||||
- **Thread discovery**: The ability to discover if an email belongs to a thread/conversation and present the context.
|
||||
- **Compliance & Retention**: Define granular retention policies to automatically manage the lifecycle of your data. Place legal holds on communications to prevent deletion during litigation (TBD).
|
||||
- **Comprehensive Auditing**: An immutable audit trail logs all system activities, ensuring you have a clear record of who accessed what and when (TBD).
|
||||
|
||||
## Tech Stack
|
||||
## 🛠️ Tech Stack
|
||||
|
||||
Open Archiver is built on a modern, scalable, and maintainable technology stack:
|
||||
|
||||
@@ -55,7 +66,7 @@ Open Archiver is built on a modern, scalable, and maintainable technology stack:
|
||||
- **Database**: PostgreSQL for metadata, user management, and audit logs
|
||||
- **Deployment**: Docker Compose deployment
|
||||
|
||||
## Deployment
|
||||
## 📦 Deployment
|
||||
|
||||
### Prerequisites
|
||||
|
||||
@@ -91,7 +102,7 @@ Open Archiver is built on a modern, scalable, and maintainable technology stack:
|
||||
4. **Access the application:**
|
||||
Once the services are running, you can access the Open Archiver web interface by navigating to `http://localhost:3000` in your web browser.
|
||||
|
||||
## Data Source Configuration
|
||||
## ⚙️ Data Source Configuration
|
||||
|
||||
After deploying the application, you will need to configure one or more ingestion sources to begin archiving emails. Follow our detailed guides to connect to your email provider:
|
||||
|
||||
@@ -99,7 +110,7 @@ After deploying the application, you will need to configure one or more ingestio
|
||||
- [Connecting to Microsoft 365](https://docs.openarchiver.com/user-guides/email-providers/imap.html)
|
||||
- [Connecting to a Generic IMAP Server](https://docs.openarchiver.com/user-guides/email-providers/imap.html)
|
||||
|
||||
## Contributing
|
||||
## 🤝 Contributing
|
||||
|
||||
We welcome contributions from the community!
|
||||
|
||||
@@ -109,4 +120,6 @@ We welcome contributions from the community!
|
||||
|
||||
Please read our `CONTRIBUTING.md` file for more details on our code of conduct and the process for submitting pull requests.
|
||||
|
||||
## Star History [](https://www.star-history.com/#LogicLabs-OU/OpenArchiver&Date)
|
||||
## 📈 Star History
|
||||
|
||||
[](https://www.star-history.com/#LogicLabs-OU/OpenArchiver&Date)
|
||||
|
||||
@@ -37,9 +37,11 @@ export default defineConfig({
|
||||
link: '/user-guides/email-providers/',
|
||||
collapsed: true,
|
||||
items: [
|
||||
{ text: 'Google Workspace', link: '/user-guides/email-providers/google-workspace' },
|
||||
{ text: 'Generic IMAP Server', link: '/user-guides/email-providers/imap' },
|
||||
{ text: 'Microsoft 365', link: '/user-guides/email-providers/microsoft-365' }
|
||||
{ text: 'Google Workspace', link: '/user-guides/email-providers/google-workspace' },
|
||||
{ text: 'Microsoft 365', link: '/user-guides/email-providers/microsoft-365' },
|
||||
{ text: 'EML Import', link: '/user-guides/email-providers/eml' },
|
||||
{ text: 'PST Import', link: '/user-guides/email-providers/pst' }
|
||||
]
|
||||
}
|
||||
]
|
||||
|
||||
@@ -6,7 +6,7 @@ The Ingestion Service manages ingestion sources, which are configurations for co
|
||||
|
||||
All endpoints in this service require authentication.
|
||||
|
||||
### POST /api/v1/ingestion
|
||||
### POST /api/v1/ingestion-sources
|
||||
|
||||
Creates a new ingestion source.
|
||||
|
||||
@@ -29,7 +29,7 @@ interface CreateIngestionSourceDto {
|
||||
- **201 Created:** The newly created ingestion source.
|
||||
- **500 Internal Server Error:** An unexpected error occurred.
|
||||
|
||||
### GET /api/v1/ingestion
|
||||
### GET /api/v1/ingestion-sources
|
||||
|
||||
Retrieves all ingestion sources.
|
||||
|
||||
@@ -40,7 +40,7 @@ Retrieves all ingestion sources.
|
||||
- **200 OK:** An array of ingestion source objects.
|
||||
- **500 Internal Server Error:** An unexpected error occurred.
|
||||
|
||||
### GET /api/v1/ingestion/:id
|
||||
### GET /api/v1/ingestion-sources/:id
|
||||
|
||||
Retrieves a single ingestion source by its ID.
|
||||
|
||||
@@ -58,7 +58,7 @@ Retrieves a single ingestion source by its ID.
|
||||
- **404 Not Found:** Ingestion source not found.
|
||||
- **500 Internal Server Error:** An unexpected error occurred.
|
||||
|
||||
### PUT /api/v1/ingestion/:id
|
||||
### PUT /api/v1/ingestion-sources/:id
|
||||
|
||||
Updates an existing ingestion source.
|
||||
|
||||
@@ -95,7 +95,7 @@ interface UpdateIngestionSourceDto {
|
||||
- **404 Not Found:** Ingestion source not found.
|
||||
- **500 Internal Server Error:** An unexpected error occurred.
|
||||
|
||||
### DELETE /api/v1/ingestion/:id
|
||||
### DELETE /api/v1/ingestion-sources/:id
|
||||
|
||||
Deletes an ingestion source and all associated data.
|
||||
|
||||
@@ -113,7 +113,7 @@ Deletes an ingestion source and all associated data.
|
||||
- **404 Not Found:** Ingestion source not found.
|
||||
- **500 Internal Server Error:** An unexpected error occurred.
|
||||
|
||||
### POST /api/v1/ingestion/:id/import
|
||||
### POST /api/v1/ingestion-sources/:id/import
|
||||
|
||||
Triggers the initial import process for an ingestion source.
|
||||
|
||||
@@ -131,7 +131,7 @@ Triggers the initial import process for an ingestion source.
|
||||
- **404 Not Found:** Ingestion source not found.
|
||||
- **500 Internal Server Error:** An unexpected error occurred.
|
||||
|
||||
### POST /api/v1/ingestion/:id/pause
|
||||
### POST /api/v1/ingestion-sources/:id/pause
|
||||
|
||||
Pauses an active ingestion source.
|
||||
|
||||
@@ -149,7 +149,7 @@ Pauses an active ingestion source.
|
||||
- **404 Not Found:** Ingestion source not found.
|
||||
- **500 Internal Server Error:** An unexpected error occurred.
|
||||
|
||||
### POST /api/v1/ingestion/:id/sync
|
||||
### POST /api/v1/ingestion-sources/:id/sync
|
||||
|
||||
Triggers a forced synchronization for an ingestion source.
|
||||
|
||||
|
||||
141
docs/services/IAM-service/iam-policies.md
Normal file
141
docs/services/IAM-service/iam-policies.md
Normal file
@@ -0,0 +1,141 @@
|
||||
# IAM Policies Guide
|
||||
|
||||
This document provides a comprehensive guide to the Identity and Access Management (IAM) policies in Open Archiver. Our policy structure is inspired by AWS IAM, providing a powerful and flexible way to manage permissions.
|
||||
|
||||
## 1. Policy Structure
|
||||
|
||||
A policy is a JSON object that consists of one or more statements. Each statement includes an `Effect`, `Action`, and `Resource`.
|
||||
|
||||
```json
|
||||
{
|
||||
"Effect": "Allow",
|
||||
"Action": ["archive:read", "archive:search"],
|
||||
"Resource": ["archive/all"]
|
||||
}
|
||||
```
|
||||
|
||||
- **`Effect`**: Specifies whether the statement results in an `Allow` or `Deny`. An explicit `Deny` always overrides an `Allow`.
|
||||
- **`Action`**: A list of operations that the policy grants or denies permission to perform. Actions are formatted as `service:operation`.
|
||||
- **`Resource`**: A list of resources to which the actions apply. Resources are specified in a hierarchical format. Wildcards (`*`) can be used.
|
||||
|
||||
## 2. Wildcard Support
|
||||
|
||||
Our IAM system supports wildcards (`*`) in both `Action` and `Resource` fields to provide flexible permission management, as defined in the `PolicyValidator`.
|
||||
|
||||
### Action Wildcards
|
||||
|
||||
You can use wildcards to grant broad permissions for actions:
|
||||
|
||||
- **Global Wildcard (`*`)**: A standalone `*` in the `Action` field grants permission for all possible actions across all services.
|
||||
```json
|
||||
"Action": ["*"]
|
||||
```
|
||||
- **Service-Level Wildcard (`service:*`)**: A wildcard at the end of an action string grants permission for all actions within that specific service.
|
||||
```json
|
||||
"Action": ["archive:*"]
|
||||
```
|
||||
|
||||
### Resource Wildcards
|
||||
|
||||
Wildcards can also be used to specify resources:
|
||||
|
||||
- **Global Wildcard (`*`)**: A standalone `*` in the `Resource` field applies the policy to all resources in the system.
|
||||
```json
|
||||
"Resource": ["*"]
|
||||
```
|
||||
- **Partial Wildcards**: Some services allow wildcards at specific points in the resource path to refer to all resources of a certain type. For example, to target all ingestion sources:
|
||||
```json
|
||||
"Resource": ["ingestion-source/*"]
|
||||
```
|
||||
|
||||
## 3. Actions and Resources by Service
|
||||
|
||||
The following sections define the available actions and resources, categorized by their respective services.
|
||||
|
||||
### Service: `archive`
|
||||
|
||||
The `archive` service pertains to all actions related to accessing and managing archived emails.
|
||||
|
||||
**Actions:**
|
||||
|
||||
| Action | Description |
|
||||
| :--------------- | :--------------------------------------------------------------------- |
|
||||
| `archive:read` | Grants permission to read the content and metadata of archived emails. |
|
||||
| `archive:search` | Grants permission to perform search queries against the email archive. |
|
||||
| `archive:export` | Grants permission to export search results or individual emails. |
|
||||
|
||||
**Resources:**
|
||||
|
||||
| Resource | Description |
|
||||
| :------------------------------------ | :--------------------------------------------------------------------------------------- |
|
||||
| `archive/all` | Represents the entire email archive. |
|
||||
| `archive/ingestion-source/{sourceId}` | Scopes the action to emails from a specific ingestion source. |
|
||||
| `archive/mailbox/{email}` | Scopes the action to a single, specific mailbox, usually identified by an email address. |
|
||||
| `archive/custodian/{custodianId}` | Scopes the action to emails belonging to a specific custodian. |
|
||||
|
||||
---
|
||||
|
||||
### Service: `ingestion`
|
||||
|
||||
The `ingestion` service covers the management of email ingestion sources.
|
||||
|
||||
**Actions:**
|
||||
|
||||
| Action | Description |
|
||||
| :----------------------- | :--------------------------------------------------------------------------- |
|
||||
| `ingestion:createSource` | Grants permission to create a new ingestion source. |
|
||||
| `ingestion:readSource` | Grants permission to view the details of ingestion sources. |
|
||||
| `ingestion:updateSource` | Grants permission to modify the configuration of an ingestion source. |
|
||||
| `ingestion:deleteSource` | Grants permission to delete an ingestion source. |
|
||||
| `ingestion:manageSync` | Grants permission to trigger, pause, or force a sync on an ingestion source. |
|
||||
|
||||
**Resources:**
|
||||
|
||||
| Resource | Description |
|
||||
| :---------------------------- | :-------------------------------------------------------- |
|
||||
| `ingestion-source/*` | Represents all ingestion sources. |
|
||||
| `ingestion-source/{sourceId}` | Scopes the action to a single, specific ingestion source. |
|
||||
|
||||
---
|
||||
|
||||
### Service: `system`
|
||||
|
||||
The `system` service is for managing system-level settings, users, and roles.
|
||||
|
||||
**Actions:**
|
||||
|
||||
| Action | Description |
|
||||
| :---------------------- | :-------------------------------------------------- |
|
||||
| `system:readSettings` | Grants permission to view system settings. |
|
||||
| `system:updateSettings` | Grants permission to modify system settings. |
|
||||
| `system:readUsers` | Grants permission to list and view user accounts. |
|
||||
| `system:createUser` | Grants permission to create new user accounts. |
|
||||
| `system:updateUser` | Grants permission to modify existing user accounts. |
|
||||
| `system:deleteUser` | Grants permission to delete user accounts. |
|
||||
| `system:assignRole` | Grants permission to assign roles to users. |
|
||||
|
||||
**Resources:**
|
||||
|
||||
| Resource | Description |
|
||||
| :--------------------- | :---------------------------------------------------- |
|
||||
| `system/settings` | Represents the system configuration. |
|
||||
| `system/users` | Represents all user accounts within the system. |
|
||||
| `system/user/{userId}` | Scopes the action to a single, specific user account. |
|
||||
|
||||
---
|
||||
|
||||
### Service: `dashboard`
|
||||
|
||||
The `dashboard` service relates to viewing analytics and overview information.
|
||||
|
||||
**Actions:**
|
||||
|
||||
| Action | Description |
|
||||
| :--------------- | :-------------------------------------------------------------- |
|
||||
| `dashboard:read` | Grants permission to view all dashboard widgets and statistics. |
|
||||
|
||||
**Resources:**
|
||||
|
||||
| Resource | Description |
|
||||
| :------------ | :------------------------------------------ |
|
||||
| `dashboard/*` | Represents all components of the dashboard. |
|
||||
36
docs/user-guides/email-providers/eml.md
Normal file
36
docs/user-guides/email-providers/eml.md
Normal file
@@ -0,0 +1,36 @@
|
||||
# EML Import
|
||||
|
||||
OpenArchiver allows you to import EML files from a zip archive. This is useful for importing emails from a variety of sources, including other email clients and services.
|
||||
|
||||
## Preparing the Zip File
|
||||
|
||||
To ensure a successful import, you should compress your .eml files to one zip file according to the following guidelines:
|
||||
|
||||
- **Structure:** The zip file can contain any number of `.eml` files, organized in any folder structure. The folder structure will be preserved in OpenArchiver, so you can use it to organize your emails.
|
||||
- **Compression:** The zip file should be compressed using standard zip compression.
|
||||
|
||||
Here's an example of a valid folder structure:
|
||||
|
||||
```
|
||||
archive.zip
|
||||
├── inbox
|
||||
│ ├── email-01.eml
|
||||
│ └── email-02.eml
|
||||
├── sent
|
||||
│ └── email-03.eml
|
||||
└── drafts
|
||||
├── nested-folder
|
||||
│ └── email-04.eml
|
||||
└── email-05.eml
|
||||
```
|
||||
|
||||
## Creating an EML Ingestion Source
|
||||
|
||||
1. Go to the **Ingestion Sources** page in the OpenArchiver dashboard.
|
||||
2. Click the **Create New** button.
|
||||
3. Select **EML Import** as the provider.
|
||||
4. Enter a name for the ingestion source.
|
||||
5. Click the **Choose File** button and select the zip archive containing your EML files.
|
||||
6. Click the **Submit** button.
|
||||
|
||||
OpenArchiver will then start importing the EML files from the zip archive. The ingestion process may take some time, depending on the size of the archive.
|
||||
@@ -7,3 +7,5 @@ Choose your provider from the list below to get started:
|
||||
- [Google Workspace](./google-workspace.md)
|
||||
- [Microsoft 365](./microsoft-365.md)
|
||||
- [Generic IMAP Server](./imap.md)
|
||||
- [EML Import](./eml.md)
|
||||
- [PST Import](./pst.md)
|
||||
|
||||
21
docs/user-guides/email-providers/pst.md
Normal file
21
docs/user-guides/email-providers/pst.md
Normal file
@@ -0,0 +1,21 @@
|
||||
# PST Import
|
||||
|
||||
OpenArchiver allows you to import PST files. This is useful for importing emails from a variety of sources, including Microsoft Outlook.
|
||||
|
||||
## Preparing the PST File
|
||||
|
||||
To ensure a successful import, you should prepare your PST file according to the following guidelines:
|
||||
|
||||
- **Structure:** The PST file can contain any number of emails, organized in any folder structure. The folder structure will be preserved in OpenArchiver, so you can use it to organize your emails.
|
||||
- **Password Protection:** OpenArchiver does not support password-protected PST files. Please remove the password from your PST file before importing it.
|
||||
|
||||
## Creating a PST Ingestion Source
|
||||
|
||||
1. Go to the **Ingestion Sources** page in the OpenArchiver dashboard.
|
||||
2. Click the **Create New** button.
|
||||
3. Select **PST Import** as the provider.
|
||||
4. Enter a name for the ingestion source.
|
||||
5. Click the **Choose File** button and select the PST file.
|
||||
6. Click the **Submit** button.
|
||||
|
||||
OpenArchiver will then start importing the emails from the PST file. The ingestion process may take some time, depending on the size of the file.
|
||||
@@ -37,7 +37,6 @@ You must change the following placeholder values to secure your instance:
|
||||
- `REDIS_PASSWORD`: A strong, unique password for the Valkey/Redis service.
|
||||
- `MEILI_MASTER_KEY`: A complex key for Meilisearch.
|
||||
- `JWT_SECRET`: A long, random string for signing authentication tokens.
|
||||
- `ADMIN_PASSWORD`: A strong password for the initial admin user.
|
||||
- `ENCRYPTION_KEY`: A 32-byte hex string for encrypting sensitive data in the database. You can generate one with the following command:
|
||||
```bash
|
||||
openssl rand -hex 32
|
||||
@@ -104,14 +103,12 @@ These variables are used by `docker-compose.yml` to configure the services.
|
||||
|
||||
#### Security & Authentication
|
||||
|
||||
| Variable | Description | Default Value |
|
||||
| ---------------- | --------------------------------------------------- | ------------------------------------------ |
|
||||
| `JWT_SECRET` | A secret key for signing JWT tokens. | `a-very-secret-key-that-you-should-change` |
|
||||
| `JWT_EXPIRES_IN` | The expiration time for JWT tokens. | `7d` |
|
||||
| `ADMIN_EMAIL` | The email for the initial admin user. | `admin@local.com` |
|
||||
| `ADMIN_PASSWORD` | The password for the initial admin user. | `a_strong_password_that_you_should_change` |
|
||||
| `SUPER_API_KEY` | An API key with super admin privileges. | |
|
||||
| `ENCRYPTION_KEY` | A 32-byte hex string for encrypting sensitive data. | |
|
||||
| Variable | Description | Default Value |
|
||||
| ---------------- | ------------------------------------------------------------------- | ------------------------------------------ |
|
||||
| `JWT_SECRET` | A secret key for signing JWT tokens. | `a-very-secret-key-that-you-should-change` |
|
||||
| `JWT_EXPIRES_IN` | The expiration time for JWT tokens. | `7d` |
|
||||
| `SUPER_API_KEY` | An API key with super admin privileges. | |
|
||||
| `ENCRYPTION_KEY` | A 32-byte hex string for encrypting sensitive data in the database. | |
|
||||
|
||||
## 3. Run the Application
|
||||
|
||||
@@ -161,3 +158,141 @@ docker compose pull
|
||||
# Restart the services with the new images
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## Deploying on Coolify
|
||||
|
||||
If you are deploying Open Archiver on [Coolify](https://coolify.io/), it is recommended to let Coolify manage the Docker networks for you. This can help avoid potential routing conflicts and simplify your setup.
|
||||
|
||||
To do this, you will need to make a small modification to your `docker-compose.yml` file.
|
||||
|
||||
### Modify `docker-compose.yml` for Coolify
|
||||
|
||||
1. **Open your `docker-compose.yml` file** in a text editor.
|
||||
|
||||
2. **Remove all `networks` sections** from the file. This includes the network configuration for each service and the top-level network definition.
|
||||
|
||||
Specifically, you need to remove:
|
||||
|
||||
- The `networks: - open-archiver-net` lines from the `open-archiver`, `postgres`, `valkey`, and `meilisearch` services.
|
||||
- The entire `networks:` block at the end of the file.
|
||||
|
||||
Here is an example of what to remove from a service:
|
||||
|
||||
```diff
|
||||
services:
|
||||
open-archiver:
|
||||
image: logiclabshq/open-archiver:latest
|
||||
# ... other settings
|
||||
- networks:
|
||||
- - open-archiver-net
|
||||
```
|
||||
|
||||
And remove this entire block from the end of the file:
|
||||
|
||||
```diff
|
||||
- networks:
|
||||
- open-archiver-net:
|
||||
- driver: bridge
|
||||
```
|
||||
|
||||
3. **Save the modified `docker-compose.yml` file.**
|
||||
|
||||
By removing these sections, you allow Coolify to automatically create and manage the necessary networks, ensuring that all services can communicate with each other and are correctly exposed through Coolify's reverse proxy.
|
||||
|
||||
After making these changes, you can proceed with deploying your application on Coolify as you normally would.
|
||||
|
||||
## Where is my data stored (When using local storage and Docker)?
|
||||
|
||||
If you are using local storage to store your emails, based on your `docker-compose.yml` file, your data is being stored in what's called a "named volume" (`archiver-data`). That's why you're not seeing the files in the `./data/open-archiver` directory you created.
|
||||
|
||||
1. **List all Docker volumes**:
|
||||
|
||||
Run this command to see all the volumes on your system:
|
||||
|
||||
```bash
|
||||
docker volume ls
|
||||
```
|
||||
|
||||
2. **Identify the correct volume**:
|
||||
|
||||
Look through the list for a volume name that ends with `_archiver-data`. The part before that will be your project's directory name. For example, if your project is in a folder named `OpenArchiver`, the volume will be `openarchiver_archiver-data` But it can be a randomly generated hash.
|
||||
|
||||
3. **Inspect the correct volume**:
|
||||
|
||||
Once you've identified the correct volume name, use it in the `inspect` command. For example:
|
||||
|
||||
```bash
|
||||
docker volume inspect <your_volume_name_here>
|
||||
```
|
||||
|
||||
This will give you the correct `Mountpoint` path where your data is being stored. It will look something like this (the exact path will vary depending on your system):
|
||||
|
||||
```json
|
||||
{
|
||||
"CreatedAt": "2025-07-25T11:22:19Z",
|
||||
"Driver": "local",
|
||||
"Labels": {
|
||||
"com.docker.compose.config-hash": "---",
|
||||
"com.docker.compose.project": "---",
|
||||
"com.docker.compose.version": "2.38.2",
|
||||
"com.docker.compose.volume": "us8wwos0o4ok4go4gc8cog84_archiver-data"
|
||||
},
|
||||
"Mountpoint": "/var/lib/docker/volumes/us8wwos0o4ok4go4gc8cog84_archiver-data/_data",
|
||||
"Name": "us8wwos0o4ok4go4gc8cog84_archiver-data",
|
||||
"Options": null,
|
||||
"Scope": "local"
|
||||
}
|
||||
```
|
||||
|
||||
In this example, the data is located at `/var/lib/docker/volumes/us8wwos0o4ok4go4gc8cog84_archiver-data/_data`. You can then `cd` into that directory to see your files.
|
||||
|
||||
### To save data to a specific folder
|
||||
|
||||
To save the data to a specific folder on your machine, you'll need to make a change to your `docker-compose.yml`. You need to switch from a named volume to a "bind mount".
|
||||
|
||||
Here’s how you can do it:
|
||||
|
||||
1. **Edit `docker-compose.yml`**:
|
||||
|
||||
Open the `docker-compose.yml` file and find the `open-archiver` service. You're going to change the `volumes` section.
|
||||
|
||||
**Change this:**
|
||||
|
||||
```yaml
|
||||
services:
|
||||
open-archiver:
|
||||
# ... other config
|
||||
volumes:
|
||||
- archiver-data:/var/data/open-archiver
|
||||
```
|
||||
|
||||
**To this:**
|
||||
|
||||
```yaml
|
||||
services:
|
||||
open-archiver:
|
||||
# ... other config
|
||||
volumes:
|
||||
- ./data/open-archiver:/var/data/open-archiver
|
||||
```
|
||||
|
||||
You'll also want to remove the `archiver-data` volume definition at the bottom of the file, since it's no longer needed.
|
||||
|
||||
**Remove this whole block:**
|
||||
|
||||
```yaml
|
||||
volumes:
|
||||
# ... other volumes
|
||||
archiver-data:
|
||||
driver: local
|
||||
```
|
||||
|
||||
2. **Restart your containers**:
|
||||
|
||||
After you've saved the changes, run the following command in your terminal to apply them. The `--force-recreate` flag will ensure the container is recreated with the new volume settings.
|
||||
|
||||
```bash
|
||||
docker-compose up -d --force-recreate
|
||||
```
|
||||
|
||||
After this, any new data will be saved directly into the `./data/open-archiver` folder in your project directory.
|
||||
|
||||
@@ -24,9 +24,11 @@
|
||||
"@azure/msal-node": "^3.6.3",
|
||||
"@microsoft/microsoft-graph-client": "^3.0.7",
|
||||
"@open-archiver/types": "workspace:*",
|
||||
"archiver": "^7.0.1",
|
||||
"axios": "^1.10.0",
|
||||
"bcryptjs": "^3.0.2",
|
||||
"bullmq": "^5.56.3",
|
||||
"busboy": "^1.6.0",
|
||||
"cross-fetch": "^4.1.0",
|
||||
"deepmerge-ts": "^7.1.5",
|
||||
"dotenv": "^17.2.0",
|
||||
@@ -42,23 +44,30 @@
|
||||
"mailparser": "^3.7.4",
|
||||
"mammoth": "^1.9.1",
|
||||
"meilisearch": "^0.51.0",
|
||||
"multer": "^2.0.2",
|
||||
"pdf2json": "^3.1.6",
|
||||
"pg": "^8.16.3",
|
||||
"pino": "^9.7.0",
|
||||
"pino-pretty": "^13.0.0",
|
||||
"postgres": "^3.4.7",
|
||||
"pst-extractor": "^1.11.0",
|
||||
"reflect-metadata": "^0.2.2",
|
||||
"sqlite3": "^5.1.7",
|
||||
"tsconfig-paths": "^4.2.0",
|
||||
"xlsx": "^0.18.5"
|
||||
"xlsx": "^0.18.5",
|
||||
"yauzl": "^3.2.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@bull-board/api": "^6.11.0",
|
||||
"@bull-board/express": "^6.11.0",
|
||||
"@types/archiver": "^6.0.3",
|
||||
"@types/busboy": "^1.5.4",
|
||||
"@types/express": "^5.0.3",
|
||||
"@types/mailparser": "^3.4.6",
|
||||
"@types/microsoft-graph": "^2.40.1",
|
||||
"@types/multer": "^2.0.0",
|
||||
"@types/node": "^24.0.12",
|
||||
"@types/yauzl": "^2.10.3",
|
||||
"bull-board": "^2.1.3",
|
||||
"ts-node-dev": "^2.0.0",
|
||||
"typescript": "^5.8.3"
|
||||
|
||||
@@ -1,12 +1,49 @@
|
||||
import type { Request, Response } from 'express';
|
||||
import type { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
import { UserService } from '../../services/UserService';
|
||||
import { db } from '../../database';
|
||||
import * as schema from '../../database/schema';
|
||||
import { sql } from 'drizzle-orm';
|
||||
import 'dotenv/config';
|
||||
|
||||
|
||||
export class AuthController {
|
||||
#authService: IAuthService;
|
||||
#authService: AuthService;
|
||||
#userService: UserService;
|
||||
|
||||
constructor(authService: IAuthService) {
|
||||
constructor(authService: AuthService, userService: UserService) {
|
||||
this.#authService = authService;
|
||||
this.#userService = userService;
|
||||
}
|
||||
/**
|
||||
* Only used for setting up the instance, should only be displayed once upon instance set up.
|
||||
* @param req
|
||||
* @param res
|
||||
* @returns
|
||||
*/
|
||||
public setup = async (req: Request, res: Response): Promise<Response> => {
|
||||
const { email, password, first_name, last_name } = req.body;
|
||||
|
||||
if (!email || !password || !first_name || !last_name) {
|
||||
return res.status(400).json({ message: 'Email, password, and name are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
const userCountResult = await db.select({ count: sql<number>`count(*)` }).from(schema.users);
|
||||
const userCount = Number(userCountResult[0].count);
|
||||
|
||||
if (userCount > 0) {
|
||||
return res.status(403).json({ message: 'Setup has already been completed.' });
|
||||
}
|
||||
|
||||
const newUser = await this.#userService.createAdminUser({ email, password, first_name, last_name }, true);
|
||||
const result = await this.#authService.login(email, password);
|
||||
return res.status(201).json(result);
|
||||
} catch (error) {
|
||||
console.error('Setup error:', error);
|
||||
return res.status(500).json({ message: 'An internal server error occurred' });
|
||||
}
|
||||
};
|
||||
|
||||
public login = async (req: Request, res: Response): Promise<Response> => {
|
||||
const { email, password } = req.body;
|
||||
@@ -28,4 +65,29 @@ export class AuthController {
|
||||
return res.status(500).json({ message: 'An internal server error occurred' });
|
||||
}
|
||||
};
|
||||
|
||||
public status = async (req: Request, res: Response): Promise<Response> => {
|
||||
try {
|
||||
|
||||
|
||||
|
||||
const userCountResult = await db.select({ count: sql<number>`count(*)` }).from(schema.users);
|
||||
const userCount = Number(userCountResult[0].count);
|
||||
const needsSetup = userCount === 0;
|
||||
// in case user uses older version with admin user variables, we will create the admin user using those variables.
|
||||
if (needsSetup && process.env.ADMIN_EMAIL && process.env.ADMIN_PASSWORD) {
|
||||
await this.#userService.createAdminUser({
|
||||
email: process.env.ADMIN_EMAIL,
|
||||
password: process.env.ADMIN_PASSWORD,
|
||||
first_name: "Admin",
|
||||
last_name: "User"
|
||||
}, true);
|
||||
return res.status(200).json({ needsSetup: false });
|
||||
}
|
||||
return res.status(200).json({ needsSetup });
|
||||
} catch (error) {
|
||||
console.error('Status check error:', error);
|
||||
return res.status(500).json({ message: 'An internal server error occurred' });
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
71
packages/backend/src/api/controllers/iam.controller.ts
Normal file
71
packages/backend/src/api/controllers/iam.controller.ts
Normal file
@@ -0,0 +1,71 @@
|
||||
import { Request, Response } from 'express';
|
||||
import { IamService } from '../../services/IamService';
|
||||
import { PolicyValidator } from '../../iam-policy/policy-validator';
|
||||
import type { PolicyStatement } from '@open-archiver/types';
|
||||
|
||||
export class IamController {
|
||||
#iamService: IamService;
|
||||
|
||||
constructor(iamService: IamService) {
|
||||
this.#iamService = iamService;
|
||||
}
|
||||
|
||||
public getRoles = async (req: Request, res: Response): Promise<void> => {
|
||||
try {
|
||||
const roles = await this.#iamService.getRoles();
|
||||
res.status(200).json(roles);
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: 'Failed to get roles.' });
|
||||
}
|
||||
};
|
||||
|
||||
public getRoleById = async (req: Request, res: Response): Promise<void> => {
|
||||
const { id } = req.params;
|
||||
|
||||
try {
|
||||
const role = await this.#iamService.getRoleById(id);
|
||||
if (role) {
|
||||
res.status(200).json(role);
|
||||
} else {
|
||||
res.status(404).json({ error: 'Role not found.' });
|
||||
}
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: 'Failed to get role.' });
|
||||
}
|
||||
};
|
||||
|
||||
public createRole = async (req: Request, res: Response): Promise<void> => {
|
||||
const { name, policy } = req.body;
|
||||
|
||||
if (!name || !policy) {
|
||||
res.status(400).json({ error: 'Missing required fields: name and policy.' });
|
||||
return;
|
||||
}
|
||||
|
||||
for (const statement of policy) {
|
||||
const { valid, reason } = PolicyValidator.isValid(statement as PolicyStatement);
|
||||
if (!valid) {
|
||||
res.status(400).json({ error: `Invalid policy statement: ${reason}` });
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
const role = await this.#iamService.createRole(name, policy);
|
||||
res.status(201).json(role);
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: 'Failed to create role.' });
|
||||
}
|
||||
};
|
||||
|
||||
public deleteRole = async (req: Request, res: Response): Promise<void> => {
|
||||
const { id } = req.params;
|
||||
|
||||
try {
|
||||
await this.#iamService.deleteRole(id);
|
||||
res.status(204).send();
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: 'Failed to delete role.' });
|
||||
}
|
||||
};
|
||||
}
|
||||
26
packages/backend/src/api/controllers/upload.controller.ts
Normal file
26
packages/backend/src/api/controllers/upload.controller.ts
Normal file
@@ -0,0 +1,26 @@
|
||||
import { Request, Response } from 'express';
|
||||
import { StorageService } from '../../services/StorageService';
|
||||
import { randomUUID } from 'crypto';
|
||||
import busboy from 'busboy';
|
||||
import { config } from '../../config/index';
|
||||
|
||||
|
||||
export const uploadFile = async (req: Request, res: Response) => {
|
||||
const storage = new StorageService();
|
||||
const bb = busboy({ headers: req.headers });
|
||||
let filePath = '';
|
||||
let originalFilename = '';
|
||||
|
||||
bb.on('file', (fieldname, file, filename) => {
|
||||
originalFilename = filename.filename;
|
||||
const uuid = randomUUID();
|
||||
filePath = `${config.storage.openArchiverFolderName}/tmp/${uuid}-${originalFilename}`;
|
||||
storage.put(filePath, file);
|
||||
});
|
||||
|
||||
bb.on('finish', () => {
|
||||
res.json({ filePath });
|
||||
});
|
||||
|
||||
req.pipe(bb);
|
||||
};
|
||||
@@ -1,5 +1,5 @@
|
||||
import type { Request, Response, NextFunction } from 'express';
|
||||
import type { IAuthService } from '../../services/AuthService';
|
||||
import type { AuthService } from '../../services/AuthService';
|
||||
import type { AuthTokenPayload } from '@open-archiver/types';
|
||||
import 'dotenv/config';
|
||||
// By using module augmentation, we can add our custom 'user' property
|
||||
@@ -12,7 +12,7 @@ declare global {
|
||||
}
|
||||
}
|
||||
|
||||
export const requireAuth = (authService: IAuthService) => {
|
||||
export const requireAuth = (authService: AuthService) => {
|
||||
return async (req: Request, res: Response, next: NextFunction) => {
|
||||
const authHeader = req.headers.authorization;
|
||||
if (!authHeader || !authHeader.startsWith('Bearer ')) {
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Router } from 'express';
|
||||
import { ArchivedEmailController } from '../controllers/archived-email.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createArchivedEmailRouter = (
|
||||
archivedEmailController: ArchivedEmailController,
|
||||
authService: IAuthService
|
||||
authService: AuthService
|
||||
): Router => {
|
||||
const router = Router();
|
||||
|
||||
|
||||
@@ -5,6 +5,13 @@ import type { AuthController } from '../controllers/auth.controller';
|
||||
export const createAuthRouter = (authController: AuthController): Router => {
|
||||
const router = Router();
|
||||
|
||||
/**
|
||||
* @route POST /api/v1/auth/setup
|
||||
* @description Creates the initial administrator user.
|
||||
* @access Public
|
||||
*/
|
||||
router.post('/setup', loginRateLimiter, authController.setup);
|
||||
|
||||
/**
|
||||
* @route POST /api/v1/auth/login
|
||||
* @description Authenticates a user and returns a JWT.
|
||||
@@ -12,5 +19,12 @@ export const createAuthRouter = (authController: AuthController): Router => {
|
||||
*/
|
||||
router.post('/login', loginRateLimiter, authController.login);
|
||||
|
||||
/**
|
||||
* @route GET /api/v1/auth/status
|
||||
* @description Checks if the application has been set up.
|
||||
* @access Public
|
||||
*/
|
||||
router.get('/status', authController.status);
|
||||
|
||||
return router;
|
||||
};
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
import { Router } from 'express';
|
||||
import { dashboardController } from '../controllers/dashboard.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createDashboardRouter = (authService: IAuthService): Router => {
|
||||
export const createDashboardRouter = (authService: AuthService): Router => {
|
||||
const router = Router();
|
||||
|
||||
router.use(requireAuth(authService));
|
||||
|
||||
36
packages/backend/src/api/routes/iam.routes.ts
Normal file
36
packages/backend/src/api/routes/iam.routes.ts
Normal file
@@ -0,0 +1,36 @@
|
||||
import { Router } from 'express';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import type { IamController } from '../controllers/iam.controller';
|
||||
|
||||
export const createIamRouter = (iamController: IamController): Router => {
|
||||
const router = Router();
|
||||
|
||||
/**
|
||||
* @route GET /api/v1/iam/roles
|
||||
* @description Gets all roles.
|
||||
* @access Private
|
||||
*/
|
||||
router.get('/roles', requireAuth, iamController.getRoles);
|
||||
|
||||
/**
|
||||
* @route GET /api/v1/iam/roles/:id
|
||||
* @description Gets a role by ID.
|
||||
* @access Private
|
||||
*/
|
||||
router.get('/roles/:id', requireAuth, iamController.getRoleById);
|
||||
|
||||
/**
|
||||
* @route POST /api/v1/iam/roles
|
||||
* @description Creates a new role.
|
||||
* @access Private
|
||||
*/
|
||||
router.post('/roles', requireAuth, iamController.createRole);
|
||||
|
||||
/**
|
||||
* @route DELETE /api/v1/iam/roles/:id
|
||||
* @description Deletes a role.
|
||||
* @access Private
|
||||
*/
|
||||
router.delete('/roles/:id', requireAuth, iamController.deleteRole);
|
||||
return router;
|
||||
};
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Router } from 'express';
|
||||
import { IngestionController } from '../controllers/ingestion.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createIngestionRouter = (
|
||||
ingestionController: IngestionController,
|
||||
authService: IAuthService
|
||||
authService: AuthService
|
||||
): Router => {
|
||||
const router = Router();
|
||||
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Router } from 'express';
|
||||
import { SearchController } from '../controllers/search.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createSearchRouter = (
|
||||
searchController: SearchController,
|
||||
authService: IAuthService
|
||||
authService: AuthService
|
||||
): Router => {
|
||||
const router = Router();
|
||||
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Router } from 'express';
|
||||
import { StorageController } from '../controllers/storage.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createStorageRouter = (
|
||||
storageController: StorageController,
|
||||
authService: IAuthService
|
||||
authService: AuthService
|
||||
): Router => {
|
||||
const router = Router();
|
||||
|
||||
|
||||
14
packages/backend/src/api/routes/upload.routes.ts
Normal file
14
packages/backend/src/api/routes/upload.routes.ts
Normal file
@@ -0,0 +1,14 @@
|
||||
import { Router } from 'express';
|
||||
import { uploadFile } from '../controllers/upload.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createUploadRouter = (authService: AuthService): Router => {
|
||||
const router = Router();
|
||||
|
||||
router.use(requireAuth(authService));
|
||||
|
||||
router.post('/', uploadFile);
|
||||
|
||||
return router;
|
||||
};
|
||||
@@ -5,4 +5,5 @@ export const app = {
|
||||
port: process.env.PORT_BACKEND ? parseInt(process.env.PORT_BACKEND, 10) : 4000,
|
||||
encryptionKey: process.env.ENCRYPTION_KEY,
|
||||
isDemo: process.env.IS_DEMO === 'true',
|
||||
syncFrequency: process.env.SYNC_FREQUENCY || '* * * * *' //default to 1 minute
|
||||
};
|
||||
|
||||
@@ -2,7 +2,7 @@ import { StorageConfig } from '@open-archiver/types';
|
||||
import 'dotenv/config';
|
||||
|
||||
const storageType = process.env.STORAGE_TYPE;
|
||||
|
||||
const openArchiverFolderName = 'open-archiver';
|
||||
let storageConfig: StorageConfig;
|
||||
|
||||
if (storageType === 'local') {
|
||||
@@ -12,6 +12,7 @@ if (storageType === 'local') {
|
||||
storageConfig = {
|
||||
type: 'local',
|
||||
rootPath: process.env.STORAGE_LOCAL_ROOT_PATH,
|
||||
openArchiverFolderName: openArchiverFolderName
|
||||
};
|
||||
} else if (storageType === 's3') {
|
||||
if (
|
||||
@@ -30,6 +31,7 @@ if (storageType === 'local') {
|
||||
secretAccessKey: process.env.STORAGE_S3_SECRET_ACCESS_KEY,
|
||||
region: process.env.STORAGE_S3_REGION,
|
||||
forcePathStyle: process.env.STORAGE_S3_FORCE_PATH_STYLE === 'true',
|
||||
openArchiverFolderName: openArchiverFolderName
|
||||
};
|
||||
} else {
|
||||
throw new Error(`Invalid STORAGE_TYPE: ${storageType}`);
|
||||
|
||||
@@ -0,0 +1,36 @@
|
||||
CREATE TABLE "roles" (
|
||||
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||
"name" text NOT NULL,
|
||||
"policies" jsonb DEFAULT '[]'::jsonb NOT NULL,
|
||||
"created_at" timestamp DEFAULT now() NOT NULL,
|
||||
"updated_at" timestamp DEFAULT now() NOT NULL,
|
||||
CONSTRAINT "roles_name_unique" UNIQUE("name")
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE "sessions" (
|
||||
"id" text PRIMARY KEY NOT NULL,
|
||||
"user_id" uuid NOT NULL,
|
||||
"expires_at" timestamp with time zone NOT NULL
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE "user_roles" (
|
||||
"user_id" uuid NOT NULL,
|
||||
"role_id" uuid NOT NULL,
|
||||
CONSTRAINT "user_roles_user_id_role_id_pk" PRIMARY KEY("user_id","role_id")
|
||||
);
|
||||
--> statement-breakpoint
|
||||
CREATE TABLE "users" (
|
||||
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||
"email" text NOT NULL,
|
||||
"name" text,
|
||||
"password" text,
|
||||
"provider" text DEFAULT 'local',
|
||||
"provider_id" text,
|
||||
"created_at" timestamp DEFAULT now() NOT NULL,
|
||||
"updated_at" timestamp DEFAULT now() NOT NULL,
|
||||
CONSTRAINT "users_email_unique" UNIQUE("email")
|
||||
);
|
||||
--> statement-breakpoint
|
||||
ALTER TABLE "sessions" ADD CONSTRAINT "sessions_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "user_roles" ADD CONSTRAINT "user_roles_user_id_users_id_fk" FOREIGN KEY ("user_id") REFERENCES "public"."users"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||
ALTER TABLE "user_roles" ADD CONSTRAINT "user_roles_role_id_roles_id_fk" FOREIGN KEY ("role_id") REFERENCES "public"."roles"("id") ON DELETE cascade ON UPDATE no action;
|
||||
@@ -0,0 +1,2 @@
|
||||
ALTER TABLE "users" RENAME COLUMN "name" TO "first_name";--> statement-breakpoint
|
||||
ALTER TABLE "users" ADD COLUMN "last_name" text;
|
||||
@@ -0,0 +1,2 @@
|
||||
ALTER TYPE "public"."ingestion_provider" ADD VALUE 'pst_import';--> statement-breakpoint
|
||||
ALTER TYPE "public"."ingestion_status" ADD VALUE 'imported';
|
||||
@@ -0,0 +1,2 @@
|
||||
ALTER TABLE "archived_emails" ADD COLUMN "path" text;--> statement-breakpoint
|
||||
ALTER TABLE "archived_emails" ADD COLUMN "tags" jsonb;
|
||||
@@ -0,0 +1 @@
|
||||
ALTER TYPE "public"."ingestion_provider" ADD VALUE 'eml_import';
|
||||
1087
packages/backend/src/database/migrations/meta/0010_snapshot.json
Normal file
1087
packages/backend/src/database/migrations/meta/0010_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
1093
packages/backend/src/database/migrations/meta/0011_snapshot.json
Normal file
1093
packages/backend/src/database/migrations/meta/0011_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
1095
packages/backend/src/database/migrations/meta/0012_snapshot.json
Normal file
1095
packages/backend/src/database/migrations/meta/0012_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
1107
packages/backend/src/database/migrations/meta/0013_snapshot.json
Normal file
1107
packages/backend/src/database/migrations/meta/0013_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
1108
packages/backend/src/database/migrations/meta/0014_snapshot.json
Normal file
1108
packages/backend/src/database/migrations/meta/0014_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -71,6 +71,41 @@
|
||||
"when": 1754337938241,
|
||||
"tag": "0009_late_lenny_balinger",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 10,
|
||||
"version": "7",
|
||||
"when": 1754420780849,
|
||||
"tag": "0010_perpetual_lightspeed",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 11,
|
||||
"version": "7",
|
||||
"when": 1754422064158,
|
||||
"tag": "0011_tan_blackheart",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 12,
|
||||
"version": "7",
|
||||
"when": 1754476962901,
|
||||
"tag": "0012_warm_the_stranger",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 13,
|
||||
"version": "7",
|
||||
"when": 1754659373517,
|
||||
"tag": "0013_classy_talkback",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 14,
|
||||
"version": "7",
|
||||
"when": 1754831765718,
|
||||
"tag": "0014_foamy_vapor",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -4,3 +4,4 @@ export * from './schema/audit-logs';
|
||||
export * from './schema/compliance';
|
||||
export * from './schema/custodians';
|
||||
export * from './schema/ingestion-sources';
|
||||
export * from './schema/users';
|
||||
|
||||
@@ -24,12 +24,10 @@ export const archivedEmails = pgTable(
|
||||
hasAttachments: boolean('has_attachments').notNull().default(false),
|
||||
isOnLegalHold: boolean('is_on_legal_hold').notNull().default(false),
|
||||
archivedAt: timestamp('archived_at', { withTimezone: true }).notNull().defaultNow(),
|
||||
path: text('path'),
|
||||
tags: jsonb('tags'),
|
||||
},
|
||||
(table) => {
|
||||
return {
|
||||
threadIdIdx: index('thread_id_idx').on(table.threadId)
|
||||
};
|
||||
}
|
||||
(table) => [index('thread_id_idx').on(table.threadId)]
|
||||
);
|
||||
|
||||
export const archivedEmailsRelations = relations(archivedEmails, ({ one }) => ({
|
||||
|
||||
@@ -3,7 +3,9 @@ import { jsonb, pgEnum, pgTable, text, timestamp, uuid } from 'drizzle-orm/pg-co
|
||||
export const ingestionProviderEnum = pgEnum('ingestion_provider', [
|
||||
'google_workspace',
|
||||
'microsoft_365',
|
||||
'generic_imap'
|
||||
'generic_imap',
|
||||
'pst_import',
|
||||
'eml_import'
|
||||
]);
|
||||
|
||||
export const ingestionStatusEnum = pgEnum('ingestion_status', [
|
||||
@@ -13,7 +15,8 @@ export const ingestionStatusEnum = pgEnum('ingestion_status', [
|
||||
'pending_auth',
|
||||
'syncing',
|
||||
'importing',
|
||||
'auth_success'
|
||||
'auth_success',
|
||||
'imported'
|
||||
]);
|
||||
|
||||
export const ingestionSources = pgTable('ingestion_sources', {
|
||||
|
||||
89
packages/backend/src/database/schema/users.ts
Normal file
89
packages/backend/src/database/schema/users.ts
Normal file
@@ -0,0 +1,89 @@
|
||||
import { relations, sql } from 'drizzle-orm';
|
||||
import {
|
||||
pgTable,
|
||||
text,
|
||||
timestamp,
|
||||
uuid,
|
||||
primaryKey,
|
||||
jsonb
|
||||
} from 'drizzle-orm/pg-core';
|
||||
import type { PolicyStatement } from '@open-archiver/types';
|
||||
|
||||
/**
|
||||
* The `users` table stores the core user information for authentication and identification.
|
||||
*/
|
||||
export const users = pgTable('users', {
|
||||
id: uuid('id').primaryKey().defaultRandom(),
|
||||
email: text('email').notNull().unique(),
|
||||
first_name: text('first_name'),
|
||||
last_name: text('last_name'),
|
||||
password: text('password'),
|
||||
provider: text('provider').default('local'),
|
||||
providerId: text('provider_id'),
|
||||
createdAt: timestamp('created_at').defaultNow().notNull(),
|
||||
updatedAt: timestamp('updated_at').defaultNow().notNull()
|
||||
});
|
||||
|
||||
/**
|
||||
* The `sessions` table stores user session information for managing login state.
|
||||
* It links a session to a user and records its expiration time.
|
||||
*/
|
||||
export const sessions = pgTable('sessions', {
|
||||
id: text('id').primaryKey(),
|
||||
userId: uuid('user_id')
|
||||
.notNull()
|
||||
.references(() => users.id, { onDelete: 'cascade' }),
|
||||
expiresAt: timestamp('expires_at', {
|
||||
withTimezone: true,
|
||||
mode: 'date'
|
||||
}).notNull()
|
||||
});
|
||||
|
||||
/**
|
||||
* The `roles` table defines the roles that can be assigned to users.
|
||||
* Each role has a name and a set of policies that define its permissions.
|
||||
*/
|
||||
export const roles = pgTable('roles', {
|
||||
id: uuid('id').primaryKey().defaultRandom(),
|
||||
name: text('name').notNull().unique(),
|
||||
policies: jsonb('policies').$type<PolicyStatement[]>().notNull().default(sql`'[]'::jsonb`),
|
||||
createdAt: timestamp('created_at').defaultNow().notNull(),
|
||||
updatedAt: timestamp('updated_at').defaultNow().notNull()
|
||||
});
|
||||
|
||||
/**
|
||||
* The `user_roles` table is a join table that maps users to their assigned roles.
|
||||
* This many-to-many relationship allows a user to have multiple roles.
|
||||
*/
|
||||
export const userRoles = pgTable(
|
||||
'user_roles',
|
||||
{
|
||||
userId: uuid('user_id')
|
||||
.notNull()
|
||||
.references(() => users.id, { onDelete: 'cascade' }),
|
||||
roleId: uuid('role_id')
|
||||
.notNull()
|
||||
.references(() => roles.id, { onDelete: 'cascade' })
|
||||
},
|
||||
(t) => [primaryKey({ columns: [t.userId, t.roleId] })]
|
||||
);
|
||||
|
||||
// Define relationships for Drizzle ORM
|
||||
export const usersRelations = relations(users, ({ many }) => ({
|
||||
userRoles: many(userRoles)
|
||||
}));
|
||||
|
||||
export const rolesRelations = relations(roles, ({ many }) => ({
|
||||
userRoles: many(userRoles)
|
||||
}));
|
||||
|
||||
export const userRolesRelations = relations(userRoles, ({ one }) => ({
|
||||
role: one(roles, {
|
||||
fields: [userRoles.roleId],
|
||||
references: [roles.id]
|
||||
}),
|
||||
user: one(users, {
|
||||
fields: [userRoles.userId],
|
||||
references: [users.id]
|
||||
})
|
||||
}));
|
||||
120
packages/backend/src/iam-policy/iam-definitions.ts
Normal file
120
packages/backend/src/iam-policy/iam-definitions.ts
Normal file
@@ -0,0 +1,120 @@
|
||||
/**
|
||||
* @file This file serves as the single source of truth for all Identity and Access Management (IAM)
|
||||
* definitions within Open Archiver. Centralizing these definitions is an industry-standard practice
|
||||
* that offers several key benefits:
|
||||
*
|
||||
* 1. **Prevents "Magic Strings"**: Avoids the use of hardcoded strings for actions and resources
|
||||
* throughout the codebase, reducing the risk of typos and inconsistencies.
|
||||
* 2. **Single Source of Truth**: Provides a clear, comprehensive, and maintainable list of all
|
||||
* possible permissions in the system.
|
||||
* 3. **Enables Validation**: Allows for the creation of a robust validation function that can
|
||||
* programmatically check if a policy statement is valid before it is saved.
|
||||
* 4. **Simplifies Auditing**: Makes it easy to audit and understand the scope of permissions
|
||||
* that can be granted.
|
||||
*
|
||||
* The structure is inspired by AWS IAM, using a `service:operation` format for actions and a
|
||||
* hierarchical, slash-separated path for resources.
|
||||
*/
|
||||
|
||||
// ===================================================================================
|
||||
// SERVICE: archive
|
||||
// ===================================================================================
|
||||
|
||||
const ARCHIVE_ACTIONS = {
|
||||
READ: 'archive:read',
|
||||
SEARCH: 'archive:search',
|
||||
EXPORT: 'archive:export',
|
||||
} as const;
|
||||
|
||||
const ARCHIVE_RESOURCES = {
|
||||
ALL: 'archive/all',
|
||||
INGESTION_SOURCE: 'archive/ingestion-source/*',
|
||||
MAILBOX: 'archive/mailbox/*',
|
||||
CUSTODIAN: 'archive/custodian/*',
|
||||
} as const;
|
||||
|
||||
|
||||
// ===================================================================================
|
||||
// SERVICE: ingestion
|
||||
// ===================================================================================
|
||||
|
||||
const INGESTION_ACTIONS = {
|
||||
CREATE_SOURCE: 'ingestion:createSource',
|
||||
READ_SOURCE: 'ingestion:readSource',
|
||||
UPDATE_SOURCE: 'ingestion:updateSource',
|
||||
DELETE_SOURCE: 'ingestion:deleteSource',
|
||||
MANAGE_SYNC: 'ingestion:manageSync', // Covers triggering, pausing, and forcing syncs
|
||||
} as const;
|
||||
|
||||
const INGESTION_RESOURCES = {
|
||||
ALL: 'ingestion-source/*',
|
||||
SOURCE: 'ingestion-source/{sourceId}',
|
||||
} as const;
|
||||
|
||||
|
||||
// ===================================================================================
|
||||
// SERVICE: system
|
||||
// ===================================================================================
|
||||
|
||||
const SYSTEM_ACTIONS = {
|
||||
READ_SETTINGS: 'system:readSettings',
|
||||
UPDATE_SETTINGS: 'system:updateSettings',
|
||||
READ_USERS: 'system:readUsers',
|
||||
CREATE_USER: 'system:createUser',
|
||||
UPDATE_USER: 'system:updateUser',
|
||||
DELETE_USER: 'system:deleteUser',
|
||||
ASSIGN_ROLE: 'system:assignRole',
|
||||
} as const;
|
||||
|
||||
const SYSTEM_RESOURCES = {
|
||||
SETTINGS: 'system/settings',
|
||||
USERS: 'system/users',
|
||||
USER: 'system/user/{userId}',
|
||||
} as const;
|
||||
|
||||
|
||||
// ===================================================================================
|
||||
// SERVICE: dashboard
|
||||
// ===================================================================================
|
||||
|
||||
const DASHBOARD_ACTIONS = {
|
||||
READ: 'dashboard:read',
|
||||
} as const;
|
||||
|
||||
const DASHBOARD_RESOURCES = {
|
||||
ALL: 'dashboard/*',
|
||||
} as const;
|
||||
|
||||
|
||||
// ===================================================================================
|
||||
// EXPORTED DEFINITIONS
|
||||
// ===================================================================================
|
||||
|
||||
/**
|
||||
* A comprehensive set of all valid IAM actions in the system.
|
||||
* This is used by the policy validator to ensure that any action in a policy is recognized.
|
||||
*/
|
||||
export const ValidActions: Set<string> = new Set([
|
||||
...Object.values(ARCHIVE_ACTIONS),
|
||||
...Object.values(INGESTION_ACTIONS),
|
||||
...Object.values(SYSTEM_ACTIONS),
|
||||
...Object.values(DASHBOARD_ACTIONS),
|
||||
]);
|
||||
|
||||
/**
|
||||
* An object containing regular expressions for validating resource formats.
|
||||
* The validator uses these patterns to ensure that resource strings in a policy
|
||||
* conform to the expected structure.
|
||||
*
|
||||
* Logic:
|
||||
* - The key represents the service (e.g., 'archive').
|
||||
* - The value is a RegExp that matches all valid resource formats for that service.
|
||||
* - This allows for flexible validation. For example, `archive/*` is a valid pattern,
|
||||
* as is `archive/email/123-abc`.
|
||||
*/
|
||||
export const ValidResourcePatterns = {
|
||||
archive: /^archive\/(all|ingestion-source\/[^\/]+|mailbox\/[^\/]+|custodian\/[^\/]+)$/,
|
||||
ingestion: /^ingestion-source\/(\*|[^\/]+)$/,
|
||||
system: /^system\/(settings|users|user\/[^\/]+)$/,
|
||||
dashboard: /^dashboard\/\*$/,
|
||||
};
|
||||
100
packages/backend/src/iam-policy/policy-validator.ts
Normal file
100
packages/backend/src/iam-policy/policy-validator.ts
Normal file
@@ -0,0 +1,100 @@
|
||||
import type { PolicyStatement } from '@open-archiver/types';
|
||||
import { ValidActions, ValidResourcePatterns } from './iam-definitions';
|
||||
|
||||
/**
|
||||
* @class PolicyValidator
|
||||
*
|
||||
* This class provides a static method to validate an IAM policy statement.
|
||||
* It is designed to be used before a policy is saved to the database, ensuring that
|
||||
* only valid and well-formed policies are stored.
|
||||
*
|
||||
* The verification logic is based on the centralized definitions in `iam-definitions.ts`.
|
||||
*/
|
||||
export class PolicyValidator {
|
||||
/**
|
||||
* Validates a single policy statement to ensure its actions and resources are valid.
|
||||
*
|
||||
* @param {PolicyStatement} statement - The policy statement to validate.
|
||||
* @returns {{valid: boolean; reason?: string}} - An object containing a boolean `valid` property
|
||||
* and an optional `reason` string if validation fails.
|
||||
*/
|
||||
public static isValid(statement: PolicyStatement): { valid: boolean; reason: string; } {
|
||||
if (!statement || !statement.Action || !statement.Resource || !statement.Effect) {
|
||||
return { valid: false, reason: 'Policy statement is missing required fields.' };
|
||||
}
|
||||
|
||||
// 1. Validate Actions
|
||||
for (const action of statement.Action) {
|
||||
const { valid, reason } = this.isActionValid(action);
|
||||
if (!valid) {
|
||||
return { valid: false, reason };
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Validate Resources
|
||||
for (const resource of statement.Resource) {
|
||||
const { valid, reason } = this.isResourceValid(resource);
|
||||
if (!valid) {
|
||||
return { valid: false, reason };
|
||||
}
|
||||
}
|
||||
|
||||
return { valid: true, reason: 'valid' };
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if a single action string is valid.
|
||||
*
|
||||
* Logic:
|
||||
* - If the action contains a wildcard (e.g., 'archive:*'), it checks if the service part
|
||||
* (e.g., 'archive') is a recognized service.
|
||||
* - If there is no wildcard, it checks if the full action string (e.g., 'archive:read')
|
||||
* exists in the `ValidActions` set.
|
||||
*
|
||||
* @param {string} action - The action string to validate.
|
||||
* @returns {{valid: boolean; reason?: string}} - An object indicating validity and a reason for failure.
|
||||
*/
|
||||
private static isActionValid(action: string): { valid: boolean; reason: string; } {
|
||||
if (action === '*') {
|
||||
return { valid: true, reason: 'valid' };
|
||||
}
|
||||
if (action.endsWith(':*')) {
|
||||
const service = action.split(':')[0];
|
||||
if (service in ValidResourcePatterns) {
|
||||
return { valid: true, reason: 'valid' };
|
||||
}
|
||||
return { valid: false, reason: `Invalid service '${service}' in action wildcard '${action}'.` };
|
||||
}
|
||||
if (ValidActions.has(action)) {
|
||||
return { valid: true, reason: 'valid' };
|
||||
}
|
||||
return { valid: false, reason: `Action '${action}' is not a valid action.` };
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if a single resource string has a valid format.
|
||||
*
|
||||
* Logic:
|
||||
* - It extracts the service name from the resource string (e.g., 'archive' from 'archive/all').
|
||||
* - It looks up the corresponding regular expression for that service in `ValidResourcePatterns`.
|
||||
* - It tests the resource string against the pattern. If the service does not exist or the
|
||||
* pattern does not match, the resource is considered invalid.
|
||||
*
|
||||
* @param {string} resource - The resource string to validate.
|
||||
* @returns {{valid: boolean; reason?: string}} - An object indicating validity and a reason for failure.
|
||||
*/
|
||||
private static isResourceValid(resource: string): { valid: boolean; reason: string; } {
|
||||
const service = resource.split('/')[0];
|
||||
if (service === '*') {
|
||||
return { valid: true, reason: 'valid' };
|
||||
}
|
||||
if (service in ValidResourcePatterns) {
|
||||
const pattern = ValidResourcePatterns[service as keyof typeof ValidResourcePatterns];
|
||||
if (pattern.test(resource)) {
|
||||
return { valid: true, reason: 'valid' };
|
||||
}
|
||||
return { valid: false, reason: `Resource '${resource}' does not match the expected format for the '${service}' service.` };
|
||||
}
|
||||
return { valid: false, reason: `Invalid service '${service}' in resource '${resource}'.` };
|
||||
}
|
||||
}
|
||||
@@ -5,16 +5,20 @@ import { IngestionController } from './api/controllers/ingestion.controller';
|
||||
import { ArchivedEmailController } from './api/controllers/archived-email.controller';
|
||||
import { StorageController } from './api/controllers/storage.controller';
|
||||
import { SearchController } from './api/controllers/search.controller';
|
||||
import { IamController } from './api/controllers/iam.controller';
|
||||
import { requireAuth } from './api/middleware/requireAuth';
|
||||
import { createAuthRouter } from './api/routes/auth.routes';
|
||||
import { createIamRouter } from './api/routes/iam.routes';
|
||||
import { createIngestionRouter } from './api/routes/ingestion.routes';
|
||||
import { createArchivedEmailRouter } from './api/routes/archived-email.routes';
|
||||
import { createStorageRouter } from './api/routes/storage.routes';
|
||||
import { createSearchRouter } from './api/routes/search.routes';
|
||||
import { createDashboardRouter } from './api/routes/dashboard.routes';
|
||||
import { createUploadRouter } from './api/routes/upload.routes';
|
||||
import testRouter from './api/routes/test.routes';
|
||||
import { AuthService } from './services/AuthService';
|
||||
import { AdminUserService } from './services/UserService';
|
||||
import { UserService } from './services/UserService';
|
||||
import { IamService } from './services/IamService';
|
||||
import { StorageService } from './services/StorageService';
|
||||
import { SearchService } from './services/SearchService';
|
||||
|
||||
@@ -32,27 +36,26 @@ const {
|
||||
|
||||
|
||||
if (!PORT_BACKEND || !JWT_SECRET || !JWT_EXPIRES_IN) {
|
||||
throw new Error('Missing required environment variables for the backend.');
|
||||
throw new Error('Missing required environment variables for the backend: PORT_BACKEND, JWT_SECRET, JWT_EXPIRES_IN.');
|
||||
}
|
||||
|
||||
// --- Dependency Injection Setup ---
|
||||
|
||||
const userService = new AdminUserService();
|
||||
const userService = new UserService();
|
||||
const authService = new AuthService(userService, JWT_SECRET, JWT_EXPIRES_IN);
|
||||
const authController = new AuthController(authService);
|
||||
const authController = new AuthController(authService, userService);
|
||||
const ingestionController = new IngestionController();
|
||||
const archivedEmailController = new ArchivedEmailController();
|
||||
const storageService = new StorageService();
|
||||
const storageController = new StorageController(storageService);
|
||||
const searchService = new SearchService();
|
||||
const searchController = new SearchController();
|
||||
const iamService = new IamService();
|
||||
const iamController = new IamController(iamService);
|
||||
|
||||
// --- Express App Initialization ---
|
||||
const app = express();
|
||||
|
||||
// Middleware
|
||||
app.use(express.json()); // For parsing application/json
|
||||
|
||||
// --- Routes ---
|
||||
const authRouter = createAuthRouter(authController);
|
||||
const ingestionRouter = createIngestionRouter(ingestionController, authService);
|
||||
@@ -60,7 +63,17 @@ const archivedEmailRouter = createArchivedEmailRouter(archivedEmailController, a
|
||||
const storageRouter = createStorageRouter(storageController, authService);
|
||||
const searchRouter = createSearchRouter(searchController, authService);
|
||||
const dashboardRouter = createDashboardRouter(authService);
|
||||
const iamRouter = createIamRouter(iamController);
|
||||
const uploadRouter = createUploadRouter(authService);
|
||||
// upload route is added before middleware because it doesn't use the json middleware.
|
||||
app.use('/v1/upload', uploadRouter);
|
||||
|
||||
// Middleware for all other routes
|
||||
app.use(express.json());
|
||||
app.use(express.urlencoded({ extended: true }));
|
||||
|
||||
app.use('/v1/auth', authRouter);
|
||||
app.use('/v1/iam', iamRouter);
|
||||
app.use('/v1/ingestion-sources', ingestionRouter);
|
||||
app.use('/v1/archived-emails', archivedEmailRouter);
|
||||
app.use('/v1/storage', storageRouter);
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { Job, FlowChildJob } from 'bullmq';
|
||||
import { IngestionService } from '../../services/IngestionService';
|
||||
import { IInitialImportJob } from '@open-archiver/types';
|
||||
import { IInitialImportJob, IngestionProvider } from '@open-archiver/types';
|
||||
import { EmailProviderFactory } from '../../services/EmailProviderFactory';
|
||||
import { flowProducer } from '../queues';
|
||||
import { logger } from '../../config/logger';
|
||||
@@ -67,26 +67,15 @@ export default async (job: Job<IInitialImportJob>) => {
|
||||
}
|
||||
});
|
||||
} else {
|
||||
const fileBasedIngestions = IngestionService.returnFileBasedIngestions();
|
||||
const finalStatus = fileBasedIngestions.includes(source.provider) ? 'imported' : 'active';
|
||||
// If there are no users, we can consider the import finished and set to active
|
||||
await IngestionService.update(ingestionSourceId, {
|
||||
status: 'active',
|
||||
status: finalStatus,
|
||||
lastSyncFinishedAt: new Date(),
|
||||
lastSyncStatusMessage: 'Initial import complete. No users found.'
|
||||
});
|
||||
}
|
||||
// } else {
|
||||
// // For other providers, we might trigger a simpler bulk import directly
|
||||
// await new IngestionService().performBulkImport(job.data);
|
||||
// await flowProducer.add({
|
||||
// name: 'sync-cycle-finished',
|
||||
// queueName: 'ingestion',
|
||||
// data: {
|
||||
// ingestionSourceId,
|
||||
// userCount: 1,
|
||||
// isInitialImport: true
|
||||
// }
|
||||
// });
|
||||
// }
|
||||
|
||||
logger.info({ ingestionSourceId }, 'Finished initial import master job');
|
||||
} catch (error) {
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { Job } from 'bullmq';
|
||||
import { IngestionService } from '../../services/IngestionService';
|
||||
import { logger } from '../../config/logger';
|
||||
import { SyncState, ProcessMailboxError } from '@open-archiver/types';
|
||||
import { SyncState, ProcessMailboxError, IngestionStatus, IngestionProvider } from '@open-archiver/types';
|
||||
import { db } from '../../database';
|
||||
import { ingestionSources } from '../../database/schema';
|
||||
import { eq } from 'drizzle-orm';
|
||||
@@ -41,15 +41,28 @@ export default async (job: Job<ISyncCycleFinishedJob, any, string>) => {
|
||||
|
||||
const finalSyncState = deepmerge(...successfulJobs.filter(s => s && Object.keys(s).length > 0));
|
||||
|
||||
let status: 'active' | 'error' = 'active';
|
||||
const source = await IngestionService.findById(ingestionSourceId);
|
||||
let status: IngestionStatus = 'active';
|
||||
const fileBasedIngestions = IngestionService.returnFileBasedIngestions();
|
||||
|
||||
if (fileBasedIngestions.includes(source.provider)) {
|
||||
status = 'imported';
|
||||
}
|
||||
let message: string;
|
||||
|
||||
// Check for a specific rate-limit message from the successful jobs
|
||||
const rateLimitMessage = successfulJobs.find(j => j.statusMessage)?.statusMessage;
|
||||
|
||||
if (failedJobs.length > 0) {
|
||||
status = 'error';
|
||||
const errorMessages = failedJobs.map(j => j.message).join('\n');
|
||||
message = `Sync cycle completed with ${failedJobs.length} error(s):\n${errorMessages}`;
|
||||
logger.error({ ingestionSourceId, errors: errorMessages }, 'Sync cycle finished with errors.');
|
||||
} else {
|
||||
} else if (rateLimitMessage) {
|
||||
message = rateLimitMessage;
|
||||
logger.warn({ ingestionSourceId, message }, 'Sync cycle paused due to rate limiting.');
|
||||
}
|
||||
else {
|
||||
message = 'Continuous sync cycle finished successfully.';
|
||||
if (isInitialImport) {
|
||||
message = `Initial import finished for ${userCount} mailboxes.`;
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
import { ingestionQueue } from '../queues';
|
||||
|
||||
import { config } from '../../config';
|
||||
|
||||
const scheduleContinuousSync = async () => {
|
||||
// This job will run every 15 minutes
|
||||
await ingestionQueue.add(
|
||||
@@ -7,7 +9,7 @@ const scheduleContinuousSync = async () => {
|
||||
{},
|
||||
{
|
||||
repeat: {
|
||||
pattern: '* * * * *', // Every 1 minute
|
||||
pattern: config.app.syncFrequency
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { count, desc, eq, asc } from 'drizzle-orm';
|
||||
import { count, desc, eq, asc, and } from 'drizzle-orm';
|
||||
import { db } from '../database';
|
||||
import { archivedEmails, attachments, emailAttachments } from '../database/schema';
|
||||
import type { PaginatedArchivedEmails, ArchivedEmail, Recipient, ThreadEmail } from '@open-archiver/types';
|
||||
@@ -59,7 +59,9 @@ export class ArchivedEmailService {
|
||||
return {
|
||||
items: items.map((item) => ({
|
||||
...item,
|
||||
recipients: this.mapRecipients(item.recipients)
|
||||
recipients: this.mapRecipients(item.recipients),
|
||||
tags: (item.tags as string[] | null) || null,
|
||||
path: item.path || null
|
||||
})),
|
||||
total: total.count,
|
||||
page,
|
||||
@@ -81,7 +83,10 @@ export class ArchivedEmailService {
|
||||
|
||||
if (email.threadId) {
|
||||
threadEmails = await db.query.archivedEmails.findMany({
|
||||
where: eq(archivedEmails.threadId, email.threadId),
|
||||
where: and(
|
||||
eq(archivedEmails.threadId, email.threadId),
|
||||
eq(archivedEmails.ingestionSourceId, email.ingestionSourceId)
|
||||
),
|
||||
orderBy: [asc(archivedEmails.sentAt)],
|
||||
columns: {
|
||||
id: true,
|
||||
@@ -100,7 +105,9 @@ export class ArchivedEmailService {
|
||||
...email,
|
||||
recipients: this.mapRecipients(email.recipients),
|
||||
raw,
|
||||
thread: threadEmails
|
||||
thread: threadEmails,
|
||||
tags: (email.tags as string[] | null) || null,
|
||||
path: email.path || null
|
||||
};
|
||||
|
||||
if (email.hasAttachments) {
|
||||
|
||||
@@ -1,38 +1,23 @@
|
||||
import { compare, hash } from 'bcryptjs';
|
||||
import type { SignJWT, jwtVerify } from 'jose';
|
||||
import type { AuthTokenPayload, User, LoginResponse } from '@open-archiver/types';
|
||||
import { compare } from 'bcryptjs';
|
||||
import { SignJWT, jwtVerify } from 'jose';
|
||||
import type { AuthTokenPayload, LoginResponse } from '@open-archiver/types';
|
||||
import { UserService } from './UserService';
|
||||
import { db } from '../database';
|
||||
import * as schema from '../database/schema';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
// This interface defines the contract for a service that manages users.
|
||||
// The AuthService will depend on this abstraction, not a concrete implementation.
|
||||
export interface IUserService {
|
||||
findByEmail(email: string): Promise<User | null>;
|
||||
}
|
||||
|
||||
// This interface defines the contract for our AuthService.
|
||||
export interface IAuthService {
|
||||
verifyPassword(password: string, hash: string): Promise<boolean>;
|
||||
login(email: string, password: string): Promise<LoginResponse | null>;
|
||||
verifyToken(token: string): Promise<AuthTokenPayload | null>;
|
||||
}
|
||||
|
||||
export class AuthService implements IAuthService {
|
||||
#userService: IUserService;
|
||||
export class AuthService {
|
||||
#userService: UserService;
|
||||
#jwtSecret: Uint8Array;
|
||||
#jwtExpiresIn: string;
|
||||
#jose: Promise<{ SignJWT: typeof SignJWT; jwtVerify: typeof jwtVerify; }>;
|
||||
|
||||
constructor(userService: IUserService, jwtSecret: string, jwtExpiresIn: string) {
|
||||
constructor(userService: UserService, jwtSecret: string, jwtExpiresIn: string) {
|
||||
this.#userService = userService;
|
||||
this.#jwtSecret = new TextEncoder().encode(jwtSecret);
|
||||
this.#jwtExpiresIn = jwtExpiresIn;
|
||||
this.#jose = import('jose');
|
||||
}
|
||||
|
||||
#hashPassword(password: string): Promise<string> {
|
||||
return hash(password, 10);
|
||||
}
|
||||
|
||||
public verifyPassword(password: string, hash: string): Promise<boolean> {
|
||||
public async verifyPassword(password: string, hash: string): Promise<boolean> {
|
||||
return compare(password, hash);
|
||||
}
|
||||
|
||||
@@ -40,7 +25,6 @@ export class AuthService implements IAuthService {
|
||||
if (!payload.sub) {
|
||||
throw new Error('JWT payload must have a subject (sub) claim.');
|
||||
}
|
||||
const { SignJWT } = await this.#jose;
|
||||
return new SignJWT(payload)
|
||||
.setProtectedHeader({ alg: 'HS256' })
|
||||
.setIssuedAt()
|
||||
@@ -52,22 +36,31 @@ export class AuthService implements IAuthService {
|
||||
public async login(email: string, password: string): Promise<LoginResponse | null> {
|
||||
const user = await this.#userService.findByEmail(email);
|
||||
|
||||
if (!user) {
|
||||
return null; // User not found
|
||||
if (!user || !user.password) {
|
||||
return null; // User not found or password not set
|
||||
}
|
||||
|
||||
const isPasswordValid = await this.verifyPassword(password, user.passwordHash);
|
||||
const isPasswordValid = await this.verifyPassword(password, user.password);
|
||||
|
||||
if (!isPasswordValid) {
|
||||
return null; // Invalid password
|
||||
}
|
||||
|
||||
const { passwordHash, ...userWithoutPassword } = user;
|
||||
const userRoles = await db.query.userRoles.findMany({
|
||||
where: eq(schema.userRoles.userId, user.id),
|
||||
with: {
|
||||
role: true
|
||||
}
|
||||
});
|
||||
|
||||
const roles = userRoles.map(ur => ur.role.name);
|
||||
|
||||
const { password: _, ...userWithoutPassword } = user;
|
||||
|
||||
const accessToken = await this.#generateAccessToken({
|
||||
sub: user.id,
|
||||
email: user.email,
|
||||
role: user.role,
|
||||
roles: roles,
|
||||
});
|
||||
|
||||
return { accessToken, user: userWithoutPassword };
|
||||
@@ -75,7 +68,6 @@ export class AuthService implements IAuthService {
|
||||
|
||||
public async verifyToken(token: string): Promise<AuthTokenPayload | null> {
|
||||
try {
|
||||
const { jwtVerify } = await this.#jose;
|
||||
const { payload } = await jwtVerify<AuthTokenPayload>(token, this.#jwtSecret);
|
||||
return payload;
|
||||
} catch (error) {
|
||||
|
||||
@@ -3,6 +3,8 @@ import type {
|
||||
GoogleWorkspaceCredentials,
|
||||
Microsoft365Credentials,
|
||||
GenericImapCredentials,
|
||||
PSTImportCredentials,
|
||||
EMLImportCredentials,
|
||||
EmailObject,
|
||||
SyncState,
|
||||
MailboxUser
|
||||
@@ -10,6 +12,8 @@ import type {
|
||||
import { GoogleWorkspaceConnector } from './ingestion-connectors/GoogleWorkspaceConnector';
|
||||
import { MicrosoftConnector } from './ingestion-connectors/MicrosoftConnector';
|
||||
import { ImapConnector } from './ingestion-connectors/ImapConnector';
|
||||
import { PSTConnector } from './ingestion-connectors/PSTConnector';
|
||||
import { EMLConnector } from './ingestion-connectors/EMLConnector';
|
||||
|
||||
// Define a common interface for all connectors
|
||||
export interface IEmailConnector {
|
||||
@@ -32,6 +36,10 @@ export class EmailProviderFactory {
|
||||
return new MicrosoftConnector(credentials as Microsoft365Credentials);
|
||||
case 'generic_imap':
|
||||
return new ImapConnector(credentials as GenericImapCredentials);
|
||||
case 'pst_import':
|
||||
return new PSTConnector(credentials as PSTImportCredentials);
|
||||
case 'eml_import':
|
||||
return new EMLConnector(credentials as EMLImportCredentials);
|
||||
default:
|
||||
throw new Error(`Unsupported provider: ${source.provider}`);
|
||||
}
|
||||
|
||||
24
packages/backend/src/services/IamService.ts
Normal file
24
packages/backend/src/services/IamService.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
import { db } from '../database';
|
||||
import { roles } from '../database/schema/users';
|
||||
import type { Role, PolicyStatement } from '@open-archiver/types';
|
||||
import { eq } from 'drizzle-orm';
|
||||
|
||||
export class IamService {
|
||||
public async getRoles(): Promise<Role[]> {
|
||||
return db.select().from(roles);
|
||||
}
|
||||
|
||||
public async getRoleById(id: string): Promise<Role | undefined> {
|
||||
const [role] = await db.select().from(roles).where(eq(roles.id, id));
|
||||
return role;
|
||||
}
|
||||
|
||||
public async createRole(name: string, policy: PolicyStatement[]): Promise<Role> {
|
||||
const [role] = await db.insert(roles).values({ name, policies: policy }).returning();
|
||||
return role;
|
||||
}
|
||||
|
||||
public async deleteRole(id: string): Promise<void> {
|
||||
await db.delete(roles).where(eq(roles.id, id));
|
||||
}
|
||||
}
|
||||
@@ -4,7 +4,8 @@ import type {
|
||||
CreateIngestionSourceDto,
|
||||
UpdateIngestionSourceDto,
|
||||
IngestionSource,
|
||||
IngestionCredentials
|
||||
IngestionCredentials,
|
||||
IngestionProvider
|
||||
} from '@open-archiver/types';
|
||||
import { and, desc, eq } from 'drizzle-orm';
|
||||
import { CryptoService } from './CryptoService';
|
||||
@@ -19,6 +20,7 @@ import { logger } from '../config/logger';
|
||||
import { IndexingService } from './IndexingService';
|
||||
import { SearchService } from './SearchService';
|
||||
import { DatabaseService } from './DatabaseService';
|
||||
import { config } from '../config/index';
|
||||
|
||||
|
||||
export class IngestionService {
|
||||
@@ -35,9 +37,12 @@ export class IngestionService {
|
||||
return { ...source, credentials: decryptedCredentials } as IngestionSource;
|
||||
}
|
||||
|
||||
public static returnFileBasedIngestions(): IngestionProvider[] {
|
||||
return ['pst_import', 'eml_import'];
|
||||
}
|
||||
|
||||
public static async create(dto: CreateIngestionSourceDto): Promise<IngestionSource> {
|
||||
const { providerConfig, ...rest } = dto;
|
||||
|
||||
const encryptedCredentials = CryptoService.encryptObject(providerConfig);
|
||||
|
||||
const valuesToInsert = {
|
||||
@@ -136,9 +141,16 @@ export class IngestionService {
|
||||
|
||||
// Delete all emails and attachments from storage
|
||||
const storage = new StorageService();
|
||||
const emailPath = `open-archiver/${source.name.replaceAll(' ', '-')}-${source.id}/`;
|
||||
const emailPath = `${config.storage.openArchiverFolderName}/${source.name.replaceAll(' ', '-')}-${source.id}/`;
|
||||
await storage.delete(emailPath);
|
||||
|
||||
if (
|
||||
(source.credentials.type === 'pst_import' || source.credentials.type === 'eml_import') &&
|
||||
source.credentials.uploadedFilePath &&
|
||||
(await storage.exists(source.credentials.uploadedFilePath))
|
||||
) {
|
||||
await storage.delete(source.credentials.uploadedFilePath);
|
||||
}
|
||||
|
||||
// Delete all emails from the database
|
||||
// NOTE: This is done by database CASADE, change when CASADE relation no longer exists.
|
||||
@@ -200,14 +212,13 @@ export class IngestionService {
|
||||
}
|
||||
|
||||
public async performBulkImport(job: IInitialImportJob): Promise<void> {
|
||||
console.log('performing bulk import');
|
||||
const { ingestionSourceId } = job;
|
||||
const source = await IngestionService.findById(ingestionSourceId);
|
||||
if (!source) {
|
||||
throw new Error(`Ingestion source ${ingestionSourceId} not found.`);
|
||||
}
|
||||
|
||||
console.log(`Starting bulk import for source: ${source.name} (${source.id})`);
|
||||
logger.info(`Starting bulk import for source: ${source.name} (${source.id})`);
|
||||
await IngestionService.update(ingestionSourceId, {
|
||||
status: 'importing',
|
||||
lastSyncStartedAt: new Date()
|
||||
@@ -229,22 +240,13 @@ export class IngestionService {
|
||||
}
|
||||
} else {
|
||||
// For single-mailbox providers, dispatch a single job
|
||||
// console.log('source.credentials ', source.credentials);
|
||||
await ingestionQueue.add('process-mailbox', {
|
||||
ingestionSourceId: source.id,
|
||||
userEmail: source.credentials.type === 'generic_imap' ? source.credentials.username : 'Default'
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
// await IngestionService.update(ingestionSourceId, {
|
||||
// status: 'active',
|
||||
// lastSyncFinishedAt: new Date(),
|
||||
// lastSyncStatusMessage: 'Successfully initiated bulk import for all mailboxes.'
|
||||
// });
|
||||
// console.log(`Bulk import job dispatch finished for source: ${source.name} (${source.id})`);
|
||||
} catch (error) {
|
||||
console.error(`Bulk import failed for source: ${source.name} (${source.id})`, error);
|
||||
logger.error(`Bulk import failed for source: ${source.name} (${source.id})`, error);
|
||||
await IngestionService.update(ingestionSourceId, {
|
||||
status: 'error',
|
||||
lastSyncFinishedAt: new Date(),
|
||||
@@ -286,10 +288,10 @@ export class IngestionService {
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('processing email, ', email.id, email.subject);
|
||||
const emlBuffer = email.eml ?? Buffer.from(email.body, 'utf-8');
|
||||
const emailHash = createHash('sha256').update(emlBuffer).digest('hex');
|
||||
const emailPath = `open-archiver/${source.name.replaceAll(' ', '-')}-${source.id}/emails/${email.id}.eml`;
|
||||
const sanitizedPath = email.path ? email.path : '';
|
||||
const emailPath = `${config.storage.openArchiverFolderName}/${source.name.replaceAll(' ', '-')}-${source.id}/emails/${sanitizedPath}${email.id}.eml`;
|
||||
await storage.put(emailPath, emlBuffer);
|
||||
|
||||
const [archivedEmail] = await db
|
||||
@@ -311,7 +313,9 @@ export class IngestionService {
|
||||
storagePath: emailPath,
|
||||
storageHashSha256: emailHash,
|
||||
sizeBytes: emlBuffer.length,
|
||||
hasAttachments: email.attachments.length > 0
|
||||
hasAttachments: email.attachments.length > 0,
|
||||
path: email.path,
|
||||
tags: email.tags
|
||||
})
|
||||
.returning();
|
||||
|
||||
@@ -319,7 +323,7 @@ export class IngestionService {
|
||||
for (const attachment of email.attachments) {
|
||||
const attachmentBuffer = attachment.content;
|
||||
const attachmentHash = createHash('sha256').update(attachmentBuffer).digest('hex');
|
||||
const attachmentPath = `open-archiver/${source.name.replaceAll(' ', '-')}-${source.id}/attachments/${attachment.filename}`;
|
||||
const attachmentPath = `${config.storage.openArchiverFolderName}/${source.name.replaceAll(' ', '-')}-${source.id}/attachments/${attachment.filename}`;
|
||||
await storage.put(attachmentPath, attachmentBuffer);
|
||||
|
||||
const [newAttachment] = await db
|
||||
@@ -348,7 +352,7 @@ export class IngestionService {
|
||||
}
|
||||
// adding to indexing queue
|
||||
//Instead: index by email (raw email object, ingestion id)
|
||||
console.log('Indexing email: ', email.subject);
|
||||
logger.info({ emailId: archivedEmail.id }, 'Indexing email');
|
||||
// await indexingQueue.add('index-email', {
|
||||
// emailId: archivedEmail.id,
|
||||
// });
|
||||
|
||||
@@ -1,31 +1,86 @@
|
||||
import { db } from '../database';
|
||||
import * as schema from '../database/schema';
|
||||
import { and, eq, asc, sql } from 'drizzle-orm';
|
||||
import { hash } from 'bcryptjs';
|
||||
import type { User } from '@open-archiver/types';
|
||||
import type { IUserService } from './AuthService';
|
||||
import type { PolicyStatement, User } from '@open-archiver/types';
|
||||
import { PolicyValidator } from '../iam-policy/policy-validator';
|
||||
|
||||
// This is a mock implementation of the IUserService.
|
||||
// Later on, this service would interact with a database.
|
||||
export class AdminUserService implements IUserService {
|
||||
#users: User[] = [];
|
||||
|
||||
constructor() {
|
||||
// Immediately seed the user when the service is instantiated.
|
||||
this.seed();
|
||||
}
|
||||
|
||||
// use .env admin user
|
||||
private async seed() {
|
||||
const passwordHash = await hash(process.env.ADMIN_PASSWORD as string, 10);
|
||||
this.#users.push({
|
||||
id: '1',
|
||||
email: process.env.ADMIN_EMAIL as string,
|
||||
role: 'Super Administrator',
|
||||
passwordHash: passwordHash,
|
||||
export class UserService {
|
||||
/**
|
||||
* Finds a user by their email address.
|
||||
* @param email The email address of the user to find.
|
||||
* @returns The user object if found, otherwise null.
|
||||
*/
|
||||
public async findByEmail(email: string): Promise<(typeof schema.users.$inferSelect) | null> {
|
||||
const user = await db.query.users.findFirst({
|
||||
where: eq(schema.users.email, email)
|
||||
});
|
||||
}
|
||||
|
||||
public async findByEmail(email: string): Promise<User | null> {
|
||||
// once user service is ready, this would be a database query.
|
||||
const user = this.#users.find(u => u.email === email);
|
||||
return user || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Finds a user by their ID.
|
||||
* @param id The ID of the user to find.
|
||||
* @returns The user object if found, otherwise null.
|
||||
*/
|
||||
public async findById(id: string): Promise<(typeof schema.users.$inferSelect) | null> {
|
||||
const user = await db.query.users.findFirst({
|
||||
where: eq(schema.users.id, id)
|
||||
});
|
||||
return user || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates an admin user in the database. The user created will be assigned the 'Super Admin' role.
|
||||
*
|
||||
* Caution ⚠️: This action can only be allowed in the initial setup
|
||||
*
|
||||
* @param userDetails The details of the user to create.
|
||||
* @param isSetup Is this an initial setup?
|
||||
* @returns The newly created user object.
|
||||
*/
|
||||
public async createAdminUser(userDetails: Pick<User, 'email' | 'first_name' | 'last_name'> & { password?: string; }, isSetup: boolean): Promise<(typeof schema.users.$inferSelect)> {
|
||||
if (!isSetup) {
|
||||
throw Error('This operation is only allowed upon initial setup.');
|
||||
}
|
||||
const { email, first_name, last_name, password } = userDetails;
|
||||
const userCountResult = await db.select({ count: sql<number>`count(*)` }).from(schema.users);
|
||||
const isFirstUser = Number(userCountResult[0].count) === 0;
|
||||
if (!isFirstUser) {
|
||||
throw Error('This operation is only allowed upon initial setup.');
|
||||
}
|
||||
const hashedPassword = password ? await hash(password, 10) : undefined;
|
||||
|
||||
const newUser = await db.insert(schema.users).values({
|
||||
email,
|
||||
first_name,
|
||||
last_name,
|
||||
password: hashedPassword,
|
||||
}).returning();
|
||||
|
||||
// find super admin role
|
||||
let superAdminRole = await db.query.roles.findFirst({
|
||||
where: eq(schema.roles.name, 'Super Admin')
|
||||
});
|
||||
|
||||
if (!superAdminRole) {
|
||||
const suerAdminPolicies: PolicyStatement[] = [{
|
||||
Effect: 'Allow',
|
||||
Action: ['*'],
|
||||
Resource: ['*']
|
||||
}];
|
||||
superAdminRole = (await db.insert(schema.roles).values({
|
||||
name: 'Super Admin',
|
||||
policies: suerAdminPolicies
|
||||
}).returning())[0];
|
||||
}
|
||||
|
||||
await db.insert(schema.userRoles).values({
|
||||
userId: newUser[0].id,
|
||||
roleId: superAdminRole.id
|
||||
});
|
||||
|
||||
|
||||
return newUser[0];
|
||||
}
|
||||
}
|
||||
|
||||
@@ -0,0 +1,199 @@
|
||||
import type { EMLImportCredentials, EmailObject, EmailAddress, SyncState, MailboxUser } from '@open-archiver/types';
|
||||
import type { IEmailConnector } from '../EmailProviderFactory';
|
||||
import { simpleParser, ParsedMail, Attachment, AddressObject } from 'mailparser';
|
||||
import { logger } from '../../config/logger';
|
||||
import { getThreadId } from './helpers/utils';
|
||||
import { StorageService } from '../StorageService';
|
||||
import { Readable } from 'stream';
|
||||
import { createHash } from 'crypto';
|
||||
import { join, dirname } from 'path';
|
||||
import { createReadStream, promises as fs, createWriteStream } from 'fs';
|
||||
import * as yauzl from 'yauzl';
|
||||
|
||||
const streamToBuffer = (stream: Readable): Promise<Buffer> => {
|
||||
return new Promise((resolve, reject) => {
|
||||
const chunks: Buffer[] = [];
|
||||
stream.on('data', (chunk) => chunks.push(chunk));
|
||||
stream.on('error', reject);
|
||||
stream.on('end', () => resolve(Buffer.concat(chunks)));
|
||||
});
|
||||
};
|
||||
|
||||
export class EMLConnector implements IEmailConnector {
|
||||
private storage: StorageService;
|
||||
|
||||
constructor(private credentials: EMLImportCredentials) {
|
||||
this.storage = new StorageService();
|
||||
}
|
||||
|
||||
public async testConnection(): Promise<boolean> {
|
||||
try {
|
||||
if (!this.credentials.uploadedFilePath) {
|
||||
throw Error("EML file path not provided.");
|
||||
}
|
||||
if (!this.credentials.uploadedFilePath.includes('.zip')) {
|
||||
throw Error("Provided file is not in the ZIP format.");
|
||||
}
|
||||
const fileExist = await this.storage.exists(this.credentials.uploadedFilePath);
|
||||
if (!fileExist) {
|
||||
throw Error("EML file upload not finished yet, please wait.");
|
||||
}
|
||||
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.error({ error, credentials: this.credentials }, 'EML file validation failed.');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
public async *listAllUsers(): AsyncGenerator<MailboxUser> {
|
||||
const displayName = this.credentials.uploadedFileName || `eml-import-${new Date().getTime()}`;
|
||||
logger.info(`Found potential mailbox: ${displayName}`);
|
||||
const constructedPrimaryEmail = `${displayName.replace(/ /g, '.').toLowerCase()}@eml.local`;
|
||||
yield {
|
||||
id: constructedPrimaryEmail,
|
||||
primaryEmail: constructedPrimaryEmail,
|
||||
displayName: displayName,
|
||||
};
|
||||
}
|
||||
|
||||
public async *fetchEmails(userEmail: string, syncState?: SyncState | null): AsyncGenerator<EmailObject | null> {
|
||||
const fileStream = await this.storage.get(this.credentials.uploadedFilePath);
|
||||
const tempDir = await fs.mkdtemp(join('/tmp', 'eml-import-'));
|
||||
const unzippedPath = join(tempDir, 'unzipped');
|
||||
await fs.mkdir(unzippedPath);
|
||||
const zipFilePath = join(tempDir, 'eml.zip');
|
||||
|
||||
try {
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
const dest = createWriteStream(zipFilePath);
|
||||
(fileStream as Readable).pipe(dest);
|
||||
dest.on('finish', () => resolve());
|
||||
dest.on('error', reject);
|
||||
});
|
||||
|
||||
await this.extract(zipFilePath, unzippedPath);
|
||||
|
||||
const files = await this.getAllFiles(unzippedPath);
|
||||
|
||||
for (const file of files) {
|
||||
if (file.endsWith('.eml')) {
|
||||
try {
|
||||
// logger.info({ file }, 'Processing EML file.');
|
||||
const stream = createReadStream(file);
|
||||
const content = await streamToBuffer(stream);
|
||||
// logger.info({ file, size: content.length }, 'Read file to buffer.');
|
||||
let relativePath = file.substring(unzippedPath.length + 1);
|
||||
if (dirname(relativePath) === '.') {
|
||||
relativePath = '';
|
||||
} else {
|
||||
relativePath = dirname(relativePath);
|
||||
}
|
||||
const emailObject = await this.parseMessage(content, relativePath);
|
||||
// logger.info({ file, messageId: emailObject.id }, 'Parsed email message.');
|
||||
yield emailObject;
|
||||
} catch (error) {
|
||||
logger.error({ error, file }, 'Failed to process a single EML file. Skipping.');
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ error }, 'Failed to fetch email.');
|
||||
throw error;
|
||||
} finally {
|
||||
await fs.rm(tempDir, { recursive: true, force: true });
|
||||
}
|
||||
}
|
||||
|
||||
private extract(zipFilePath: string, dest: string): Promise<void> {
|
||||
return new Promise((resolve, reject) => {
|
||||
yauzl.open(zipFilePath, { lazyEntries: true, decodeStrings: false }, (err, zipfile) => {
|
||||
if (err) reject(err);
|
||||
zipfile.on('error', reject);
|
||||
zipfile.readEntry();
|
||||
zipfile.on('entry', (entry) => {
|
||||
const fileName = entry.fileName.toString('utf8');
|
||||
// Ignore macOS-specific metadata files.
|
||||
if (fileName.startsWith('__MACOSX/')) {
|
||||
zipfile.readEntry();
|
||||
return;
|
||||
}
|
||||
const entryPath = join(dest, fileName);
|
||||
if (/\/$/.test(fileName)) {
|
||||
fs.mkdir(entryPath, { recursive: true }).then(() => zipfile.readEntry()).catch(reject);
|
||||
} else {
|
||||
zipfile.openReadStream(entry, (err, readStream) => {
|
||||
if (err) reject(err);
|
||||
const writeStream = createWriteStream(entryPath);
|
||||
readStream.pipe(writeStream);
|
||||
writeStream.on('finish', () => zipfile.readEntry());
|
||||
writeStream.on('error', reject);
|
||||
});
|
||||
}
|
||||
});
|
||||
zipfile.on('end', () => resolve());
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
private async getAllFiles(dirPath: string, arrayOfFiles: string[] = []): Promise<string[]> {
|
||||
const files = await fs.readdir(dirPath);
|
||||
|
||||
for (const file of files) {
|
||||
const fullPath = join(dirPath, file);
|
||||
if ((await fs.stat(fullPath)).isDirectory()) {
|
||||
await this.getAllFiles(fullPath, arrayOfFiles);
|
||||
} else {
|
||||
arrayOfFiles.push(fullPath);
|
||||
}
|
||||
}
|
||||
|
||||
return arrayOfFiles;
|
||||
}
|
||||
|
||||
private async parseMessage(emlBuffer: Buffer, path: string): Promise<EmailObject> {
|
||||
const parsedEmail: ParsedMail = await simpleParser(emlBuffer);
|
||||
|
||||
const attachments = parsedEmail.attachments.map((attachment: Attachment) => ({
|
||||
filename: attachment.filename || 'untitled',
|
||||
contentType: attachment.contentType,
|
||||
size: attachment.size,
|
||||
content: attachment.content as Buffer
|
||||
}));
|
||||
|
||||
const mapAddresses = (addresses: AddressObject | AddressObject[] | undefined): EmailAddress[] => {
|
||||
if (!addresses) return [];
|
||||
const addressArray = Array.isArray(addresses) ? addresses : [addresses];
|
||||
return addressArray.flatMap(a => a.value.map(v => ({ name: v.name, address: v.address?.replaceAll(`'`, '') || '' })));
|
||||
};
|
||||
|
||||
const threadId = getThreadId(parsedEmail.headers);
|
||||
let messageId = parsedEmail.messageId;
|
||||
|
||||
if (!messageId) {
|
||||
messageId = `generated-${createHash('sha256').update(emlBuffer).digest('hex')}`;
|
||||
}
|
||||
|
||||
|
||||
return {
|
||||
id: messageId,
|
||||
threadId: threadId,
|
||||
from: mapAddresses(parsedEmail.from),
|
||||
to: mapAddresses(parsedEmail.to),
|
||||
cc: mapAddresses(parsedEmail.cc),
|
||||
bcc: mapAddresses(parsedEmail.bcc),
|
||||
subject: parsedEmail.subject || '',
|
||||
body: parsedEmail.text || '',
|
||||
html: parsedEmail.html || '',
|
||||
headers: parsedEmail.headers,
|
||||
attachments,
|
||||
receivedAt: parsedEmail.date || new Date(),
|
||||
eml: emlBuffer,
|
||||
path
|
||||
};
|
||||
}
|
||||
|
||||
public getUpdatedSyncState(): SyncState {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
@@ -10,7 +10,7 @@ import type {
|
||||
import type { IEmailConnector } from '../EmailProviderFactory';
|
||||
import { logger } from '../../config/logger';
|
||||
import { simpleParser, ParsedMail, Attachment, AddressObject, Headers } from 'mailparser';
|
||||
import { getThreadId } from './utils';
|
||||
import { getThreadId } from './helpers/utils';
|
||||
|
||||
/**
|
||||
* A connector for Google Workspace that uses a service account with domain-wide delegation
|
||||
@@ -168,9 +168,18 @@ export class GoogleWorkspaceConnector implements IEmailConnector {
|
||||
for (const messageAdded of historyRecord.messagesAdded) {
|
||||
if (messageAdded.message?.id) {
|
||||
try {
|
||||
const messageId = messageAdded.message.id;
|
||||
const metadataResponse = await gmail.users.messages.get({
|
||||
userId: userEmail,
|
||||
id: messageId,
|
||||
format: 'METADATA',
|
||||
fields: 'labelIds'
|
||||
});
|
||||
const labels = await this.getLabelDetails(gmail, userEmail, metadataResponse.data.labelIds || []);
|
||||
|
||||
const msgResponse = await gmail.users.messages.get({
|
||||
userId: userEmail,
|
||||
id: messageAdded.message.id,
|
||||
id: messageId,
|
||||
format: 'RAW'
|
||||
});
|
||||
|
||||
@@ -205,6 +214,8 @@ export class GoogleWorkspaceConnector implements IEmailConnector {
|
||||
headers: parsedEmail.headers,
|
||||
attachments,
|
||||
receivedAt: parsedEmail.date || new Date(),
|
||||
path: labels.path,
|
||||
tags: labels.tags
|
||||
};
|
||||
}
|
||||
} catch (error: any) {
|
||||
@@ -243,9 +254,18 @@ export class GoogleWorkspaceConnector implements IEmailConnector {
|
||||
for (const message of messages) {
|
||||
if (message.id) {
|
||||
try {
|
||||
const messageId = message.id;
|
||||
const metadataResponse = await gmail.users.messages.get({
|
||||
userId: userEmail,
|
||||
id: messageId,
|
||||
format: 'METADATA',
|
||||
fields: 'labelIds'
|
||||
});
|
||||
const labels = await this.getLabelDetails(gmail, userEmail, metadataResponse.data.labelIds || []);
|
||||
|
||||
const msgResponse = await gmail.users.messages.get({
|
||||
userId: userEmail,
|
||||
id: message.id,
|
||||
id: messageId,
|
||||
format: 'RAW'
|
||||
});
|
||||
|
||||
@@ -280,6 +300,8 @@ export class GoogleWorkspaceConnector implements IEmailConnector {
|
||||
headers: parsedEmail.headers,
|
||||
attachments,
|
||||
receivedAt: parsedEmail.date || new Date(),
|
||||
path: labels.path,
|
||||
tags: labels.tags
|
||||
};
|
||||
}
|
||||
} catch (error: any) {
|
||||
@@ -313,4 +335,29 @@ export class GoogleWorkspaceConnector implements IEmailConnector {
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private labelCache: Map<string, gmail_v1.Schema$Label> = new Map();
|
||||
|
||||
private async getLabelDetails(gmail: gmail_v1.Gmail, userEmail: string, labelIds: string[]): Promise<{ path: string, tags: string[]; }> {
|
||||
const tags: string[] = [];
|
||||
let path = '';
|
||||
|
||||
for (const labelId of labelIds) {
|
||||
let label = this.labelCache.get(labelId);
|
||||
if (!label) {
|
||||
const res = await gmail.users.labels.get({ userId: userEmail, id: labelId });
|
||||
label = res.data;
|
||||
this.labelCache.set(labelId, label);
|
||||
}
|
||||
|
||||
if (label.name) {
|
||||
tags.push(label.name);
|
||||
if (label.type === 'user') {
|
||||
path = path ? `${path}/${label.name}` : label.name;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return { path, tags };
|
||||
}
|
||||
}
|
||||
|
||||
@@ -3,15 +3,20 @@ import type { IEmailConnector } from '../EmailProviderFactory';
|
||||
import { ImapFlow } from 'imapflow';
|
||||
import { simpleParser, ParsedMail, Attachment, AddressObject, Headers } from 'mailparser';
|
||||
import { logger } from '../../config/logger';
|
||||
import { getThreadId } from './utils';
|
||||
import { getThreadId } from './helpers/utils';
|
||||
|
||||
export class ImapConnector implements IEmailConnector {
|
||||
private client: ImapFlow;
|
||||
private newMaxUids: { [mailboxPath: string]: number; } = {};
|
||||
private isConnected = false;
|
||||
private statusMessage: string | undefined;
|
||||
|
||||
constructor(private credentials: GenericImapCredentials) {
|
||||
this.client = new ImapFlow({
|
||||
this.client = this.createClient();
|
||||
}
|
||||
|
||||
private createClient(): ImapFlow {
|
||||
const client = new ImapFlow({
|
||||
host: this.credentials.host,
|
||||
port: this.credentials.port,
|
||||
secure: this.credentials.secure,
|
||||
@@ -23,10 +28,12 @@ export class ImapConnector implements IEmailConnector {
|
||||
});
|
||||
|
||||
// Handles client-level errors, like unexpected disconnects, to prevent crashes.
|
||||
this.client.on('error', (err) => {
|
||||
client.on('error', (err) => {
|
||||
logger.error({ err }, 'IMAP client error');
|
||||
this.isConnected = false;
|
||||
});
|
||||
|
||||
return client;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -36,6 +43,12 @@ export class ImapConnector implements IEmailConnector {
|
||||
if (this.isConnected && this.client.usable) {
|
||||
return;
|
||||
}
|
||||
|
||||
// If the client is not usable (e.g., after a logout), create a new one.
|
||||
if (!this.client.usable) {
|
||||
this.client = this.createClient();
|
||||
}
|
||||
|
||||
try {
|
||||
await this.client.connect();
|
||||
this.isConnected = true;
|
||||
@@ -100,7 +113,7 @@ export class ImapConnector implements IEmailConnector {
|
||||
* @param maxRetries The maximum number of retries.
|
||||
* @returns The result of the action.
|
||||
*/
|
||||
private async withRetry<T>(action: () => Promise<T>, maxRetries = 3): Promise<T> {
|
||||
private async withRetry<T>(action: () => Promise<T>, maxRetries = 5): Promise<T> {
|
||||
for (let attempt = 1; attempt <= maxRetries; attempt++) {
|
||||
try {
|
||||
await this.connect();
|
||||
@@ -113,7 +126,10 @@ export class ImapConnector implements IEmailConnector {
|
||||
throw err;
|
||||
}
|
||||
// Wait for a short period before retrying
|
||||
await new Promise(resolve => setTimeout(resolve, 1000 * attempt));
|
||||
const delay = Math.pow(2, attempt) * 1000;
|
||||
const jitter = Math.random() * 1000;
|
||||
logger.info(`Retrying in ${Math.round((delay + jitter) / 1000)}s`);
|
||||
await new Promise(resolve => setTimeout(resolve, delay + jitter));
|
||||
}
|
||||
}
|
||||
// This line should be unreachable
|
||||
@@ -121,28 +137,32 @@ export class ImapConnector implements IEmailConnector {
|
||||
}
|
||||
|
||||
public async *fetchEmails(userEmail: string, syncState?: SyncState | null): AsyncGenerator<EmailObject | null> {
|
||||
try {
|
||||
const mailboxes = await this.withRetry(() => this.client.list());
|
||||
// console.log('fetched mailboxes:', mailboxes);
|
||||
const processableMailboxes = mailboxes.filter(mailbox => {
|
||||
// filter out trash and all mail emails
|
||||
if (mailbox.specialUse) {
|
||||
const specialUse = mailbox.specialUse.toLowerCase();
|
||||
if (specialUse === '\\junk' || specialUse === '\\trash' || specialUse === '\\all') {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
// Fallback to checking flags
|
||||
if (mailbox.flags.has('\\Noselect') || mailbox.flags.has('\\Trash') || mailbox.flags.has('\\Junk') || mailbox.flags.has('\\All')) {
|
||||
// list all mailboxes first
|
||||
const mailboxes = await this.withRetry(async () => await this.client.list());
|
||||
await this.disconnect();
|
||||
|
||||
const processableMailboxes = mailboxes.filter(mailbox => {
|
||||
// filter out trash and all mail emails
|
||||
if (mailbox.specialUse) {
|
||||
const specialUse = mailbox.specialUse.toLowerCase();
|
||||
if (specialUse === '\\junk' || specialUse === '\\trash' || specialUse === '\\all') {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
// Fallback to checking flags
|
||||
if (mailbox.flags.has('\\Noselect') || mailbox.flags.has('\\Trash') || mailbox.flags.has('\\Junk') || mailbox.flags.has('\\All')) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
});
|
||||
return true;
|
||||
});
|
||||
|
||||
for (const mailboxInfo of processableMailboxes) {
|
||||
const mailboxPath = mailboxInfo.path;
|
||||
const mailbox = await this.withRetry(() => this.client.mailboxOpen(mailboxPath));
|
||||
for (const mailboxInfo of processableMailboxes) {
|
||||
const mailboxPath = mailboxInfo.path;
|
||||
logger.info({ mailboxPath }, 'Processing mailbox');
|
||||
|
||||
try {
|
||||
const mailbox = await this.withRetry(async () => await this.client.mailboxOpen(mailboxPath));
|
||||
const lastUid = syncState?.imap?.[mailboxPath]?.maxUid;
|
||||
let currentMaxUid = lastUid || 0;
|
||||
|
||||
@@ -154,31 +174,55 @@ export class ImapConnector implements IEmailConnector {
|
||||
}
|
||||
this.newMaxUids[mailboxPath] = currentMaxUid;
|
||||
|
||||
const searchCriteria = lastUid ? { uid: `${lastUid + 1}:*` } : { all: true };
|
||||
|
||||
// Only fetch if the mailbox has messages, to avoid errors on empty mailboxes with some IMAP servers.
|
||||
if (mailbox.exists > 0) {
|
||||
for await (const msg of this.client.fetch(searchCriteria, { envelope: true, source: true, bodyStructure: true, uid: true })) {
|
||||
if (lastUid && msg.uid <= lastUid) {
|
||||
continue;
|
||||
const BATCH_SIZE = 250; // A configurable batch size
|
||||
let startUid = (lastUid || 0) + 1;
|
||||
|
||||
while (true) {
|
||||
const endUid = startUid + BATCH_SIZE - 1;
|
||||
const searchCriteria = { uid: `${startUid}:${endUid}` };
|
||||
let messagesInBatch = 0;
|
||||
|
||||
for await (const msg of this.client.fetch(searchCriteria, { envelope: true, source: true, bodyStructure: true, uid: true })) {
|
||||
messagesInBatch++;
|
||||
|
||||
if (lastUid && msg.uid <= lastUid) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (msg.uid > this.newMaxUids[mailboxPath]) {
|
||||
this.newMaxUids[mailboxPath] = msg.uid;
|
||||
}
|
||||
|
||||
if (msg.envelope && msg.source) {
|
||||
yield await this.parseMessage(msg, mailboxPath);
|
||||
}
|
||||
}
|
||||
|
||||
if (msg.uid > this.newMaxUids[mailboxPath]) {
|
||||
this.newMaxUids[mailboxPath] = msg.uid;
|
||||
// If this batch was smaller than the batch size, we've reached the end
|
||||
if (messagesInBatch < BATCH_SIZE) {
|
||||
break;
|
||||
}
|
||||
|
||||
if (msg.envelope && msg.source) {
|
||||
yield await this.parseMessage(msg);
|
||||
}
|
||||
// Move to the next batch
|
||||
startUid = endUid + 1;
|
||||
}
|
||||
}
|
||||
} catch (err: any) {
|
||||
logger.error({ err, mailboxPath }, 'Failed to process mailbox');
|
||||
// Check if the error indicates a persistent failure after retries
|
||||
if (err.message.includes('IMAP operation failed after all retries')) {
|
||||
this.statusMessage = 'Sync paused due to reaching the mail server rate limit. The process will automatically resume later.';
|
||||
}
|
||||
}
|
||||
finally {
|
||||
await this.disconnect();
|
||||
}
|
||||
} finally {
|
||||
await this.disconnect();
|
||||
}
|
||||
}
|
||||
|
||||
private async parseMessage(msg: any): Promise<EmailObject> {
|
||||
private async parseMessage(msg: any, mailboxPath: string): Promise<EmailObject> {
|
||||
const parsedEmail: ParsedMail = await simpleParser(msg.source);
|
||||
const attachments = parsedEmail.attachments.map((attachment: Attachment) => ({
|
||||
filename: attachment.filename || 'untitled',
|
||||
@@ -196,7 +240,7 @@ export class ImapConnector implements IEmailConnector {
|
||||
const threadId = getThreadId(parsedEmail.headers);
|
||||
|
||||
return {
|
||||
id: msg.uid.toString(),
|
||||
id: parsedEmail.messageId || msg.uid.toString(),
|
||||
threadId: threadId,
|
||||
from: mapAddresses(parsedEmail.from),
|
||||
to: mapAddresses(parsedEmail.to),
|
||||
@@ -208,7 +252,8 @@ export class ImapConnector implements IEmailConnector {
|
||||
headers: parsedEmail.headers,
|
||||
attachments,
|
||||
receivedAt: parsedEmail.date || new Date(),
|
||||
eml: msg.source
|
||||
eml: msg.source,
|
||||
path: mailboxPath
|
||||
};
|
||||
}
|
||||
|
||||
@@ -217,8 +262,14 @@ export class ImapConnector implements IEmailConnector {
|
||||
for (const [path, uid] of Object.entries(this.newMaxUids)) {
|
||||
imapSyncState[path] = { maxUid: uid };
|
||||
}
|
||||
return {
|
||||
const syncState: SyncState = {
|
||||
imap: imapSyncState
|
||||
};
|
||||
|
||||
if (this.statusMessage) {
|
||||
syncState.statusMessage = this.statusMessage;
|
||||
}
|
||||
|
||||
return syncState;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -143,9 +143,9 @@ export class MicrosoftConnector implements IEmailConnector {
|
||||
try {
|
||||
const folders = this.listAllFolders(userEmail);
|
||||
for await (const folder of folders) {
|
||||
if (folder.id) {
|
||||
if (folder.id && folder.path) {
|
||||
logger.info({ userEmail, folderId: folder.id, folderName: folder.displayName }, 'Syncing folder');
|
||||
yield* this.syncFolder(userEmail, folder.id, this.newDeltaTokens[folder.id]);
|
||||
yield* this.syncFolder(userEmail, folder.id, folder.path, this.newDeltaTokens[folder.id]);
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
@@ -159,20 +159,33 @@ export class MicrosoftConnector implements IEmailConnector {
|
||||
* @param userEmail The user principal name or ID.
|
||||
* @returns An async generator that yields each mail folder.
|
||||
*/
|
||||
private async *listAllFolders(userEmail: string): AsyncGenerator<MailFolder> {
|
||||
let requestUrl: string | undefined = `/users/${userEmail}/mailFolders`;
|
||||
private async *listAllFolders(userEmail: string, parentFolderId?: string, currentPath = ''): AsyncGenerator<MailFolder & { path: string; }> {
|
||||
const requestUrl = parentFolderId
|
||||
? `/users/${userEmail}/mailFolders/${parentFolderId}/childFolders`
|
||||
: `/users/${userEmail}/mailFolders`;
|
||||
|
||||
while (requestUrl) {
|
||||
try {
|
||||
const response = await this.graphClient.api(requestUrl).get();
|
||||
try {
|
||||
let response = await this.graphClient.api(requestUrl).get();
|
||||
|
||||
while (response) {
|
||||
for (const folder of response.value as MailFolder[]) {
|
||||
yield folder;
|
||||
const newPath = currentPath ? `${currentPath}/${folder.displayName || ''}` : folder.displayName || '';
|
||||
yield { ...folder, path: newPath || '' };
|
||||
|
||||
if (folder.childFolderCount && folder.childFolderCount > 0) {
|
||||
yield* this.listAllFolders(userEmail, folder.id, newPath);
|
||||
}
|
||||
}
|
||||
|
||||
if (response['@odata.nextLink']) {
|
||||
response = await this.graphClient.api(response['@odata.nextLink']).get();
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
requestUrl = response['@odata.nextLink'];
|
||||
} catch (error) {
|
||||
logger.error({ err: error, userEmail }, 'Failed to list mail folders');
|
||||
throw error; // Stop if we can't list folders
|
||||
}
|
||||
} catch (error) {
|
||||
logger.error({ err: error, userEmail }, 'Failed to list mail folders');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -186,6 +199,7 @@ export class MicrosoftConnector implements IEmailConnector {
|
||||
private async *syncFolder(
|
||||
userEmail: string,
|
||||
folderId: string,
|
||||
path: string,
|
||||
deltaToken?: string
|
||||
): AsyncGenerator<EmailObject> {
|
||||
let requestUrl: string | undefined;
|
||||
@@ -208,7 +222,7 @@ export class MicrosoftConnector implements IEmailConnector {
|
||||
if (message.id && !(message)['@removed']) {
|
||||
const rawEmail = await this.getRawEmail(userEmail, message.id);
|
||||
if (rawEmail) {
|
||||
const emailObject = await this.parseEmail(rawEmail, message.id, userEmail);
|
||||
const emailObject = await this.parseEmail(rawEmail, message.id, userEmail, path);
|
||||
emailObject.threadId = message.conversationId; // Add conversationId as threadId
|
||||
yield emailObject;
|
||||
}
|
||||
@@ -242,7 +256,7 @@ export class MicrosoftConnector implements IEmailConnector {
|
||||
}
|
||||
}
|
||||
|
||||
private async parseEmail(rawEmail: Buffer, messageId: string, userEmail: string): Promise<EmailObject> {
|
||||
private async parseEmail(rawEmail: Buffer, messageId: string, userEmail: string, path: string): Promise<EmailObject> {
|
||||
const parsedEmail: ParsedMail = await simpleParser(rawEmail);
|
||||
const attachments = parsedEmail.attachments.map((attachment: Attachment) => ({
|
||||
filename: attachment.filename || 'untitled',
|
||||
@@ -270,6 +284,7 @@ export class MicrosoftConnector implements IEmailConnector {
|
||||
headers: parsedEmail.headers,
|
||||
attachments,
|
||||
receivedAt: parsedEmail.date || new Date(),
|
||||
path
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@@ -0,0 +1,337 @@
|
||||
import type { PSTImportCredentials, EmailObject, EmailAddress, SyncState, MailboxUser } from '@open-archiver/types';
|
||||
import type { IEmailConnector } from '../EmailProviderFactory';
|
||||
import { PSTFile, PSTFolder, PSTMessage } from 'pst-extractor';
|
||||
import { simpleParser, ParsedMail, Attachment, AddressObject } from 'mailparser';
|
||||
import { logger } from '../../config/logger';
|
||||
import { getThreadId } from './helpers/utils';
|
||||
import { StorageService } from '../StorageService';
|
||||
import { Readable } from 'stream';
|
||||
import { createHash } from 'crypto';
|
||||
|
||||
const streamToBuffer = (stream: Readable): Promise<Buffer> => {
|
||||
return new Promise((resolve, reject) => {
|
||||
const chunks: Buffer[] = [];
|
||||
stream.on('data', (chunk) => chunks.push(chunk));
|
||||
stream.on('error', reject);
|
||||
stream.on('end', () => resolve(Buffer.concat(chunks)));
|
||||
});
|
||||
};
|
||||
|
||||
// We have to hardcode names for deleted and trash folders here as current lib doesn't support looking into PST properties.
|
||||
const DELETED_FOLDERS = new Set([
|
||||
// English
|
||||
'deleted items', 'trash',
|
||||
// Spanish
|
||||
'elementos eliminados', 'papelera',
|
||||
// French
|
||||
'éléments supprimés', 'corbeille',
|
||||
// German
|
||||
'gelöschte elemente', 'papierkorb',
|
||||
// Italian
|
||||
'posta eliminata', 'cestino',
|
||||
// Portuguese
|
||||
'itens excluídos', 'lixo',
|
||||
// Dutch
|
||||
'verwijderde items', 'prullenbak',
|
||||
// Russian
|
||||
'удаленные', 'корзина',
|
||||
// Polish
|
||||
'usunięte elementy', 'kosz',
|
||||
// Japanese
|
||||
'削除済みアイテム',
|
||||
// Czech
|
||||
'odstraněná pošta', 'koš',
|
||||
// Estonian
|
||||
'kustutatud kirjad', 'prügikast',
|
||||
// Swedish
|
||||
'borttagna objekt', 'skräp',
|
||||
// Danish
|
||||
'slettet post', 'papirkurv',
|
||||
// Norwegian
|
||||
'slettede elementer',
|
||||
// Finnish
|
||||
'poistetut', 'roskakori'
|
||||
]);
|
||||
|
||||
const JUNK_FOLDERS = new Set([
|
||||
// English
|
||||
'junk email', 'spam',
|
||||
// Spanish
|
||||
'correo no deseado',
|
||||
// French
|
||||
'courrier indésirable',
|
||||
// German
|
||||
'junk-e-mail',
|
||||
// Italian
|
||||
'posta indesiderata',
|
||||
// Portuguese
|
||||
'lixo eletrônico',
|
||||
// Dutch
|
||||
'ongewenste e-mail',
|
||||
// Russian
|
||||
'нежелательная почта', 'спам',
|
||||
// Polish
|
||||
'wiadomości-śmieci',
|
||||
// Japanese
|
||||
'迷惑メール', 'スパム',
|
||||
// Czech
|
||||
'nevyžádaná pošta',
|
||||
// Estonian
|
||||
'rämpspost',
|
||||
// Swedish
|
||||
'skräppost',
|
||||
// Danish
|
||||
'uønsket post',
|
||||
// Norwegian
|
||||
'søppelpost',
|
||||
// Finnish
|
||||
'roskaposti'
|
||||
]);
|
||||
|
||||
export class PSTConnector implements IEmailConnector {
|
||||
private storage: StorageService;
|
||||
private pstFile: PSTFile | null = null;
|
||||
|
||||
constructor(private credentials: PSTImportCredentials) {
|
||||
this.storage = new StorageService();
|
||||
}
|
||||
|
||||
private async loadPstFile(): Promise<PSTFile> {
|
||||
if (this.pstFile) {
|
||||
return this.pstFile;
|
||||
}
|
||||
const fileStream = await this.storage.get(this.credentials.uploadedFilePath);
|
||||
const buffer = await streamToBuffer(fileStream as Readable);
|
||||
this.pstFile = new PSTFile(buffer);
|
||||
return this.pstFile;
|
||||
}
|
||||
|
||||
public async testConnection(): Promise<boolean> {
|
||||
try {
|
||||
if (!this.credentials.uploadedFilePath) {
|
||||
throw Error("PST file path not provided.");
|
||||
}
|
||||
if (!this.credentials.uploadedFilePath.includes('.pst')) {
|
||||
throw Error("Provided file is not in the PST format.");
|
||||
}
|
||||
const fileExist = await this.storage.exists(this.credentials.uploadedFilePath);
|
||||
if (!fileExist) {
|
||||
throw Error("PST file upload not finished yet, please wait.");
|
||||
}
|
||||
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.error({ error, credentials: this.credentials }, 'PST file validation failed.');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Lists mailboxes within the PST. It treats each top-level folder
|
||||
* as a distinct mailbox, allowing it to handle PSTs that have been
|
||||
* consolidated from multiple sources.
|
||||
*/
|
||||
public async *listAllUsers(): AsyncGenerator<MailboxUser> {
|
||||
let pstFile: PSTFile | null = null;
|
||||
try {
|
||||
pstFile = await this.loadPstFile();
|
||||
const root = pstFile.getRootFolder();
|
||||
const displayName: string = root.displayName || pstFile.pstFilename || String(new Date().getTime());
|
||||
logger.info(`Found potential mailbox: ${displayName}`);
|
||||
const constructedPrimaryEmail = `${displayName.replace(/ /g, '.').toLowerCase()}@pst.local`;
|
||||
yield {
|
||||
id: constructedPrimaryEmail,
|
||||
// We will address the primaryEmail problem in the next section.
|
||||
primaryEmail: constructedPrimaryEmail,
|
||||
displayName: displayName,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error({ error }, 'Failed to list users from PST file.');
|
||||
pstFile?.close();
|
||||
throw error;
|
||||
} finally {
|
||||
pstFile?.close();
|
||||
}
|
||||
}
|
||||
|
||||
public async *fetchEmails(userEmail: string, syncState?: SyncState | null): AsyncGenerator<EmailObject | null> {
|
||||
let pstFile: PSTFile | null = null;
|
||||
try {
|
||||
pstFile = await this.loadPstFile();
|
||||
const root = pstFile.getRootFolder();
|
||||
yield* this.processFolder(root, '');
|
||||
} catch (error) {
|
||||
logger.error({ error }, 'Failed to fetch email.');
|
||||
pstFile?.close();
|
||||
throw error;
|
||||
}
|
||||
finally {
|
||||
|
||||
pstFile?.close();
|
||||
}
|
||||
}
|
||||
|
||||
private async *processFolder(folder: PSTFolder, currentPath: string): AsyncGenerator<EmailObject | null> {
|
||||
const folderName = folder.displayName.toLowerCase();
|
||||
if (DELETED_FOLDERS.has(folderName) || JUNK_FOLDERS.has(folderName)) {
|
||||
logger.info(`Skipping folder: ${folder.displayName}`);
|
||||
return;
|
||||
}
|
||||
|
||||
const newPath = currentPath ? `${currentPath}/${folder.displayName}` : folder.displayName;
|
||||
|
||||
if (folder.contentCount > 0) {
|
||||
let email: PSTMessage | null = folder.getNextChild();
|
||||
while (email != null) {
|
||||
yield await this.parseMessage(email, newPath);
|
||||
try {
|
||||
email = folder.getNextChild();
|
||||
} catch (error) {
|
||||
console.warn("Folder doesn't have child");
|
||||
email = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (folder.hasSubfolders) {
|
||||
for (const subFolder of folder.getSubFolders()) {
|
||||
yield* this.processFolder(subFolder, newPath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async parseMessage(msg: PSTMessage, path: string): Promise<EmailObject> {
|
||||
const emlContent = await this.constructEml(msg);
|
||||
const emlBuffer = Buffer.from(emlContent, 'utf-8');
|
||||
const parsedEmail: ParsedMail = await simpleParser(emlBuffer);
|
||||
|
||||
const attachments = parsedEmail.attachments.map((attachment: Attachment) => ({
|
||||
filename: attachment.filename || 'untitled',
|
||||
contentType: attachment.contentType,
|
||||
size: attachment.size,
|
||||
content: attachment.content as Buffer
|
||||
}));
|
||||
|
||||
const mapAddresses = (addresses: AddressObject | AddressObject[] | undefined): EmailAddress[] => {
|
||||
if (!addresses) return [];
|
||||
const addressArray = Array.isArray(addresses) ? addresses : [addresses];
|
||||
return addressArray.flatMap(a => a.value.map(v => ({ name: v.name, address: v.address?.replaceAll(`'`, '') || '' })));
|
||||
};
|
||||
|
||||
const threadId = getThreadId(parsedEmail.headers);
|
||||
let messageId = msg.internetMessageId;
|
||||
// generate a unique ID for this message
|
||||
|
||||
if (!messageId) {
|
||||
messageId = `generated-${createHash('sha256').update(emlBuffer ?? Buffer.from(parsedEmail.text || parsedEmail.html || '', 'utf-8')).digest('hex')}-${createHash('sha256').update(emlBuffer ?? Buffer.from(msg.subject || '', 'utf-8')).digest('hex')}-${msg.clientSubmitTime?.getTime()}`;
|
||||
}
|
||||
return {
|
||||
id: messageId,
|
||||
threadId: threadId,
|
||||
from: mapAddresses(parsedEmail.from),
|
||||
to: mapAddresses(parsedEmail.to),
|
||||
cc: mapAddresses(parsedEmail.cc),
|
||||
bcc: mapAddresses(parsedEmail.bcc),
|
||||
subject: parsedEmail.subject || '',
|
||||
body: parsedEmail.text || '',
|
||||
html: parsedEmail.html || '',
|
||||
headers: parsedEmail.headers,
|
||||
attachments,
|
||||
receivedAt: parsedEmail.date || new Date(),
|
||||
eml: emlBuffer,
|
||||
path
|
||||
};
|
||||
}
|
||||
|
||||
private async constructEml(msg: PSTMessage): Promise<string> {
|
||||
let eml = '';
|
||||
const boundary = '----boundary-openarchiver';
|
||||
const altBoundary = '----boundary-openarchiver_alt';
|
||||
|
||||
let headers = '';
|
||||
|
||||
if (msg.senderName || msg.senderEmailAddress) {
|
||||
headers += `From: ${msg.senderName} <${msg.senderEmailAddress}>\n`;
|
||||
}
|
||||
if (msg.displayTo) {
|
||||
headers += `To: ${msg.displayTo}\n`;
|
||||
}
|
||||
if (msg.displayCC) {
|
||||
headers += `Cc: ${msg.displayCC}\n`;
|
||||
}
|
||||
if (msg.displayBCC) {
|
||||
headers += `Bcc: ${msg.displayBCC}\n`;
|
||||
}
|
||||
if (msg.subject) {
|
||||
headers += `Subject: ${msg.subject}\n`;
|
||||
}
|
||||
if (msg.clientSubmitTime) {
|
||||
headers += `Date: ${new Date(msg.clientSubmitTime).toUTCString()}\n`;
|
||||
}
|
||||
if (msg.internetMessageId) {
|
||||
headers += `Message-ID: <${msg.internetMessageId}>\n`;
|
||||
}
|
||||
if (msg.inReplyToId) {
|
||||
headers += `In-Reply-To: ${msg.inReplyToId}`;
|
||||
}
|
||||
if (msg.conversationId) {
|
||||
headers += `Conversation-Id: ${msg.conversationId}`;
|
||||
}
|
||||
headers += 'MIME-Version: 1.0\n';
|
||||
|
||||
//add new headers
|
||||
if (!/Content-Type:/i.test(headers)) {
|
||||
if (msg.hasAttachments) {
|
||||
headers += `Content-Type: multipart/mixed; boundary="${boundary}"\n`;
|
||||
headers += `Content-Type: multipart/alternative; boundary="${altBoundary}"\n\n`;
|
||||
eml += headers;
|
||||
eml += `--${boundary}\n\n`;
|
||||
} else {
|
||||
eml += headers;
|
||||
eml += `Content-Type: multipart/alternative; boundary="${altBoundary}"\n\n`;
|
||||
}
|
||||
}
|
||||
// Body
|
||||
const hasBody = !!msg.body;
|
||||
const hasHtml = !!msg.bodyHTML;
|
||||
|
||||
if (hasBody) {
|
||||
eml += `--${altBoundary}\n`;
|
||||
eml += 'Content-Type: text/plain; charset="utf-8"\n\n';
|
||||
eml += `${msg.body}\n\n`;
|
||||
}
|
||||
|
||||
if (hasHtml) {
|
||||
eml += `--${altBoundary}\n`;
|
||||
eml += 'Content-Type: text/html; charset="utf-8"\n\n';
|
||||
eml += `${msg.bodyHTML}\n\n`;
|
||||
}
|
||||
|
||||
if (hasBody || hasHtml) {
|
||||
eml += `--${altBoundary}--\n`;
|
||||
}
|
||||
|
||||
if (msg.hasAttachments) {
|
||||
for (let i = 0; i < msg.numberOfAttachments; i++) {
|
||||
const attachment = msg.getAttachment(i);
|
||||
const attachmentStream = attachment.fileInputStream;
|
||||
if (attachmentStream) {
|
||||
const attachmentBuffer = Buffer.alloc(attachment.filesize);
|
||||
attachmentStream.readCompletely(attachmentBuffer);
|
||||
eml += `\n--${boundary}\n`;
|
||||
eml += `Content-Type: ${attachment.mimeTag}; name="${attachment.longFilename}"\n`;
|
||||
eml += `Content-Disposition: attachment; filename="${attachment.longFilename}"\n`;
|
||||
eml += 'Content-Transfer-Encoding: base64\n\n';
|
||||
eml += `${attachmentBuffer.toString('base64')}\n`;
|
||||
}
|
||||
}
|
||||
eml += `\n--${boundary}--`;
|
||||
}
|
||||
|
||||
return eml;
|
||||
}
|
||||
|
||||
public getUpdatedSyncState(): SyncState {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
@@ -34,6 +34,15 @@ export function getThreadId(headers: Headers): string | undefined {
|
||||
}
|
||||
}
|
||||
|
||||
const conversationIdHeader = headers.get('conversation-id');
|
||||
|
||||
if (conversationIdHeader) {
|
||||
const conversationId = getHeaderValue(conversationIdHeader);
|
||||
if (conversationId) {
|
||||
return conversationId.trim();
|
||||
}
|
||||
}
|
||||
|
||||
const messageIdHeader = headers.get('message-id');
|
||||
|
||||
if (messageIdHeader) {
|
||||
@@ -111,6 +111,10 @@
|
||||
--color-sidebar-ring: var(--sidebar-ring);
|
||||
}
|
||||
|
||||
.link {
|
||||
@apply hover:text-primary font-medium hover:underline hover:underline-offset-2;
|
||||
}
|
||||
|
||||
@layer base {
|
||||
* {
|
||||
@apply border-border outline-ring/50;
|
||||
|
||||
@@ -14,9 +14,11 @@ export const api = async (
|
||||
options: RequestInit = {}
|
||||
): Promise<Response> => {
|
||||
const { accessToken } = get(authStore);
|
||||
const defaultHeaders: HeadersInit = {
|
||||
'Content-Type': 'application/json'
|
||||
};
|
||||
const defaultHeaders: HeadersInit = {};
|
||||
|
||||
if (!(options.body instanceof FormData)) {
|
||||
defaultHeaders['Content-Type'] = 'application/json';
|
||||
}
|
||||
|
||||
if (accessToken) {
|
||||
defaultHeaders['Authorization'] = `Bearer ${accessToken}`;
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
raw,
|
||||
rawHtml
|
||||
}: { raw?: Buffer | { type: 'Buffer'; data: number[] } | undefined; rawHtml?: string } = $props();
|
||||
|
||||
let parsedEmail: Email | null = $state(null);
|
||||
let isLoading = $state(true);
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
<script lang="ts">
|
||||
import { goto } from '$app/navigation';
|
||||
import type { ArchivedEmail } from '@open-archiver/types';
|
||||
import { ScrollArea } from '$lib/components/ui/scroll-area/index.js';
|
||||
|
||||
let {
|
||||
thread,
|
||||
@@ -12,51 +13,53 @@
|
||||
</script>
|
||||
|
||||
<div>
|
||||
<div class="relative border-l-2 border-gray-200 pl-6">
|
||||
{#if thread}
|
||||
{#each thread as item, i (item.id)}
|
||||
<div class="mb-8">
|
||||
<span
|
||||
class="absolute -left-3 flex h-6 w-6 items-center justify-center rounded-full bg-gray-200 ring-8 ring-white"
|
||||
>
|
||||
<svg
|
||||
class="h-3 w-3 text-gray-600"
|
||||
fill="currentColor"
|
||||
viewBox="0 0 20 20"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
><path
|
||||
fill-rule="evenodd"
|
||||
d="M6 2a1 1 0 00-1 1v1H4a2 2 0 00-2 2v10a2 2 0 002 2h12a2 2 0 002-2V6a2 2 0 00-2-2h-1V3a1 1 0 10-2 0v1H7V3a1 1 0 00-1-1zm0 5a1 1 0 000 2h8a1 1 0 100-2H6z"
|
||||
clip-rule="evenodd"
|
||||
></path></svg
|
||||
<ScrollArea class="max-h-120 -ml-3 overflow-y-scroll">
|
||||
<div class="relative ml-3 border-l-2 border-gray-200 pl-6">
|
||||
{#if thread}
|
||||
{#each thread as item, i (item.id)}
|
||||
<div class="mb-8">
|
||||
<span
|
||||
class=" ring-sidebar absolute -left-3 flex h-6 w-6 items-center justify-center rounded-full bg-gray-200 ring-8"
|
||||
>
|
||||
</span>
|
||||
<h4
|
||||
class:font-bold={item.id === currentEmailId}
|
||||
class="text-md mb-2 {item.id !== currentEmailId
|
||||
? 'text-blue-500 hover:underline'
|
||||
: 'text-gray-900'}"
|
||||
>
|
||||
{#if item.id !== currentEmailId}
|
||||
<a
|
||||
href="/dashboard/archived-emails/{item.id}"
|
||||
onclick={(e) => {
|
||||
e.preventDefault();
|
||||
goto(`/dashboard/archived-emails/${item.id}`, {
|
||||
invalidateAll: true
|
||||
});
|
||||
}}>{item.subject || 'No Subject'}</a
|
||||
<svg
|
||||
class="h-3 w-3 text-gray-600"
|
||||
fill="currentColor"
|
||||
viewBox="0 0 20 20"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
><path
|
||||
fill-rule="evenodd"
|
||||
d="M6 2a1 1 0 00-1 1v1H4a2 2 0 00-2 2v10a2 2 0 002 2h12a2 2 0 002-2V6a2 2 0 00-2-2h-1V3a1 1 0 10-2 0v1H7V3a1 1 0 00-1-1zm0 5a1 1 0 000 2h8a1 1 0 100-2H6z"
|
||||
clip-rule="evenodd"
|
||||
></path></svg
|
||||
>
|
||||
{:else}
|
||||
{item.subject || 'No Subject'}
|
||||
{/if}
|
||||
</h4>
|
||||
<div class="flex flex-col space-y-2 text-sm font-normal leading-none text-gray-400">
|
||||
<span>From: {item.senderEmail}</span>
|
||||
<time class="">{new Date(item.sentAt).toLocaleString()}</time>
|
||||
</span>
|
||||
<h4
|
||||
class:font-bold={item.id === currentEmailId}
|
||||
class="text-md mb-2 {item.id !== currentEmailId
|
||||
? 'text-blue-500 hover:underline'
|
||||
: 'text-gray-900'}"
|
||||
>
|
||||
{#if item.id !== currentEmailId}
|
||||
<a
|
||||
href="/dashboard/archived-emails/{item.id}"
|
||||
onclick={(e) => {
|
||||
e.preventDefault();
|
||||
goto(`/dashboard/archived-emails/${item.id}`, {
|
||||
invalidateAll: true
|
||||
});
|
||||
}}>{item.subject || 'No Subject'}</a
|
||||
>
|
||||
{:else}
|
||||
{item.subject || 'No Subject'}
|
||||
{/if}
|
||||
</h4>
|
||||
<div class="flex flex-col space-y-2 text-sm font-normal leading-none text-gray-400">
|
||||
<span>From: {item.senderEmail}</span>
|
||||
<time class="">{new Date(item.sentAt).toLocaleString()}</time>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/each}
|
||||
{/if}
|
||||
</div>
|
||||
{/each}
|
||||
{/if}
|
||||
</div>
|
||||
</ScrollArea>
|
||||
</div>
|
||||
|
||||
@@ -8,7 +8,9 @@
|
||||
import * as Select from '$lib/components/ui/select';
|
||||
import * as Alert from '$lib/components/ui/alert/index.js';
|
||||
import { Textarea } from '$lib/components/ui/textarea/index.js';
|
||||
|
||||
import { setAlert } from '$lib/components/custom/alert/alert-state.svelte';
|
||||
import { api } from '$lib/api.client';
|
||||
import { Loader2 } from 'lucide-svelte';
|
||||
let {
|
||||
source = null,
|
||||
onSubmit
|
||||
@@ -20,7 +22,9 @@
|
||||
const providerOptions = [
|
||||
{ value: 'generic_imap', label: 'Generic IMAP' },
|
||||
{ value: 'google_workspace', label: 'Google Workspace' },
|
||||
{ value: 'microsoft_365', label: 'Microsoft 365' }
|
||||
{ value: 'microsoft_365', label: 'Microsoft 365' },
|
||||
{ value: 'pst_import', label: 'PST Import' },
|
||||
{ value: 'eml_import', label: 'EML Import' }
|
||||
];
|
||||
|
||||
let formData: CreateIngestionSourceDto = $state({
|
||||
@@ -43,6 +47,8 @@
|
||||
|
||||
let isSubmitting = $state(false);
|
||||
|
||||
let fileUploading = $state(false);
|
||||
|
||||
const handleSubmit = async (event: Event) => {
|
||||
event.preventDefault();
|
||||
isSubmitting = true;
|
||||
@@ -52,6 +58,45 @@
|
||||
isSubmitting = false;
|
||||
}
|
||||
};
|
||||
|
||||
const handleFileChange = async (event: Event) => {
|
||||
const target = event.target as HTMLInputElement;
|
||||
const file = target.files?.[0];
|
||||
fileUploading = true;
|
||||
if (!file) {
|
||||
fileUploading = false;
|
||||
return;
|
||||
}
|
||||
|
||||
const uploadFormData = new FormData();
|
||||
uploadFormData.append('file', file);
|
||||
|
||||
try {
|
||||
const response = await api('/upload', {
|
||||
method: 'POST',
|
||||
body: uploadFormData
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('File upload failed');
|
||||
}
|
||||
|
||||
const result = await response.json();
|
||||
formData.providerConfig.uploadedFilePath = result.filePath;
|
||||
formData.providerConfig.uploadedFileName = file.name;
|
||||
console.log(formData.providerConfig.uploadedFilePath);
|
||||
fileUploading = false;
|
||||
} catch (error) {
|
||||
fileUploading = false;
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: 'Upload Failed',
|
||||
message: 'PST file upload failed. Please try again.',
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
}
|
||||
};
|
||||
</script>
|
||||
|
||||
<form onsubmit={handleSubmit} class="grid gap-4 py-4">
|
||||
@@ -136,6 +181,26 @@
|
||||
<Label for="secure" class="text-left">Use TLS</Label>
|
||||
<Checkbox id="secure" bind:checked={formData.providerConfig.secure} />
|
||||
</div>
|
||||
{:else if formData.provider === 'pst_import'}
|
||||
<div class="grid grid-cols-4 items-center gap-4">
|
||||
<Label for="pst-file" class="text-left">PST File</Label>
|
||||
<div class="col-span-3 flex flex-row items-center space-x-2">
|
||||
<Input id="pst-file" type="file" class="" accept=".pst" onchange={handleFileChange} />
|
||||
{#if fileUploading}
|
||||
<span class=" text-primary animate-spin"><Loader2 /></span>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{:else if formData.provider === 'eml_import'}
|
||||
<div class="grid grid-cols-4 items-center gap-4">
|
||||
<Label for="eml-file" class="text-left">EML File</Label>
|
||||
<div class="col-span-3 flex flex-row items-center space-x-2">
|
||||
<Input id="eml-file" type="file" class="" accept=".zip" onchange={handleFileChange} />
|
||||
{#if fileUploading}
|
||||
<span class=" text-primary animate-spin"><Loader2 /></span>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
{#if formData.provider === 'google_workspace' || formData.provider === 'microsoft_365'}
|
||||
<Alert.Root>
|
||||
@@ -150,7 +215,7 @@
|
||||
</Alert.Root>
|
||||
{/if}
|
||||
<Dialog.Footer>
|
||||
<Button type="submit" disabled={isSubmitting}>
|
||||
<Button type="submit" disabled={isSubmitting || fileUploading}>
|
||||
{#if isSubmitting}
|
||||
Submitting...
|
||||
{:else}
|
||||
|
||||
10
packages/frontend/src/lib/components/ui/scroll-area/index.ts
Normal file
10
packages/frontend/src/lib/components/ui/scroll-area/index.ts
Normal file
@@ -0,0 +1,10 @@
|
||||
import Scrollbar from "./scroll-area-scrollbar.svelte";
|
||||
import Root from "./scroll-area.svelte";
|
||||
|
||||
export {
|
||||
Root,
|
||||
Scrollbar,
|
||||
//,
|
||||
Root as ScrollArea,
|
||||
Scrollbar as ScrollAreaScrollbar,
|
||||
};
|
||||
@@ -0,0 +1,31 @@
|
||||
<script lang="ts">
|
||||
import { ScrollArea as ScrollAreaPrimitive } from "bits-ui";
|
||||
import { cn, type WithoutChild } from "$lib/utils.js";
|
||||
|
||||
let {
|
||||
ref = $bindable(null),
|
||||
class: className,
|
||||
orientation = "vertical",
|
||||
children,
|
||||
...restProps
|
||||
}: WithoutChild<ScrollAreaPrimitive.ScrollbarProps> = $props();
|
||||
</script>
|
||||
|
||||
<ScrollAreaPrimitive.Scrollbar
|
||||
bind:ref
|
||||
data-slot="scroll-area-scrollbar"
|
||||
{orientation}
|
||||
class={cn(
|
||||
"flex touch-none select-none p-px transition-colors",
|
||||
orientation === "vertical" && "h-full w-2.5 border-l border-l-transparent",
|
||||
orientation === "horizontal" && "h-2.5 flex-col border-t border-t-transparent",
|
||||
className
|
||||
)}
|
||||
{...restProps}
|
||||
>
|
||||
{@render children?.()}
|
||||
<ScrollAreaPrimitive.Thumb
|
||||
data-slot="scroll-area-thumb"
|
||||
class="bg-border relative flex-1 rounded-full"
|
||||
/>
|
||||
</ScrollAreaPrimitive.Scrollbar>
|
||||
@@ -0,0 +1,40 @@
|
||||
<script lang="ts">
|
||||
import { ScrollArea as ScrollAreaPrimitive } from "bits-ui";
|
||||
import { Scrollbar } from "./index.js";
|
||||
import { cn, type WithoutChild } from "$lib/utils.js";
|
||||
|
||||
let {
|
||||
ref = $bindable(null),
|
||||
class: className,
|
||||
orientation = "vertical",
|
||||
scrollbarXClasses = "",
|
||||
scrollbarYClasses = "",
|
||||
children,
|
||||
...restProps
|
||||
}: WithoutChild<ScrollAreaPrimitive.RootProps> & {
|
||||
orientation?: "vertical" | "horizontal" | "both" | undefined;
|
||||
scrollbarXClasses?: string | undefined;
|
||||
scrollbarYClasses?: string | undefined;
|
||||
} = $props();
|
||||
</script>
|
||||
|
||||
<ScrollAreaPrimitive.Root
|
||||
bind:ref
|
||||
data-slot="scroll-area"
|
||||
class={cn("relative", className)}
|
||||
{...restProps}
|
||||
>
|
||||
<ScrollAreaPrimitive.Viewport
|
||||
data-slot="scroll-area-viewport"
|
||||
class="ring-ring/10 dark:ring-ring/20 dark:outline-ring/40 outline-ring/50 size-full rounded-[inherit] transition-[color,box-shadow] focus-visible:outline-1 focus-visible:ring-4"
|
||||
>
|
||||
{@render children?.()}
|
||||
</ScrollAreaPrimitive.Viewport>
|
||||
{#if orientation === "vertical" || orientation === "both"}
|
||||
<Scrollbar orientation="vertical" class={scrollbarYClasses} />
|
||||
{/if}
|
||||
{#if orientation === "horizontal" || orientation === "both"}
|
||||
<Scrollbar orientation="horizontal" class={scrollbarXClasses} />
|
||||
{/if}
|
||||
<ScrollAreaPrimitive.Corner />
|
||||
</ScrollAreaPrimitive.Root>
|
||||
@@ -1,9 +1,27 @@
|
||||
import { redirect } from '@sveltejs/kit';
|
||||
import type { LayoutServerLoad } from './$types';
|
||||
import 'dotenv/config';
|
||||
import { api } from '$lib/server/api';
|
||||
|
||||
export const load: LayoutServerLoad = async (event) => {
|
||||
const { locals, url } = event;
|
||||
try {
|
||||
const response = await api('/auth/status', event);
|
||||
const { needsSetup } = await response.json();
|
||||
|
||||
if (needsSetup && url.pathname !== '/setup') {
|
||||
throw redirect(307, '/setup');
|
||||
}
|
||||
|
||||
if (!needsSetup && url.pathname === '/setup') {
|
||||
throw redirect(307, '/signin');
|
||||
}
|
||||
} catch (error) {
|
||||
throw error;
|
||||
|
||||
}
|
||||
|
||||
|
||||
export const load: LayoutServerLoad = async ({ locals }) => {
|
||||
return {
|
||||
user: locals.user,
|
||||
accessToken: locals.accessToken,
|
||||
|
||||
@@ -70,8 +70,9 @@
|
||||
</Card.Header>
|
||||
<Card.Content>
|
||||
<div
|
||||
class=" text-2xl font-bold text-green-500"
|
||||
class=" text-2xl font-bold"
|
||||
class:text-destructive={data.stats.failedIngestionsLast7Days > 0}
|
||||
class:text-green-600={data.stats.failedIngestionsLast7Days <= 0}
|
||||
>
|
||||
{data.stats.failedIngestionsLast7Days}
|
||||
</div>
|
||||
|
||||
@@ -99,28 +99,35 @@
|
||||
<Table.Header>
|
||||
<Table.Row>
|
||||
<Table.Head>Date</Table.Head>
|
||||
<Table.Head>Inbox</Table.Head>
|
||||
<Table.Head>Subject</Table.Head>
|
||||
<Table.Head>Sender</Table.Head>
|
||||
<Table.Head>Attachments</Table.Head>
|
||||
<Table.Head>Inbox</Table.Head>
|
||||
<Table.Head>Path</Table.Head>
|
||||
<Table.Head class="text-right">Actions</Table.Head>
|
||||
</Table.Row>
|
||||
</Table.Header>
|
||||
<Table.Body>
|
||||
<Table.Body class="text-sm">
|
||||
{#if archivedEmails.items.length > 0}
|
||||
{#each archivedEmails.items as email (email.id)}
|
||||
<Table.Row>
|
||||
<Table.Cell>{new Date(email.sentAt).toLocaleString()}</Table.Cell>
|
||||
<Table.Cell>{email.userEmail}</Table.Cell>
|
||||
|
||||
<Table.Cell>
|
||||
<div class="max-w-100 truncate">
|
||||
<a href={`/dashboard/archived-emails/${email.id}`}>
|
||||
<a class="link" href={`/dashboard/archived-emails/${email.id}`}>
|
||||
{email.subject}
|
||||
</a>
|
||||
</div>
|
||||
</Table.Cell>
|
||||
<Table.Cell>{email.senderEmail}</Table.Cell>
|
||||
<Table.Cell>{email.hasAttachments ? 'Yes' : 'No'}</Table.Cell>
|
||||
<Table.Cell>
|
||||
{email.senderEmail || email.senderName}
|
||||
</Table.Cell>
|
||||
<Table.Cell>{email.userEmail}</Table.Cell>
|
||||
<Table.Cell>
|
||||
{#if email.path}
|
||||
<span class=" bg-muted truncate rounded p-1.5 text-xs">{email.path} </span>
|
||||
{/if}
|
||||
</Table.Cell>
|
||||
<Table.Cell class="text-right">
|
||||
<a href={`/dashboard/archived-emails/${email.id}`}>
|
||||
<Button variant="outline">View</Button>
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
import EmailThread from '$lib/components/custom/EmailThread.svelte';
|
||||
import { api } from '$lib/api.client';
|
||||
import { browser } from '$app/environment';
|
||||
import { formatBytes } from '$lib/utils';
|
||||
|
||||
let { data }: { data: PageData } = $props();
|
||||
let email = $derived(data.email);
|
||||
@@ -50,9 +51,38 @@
|
||||
</Card.Header>
|
||||
<Card.Content>
|
||||
<div class="space-y-4">
|
||||
<div>
|
||||
<div class="space-y-1">
|
||||
<h3 class="font-semibold">Recipients</h3>
|
||||
<p>To: {email.recipients.map((r) => r.email).join(', ')}</p>
|
||||
<Card.Description>
|
||||
<p>To: {email.recipients.map((r) => r.email || r.name).join(', ')}</p>
|
||||
</Card.Description>
|
||||
</div>
|
||||
<div class=" space-y-1">
|
||||
<h3 class="font-semibold">Meta data</h3>
|
||||
<Card.Description class="space-y-2">
|
||||
{#if email.path}
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
<span>Folder:</span>
|
||||
<span class=" bg-muted truncate rounded p-1.5 text-xs"
|
||||
>{email.path || '/'}</span
|
||||
>
|
||||
</div>
|
||||
{/if}
|
||||
{#if email.tags && email.tags.length > 0}
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
<span> Tags: </span>
|
||||
{#each email.tags as tag}
|
||||
<span class=" bg-muted truncate rounded p-1.5 text-xs">{tag}</span>
|
||||
{/each}
|
||||
</div>
|
||||
{/if}
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
<span>size:</span>
|
||||
<span class=" bg-muted truncate rounded p-1.5 text-xs"
|
||||
>{formatBytes(email.sizeBytes)}</span
|
||||
>
|
||||
</div>
|
||||
</Card.Description>
|
||||
</div>
|
||||
<div>
|
||||
<h3 class="font-semibold">Email Preview</h3>
|
||||
|
||||
@@ -3,9 +3,10 @@
|
||||
import * as Table from '$lib/components/ui/table';
|
||||
import { Button } from '$lib/components/ui/button';
|
||||
import * as DropdownMenu from '$lib/components/ui/dropdown-menu';
|
||||
import { MoreHorizontal } from 'lucide-svelte';
|
||||
import { MoreHorizontal, Trash, RefreshCw } from 'lucide-svelte';
|
||||
import * as Dialog from '$lib/components/ui/dialog';
|
||||
import { Switch } from '$lib/components/ui/switch';
|
||||
import { Checkbox } from '$lib/components/ui/checkbox';
|
||||
import IngestionSourceForm from '$lib/components/custom/IngestionSourceForm.svelte';
|
||||
import { api } from '$lib/api.client';
|
||||
import type { IngestionSource, CreateIngestionSourceDto } from '@open-archiver/types';
|
||||
@@ -20,6 +21,8 @@
|
||||
let selectedSource = $state<IngestionSource | null>(null);
|
||||
let sourceToDelete = $state<IngestionSource | null>(null);
|
||||
let isDeleting = $state(false);
|
||||
let selectedIds = $state<string[]>([]);
|
||||
let isBulkDeleteDialogOpen = $state(false);
|
||||
|
||||
const openCreateDialog = () => {
|
||||
selectedSource = null;
|
||||
@@ -125,6 +128,64 @@
|
||||
}
|
||||
};
|
||||
|
||||
const handleBulkDelete = async () => {
|
||||
isDeleting = true;
|
||||
try {
|
||||
for (const id of selectedIds) {
|
||||
const res = await api(`/ingestion-sources/${id}`, { method: 'DELETE' });
|
||||
if (!res.ok) {
|
||||
const errorBody = await res.json();
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: `Failed to delete ingestion ${id}`,
|
||||
message: errorBody.message || JSON.stringify(errorBody),
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
}
|
||||
}
|
||||
ingestionSources = ingestionSources.filter((s) => !selectedIds.includes(s.id));
|
||||
selectedIds = [];
|
||||
isBulkDeleteDialogOpen = false;
|
||||
} finally {
|
||||
isDeleting = false;
|
||||
}
|
||||
};
|
||||
|
||||
const handleBulkForceSync = async () => {
|
||||
try {
|
||||
for (const id of selectedIds) {
|
||||
const res = await api(`/ingestion-sources/${id}/sync`, { method: 'POST' });
|
||||
if (!res.ok) {
|
||||
const errorBody = await res.json();
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: `Failed to trigger force sync for ingestion ${id}`,
|
||||
message: errorBody.message || JSON.stringify(errorBody),
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
}
|
||||
}
|
||||
const updatedSources = ingestionSources.map((s) => {
|
||||
if (selectedIds.includes(s.id)) {
|
||||
return { ...s, status: 'syncing' as const };
|
||||
}
|
||||
return s;
|
||||
});
|
||||
ingestionSources = updatedSources;
|
||||
selectedIds = [];
|
||||
} catch (e) {
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: 'Failed to trigger force sync',
|
||||
message: e instanceof Error ? e.message : JSON.stringify(e),
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const handleFormSubmit = async (formData: CreateIngestionSourceDto) => {
|
||||
try {
|
||||
if (selectedSource) {
|
||||
@@ -174,6 +235,8 @@
|
||||
switch (status) {
|
||||
case 'active':
|
||||
return 'bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-300';
|
||||
case 'imported':
|
||||
return 'bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-300';
|
||||
case 'paused':
|
||||
return 'bg-gray-100 text-gray-800 dark:bg-gray-700 dark:text-gray-300';
|
||||
case 'error':
|
||||
@@ -198,7 +261,29 @@
|
||||
|
||||
<div class="">
|
||||
<div class="mb-4 flex items-center justify-between">
|
||||
<h1 class="text-2xl font-bold">Ingestion Sources</h1>
|
||||
<div class="flex items-center gap-4">
|
||||
<h1 class="text-2xl font-bold">Ingestion Sources</h1>
|
||||
{#if selectedIds.length > 0}
|
||||
<DropdownMenu.Root>
|
||||
<DropdownMenu.Trigger>
|
||||
<Button variant="outline">
|
||||
Bulk Actions ({selectedIds.length})
|
||||
<MoreHorizontal class="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
</DropdownMenu.Trigger>
|
||||
<DropdownMenu.Content>
|
||||
<DropdownMenu.Item onclick={handleBulkForceSync}>
|
||||
<RefreshCw class="mr-2 h-4 w-4" />
|
||||
Force Sync
|
||||
</DropdownMenu.Item>
|
||||
<DropdownMenu.Item class="text-red-600" onclick={() => (isBulkDeleteDialogOpen = true)}>
|
||||
<Trash class="mr-2 h-4 w-4" />
|
||||
Delete
|
||||
</DropdownMenu.Item>
|
||||
</DropdownMenu.Content>
|
||||
</DropdownMenu.Root>
|
||||
{/if}
|
||||
</div>
|
||||
<Button onclick={openCreateDialog} disabled={data.isDemo}>Create New</Button>
|
||||
</div>
|
||||
|
||||
@@ -206,6 +291,20 @@
|
||||
<Table.Root>
|
||||
<Table.Header>
|
||||
<Table.Row>
|
||||
<Table.Head class="w-12">
|
||||
<Checkbox
|
||||
onCheckedChange={(checked) => {
|
||||
if (checked) {
|
||||
selectedIds = ingestionSources.map((s) => s.id);
|
||||
} else {
|
||||
selectedIds = [];
|
||||
}
|
||||
}}
|
||||
checked={ingestionSources.length > 0 && selectedIds.length === ingestionSources.length
|
||||
? true
|
||||
: ((selectedIds.length > 0 ? 'indeterminate' : false) as any)}
|
||||
/>
|
||||
</Table.Head>
|
||||
<Table.Head>Name</Table.Head>
|
||||
<Table.Head>Provider</Table.Head>
|
||||
<Table.Head>Status</Table.Head>
|
||||
@@ -219,7 +318,21 @@
|
||||
{#each ingestionSources as source (source.id)}
|
||||
<Table.Row>
|
||||
<Table.Cell>
|
||||
<a href="/dashboard/archived-emails?ingestionSourceId={source.id}">{source.name}</a>
|
||||
<Checkbox
|
||||
checked={selectedIds.includes(source.id)}
|
||||
onCheckedChange={() => {
|
||||
if (selectedIds.includes(source.id)) {
|
||||
selectedIds = selectedIds.filter((id) => id !== source.id);
|
||||
} else {
|
||||
selectedIds = [...selectedIds, source.id];
|
||||
}
|
||||
}}
|
||||
/>
|
||||
</Table.Cell>
|
||||
<Table.Cell>
|
||||
<a class="link" href="/dashboard/archived-emails?ingestionSourceId={source.id}"
|
||||
>{source.name}</a
|
||||
>
|
||||
</Table.Cell>
|
||||
<Table.Cell class="capitalize">{source.provider.split('_').join(' ')}</Table.Cell>
|
||||
<Table.Cell class="min-w-24">
|
||||
@@ -276,7 +389,7 @@
|
||||
{/each}
|
||||
{:else}
|
||||
<Table.Row>
|
||||
<Table.Cell colspan={5} class="text-center">No ingestion sources found.</Table.Cell>
|
||||
<Table.Cell class="h-8 text-center"></Table.Cell>
|
||||
</Table.Row>
|
||||
{/if}
|
||||
</Table.Body>
|
||||
@@ -324,3 +437,26 @@
|
||||
</Dialog.Footer>
|
||||
</Dialog.Content>
|
||||
</Dialog.Root>
|
||||
|
||||
<Dialog.Root bind:open={isBulkDeleteDialogOpen}>
|
||||
<Dialog.Content class="sm:max-w-lg">
|
||||
<Dialog.Header>
|
||||
<Dialog.Title
|
||||
>Are you sure you want to delete {selectedIds.length} selected ingestions?</Dialog.Title
|
||||
>
|
||||
<Dialog.Description>
|
||||
This will delete all archived emails, attachments, indexing, and files associated with these
|
||||
ingestions. If you only want to stop syncing new emails, you can pause the ingestions
|
||||
instead.
|
||||
</Dialog.Description>
|
||||
</Dialog.Header>
|
||||
<Dialog.Footer class="sm:justify-start">
|
||||
<Button type="button" variant="destructive" onclick={handleBulkDelete} disabled={isDeleting}
|
||||
>{#if isDeleting}Deleting...{:else}Confirm{/if}</Button
|
||||
>
|
||||
<Dialog.Close>
|
||||
<Button type="button" variant="secondary">Cancel</Button>
|
||||
</Dialog.Close>
|
||||
</Dialog.Footer>
|
||||
</Dialog.Content>
|
||||
</Dialog.Root>
|
||||
|
||||
5
packages/frontend/src/routes/setup/+page.server.ts
Normal file
5
packages/frontend/src/routes/setup/+page.server.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import { redirect } from '@sveltejs/kit';
|
||||
import type { PageServerLoad } from "./$types";
|
||||
|
||||
|
||||
export const load = (async (event) => { }) satisfies PageServerLoad;
|
||||
111
packages/frontend/src/routes/setup/+page.svelte
Normal file
111
packages/frontend/src/routes/setup/+page.svelte
Normal file
@@ -0,0 +1,111 @@
|
||||
<script lang="ts">
|
||||
import { goto } from '$app/navigation';
|
||||
import { Button } from '$lib/components/ui/button/index.js';
|
||||
import * as Card from '$lib/components/ui/card';
|
||||
import { Input } from '$lib/components/ui/input/index.js';
|
||||
import { Label } from '$lib/components/ui/label/index.js';
|
||||
import { api } from '$lib/api.client';
|
||||
import { authStore } from '$lib/stores/auth.store';
|
||||
import { setAlert } from '$lib/components/custom/alert/alert-state.svelte';
|
||||
|
||||
let first_name = '';
|
||||
let last_name = '';
|
||||
let email = '';
|
||||
let password = '';
|
||||
let isLoading = false;
|
||||
|
||||
async function handleSubmit() {
|
||||
isLoading = true;
|
||||
try {
|
||||
const response = await api('/auth/setup', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({ first_name, last_name, email, password })
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json();
|
||||
throw new Error(errorData.message || 'An unknown error occurred.');
|
||||
}
|
||||
|
||||
const { accessToken, user } = await response.json();
|
||||
authStore.login(accessToken, user);
|
||||
goto('/dashboard');
|
||||
} catch (err: any) {
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: 'Setup Failed',
|
||||
message: err.message,
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
} finally {
|
||||
isLoading = false;
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
<svelte:head>
|
||||
<title>Setup - Open Archiver</title>
|
||||
<meta name="description" content="Set up the initial administrator account for Open Archiver." />
|
||||
</svelte:head>
|
||||
|
||||
<div
|
||||
class="flex min-h-screen flex-col items-center justify-center space-y-16 bg-gray-100 dark:bg-gray-900"
|
||||
>
|
||||
<div>
|
||||
<a
|
||||
href="https://openarchiver.com/"
|
||||
target="_blank"
|
||||
class="flex flex-row items-center gap-2 font-bold"
|
||||
>
|
||||
<img src="/logos/logo-sq.svg" alt="OpenArchiver Logo" class="h-16 w-16" />
|
||||
<span class="text-2xl">Open Archiver</span>
|
||||
</a>
|
||||
</div>
|
||||
<Card.Root class="w-full max-w-md">
|
||||
<Card.Header class="space-y-1">
|
||||
<Card.Title class="text-2xl">Welcome</Card.Title>
|
||||
<Card.Description>Create the first administrator account to get started.</Card.Description>
|
||||
</Card.Header>
|
||||
<Card.Content class="grid gap-4">
|
||||
<form on:submit|preventDefault={handleSubmit} class="grid gap-4">
|
||||
<div class="grid gap-2">
|
||||
<Label for="first_name">First name</Label>
|
||||
<Input
|
||||
id="first_name"
|
||||
type="text"
|
||||
placeholder="First name"
|
||||
bind:value={first_name}
|
||||
required
|
||||
/>
|
||||
</div>
|
||||
<div class="grid gap-2">
|
||||
<Label for="last_name">Last name</Label>
|
||||
<Input
|
||||
id="last_name"
|
||||
type="text"
|
||||
placeholder="Last name"
|
||||
bind:value={last_name}
|
||||
required
|
||||
/>
|
||||
</div>
|
||||
<div class="grid gap-2">
|
||||
<Label for="email">Email</Label>
|
||||
<Input id="email" type="email" placeholder="m@example.com" bind:value={email} required />
|
||||
</div>
|
||||
<div class="grid gap-2">
|
||||
<Label for="password">Password</Label>
|
||||
<Input id="password" type="password" bind:value={password} required />
|
||||
</div>
|
||||
|
||||
<Button type="submit" class="w-full" disabled={isLoading}>
|
||||
{#if isLoading}
|
||||
<span>Creating Account...</span>
|
||||
{:else}
|
||||
<span>Create Account</span>
|
||||
{/if}
|
||||
</Button>
|
||||
</form>
|
||||
</Card.Content>
|
||||
</Card.Root>
|
||||
</div>
|
||||
11
packages/frontend/src/routes/signin/+page.server.ts
Normal file
11
packages/frontend/src/routes/signin/+page.server.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
import { redirect } from '@sveltejs/kit';
|
||||
import type { PageServerLoad } from "./$types";
|
||||
|
||||
|
||||
export const load = (async (event) => {
|
||||
const { locals } = event;
|
||||
if (locals.user) {
|
||||
throw redirect(307, '/dashboard');
|
||||
}
|
||||
|
||||
}) satisfies PageServerLoad;
|
||||
@@ -48,6 +48,8 @@ export interface ArchivedEmail {
|
||||
attachments?: Attachment[];
|
||||
raw?: Buffer;
|
||||
thread?: ThreadEmail[];
|
||||
path: string | null;
|
||||
tags: string[] | null;
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -11,9 +11,9 @@ export interface AuthTokenPayload extends JWTPayload {
|
||||
*/
|
||||
email: string;
|
||||
/**
|
||||
* The user's role, used for authorization.
|
||||
* The user's assigned roles, which determines their permissions.
|
||||
*/
|
||||
role: User['role'];
|
||||
roles: string[];
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -27,5 +27,5 @@ export interface LoginResponse {
|
||||
/**
|
||||
* The authenticated user's information.
|
||||
*/
|
||||
user: Omit<User, 'passwordHash'>;
|
||||
user: Omit<User, 'password'>;
|
||||
}
|
||||
|
||||
@@ -49,6 +49,10 @@ export interface EmailObject {
|
||||
eml?: Buffer;
|
||||
/** The email address of the user whose mailbox this email belongs to. */
|
||||
userEmail?: string;
|
||||
/** The folder path of the email in the source mailbox. */
|
||||
path?: string;
|
||||
/** An array of tags or labels associated with the email. */
|
||||
tags?: string[];
|
||||
}
|
||||
|
||||
// Define the structure of the document to be indexed in Meilisearch
|
||||
|
||||
9
packages/types/src/iam.types.ts
Normal file
9
packages/types/src/iam.types.ts
Normal file
@@ -0,0 +1,9 @@
|
||||
export type Action = string;
|
||||
|
||||
export type Resource = string;
|
||||
|
||||
export interface PolicyStatement {
|
||||
Effect: 'Allow' | 'Deny';
|
||||
Action: Action[];
|
||||
Resource: Resource[];
|
||||
}
|
||||
@@ -6,3 +6,4 @@ export * from './email.types';
|
||||
export * from './archived-emails.types';
|
||||
export * from './search.types';
|
||||
export * from './dashboard.types';
|
||||
export * from './iam.types';
|
||||
|
||||
@@ -15,9 +15,10 @@ export type SyncState = {
|
||||
};
|
||||
};
|
||||
lastSyncTimestamp?: string;
|
||||
statusMessage?: string;
|
||||
};
|
||||
|
||||
export type IngestionProvider = 'google_workspace' | 'microsoft_365' | 'generic_imap';
|
||||
export type IngestionProvider = 'google_workspace' | 'microsoft_365' | 'generic_imap' | 'pst_import' | 'eml_import';
|
||||
|
||||
export type IngestionStatus =
|
||||
| 'active'
|
||||
@@ -26,7 +27,8 @@ export type IngestionStatus =
|
||||
| 'pending_auth'
|
||||
| 'syncing'
|
||||
| 'importing'
|
||||
| 'auth_success';
|
||||
| 'auth_success'
|
||||
| 'imported';
|
||||
|
||||
export interface BaseIngestionCredentials {
|
||||
type: IngestionProvider;
|
||||
@@ -61,11 +63,25 @@ export interface Microsoft365Credentials extends BaseIngestionCredentials {
|
||||
tenantId: string;
|
||||
}
|
||||
|
||||
export interface PSTImportCredentials extends BaseIngestionCredentials {
|
||||
type: 'pst_import';
|
||||
uploadedFileName: string;
|
||||
uploadedFilePath: string;
|
||||
}
|
||||
|
||||
export interface EMLImportCredentials extends BaseIngestionCredentials {
|
||||
type: 'eml_import';
|
||||
uploadedFileName: string;
|
||||
uploadedFilePath: string;
|
||||
}
|
||||
|
||||
// Discriminated union for all possible credential types
|
||||
export type IngestionCredentials =
|
||||
| GenericImapCredentials
|
||||
| GoogleWorkspaceCredentials
|
||||
| Microsoft365Credentials;
|
||||
| Microsoft365Credentials
|
||||
| PSTImportCredentials
|
||||
| EMLImportCredentials;
|
||||
|
||||
export interface IngestionSource {
|
||||
id: string;
|
||||
@@ -118,6 +134,12 @@ export interface IProcessMailboxJob {
|
||||
userEmail: string;
|
||||
}
|
||||
|
||||
export interface IPstProcessingJob {
|
||||
ingestionSourceId: string;
|
||||
filePath: string;
|
||||
originalFilename: string;
|
||||
}
|
||||
|
||||
export type MailboxUser = {
|
||||
id: string;
|
||||
primaryEmail: string;
|
||||
|
||||
@@ -44,6 +44,7 @@ export interface LocalStorageConfig {
|
||||
type: 'local';
|
||||
// The absolute root path on the server where the archive will be stored.
|
||||
rootPath: string;
|
||||
openArchiverFolderName: string;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -64,6 +65,7 @@ export interface S3StorageConfig {
|
||||
region?: string;
|
||||
// Force path-style addressing, required for MinIO.
|
||||
forcePathStyle?: boolean;
|
||||
openArchiverFolderName: string;
|
||||
}
|
||||
|
||||
export type StorageConfig = LocalStorageConfig | S3StorageConfig;
|
||||
|
||||
@@ -1,26 +1,34 @@
|
||||
/**
|
||||
* Defines the possible roles a user can have within the system.
|
||||
*/
|
||||
export type UserRole = 'Super Administrator' | 'Auditor/Compliance Officer' | 'End User';
|
||||
import { PolicyStatement } from './iam.types';
|
||||
|
||||
/**
|
||||
* Represents a user account in the system.
|
||||
* This is the core user object that will be stored in the database.
|
||||
*/
|
||||
export interface User {
|
||||
/**
|
||||
* The unique identifier for the user.
|
||||
*/
|
||||
id: string;
|
||||
/**
|
||||
* The user's email address, used for login.
|
||||
*/
|
||||
first_name: string | null;
|
||||
last_name: string | null;
|
||||
email: string;
|
||||
/**
|
||||
* The user's assigned role, which determines their permissions.
|
||||
*/
|
||||
role: UserRole;
|
||||
/**
|
||||
* The hashed password for the user. This should never be exposed to the client.
|
||||
*/
|
||||
passwordHash: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Represents a user's session.
|
||||
* This is used to track a user's login status.
|
||||
*/
|
||||
export interface Session {
|
||||
id: string;
|
||||
userId: string;
|
||||
expiresAt: Date;
|
||||
}
|
||||
|
||||
/**
|
||||
* Defines a role that can be assigned to users.
|
||||
* Roles are used to group a set of permissions together.
|
||||
*/
|
||||
export interface Role {
|
||||
id: string;
|
||||
name: string;
|
||||
policies: PolicyStatement[];
|
||||
createdAt: Date;
|
||||
updatedAt: Date;
|
||||
}
|
||||
|
||||
466
pnpm-lock.yaml
generated
466
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user