mirror of
https://github.com/LogicLabs-OU/OpenArchiver.git
synced 2026-04-06 00:31:57 +02:00
PST ingestion
This commit is contained in:
@@ -57,10 +57,7 @@ STORAGE_S3_FORCE_PATH_STYLE=false
|
||||
JWT_SECRET=a-very-secret-key-that-you-should-change
|
||||
JWT_EXPIRES_IN="7d"
|
||||
|
||||
# Admin User
|
||||
# Set the credentials for the initial admin user.
|
||||
ADMIN_EMAIL=admin@local.com
|
||||
ADMIN_PASSWORD=a_strong_password_that_you_should_change
|
||||
SUPER_API_KEY=
|
||||
|
||||
# Master Encryption Key for sensitive data (Such as Ingestion source credentials and passwords)
|
||||
|
||||
@@ -37,7 +37,6 @@ You must change the following placeholder values to secure your instance:
|
||||
- `REDIS_PASSWORD`: A strong, unique password for the Valkey/Redis service.
|
||||
- `MEILI_MASTER_KEY`: A complex key for Meilisearch.
|
||||
- `JWT_SECRET`: A long, random string for signing authentication tokens.
|
||||
- `ADMIN_PASSWORD`: A strong password for the initial admin user.
|
||||
- `ENCRYPTION_KEY`: A 32-byte hex string for encrypting sensitive data in the database. You can generate one with the following command:
|
||||
```bash
|
||||
openssl rand -hex 32
|
||||
@@ -104,14 +103,12 @@ These variables are used by `docker-compose.yml` to configure the services.
|
||||
|
||||
#### Security & Authentication
|
||||
|
||||
| Variable | Description | Default Value |
|
||||
| ---------------- | --------------------------------------------------- | ------------------------------------------ |
|
||||
| `JWT_SECRET` | A secret key for signing JWT tokens. | `a-very-secret-key-that-you-should-change` |
|
||||
| `JWT_EXPIRES_IN` | The expiration time for JWT tokens. | `7d` |
|
||||
| `ADMIN_EMAIL` | The email for the initial admin user. | `admin@local.com` |
|
||||
| `ADMIN_PASSWORD` | The password for the initial admin user. | `a_strong_password_that_you_should_change` |
|
||||
| `SUPER_API_KEY` | An API key with super admin privileges. | |
|
||||
| `ENCRYPTION_KEY` | A 32-byte hex string for encrypting sensitive data. | |
|
||||
| Variable | Description | Default Value |
|
||||
| ---------------- | ------------------------------------------------------------------- | ------------------------------------------ |
|
||||
| `JWT_SECRET` | A secret key for signing JWT tokens. | `a-very-secret-key-that-you-should-change` |
|
||||
| `JWT_EXPIRES_IN` | The expiration time for JWT tokens. | `7d` |
|
||||
| `SUPER_API_KEY` | An API key with super admin privileges. | |
|
||||
| `ENCRYPTION_KEY` | A 32-byte hex string for encrypting sensitive data in the database. | |
|
||||
|
||||
## 3. Run the Application
|
||||
|
||||
@@ -203,3 +200,99 @@ To do this, you will need to make a small modification to your `docker-compose.y
|
||||
By removing these sections, you allow Coolify to automatically create and manage the necessary networks, ensuring that all services can communicate with each other and are correctly exposed through Coolify's reverse proxy.
|
||||
|
||||
After making these changes, you can proceed with deploying your application on Coolify as you normally would.
|
||||
|
||||
## Where is my data stored (When using local storage and Docker)?
|
||||
|
||||
If you are using local storage to store your emails, based on your `docker-compose.yml` file, your data is being stored in what's called a "named volume" (`archiver-data`). That's why you're not seeing the files in the `./data/open-archiver` directory you created.
|
||||
|
||||
1. **List all Docker volumes**:
|
||||
|
||||
Run this command to see all the volumes on your system:
|
||||
|
||||
```bash
|
||||
docker volume ls
|
||||
```
|
||||
|
||||
2. **Identify the correct volume**:
|
||||
|
||||
Look through the list for a volume name that ends with `_archiver-data`. The part before that will be your project's directory name. For example, if your project is in a folder named `OpenArchiver`, the volume will be `openarchiver_archiver-data` But it can be a randomly generated hash.
|
||||
|
||||
3. **Inspect the correct volume**:
|
||||
|
||||
Once you've identified the correct volume name, use it in the `inspect` command. For example:
|
||||
|
||||
```bash
|
||||
docker volume inspect <your_volume_name_here>
|
||||
```
|
||||
|
||||
This will give you the correct `Mountpoint` path where your data is being stored. It will look something like this (the exact path will vary depending on your system):
|
||||
|
||||
```json
|
||||
{
|
||||
"CreatedAt": "2025-07-25T11:22:19Z",
|
||||
"Driver": "local",
|
||||
"Labels": {
|
||||
"com.docker.compose.config-hash": "---",
|
||||
"com.docker.compose.project": "---",
|
||||
"com.docker.compose.version": "2.38.2",
|
||||
"com.docker.compose.volume": "us8wwos0o4ok4go4gc8cog84_archiver-data"
|
||||
},
|
||||
"Mountpoint": "/var/lib/docker/volumes/us8wwos0o4ok4go4gc8cog84_archiver-data/_data",
|
||||
"Name": "us8wwos0o4ok4go4gc8cog84_archiver-data",
|
||||
"Options": null,
|
||||
"Scope": "local"
|
||||
}
|
||||
```
|
||||
|
||||
In this example, the data is located at `/var/lib/docker/volumes/us8wwos0o4ok4go4gc8cog84_archiver-data/_data`. You can then `cd` into that directory to see your files.
|
||||
|
||||
### To save data to a specific folder
|
||||
|
||||
To save the data to a specific folder on your machine, you'll need to make a change to your `docker-compose.yml`. You need to switch from a named volume to a "bind mount".
|
||||
|
||||
Here’s how you can do it:
|
||||
|
||||
1. **Edit `docker-compose.yml`**:
|
||||
|
||||
Open the `docker-compose.yml` file and find the `open-archiver` service. You're going to change the `volumes` section.
|
||||
|
||||
**Change this:**
|
||||
|
||||
```yaml
|
||||
services:
|
||||
open-archiver:
|
||||
# ... other config
|
||||
volumes:
|
||||
- archiver-data:/var/data/open-archiver
|
||||
```
|
||||
|
||||
**To this:**
|
||||
|
||||
```yaml
|
||||
services:
|
||||
open-archiver:
|
||||
# ... other config
|
||||
volumes:
|
||||
- ./data/open-archiver:/var/data/open-archiver
|
||||
```
|
||||
|
||||
You'll also want to remove the `archiver-data` volume definition at the bottom of the file, since it's no longer needed.
|
||||
|
||||
**Remove this whole block:**
|
||||
|
||||
```yaml
|
||||
volumes:
|
||||
# ... other volumes
|
||||
archiver-data:
|
||||
driver: local
|
||||
```
|
||||
|
||||
2. **Restart your containers**:
|
||||
|
||||
After you've saved the changes, run the following command in your terminal to apply them. The `--force-recreate` flag will ensure the container is recreated with the new volume settings.
|
||||
|
||||
```bash
|
||||
docker-compose up -d --force-recreate
|
||||
```
|
||||
|
||||
After this, any new data will be saved directly into the `./data/open-archiver` folder in your project directory.
|
||||
|
||||
@@ -27,6 +27,7 @@
|
||||
"axios": "^1.10.0",
|
||||
"bcryptjs": "^3.0.2",
|
||||
"bullmq": "^5.56.3",
|
||||
"busboy": "^1.6.0",
|
||||
"cross-fetch": "^4.1.0",
|
||||
"deepmerge-ts": "^7.1.5",
|
||||
"dotenv": "^17.2.0",
|
||||
@@ -42,11 +43,13 @@
|
||||
"mailparser": "^3.7.4",
|
||||
"mammoth": "^1.9.1",
|
||||
"meilisearch": "^0.51.0",
|
||||
"multer": "^2.0.2",
|
||||
"pdf2json": "^3.1.6",
|
||||
"pg": "^8.16.3",
|
||||
"pino": "^9.7.0",
|
||||
"pino-pretty": "^13.0.0",
|
||||
"postgres": "^3.4.7",
|
||||
"pst-extractor": "^1.11.0",
|
||||
"reflect-metadata": "^0.2.2",
|
||||
"sqlite3": "^5.1.7",
|
||||
"tsconfig-paths": "^4.2.0",
|
||||
@@ -55,9 +58,11 @@
|
||||
"devDependencies": {
|
||||
"@bull-board/api": "^6.11.0",
|
||||
"@bull-board/express": "^6.11.0",
|
||||
"@types/busboy": "^1.5.4",
|
||||
"@types/express": "^5.0.3",
|
||||
"@types/mailparser": "^3.4.6",
|
||||
"@types/microsoft-graph": "^2.40.1",
|
||||
"@types/multer": "^2.0.0",
|
||||
"@types/node": "^24.0.12",
|
||||
"bull-board": "^2.1.3",
|
||||
"ts-node-dev": "^2.0.0",
|
||||
|
||||
@@ -4,6 +4,8 @@ import { UserService } from '../../services/UserService';
|
||||
import { db } from '../../database';
|
||||
import * as schema from '../../database/schema';
|
||||
import { sql } from 'drizzle-orm';
|
||||
import 'dotenv/config';
|
||||
|
||||
|
||||
export class AuthController {
|
||||
#authService: AuthService;
|
||||
@@ -66,9 +68,22 @@ export class AuthController {
|
||||
|
||||
public status = async (req: Request, res: Response): Promise<Response> => {
|
||||
try {
|
||||
|
||||
|
||||
|
||||
const userCountResult = await db.select({ count: sql<number>`count(*)` }).from(schema.users);
|
||||
const userCount = Number(userCountResult[0].count);
|
||||
const needsSetup = userCount === 0;
|
||||
// in case user uses older version with admin user variables, we will create the admin user using those variables.
|
||||
if (needsSetup && process.env.ADMIN_EMAIL && process.env.ADMIN_PASSWORD) {
|
||||
await this.#userService.createAdminUser({
|
||||
email: process.env.ADMIN_EMAIL,
|
||||
password: process.env.ADMIN_PASSWORD,
|
||||
first_name: "Admin",
|
||||
last_name: "User"
|
||||
}, true);
|
||||
return res.status(200).json({ needsSetup: false });
|
||||
}
|
||||
return res.status(200).json({ needsSetup });
|
||||
} catch (error) {
|
||||
console.error('Status check error:', error);
|
||||
|
||||
24
packages/backend/src/api/controllers/upload.controller.ts
Normal file
24
packages/backend/src/api/controllers/upload.controller.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
import { Request, Response } from 'express';
|
||||
import { StorageService } from '../../services/StorageService';
|
||||
import { randomUUID } from 'crypto';
|
||||
import busboy from 'busboy';
|
||||
|
||||
export const uploadFile = async (req: Request, res: Response) => {
|
||||
const storage = new StorageService();
|
||||
const bb = busboy({ headers: req.headers });
|
||||
let filePath = '';
|
||||
let originalFilename = '';
|
||||
|
||||
bb.on('file', (fieldname, file, filename) => {
|
||||
originalFilename = filename.filename;
|
||||
const uuid = randomUUID();
|
||||
filePath = `temp/${uuid}-${originalFilename}`;
|
||||
storage.put(filePath, file);
|
||||
});
|
||||
|
||||
bb.on('finish', () => {
|
||||
res.json({ filePath });
|
||||
});
|
||||
|
||||
req.pipe(bb);
|
||||
};
|
||||
@@ -1,5 +1,5 @@
|
||||
import type { Request, Response, NextFunction } from 'express';
|
||||
import type { IAuthService } from '../../services/AuthService';
|
||||
import type { AuthService } from '../../services/AuthService';
|
||||
import type { AuthTokenPayload } from '@open-archiver/types';
|
||||
import 'dotenv/config';
|
||||
// By using module augmentation, we can add our custom 'user' property
|
||||
@@ -12,7 +12,7 @@ declare global {
|
||||
}
|
||||
}
|
||||
|
||||
export const requireAuth = (authService: IAuthService) => {
|
||||
export const requireAuth = (authService: AuthService) => {
|
||||
return async (req: Request, res: Response, next: NextFunction) => {
|
||||
const authHeader = req.headers.authorization;
|
||||
if (!authHeader || !authHeader.startsWith('Bearer ')) {
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Router } from 'express';
|
||||
import { ArchivedEmailController } from '../controllers/archived-email.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createArchivedEmailRouter = (
|
||||
archivedEmailController: ArchivedEmailController,
|
||||
authService: IAuthService
|
||||
authService: AuthService
|
||||
): Router => {
|
||||
const router = Router();
|
||||
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
import { Router } from 'express';
|
||||
import { dashboardController } from '../controllers/dashboard.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createDashboardRouter = (authService: IAuthService): Router => {
|
||||
export const createDashboardRouter = (authService: AuthService): Router => {
|
||||
const router = Router();
|
||||
|
||||
router.use(requireAuth(authService));
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Router } from 'express';
|
||||
import { IngestionController } from '../controllers/ingestion.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createIngestionRouter = (
|
||||
ingestionController: IngestionController,
|
||||
authService: IAuthService
|
||||
authService: AuthService
|
||||
): Router => {
|
||||
const router = Router();
|
||||
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Router } from 'express';
|
||||
import { SearchController } from '../controllers/search.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createSearchRouter = (
|
||||
searchController: SearchController,
|
||||
authService: IAuthService
|
||||
authService: AuthService
|
||||
): Router => {
|
||||
const router = Router();
|
||||
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { Router } from 'express';
|
||||
import { StorageController } from '../controllers/storage.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { IAuthService } from '../../services/AuthService';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createStorageRouter = (
|
||||
storageController: StorageController,
|
||||
authService: IAuthService
|
||||
authService: AuthService
|
||||
): Router => {
|
||||
const router = Router();
|
||||
|
||||
|
||||
14
packages/backend/src/api/routes/upload.routes.ts
Normal file
14
packages/backend/src/api/routes/upload.routes.ts
Normal file
@@ -0,0 +1,14 @@
|
||||
import { Router } from 'express';
|
||||
import { uploadFile } from '../controllers/upload.controller';
|
||||
import { requireAuth } from '../middleware/requireAuth';
|
||||
import { AuthService } from '../../services/AuthService';
|
||||
|
||||
export const createUploadRouter = (authService: AuthService): Router => {
|
||||
const router = Router();
|
||||
|
||||
router.use(requireAuth(authService));
|
||||
|
||||
router.post('/', uploadFile);
|
||||
|
||||
return router;
|
||||
};
|
||||
@@ -0,0 +1,2 @@
|
||||
ALTER TYPE "public"."ingestion_provider" ADD VALUE 'pst_import';--> statement-breakpoint
|
||||
ALTER TYPE "public"."ingestion_status" ADD VALUE 'imported';
|
||||
1095
packages/backend/src/database/migrations/meta/0012_snapshot.json
Normal file
1095
packages/backend/src/database/migrations/meta/0012_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -85,6 +85,13 @@
|
||||
"when": 1754422064158,
|
||||
"tag": "0011_tan_blackheart",
|
||||
"breakpoints": true
|
||||
},
|
||||
{
|
||||
"idx": 12,
|
||||
"version": "7",
|
||||
"when": 1754476962901,
|
||||
"tag": "0012_warm_the_stranger",
|
||||
"breakpoints": true
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -3,7 +3,8 @@ import { jsonb, pgEnum, pgTable, text, timestamp, uuid } from 'drizzle-orm/pg-co
|
||||
export const ingestionProviderEnum = pgEnum('ingestion_provider', [
|
||||
'google_workspace',
|
||||
'microsoft_365',
|
||||
'generic_imap'
|
||||
'generic_imap',
|
||||
'pst_import'
|
||||
]);
|
||||
|
||||
export const ingestionStatusEnum = pgEnum('ingestion_status', [
|
||||
@@ -13,7 +14,8 @@ export const ingestionStatusEnum = pgEnum('ingestion_status', [
|
||||
'pending_auth',
|
||||
'syncing',
|
||||
'importing',
|
||||
'auth_success'
|
||||
'auth_success',
|
||||
'imported'
|
||||
]);
|
||||
|
||||
export const ingestionSources = pgTable('ingestion_sources', {
|
||||
|
||||
@@ -14,6 +14,7 @@ import { createArchivedEmailRouter } from './api/routes/archived-email.routes';
|
||||
import { createStorageRouter } from './api/routes/storage.routes';
|
||||
import { createSearchRouter } from './api/routes/search.routes';
|
||||
import { createDashboardRouter } from './api/routes/dashboard.routes';
|
||||
import { createUploadRouter } from './api/routes/upload.routes';
|
||||
import testRouter from './api/routes/test.routes';
|
||||
import { AuthService } from './services/AuthService';
|
||||
import { UserService } from './services/UserService';
|
||||
@@ -55,9 +56,6 @@ const iamController = new IamController(iamService);
|
||||
// --- Express App Initialization ---
|
||||
const app = express();
|
||||
|
||||
// Middleware
|
||||
app.use(express.json()); // For parsing application/json
|
||||
|
||||
// --- Routes ---
|
||||
const authRouter = createAuthRouter(authController);
|
||||
const ingestionRouter = createIngestionRouter(ingestionController, authService);
|
||||
@@ -66,6 +64,14 @@ const storageRouter = createStorageRouter(storageController, authService);
|
||||
const searchRouter = createSearchRouter(searchController, authService);
|
||||
const dashboardRouter = createDashboardRouter(authService);
|
||||
const iamRouter = createIamRouter(iamController);
|
||||
const uploadRouter = createUploadRouter(authService);
|
||||
// upload route is added before middleware because it doesn't use the json middleware.
|
||||
app.use('/v1/upload', uploadRouter);
|
||||
|
||||
// Middleware for all other routes
|
||||
app.use(express.json());
|
||||
app.use(express.urlencoded({ extended: true }));
|
||||
|
||||
app.use('/v1/auth', authRouter);
|
||||
app.use('/v1/iam', iamRouter);
|
||||
app.use('/v1/ingestion-sources', ingestionRouter);
|
||||
|
||||
@@ -67,9 +67,10 @@ export default async (job: Job<IInitialImportJob>) => {
|
||||
}
|
||||
});
|
||||
} else {
|
||||
const finalStatus = source.provider === 'pst_import' ? 'imported' : 'active';
|
||||
// If there are no users, we can consider the import finished and set to active
|
||||
await IngestionService.update(ingestionSourceId, {
|
||||
status: 'active',
|
||||
status: finalStatus,
|
||||
lastSyncFinishedAt: new Date(),
|
||||
lastSyncStatusMessage: 'Initial import complete. No users found.'
|
||||
});
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { Job } from 'bullmq';
|
||||
import { IngestionService } from '../../services/IngestionService';
|
||||
import { logger } from '../../config/logger';
|
||||
import { SyncState, ProcessMailboxError } from '@open-archiver/types';
|
||||
import { SyncState, ProcessMailboxError, IngestionStatus } from '@open-archiver/types';
|
||||
import { db } from '../../database';
|
||||
import { ingestionSources } from '../../database/schema';
|
||||
import { eq } from 'drizzle-orm';
|
||||
@@ -41,7 +41,11 @@ export default async (job: Job<ISyncCycleFinishedJob, any, string>) => {
|
||||
|
||||
const finalSyncState = deepmerge(...successfulJobs.filter(s => s && Object.keys(s).length > 0));
|
||||
|
||||
let status: 'active' | 'error' = 'active';
|
||||
const source = await IngestionService.findById(ingestionSourceId);
|
||||
let status: IngestionStatus = 'active';
|
||||
if (source.provider === 'pst_import') {
|
||||
status = 'imported';
|
||||
}
|
||||
let message: string;
|
||||
|
||||
if (failedJobs.length > 0) {
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { count, desc, eq, asc } from 'drizzle-orm';
|
||||
import { count, desc, eq, asc, and } from 'drizzle-orm';
|
||||
import { db } from '../database';
|
||||
import { archivedEmails, attachments, emailAttachments } from '../database/schema';
|
||||
import type { PaginatedArchivedEmails, ArchivedEmail, Recipient, ThreadEmail } from '@open-archiver/types';
|
||||
@@ -81,7 +81,10 @@ export class ArchivedEmailService {
|
||||
|
||||
if (email.threadId) {
|
||||
threadEmails = await db.query.archivedEmails.findMany({
|
||||
where: eq(archivedEmails.threadId, email.threadId),
|
||||
where: and(
|
||||
eq(archivedEmails.threadId, email.threadId),
|
||||
eq(archivedEmails.ingestionSourceId, email.ingestionSourceId)
|
||||
),
|
||||
orderBy: [asc(archivedEmails.sentAt)],
|
||||
columns: {
|
||||
id: true,
|
||||
|
||||
@@ -3,6 +3,7 @@ import type {
|
||||
GoogleWorkspaceCredentials,
|
||||
Microsoft365Credentials,
|
||||
GenericImapCredentials,
|
||||
PSTImportCredentials,
|
||||
EmailObject,
|
||||
SyncState,
|
||||
MailboxUser
|
||||
@@ -10,6 +11,7 @@ import type {
|
||||
import { GoogleWorkspaceConnector } from './ingestion-connectors/GoogleWorkspaceConnector';
|
||||
import { MicrosoftConnector } from './ingestion-connectors/MicrosoftConnector';
|
||||
import { ImapConnector } from './ingestion-connectors/ImapConnector';
|
||||
import { PSTConnector } from './ingestion-connectors/PSTConnector';
|
||||
|
||||
// Define a common interface for all connectors
|
||||
export interface IEmailConnector {
|
||||
@@ -32,6 +34,8 @@ export class EmailProviderFactory {
|
||||
return new MicrosoftConnector(credentials as Microsoft365Credentials);
|
||||
case 'generic_imap':
|
||||
return new ImapConnector(credentials as GenericImapCredentials);
|
||||
case 'pst_import':
|
||||
return new PSTConnector(credentials as PSTImportCredentials);
|
||||
default:
|
||||
throw new Error(`Unsupported provider: ${source.provider}`);
|
||||
}
|
||||
|
||||
@@ -37,7 +37,7 @@ export class IngestionService {
|
||||
|
||||
public static async create(dto: CreateIngestionSourceDto): Promise<IngestionSource> {
|
||||
const { providerConfig, ...rest } = dto;
|
||||
|
||||
console.log(providerConfig);
|
||||
const encryptedCredentials = CryptoService.encryptObject(providerConfig);
|
||||
|
||||
const valuesToInsert = {
|
||||
|
||||
@@ -31,9 +31,10 @@ export class UserService {
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates an admin user in the database.
|
||||
* The user created will be assigned the 'Super Admin' role.
|
||||
* Creates an admin user in the database. The user created will be assigned the 'Super Admin' role.
|
||||
*
|
||||
* Caution ⚠️: This action can only be allowed in the initial setup
|
||||
*
|
||||
* @param userDetails The details of the user to create.
|
||||
* @param isSetup Is this an initial setup?
|
||||
* @returns The newly created user object.
|
||||
|
||||
@@ -10,7 +10,7 @@ import type {
|
||||
import type { IEmailConnector } from '../EmailProviderFactory';
|
||||
import { logger } from '../../config/logger';
|
||||
import { simpleParser, ParsedMail, Attachment, AddressObject, Headers } from 'mailparser';
|
||||
import { getThreadId } from './utils';
|
||||
import { getThreadId } from './helpers/utils';
|
||||
|
||||
/**
|
||||
* A connector for Google Workspace that uses a service account with domain-wide delegation
|
||||
|
||||
@@ -3,7 +3,7 @@ import type { IEmailConnector } from '../EmailProviderFactory';
|
||||
import { ImapFlow } from 'imapflow';
|
||||
import { simpleParser, ParsedMail, Attachment, AddressObject, Headers } from 'mailparser';
|
||||
import { logger } from '../../config/logger';
|
||||
import { getThreadId } from './utils';
|
||||
import { getThreadId } from './helpers/utils';
|
||||
|
||||
export class ImapConnector implements IEmailConnector {
|
||||
private client: ImapFlow;
|
||||
|
||||
@@ -0,0 +1,330 @@
|
||||
import type { PSTImportCredentials, EmailObject, EmailAddress, SyncState, MailboxUser } from '@open-archiver/types';
|
||||
import type { IEmailConnector } from '../EmailProviderFactory';
|
||||
import { PSTFile, PSTFolder, PSTMessage } from 'pst-extractor';
|
||||
import { simpleParser, ParsedMail, Attachment, AddressObject } from 'mailparser';
|
||||
import { logger } from '../../config/logger';
|
||||
import { getThreadId } from './helpers/utils';
|
||||
import { StorageService } from '../StorageService';
|
||||
import { Readable } from 'stream';
|
||||
import { createHash } from 'crypto';
|
||||
|
||||
const streamToBuffer = (stream: Readable): Promise<Buffer> => {
|
||||
return new Promise((resolve, reject) => {
|
||||
const chunks: Buffer[] = [];
|
||||
stream.on('data', (chunk) => chunks.push(chunk));
|
||||
stream.on('error', reject);
|
||||
stream.on('end', () => resolve(Buffer.concat(chunks)));
|
||||
});
|
||||
};
|
||||
|
||||
// We have to hardcode names for deleted and trash folders here as current lib doesn't support looking into PST properties.
|
||||
const DELETED_FOLDERS = new Set([
|
||||
// English
|
||||
'deleted items', 'trash',
|
||||
// Spanish
|
||||
'elementos eliminados', 'papelera',
|
||||
// French
|
||||
'éléments supprimés', 'corbeille',
|
||||
// German
|
||||
'gelöschte elemente', 'papierkorb',
|
||||
// Italian
|
||||
'posta eliminata', 'cestino',
|
||||
// Portuguese
|
||||
'itens excluídos', 'lixo',
|
||||
// Dutch
|
||||
'verwijderde items', 'prullenbak',
|
||||
// Russian
|
||||
'удаленные', 'корзина',
|
||||
// Polish
|
||||
'usunięte elementy', 'kosz',
|
||||
// Japanese
|
||||
'削除済みアイテム',
|
||||
// Czech
|
||||
'odstraněná pošta', 'koš',
|
||||
// Estonian
|
||||
'kustutatud kirjad', 'prügikast',
|
||||
// Swedish
|
||||
'borttagna objekt', 'skräp',
|
||||
// Danish
|
||||
'slettet post', 'papirkurv',
|
||||
// Norwegian
|
||||
'slettede elementer',
|
||||
// Finnish
|
||||
'poistetut', 'roskakori'
|
||||
]);
|
||||
|
||||
const JUNK_FOLDERS = new Set([
|
||||
// English
|
||||
'junk email', 'spam',
|
||||
// Spanish
|
||||
'correo no deseado',
|
||||
// French
|
||||
'courrier indésirable',
|
||||
// German
|
||||
'junk-e-mail',
|
||||
// Italian
|
||||
'posta indesiderata',
|
||||
// Portuguese
|
||||
'lixo eletrônico',
|
||||
// Dutch
|
||||
'ongewenste e-mail',
|
||||
// Russian
|
||||
'нежелательная почта', 'спам',
|
||||
// Polish
|
||||
'wiadomości-śmieci',
|
||||
// Japanese
|
||||
'迷惑メール', 'スパム',
|
||||
// Czech
|
||||
'nevyžádaná pošta',
|
||||
// Estonian
|
||||
'rämpspost',
|
||||
// Swedish
|
||||
'skräppost',
|
||||
// Danish
|
||||
'uønsket post',
|
||||
// Norwegian
|
||||
'søppelpost',
|
||||
// Finnish
|
||||
'roskaposti'
|
||||
]);
|
||||
|
||||
export class PSTConnector implements IEmailConnector {
|
||||
private storage: StorageService;
|
||||
private pstFile: PSTFile | null = null;
|
||||
|
||||
constructor(private credentials: PSTImportCredentials) {
|
||||
this.storage = new StorageService();
|
||||
}
|
||||
|
||||
private async loadPstFile(): Promise<PSTFile> {
|
||||
if (this.pstFile) {
|
||||
return this.pstFile;
|
||||
}
|
||||
const fileStream = await this.storage.get(this.credentials.uploadedFilePath);
|
||||
const buffer = await streamToBuffer(fileStream as Readable);
|
||||
this.pstFile = new PSTFile(buffer);
|
||||
return this.pstFile;
|
||||
}
|
||||
|
||||
public async testConnection(): Promise<boolean> {
|
||||
try {
|
||||
if (!this.credentials.uploadedFilePath) {
|
||||
throw Error("PST file path not provided.");
|
||||
}
|
||||
if (!this.credentials.uploadedFilePath.includes('.pst')) {
|
||||
throw Error("Provided file is not in the PST format.");
|
||||
}
|
||||
return true;
|
||||
} catch (error) {
|
||||
logger.error({ error, credentials: this.credentials }, 'PST file validation failed.');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Lists mailboxes within the PST. It treats each top-level folder
|
||||
* as a distinct mailbox, allowing it to handle PSTs that have been
|
||||
* consolidated from multiple sources.
|
||||
*/
|
||||
public async *listAllUsers(): AsyncGenerator<MailboxUser> {
|
||||
let pstFile: PSTFile | null = null;
|
||||
try {
|
||||
pstFile = await this.loadPstFile();
|
||||
const root = pstFile.getRootFolder();
|
||||
const displayName = root.displayName || pstFile.pstFilename;
|
||||
logger.info(`Found potential mailbox: ${displayName}`);
|
||||
const constructedPrimaryEmail = `${displayName.replace(/ /g, '.').toLowerCase()}@pst.local`;
|
||||
yield {
|
||||
id: constructedPrimaryEmail,
|
||||
// We will address the primaryEmail problem in the next section.
|
||||
primaryEmail: constructedPrimaryEmail,
|
||||
displayName: displayName,
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error({ error }, 'Failed to list users from PST file using top-level folder strategy.');
|
||||
pstFile?.close();
|
||||
throw error;
|
||||
} finally {
|
||||
pstFile?.close();
|
||||
}
|
||||
}
|
||||
|
||||
public async *fetchEmails(userEmail: string, syncState?: SyncState | null): AsyncGenerator<EmailObject | null> {
|
||||
let pstFile: PSTFile | null = null;
|
||||
try {
|
||||
pstFile = await this.loadPstFile();
|
||||
const root = pstFile.getRootFolder();
|
||||
yield* this.processFolder(root);
|
||||
} catch (error) {
|
||||
logger.error({ error }, 'Failed to list users from PST file using top-level folder strategy.');
|
||||
pstFile?.close();
|
||||
throw error;
|
||||
}
|
||||
finally {
|
||||
|
||||
pstFile?.close();
|
||||
}
|
||||
}
|
||||
|
||||
private async *processFolder(folder: PSTFolder): AsyncGenerator<EmailObject | null> {
|
||||
const folderName = folder.displayName.toLowerCase();
|
||||
if (DELETED_FOLDERS.has(folderName) || JUNK_FOLDERS.has(folderName)) {
|
||||
logger.info(`Skipping folder: ${folder.displayName}`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (folder.contentCount > 0) {
|
||||
let email: PSTMessage | null = folder.getNextChild();
|
||||
while (email != null) {
|
||||
yield await this.parseMessage(email);
|
||||
try {
|
||||
email = folder.getNextChild();
|
||||
} catch (error) {
|
||||
console.warn("Folder doesn't have child");
|
||||
email = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (folder.hasSubfolders) {
|
||||
for (const subFolder of folder.getSubFolders()) {
|
||||
yield* this.processFolder(subFolder);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async parseMessage(msg: PSTMessage): Promise<EmailObject> {
|
||||
const emlContent = await this.constructEml(msg);
|
||||
const emlBuffer = Buffer.from(emlContent, 'utf-8');
|
||||
const parsedEmail: ParsedMail = await simpleParser(emlBuffer);
|
||||
|
||||
const attachments = parsedEmail.attachments.map((attachment: Attachment) => ({
|
||||
filename: attachment.filename || 'untitled',
|
||||
contentType: attachment.contentType,
|
||||
size: attachment.size,
|
||||
content: attachment.content as Buffer
|
||||
}));
|
||||
|
||||
const mapAddresses = (addresses: AddressObject | AddressObject[] | undefined): EmailAddress[] => {
|
||||
if (!addresses) return [];
|
||||
const addressArray = Array.isArray(addresses) ? addresses : [addresses];
|
||||
return addressArray.flatMap(a => a.value.map(v => ({ name: v.name, address: v.address?.replaceAll(`'`, '') || '' })));
|
||||
};
|
||||
|
||||
const threadId = getThreadId(parsedEmail.headers);
|
||||
let messageId = msg.internetMessageId;
|
||||
// generate a unique ID for this message
|
||||
|
||||
if (!messageId) {
|
||||
messageId = `generated-${createHash('sha256').update(emlBuffer ?? Buffer.from(parsedEmail.text || parsedEmail.html || '', 'utf-8')).digest('hex')}-${createHash('sha256').update(emlBuffer ?? Buffer.from(msg.subject || '', 'utf-8')).digest('hex')}-${msg.clientSubmitTime?.getTime()}`;
|
||||
}
|
||||
return {
|
||||
id: messageId,
|
||||
threadId: threadId,
|
||||
from: mapAddresses(parsedEmail.from),
|
||||
to: mapAddresses(parsedEmail.to),
|
||||
cc: mapAddresses(parsedEmail.cc),
|
||||
bcc: mapAddresses(parsedEmail.bcc),
|
||||
subject: parsedEmail.subject || '',
|
||||
body: parsedEmail.text || '',
|
||||
html: parsedEmail.html || '',
|
||||
headers: parsedEmail.headers,
|
||||
attachments,
|
||||
receivedAt: parsedEmail.date || new Date(),
|
||||
eml: emlBuffer
|
||||
};
|
||||
}
|
||||
|
||||
private async constructEml(msg: PSTMessage): Promise<string> {
|
||||
let eml = '';
|
||||
const boundary = '----boundary-openarchiver';
|
||||
const altBoundary = '----boundary-openarchiver_alt';
|
||||
|
||||
let headers = '';
|
||||
|
||||
if (msg.senderName || msg.senderEmailAddress) {
|
||||
headers += `From: ${msg.senderName} <${msg.senderEmailAddress}>\n`;
|
||||
}
|
||||
if (msg.displayTo) {
|
||||
headers += `To: ${msg.displayTo}\n`;
|
||||
}
|
||||
if (msg.displayCC) {
|
||||
headers += `Cc: ${msg.displayCC}\n`;
|
||||
}
|
||||
if (msg.displayBCC) {
|
||||
headers += `Bcc: ${msg.displayBCC}\n`;
|
||||
}
|
||||
if (msg.subject) {
|
||||
headers += `Subject: ${msg.subject}\n`;
|
||||
}
|
||||
if (msg.clientSubmitTime) {
|
||||
headers += `Date: ${new Date(msg.clientSubmitTime).toUTCString()}\n`;
|
||||
}
|
||||
if (msg.internetMessageId) {
|
||||
headers += `Message-ID: <${msg.internetMessageId}>\n`;
|
||||
}
|
||||
if (msg.inReplyToId) {
|
||||
headers += `In-Reply-To: ${msg.inReplyToId}`;
|
||||
}
|
||||
if (msg.conversationId) {
|
||||
headers += `Conversation-Id: ${msg.conversationId}`;
|
||||
}
|
||||
headers += 'MIME-Version: 1.0\n';
|
||||
|
||||
console.log("headers", headers);
|
||||
//add new headers
|
||||
if (!/Content-Type:/i.test(headers)) {
|
||||
if (msg.hasAttachments) {
|
||||
headers += `Content-Type: multipart/mixed; boundary="${boundary}"\n`;
|
||||
headers += `Content-Type: multipart/alternative; boundary="${altBoundary}"\n\n`;
|
||||
eml += headers;
|
||||
eml += `--${boundary}\n\n`;
|
||||
} else {
|
||||
eml += headers;
|
||||
eml += `Content-Type: multipart/alternative; boundary="${altBoundary}"\n\n`;
|
||||
}
|
||||
}
|
||||
// Body
|
||||
const hasBody = !!msg.body;
|
||||
const hasHtml = !!msg.bodyHTML;
|
||||
|
||||
if (hasBody) {
|
||||
eml += `--${altBoundary}\n`;
|
||||
eml += 'Content-Type: text/plain; charset="utf-8"\n\n';
|
||||
eml += `${msg.body}\n\n`;
|
||||
}
|
||||
|
||||
if (hasHtml) {
|
||||
eml += `--${altBoundary}\n`;
|
||||
eml += 'Content-Type: text/html; charset="utf-8"\n\n';
|
||||
eml += `${msg.bodyHTML}\n\n`;
|
||||
}
|
||||
|
||||
if (hasBody || hasHtml) {
|
||||
eml += `--${altBoundary}--\n`;
|
||||
}
|
||||
|
||||
if (msg.hasAttachments) {
|
||||
for (let i = 0; i < msg.numberOfAttachments; i++) {
|
||||
const attachment = msg.getAttachment(i);
|
||||
const attachmentStream = attachment.fileInputStream;
|
||||
if (attachmentStream) {
|
||||
const attachmentBuffer = Buffer.alloc(attachment.filesize);
|
||||
attachmentStream.readCompletely(attachmentBuffer);
|
||||
eml += `\n--${boundary}\n`;
|
||||
eml += `Content-Type: ${attachment.mimeTag}; name="${attachment.longFilename}"\n`;
|
||||
eml += `Content-Disposition: attachment; filename="${attachment.longFilename}"\n`;
|
||||
eml += 'Content-Transfer-Encoding: base64\n\n';
|
||||
eml += `${attachmentBuffer.toString('base64')}\n`;
|
||||
}
|
||||
}
|
||||
eml += `\n--${boundary}--`;
|
||||
}
|
||||
|
||||
return eml;
|
||||
}
|
||||
|
||||
public getUpdatedSyncState(): SyncState {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
@@ -34,6 +34,15 @@ export function getThreadId(headers: Headers): string | undefined {
|
||||
}
|
||||
}
|
||||
|
||||
const conversationIdHeader = headers.get('conversation-id');
|
||||
|
||||
if (conversationIdHeader) {
|
||||
const conversationId = getHeaderValue(conversationIdHeader);
|
||||
if (conversationId) {
|
||||
return conversationId.trim();
|
||||
}
|
||||
}
|
||||
|
||||
const messageIdHeader = headers.get('message-id');
|
||||
|
||||
if (messageIdHeader) {
|
||||
@@ -14,9 +14,11 @@ export const api = async (
|
||||
options: RequestInit = {}
|
||||
): Promise<Response> => {
|
||||
const { accessToken } = get(authStore);
|
||||
const defaultHeaders: HeadersInit = {
|
||||
'Content-Type': 'application/json'
|
||||
};
|
||||
const defaultHeaders: HeadersInit = {};
|
||||
|
||||
if (!(options.body instanceof FormData)) {
|
||||
defaultHeaders['Content-Type'] = 'application/json';
|
||||
}
|
||||
|
||||
if (accessToken) {
|
||||
defaultHeaders['Authorization'] = `Bearer ${accessToken}`;
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
raw,
|
||||
rawHtml
|
||||
}: { raw?: Buffer | { type: 'Buffer'; data: number[] } | undefined; rawHtml?: string } = $props();
|
||||
|
||||
let parsedEmail: Email | null = $state(null);
|
||||
let isLoading = $state(true);
|
||||
|
||||
|
||||
@@ -19,7 +19,7 @@
|
||||
{#each thread as item, i (item.id)}
|
||||
<div class="mb-8">
|
||||
<span
|
||||
class="absolute -left-3 flex h-6 w-6 items-center justify-center rounded-full bg-gray-200 ring-8 ring-white"
|
||||
class=" ring-sidebar absolute -left-3 flex h-6 w-6 items-center justify-center rounded-full bg-gray-200 ring-8"
|
||||
>
|
||||
<svg
|
||||
class="h-3 w-3 text-gray-600"
|
||||
|
||||
@@ -8,7 +8,9 @@
|
||||
import * as Select from '$lib/components/ui/select';
|
||||
import * as Alert from '$lib/components/ui/alert/index.js';
|
||||
import { Textarea } from '$lib/components/ui/textarea/index.js';
|
||||
|
||||
import { setAlert } from '$lib/components/custom/alert/alert-state.svelte';
|
||||
import { api } from '$lib/api.client';
|
||||
import { Loader2 } from 'lucide-svelte';
|
||||
let {
|
||||
source = null,
|
||||
onSubmit
|
||||
@@ -20,7 +22,8 @@
|
||||
const providerOptions = [
|
||||
{ value: 'generic_imap', label: 'Generic IMAP' },
|
||||
{ value: 'google_workspace', label: 'Google Workspace' },
|
||||
{ value: 'microsoft_365', label: 'Microsoft 365' }
|
||||
{ value: 'microsoft_365', label: 'Microsoft 365' },
|
||||
{ value: 'pst_import', label: 'PST Import' }
|
||||
];
|
||||
|
||||
let formData: CreateIngestionSourceDto = $state({
|
||||
@@ -43,6 +46,8 @@
|
||||
|
||||
let isSubmitting = $state(false);
|
||||
|
||||
let fileUploading = $state(false);
|
||||
|
||||
const handleSubmit = async (event: Event) => {
|
||||
event.preventDefault();
|
||||
isSubmitting = true;
|
||||
@@ -52,6 +57,45 @@
|
||||
isSubmitting = false;
|
||||
}
|
||||
};
|
||||
|
||||
const handleFileChange = async (event: Event) => {
|
||||
const target = event.target as HTMLInputElement;
|
||||
const file = target.files?.[0];
|
||||
fileUploading = true;
|
||||
if (!file) {
|
||||
fileUploading = false;
|
||||
return;
|
||||
}
|
||||
|
||||
const uploadFormData = new FormData();
|
||||
uploadFormData.append('file', file);
|
||||
|
||||
try {
|
||||
const response = await api('/upload', {
|
||||
method: 'POST',
|
||||
body: uploadFormData
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('File upload failed');
|
||||
}
|
||||
|
||||
const result = await response.json();
|
||||
formData.providerConfig.uploadedFilePath = result.filePath;
|
||||
formData.providerConfig.uploadedFileName = file.name;
|
||||
console.log(formData.providerConfig.uploadedFilePath);
|
||||
fileUploading = false;
|
||||
} catch (error) {
|
||||
fileUploading = false;
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: 'Upload Failed',
|
||||
message: 'PST file upload failed. Please try again.',
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
}
|
||||
};
|
||||
</script>
|
||||
|
||||
<form onsubmit={handleSubmit} class="grid gap-4 py-4">
|
||||
@@ -136,6 +180,16 @@
|
||||
<Label for="secure" class="text-left">Use TLS</Label>
|
||||
<Checkbox id="secure" bind:checked={formData.providerConfig.secure} />
|
||||
</div>
|
||||
{:else if formData.provider === 'pst_import'}
|
||||
<div class="grid grid-cols-4 items-center gap-4">
|
||||
<Label for="pst-file" class="text-left">PST File</Label>
|
||||
<div class="col-span-3 flex flex-row items-center space-x-2">
|
||||
<Input id="pst-file" type="file" class="" accept=".pst" onchange={handleFileChange} />
|
||||
{#if fileUploading}
|
||||
<span class=" text-primary animate-spin"><Loader2 /></span>
|
||||
{/if}
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
{#if formData.provider === 'google_workspace' || formData.provider === 'microsoft_365'}
|
||||
<Alert.Root>
|
||||
@@ -150,7 +204,7 @@
|
||||
</Alert.Root>
|
||||
{/if}
|
||||
<Dialog.Footer>
|
||||
<Button type="submit" disabled={isSubmitting}>
|
||||
<Button type="submit" disabled={isSubmitting || fileUploading}>
|
||||
{#if isSubmitting}
|
||||
Submitting...
|
||||
{:else}
|
||||
|
||||
@@ -70,8 +70,9 @@
|
||||
</Card.Header>
|
||||
<Card.Content>
|
||||
<div
|
||||
class=" text-2xl font-bold text-green-500"
|
||||
class=" text-2xl font-bold"
|
||||
class:text-destructive={data.stats.failedIngestionsLast7Days > 0}
|
||||
class:text-green-600={data.stats.failedIngestionsLast7Days <= 0}
|
||||
>
|
||||
{data.stats.failedIngestionsLast7Days}
|
||||
</div>
|
||||
|
||||
@@ -3,9 +3,10 @@
|
||||
import * as Table from '$lib/components/ui/table';
|
||||
import { Button } from '$lib/components/ui/button';
|
||||
import * as DropdownMenu from '$lib/components/ui/dropdown-menu';
|
||||
import { MoreHorizontal } from 'lucide-svelte';
|
||||
import { MoreHorizontal, Trash, RefreshCw } from 'lucide-svelte';
|
||||
import * as Dialog from '$lib/components/ui/dialog';
|
||||
import { Switch } from '$lib/components/ui/switch';
|
||||
import { Checkbox } from '$lib/components/ui/checkbox';
|
||||
import IngestionSourceForm from '$lib/components/custom/IngestionSourceForm.svelte';
|
||||
import { api } from '$lib/api.client';
|
||||
import type { IngestionSource, CreateIngestionSourceDto } from '@open-archiver/types';
|
||||
@@ -20,6 +21,8 @@
|
||||
let selectedSource = $state<IngestionSource | null>(null);
|
||||
let sourceToDelete = $state<IngestionSource | null>(null);
|
||||
let isDeleting = $state(false);
|
||||
let selectedIds = $state<string[]>([]);
|
||||
let isBulkDeleteDialogOpen = $state(false);
|
||||
|
||||
const openCreateDialog = () => {
|
||||
selectedSource = null;
|
||||
@@ -125,6 +128,64 @@
|
||||
}
|
||||
};
|
||||
|
||||
const handleBulkDelete = async () => {
|
||||
isDeleting = true;
|
||||
try {
|
||||
for (const id of selectedIds) {
|
||||
const res = await api(`/ingestion-sources/${id}`, { method: 'DELETE' });
|
||||
if (!res.ok) {
|
||||
const errorBody = await res.json();
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: `Failed to delete ingestion ${id}`,
|
||||
message: errorBody.message || JSON.stringify(errorBody),
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
}
|
||||
}
|
||||
ingestionSources = ingestionSources.filter((s) => !selectedIds.includes(s.id));
|
||||
selectedIds = [];
|
||||
isBulkDeleteDialogOpen = false;
|
||||
} finally {
|
||||
isDeleting = false;
|
||||
}
|
||||
};
|
||||
|
||||
const handleBulkForceSync = async () => {
|
||||
try {
|
||||
for (const id of selectedIds) {
|
||||
const res = await api(`/ingestion-sources/${id}/sync`, { method: 'POST' });
|
||||
if (!res.ok) {
|
||||
const errorBody = await res.json();
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: `Failed to trigger force sync for ingestion ${id}`,
|
||||
message: errorBody.message || JSON.stringify(errorBody),
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
}
|
||||
}
|
||||
const updatedSources = ingestionSources.map((s) => {
|
||||
if (selectedIds.includes(s.id)) {
|
||||
return { ...s, status: 'syncing' as const };
|
||||
}
|
||||
return s;
|
||||
});
|
||||
ingestionSources = updatedSources;
|
||||
selectedIds = [];
|
||||
} catch (e) {
|
||||
setAlert({
|
||||
type: 'error',
|
||||
title: 'Failed to trigger force sync',
|
||||
message: e instanceof Error ? e.message : JSON.stringify(e),
|
||||
duration: 5000,
|
||||
show: true
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const handleFormSubmit = async (formData: CreateIngestionSourceDto) => {
|
||||
try {
|
||||
if (selectedSource) {
|
||||
@@ -174,6 +235,8 @@
|
||||
switch (status) {
|
||||
case 'active':
|
||||
return 'bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-300';
|
||||
case 'imported':
|
||||
return 'bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-300';
|
||||
case 'paused':
|
||||
return 'bg-gray-100 text-gray-800 dark:bg-gray-700 dark:text-gray-300';
|
||||
case 'error':
|
||||
@@ -198,7 +261,29 @@
|
||||
|
||||
<div class="">
|
||||
<div class="mb-4 flex items-center justify-between">
|
||||
<h1 class="text-2xl font-bold">Ingestion Sources</h1>
|
||||
<div class="flex items-center gap-4">
|
||||
<h1 class="text-2xl font-bold">Ingestion Sources</h1>
|
||||
{#if selectedIds.length > 0}
|
||||
<DropdownMenu.Root>
|
||||
<DropdownMenu.Trigger>
|
||||
<Button variant="outline">
|
||||
Bulk Actions ({selectedIds.length})
|
||||
<MoreHorizontal class="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
</DropdownMenu.Trigger>
|
||||
<DropdownMenu.Content>
|
||||
<DropdownMenu.Item onclick={handleBulkForceSync}>
|
||||
<RefreshCw class="mr-2 h-4 w-4" />
|
||||
Force Sync
|
||||
</DropdownMenu.Item>
|
||||
<DropdownMenu.Item class="text-red-600" onclick={() => (isBulkDeleteDialogOpen = true)}>
|
||||
<Trash class="mr-2 h-4 w-4" />
|
||||
Delete
|
||||
</DropdownMenu.Item>
|
||||
</DropdownMenu.Content>
|
||||
</DropdownMenu.Root>
|
||||
{/if}
|
||||
</div>
|
||||
<Button onclick={openCreateDialog} disabled={data.isDemo}>Create New</Button>
|
||||
</div>
|
||||
|
||||
@@ -206,6 +291,20 @@
|
||||
<Table.Root>
|
||||
<Table.Header>
|
||||
<Table.Row>
|
||||
<Table.Head class="w-12">
|
||||
<Checkbox
|
||||
onCheckedChange={(checked) => {
|
||||
if (checked) {
|
||||
selectedIds = ingestionSources.map((s) => s.id);
|
||||
} else {
|
||||
selectedIds = [];
|
||||
}
|
||||
}}
|
||||
checked={ingestionSources.length > 0 && selectedIds.length === ingestionSources.length
|
||||
? true
|
||||
: ((selectedIds.length > 0 ? 'indeterminate' : false) as any)}
|
||||
/>
|
||||
</Table.Head>
|
||||
<Table.Head>Name</Table.Head>
|
||||
<Table.Head>Provider</Table.Head>
|
||||
<Table.Head>Status</Table.Head>
|
||||
@@ -218,6 +317,18 @@
|
||||
{#if ingestionSources.length > 0}
|
||||
{#each ingestionSources as source (source.id)}
|
||||
<Table.Row>
|
||||
<Table.Cell>
|
||||
<Checkbox
|
||||
checked={selectedIds.includes(source.id)}
|
||||
onCheckedChange={() => {
|
||||
if (selectedIds.includes(source.id)) {
|
||||
selectedIds = selectedIds.filter((id) => id !== source.id);
|
||||
} else {
|
||||
selectedIds = [...selectedIds, source.id];
|
||||
}
|
||||
}}
|
||||
/>
|
||||
</Table.Cell>
|
||||
<Table.Cell>
|
||||
<a href="/dashboard/archived-emails?ingestionSourceId={source.id}">{source.name}</a>
|
||||
</Table.Cell>
|
||||
@@ -324,3 +435,26 @@
|
||||
</Dialog.Footer>
|
||||
</Dialog.Content>
|
||||
</Dialog.Root>
|
||||
|
||||
<Dialog.Root bind:open={isBulkDeleteDialogOpen}>
|
||||
<Dialog.Content class="sm:max-w-lg">
|
||||
<Dialog.Header>
|
||||
<Dialog.Title
|
||||
>Are you sure you want to delete {selectedIds.length} selected ingestions?</Dialog.Title
|
||||
>
|
||||
<Dialog.Description>
|
||||
This will delete all archived emails, attachments, indexing, and files associated with these
|
||||
ingestions. If you only want to stop syncing new emails, you can pause the ingestions
|
||||
instead.
|
||||
</Dialog.Description>
|
||||
</Dialog.Header>
|
||||
<Dialog.Footer class="sm:justify-start">
|
||||
<Button type="button" variant="destructive" onclick={handleBulkDelete} disabled={isDeleting}
|
||||
>{#if isDeleting}Deleting...{:else}Confirm{/if}</Button
|
||||
>
|
||||
<Dialog.Close>
|
||||
<Button type="button" variant="secondary">Cancel</Button>
|
||||
</Dialog.Close>
|
||||
</Dialog.Footer>
|
||||
</Dialog.Content>
|
||||
</Dialog.Root>
|
||||
|
||||
@@ -17,7 +17,7 @@ export type SyncState = {
|
||||
lastSyncTimestamp?: string;
|
||||
};
|
||||
|
||||
export type IngestionProvider = 'google_workspace' | 'microsoft_365' | 'generic_imap';
|
||||
export type IngestionProvider = 'google_workspace' | 'microsoft_365' | 'generic_imap' | 'pst_import';
|
||||
|
||||
export type IngestionStatus =
|
||||
| 'active'
|
||||
@@ -26,7 +26,8 @@ export type IngestionStatus =
|
||||
| 'pending_auth'
|
||||
| 'syncing'
|
||||
| 'importing'
|
||||
| 'auth_success';
|
||||
| 'auth_success'
|
||||
| 'imported';
|
||||
|
||||
export interface BaseIngestionCredentials {
|
||||
type: IngestionProvider;
|
||||
@@ -61,11 +62,18 @@ export interface Microsoft365Credentials extends BaseIngestionCredentials {
|
||||
tenantId: string;
|
||||
}
|
||||
|
||||
export interface PSTImportCredentials extends BaseIngestionCredentials {
|
||||
type: 'pst_import';
|
||||
uploadedFileName: string;
|
||||
uploadedFilePath: string;
|
||||
}
|
||||
|
||||
// Discriminated union for all possible credential types
|
||||
export type IngestionCredentials =
|
||||
| GenericImapCredentials
|
||||
| GoogleWorkspaceCredentials
|
||||
| Microsoft365Credentials;
|
||||
| Microsoft365Credentials
|
||||
| PSTImportCredentials;
|
||||
|
||||
export interface IngestionSource {
|
||||
id: string;
|
||||
@@ -118,6 +126,12 @@ export interface IProcessMailboxJob {
|
||||
userEmail: string;
|
||||
}
|
||||
|
||||
export interface IPstProcessingJob {
|
||||
ingestionSourceId: string;
|
||||
filePath: string;
|
||||
originalFilename: string;
|
||||
}
|
||||
|
||||
export type MailboxUser = {
|
||||
id: string;
|
||||
primaryEmail: string;
|
||||
|
||||
112
pnpm-lock.yaml
generated
112
pnpm-lock.yaml
generated
@@ -48,6 +48,9 @@ importers:
|
||||
bullmq:
|
||||
specifier: ^5.56.3
|
||||
version: 5.56.3
|
||||
busboy:
|
||||
specifier: ^1.6.0
|
||||
version: 1.6.0
|
||||
cross-fetch:
|
||||
specifier: ^4.1.0
|
||||
version: 4.1.0(encoding@0.1.13)
|
||||
@@ -93,6 +96,9 @@ importers:
|
||||
meilisearch:
|
||||
specifier: ^0.51.0
|
||||
version: 0.51.0
|
||||
multer:
|
||||
specifier: ^2.0.2
|
||||
version: 2.0.2
|
||||
pdf2json:
|
||||
specifier: ^3.1.6
|
||||
version: 3.1.6
|
||||
@@ -108,6 +114,9 @@ importers:
|
||||
postgres:
|
||||
specifier: ^3.4.7
|
||||
version: 3.4.7
|
||||
pst-extractor:
|
||||
specifier: ^1.11.0
|
||||
version: 1.11.0
|
||||
reflect-metadata:
|
||||
specifier: ^0.2.2
|
||||
version: 0.2.2
|
||||
@@ -127,6 +136,9 @@ importers:
|
||||
'@bull-board/express':
|
||||
specifier: ^6.11.0
|
||||
version: 6.11.0
|
||||
'@types/busboy':
|
||||
specifier: ^1.5.4
|
||||
version: 1.5.4
|
||||
'@types/express':
|
||||
specifier: ^5.0.3
|
||||
version: 5.0.3
|
||||
@@ -136,6 +148,9 @@ importers:
|
||||
'@types/microsoft-graph':
|
||||
specifier: ^2.40.1
|
||||
version: 2.40.1
|
||||
'@types/multer':
|
||||
specifier: ^2.0.0
|
||||
version: 2.0.0
|
||||
'@types/node':
|
||||
specifier: ^24.0.12
|
||||
version: 24.0.13
|
||||
@@ -1660,6 +1675,9 @@ packages:
|
||||
'@types/body-parser@1.19.6':
|
||||
resolution: {integrity: sha512-HLFeCYgz89uk22N5Qg3dvGvsv46B8GLvKKo1zKG4NybA8U2DiEO3w9lqGg29t/tfLRJpJ6iQxnVw4OnB7MoM9g==}
|
||||
|
||||
'@types/busboy@1.5.4':
|
||||
resolution: {integrity: sha512-kG7WrUuAKK0NoyxfQHsVE6j1m01s6kMma64E+OZenQABMQyTJop1DumUWcLwAQ2JzpefU7PDYoRDKl8uZosFjw==}
|
||||
|
||||
'@types/connect@3.4.38':
|
||||
resolution: {integrity: sha512-K6uROf1LD88uDQqJCktA4yzL1YYAK6NgfsI0v/mTgyPKWsX1CnJ0XPSDhViejru1GcRkLWb8RlzFYJRqGUbaug==}
|
||||
|
||||
@@ -1714,6 +1732,9 @@ packages:
|
||||
'@types/mime@1.3.5':
|
||||
resolution: {integrity: sha512-/pyBZWSLD2n0dcHE3hq8s8ZvcETHtEuF+3E7XVt0Ig2nvsVQXdghHVcEkIWjy9A0wKfTn97a/PSDYohKIlnP/w==}
|
||||
|
||||
'@types/multer@2.0.0':
|
||||
resolution: {integrity: sha512-C3Z9v9Evij2yST3RSBktxP9STm6OdMc5uR1xF1SGr98uv8dUlAL2hqwrZ3GVB3uyMyiegnscEK6PGtYvNrjTjw==}
|
||||
|
||||
'@types/node@24.0.13':
|
||||
resolution: {integrity: sha512-Qm9OYVOFHFYg3wJoTSrz80hoec5Lia/dPp84do3X7dZvLikQvM1YpmvTBEdIr/e+U8HTkFjLHLnl78K/qjf+jQ==}
|
||||
|
||||
@@ -1905,6 +1926,9 @@ packages:
|
||||
resolution: {integrity: sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==}
|
||||
engines: {node: '>= 8'}
|
||||
|
||||
append-field@1.0.0:
|
||||
resolution: {integrity: sha512-klpgFSWLW1ZEs8svjfb7g4qWY0YS5imI82dTg+QahUvJ8YqAY0P10Uk8tTyh9ZGuYEZEMaeJYCF5BFuX552hsw==}
|
||||
|
||||
aproba@2.0.0:
|
||||
resolution: {integrity: sha512-lYe4Gx7QT+MKGbDsA+Z+he/Wtef0BiwDOlK/XkBrdfsh9J/jPPXbX0tE9x9cl27Tmu5gg3QUbUrQYa/y+KOHPQ==}
|
||||
|
||||
@@ -2019,6 +2043,10 @@ packages:
|
||||
bullmq@5.56.3:
|
||||
resolution: {integrity: sha512-03szheVTKfLsCm5EwzOjSSUTI0UIGJjTUgX91W4+a0pj6SSfiuuNzB29QJh+T3bcgUZUHuTp01Jyxa101sv0Lg==}
|
||||
|
||||
busboy@1.6.0:
|
||||
resolution: {integrity: sha512-8SFQbg/0hQ9xy3UNTB0YEnsNBbWfhf7RtnzpL7TkBiTBRfrQ9Fxcnz7VJsleJpyp6rVLvXiuORqjlHi5q+PYuA==}
|
||||
engines: {node: '>=10.16.0'}
|
||||
|
||||
bytes@3.1.0:
|
||||
resolution: {integrity: sha512-zauLjrfCG+xvoyaqLoV8bLVXXNGC4JqlxFCutSDWA6fJrTo2ZuvLYTqZ7aHBLZSMOopbzwv8f+wZcVzfVTI2Dg==}
|
||||
engines: {node: '>= 0.8'}
|
||||
@@ -2126,6 +2154,10 @@ packages:
|
||||
concat-map@0.0.1:
|
||||
resolution: {integrity: sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==}
|
||||
|
||||
concat-stream@2.0.0:
|
||||
resolution: {integrity: sha512-MWufYdFw53ccGjCA+Ol7XJYpAlW6/prSMzuPOTRnJGcGzuhLn4Scrz7qf6o8bROZ514ltazcIFJZevcfbo0x7A==}
|
||||
engines: {'0': node >= 6.0}
|
||||
|
||||
concurrently@9.2.0:
|
||||
resolution: {integrity: sha512-IsB/fiXTupmagMW4MNp2lx2cdSN2FfZq78vF90LBB+zZHArbIQZjQtzXCiXnvTxCZSvXanTqFLWBjw2UkLx1SQ==}
|
||||
engines: {node: '>=18'}
|
||||
@@ -3186,6 +3218,9 @@ packages:
|
||||
lodash@4.17.21:
|
||||
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
|
||||
|
||||
long@5.3.2:
|
||||
resolution: {integrity: sha512-mNAgZ1GmyNhD7AuqnTG3/VQ26o760+ZYBPKjPvugO8+nLbYfX6TVpJPseBvopbdY+qpZ/lKUnmEc1LeZYS3QAA==}
|
||||
|
||||
lop@0.4.2:
|
||||
resolution: {integrity: sha512-RefILVDQ4DKoRZsJ4Pj22TxE3omDO47yFpkIBoDKzkqPRISs5U1cnAdg/5583YPkWPaLIYHOKRMQSvjFsO26cw==}
|
||||
|
||||
@@ -3362,6 +3397,10 @@ packages:
|
||||
mkdirp-classic@0.5.3:
|
||||
resolution: {integrity: sha512-gKLcREMhtuZRwRAfqP3RFW+TK4JqApVBtOIftVgjuABpAtpxhPGaDcfvbhNvD0B8iD1oUr/txX35NjcaY6Ns/A==}
|
||||
|
||||
mkdirp@0.5.6:
|
||||
resolution: {integrity: sha512-FP+p8RB8OWpF3YZBCrP5gtADmtXApB5AMLn+vdyA+PyxCjrCs00mjyUozssO33cwDeT3wNGdLxJ5M//YqtHAJw==}
|
||||
hasBin: true
|
||||
|
||||
mkdirp@1.0.4:
|
||||
resolution: {integrity: sha512-vVqVZQyf3WLx2Shd0qJ9xuvqgAyKPLAiqITEtqW0oIUjzo3PePDd6fW9iFz30ef7Ysp/oiWqbhszeGWW2T6Gzw==}
|
||||
engines: {node: '>=10'}
|
||||
@@ -3401,6 +3440,10 @@ packages:
|
||||
msgpackr@1.11.4:
|
||||
resolution: {integrity: sha512-uaff7RG9VIC4jacFW9xzL3jc0iM32DNHe4jYVycBcjUePT/Klnfj7pqtWJt9khvDFizmjN2TlYniYmSS2LIaZg==}
|
||||
|
||||
multer@2.0.2:
|
||||
resolution: {integrity: sha512-u7f2xaZ/UG8oLXHvtF/oWTRvT44p9ecwBBqTwgJVq0+4BW1g8OW01TyMEGWBHbyMOYVHXslaut7qEQ1meATXgw==}
|
||||
engines: {node: '>= 10.16.0'}
|
||||
|
||||
nanoid@3.3.11:
|
||||
resolution: {integrity: sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==}
|
||||
engines: {node: ^10 || ^12 || ^13.7 || ^14 || >=15.0.1}
|
||||
@@ -3480,6 +3523,10 @@ packages:
|
||||
engines: {node: ^12.13.0 || ^14.15.0 || >=16.0.0}
|
||||
deprecated: This package is no longer supported.
|
||||
|
||||
object-assign@4.1.1:
|
||||
resolution: {integrity: sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg==}
|
||||
engines: {node: '>=0.10.0'}
|
||||
|
||||
object-inspect@1.13.4:
|
||||
resolution: {integrity: sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew==}
|
||||
engines: {node: '>= 0.4'}
|
||||
@@ -3742,6 +3789,10 @@ packages:
|
||||
proxy-from-env@1.1.0:
|
||||
resolution: {integrity: sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==}
|
||||
|
||||
pst-extractor@1.11.0:
|
||||
resolution: {integrity: sha512-y4IzdvKlXabFrbIqQiehkBok/F1+YNoNl9R4o0phamzO13g79HSLzjs/Nctz8YxHlHQ1490WP1YIlHSLtuVa/w==}
|
||||
engines: {node: '>=10'}
|
||||
|
||||
pump@3.0.3:
|
||||
resolution: {integrity: sha512-todwxLMY7/heScKmntwQG8CXVkWUOdYxIvY2s0VWAAMh/nd8SoYiRaKjlr7+iCs984f2P8zvrfWcDDYVb73NfA==}
|
||||
|
||||
@@ -4063,6 +4114,10 @@ packages:
|
||||
stream-browserify@3.0.0:
|
||||
resolution: {integrity: sha512-H73RAHsVBapbim0tU2JwwOiXUj+fikfiaoYAKHF3VJfA0pe2BCzkhAHBlLG6REzE+2WNZcxOXjK7lkso+9euLA==}
|
||||
|
||||
streamsearch@1.1.0:
|
||||
resolution: {integrity: sha512-Mcc5wHehp9aXz1ax6bZUyY5afg9u2rv5cqQI3mRrYkGC8rW2hM02jWuwjtL++LS5qinSyhj2QfLyNsuc+VsExg==}
|
||||
engines: {node: '>=10.0.0'}
|
||||
|
||||
string-width@4.2.3:
|
||||
resolution: {integrity: sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==}
|
||||
engines: {node: '>=8'}
|
||||
@@ -4272,6 +4327,9 @@ packages:
|
||||
resolution: {integrity: sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw==}
|
||||
engines: {node: '>= 0.6'}
|
||||
|
||||
typedarray@0.0.6:
|
||||
resolution: {integrity: sha512-/aCDEGatGvZ2BIk+HmLf4ifCJFwvKFNb9/JeZPMulfgFracn9QFcAf5GO8B/mweUjSoblS5In0cWhqpfs/5PQA==}
|
||||
|
||||
typescript@5.8.3:
|
||||
resolution: {integrity: sha512-p1diW6TqL9L07nNxvRMM7hMMw4c5XOo/1ibL4aAIGmSAt9slTE1Xgw5KWuof2uTOvCg9BY7ZRi+GaF+7sfgPeQ==}
|
||||
engines: {node: '>=14.17'}
|
||||
@@ -4321,6 +4379,9 @@ packages:
|
||||
resolution: {integrity: sha512-pMZTvIkT1d+TFGvDOqodOclx0QWkkgi6Tdoa8gC8ffGAAqz9pzPTZWAybbsHHoED/ztMtkv/VoYTYyShUn81hA==}
|
||||
engines: {node: '>= 0.4.0'}
|
||||
|
||||
uuid-parse@1.1.0:
|
||||
resolution: {integrity: sha512-OdmXxA8rDsQ7YpNVbKSJkNzTw2I+S5WsbMDnCtIWSQaosNAcWtFuI/YK1TjzUI6nbkgiqEyh8gWngfcv8Asd9A==}
|
||||
|
||||
uuid@8.3.2:
|
||||
resolution: {integrity: sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==}
|
||||
hasBin: true
|
||||
@@ -6150,6 +6211,10 @@ snapshots:
|
||||
'@types/connect': 3.4.38
|
||||
'@types/node': 24.0.13
|
||||
|
||||
'@types/busboy@1.5.4':
|
||||
dependencies:
|
||||
'@types/node': 24.0.13
|
||||
|
||||
'@types/connect@3.4.38':
|
||||
dependencies:
|
||||
'@types/node': 24.0.13
|
||||
@@ -6219,6 +6284,10 @@ snapshots:
|
||||
|
||||
'@types/mime@1.3.5': {}
|
||||
|
||||
'@types/multer@2.0.0':
|
||||
dependencies:
|
||||
'@types/express': 5.0.3
|
||||
|
||||
'@types/node@24.0.13':
|
||||
dependencies:
|
||||
undici-types: 7.8.0
|
||||
@@ -6427,6 +6496,8 @@ snapshots:
|
||||
normalize-path: 3.0.0
|
||||
picomatch: 2.3.1
|
||||
|
||||
append-field@1.0.0: {}
|
||||
|
||||
aproba@2.0.0:
|
||||
optional: true
|
||||
|
||||
@@ -6577,6 +6648,10 @@ snapshots:
|
||||
transitivePeerDependencies:
|
||||
- supports-color
|
||||
|
||||
busboy@1.6.0:
|
||||
dependencies:
|
||||
streamsearch: 1.1.0
|
||||
|
||||
bytes@3.1.0: {}
|
||||
|
||||
bytes@3.1.2: {}
|
||||
@@ -6691,6 +6766,13 @@ snapshots:
|
||||
|
||||
concat-map@0.0.1: {}
|
||||
|
||||
concat-stream@2.0.0:
|
||||
dependencies:
|
||||
buffer-from: 1.1.2
|
||||
inherits: 2.0.4
|
||||
readable-stream: 3.6.2
|
||||
typedarray: 0.0.6
|
||||
|
||||
concurrently@9.2.0:
|
||||
dependencies:
|
||||
chalk: 4.1.2
|
||||
@@ -7820,6 +7902,8 @@ snapshots:
|
||||
|
||||
lodash@4.17.21: {}
|
||||
|
||||
long@5.3.2: {}
|
||||
|
||||
lop@0.4.2:
|
||||
dependencies:
|
||||
duck: 0.1.12
|
||||
@@ -8027,6 +8111,10 @@ snapshots:
|
||||
|
||||
mkdirp-classic@0.5.3: {}
|
||||
|
||||
mkdirp@0.5.6:
|
||||
dependencies:
|
||||
minimist: 1.2.8
|
||||
|
||||
mkdirp@1.0.4: {}
|
||||
|
||||
mkdirp@3.0.1: {}
|
||||
@@ -8063,6 +8151,16 @@ snapshots:
|
||||
optionalDependencies:
|
||||
msgpackr-extract: 3.0.3
|
||||
|
||||
multer@2.0.2:
|
||||
dependencies:
|
||||
append-field: 1.0.0
|
||||
busboy: 1.6.0
|
||||
concat-stream: 2.0.0
|
||||
mkdirp: 0.5.6
|
||||
object-assign: 4.1.1
|
||||
type-is: 1.6.18
|
||||
xtend: 4.0.2
|
||||
|
||||
nanoid@3.3.11: {}
|
||||
|
||||
napi-build-utils@2.0.0: {}
|
||||
@@ -8137,6 +8235,8 @@ snapshots:
|
||||
set-blocking: 2.0.0
|
||||
optional: true
|
||||
|
||||
object-assign@4.1.1: {}
|
||||
|
||||
object-inspect@1.13.4: {}
|
||||
|
||||
on-exit-leak-free@2.1.2: {}
|
||||
@@ -8340,6 +8440,12 @@ snapshots:
|
||||
|
||||
proxy-from-env@1.1.0: {}
|
||||
|
||||
pst-extractor@1.11.0:
|
||||
dependencies:
|
||||
iconv-lite: 0.6.3
|
||||
long: 5.3.2
|
||||
uuid-parse: 1.1.0
|
||||
|
||||
pump@3.0.3:
|
||||
dependencies:
|
||||
end-of-stream: 1.4.5
|
||||
@@ -8734,6 +8840,8 @@ snapshots:
|
||||
inherits: 2.0.4
|
||||
readable-stream: 3.6.2
|
||||
|
||||
streamsearch@1.1.0: {}
|
||||
|
||||
string-width@4.2.3:
|
||||
dependencies:
|
||||
emoji-regex: 8.0.0
|
||||
@@ -8978,6 +9086,8 @@ snapshots:
|
||||
media-typer: 1.1.0
|
||||
mime-types: 3.0.1
|
||||
|
||||
typedarray@0.0.6: {}
|
||||
|
||||
typescript@5.8.3: {}
|
||||
|
||||
uc.micro@2.1.0: {}
|
||||
@@ -9027,6 +9137,8 @@ snapshots:
|
||||
|
||||
utils-merge@1.0.1: {}
|
||||
|
||||
uuid-parse@1.1.0: {}
|
||||
|
||||
uuid@8.3.2: {}
|
||||
|
||||
uuid@9.0.1: {}
|
||||
|
||||
Reference in New Issue
Block a user