Compare commits

..

23 Commits

Author SHA1 Message Date
Wayne
4200db69aa Search page responsive fix 2025-10-24 17:03:25 +02:00
Wayne
d57d674a0b Fix package.json in packages 2025-10-23 17:35:34 +02:00
Wayne
092f1e943c Remove enterprise packages 2025-10-23 17:33:26 +02:00
Wayne
8ff772fba2 Formatting code 2025-10-23 17:01:41 +02:00
Wayne
b7799f749d Remove demoMode logic 2025-10-23 16:23:18 +02:00
Wayne
e0e7f4cab1 License service/module 2025-10-22 00:04:38 +02:00
Wayne
874fafd0f3 frontend: Responsive design for menu bar, pagination 2025-10-18 18:30:27 +02:00
Wayne
b1576eb152 Using BSL license 2025-10-18 17:10:36 +02:00
Wayne
8200d1e478 Add ALL_INCLUSIVE_ARCHIVE environment variable to disable jun filtering 2025-10-17 23:25:12 +02:00
Wayne
1c9cecab47 feat(backend): Add BullMQ dashboard for job monitoring
This commit introduces a web-based UI for monitoring and managing background jobs using Bullmq.

Key changes:
- A new `/api/v1/jobs` endpoint is created, serving the Bull Board dashboard. Access is restricted to authenticated administrators.
- All BullMQ queue definitions (`ingestion`, `indexing`, `sync-scheduler`) have been centralized into a new `packages/backend/src/jobs/queues.ts` file.
- Workers and services now import queue instances from this central file, improving code organization and removing redundant queue instantiations.
2025-10-17 17:03:25 +02:00
Wayne
9d408129c9 Not filtering our Trash folder 2025-10-14 00:11:13 +02:00
Wayne
150a9b15c9 feat(attachments): De-duplicate attachment content by content hash
This commit refactors attachment handling to allow multiple emails within the same ingestion source to reference attachments with identical content (same hash).

Changes:
- The unique index on the `attachments` table has been changed to a non-unique index to permit duplicate hash/source pairs.
- The ingestion logic is updated to first check for an existing attachment with the same hash and source. If found, it reuses the existing record; otherwise, it creates a new one. This maintains storage de-duplication.
- The email deletion logic is improved to be more robust. It now correctly removes the email-attachment link before checking if the attachment record and its corresponding file can be safely deleted.
2025-10-13 15:25:46 +02:00
Wayne
eefe21c4cd feat(docker): Fix CORS errors
This commit fixes CORS errors when running the app in Docker by introducing the `APP_URL` environment variable. A CORS policy is set up for the backend to only allow origin from the `APP_URL`.

Key changes include:
- New `APP_URL` and `ORIGIN` environment variables have been added to properly configure CORS and the SvelteKit adapter, making the application's public URL easily configurable.
- Dockerfiles are updated to copy the entrypoint script, Drizzle config, and migration files into the final image.
- Documentation and example files (`.env.example`, `docker-compose.yml`) have been updated to reflect these changes.
2025-10-13 01:28:23 +02:00
Wayne
29ac26e488 Adding position for menu items 2025-10-09 23:39:42 +02:00
Wayne
6b15dcdd89 Add option to disable deletions
This commit introduces a new feature that allows admins to disable the deletion of emails and ingestion sources for the entire instance. This is a critical feature for compliance and data retention, as it prevents accidental or unauthorized deletions.

Changes:
-   **Configuration**: Added an `ENABLE_DELETION` environment variable. If this variable is not set to `true`, all deletion operations will be disabled.
-   **Deletion Guard**: A centralized `checkDeletionEnabled` guard has been implemented to enforce this setting at both the controller and service levels, ensuring a robust and secure implementation.
-   **Documentation**: The installation guide has been updated to include the new `ENABLE_DELETION` environment variable and its behavior.
-   **Refactor**: The `IngestionService`'s `create` method was refactored to remove unnecessary calls to the `delete` method, simplifying the code and improving its robustness.
2025-10-06 00:58:41 +02:00
Wayne
659d130f3b Scope attachment deduplication to ingestion source
Previously, attachment deduplication was handled globally by enforcing a unique constraint on the content hash (contentHashSha256) in the `attachments` table. This caused an issue where an attachment from one ingestion source would be incorrectly linked if the same attachment was processed by a different source.

This commit refactors the deduplication logic to be scoped on a per-ingestion-source basis.

Changes:
-   **Schema:** The `attachments` table schema has been updated to include a nullable `ingestionSourceId` column. A composite unique index has been added on `(ingestionSourceId, contentHashSha256)` to enforce per-source uniqueness. The `ingestionSourceId` is nullable to ensure backward compatibility with existing databases.
-   **Ingestion Logic:** The `IngestionService` has been updated to provide the `ingestionSourceId` when inserting attachment records. The `onConflictDoUpdate` clause now targets the new composite key, ensuring that attachments are only considered duplicates if they have the same hash and originate from the same ingestion source.
2025-10-06 00:04:34 +02:00
Wayne
2a3d6846d8 Scope attachment deduplication to ingestion source
Previously, attachment deduplication was handled globally by enforcing a unique constraint on the content hash (contentHashSha256) in the `attachments` table. This caused an issue where an attachment from one ingestion source would be incorrectly linked if the same attachment was processed by a different source.

This commit refactors the deduplication logic to be scoped on a per-ingestion-source basis.

Changes:
-   **Schema:** The `attachments` table schema has been updated to include a nullable `ingestionSourceId` column. A composite unique index has been added on `(ingestionSourceId, contentHashSha256)` to enforce per-source uniqueness. The `ingestionSourceId` is nullable to ensure backward compatibility with existing databases.
-   **Ingestion Logic:** The `IngestionService` has been updated to provide the `ingestionSourceId` when inserting attachment records. The `onConflictDoUpdate` clause now targets the new composite key, ensuring that attachments are only considered duplicates if they have the same hash and originate from the same ingestion source.
2025-10-06 00:04:06 +02:00
Wayne
826fd6f965 File encryption support 2025-10-04 00:45:33 +02:00
Wayne
f4dce6f1e9 Update Docker-compose.yml to use bind mount for Open Archiver data.
Fix API rate-limiter warning about trust proxy
2025-10-03 17:46:27 +02:00
Wayne
1a2aec3cf4 feat: Integrity report, allowing users to verify the integrity of archived emails and their attachments.
- When an email is archived, Open Archiver calculates a unique cryptographic signature (a SHA256 hash) for the email's raw `.eml` file and for each of its attachments. These signatures are stored in the database alongside the email's metadata.
- The integrity check feature recalculates these signatures for the stored files and compares them to the original signatures stored in the database. This process allows you to verify that the content of your archived emails has not been altered, corrupted, or tampered with since the moment they were archived.
- Add docs of Integrity report
2025-10-03 16:09:52 +02:00
Wayne
2030264838 Audit-log docs 2025-10-03 10:58:22 +02:00
Wayne
d99fcfcc27 enterprise: Audit log API, UI 2025-10-03 01:11:32 +02:00
Wayne
d20fe8badb open-core setup, adding enterprise package 2025-09-28 23:29:46 +02:00
26 changed files with 133 additions and 647 deletions

View File

@@ -7,7 +7,7 @@
[![Redis](https://img.shields.io/badge/Redis-DC382D?style=for-the-badge&logo=redis&logoColor=white)](https://redis.io)
[![SvelteKit](https://img.shields.io/badge/SvelteKit-FF3E00?style=for-the-badge&logo=svelte&logoColor=white)](https://svelte.dev/)
**A secure, sovereign, and open-source platform for email archiving.**
**A secure, sovereign, and open-source platform for email archiving and eDiscovery.**
Open Archiver provides a robust, self-hosted solution for archiving, storing, indexing, and searching emails from major platforms, including Google Workspace (Gmail), Microsoft 365, PST files, as well as generic IMAP-enabled email inboxes. Use Open Archiver to keep a permanent, tamper-proof record of your communication history, free from vendor lock-in.
@@ -48,14 +48,13 @@ Password: openarchiver_demo
- Zipped .eml files
- Mbox files
- **Secure & Efficient Storage**: Emails are stored in the standard `.eml` format. The system uses deduplication and compression to minimize storage costs. All files are encrypted at rest.
- **Secure & Efficient Storage**: Emails are stored in the standard `.eml` format. The system uses deduplication and compression to minimize storage costs. All data is encrypted at rest.
- **Pluggable Storage Backends**: Support both local filesystem storage and S3-compatible object storage (like AWS S3 or MinIO).
- **Powerful Search & eDiscovery**: A high-performance search engine indexes the full text of emails and attachments (PDF, DOCX, etc.).
- **Thread discovery**: The ability to discover if an email belongs to a thread/conversation and present the context.
- **Compliance & Retention**: Define granular retention policies to automatically manage the lifecycle of your data. Place legal holds on communications to prevent deletion during litigation (TBD).
- **File Hash and Encryption**: Email and attachment file hash values are stored in the meta database upon ingestion, meaning any attempt to alter the file content will be identified, ensuring legal and regulatory compliance.
- - Each archived email comes with an "Integrity Report" feature that indicates if the files are original.
- **Comprehensive Auditing**: An immutable audit trail logs all system activities, ensuring you have a clear record of who accessed what and when.
- **Comprehensive Auditing**: An immutable audit trail logs all system activities, ensuring you have a clear record of who accessed what and when (TBD).
## 🛠️ Tech Stack

Binary file not shown.

Before

Width:  |  Height:  |  Size: 259 KiB

View File

@@ -19,7 +19,7 @@ The request body should be a `CreateIngestionSourceDto` object.
```typescript
interface CreateIngestionSourceDto {
name: string;
provider: 'google_workspace' | 'microsoft_365' | 'generic_imap' | 'pst_import' | 'eml_import' | 'mbox_import';
provider: 'google' | 'microsoft' | 'generic_imap';
providerConfig: IngestionCredentials;
}
```

View File

@@ -1,4 +1,4 @@
# Integrity Check
# Email Integrity Check
Open Archiver allows you to verify the integrity of your archived emails and their attachments. This guide explains how the integrity check works and what the results mean.

View File

@@ -1,6 +1,6 @@
{
"name": "open-archiver",
"version": "0.4.1",
"version": "0.4.0",
"private": true,
"license": "SEE LICENSE IN LICENSE file",
"scripts": {

View File

@@ -16,7 +16,7 @@ const generateApiKeySchema = z.object({
});
export class ApiKeyController {
private userService = new UserService();
public generateApiKey = async (req: Request, res: Response) => {
public async generateApiKey(req: Request, res: Response) {
try {
const { name, expiresInDays } = generateApiKeySchema.parse(req.body);
if (!req.user || !req.user.sub) {
@@ -45,9 +45,9 @@ export class ApiKeyController {
}
res.status(500).json({ message: req.t('errors.internalServerError') });
}
};
}
public getApiKeys = async (req: Request, res: Response) => {
public async getApiKeys(req: Request, res: Response) {
if (!req.user || !req.user.sub) {
return res.status(401).json({ message: 'Unauthorized' });
}
@@ -55,9 +55,9 @@ export class ApiKeyController {
const keys = await ApiKeyService.getKeys(userId);
res.status(200).json(keys);
};
}
public deleteApiKey = async (req: Request, res: Response) => {
public async deleteApiKey(req: Request, res: Response) {
const { id } = req.params;
if (!req.user || !req.user.sub) {
return res.status(401).json({ message: 'Unauthorized' });
@@ -70,5 +70,5 @@ export class ApiKeyController {
await ApiKeyService.deleteKey(id, userId, actor, req.ip || 'unknown');
res.status(204).send({ message: req.t('apiKeys.deleteSuccess') });
};
}
}

View File

@@ -79,60 +79,3 @@ export const deleteUser = async (req: Request, res: Response) => {
await userService.deleteUser(req.params.id, actor, req.ip || 'unknown');
res.status(204).send();
};
export const getProfile = async (req: Request, res: Response) => {
if (!req.user || !req.user.sub) {
return res.status(401).json({ message: 'Unauthorized' });
}
const user = await userService.findById(req.user.sub);
if (!user) {
return res.status(404).json({ message: req.t('user.notFound') });
}
res.json(user);
};
export const updateProfile = async (req: Request, res: Response) => {
const { email, first_name, last_name } = req.body;
if (!req.user || !req.user.sub) {
return res.status(401).json({ message: 'Unauthorized' });
}
const actor = await userService.findById(req.user.sub);
if (!actor) {
return res.status(401).json({ message: 'Unauthorized' });
}
const updatedUser = await userService.updateUser(
req.user.sub,
{ email, first_name, last_name },
undefined,
actor,
req.ip || 'unknown'
);
res.json(updatedUser);
};
export const updatePassword = async (req: Request, res: Response) => {
const { currentPassword, newPassword } = req.body;
if (!req.user || !req.user.sub) {
return res.status(401).json({ message: 'Unauthorized' });
}
const actor = await userService.findById(req.user.sub);
if (!actor) {
return res.status(401).json({ message: 'Unauthorized' });
}
try {
await userService.updatePassword(
req.user.sub,
currentPassword,
newPassword,
actor,
req.ip || 'unknown'
);
res.status(200).json({ message: 'Password updated successfully' });
} catch (e: any) {
if (e.message === 'Invalid current password') {
return res.status(400).json({ message: e.message });
}
throw e;
}
};

View File

@@ -11,10 +11,6 @@ export const createUserRouter = (authService: AuthService): Router => {
router.get('/', requirePermission('read', 'users'), userController.getUsers);
router.get('/profile', userController.getProfile);
router.patch('/profile', userController.updateProfile);
router.post('/profile/password', userController.updatePassword);
router.get('/:id', requirePermission('read', 'users'), userController.getUser);
/**

View File

@@ -47,10 +47,10 @@ function extractTextFromPdf(buffer: Buffer): Promise<string> {
}
// reduced Timeout for better performance
// setTimeout(() => {
// logger.warn('PDF parsing timed out');
// finish('');
// }, 5000);
setTimeout(() => {
logger.warn('PDF parsing timed out');
finish('');
}, 5000);
});
}

View File

@@ -33,6 +33,7 @@ export const processMailboxProcessor = async (job: Job<IProcessMailboxJob, SyncS
const searchService = new SearchService();
const storageService = new StorageService();
const databaseService = new DatabaseService();
const indexingService = new IndexingService(databaseService, searchService, storageService);
try {
const source = await IngestionService.findById(ingestionSourceId);
@@ -71,8 +72,7 @@ export const processMailboxProcessor = async (job: Job<IProcessMailboxJob, SyncS
return newSyncState;
} catch (error) {
if (emailBatch.length > 0) {
await indexingQueue.add('index-email-batch', { emails: emailBatch });
emailBatch = [];
await indexingService.indexEmailBatch(emailBatch);
}
logger.error({ err: error, ingestionSourceId, userEmail }, 'Error processing mailbox');

View File

@@ -51,7 +51,7 @@ export default async (job: Job<ISyncCycleFinishedJob, any, string>) => {
const finalSyncState = deepmerge(
...successfulJobs.filter((s) => s && Object.keys(s).length > 0)
) as SyncState;
);
const source = await IngestionService.findById(ingestionSourceId);
let status: IngestionStatus = 'active';
@@ -63,9 +63,7 @@ export default async (job: Job<ISyncCycleFinishedJob, any, string>) => {
let message: string;
// Check for a specific rate-limit message from the successful jobs
const rateLimitMessage = successfulJobs.find(
(j) => j.statusMessage && j.statusMessage.includes('rate limit')
)?.statusMessage;
const rateLimitMessage = successfulJobs.find((j) => j.statusMessage)?.statusMessage;
if (failedJobs.length > 0) {
status = 'error';

View File

@@ -20,17 +20,16 @@ export class ApiKeyService {
expiresAt.setDate(expiresAt.getDate() + expiresInDays);
const keyHash = createHash('sha256').update(key).digest('hex');
try {
await db.insert(apiKeys).values({
userId,
name,
key: CryptoService.encrypt(key),
keyHash,
expiresAt,
});
await db.insert(apiKeys).values({
userId,
name,
key: CryptoService.encrypt(key),
keyHash,
expiresAt,
});
await this.auditService.createAuditLog({
actorIdentifier: actor.id,
await this.auditService.createAuditLog({
actorIdentifier: actor.id,
actionType: 'GENERATE',
targetType: 'ApiKey',
targetId: name,
@@ -41,9 +40,6 @@ export class ApiKeyService {
});
return key;
} catch (error) {
throw error;
}
}
public static async getKeys(userId: string): Promise<ApiKey[]> {

View File

@@ -93,19 +93,21 @@ export class IndexingService {
const batch = emails.slice(i, i + CONCURRENCY_LIMIT);
const batchDocuments = await Promise.allSettled(
batch.map(async (pendingEmail) => {
batch.map(async ({ email, sourceId, archivedId }) => {
try {
const document = await this.indexEmailById(
pendingEmail.archivedEmailId
return await this.createEmailDocumentFromRawForBatch(
email,
sourceId,
archivedId,
email.userEmail || ''
);
if (document) {
return document;
}
return null;
} catch (error) {
logger.error(
{
emailId: pendingEmail.archivedEmailId,
emailId: archivedId,
sourceId,
userEmail: email.userEmail || '',
rawEmailData: JSON.stringify(email, null, 2),
error: error instanceof Error ? error.message : String(error),
},
'Failed to create document for email in batch'
@@ -116,15 +118,10 @@ export class IndexingService {
);
for (const result of batchDocuments) {
if (result.status === 'fulfilled' && result.value) {
if (result.status === 'fulfilled') {
rawDocuments.push(result.value);
} else if (result.status === 'rejected') {
logger.error({ error: result.reason }, 'Failed to process email in batch');
} else {
logger.error(
{ result: result },
'Failed to process email in batch, reason unknown.'
);
logger.error({ error: result.reason }, 'Failed to process email in batch');
}
}
}
@@ -198,7 +195,10 @@ export class IndexingService {
}
}
private async indexEmailById(emailId: string): Promise<EmailDocument | null> {
/**
* @deprecated
*/
private async indexEmailById(emailId: string): Promise<void> {
const email = await this.dbService.db.query.archivedEmails.findFirst({
where: eq(archivedEmails.id, emailId),
});
@@ -228,13 +228,13 @@ export class IndexingService {
emailAttachmentsResult,
email.userEmail
);
return document;
await this.searchService.addDocuments('emails', [document], 'id');
}
/**
* @deprecated
*/
/* private async indexByEmail(pendingEmail: PendingEmail): Promise<void> {
private async indexByEmail(pendingEmail: PendingEmail): Promise<void> {
const attachments: AttachmentsType = [];
if (pendingEmail.email.attachments && pendingEmail.email.attachments.length > 0) {
for (const attachment of pendingEmail.email.attachments) {
@@ -254,12 +254,12 @@ export class IndexingService {
);
// console.log(document);
await this.searchService.addDocuments('emails', [document], 'id');
} */
}
/**
* Creates a search document from a raw email object and its attachments.
*/
/* private async createEmailDocumentFromRawForBatch(
private async createEmailDocumentFromRawForBatch(
email: EmailObject,
ingestionSourceId: string,
archivedEmailId: string,
@@ -333,7 +333,7 @@ export class IndexingService {
timestamp: new Date(email.receivedAt).getTime(),
ingestionSourceId: ingestionSourceId,
};
} */
}
private async createEmailDocumentFromRaw(
email: EmailObject,

View File

@@ -518,8 +518,12 @@ export class IngestionService {
}
}
email.userEmail = userEmail;
return {
archivedEmailId: archivedEmail.id,
email,
sourceId: source.id,
archivedId: archivedEmail.id,
};
} catch (error) {
logger.error({

View File

@@ -81,79 +81,6 @@ export class StorageService implements IStorageProvider {
return Readable.from(decryptedContent);
}
public async getStream(path: string): Promise<NodeJS.ReadableStream> {
const stream = await this.provider.get(path);
if (!this.encryptionKey) {
return stream;
}
// For encrypted files, we need to read the prefix and IV first.
// This part still buffers a small, fixed amount of data, which is acceptable.
const prefixAndIvBuffer = await new Promise<Buffer>((resolve, reject) => {
const chunks: Buffer[] = [];
let totalLength = 0;
const targetLength = ENCRYPTION_PREFIX.length + 16;
const onData = (chunk: Buffer) => {
chunks.push(chunk);
totalLength += chunk.length;
if (totalLength >= targetLength) {
stream.removeListener('data', onData);
resolve(Buffer.concat(chunks));
}
};
stream.on('data', onData);
stream.on('error', reject);
stream.on('end', () => {
// Handle cases where the file is smaller than the prefix + IV
if (totalLength < targetLength) {
resolve(Buffer.concat(chunks));
}
});
});
const prefix = prefixAndIvBuffer.subarray(0, ENCRYPTION_PREFIX.length);
if (!prefix.equals(ENCRYPTION_PREFIX)) {
// File is not encrypted, return a new stream containing the buffered prefix and the rest of the original stream
const combinedStream = new Readable({
read() {},
});
combinedStream.push(prefixAndIvBuffer);
stream.on('data', (chunk) => {
combinedStream.push(chunk);
});
stream.on('end', () => {
combinedStream.push(null); // No more data
});
stream.on('error', (err) => {
combinedStream.emit('error', err);
});
return combinedStream;
}
try {
const iv = prefixAndIvBuffer.subarray(
ENCRYPTION_PREFIX.length,
ENCRYPTION_PREFIX.length + 16
);
const decipher = createDecipheriv(this.algorithm, this.encryptionKey, iv);
// Push the remaining part of the initial buffer to the decipher
const remainingBuffer = prefixAndIvBuffer.subarray(ENCRYPTION_PREFIX.length + 16);
if (remainingBuffer.length > 0) {
decipher.write(remainingBuffer);
}
// Pipe the rest of the stream
stream.pipe(decipher);
return decipher;
} catch (error) {
throw new Error('Failed to decrypt file. It may be corrupted or the key is incorrect.');
}
}
delete(path: string): Promise<void> {
return this.provider.delete(path);
}

View File

@@ -1,7 +1,7 @@
import { db } from '../database';
import * as schema from '../database/schema';
import { eq, sql } from 'drizzle-orm';
import { hash, compare } from 'bcryptjs';
import { hash } from 'bcryptjs';
import type { CaslPolicy, User } from '@open-archiver/types';
import { AuditService } from './AuditService';
@@ -152,46 +152,6 @@ export class UserService {
});
}
public async updatePassword(
id: string,
currentPassword: string,
newPassword: string,
actor: User,
actorIp: string
): Promise<void> {
const user = await db.query.users.findFirst({
where: eq(schema.users.id, id),
});
if (!user || !user.password) {
throw new Error('User not found');
}
const isPasswordValid = await compare(currentPassword, user.password);
if (!isPasswordValid) {
throw new Error('Invalid current password');
}
const hashedPassword = await hash(newPassword, 10);
await db
.update(schema.users)
.set({ password: hashedPassword })
.where(eq(schema.users.id, id));
await UserService.auditService.createAuditLog({
actorIdentifier: actor.id,
actionType: 'UPDATE',
targetType: 'User',
targetId: id,
actorIp,
details: {
field: 'password',
},
});
}
/**
* Creates an admin user in the database. The user created will be assigned the 'Super Admin' role.
*

View File

@@ -15,6 +15,7 @@ import { getThreadId } from './helpers/utils';
export class ImapConnector implements IEmailConnector {
private client: ImapFlow;
private newMaxUids: { [mailboxPath: string]: number } = {};
private isConnected = false;
private statusMessage: string | undefined;
constructor(private credentials: GenericImapCredentials) {
@@ -40,6 +41,7 @@ export class ImapConnector implements IEmailConnector {
// Handles client-level errors, like unexpected disconnects, to prevent crashes.
client.on('error', (err) => {
logger.error({ err }, 'IMAP client error');
this.isConnected = false;
});
return client;
@@ -49,17 +51,20 @@ export class ImapConnector implements IEmailConnector {
* Establishes a connection to the IMAP server if not already connected.
*/
private async connect(): Promise<void> {
// If the client is already connected and usable, do nothing.
if (this.client.usable) {
if (this.isConnected && this.client.usable) {
return;
}
// If the client is not usable (e.g., after a logout or an error), create a new one.
this.client = this.createClient();
// If the client is not usable (e.g., after a logout), create a new one.
if (!this.client.usable) {
this.client = this.createClient();
}
try {
await this.client.connect();
this.isConnected = true;
} catch (err: any) {
this.isConnected = false;
logger.error({ err }, 'IMAP connection failed');
if (err.responseText) {
throw new Error(`IMAP Connection Error: ${err.responseText}`);
@@ -72,8 +77,9 @@ export class ImapConnector implements IEmailConnector {
* Disconnects from the IMAP server if the connection is active.
*/
private async disconnect(): Promise<void> {
if (this.client.usable) {
if (this.isConnected && this.client.usable) {
await this.client.logout();
this.isConnected = false;
}
}
@@ -124,7 +130,7 @@ export class ImapConnector implements IEmailConnector {
return await action();
} catch (err: any) {
logger.error({ err, attempt }, `IMAP operation failed on attempt ${attempt}`);
// The client is no longer usable, a new one will be created on the next attempt.
this.isConnected = false; // Force reconnect on next attempt
if (attempt === maxRetries) {
logger.error({ err }, 'IMAP operation failed after all retries.');
throw err;
@@ -149,10 +155,6 @@ export class ImapConnector implements IEmailConnector {
const mailboxes = await this.withRetry(async () => await this.client.list());
const processableMailboxes = mailboxes.filter((mailbox) => {
// Exclude mailboxes that cannot be selected.
if (mailbox.flags.has('\\Noselect')) {
return false;
}
if (config.app.allInclusiveArchive) {
return true;
}

View File

@@ -10,46 +10,9 @@ import { simpleParser, ParsedMail, Attachment, AddressObject } from 'mailparser'
import { logger } from '../../config/logger';
import { getThreadId } from './helpers/utils';
import { StorageService } from '../StorageService';
import { Readable, Transform } from 'stream';
import { Readable } from 'stream';
import { createHash } from 'crypto';
class MboxSplitter extends Transform {
private buffer: Buffer = Buffer.alloc(0);
private delimiter: Buffer = Buffer.from('\nFrom ');
private firstChunk: boolean = true;
_transform(chunk: Buffer, encoding: string, callback: Function) {
if (this.firstChunk) {
// Check if the file starts with "From ". If not, prepend it to the first email.
if (chunk.subarray(0, 5).toString() !== 'From ') {
this.push(Buffer.from('From '));
}
this.firstChunk = false;
}
let currentBuffer = Buffer.concat([this.buffer, chunk]);
let position;
while ((position = currentBuffer.indexOf(this.delimiter)) > -1) {
const email = currentBuffer.subarray(0, position);
if (email.length > 0) {
this.push(email);
}
// The next email starts with "From ", which is what the parser expects.
currentBuffer = currentBuffer.subarray(position + 1);
}
this.buffer = currentBuffer;
callback();
}
_flush(callback: Function) {
if (this.buffer.length > 0) {
this.push(this.buffer);
}
callback();
}
}
import { streamToBuffer } from '../../helpers/streamToBuffer';
export class MboxConnector implements IEmailConnector {
private storage: StorageService;
@@ -94,33 +57,48 @@ export class MboxConnector implements IEmailConnector {
userEmail: string,
syncState?: SyncState | null
): AsyncGenerator<EmailObject | null> {
const fileStream = await this.storage.getStream(this.credentials.uploadedFilePath);
const mboxSplitter = new MboxSplitter();
const emailStream = fileStream.pipe(mboxSplitter);
try {
const fileStream = await this.storage.get(this.credentials.uploadedFilePath);
const fileBuffer = await streamToBuffer(fileStream as Readable);
const mboxContent = fileBuffer.toString('utf-8');
const emailDelimiter = '\nFrom ';
const emails = mboxContent.split(emailDelimiter);
for await (const emailBuffer of emailStream) {
// The first split part might be empty or part of the first email's header, so we adjust.
if (emails.length > 0 && !mboxContent.startsWith('From ')) {
emails.shift(); // Adjust if the file doesn't start with "From "
}
logger.info(`Found ${emails.length} potential emails in the mbox file.`);
let emailCount = 0;
for (const email of emails) {
try {
// Re-add the "From " delimiter for the parser, except for the very first email
const emailWithDelimiter =
emailCount > 0 || mboxContent.startsWith('From ') ? `From ${email}` : email;
const emailBuffer = Buffer.from(emailWithDelimiter, 'utf-8');
const emailObject = await this.parseMessage(emailBuffer, '');
yield emailObject;
emailCount++;
} catch (error) {
logger.error(
{ error, file: this.credentials.uploadedFilePath },
'Failed to process a single message from mbox file. Skipping.'
);
}
}
logger.info(`Finished processing mbox file. Total emails processed: ${emailCount}`);
} finally {
try {
const emailObject = await this.parseMessage(emailBuffer as Buffer, '');
yield emailObject;
await this.storage.delete(this.credentials.uploadedFilePath);
} catch (error) {
logger.error(
{ error, file: this.credentials.uploadedFilePath },
'Failed to process a single message from mbox file. Skipping.'
'Failed to delete mbox file after processing.'
);
}
}
// After the stream is fully consumed, delete the file.
// The `for await...of` loop ensures streams are properly closed on completion,
// so we can safely delete the file here without causing a hang.
try {
await this.storage.delete(this.credentials.uploadedFilePath);
} catch (error) {
logger.error(
{ error, file: this.credentials.uploadedFilePath },
'Failed to delete mbox file after processing.'
);
}
}
private async parseMessage(emlBuffer: Buffer, path: string): Promise<EmailObject> {

View File

@@ -13,8 +13,15 @@ import { getThreadId } from './helpers/utils';
import { StorageService } from '../StorageService';
import { Readable } from 'stream';
import { createHash } from 'crypto';
import { join } from 'path';
import { createWriteStream, promises as fs } from 'fs';
const streamToBuffer = (stream: Readable): Promise<Buffer> => {
return new Promise((resolve, reject) => {
const chunks: Buffer[] = [];
stream.on('data', (chunk) => chunks.push(chunk));
stream.on('error', reject);
stream.on('end', () => resolve(Buffer.concat(chunks)));
});
};
// We have to hardcode names for deleted and trash folders here as current lib doesn't support looking into PST properties.
const DELETED_FOLDERS = new Set([
@@ -106,25 +113,20 @@ const JUNK_FOLDERS = new Set([
export class PSTConnector implements IEmailConnector {
private storage: StorageService;
private pstFile: PSTFile | null = null;
constructor(private credentials: PSTImportCredentials) {
this.storage = new StorageService();
}
private async loadPstFile(): Promise<{ pstFile: PSTFile; tempDir: string }> {
const fileStream = await this.storage.getStream(this.credentials.uploadedFilePath);
const tempDir = await fs.mkdtemp(join('/tmp', `pst-import-${new Date().getTime()}`));
const tempFilePath = join(tempDir, 'temp.pst');
await new Promise<void>((resolve, reject) => {
const dest = createWriteStream(tempFilePath);
fileStream.pipe(dest);
dest.on('finish', resolve);
dest.on('error', reject);
});
const pstFile = new PSTFile(tempFilePath);
return { pstFile, tempDir };
private async loadPstFile(): Promise<PSTFile> {
if (this.pstFile) {
return this.pstFile;
}
const fileStream = await this.storage.get(this.credentials.uploadedFilePath);
const buffer = await streamToBuffer(fileStream as Readable);
this.pstFile = new PSTFile(buffer);
return this.pstFile;
}
public async testConnection(): Promise<boolean> {
@@ -139,6 +141,7 @@ export class PSTConnector implements IEmailConnector {
if (!fileExist) {
throw Error('PST file upload not finished yet, please wait.');
}
return true;
} catch (error) {
logger.error({ error, credentials: this.credentials }, 'PST file validation failed.');
@@ -153,11 +156,8 @@ export class PSTConnector implements IEmailConnector {
*/
public async *listAllUsers(): AsyncGenerator<MailboxUser> {
let pstFile: PSTFile | null = null;
let tempDir: string | null = null;
try {
const loadResult = await this.loadPstFile();
pstFile = loadResult.pstFile;
tempDir = loadResult.tempDir;
pstFile = await this.loadPstFile();
const root = pstFile.getRootFolder();
const displayName: string =
root.displayName || pstFile.pstFilename || String(new Date().getTime());
@@ -171,12 +171,10 @@ export class PSTConnector implements IEmailConnector {
};
} catch (error) {
logger.error({ error }, 'Failed to list users from PST file.');
pstFile?.close();
throw error;
} finally {
pstFile?.close();
if (tempDir) {
await fs.rm(tempDir, { recursive: true, force: true });
}
}
}
@@ -185,21 +183,16 @@ export class PSTConnector implements IEmailConnector {
syncState?: SyncState | null
): AsyncGenerator<EmailObject | null> {
let pstFile: PSTFile | null = null;
let tempDir: string | null = null;
try {
const loadResult = await this.loadPstFile();
pstFile = loadResult.pstFile;
tempDir = loadResult.tempDir;
pstFile = await this.loadPstFile();
const root = pstFile.getRootFolder();
yield* this.processFolder(root, '', userEmail);
} catch (error) {
logger.error({ error }, 'Failed to fetch email.');
pstFile?.close();
throw error;
} finally {
pstFile?.close();
if (tempDir) {
await fs.rm(tempDir, { recursive: true, force: true });
}
try {
await this.storage.delete(this.credentials.uploadedFilePath);
} catch (error) {

View File

@@ -118,23 +118,6 @@
"confirm": "Confirm",
"cancel": "Cancel"
},
"account": {
"title": "Account Settings",
"description": "Manage your profile and security settings.",
"personal_info": "Personal Information",
"personal_info_desc": "Update your personal details.",
"security": "Security",
"security_desc": "Manage your password and security preferences.",
"edit_profile": "Edit Profile",
"change_password": "Change Password",
"edit_profile_desc": "Make changes to your profile here.",
"change_password_desc": "Change your password. You will need to enter your current password.",
"current_password": "Current Password",
"new_password": "New Password",
"confirm_new_password": "Confirm New Password",
"operation_successful": "Operation successful",
"passwords_do_not_match": "Passwords do not match"
},
"system_settings": {
"title": "System Settings",
"system_settings": "System Settings",
@@ -251,7 +234,6 @@
"users": "Users",
"roles": "Roles",
"api_keys": "API Keys",
"account": "Account",
"logout": "Logout",
"admin": "Admin"
},

View File

@@ -10,20 +10,10 @@ const handleRequest: RequestHandler = async ({ request, params, fetch }) => {
const targetUrl = `${BACKEND_URL}/${slug}${url.search}`;
try {
let body: ArrayBuffer | null = null;
const headers = new Headers(request.headers);
if (request.method !== 'GET' && request.method !== 'HEAD') {
body = await request.arrayBuffer();
if (body.byteLength > 0) {
headers.set('Content-Length', String(body.byteLength));
}
}
const proxyRequest = new Request(targetUrl, {
method: request.method,
headers: headers,
body: body,
headers: request.headers,
body: request.body,
duplex: 'half',
} as RequestInit);

View File

@@ -64,10 +64,6 @@
href: '/dashboard/settings/api-keys',
label: $t('app.layout.api_keys'),
},
{
href: '/dashboard/settings/account',
label: $t('app.layout.account'),
},
],
position: 5,
},

View File

@@ -58,7 +58,7 @@
<Card.Root>
<Card.Header>
<Card.Title>{$t('app.jobs.jobs')}</Card.Title>
<div class="flex flex-wrap space-x-2 space-y-2">
<div class="flex space-x-2">
{#each jobStatuses as status}
<Button
variant={selectedStatus === status ? 'default' : 'outline'}

View File

@@ -1,58 +0,0 @@
import type { PageServerLoad, Actions } from './$types';
import { api } from '$lib/server/api';
import { fail } from '@sveltejs/kit';
import type { User } from '@open-archiver/types';
export const load: PageServerLoad = async (event) => {
const response = await api('/users/profile', event);
if (!response.ok) {
const error = await response.json();
console.error('Failed to fetch profile:', error);
// Return null user if failed, handle in UI
return { user: null };
}
const user: User = await response.json();
return { user };
};
export const actions: Actions = {
updateProfile: async (event) => {
const data = await event.request.formData();
const first_name = data.get('first_name');
const last_name = data.get('last_name');
const email = data.get('email');
const response = await api('/users/profile', event, {
method: 'PATCH',
body: JSON.stringify({ first_name, last_name, email }),
});
if (!response.ok) {
const error = await response.json();
return fail(response.status, {
profileError: true,
message: error.message || 'Failed to update profile',
});
}
return { success: true };
},
updatePassword: async (event) => {
const data = await event.request.formData();
const currentPassword = data.get('currentPassword');
const newPassword = data.get('newPassword');
const response = await api('/users/profile/password', event, {
method: 'POST',
body: JSON.stringify({ currentPassword, newPassword }),
});
if (!response.ok) {
const error = await response.json();
return fail(response.status, {
passwordError: true,
message: error.message || 'Failed to update password',
});
}
return { success: true };
},
};

View File

@@ -1,218 +0,0 @@
<script lang="ts">
import { enhance } from '$app/forms';
import { t } from '$lib/translations';
import { Button } from '$lib/components/ui/button';
import * as Card from '$lib/components/ui/card';
import { Input } from '$lib/components/ui/input';
import { Label } from '$lib/components/ui/label';
import * as Dialog from '$lib/components/ui/dialog';
import { setAlert } from '$lib/components/custom/alert/alert-state.svelte';
let { data, form } = $props();
let user = $derived(data.user);
let isProfileDialogOpen = $state(false);
let isPasswordDialogOpen = $state(false);
let isSubmitting = $state(false);
// Profile form state
let profileFirstName = $state('');
let profileLastName = $state('');
let profileEmail = $state('');
// Password form state
let currentPassword = $state('');
let newPassword = $state('');
let confirmNewPassword = $state('');
// Preload profile form
$effect(() => {
if (user && isProfileDialogOpen) {
profileFirstName = user.first_name || '';
profileLastName = user.last_name || '';
profileEmail = user.email || '';
}
});
// Handle form actions result
$effect(() => {
if (form) {
isSubmitting = false;
if (form.success) {
isProfileDialogOpen = false;
isPasswordDialogOpen = false;
setAlert({
type: 'success',
title: $t('app.account.operation_successful'),
message: $t('app.account.operation_successful'),
duration: 3000,
show: true
});
} else if (form.profileError || form.passwordError) {
setAlert({
type: 'error',
title: $t('app.search.error'),
message: form.message,
duration: 3000,
show: true
});
}
}
});
function openProfileDialog() {
isProfileDialogOpen = true;
}
function openPasswordDialog() {
currentPassword = '';
newPassword = '';
confirmNewPassword = '';
isPasswordDialogOpen = true;
}
</script>
<svelte:head>
<title>{$t('app.account.title')} - OpenArchiver</title>
</svelte:head>
<div class="space-y-6">
<div>
<h1 class="text-2xl font-bold">{$t('app.account.title')}</h1>
<p class="text-muted-foreground">{$t('app.account.description')}</p>
</div>
<!-- Personal Information -->
<Card.Root>
<Card.Header>
<Card.Title>{$t('app.account.personal_info')}</Card.Title>
</Card.Header>
<Card.Content class="space-y-4">
<div class="grid grid-cols-2 gap-4">
<div>
<Label class="text-muted-foreground">{$t('app.users.name')}</Label>
<p class="text-sm font-medium">{user?.first_name} {user?.last_name}</p>
</div>
<div>
<Label class="text-muted-foreground">{$t('app.users.email')}</Label>
<p class="text-sm font-medium">{user?.email}</p>
</div>
<div>
<Label class="text-muted-foreground">{$t('app.users.role')}</Label>
<p class="text-sm font-medium">{user?.role?.name || '-'}</p>
</div>
</div>
</Card.Content>
<Card.Footer>
<Button variant="outline" onclick={openProfileDialog}>{$t('app.account.edit_profile')}</Button>
</Card.Footer>
</Card.Root>
<!-- Security -->
<Card.Root>
<Card.Header>
<Card.Title>{$t('app.account.security')}</Card.Title>
</Card.Header>
<Card.Content>
<div class="flex items-center justify-between">
<div>
<Label class="text-muted-foreground">{$t('app.auth.password')}</Label>
<p class="text-sm">*************</p>
</div>
</div>
</Card.Content>
<Card.Footer>
<Button variant="outline" onclick={openPasswordDialog}>{$t('app.account.change_password')}</Button>
</Card.Footer>
</Card.Root>
</div>
<!-- Profile Edit Dialog -->
<Dialog.Root bind:open={isProfileDialogOpen}>
<Dialog.Content class="sm:max-w-[425px]">
<Dialog.Header>
<Dialog.Title>{$t('app.account.edit_profile')}</Dialog.Title>
<Dialog.Description>{$t('app.account.edit_profile_desc')}</Dialog.Description>
</Dialog.Header>
<form method="POST" action="?/updateProfile" use:enhance={() => {
isSubmitting = true;
return async ({ update }) => {
await update();
isSubmitting = false;
};
}} class="grid gap-4 py-4">
<div class="grid grid-cols-4 items-center gap-4">
<Label for="first_name" class="text-right">{$t('app.setup.first_name')}</Label>
<Input id="first_name" name="first_name" bind:value={profileFirstName} class="col-span-3" />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="last_name" class="text-right">{$t('app.setup.last_name')}</Label>
<Input id="last_name" name="last_name" bind:value={profileLastName} class="col-span-3" />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="email" class="text-right">{$t('app.users.email')}</Label>
<Input id="email" name="email" type="email" bind:value={profileEmail} class="col-span-3" />
</div>
<Dialog.Footer>
<Button type="submit" disabled={isSubmitting}>
{#if isSubmitting}
{$t('app.components.common.submitting')}
{:else}
{$t('app.components.common.save')}
{/if}
</Button>
</Dialog.Footer>
</form>
</Dialog.Content>
</Dialog.Root>
<!-- Change Password Dialog -->
<Dialog.Root bind:open={isPasswordDialogOpen}>
<Dialog.Content class="sm:max-w-[425px]">
<Dialog.Header>
<Dialog.Title>{$t('app.account.change_password')}</Dialog.Title>
<Dialog.Description>{$t('app.account.change_password_desc')}</Dialog.Description>
</Dialog.Header>
<form method="POST" action="?/updatePassword" use:enhance={({ cancel }) => {
if (newPassword !== confirmNewPassword) {
setAlert({
type: 'error',
title: $t('app.search.error'),
message: $t('app.account.passwords_do_not_match'),
duration: 3000,
show: true
});
cancel();
return;
}
isSubmitting = true;
return async ({ update }) => {
await update();
isSubmitting = false;
};
}} class="grid gap-4 py-4">
<div class="grid grid-cols-4 items-center gap-4">
<Label for="currentPassword" class="text-right">{$t('app.account.current_password')}</Label>
<Input id="currentPassword" name="currentPassword" type="password" bind:value={currentPassword} class="col-span-3" required />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="newPassword" class="text-right">{$t('app.account.new_password')}</Label>
<Input id="newPassword" name="newPassword" type="password" bind:value={newPassword} class="col-span-3" required />
</div>
<div class="grid grid-cols-4 items-center gap-4">
<Label for="confirmNewPassword" class="text-right">{$t('app.account.confirm_new_password')}</Label>
<Input id="confirmNewPassword" type="password" bind:value={confirmNewPassword} class="col-span-3" required />
</div>
<Dialog.Footer>
<Button type="submit" disabled={isSubmitting}>
{#if isSubmitting}
{$t('app.components.common.submitting')}
{:else}
{$t('app.components.common.save')}
{/if}
</Button>
</Dialog.Footer>
</form>
</Dialog.Content>
</Dialog.Root>

View File

@@ -56,15 +56,13 @@ export interface EmailObject {
}
/**
* Represents an email that has been processed and is ready for indexing.
* Represents an email that has been processed and is ready for indexing.
* This interface defines the shape of the data that is passed to the batch indexing function.
*/
export interface PendingEmail {
/** The unique identifier of the archived email record in the database.
* This ID is used to retrieve the full email data from the database and storage for indexing.
*/
archivedEmailId: string;
email: EmailObject;
sourceId: string;
archivedId: string;
}
// Define the structure of the document to be indexed in Meilisearch