Compare commits

...

43 Commits

Author SHA1 Message Date
Simon Larsen
c75e038685 Add use case pages for Customer Trust, Distributed Operations, Platform Engineering, Regulatory Communications, and Release Guardrails
- Implemented HTML structure and content for each use case, enhancing the website's offerings.
- Included meta descriptions and titles for SEO optimization.
- Integrated navigation, logo roll, call-to-action sections, and footer for consistent layout.
- Highlighted key benefits and features relevant to each use case with engaging visuals and statistics.
2025-12-02 13:30:25 +00:00
Nawaz Dhandala
57abffa113 Merge branch 'master' of https://github.com/OneUptime/oneuptime 2025-12-02 13:06:53 +00:00
Nawaz Dhandala
e8e493ee5a Refactor code structure for improved readability and maintainability 2025-12-02 13:06:50 +00:00
Simon Larsen
e065ebdddc Merge branch 'copilot-v2' 2025-12-02 13:02:32 +00:00
Simon Larsen
39da442892 style: update blog post first paragraph styling for improved readability 2025-12-02 13:02:18 +00:00
Simon Larsen
45b02b30e3 Merge pull request #2152 from OneUptime/chore/npm-audit-fix
chore: npm audit fix
2025-12-02 11:46:04 +00:00
Nawaz Dhandala
30414327f9 feat: add Dockerfile for OneUptime-copilot setup 2025-12-02 11:05:16 +00:00
simlarsen
b99a20a588 chore: npm audit fix 2025-12-02 01:50:48 +00:00
Nawaz Dhandala
22178c282d fix: format command descriptions for consistency in MicrosoftTeamsAPI 2025-12-01 17:13:11 +00:00
Nawaz Dhandala
30389a8d49 feat: add command lists for improved interaction with OneUptime bot in Microsoft Teams 2025-12-01 17:11:37 +00:00
Nawaz Dhandala
7b73cc2ea7 fix: remove trailing spaces in action type definitions 2025-12-01 17:05:30 +00:00
Nawaz Dhandala
6d2c331216 feat: update command triggers for incident and maintenance actions 2025-12-01 17:05:05 +00:00
Nawaz Dhandala
624e4c2296 chore: update version to 9.1.2 2025-12-01 16:38:51 +00:00
Simon Larsen
5e901ee973 Merge pull request #2151 from OneUptime/copilot-v2
Copilot v2
2025-12-01 16:25:22 +00:00
Simon Larsen
a103abc7a9 fix: simplify boolean expression for hasProgressedBeyondScheduledState 2025-12-01 15:45:35 +00:00
Simon Larsen
a7dda0bd53 feat: add logic to update nextSubscriberNotificationBeforeTheEventAt for progressed scheduled maintenance events 2025-12-01 15:45:15 +00:00
Simon Larsen
6948754c86 Merge pull request #2147 from OneUptime/copilot-v2
Copilot v2
2025-12-01 15:21:05 +00:00
Simon Larsen
cc5731bb6d feat: add error handling and logging for missing tool calls and directory entries 2025-12-01 15:20:44 +00:00
Simon Larsen
6761a8a686 Merge pull request #2148 from OneUptime/snyk-upgrade-240d43adaab510cce84165a4f1ccf9b5
[Snyk] Upgrade mailparser from 3.7.5 to 3.9.0
2025-12-01 13:42:14 +00:00
Simon Larsen
6e487199aa refactor: add type annotations and improve type safety across multiple files 2025-12-01 13:41:34 +00:00
snyk-bot
cda5de92ec fix: upgrade mailparser from 3.7.5 to 3.9.0
Snyk has created this PR to upgrade mailparser from 3.7.5 to 3.9.0.

See this package in npm:
mailparser

See this project in Snyk:
https://app.snyk.io/org/oneuptime-RsC2nshvQ2Vnr35jHvMnMP/project/c3622982-05c8-495c-809c-20f301c75f92?utm_source=github&utm_medium=referral&page=upgrade-pr
2025-11-29 12:10:48 +00:00
Simon Larsen
33349341a9 refactor: improve code formatting and readability across multiple files 2025-11-28 21:53:50 +00:00
Simon Larsen
db81fdd3e7 feat: enhance logging throughout the Copilot agent and tools for better traceability 2025-11-28 21:52:33 +00:00
Simon Larsen
d71eba91dd chore: remove vscode-copilot-chat subproject reference 2025-11-28 21:45:07 +00:00
Simon Larsen
682bb805f3 feat: implement AgentLogger for file-based logging with exit handlers 2025-11-28 21:43:15 +00:00
Simon Larsen
7f38e3d417 docs: add usage example for running the agent in development mode 2025-11-28 21:31:08 +00:00
Simon Larsen
559985e93b feat: add tsconfig-paths for improved module resolution in development 2025-11-28 21:28:19 +00:00
Simon Larsen
43588cbe5a refactor: update optional properties to include 'undefined' type in various interfaces 2025-11-28 20:57:17 +00:00
Simon Larsen
0772fce477 refactor: update Telemetry class to use type assertions for loggerProviderConfig and nodeSdkConfiguration
chore: remove unused common type definitions and clean up tsconfig.json
2025-11-28 20:20:09 +00:00
Simon Larsen
78107d8b1c chore: remove unused type definitions and clean up tsconfig.json 2025-11-28 20:06:43 +00:00
Simon Larsen
078af43b0c chore: remove tsconfig.json for OneUptime Copilot Agent 2025-11-28 19:58:11 +00:00
Simon Larsen
9b9aeb2f40 feat: Implement OneUptime Copilot Agent with workspace tools
- Added SystemPrompt for guiding the agent's behavior.
- Created WorkspaceContextBuilder to gather workspace information.
- Developed main entry point in index.ts for agent execution.
- Implemented LMStudioClient for interacting with the LM Studio API.
- Added ApplyPatchTool for applying code changes via patches.
- Created ListDirectoryTool for listing files and directories.
- Implemented ReadFileTool for reading file contents.
- Developed RunCommandTool for executing shell commands.
- Added SearchWorkspaceTool for searching files in the workspace.
- Created WriteFileTool for writing content to files.
- Established ToolRegistry for managing and executing tools.
- Defined types for chat messages and tool calls.
- Added utility classes for logging and executing commands.
- Implemented WorkspacePaths for managing file paths within the workspace.
- Configured TypeScript settings in tsconfig.json.
2025-11-28 19:57:52 +00:00
Nawaz Dhandala
67577f5a2b refactor: improve formatting and readability in Incident migration and MonitorService 2025-11-28 17:42:22 +00:00
Nawaz Dhandala
4e808cf382 feat: enhance monitor deletion process to include MetricService cleanup 2025-11-28 17:40:31 +00:00
Nawaz Dhandala
c993b33dab feat: add projectId to MetricService deletion query in incident handling 2025-11-28 17:35:23 +00:00
Nawaz Dhandala
3c5a64024b feat: include projectId in MetricService deletion query for incidents 2025-11-28 17:34:30 +00:00
Nawaz Dhandala
86efe54a29 refactor: remove unused favicon handling from DashboardMasterPage 2025-11-28 17:29:43 +00:00
Simon Larsen
17bf568428 feat: Implement OneUptime Copilot Agent with core functionalities
- Add SystemPrompt to define agent behavior and principles.
- Create WorkspaceContextBuilder for workspace snapshot and Git status.
- Initialize main entry point with command-line options for agent configuration.
- Develop LMStudioClient for chat completion requests to LM Studio.
- Implement tools for file operations: ApplyPatchTool, ListDirectoryTool, ReadFileTool, RunCommandTool, SearchWorkspaceTool, WriteFileTool.
- Establish ToolRegistry for managing and executing tools.
- Define types for chat messages, tool calls, and execution results.
- Set up workspace path utilities for file management and validation.
- Configure TypeScript settings for the project.
2025-11-28 16:49:46 +00:00
Simon Larsen
26ac698cc7 Remove Copilot package configuration files 2025-11-28 15:43:36 +00:00
Simon Larsen
72bb25e036 chore: migrate VERSION_PREFIX to VERSION and update related workflows 2025-11-28 15:40:24 +00:00
Nawaz Dhandala
1f23742c1f chore: remove vscode-copilot-chat subproject 2025-11-28 14:12:12 +00:00
Nawaz Dhandala
ac66cee4aa feat: add declaredAt field to Incident model with migration and default value 2025-11-28 10:12:43 +00:00
Nawaz Dhandala
66efe2d2fa feat: add declaredAt field to Incident model and update related services and components 2025-11-28 10:10:05 +00:00
130 changed files with 5285 additions and 27931 deletions

View File

@@ -42,15 +42,15 @@ jobs:
run: |
set -euo pipefail
VERSION_PREFIX_RAW="$(tr -d ' \n' < VERSION_PREFIX)"
if [[ -z "$VERSION_PREFIX_RAW" ]]; then
echo "VERSION_PREFIX is empty" >&2
VERSION_RAW="$(tr -d ' \n' < VERSION)"
if [[ -z "$VERSION_RAW" ]]; then
echo "VERSION is empty" >&2
exit 1
fi
IFS='.' read -r major minor patch <<< "$VERSION_PREFIX_RAW"
IFS='.' read -r major minor patch <<< "$VERSION_RAW"
if [[ -z "$minor" ]]; then
echo "VERSION_PREFIX must contain major and minor components" >&2
echo "VERSION must contain major and minor components" >&2
exit 1
fi
patch="${patch:-0}"
@@ -58,7 +58,7 @@ jobs:
for part_name in major minor patch; do
part="${!part_name}"
if ! [[ "$part" =~ ^[0-9]+$ ]]; then
echo "Invalid ${part_name} component '$part' in VERSION_PREFIX" >&2
echo "Invalid ${part_name} component '$part' in VERSION" >&2
exit 1
fi
done
@@ -82,18 +82,18 @@ jobs:
fi
new_version="${major}.${minor}.${target_patch}"
if [[ "$new_version" != "$VERSION_PREFIX_RAW" ]]; then
echo "$new_version" > VERSION_PREFIX
if [[ "$new_version" != "$VERSION_RAW" ]]; then
echo "$new_version" > VERSION
git config --global user.name "github-actions[bot]"
git config --global user.email "github-actions[bot]@users.noreply.github.com"
git add VERSION_PREFIX
git add VERSION
if ! git diff --cached --quiet; then
branch_name="chore/bump-version-prefix-${new_version}-$(date +%s)"
branch_name="chore/bump-version-${new_version}-$(date +%s)"
git checkout -b "$branch_name"
git commit -m "chore: bump version prefix to ${new_version} [skip ci]"
git commit -m "chore: bump version to ${new_version} [skip ci]"
git push origin "$branch_name"
pr_title="chore: bump version prefix to ${new_version}"
pr_body=$'Automated change to VERSION_PREFIX to align release workflow with master.\n\nCreated by GitHub Actions release workflow.'
pr_title="chore: bump version to ${new_version}"
pr_body=$'Automated change to VERSION to align release workflow with master.\n\nCreated by GitHub Actions release workflow.'
gh pr create --repo "$REPOSITORY" --base master --head "$branch_name" --title "$pr_title" --body "$pr_body"
pr_url="$(gh pr view "$branch_name" --repo "$REPOSITORY" --json url --jq '.url' 2>/dev/null || true)"
updated="true"

View File

@@ -41,15 +41,15 @@ jobs:
run: |
set -euo pipefail
VERSION_PREFIX_RAW="$(tr -d ' \n' < VERSION_PREFIX)"
if [[ -z "$VERSION_PREFIX_RAW" ]]; then
echo "VERSION_PREFIX is empty" >&2
VERSION_RAW="$(tr -d ' \n' < VERSION)"
if [[ -z "$VERSION_RAW" ]]; then
echo "VERSION is empty" >&2
exit 1
fi
IFS='.' read -r major minor patch <<< "$VERSION_PREFIX_RAW"
IFS='.' read -r major minor patch <<< "$VERSION_RAW"
if [[ -z "$minor" ]]; then
echo "VERSION_PREFIX must contain major and minor components" >&2
echo "VERSION must contain major and minor components" >&2
exit 1
fi
patch="${patch:-0}"
@@ -57,7 +57,7 @@ jobs:
for part_name in major minor patch; do
part="${!part_name}"
if ! [[ "$part" =~ ^[0-9]+$ ]]; then
echo "Invalid ${part_name} component '$part' in VERSION_PREFIX" >&2
echo "Invalid ${part_name} component '$part' in VERSION" >&2
exit 1
fi
done

View File

@@ -228,6 +228,43 @@ export default class Incident extends BaseModel {
})
public description?: string = undefined;
@ColumnAccessControl({
create: [
Permission.ProjectOwner,
Permission.ProjectAdmin,
Permission.ProjectMember,
Permission.CreateProjectIncident,
],
read: [
Permission.ProjectOwner,
Permission.ProjectAdmin,
Permission.ProjectMember,
Permission.ReadProjectIncident,
],
update: [
Permission.ProjectOwner,
Permission.ProjectAdmin,
Permission.ProjectMember,
Permission.EditProjectIncident,
],
})
@Index()
@TableColumn({
required: true,
type: TableColumnType.Date,
title: "Declared At",
description: "Date and time when this incident was declared.",
isDefaultValueColumn: true,
})
@Column({
type: ColumnType.Date,
nullable: false,
default: () => {
return "now()";
},
})
public declaredAt?: Date = undefined;
@Index()
@ColumnAccessControl({
create: [],

View File

@@ -101,7 +101,48 @@ export default class MicrosoftTeamsAPI {
supportsCalling: false,
supportsVideo: false,
// Provide basic command lists to improve client compatibility (esp. mobile)
commandLists: [],
commandLists: [
{
scopes: ["team", "groupChat", "personal"],
commands: [
{
title: "help",
description:
"Show instructions for interacting with the OneUptime bot.",
},
{
title: "create incident",
description:
"Launch the adaptive card to declare a new incident in OneUptime.",
},
{
title: "create maintenance",
description:
"Open the workflow to schedule maintenance directly from Teams.",
},
{
title: "show active incidents",
description:
"List all ongoing incidents with severity and state context.",
},
{
title: "show scheduled maintenance",
description:
"Display upcoming scheduled maintenance events for the workspace.",
},
{
title: "show ongoing maintenance",
description:
"Surface maintenance windows that are currently in progress.",
},
{
title: "show active alerts",
description:
"Provide a summary of alerts that still require attention.",
},
],
},
],
},
],
permissions: ["identity", "messageTeamMembers"],

View File

@@ -1421,6 +1421,7 @@ export default class StatusPageAPI extends BaseAPI<
if (monitorsOnStatusPage.length > 0) {
let select: Select<Incident> = {
createdAt: true,
declaredAt: true,
title: true,
description: true,
_id: true,
@@ -1474,6 +1475,7 @@ export default class StatusPageAPI extends BaseAPI<
},
select: select,
sort: {
declaredAt: SortOrder.Descending,
createdAt: SortOrder.Descending,
},
@@ -3303,6 +3305,7 @@ export default class StatusPageAPI extends BaseAPI<
let selectIncidents: Select<Incident> = {
createdAt: true,
declaredAt: true,
title: true,
description: true,
_id: true,
@@ -3336,6 +3339,7 @@ export default class StatusPageAPI extends BaseAPI<
query: incidentQuery,
select: selectIncidents,
sort: {
declaredAt: SortOrder.Descending,
createdAt: SortOrder.Descending,
},
skip: 0,
@@ -3373,6 +3377,7 @@ export default class StatusPageAPI extends BaseAPI<
},
select: selectIncidents,
sort: {
declaredAt: SortOrder.Descending,
createdAt: SortOrder.Descending,
},

View File

@@ -0,0 +1,30 @@
import { MigrationInterface, QueryRunner } from "typeorm";
export class MigrationName1764324618043 implements MigrationInterface {
public name = "MigrationName1764324618043";
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(
`ALTER TABLE "Incident" ADD "declaredAt" TIMESTAMP WITH TIME ZONE`,
);
await queryRunner.query(
`UPDATE "Incident" SET "declaredAt" = "createdAt" WHERE "declaredAt" IS NULL`,
);
await queryRunner.query(
`ALTER TABLE "Incident" ALTER COLUMN "declaredAt" SET DEFAULT now()`,
);
await queryRunner.query(
`ALTER TABLE "Incident" ALTER COLUMN "declaredAt" SET NOT NULL`,
);
await queryRunner.query(
`CREATE INDEX "IDX_b26979b9f119310661734465a4" ON "Incident" ("declaredAt") `,
);
}
public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(
`DROP INDEX "public"."IDX_b26979b9f119310661734465a4"`,
);
await queryRunner.query(`ALTER TABLE "Incident" DROP COLUMN "declaredAt"`);
}
}

View File

@@ -186,6 +186,7 @@ import { MigrationName1763471659817 } from "./1763471659817-MigrationName";
import { MigrationName1763477560906 } from "./1763477560906-MigrationName";
import { MigrationName1763480947474 } from "./1763480947474-MigrationName";
import { MigrationName1763643080445 } from "./1763643080445-MigrationName";
import { MigrationName1764324618043 } from "./1764324618043-MigrationName";
export default [
InitialMigration,
@@ -376,4 +377,5 @@ export default [
MigrationName1763477560906,
MigrationName1763480947474,
MigrationName1763643080445,
MigrationName1764324618043,
];

View File

@@ -480,6 +480,14 @@ export class Service extends DatabaseService<Model> {
const projectId: ObjectID =
createBy.props.tenantId || createBy.data.projectId!;
if (!createBy.data.declaredAt) {
createBy.data.declaredAt = OneUptimeDate.getCurrentDate();
} else {
createBy.data.declaredAt = OneUptimeDate.fromString(
createBy.data.declaredAt as Date,
);
}
// Determine the initial incident state
let initialIncidentStateId: ObjectID | undefined = undefined;
@@ -975,6 +983,7 @@ ${incident.remediationNotes || "No remediation notes provided."}
notifyOwners: false,
rootCause: createdItem.rootCause,
stateChangeLog: createdItem.createdStateLog,
timelineStartsAt: createdItem.declaredAt,
props: {
isRoot: true,
},
@@ -1790,6 +1799,7 @@ ${incidentSeverity.name}
limit: LIMIT_MAX,
skip: 0,
select: {
_id: true,
projectId: true,
monitors: {
_id: true,
@@ -1821,6 +1831,18 @@ ${incidentSeverity.name}
incident.monitors,
);
}
if (incident.projectId && incident.id) {
await MetricService.deleteBy({
query: {
projectId: incident.projectId,
serviceId: incident.id,
},
props: {
isRoot: true,
},
});
}
}
}
@@ -1838,6 +1860,7 @@ ${incidentSeverity.name}
rootCause: string | undefined;
stateChangeLog: JSONObject | undefined;
props: DatabaseCommonInteractionProps | undefined;
timelineStartsAt?: Date | string | undefined;
}): Promise<void> {
const {
projectId,
@@ -1849,8 +1872,13 @@ ${incidentSeverity.name}
rootCause,
stateChangeLog,
props,
timelineStartsAt,
} = data;
const declaredTimelineStart: Date | undefined = timelineStartsAt
? OneUptimeDate.fromString(timelineStartsAt as Date)
: undefined;
// get last monitor status timeline.
const lastIncidentStatusTimeline: IncidentStateTimeline | null =
await IncidentStateTimelineService.findOneBy({
@@ -1888,6 +1916,10 @@ ${incidentSeverity.name}
statusTimeline.shouldStatusPageSubscribersBeNotified =
shouldNotifyStatusPageSubscribers;
if (!lastIncidentStatusTimeline && declaredTimelineStart) {
statusTimeline.startsAt = declaredTimelineStart;
}
// Map boolean to enum value
statusTimeline.subscriberNotificationStatus = isSubscribersNotified
? StatusPageSubscriberNotificationStatus.Success
@@ -1914,6 +1946,7 @@ ${incidentSeverity.name}
id: data.incidentId,
select: {
projectId: true,
declaredAt: true,
monitors: {
_id: true,
name: true,
@@ -1970,6 +2003,7 @@ ${incidentSeverity.name}
await MetricService.deleteBy({
query: {
projectId: incident.projectId,
serviceId: data.incidentId,
},
props: {
@@ -1983,6 +2017,7 @@ ${incidentSeverity.name}
const incidentStartsAt: Date =
firstIncidentStateTimeline?.startsAt ||
incident.declaredAt ||
incident.createdAt ||
OneUptimeDate.getCurrentDate();
@@ -2075,6 +2110,7 @@ ${incidentSeverity.name}
timeToAcknowledgeMetric.time =
ackIncidentStateTimeline?.startsAt ||
incident.declaredAt ||
incident.createdAt ||
OneUptimeDate.getCurrentDate();
timeToAcknowledgeMetric.timeUnixNano = OneUptimeDate.toUnixNano(
@@ -2140,6 +2176,7 @@ ${incidentSeverity.name}
timeToResolveMetric.time =
resolvedIncidentStateTimeline?.startsAt ||
incident.declaredAt ||
incident.createdAt ||
OneUptimeDate.getCurrentDate();
timeToResolveMetric.timeUnixNano = OneUptimeDate.toUnixNano(
@@ -2200,6 +2237,7 @@ ${incidentSeverity.name}
incidentDurationMetric.time =
lastIncidentStateTimeline?.startsAt ||
incident.declaredAt ||
incident.createdAt ||
OneUptimeDate.getCurrentDate();
incidentDurationMetric.timeUnixNano = OneUptimeDate.toUnixNano(

View File

@@ -63,14 +63,13 @@ import MonitorFeedService from "./MonitorFeedService";
import { MonitorFeedEventType } from "../../Models/DatabaseModels/MonitorFeed";
import { Gray500, Green500 } from "../../Types/BrandColors";
import LabelService from "./LabelService";
import QueryOperator from "../../Types/BaseDatabase/QueryOperator";
import { FindWhere } from "../../Types/BaseDatabase/Query";
import logger from "../Utils/Logger";
import PushNotificationUtil from "../Utils/PushNotificationUtil";
import ExceptionMessages from "../../Types/Exception/ExceptionMessages";
import Project from "../../Models/DatabaseModels/Project";
import { createWhatsAppMessageFromTemplate } from "../Utils/WhatsAppTemplateUtil";
import { WhatsAppMessagePayload } from "../../Types/WhatsApp/WhatsAppMessage";
import MetricService from "./MetricService";
export class Service extends DatabaseService<Model> {
public constructor() {
@@ -136,12 +135,26 @@ export class Service extends DatabaseService<Model> {
protected override async onBeforeDelete(
deleteBy: DeleteBy<Model>,
): Promise<OnDelete<Model>> {
if (deleteBy.query._id) {
// delete all the status page resource for this monitor.
const monitorsPendingDeletion: Array<Model> = await this.findBy({
query: deleteBy.query,
limit: LIMIT_MAX,
skip: 0,
select: {
_id: true,
projectId: true,
},
props: deleteBy.props,
});
for (const monitor of monitorsPendingDeletion) {
if (!monitor.id) {
continue;
}
// delete all the status page resources for this monitor.
await StatusPageResourceService.deleteBy({
query: {
monitorId: new ObjectID(deleteBy.query._id as string),
monitorId: monitor.id,
},
limit: LIMIT_MAX,
skip: 0,
@@ -150,37 +163,19 @@ export class Service extends DatabaseService<Model> {
},
});
let projectId: FindWhere<ObjectID> | QueryOperator<ObjectID> | undefined =
deleteBy.query.projectId || deleteBy.props.tenantId;
const projectId: ObjectID | undefined = monitor.projectId as
| ObjectID
| undefined;
if (!projectId) {
// fetch this monitor from the database to get the projectId.
const monitor: Model | null = await this.findOneById({
id: new ObjectID(deleteBy.query._id as string) as ObjectID,
select: {
projectId: true,
},
props: {
isRoot: true,
},
});
if (!monitor) {
throw new BadDataException(ExceptionMessages.MonitorNotFound);
}
if (!monitor.id) {
throw new BadDataException(ExceptionMessages.MonitorNotFound);
}
projectId = monitor.projectId!;
continue;
}
try {
await WorkspaceNotificationRuleService.archiveWorkspaceChannels({
projectId: projectId as ObjectID,
projectId: projectId,
notificationFor: {
monitorId: new ObjectID(deleteBy.query._id as string) as ObjectID,
monitorId: monitor.id,
},
sendMessageBeforeArchiving: {
_type: "WorkspacePayloadMarkdown",
@@ -189,12 +184,17 @@ export class Service extends DatabaseService<Model> {
});
} catch (error) {
logger.error(
`Error while archiving workspace channels for monitor ${deleteBy.query._id}: ${error}`,
`Error while archiving workspace channels for monitor ${monitor.id?.toString()}: ${error}`,
);
}
}
return { deleteBy, carryForward: null };
return {
deleteBy,
carryForward: {
monitors: monitorsPendingDeletion,
},
};
}
@CaptureSpan()
@@ -208,6 +208,24 @@ export class Service extends DatabaseService<Model> {
);
}
if (onDelete.carryForward && onDelete.carryForward.monitors) {
for (const monitor of onDelete.carryForward.monitors as Array<Model>) {
if (!monitor.projectId || !monitor.id) {
continue;
}
await MetricService.deleteBy({
query: {
projectId: monitor.projectId,
serviceId: monitor.id,
},
props: {
isRoot: true,
},
});
}
}
return onDelete;
}

View File

@@ -510,12 +510,30 @@ export class Service extends DatabaseService<ScheduledMaintenanceStateTimeline>
monitors: {
_id: true,
},
nextSubscriberNotificationBeforeTheEventAt: true,
},
props: {
isRoot: true,
},
});
const hasProgressedBeyondScheduledState: boolean = Boolean(
scheduledMaintenanceState && !scheduledMaintenanceState.isScheduledState,
);
if (
hasProgressedBeyondScheduledState &&
scheduledMaintenanceEvent?.nextSubscriberNotificationBeforeTheEventAt
) {
await ScheduledMaintenanceService.updateOneById({
id: createdItem.scheduledMaintenanceId!,
data: {
nextSubscriberNotificationBeforeTheEventAt: null,
},
props: onCreate.createBy.props,
});
}
if (isOngoingState) {
if (
scheduledMaintenanceEvent &&

View File

@@ -226,7 +226,11 @@ export default class Telemetry {
};
if (logRecordProcessors.length > 0) {
loggerProviderConfig.processors = logRecordProcessors;
(
loggerProviderConfig as LoggerProviderConfig & {
processors?: Array<LogRecordProcessor>;
}
).processors = logRecordProcessors;
}
this.loggerProvider = new LoggerProvider(loggerProviderConfig);
@@ -254,7 +258,11 @@ export default class Telemetry {
*/
if (logRecordProcessors.length > 0) {
nodeSdkConfiguration.logRecordProcessors = logRecordProcessors;
(
nodeSdkConfiguration as opentelemetry.NodeSDKConfiguration & {
logRecordProcessors?: Array<LogRecordProcessor>;
}
).logRecordProcessors = logRecordProcessors;
}
const sdk: opentelemetry.NodeSDK = new opentelemetry.NodeSDK(

View File

@@ -16,7 +16,7 @@ export enum MicrosoftTeamsIncidentActionType {
SubmitExecuteIncidentOnCallPolicy = "SubmitExecuteIncidentOnCallPolicy",
ViewChangeIncidentState = "ViewChangeIncidentState",
SubmitChangeIncidentState = "SubmitChangeIncidentState",
NewIncident = "/incident", // new incident slash command
NewIncident = "CreateIncident",
SubmitNewIncident = "SubmitNewIncident",
}
@@ -57,7 +57,7 @@ export enum MicrosoftTeamsScheduledMaintenanceActionType {
SubmitScheduledMaintenanceNote = "SubmitScheduledMaintenanceNote",
ViewChangeScheduledMaintenanceState = "ViewChangeScheduledMaintenanceState",
SubmitChangeScheduledMaintenanceState = "SubmitChangeScheduledMaintenanceState",
NewScheduledMaintenance = "/maintenance", // new scheduled maintenance slash command
NewScheduledMaintenance = "CreateMaintenance",
SubmitNewScheduledMaintenance = "SubmitNewScheduledMaintenance",
}

View File

@@ -320,6 +320,7 @@ export default class MicrosoftTeamsIncidentActions {
name: true,
},
createdAt: true,
declaredAt: true,
},
props: {
isRoot: true,
@@ -331,7 +332,9 @@ export default class MicrosoftTeamsIncidentActions {
return;
}
const message: string = `**Incident Details**\n\n**Title:** ${incident.title}\n**Description:** ${incident.description || "No description"}\n**State:** ${incident.currentIncidentState?.name || "Unknown"}\n**Severity:** ${incident.incidentSeverity?.name || "Unknown"}\n**Created At:** ${incident.createdAt ? new Date(incident.createdAt).toLocaleString() : "Unknown"}`;
const declaredAt: Date | undefined =
incident.declaredAt || incident.createdAt || undefined;
const message: string = `**Incident Details**\n\n**Title:** ${incident.title}\n**Description:** ${incident.description || "No description"}\n**State:** ${incident.currentIncidentState?.name || "Unknown"}\n**Severity:** ${incident.incidentSeverity?.name || "Unknown"}\n**Declared At:** ${declaredAt ? new Date(declaredAt).toLocaleString() : "Unknown"}`;
await turnContext.sendActivity(message);
return;

View File

@@ -1798,14 +1798,19 @@ export default class MicrosoftTeamsUtil extends WorkspaceBase {
let responseText: string = "";
try {
const isCreateIncidentCommand: boolean =
cleanText === "create incident" ||
cleanText.startsWith("create incident ");
const isCreateMaintenanceCommand: boolean =
cleanText === "create maintenance" ||
cleanText.startsWith("create maintenance ");
if (cleanText.includes("help") || cleanText === "") {
responseText = this.getHelpMessage();
} else if (
cleanText === "/incident" ||
cleanText.startsWith("/incident ")
) {
// Handle /incident slash command
logger.debug("Processing /incident command");
} else if (isCreateIncidentCommand) {
// Handle create incident command (legacy slash command supported)
logger.debug("Processing create incident command");
const card: JSONObject =
await MicrosoftTeamsIncidentActions.buildNewIncidentCard(projectId);
await data.turnContext.sendActivity({
@@ -1818,12 +1823,9 @@ export default class MicrosoftTeamsUtil extends WorkspaceBase {
});
logger.debug("New incident card sent successfully");
return;
} else if (
cleanText === "/maintenance" ||
cleanText.startsWith("/maintenance ")
) {
// Handle /maintenance slash command
logger.debug("Processing /maintenance command");
} else if (isCreateMaintenanceCommand) {
// Handle create maintenance command (legacy slash command supported)
logger.debug("Processing create maintenance command");
const card: JSONObject =
await MicrosoftTeamsScheduledMaintenanceActions.buildNewScheduledMaintenanceCard(
projectId,
@@ -1880,8 +1882,8 @@ export default class MicrosoftTeamsUtil extends WorkspaceBase {
**Available Commands:**
- **help** — Show this help message
- **/incident** — Create a new incident
- **/maintenance** — Create a new scheduled maintenance event
- **create incident** — Create a new incident
- **create maintenance** — Create a new scheduled maintenance event
- **show active incidents** — Display all currently active incidents
- **show scheduled maintenance** — Show upcoming scheduled maintenance events
- **show ongoing maintenance** — Display currently ongoing maintenance events
@@ -1929,11 +1931,13 @@ Just type any of these commands to get the information you need!`;
color: true,
},
createdAt: true,
declaredAt: true,
monitors: {
name: true,
},
},
sort: {
declaredAt: SortOrder.Descending,
createdAt: SortOrder.Descending,
},
limit: 10,
@@ -1958,8 +1962,10 @@ If you need to report an incident or check historical incidents, please visit th
for (const incident of activeIncidents) {
const severity: string = incident.incidentSeverity?.name || "Unknown";
const state: string = incident.currentIncidentState?.name || "Unknown";
const createdAt: string = incident.createdAt
? OneUptimeDate.getDateAsFormattedString(incident.createdAt)
const declaredAt: Date | undefined =
incident.declaredAt || incident.createdAt;
const declaredAtText: string = declaredAt
? OneUptimeDate.getDateAsFormattedString(declaredAt)
: "Unknown";
const severityIcon: string = ["Critical", "Major"].includes(severity)
@@ -1977,7 +1983,7 @@ If you need to report an incident or check historical incidents, please visit th
message += `${severityIcon} **[Incident #${incident.incidentNumber}: ${incident.title}](${incidentUrl.toString()})**
• **Severity:** ${severity}
• **Status:** ${state}
• **Created:** ${createdAt}
• **Declared:** ${declaredAtText}
`;
if (incident.monitors && incident.monitors.length > 0) {
@@ -2716,11 +2722,11 @@ All monitoring checks are passing normally.`;
value: "Show quick help and useful links",
},
{
title: "/incident",
title: "create incident",
value: "Create a new incident without leaving Teams",
},
{
title: "/maintenance",
title: "create maintenance",
value: "Schedule or review maintenance windows",
},
{

128
Common/package-lock.json generated
View File

@@ -8412,39 +8412,39 @@
}
},
"node_modules/express": {
"version": "4.21.2",
"resolved": "https://registry.npmjs.org/express/-/express-4.21.2.tgz",
"integrity": "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA==",
"version": "4.22.1",
"resolved": "https://registry.npmjs.org/express/-/express-4.22.1.tgz",
"integrity": "sha512-F2X8g9P1X7uCPZMA3MVf9wcTqlyNp7IhH5qPCI0izhaOIYXaW9L535tGA3qmjRzpH+bZczqq7hVKxTR4NWnu+g==",
"license": "MIT",
"dependencies": {
"accepts": "~1.3.8",
"array-flatten": "1.1.1",
"body-parser": "1.20.3",
"content-disposition": "0.5.4",
"body-parser": "~1.20.3",
"content-disposition": "~0.5.4",
"content-type": "~1.0.4",
"cookie": "0.7.1",
"cookie-signature": "1.0.6",
"cookie": "~0.7.1",
"cookie-signature": "~1.0.6",
"debug": "2.6.9",
"depd": "2.0.0",
"encodeurl": "~2.0.0",
"escape-html": "~1.0.3",
"etag": "~1.8.1",
"finalhandler": "1.3.1",
"fresh": "0.5.2",
"http-errors": "2.0.0",
"finalhandler": "~1.3.1",
"fresh": "~0.5.2",
"http-errors": "~2.0.0",
"merge-descriptors": "1.0.3",
"methods": "~1.1.2",
"on-finished": "2.4.1",
"on-finished": "~2.4.1",
"parseurl": "~1.3.3",
"path-to-regexp": "0.1.12",
"path-to-regexp": "~0.1.12",
"proxy-addr": "~2.0.7",
"qs": "6.13.0",
"qs": "~6.14.0",
"range-parser": "~1.2.1",
"safe-buffer": "5.2.1",
"send": "0.19.0",
"serve-static": "1.16.2",
"send": "~0.19.0",
"serve-static": "~1.16.2",
"setprototypeof": "1.2.0",
"statuses": "2.0.1",
"statuses": "~2.0.1",
"type-is": "~1.6.18",
"utils-merge": "1.0.1",
"vary": "~1.1.2"
@@ -8457,15 +8457,6 @@
"url": "https://opencollective.com/express"
}
},
"node_modules/express/node_modules/cookie": {
"version": "0.7.1",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
"integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"node_modules/express/node_modules/debug": {
"version": "2.6.9",
"resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
@@ -8481,6 +8472,21 @@
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A==",
"license": "MIT"
},
"node_modules/express/node_modules/qs": {
"version": "6.14.0",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.0.tgz",
"integrity": "sha512-YWWTjgABSKcvs/nWBi9PycY/JiPJqOD4JA6o9Sej2AtvSGarXxKC3OQSk4pAarbdQlKAh5D4FCQkJNkW+GAn3w==",
"license": "BSD-3-Clause",
"dependencies": {
"side-channel": "^1.1.0"
},
"engines": {
"node": ">=0.6"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/extend": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/extend/-/extend-3.0.2.tgz",
@@ -12986,9 +12992,9 @@
"license": "MIT"
},
"node_modules/nodemailer": {
"version": "7.0.7",
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-7.0.7.tgz",
"integrity": "sha512-jGOaRznodf62TVzdyhKt/f1Q/c3kYynk8629sgJHpRzGZj01ezbgMMWJSAjHADcwTKxco3B68/R+KHJY2T5BaA==",
"version": "7.0.11",
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-7.0.11.tgz",
"integrity": "sha512-gnXhNRE0FNhD7wPSCGhdNh46Hs6nm+uTyg+Kq0cZukNQiYdnCsoQjodNP9BQVG9XrcK/v6/MgpAPBUFyzh9pvw==",
"license": "MIT-0",
"engines": {
"node": ">=6.0.0"
@@ -15267,15 +15273,69 @@
}
},
"node_modules/side-channel": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/side-channel/-/side-channel-1.0.6.tgz",
"integrity": "sha512-fDW/EZ6Q9RiO8eFG8Hj+7u/oW+XrPTIChwCOM2+th2A6OblDtYYIpve9m+KvI9Z4C9qSEXlaGR6bTEYHReuglA==",
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/side-channel/-/side-channel-1.1.0.tgz",
"integrity": "sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw==",
"license": "MIT",
"dependencies": {
"call-bind": "^1.0.7",
"es-errors": "^1.3.0",
"get-intrinsic": "^1.2.4",
"object-inspect": "^1.13.1"
"object-inspect": "^1.13.3",
"side-channel-list": "^1.0.0",
"side-channel-map": "^1.0.1",
"side-channel-weakmap": "^1.0.2"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/side-channel-list": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/side-channel-list/-/side-channel-list-1.0.0.tgz",
"integrity": "sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==",
"license": "MIT",
"dependencies": {
"es-errors": "^1.3.0",
"object-inspect": "^1.13.3"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/side-channel-map": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/side-channel-map/-/side-channel-map-1.0.1.tgz",
"integrity": "sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA==",
"license": "MIT",
"dependencies": {
"call-bound": "^1.0.2",
"es-errors": "^1.3.0",
"get-intrinsic": "^1.2.5",
"object-inspect": "^1.13.3"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/side-channel-weakmap": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/side-channel-weakmap/-/side-channel-weakmap-1.0.2.tgz",
"integrity": "sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A==",
"license": "MIT",
"dependencies": {
"call-bound": "^1.0.2",
"es-errors": "^1.3.0",
"get-intrinsic": "^1.2.5",
"object-inspect": "^1.13.3",
"side-channel-map": "^1.0.1"
},
"engines": {
"node": ">= 0.4"

View File

@@ -47,8 +47,8 @@
"@asteasolutions/zod-to-openapi": "^7.3.2",
"@bull-board/express": "^5.21.4",
"@clickhouse/client": "^1.10.1",
"@hcaptcha/react-hcaptcha": "^1.14.0",
"@elastic/elasticsearch": "^8.12.1",
"@hcaptcha/react-hcaptcha": "^1.14.0",
"@monaco-editor/react": "^4.4.6",
"@opentelemetry/api": "^1.9.0",
"@opentelemetry/api-logs": "^0.206.0",

View File

@@ -1,56 +0,0 @@
.git
node_modules
# See https://help.github.com/ignore-files/ for more about ignoring files.
# dependencies
/node_modules
node_modules
.idea
# testing
/coverage
# production
/build
# misc
.DS_Store
env.js
npm-debug.log*
yarn-debug.log*
yarn-error.log*
yarn.lock
Untitled-1
*.local.sh
*.local.yaml
run
stop
nohup.out*
encrypted-credentials.tar
encrypted-credentials/
_README.md
# Important Add production values to gitignore.
values-saas-production.yaml
kubernetes/values-saas-production.yaml
/private
/tls_cert.pem
/tls_key.pem
/keys
temp_readme.md
tests/coverage
settings.json
GoSDK/tester/

View File

@@ -1,6 +0,0 @@
ONEUPTIME_URL=https://oneuptime.com
ONEUPTIME_REPOSITORY_SECRET_KEY=your-repository-secret-key
CODE_REPOSITORY_PASSWORD=
CODE_REPOSITORY_USERNAME=
# Optional. If this is left blank then this url will be ONEUPTIME_URL/llama
ONEUPTIME_LLM_SERVER_URL=

View File

@@ -1 +0,0 @@
*.js text eol=lf

18
Copilot/.gitignore vendored
View File

@@ -1,16 +1,4 @@
# See https://help.github.com/ignore-files/ for more about ignoring files.
# dependencies
#/backend/node_modules
/kubernetes
/node_modules
.idea
# misc
node_modules
build
*.log
.DS_Store
npm-debug.log*
yarn-debug.log*
yarn-error.log*
yarn.lock

View File

@@ -1,76 +0,0 @@
import URL from "Common/Types/API/URL";
import LlmType from "./Types/LlmType";
import BadDataException from "Common/Types/Exception/BadDataException";
type GetStringFunction = () => string;
type GetStringOrNullFunction = () => string | null;
type GetURLFunction = () => URL;
export const MIN_ITEMS_IN_QUEUE_PER_SERVICE_CATALOG: number = 10;
export const GetIsCopilotDisabled: () => boolean = () => {
return process.env["DISABLE_COPILOT"] === "true";
};
export const GetOneUptimeURL: GetURLFunction = () => {
return URL.fromString(
process.env["ONEUPTIME_URL"] || "https://oneuptime.com",
);
};
export const GetRepositorySecretKey: GetStringOrNullFunction = ():
| string
| null => {
return process.env["ONEUPTIME_REPOSITORY_SECRET_KEY"] || null;
};
export const GetLocalRepositoryPath: GetStringFunction = (): string => {
return "/repository";
};
export const GetCodeRepositoryPassword: GetStringOrNullFunction = ():
| string
| null => {
const token: string | null = process.env["CODE_REPOSITORY_PASSWORD"] || null;
return token;
};
export const GetCodeRepositoryUsername: GetStringOrNullFunction = ():
| string
| null => {
const username: string | null =
process.env["CODE_REPOSITORY_USERNAME"] || null;
return username;
};
export const GetLlmServerUrl: GetURLFunction = () => {
if (!process.env["ONEUPTIME_LLM_SERVER_URL"]) {
throw new BadDataException("ONEUPTIME_LLM_SERVER_URL is not set");
}
return URL.fromString(process.env["ONEUPTIME_LLM_SERVER_URL"]);
};
export const GetOpenAIAPIKey: GetStringOrNullFunction = (): string | null => {
return process.env["OPENAI_API_KEY"] || null;
};
export const GetOpenAIModel: GetStringOrNullFunction = (): string | null => {
return process.env["OPENAI_MODEL"] || "gpt-4o";
};
type GetLlmTypeFunction = () => LlmType;
export const GetLlmType: GetLlmTypeFunction = (): LlmType => {
if (GetOpenAIAPIKey() && GetOpenAIModel()) {
return LlmType.OpenAI;
}
if (GetLlmServerUrl()) {
return LlmType.ONEUPTIME_LLM;
}
return LlmType.ONEUPTIME_LLM;
};
export const FixNumberOfCodeEventsInEachRun: number = 5;

View File

@@ -25,6 +25,8 @@ ENV PLAYWRIGHT_SKIP_BROWSER_DOWNLOAD=1
RUN if [ -z "$APP_VERSION" ]; then export APP_VERSION=1.0.0; fi
RUN apt-get update
# Install bash.
RUN apt-get install bash -y && apt-get install curl -y
@@ -46,13 +48,6 @@ COPY ./Common /usr/src/Common
ENV PRODUCTION=true
WORKDIR /usr/src/app
@@ -61,12 +56,11 @@ WORKDIR /usr/src/app
COPY ./Copilot/package*.json /usr/src/app/
RUN npm install
# Create /repository/ directory where the app will store the repository
RUN mkdir -p /repository
# Set the stack trace limit to 0 to show full stack traces
ENV NODE_OPTIONS='--stack-trace-limit=30'
# Set the stack trace limit to 30 to show longer stack traces
ENV NODE_OPTIONS="--stack-trace-limit=30"
{{ if eq .Env.ENVIRONMENT "development" }}
#Run the app
@@ -75,9 +69,9 @@ CMD [ "npm", "run", "dev" ]
# Copy app source
COPY ./Copilot /usr/src/app
# Bundle app source
RUN npm run compile
RUN npm run build
# Set permission to write logs and cache in case container run as non root
RUN chown -R 1000:1000 "/tmp/npm" && chmod -R 2777 "/tmp/npm"
#Run the app
CMD [ "npm", "start" ]
{{ end }}
{{ end }}

View File

@@ -1,8 +0,0 @@
import Exception from "Common/Types/Exception/Exception";
import ExceptionCode from "Common/Types/Exception/ExceptionCode";
export default class CopilotActionException extends Exception {
public constructor(code: ExceptionCode, message: string) {
super(code, message);
}
}

View File

@@ -1,8 +0,0 @@
import Exception from "Common/Types/Exception/Exception";
import ExceptionCode from "Common/Types/Exception/ExceptionCode";
export default class CopilotActionProcessingException extends Exception {
public constructor(code: ExceptionCode, message: string) {
super(code, message);
}
}

View File

@@ -1,8 +0,0 @@
import ExceptionCode from "Common/Types/Exception/ExceptionCode";
import CopilotActionProcessingException from "./CopilotActionProcessingException";
export default class ErrorGettingResponseFromLLM extends CopilotActionProcessingException {
public constructor(message: string) {
super(ExceptionCode.BadDataException, message);
}
}

View File

@@ -1,8 +0,0 @@
import ExceptionCode from "Common/Types/Exception/ExceptionCode";
import CopilotActionProcessingException from "./CopilotActionProcessingException";
export default class LLMTimeoutException extends CopilotActionProcessingException {
public constructor(message: string) {
super(ExceptionCode.BadDataException, message);
}
}

View File

@@ -1,8 +0,0 @@
import ExceptionCode from "Common/Types/Exception/ExceptionCode";
import CopilotActionProcessingException from "./CopilotActionProcessingException";
export default class NotAcceptedFileExtentionForCopilotAction extends CopilotActionProcessingException {
public constructor(message: string) {
super(ExceptionCode.BadDataException, message);
}
}

View File

@@ -1,46 +0,0 @@
import CodeRepositoryUtil from "./Utils/CodeRepository";
import HTTPErrorResponse from "Common/Types/API/HTTPErrorResponse";
import logger from "Common/Server/Utils/Logger";
import dotenv from "dotenv";
import Init from "./Init";
import Telemetry from "Common/Server/Utils/Telemetry";
const APP_NAME: string = "copilot";
dotenv.config();
logger.info("OneUptime Copilot is starting...");
// Initialize telemetry
Telemetry.init({
serviceName: APP_NAME,
});
Init()
.then(() => {
process.exit(0);
})
.catch(async (error: Error | HTTPErrorResponse) => {
try {
logger.error(error);
await CodeRepositoryUtil.discardChanges();
// change back to main branch.
await CodeRepositoryUtil.checkoutMainBranch();
} catch (e) {
logger.error(e);
// do nothing.
}
logger.error("Error in starting OneUptime Copilot: ");
if (error instanceof HTTPErrorResponse) {
logger.error(error.message);
} else if (error instanceof Error) {
logger.error(error.message);
} else {
logger.error(error);
}
process.exit(1);
});

View File

@@ -1,231 +0,0 @@
import CodeRepositoryUtil, {
CodeRepositoryResult,
RepoScriptType,
} from "./Utils/CodeRepository";
import InitUtil from "./Utils/Init";
import ServiceRepositoryUtil from "./Utils/ServiceRepository";
import { PromiseVoidFunction } from "Common/Types/FunctionTypes";
import logger from "Common/Server/Utils/Logger";
import CopilotActionUtil from "./Utils/CopilotAction";
import CopilotAction from "Common/Models/DatabaseModels/CopilotAction";
import {
FixNumberOfCodeEventsInEachRun,
GetIsCopilotDisabled,
GetLlmType,
} from "./Config";
import CopilotActionService, {
CopilotExecutionResult,
} from "./Service/CopilotActions/Index";
import CopilotActionStatus from "Common/Types/Copilot/CopilotActionStatus";
import PullRequest from "Common/Types/CodeRepository/PullRequest";
import ServiceCopilotCodeRepository from "Common/Models/DatabaseModels/ServiceCopilotCodeRepository";
import CopilotActionProcessingException from "./Exceptions/CopilotActionProcessingException";
import CopilotPullRequest from "Common/Models/DatabaseModels/CopilotPullRequest";
import ProcessUtil from "./Utils/Process";
let currentFixCount: number = 1;
const init: PromiseVoidFunction = async (): Promise<void> => {
// check if copilot is disabled.
if (GetIsCopilotDisabled()) {
logger.info("Copilot is disabled. Exiting.");
ProcessUtil.haltProcessWithSuccess();
}
logger.info(`Using ${GetLlmType()} as the AI model.`);
await CodeRepositoryUtil.setAuthorIdentity({
email: "copilot@oneuptime.com",
name: "OneUptime Copilot",
});
const codeRepositoryResult: CodeRepositoryResult = await InitUtil.init();
// before cloning the repo, check if there are any services to improve.
ServiceRepositoryUtil.setCodeRepositoryResult({
codeRepositoryResult,
});
const servicesToImprove: ServiceCopilotCodeRepository[] =
await ServiceRepositoryUtil.getServicesToImprove();
logger.debug(`Found ${servicesToImprove.length} services to improve.`);
// if no services to improve, then exit.
if (servicesToImprove.length === 0) {
logger.info("No services to improve. Exiting.");
ProcessUtil.haltProcessWithSuccess();
}
for (const serviceToImprove of servicesToImprove) {
logger.debug(`- ${serviceToImprove.serviceCatalog!.name}`);
}
await cloneRepository({
codeRepositoryResult,
});
await setUpRepository();
for (const serviceRepository of servicesToImprove) {
checkIfCurrentFixCountIsLessThanFixNumberOfCodeEventsInEachRun();
const actionsToWorkOn: Array<CopilotAction> =
await CopilotActionUtil.getActionsToWorkOn({
serviceCatalogId: serviceRepository.serviceCatalog!.id!,
serviceRepositoryId: serviceRepository.id!,
});
for (const actionToWorkOn of actionsToWorkOn) {
checkIfCurrentFixCountIsLessThanFixNumberOfCodeEventsInEachRun();
// check copilot events for this file.
let executionResult: CopilotExecutionResult | null = null;
let currentRetryCount: number = 0;
const maxRetryCount: number = 3;
while (currentRetryCount < maxRetryCount) {
try {
executionResult = await executeAction({
serviceRepository,
copilotAction: actionToWorkOn,
});
break;
} catch (e) {
logger.error(e);
currentRetryCount++;
await CodeRepositoryUtil.discardAllChangesOnCurrentBranch();
}
}
if (
executionResult &&
executionResult.status === CopilotActionStatus.PR_CREATED
) {
currentFixCount++;
}
}
}
};
interface ExecuteActionData {
serviceRepository: ServiceCopilotCodeRepository;
copilotAction: CopilotAction;
}
type ExecutionActionFunction = (
data: ExecuteActionData,
) => Promise<CopilotExecutionResult | null>;
const executeAction: ExecutionActionFunction = async (
data: ExecuteActionData,
): Promise<CopilotExecutionResult | null> => {
const { serviceRepository, copilotAction } = data;
try {
return await CopilotActionService.executeAction({
serviceRepository: serviceRepository,
copilotAction: copilotAction,
});
} catch (e) {
if (e instanceof CopilotActionProcessingException) {
// This is not a serious exception, so we just move on to the next action.
logger.info(e.message);
return null;
}
throw e;
}
};
type CloneRepositoryFunction = (data: {
codeRepositoryResult: CodeRepositoryResult;
}) => Promise<void>;
const cloneRepository: CloneRepositoryFunction = async (data: {
codeRepositoryResult: CodeRepositoryResult;
}): Promise<void> => {
const { codeRepositoryResult } = data;
logger.info(
`Cloning the repository ${codeRepositoryResult.codeRepository.name} to a temporary directory.`,
);
// now clone this repository to a temporary directory - /repository
await CodeRepositoryUtil.cloneRepository({
codeRepository: codeRepositoryResult.codeRepository,
});
// Check if OneUptime Copilot has setup properly.
const onAfterCloneScript: string | null =
await CodeRepositoryUtil.getRepoScript({
scriptType: RepoScriptType.OnAfterClone,
});
if (!onAfterCloneScript) {
logger.debug("No on-after-clone script found for this repository.");
}
if (onAfterCloneScript) {
logger.info("Executing on-after-clone script.");
await CodeRepositoryUtil.executeScript({
script: onAfterCloneScript,
});
logger.info("on-after-clone script executed successfully.");
}
logger.info(
`Repository ${codeRepositoryResult.codeRepository.name} cloned successfully.`,
);
};
const checkIfCurrentFixCountIsLessThanFixNumberOfCodeEventsInEachRun: VoidFunction =
(): void => {
if (currentFixCount <= FixNumberOfCodeEventsInEachRun) {
return;
}
logger.info(
`Copilot has fixed ${FixNumberOfCodeEventsInEachRun} code events. Thank you for using Copilot. If you wish to fix more code events, please run Copilot again.`,
);
ProcessUtil.haltProcessWithSuccess();
};
const setUpRepository: PromiseVoidFunction = async (): Promise<void> => {
const isSetupProperly: boolean =
await CodeRepositoryUtil.isRepoSetupProperly();
if (isSetupProperly) {
return;
}
// if the repo is not set up properly, then check if there's an outstanding setup Pr for this repo.
logger.info("Setting up the repository.");
// check if there's an outstanding setup PR for this repo.
const setupPullRequest: CopilotPullRequest | null =
await CodeRepositoryUtil.getOpenSetupPullRequest();
if (setupPullRequest) {
logger.info(
`There's an open setup PR for this repository: ${setupPullRequest.pullRequestId}. Please merge this PR to continue using Copilot. Exiting...`,
);
ProcessUtil.haltProcessWithSuccess();
return;
}
// if there's no setup PR, then create a new setup PR.
const pullRequest: PullRequest = await CodeRepositoryUtil.setUpRepo();
logger.info(
"Repository setup PR created - #" +
pullRequest.pullRequestNumber +
". Please megre this PR to continue using Copilot. Exiting..",
);
ProcessUtil.haltProcessWithSuccess();
};
export default init;

View File

@@ -1,6 +1,68 @@
# OneUptime Copilot
# OneUptime Copilot Agent
Copilot is a tool that helps you improve your codebase automatically.
A standalone CLI coding agent that mirrors the autonomous workflows we use inside VS Code Copilot Chat. It connects to an LM Studiohosted OpenAI-compatible model, inspects a workspace, reasons about the task, and uses a toolbox (file/patch editing, search, terminal commands) to complete coding requests.
Please refer to the [official documentation](/Docs/Content/copilot) for more information.
## Prerequisites
- Node.js 18+
- An LM Studio instance exposing a chat completions endpoint (for example `http://localhost:1234/v1/chat/completions`).
- The workspace you want the agent to modify must already exist locally.
## Installation
```bash
cd Copilot/oneuptime-copilot-agent
npm install
npm run build
npm link # optional, provides the global oneuptime-copilot-agent command
```
## Usage
```bash
oneuptime-copilot-agent \
--prompt "Refactor auth middleware and add unit tests" \
--model http://localhost:1234/v1/chat/completions \
--model-name Meta-Llama-3-8B-Instruct \
--workspace-path /path/to/oneuptime
```
### CLI options
| Flag | Description |
| ---- | ----------- |
| `--prompt` | Required. Natural language description of the task. |
| `--model` | Required. Full LM Studio chat completions endpoint URL. |
| `--workspace-path` | Required. Absolute or relative path to the repo the agent should use. |
| `--model-name` | Optional model identifier that LM Studio expects (default `lmstudio`). |
| `--temperature` | Sampling temperature (default `0.1`). |
| `--max-iterations` | Maximum agent/tool-call loops before stopping (default `12`). |
| `--timeout` | LLM HTTP timeout per request in milliseconds (default `120000`). |
| `--api-key` | Optional bearer token if the endpoint is secured. |
| `--log-level` | `debug`, `info`, `warn`, or `error` (default `info`). |
| `--log-file` | Optional file path. When provided, all logs are appended to this file in addition to stdout. |
## Architecture snapshot
- `src/agent` Orchestrates the conversation loop, builds the system prompt (inspired by the VS Code Copilot agent), snapshots the workspace, and streams messages to the LM Studio endpoint.
- `src/tools` Implements the toolbelt (`list_directory`, `read_file`, `search_workspace`, `apply_patch`, `write_file`, `run_command`). These wrap `Common` utilities (`Execute`, `LocalFile`, `Logger`) to stay consistent with other OneUptime services.
- `src/llm` Thin LM Studio/OpenAI-compatible client using `undici` with timeout + error handling.
- `src/@types/Common` Lightweight shim typings so TypeScript consumers get the pieces of `Common` they need without re-compiling that entire package.
## Development scripts
```bash
npm run build # Compile TypeScript -> build/dist
npm run dev # Run with ts-node for quick experiments
```
For example:
```
npm run dev -- --prompt "Write tests for this project" \
--model http://localhost:1234/v1/chat/completions \
--model-name openai/gpt-oss-20b \
--workspace-path ./
```
The agent intentionally mirrors Copilots workflow: it iteratively plans, reads files, edits them through patches or full rewrites, and executes commands/tests via the terminal tool. Logs stream to stdout so you can follow each tool invocation in real time.

View File

@@ -1,427 +0,0 @@
import CopilotActionType from "Common/Types/Copilot/CopilotActionType";
import CopilotActionBase from "./CopilotActionsBase";
import CodeRepositoryUtil from "../../Utils/CodeRepository";
import TechStack from "Common/Types/ServiceCatalog/TechStack";
import { CopilotPromptResult } from "../LLM/LLMBase";
import Text from "Common/Types/Text";
import { CopilotActionPrompt, CopilotProcess } from "./Types";
import { PromptRole } from "../LLM/Prompt";
import logger from "Common/Server/Utils/Logger";
import FileActionProp from "Common/Types/Copilot/CopilotActionProps/FileActionProp";
import CodeRepositoryFile from "Common/Server/Utils/CodeRepository/CodeRepositoryFile";
import CopilotActionUtil from "../../Utils/CopilotAction";
import ObjectID from "Common/Types/ObjectID";
import CopilotAction from "Common/Models/DatabaseModels/CopilotAction";
import ServiceRepositoryUtil from "../../Utils/ServiceRepository";
import Dictionary from "Common/Types/Dictionary";
import ArrayUtil from "Common/Utils/Array";
import CopilotActionProp from "Common/Types/Copilot/CopilotActionProps/Index";
import BadDataException from "Common/Types/Exception/BadDataException";
import LocalFile from "Common/Server/Utils/LocalFile";
export default class AddSpans extends CopilotActionBase {
public isRequirementsMet: boolean = false;
public constructor() {
super();
this.copilotActionType = CopilotActionType.ADD_SPANS;
this.acceptFileExtentions = CodeRepositoryUtil.getCodeFileExtentions();
}
protected override async isActionRequired(data: {
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
copilotActionProp: FileActionProp;
}): Promise<boolean> {
// check if the action has already been processed for this file.
const existingAction: CopilotAction | null =
await CopilotActionUtil.getExistingAction({
serviceCatalogId: data.serviceCatalogId,
actionType: this.copilotActionType,
actionProps: {
filePath: data.copilotActionProp.filePath, // has this action run on this file before?
},
});
if (!existingAction) {
return true;
}
return false;
}
public override async getActionPropsToQueue(data: {
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
maxActionsToQueue: number;
}): Promise<Array<CopilotActionProp>> {
// get files in the repo.
logger.debug(
`${this.copilotActionType} - Getting files to queue for improve comments.`,
);
let totalActionsToQueue: number = 0;
logger.debug(`${this.copilotActionType} - Reading files in the service.`);
const files: Dictionary<CodeRepositoryFile> =
await ServiceRepositoryUtil.getFilesByServiceCatalogId({
serviceCatalogId: data.serviceCatalogId,
});
logger.debug(
`${this.copilotActionType} - Files read. ${Object.keys(files).length} files found.`,
);
// get keys in random order.
let fileKeys: string[] = Object.keys(files);
//randomize the order of the files.
fileKeys = ArrayUtil.shuffle(fileKeys);
const actionsPropsQueued: Array<CopilotActionProp> = [];
for (const fileKey of fileKeys) {
// check if the file is in accepted file extentions.
const fileExtention: string = LocalFile.getFileExtension(
files[fileKey]!.filePath,
);
if (!this.acceptFileExtentions.includes(fileExtention)) {
continue;
}
const file: CodeRepositoryFile = files[fileKey]!;
logger.debug(
`${this.copilotActionType} - Checking file: ${file.filePath}`,
);
if (
await this.isActionRequired({
serviceCatalogId: data.serviceCatalogId,
serviceRepositoryId: data.serviceRepositoryId,
copilotActionProp: {
filePath: file.filePath,
},
})
) {
actionsPropsQueued.push({
filePath: file.filePath,
});
totalActionsToQueue++;
}
if (totalActionsToQueue >= data.maxActionsToQueue) {
break;
}
}
return actionsPropsQueued;
}
public override async getCommitMessage(
data: CopilotProcess,
): Promise<string> {
return "Add Spans in " + (data.actionProp as FileActionProp).filePath;
}
public override async getPullRequestTitle(
data: CopilotProcess,
): Promise<string> {
return "Add spans in " + (data.actionProp as FileActionProp).filePath;
}
public override async getPullRequestBody(
data: CopilotProcess,
): Promise<string> {
return `Add spans in ${(data.actionProp as FileActionProp).filePath}
${await this.getDefaultPullRequestBody()}
`;
}
public override isActionComplete(_data: CopilotProcess): Promise<boolean> {
return Promise.resolve(this.isRequirementsMet);
}
public override async onExecutionStep(
data: CopilotProcess,
): Promise<CopilotProcess> {
const filePath: string = (data.actionProp as FileActionProp).filePath;
if (!filePath) {
throw new BadDataException("File Path is not set in the action prop.");
}
const fileContent: string = await ServiceRepositoryUtil.getFileContent({
filePath: filePath,
});
const codeParts: string[] = await this.splitInputCode({
code: fileContent,
itemSize: 500,
});
let newContent: string = "";
let hasSpansBeenAdded: boolean = true;
for (const codePart of codeParts) {
const codePartResult: {
newCode: string;
hasSpansBeenAdded: boolean;
} = await this.addSpansInCode({
data: data,
codePart: codePart,
currentRetryCount: 0,
maxRetryCount: 3,
});
if (!codePartResult.hasSpansBeenAdded) {
hasSpansBeenAdded = false;
newContent += codePartResult.newCode + "\n";
} else {
newContent += codePart + "\n";
}
}
if (hasSpansBeenAdded) {
this.isRequirementsMet = true;
return data;
}
newContent = newContent.trim();
logger.debug("New Content:");
logger.debug(newContent);
const fileActionProps: FileActionProp = data.actionProp as FileActionProp;
// add to result.
data.result.files[fileActionProps.filePath] = {
fileContent: newContent,
} as CodeRepositoryFile;
this.isRequirementsMet = true;
return data;
}
private async didPassValidation(data: CopilotPromptResult): Promise<boolean> {
const validationResponse: string = data.output as string;
if (validationResponse === "--no--") {
return true;
}
return false;
}
private async hasSpansBeenAddedAlready(content: string): Promise<boolean> {
if (content.includes("--all-good--")) {
return true;
}
return false;
}
private async addSpansInCode(options: {
data: CopilotProcess;
codePart: string;
currentRetryCount: number;
maxRetryCount: number;
}): Promise<{
newCode: string;
hasSpansBeenAdded: boolean;
}> {
let hasSpansBeenAdded: boolean = true;
const codePart: string = options.codePart;
const data: CopilotProcess = options.data;
const actionPrompt: CopilotActionPrompt = await this.getPrompt(
data,
codePart,
);
const copilotResult: CopilotPromptResult =
await this.askCopilot(actionPrompt);
const newCodePart: string = await this.cleanupCode({
inputCode: codePart,
outputCode: copilotResult.output as string,
});
if (!(await this.hasSpansBeenAddedAlready(newCodePart))) {
hasSpansBeenAdded = false;
}
const validationPrompt: CopilotActionPrompt =
await this.getValidationPrompt({
oldCode: codePart,
newCode: newCodePart,
});
const validationResponse: CopilotPromptResult =
await this.askCopilot(validationPrompt);
const didPassValidation: boolean =
await this.didPassValidation(validationResponse);
if (
!didPassValidation &&
options.currentRetryCount < options.maxRetryCount
) {
return await this.addSpansInCode({
data: data,
codePart: codePart,
currentRetryCount: options.currentRetryCount + 1,
maxRetryCount: options.maxRetryCount,
});
}
if (!didPassValidation) {
return {
newCode: codePart,
hasSpansBeenAdded: false,
};
}
return {
newCode: newCodePart,
hasSpansBeenAdded: hasSpansBeenAdded,
};
}
private async getValidationPrompt(data: {
oldCode: string;
newCode: string;
}): Promise<CopilotActionPrompt> {
const oldCode: string = data.oldCode;
const newCode: string = data.newCode;
const prompt: string = `
I've asked to add open telemetry spans in the code.
This is the old code:
${oldCode}
----
This is the new code:
${newCode}
Was anything changed in the code except adding spans? If yes, please reply with the following text:
--yes--
If the code was NOT changed EXCEPT adding spans, please reply with the following text:
--no--
`;
const systemPrompt: string = await this.getSystemPrompt();
return {
messages: [
{
content: systemPrompt,
role: PromptRole.System,
},
{
content: prompt,
role: PromptRole.User,
},
],
};
}
public override async getPrompt(
_data: CopilotProcess,
inputCode: string,
): Promise<CopilotActionPrompt> {
/*
* const fileLanguage: TechStack = data.input.files[data.input.currentFilePath]
* ?.fileLanguage as TechStack;
*/
const fileLanguage: TechStack = TechStack.TypeScript;
const prompt: string = `Please add OpenTelemetry spans in the code to functions and methods. If spans are already added, do not modify them.
If you think functions in the code already have spans, please reply with the following text:
--all-good--
Here is the code. This is in ${fileLanguage}:
${inputCode}
`;
const systemPrompt: string = await this.getSystemPrompt();
return {
messages: [
{
content: systemPrompt,
role: PromptRole.System,
},
{
content: prompt,
role: PromptRole.User,
},
],
};
}
public async getSystemPrompt(): Promise<string> {
const systemPrompt: string = `You are an expert programmer. Here are your instructions:
- You will follow the instructions given by the user strictly.
- You will not deviate from the instructions given by the user.
- You will not only add OpenTelemetry Spans in this code. You will not do anything else.`;
return systemPrompt;
}
public async cleanupCode(data: {
inputCode: string;
outputCode: string;
}): Promise<string> {
/*
* this code contains text as well. The code is in betwen ```<type> and ```. Please extract the code and return it.
* for example code can be in the format of
* ```python
* print("Hello World")
* ```
*/
// so the code to be extracted is print("Hello World")
// the code can be in multiple lines as well.
let extractedCode: string = data.outputCode; // this is the code in the file
if (extractedCode.includes("```")) {
extractedCode = extractedCode.match(/```.*\n([\s\S]*?)```/)?.[1] ?? "";
}
// get first line of input code.
const firstWordOfInputCode: string = Text.getFirstWord(data.inputCode);
extractedCode = Text.trimStartUntilThisWord(
extractedCode,
firstWordOfInputCode,
);
const lastWordOfInputCode: string = Text.getLastWord(data.inputCode);
extractedCode = Text.trimEndUntilThisWord(
extractedCode,
lastWordOfInputCode,
);
extractedCode = Text.trimUpQuotesFromStartAndEnd(extractedCode);
// check for quotes.
return extractedCode;
}
}

View File

@@ -1,299 +0,0 @@
import NotImplementedException from "Common/Types/Exception/NotImplementedException";
import LlmType from "../../Types/LlmType";
import CopilotActionType from "Common/Types/Copilot/CopilotActionType";
import LLM from "../LLM/LLM";
import { GetLlmType } from "../../Config";
import Text from "Common/Types/Text";
import { CopilotPromptResult } from "../LLM/LLMBase";
import BadDataException from "Common/Types/Exception/BadDataException";
import logger from "Common/Server/Utils/Logger";
import CodeRepositoryUtil, { RepoScriptType } from "../../Utils/CodeRepository";
import CopilotActionProp from "Common/Types/Copilot/CopilotActionProps/Index";
import ObjectID from "Common/Types/ObjectID";
import {
CopilotActionPrompt,
CopilotProcess,
CopilotProcessStart,
} from "./Types";
export default class CopilotActionBase {
public llmType: LlmType = LlmType.ONEUPTIME_LLM; // temp value which will be overridden in the constructor
public copilotActionType: CopilotActionType =
CopilotActionType.IMPROVE_COMMENTS; // temp value which will be overridden in the constructor
public acceptFileExtentions: string[] = [];
public constructor() {
this.llmType = GetLlmType();
}
protected async isActionRequired(_data: {
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
copilotActionProp: CopilotActionProp;
}): Promise<boolean> {
throw new NotImplementedException();
}
public async getActionPropsToQueue(_data: {
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
maxActionsToQueue: number;
}): Promise<Array<CopilotActionProp>> {
throw new NotImplementedException();
}
protected async validateExecutionStep(
_data: CopilotProcess,
): Promise<boolean> {
if (!this.copilotActionType) {
throw new BadDataException("Copilot Action Type is not set");
}
// validate by default.
return true;
}
protected async onAfterExecute(
data: CopilotProcess,
): Promise<CopilotProcess> {
// do nothing
return data;
}
protected async onBeforeExecute(
data: CopilotProcess,
): Promise<CopilotProcess> {
// do nothing
return data;
}
public async getBranchName(): Promise<string> {
const randomText: string = Text.generateRandomText(5);
const bracnhName: string = `${Text.pascalCaseToDashes(this.copilotActionType).toLowerCase()}-${randomText}`;
// replace -- with - in the branch name
return Text.replaceAll(bracnhName, "--", "-");
}
public async getPullRequestTitle(_data: CopilotProcess): Promise<string> {
throw new NotImplementedException();
}
public async getPullRequestBody(_data: CopilotProcess): Promise<string> {
throw new NotImplementedException();
}
protected async getDefaultPullRequestBody(): Promise<string> {
return `
#### Warning
This PR is generated by OneUptime Copilot. OneUptime Copilot is an AI tool that improves your code. Please do not rely on it completely. Always review the changes before merging.
#### Feedback
If you have any feedback or suggestions, please let us know. We would love to hear from you. Please contact us at copilot@oneuptime.com.
`;
}
public async getCommitMessage(_data: CopilotProcess): Promise<string> {
throw new NotImplementedException();
}
protected async onExecutionStep(
data: CopilotProcess,
): Promise<CopilotProcess> {
return Promise.resolve(data);
}
protected async isActionComplete(_data: CopilotProcess): Promise<boolean> {
return true; // by default the action is completed
}
protected async getNextFilePath(
_data: CopilotProcess,
): Promise<string | null> {
return null;
}
public async execute(
data: CopilotProcessStart,
): Promise<CopilotProcess | null> {
logger.info(
"Executing Copilot Action (this will take several minutes to complete): " +
this.copilotActionType,
);
logger.info(data.actionProp);
const onBeforeExecuteActionScript: string | null =
await CodeRepositoryUtil.getRepoScript({
scriptType: RepoScriptType.OnBeforeCodeChange,
});
if (!onBeforeExecuteActionScript) {
logger.debug(
"No on-before-copilot-action script found for this repository.",
);
} else {
logger.info("Executing on-before-copilot-action script.");
await CodeRepositoryUtil.executeScript({
script: onBeforeExecuteActionScript,
});
logger.info("on-before-copilot-action script executed successfully");
}
const processData: CopilotProcess = await this.onBeforeExecute({
...data,
result: {
files: {},
statusMessage: "",
logs: [],
},
});
if (!processData.result) {
processData.result = {
files: {},
statusMessage: "",
logs: [],
};
}
if (!processData.result.files) {
processData.result.files = {};
}
let isActionComplete: boolean = false;
while (!isActionComplete) {
if (!(await this.validateExecutionStep(processData))) {
/*
* execution step not valid
* return data as it is
*/
return processData;
}
data = await this.onExecutionStep(processData);
isActionComplete = await this.isActionComplete(processData);
}
data = await this.onAfterExecute(processData);
// write to disk.
await this.writeToDisk({ data: processData });
const onAfterExecuteActionScript: string | null =
await CodeRepositoryUtil.getRepoScript({
scriptType: RepoScriptType.OnAfterCodeChange,
});
if (!onAfterExecuteActionScript) {
logger.debug(
"No on-after-copilot-action script found for this repository.",
);
}
if (onAfterExecuteActionScript) {
logger.info("Executing on-after-copilot-action script.");
await CodeRepositoryUtil.executeScript({
script: onAfterExecuteActionScript,
});
logger.info("on-after-copilot-action script executed successfully");
}
return processData;
}
protected async _getPrompt(
data: CopilotProcess,
inputCode: string,
): Promise<CopilotActionPrompt | null> {
const prompt: CopilotActionPrompt | null = await this._getPrompt(
data,
inputCode,
);
if (!prompt) {
return null;
}
return prompt;
}
protected async getPrompt(
_data: CopilotProcess,
_inputCode: string,
): Promise<CopilotActionPrompt | null> {
throw new NotImplementedException();
}
protected async askCopilot(
prompt: CopilotActionPrompt,
): Promise<CopilotPromptResult> {
return await LLM.getResponse(prompt);
}
protected async writeToDisk(data: { data: CopilotProcess }): Promise<void> {
// write all the modified files.
const processResult: CopilotProcess = data.data;
for (const filePath in processResult.result.files) {
logger.info(`Writing file: ${filePath}`);
logger.info(`File content: `);
logger.info(`${processResult.result.files[filePath]!.fileContent}`);
const code: string = processResult.result.files[filePath]!.fileContent;
await CodeRepositoryUtil.writeToFile({
filePath: filePath,
content: code,
});
}
}
protected async discardAllChanges(): Promise<void> {
await CodeRepositoryUtil.discardAllChangesOnCurrentBranch();
}
protected async splitInputCode(data: {
code: string;
itemSize: number;
}): Promise<string[]> {
const inputCode: string = data.code;
const items: Array<string> = [];
const linesInInputCode: Array<string> = inputCode.split("\n");
let currentItemSize: number = 0;
const maxItemSize: number = data.itemSize;
let currentItem: string = "";
for (const line of linesInInputCode) {
const words: Array<string> = line.split(" ");
// check if the current item size is less than the max item size
if (currentItemSize + words.length < maxItemSize) {
currentItem += line + "\n";
currentItemSize += words.length;
} else {
// start a new item
items.push(currentItem);
currentItem = line + "\n";
currentItemSize = words.length;
}
}
if (currentItem) {
items.push(currentItem);
}
return items;
}
}

View File

@@ -1,446 +0,0 @@
import CopilotActionType from "Common/Types/Copilot/CopilotActionType";
import CopilotActionBase from "./CopilotActionsBase";
import CodeRepositoryUtil from "../../Utils/CodeRepository";
import TechStack from "Common/Types/ServiceCatalog/TechStack";
import { CopilotPromptResult } from "../LLM/LLMBase";
import Text from "Common/Types/Text";
import { CopilotActionPrompt, CopilotProcess } from "./Types";
import { PromptRole } from "../LLM/Prompt";
import logger from "Common/Server/Utils/Logger";
import FileActionProp from "Common/Types/Copilot/CopilotActionProps/FileActionProp";
import CodeRepositoryFile from "Common/Server/Utils/CodeRepository/CodeRepositoryFile";
import CopilotActionUtil from "../../Utils/CopilotAction";
import ObjectID from "Common/Types/ObjectID";
import CopilotAction from "Common/Models/DatabaseModels/CopilotAction";
import ServiceRepositoryUtil from "../../Utils/ServiceRepository";
import Dictionary from "Common/Types/Dictionary";
import ArrayUtil from "Common/Utils/Array";
import CopilotActionProp from "Common/Types/Copilot/CopilotActionProps/Index";
import BadDataException from "Common/Types/Exception/BadDataException";
import LocalFile from "Common/Server/Utils/LocalFile";
export default class ImproveComments extends CopilotActionBase {
public isRequirementsMet: boolean = false;
public constructor() {
super();
this.copilotActionType = CopilotActionType.IMPROVE_COMMENTS;
this.acceptFileExtentions = CodeRepositoryUtil.getCodeFileExtentions();
}
protected override async isActionRequired(data: {
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
copilotActionProp: FileActionProp;
}): Promise<boolean> {
// check if the action has already been processed for this file.
const existingAction: CopilotAction | null =
await CopilotActionUtil.getExistingAction({
serviceCatalogId: data.serviceCatalogId,
actionType: this.copilotActionType,
actionProps: {
filePath: data.copilotActionProp.filePath, // has this action run on this file before?
},
});
if (!existingAction) {
return true;
}
return false;
}
public override async getActionPropsToQueue(data: {
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
maxActionsToQueue: number;
}): Promise<Array<CopilotActionProp>> {
// get files in the repo.
logger.debug(
`${this.copilotActionType} - Getting files to queue for improve comments.`,
);
let totalActionsToQueue: number = 0;
logger.debug(`${this.copilotActionType} - Reading files in the service.`);
const files: Dictionary<CodeRepositoryFile> =
await ServiceRepositoryUtil.getFilesByServiceCatalogId({
serviceCatalogId: data.serviceCatalogId,
});
logger.debug(
`${this.copilotActionType} - Files read. ${Object.keys(files).length} files found.`,
);
// get keys in random order.
let fileKeys: string[] = Object.keys(files);
//randomize the order of the files.
fileKeys = ArrayUtil.shuffle(fileKeys);
const actionsPropsQueued: Array<CopilotActionProp> = [];
logger.debug(
`${this.copilotActionType} - Accepted File Extentions: ${this.acceptFileExtentions}`,
);
for (const fileKey of fileKeys) {
logger.debug(
`${this.copilotActionType} - Checking file: ${files[fileKey]!.filePath}`,
);
// check if the file is in accepted file extentions.
const fileExtention: string = LocalFile.getFileExtension(
files[fileKey]!.filePath,
);
logger.debug(
`${this.copilotActionType} - File Extention: ${fileExtention}`,
);
if (!this.acceptFileExtentions.includes(fileExtention)) {
logger.debug(
`${this.copilotActionType} - File is not in accepted file extentions. Skipping.`,
);
continue;
}
const file: CodeRepositoryFile = files[fileKey]!;
logger.debug(
`${this.copilotActionType} - Checking file: ${file.filePath}`,
);
if (
await this.isActionRequired({
serviceCatalogId: data.serviceCatalogId,
serviceRepositoryId: data.serviceRepositoryId,
copilotActionProp: {
filePath: file.filePath,
},
})
) {
actionsPropsQueued.push({
filePath: file.filePath,
});
totalActionsToQueue++;
}
if (totalActionsToQueue >= data.maxActionsToQueue) {
break;
}
}
return actionsPropsQueued;
}
public override async getCommitMessage(
data: CopilotProcess,
): Promise<string> {
return (
"Improved comments on " + (data.actionProp as FileActionProp).filePath
);
}
public override async getPullRequestTitle(
data: CopilotProcess,
): Promise<string> {
return (
"Improved comments on " + (data.actionProp as FileActionProp).filePath
);
}
public override async getPullRequestBody(
data: CopilotProcess,
): Promise<string> {
return `Improved comments on ${(data.actionProp as FileActionProp).filePath}
${await this.getDefaultPullRequestBody()}
`;
}
public override isActionComplete(_data: CopilotProcess): Promise<boolean> {
return Promise.resolve(this.isRequirementsMet);
}
public override async onExecutionStep(
data: CopilotProcess,
): Promise<CopilotProcess> {
const filePath: string = (data.actionProp as FileActionProp).filePath;
if (!filePath) {
throw new BadDataException("File Path is not set in the action prop.");
}
const fileContent: string = await ServiceRepositoryUtil.getFileContent({
filePath: filePath,
});
const codeParts: string[] = await this.splitInputCode({
code: fileContent,
itemSize: 500,
});
let newContent: string = "";
let isWellCommented: boolean = true;
for (const codePart of codeParts) {
const codePartResult: {
newCode: string;
isWellCommented: boolean;
} = await this.commentCodePart({
data: data,
codePart: codePart,
currentRetryCount: 0,
maxRetryCount: 3,
});
if (!codePartResult.isWellCommented) {
isWellCommented = false;
newContent += codePartResult.newCode + "\n";
} else {
newContent += codePart + "\n";
}
}
if (isWellCommented) {
this.isRequirementsMet = true;
return data;
}
newContent = newContent.trim();
logger.debug("New Content:");
logger.debug(newContent);
const fileActionProps: FileActionProp = data.actionProp as FileActionProp;
// add to result.
data.result.files[fileActionProps.filePath] = {
fileContent: newContent,
} as CodeRepositoryFile;
this.isRequirementsMet = true;
return data;
}
private async didPassValidation(data: CopilotPromptResult): Promise<boolean> {
const validationResponse: string = data.output as string;
if (validationResponse === "--no--") {
return true;
}
return false;
}
private async isFileAlreadyWellCommented(content: string): Promise<boolean> {
if (content.includes("--all-good--")) {
return true;
}
return false;
}
private async commentCodePart(options: {
data: CopilotProcess;
codePart: string;
currentRetryCount: number;
maxRetryCount: number;
}): Promise<{
newCode: string;
isWellCommented: boolean;
}> {
let isWellCommented: boolean = true;
const codePart: string = options.codePart;
const data: CopilotProcess = options.data;
const actionPrompt: CopilotActionPrompt = await this.getPrompt(
data,
codePart,
);
const copilotResult: CopilotPromptResult =
await this.askCopilot(actionPrompt);
const newCodePart: string = await this.cleanupCode({
inputCode: codePart,
outputCode: copilotResult.output as string,
});
if (!(await this.isFileAlreadyWellCommented(newCodePart))) {
isWellCommented = false;
}
const validationPrompt: CopilotActionPrompt =
await this.getValidationPrompt({
oldCode: codePart,
newCode: newCodePart,
});
const validationResponse: CopilotPromptResult =
await this.askCopilot(validationPrompt);
const didPassValidation: boolean =
await this.didPassValidation(validationResponse);
if (
!didPassValidation &&
options.currentRetryCount < options.maxRetryCount
) {
return await this.commentCodePart({
data: data,
codePart: codePart,
currentRetryCount: options.currentRetryCount + 1,
maxRetryCount: options.maxRetryCount,
});
}
if (!didPassValidation) {
return {
newCode: codePart,
isWellCommented: false,
};
}
return {
newCode: newCodePart,
isWellCommented: isWellCommented,
};
}
private async getValidationPrompt(data: {
oldCode: string;
newCode: string;
}): Promise<CopilotActionPrompt> {
const oldCode: string = data.oldCode;
const newCode: string = data.newCode;
const prompt: string = `
I've asked to improve comments in the code.
This is the old code:
${oldCode}
----
This is the new code:
${newCode}
Was anything changed in the code except comments? If yes, please reply with the following text:
--yes--
If the code was NOT changed EXCEPT comments, please reply with the following text:
--no--
`;
const systemPrompt: string = await this.getSystemPrompt();
return {
messages: [
{
content: systemPrompt,
role: PromptRole.System,
},
{
content: prompt,
role: PromptRole.User,
},
],
};
}
public override async getPrompt(
_data: CopilotProcess,
inputCode: string,
): Promise<CopilotActionPrompt> {
/*
* const fileLanguage: TechStack = data.input.files[data.input.currentFilePath]
* ?.fileLanguage as TechStack;
*/
const fileLanguage: TechStack = TechStack.TypeScript;
const prompt: string = `Please improve the comments in this code. Please only add minimal comments and comment code which is hard to understand. Please add comments in new line and do not add inline comments.
If you think the code is already well commented, please reply with the following text:
--all-good--
Here is the code. This is in ${fileLanguage}:
${inputCode}
`;
const systemPrompt: string = await this.getSystemPrompt();
return {
messages: [
{
content: systemPrompt,
role: PromptRole.System,
},
{
content: prompt,
role: PromptRole.User,
},
],
};
}
public async getSystemPrompt(): Promise<string> {
const systemPrompt: string = `You are an expert programmer. Here are your instructions:
- You will follow the instructions given by the user strictly.
- You will not deviate from the instructions given by the user.
- You will not change the code. You will only improve the comments.`;
return systemPrompt;
}
public async cleanupCode(data: {
inputCode: string;
outputCode: string;
}): Promise<string> {
/*
* this code contains text as well. The code is in betwen ```<type> and ```. Please extract the code and return it.
* for example code can be in the format of
* ```python
* print("Hello World")
* ```
*/
// so the code to be extracted is print("Hello World")
// the code can be in multiple lines as well.
let extractedCode: string = data.outputCode; // this is the code in the file
if (extractedCode.includes("```")) {
extractedCode = extractedCode.match(/```.*\n([\s\S]*?)```/)?.[1] ?? "";
}
// get first line of input code.
const firstWordOfInputCode: string = Text.getFirstWord(data.inputCode);
extractedCode = Text.trimStartUntilThisWord(
extractedCode,
firstWordOfInputCode,
);
const lastWordOfInputCode: string = Text.getLastWord(data.inputCode);
extractedCode = Text.trimEndUntilThisWord(
extractedCode,
lastWordOfInputCode,
);
extractedCode = Text.trimUpQuotesFromStartAndEnd(extractedCode);
// check for quotes.
return extractedCode;
}
}

View File

@@ -1,227 +0,0 @@
import CopilotActionType from "Common/Types/Copilot/CopilotActionType";
import ImproveComments from "./ImproveComments";
import Dictionary from "Common/Types/Dictionary";
import CopilotActionBase from "./CopilotActionsBase";
import BadDataException from "Common/Types/Exception/BadDataException";
import CodeRepositoryUtil, { RepoScriptType } from "../../Utils/CodeRepository";
import ServiceCopilotCodeRepository from "Common/Models/DatabaseModels/ServiceCopilotCodeRepository";
import PullRequest from "Common/Types/CodeRepository/PullRequest";
import CopilotAction from "Common/Models/DatabaseModels/CopilotAction";
import ObjectID from "Common/Types/ObjectID";
import CopilotActionStatus from "Common/Types/Copilot/CopilotActionStatus";
import logger from "Common/Server/Utils/Logger";
import CopilotPullRequest from "Common/Models/DatabaseModels/CopilotPullRequest";
import CopilotPullRequestService from "../CopilotPullRequest";
import CopilotActionUtil from "../../Utils/CopilotAction";
import { CopilotProcess } from "./Types";
// import AddSpans from "./AddSpan";
export const ActionDictionary: Dictionary<typeof CopilotActionBase> = {
[CopilotActionType.IMPROVE_COMMENTS]: ImproveComments,
// [CopilotActionType.ADD_SPANS]: AddSpans,
};
export interface CopilotExecutionResult {
status: CopilotActionStatus;
pullRequest: PullRequest | null;
}
export default class CopilotActionService {
public static async executeAction(data: {
serviceRepository: ServiceCopilotCodeRepository;
copilotAction: CopilotAction;
}): Promise<CopilotExecutionResult> {
await CodeRepositoryUtil.discardAllChangesOnCurrentBranch();
await CodeRepositoryUtil.switchToMainBranch();
await CodeRepositoryUtil.pullChanges();
const ActionType: typeof CopilotActionBase | undefined =
ActionDictionary[data.copilotAction.copilotActionType!];
if (!ActionType) {
throw new BadDataException("Invalid CopilotActionType");
}
const action: CopilotActionBase = new ActionType() as CopilotActionBase;
// mark this action as processing.
await CopilotActionUtil.updateCopilotAction({
actionStatus: CopilotActionStatus.PROCESSING,
actionId: data.copilotAction.id!,
});
const processResult: CopilotProcess | null = await action.execute({
actionProp: data.copilotAction.copilotActionProp!,
});
let executionResult: CopilotExecutionResult = {
status: CopilotActionStatus.NO_ACTION_REQUIRED,
pullRequest: null,
};
let pullRequest: PullRequest | null = null;
if (
processResult &&
processResult.result &&
processResult.result.files &&
Object.keys(processResult.result.files).length > 0
) {
logger.info("Obtained result from Copilot Action");
logger.info("Committing the changes to the repository and creating a PR");
const branchName: string = CodeRepositoryUtil.getBranchName({
branchName: await action.getBranchName(),
});
// create a branch
await CodeRepositoryUtil.createBranch({
branchName: branchName,
});
// write all the modified files.
const filePaths: string[] = Object.keys(processResult.result.files);
// run on before commit script. This is the place where we can run tests.
const onBeforeCommitScript: string | null =
await CodeRepositoryUtil.getRepoScript({
scriptType: RepoScriptType.OnBeforeCommit,
});
if (!onBeforeCommitScript) {
logger.debug("No on-before-commit script found for this repository.");
} else {
logger.info("Executing on-before-commit script.");
await CodeRepositoryUtil.executeScript({
script: onBeforeCommitScript,
});
logger.info("on-before-commit script executed successfully.");
}
const commitMessage: string =
await action.getCommitMessage(processResult);
const onAfterCommitScript: string | null =
await CodeRepositoryUtil.getRepoScript({
scriptType: RepoScriptType.OnAfterCommit,
});
if (!onAfterCommitScript) {
logger.debug("No on-after-commit script found for this repository.");
}
if (onAfterCommitScript) {
logger.info("Executing on-after-commit script.");
await CodeRepositoryUtil.executeScript({
script: onAfterCommitScript,
});
logger.info("on-after-commit script executed successfully.");
}
// add files to stage
logger.info("Adding files to stage: ");
for (const filePath of filePaths) {
logger.info(`- ${filePath}`);
}
await CodeRepositoryUtil.addFilesToGit({
filePaths: filePaths,
});
// commit changes
logger.info("Committing changes");
await CodeRepositoryUtil.commitChanges({
message: commitMessage,
});
// push changes
logger.info("Pushing changes");
await CodeRepositoryUtil.pushChanges({
branchName: branchName,
});
// create a PR
logger.info("Creating a PR");
pullRequest = await CodeRepositoryUtil.createPullRequest({
branchName: branchName,
title: await action.getPullRequestTitle(processResult),
body: await action.getPullRequestBody(processResult),
});
// switch to main branch.
logger.info("Switching to main branch");
await CodeRepositoryUtil.switchToMainBranch();
//save the result to the database.
logger.info("Saving the result to the database");
executionResult = {
status: CopilotActionStatus.PR_CREATED,
pullRequest: pullRequest,
};
}
if (
!processResult ||
!processResult.result ||
!processResult.result.files ||
Object.keys(processResult.result.files).length === 0
) {
logger.info("No result obtained from Copilot Action");
}
const getCurrentCommitHash: string =
await CodeRepositoryUtil.getCurrentCommitHash();
await CopilotActionService.updateCopilotAction({
serviceCatalogId: data.serviceRepository.serviceCatalog!.id!,
serviceRepositoryId: data.serviceRepository.id!,
commitHash: getCurrentCommitHash,
pullRequest: pullRequest,
copilotActionStatus: executionResult.status,
copilotActonId: data.copilotAction.id!,
statusMessage: processResult?.result.statusMessage || "",
logs: processResult?.result.logs || [],
});
return executionResult;
}
private static async updateCopilotAction(data: {
copilotActonId: ObjectID;
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
commitHash: string;
pullRequest: PullRequest | null;
statusMessage: string;
logs: Array<string>;
copilotActionStatus: CopilotActionStatus;
}): Promise<void> {
// add copilot action to the database.
let copilotPullRequest: CopilotPullRequest | null = null;
if (data.pullRequest) {
copilotPullRequest =
await CopilotPullRequestService.addPullRequestToDatabase({
pullRequest: data.pullRequest,
serviceCatalogId: data.serviceCatalogId,
serviceRepositoryId: data.serviceRepositoryId,
});
}
await CopilotActionUtil.updateCopilotAction({
actionStatus: data.copilotActionStatus,
pullRequestId: copilotPullRequest ? copilotPullRequest.id! : undefined,
commitHash: data.commitHash,
statusMessage: data.statusMessage,
logs: data.logs,
actionId: data.copilotActonId,
});
}
}

View File

@@ -1,28 +0,0 @@
import CodeRepositoryFile from "Common/Server/Utils/CodeRepository/CodeRepositoryFile";
import Dictionary from "Common/Types/Dictionary";
import { Prompt } from "../LLM/Prompt";
import CopilotActionProp from "Common/Types/Copilot/CopilotActionProps/Index";
export interface CopilotActionRunResult {
files: Dictionary<CodeRepositoryFile>;
statusMessage: string;
logs: Array<string>;
}
export interface CopilotActionPrompt {
messages: Array<Prompt>;
timeoutInMinutes?: number | undefined;
}
export interface CopilotActionVars {
currentFilePath: string;
files: Dictionary<CodeRepositoryFile>;
}
export interface CopilotProcessStart {
actionProp: CopilotActionProp;
}
export interface CopilotProcess extends CopilotProcessStart {
result: CopilotActionRunResult;
}

View File

@@ -1,146 +0,0 @@
import BadDataException from "Common/Types/Exception/BadDataException";
import PullRequest from "Common/Types/CodeRepository/PullRequest";
import ObjectID from "Common/Types/ObjectID";
import URL from "Common/Types/API/URL";
import { GetOneUptimeURL, GetRepositorySecretKey } from "../Config";
import HTTPErrorResponse from "Common/Types/API/HTTPErrorResponse";
import HTTPResponse from "Common/Types/API/HTTPResponse";
import { JSONObject } from "Common/Types/JSON";
import API from "Common/Utils/API";
import CopilotPullRequest from "Common/Models/DatabaseModels/CopilotPullRequest";
import CodeRepositoryUtil from "../Utils/CodeRepository";
import PullRequestState from "Common/Types/CodeRepository/PullRequestState";
export default class CopilotPullRequestService {
public static async refreshPullRequestStatus(data: {
copilotPullRequest: CopilotPullRequest;
}): Promise<PullRequestState> {
if (!data.copilotPullRequest.pullRequestId) {
throw new BadDataException("Pull Request ID not found");
}
if (!data.copilotPullRequest.id) {
throw new BadDataException("Copilot Pull Request ID not found");
}
const currentState: PullRequestState =
await CodeRepositoryUtil.getPullRequestState({
pullRequestId: data.copilotPullRequest.pullRequestId,
});
// update the status of the pull request in the database.
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotPullRequest()
.getCrudApiPath()
?.toString()}/update-pull-request-status/${GetRepositorySecretKey()}`,
);
const codeRepositoryResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.post({
url: url,
data: {
copilotPullRequestId: data.copilotPullRequest.id?.toString(),
copilotPullRequestStatus: currentState,
},
});
if (codeRepositoryResult instanceof HTTPErrorResponse) {
throw codeRepositoryResult;
}
return currentState;
}
public static async getOpenPullRequestsFromDatabase(): Promise<
Array<CopilotPullRequest>
> {
// send this to the API.
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotPullRequest()
.getCrudApiPath()
?.toString()}/get-pending-pull-requests/${GetRepositorySecretKey()}`,
);
const codeRepositoryResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.get({
url: url,
});
if (codeRepositoryResult instanceof HTTPErrorResponse) {
throw codeRepositoryResult;
}
const copilotPullRequestsJsonArray: Array<JSONObject> = codeRepositoryResult
.data["copilotPullRequests"] as Array<JSONObject>;
return CopilotPullRequest.fromJSONArray(
copilotPullRequestsJsonArray,
CopilotPullRequest,
) as Array<CopilotPullRequest>;
}
public static async addPullRequestToDatabase(data: {
pullRequest: PullRequest;
serviceCatalogId?: ObjectID | undefined;
serviceRepositoryId?: ObjectID | undefined;
isSetupPullRequest?: boolean | undefined;
}): Promise<CopilotPullRequest> {
let copilotPullRequest: CopilotPullRequest | null = null;
if (data.pullRequest && data.pullRequest.pullRequestNumber) {
copilotPullRequest = new CopilotPullRequest();
copilotPullRequest.pullRequestId =
data.pullRequest.pullRequestNumber.toString();
copilotPullRequest.copilotPullRequestStatus = PullRequestState.Open;
if (data.serviceCatalogId) {
copilotPullRequest.serviceCatalogId = data.serviceCatalogId;
}
if (data.isSetupPullRequest) {
copilotPullRequest.isSetupPullRequest = data.isSetupPullRequest;
}
if (data.serviceRepositoryId) {
copilotPullRequest.serviceRepositoryId = data.serviceRepositoryId;
}
// send this to the API.
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotPullRequest()
.getCrudApiPath()
?.toString()}/add-pull-request/${GetRepositorySecretKey()}`,
);
const codeRepositoryResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.post({
url: url,
data: {
copilotPullRequest: CopilotPullRequest.toJSON(
copilotPullRequest,
CopilotPullRequest,
),
},
});
if (codeRepositoryResult instanceof HTTPErrorResponse) {
throw codeRepositoryResult;
}
copilotPullRequest = CopilotPullRequest.fromJSON(
codeRepositoryResult.data,
CopilotPullRequest,
) as CopilotPullRequest;
return copilotPullRequest;
}
throw new BadDataException("Pull Request Number not found");
}
}

View File

@@ -1,24 +0,0 @@
import BadDataException from "Common/Types/Exception/BadDataException";
import { GetLlmType } from "../../Config";
import LlmType from "../../Types/LlmType";
import LlmBase, { CopilotPromptResult } from "./LLMBase";
import LLMServer from "./LLMServer";
import OpenAI from "./OpenAI";
import { CopilotActionPrompt } from "../CopilotActions/Types";
export default class LLM extends LlmBase {
public static override async getResponse(
data: CopilotActionPrompt,
): Promise<CopilotPromptResult> {
if (GetLlmType() === LlmType.ONEUPTIME_LLM) {
return await LLMServer.getResponse(data);
}
if (GetLlmType() === LlmType.OpenAI) {
return await OpenAI.getResponse(data);
}
throw new BadDataException("Invalid LLM type");
}
}

View File

@@ -1,15 +0,0 @@
import NotImplementedException from "Common/Types/Exception/NotImplementedException";
import { JSONValue } from "Common/Types/JSON";
import { CopilotActionPrompt } from "../CopilotActions/Types";
export interface CopilotPromptResult {
output: JSONValue;
}
export default class LlmBase {
public static async getResponse(
_data: CopilotActionPrompt,
): Promise<CopilotPromptResult> {
throw new NotImplementedException();
}
}

View File

@@ -1,152 +0,0 @@
import URL from "Common/Types/API/URL";
import { GetLlmServerUrl } from "../../Config";
import LlmBase, { CopilotPromptResult } from "./LLMBase";
import API from "Common/Utils/API";
import HTTPErrorResponse from "Common/Types/API/HTTPErrorResponse";
import HTTPResponse from "Common/Types/API/HTTPResponse";
import { JSONArray, JSONObject } from "Common/Types/JSON";
import BadRequestException from "Common/Types/Exception/BadRequestException";
import Sleep from "Common/Types/Sleep";
import logger from "Common/Server/Utils/Logger";
import ErrorGettingResponseFromLLM from "../../Exceptions/ErrorGettingResponseFromLLM";
import BadOperationException from "Common/Types/Exception/BadOperationException";
import OneUptimeDate from "Common/Types/Date";
import LLMTimeoutException from "../../Exceptions/LLMTimeoutException";
import { CopilotActionPrompt } from "../CopilotActions/Types";
import { Prompt } from "./Prompt";
enum LlamaPromptStatus {
Processed = "processed",
NotFound = "not found",
Pending = "pending",
}
export default class Llama extends LlmBase {
public static override async getResponse(
data: CopilotActionPrompt,
): Promise<CopilotPromptResult> {
const serverUrl: URL = GetLlmServerUrl();
const response: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.post<JSONObject>({
url: URL.fromString(serverUrl.toString()).addRoute("/prompt/"),
data: {
messages: data.messages.map((message: Prompt) => {
return {
content: message.content,
role: message.role,
};
}),
// secretkey: GetRepositorySecretKey(),
},
headers: {},
options: {
retries: 3,
exponentialBackoff: true,
},
});
if (response instanceof HTTPErrorResponse) {
throw response;
}
const result: JSONObject = response.data;
const idOfPrompt: string = result["id"] as string;
if (result["error"] && typeof result["error"] === "string") {
throw new BadOperationException(result["error"]);
}
// now check this prompt status.
let promptStatus: LlamaPromptStatus = LlamaPromptStatus.Pending;
let promptResult: JSONObject | null = null;
const currentDate: Date = OneUptimeDate.getCurrentDate();
const timeoutInMinutes: number = data.timeoutInMinutes || 5;
while (promptStatus === LlamaPromptStatus.Pending) {
const timeNow: Date = OneUptimeDate.getCurrentDate();
if (
OneUptimeDate.getDifferenceInMinutes(timeNow, currentDate) >
timeoutInMinutes
) {
throw new LLMTimeoutException(
`Timeout of ${timeoutInMinutes} minutes exceeded. Skipping the prompt.`,
);
}
const response: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.post<JSONObject>({
url: URL.fromString(serverUrl.toString()).addRoute(`/prompt-result/`),
data: {
id: idOfPrompt,
// secretkey: GetRepositorySecretKey(),
},
headers: {},
options: {
retries: 3,
exponentialBackoff: true,
},
});
if (response instanceof HTTPErrorResponse) {
throw response;
}
if (
response.data["error"] &&
typeof response.data["error"] === "string"
) {
throw new BadOperationException(response.data["error"]);
}
const result: JSONObject = response.data;
promptStatus = result["status"] as LlamaPromptStatus;
if (promptStatus === LlamaPromptStatus.Processed) {
logger.debug("Prompt is processed");
promptResult = result;
} else if (promptStatus === LlamaPromptStatus.NotFound) {
throw new ErrorGettingResponseFromLLM("Error processing prompt");
} else if (promptStatus === LlamaPromptStatus.Pending) {
logger.debug("Prompt is still pending. Waiting for 1 second");
await Sleep.sleep(1000);
}
}
if (!promptResult) {
throw new BadRequestException("Failed to get response from Llama server");
}
if (
promptResult["output"] &&
(promptResult["output"] as JSONArray).length > 0
) {
promptResult = (promptResult["output"] as JSONArray)[0] as JSONObject;
}
if (promptResult && (promptResult as JSONObject)["generated_text"]) {
const arrayOfGeneratedText: JSONArray = (promptResult as JSONObject)[
"generated_text"
] as JSONArray;
// get last item
const lastItem: JSONObject = arrayOfGeneratedText[
arrayOfGeneratedText.length - 1
] as JSONObject;
if (lastItem["content"]) {
return {
output: lastItem["content"] as string,
};
}
}
throw new BadRequestException("Failed to get response from Llama server");
}
}

View File

@@ -1,49 +0,0 @@
import OpenAI from "openai";
import { GetOpenAIAPIKey, GetOpenAIModel } from "../../Config";
import LlmBase, { CopilotPromptResult } from "./LLMBase";
import BadRequestException from "Common/Types/Exception/BadRequestException";
import { CopilotActionPrompt } from "../CopilotActions/Types";
import logger from "Common/Server/Utils/Logger";
export default class Llama extends LlmBase {
public static openai: OpenAI | null = null;
public static override async getResponse(
data: CopilotActionPrompt,
): Promise<CopilotPromptResult> {
if (!GetOpenAIAPIKey() || !GetOpenAIModel()) {
throw new BadRequestException("OpenAI API Key or Model is not set");
}
if (!this.openai) {
this.openai = new OpenAI({
apiKey: GetOpenAIAPIKey() as string,
});
}
logger.debug("Getting response from OpenAI");
const chatCompletion: OpenAI.Chat.Completions.ChatCompletion =
await this.openai.chat.completions.create({
messages: data.messages,
model: GetOpenAIModel()!,
});
logger.debug("Got response from OpenAI");
if (
chatCompletion.choices.length > 0 &&
chatCompletion.choices[0]?.message?.content
) {
const response: string = chatCompletion.choices[0]!.message.content;
logger.debug(`Response from OpenAI: ${response}`);
return {
output: response,
};
}
throw new BadRequestException("Failed to get response from OpenAI server");
}
}

View File

@@ -1,10 +0,0 @@
export enum PromptRole {
System = "system",
User = "user",
Assistant = "assistant",
}
export interface Prompt {
content: string;
role: PromptRole;
}

View File

@@ -1,15 +0,0 @@
## OneUptime Copilot
This folder contains the configuration files for the OneUptime Copilot. The Copilot is a tool that automatically improves your code. It can fix issues, improve code quality, and help you ship faster.
This folder has the following structure:
- `config.js`: The configuration file for the Copilot. You can customize the Copilot's behavior by changing this file.
- `scripts`: A folder containing scripts that the Copilot runs. These are hooks that run at different stages of the Copilot's process.
- `on-after-clone.sh`: A script that runs after the Copilot clones your repository.
- `on-before-code-change.sh`: A script that runs before the Copilot makes changes to your code.
- `on-after-code-change.sh`: A script that runs after the Copilot makes changes to your code.
- `on-before-commit.sh`: A script that runs before the Copilot commits changes to your repository.
- `on-after-commit.sh`: A script that runs after the Copilot commits changes to your repository.

View File

@@ -1,10 +0,0 @@
// This is the configuration file for the oneuptime copilot.
const getCopilotConfig = () => {
return {
// The version of the schema for this configuration file.
schemaVersion: '1.0',
}
}
export default getCopilotConfig;

View File

@@ -1,13 +0,0 @@
# Description: Copilot clones your repository and to improve your code.
# This scirpt runs after the clone process is completed.
# Some of the common tasks you can do here are:
# 1. Install dependencies
# 2. Run linting
# 3. Run tests
# 4. Run build
# 5. Run any other command that you want to run after the clone process is completed.
# If this script fails, copilot will not proceed with the next steps to improve your code.
# This step is to ensure that the code is in a good state before we start improving it.
# If you want to skip this script, you can keep this file empty.
# It's highly recommended to run linting and tests in this script to ensure the code is in a good state.
# This scirpt will run on ubuntu machine. So, make sure the commands you run are compatible with ubuntu.

View File

@@ -1,11 +0,0 @@
# Description: Copilot will run this script after we make improvements to your code and write it to disk.
# Some of the common tasks you can do here are:
# 1. Run linting
# 2. Run tests
# 3. Run build
# 4. Run any other command that you want to run after the code is changed.
# If this script fails, copilot will not commit the changes to your repository.
# This step is to ensure that the code is in a good state before we commit the changes.
# If you want to skip this script, you can keep this file empty.
# It's highly recommended to run linting and tests in this script to ensure the code is in a good state.
# This scirpt will run on ubuntu machine. So, make sure the commands you run are compatible with ubuntu.

View File

@@ -1 +0,0 @@
# Description: Copilot will run this script after the commit process is completed.

View File

@@ -1,9 +0,0 @@
# Description: Copilot will run this script before we make changes to your code.
# Some of the common tasks you can do here are:
# 1. Install dependencies
# 2. Run any other command that you want to run before the code is changed.
# If this script fails, copilot will not make any changes to the code.
# This step is to ensure that the code is in a good state before we start making changes.
# If you want to skip this script, you can keep this file empty.
# It's highly recommended to run things like installing dependencies in this script.
# This scirpt will run on ubuntu machine. So, make sure the commands you run are compatible with ubuntu.

View File

@@ -1 +0,0 @@
# Description: Copilot will run this script before we commit the changes to your repository.

View File

@@ -1,6 +0,0 @@
enum LlmType {
ONEUPTIME_LLM = "OneUptime LLM Server", // OneUptime custom LLM Server
OpenAI = "OpenAI",
}
export default LlmType;

View File

@@ -1,822 +0,0 @@
import {
GetCodeRepositoryPassword,
GetCodeRepositoryUsername,
GetLocalRepositoryPath,
GetOneUptimeURL,
GetRepositorySecretKey,
} from "../Config";
import HTTPErrorResponse from "Common/Types/API/HTTPErrorResponse";
import HTTPResponse from "Common/Types/API/HTTPResponse";
import URL from "Common/Types/API/URL";
import CodeRepositoryType from "Common/Types/CodeRepository/CodeRepositoryType";
import PullRequest from "Common/Types/CodeRepository/PullRequest";
import PullRequestState from "Common/Types/CodeRepository/PullRequestState";
import BadDataException from "Common/Types/Exception/BadDataException";
import { JSONArray, JSONObject } from "Common/Types/JSON";
import API from "Common/Utils/API";
import CodeRepositoryServerUtil from "Common/Server/Utils/CodeRepository/CodeRepository";
import GitHubUtil from "Common/Server/Utils/CodeRepository/GitHub/GitHub";
import LocalFile from "Common/Server/Utils/LocalFile";
import logger from "Common/Server/Utils/Logger";
import CopilotCodeRepository from "Common/Models/DatabaseModels/CopilotCodeRepository";
import ServiceCopilotCodeRepository from "Common/Models/DatabaseModels/ServiceCopilotCodeRepository";
import Text from "Common/Types/Text";
import Execute from "Common/Server/Utils/Execute";
import CopilotPullRequestService from "../Service/CopilotPullRequest";
import CopilotPullRequest from "Common/Models/DatabaseModels/CopilotPullRequest";
export interface CodeRepositoryResult {
codeRepository: CopilotCodeRepository;
serviceRepositories: Array<ServiceCopilotCodeRepository>;
}
export interface ServiceToImproveResult {
serviceRepository: ServiceCopilotCodeRepository;
numberOfOpenPullRequests: number;
pullRequests: Array<CopilotPullRequest>;
}
export enum RepoScriptType {
OnAfterClone = "onAfterClone",
OnBeforeCommit = "onBeforeCommit",
OnAfterCommit = "OnAfterCommit",
OnBeforeCodeChange = "OnBeforeCodeChange",
OnAfterCodeChange = "OnAfterCodeChange",
}
export default class CodeRepositoryUtil {
public static codeRepositoryResult: CodeRepositoryResult | null = null;
public static gitHubUtil: GitHubUtil | null = null;
public static folderNameOfClonedRepository: string | null = null;
public static async getCurrentCommitHash(): Promise<string> {
return await CodeRepositoryServerUtil.getCurrentCommitHash({
repoPath: this.getLocalRepositoryPath(),
});
}
public static isRepoCloned(): boolean {
return Boolean(this.folderNameOfClonedRepository);
}
public static async getOpenSetupPullRequest(): Promise<CopilotPullRequest | null> {
const openPullRequests: Array<CopilotPullRequest> =
await CopilotPullRequestService.getOpenPullRequestsFromDatabase();
for (const pullRequest of openPullRequests) {
if (pullRequest.isSetupPullRequest) {
return pullRequest;
}
}
return null;
}
public static getLocalRepositoryPath(): string {
if (this.folderNameOfClonedRepository) {
return LocalFile.sanitizeFilePath(
GetLocalRepositoryPath() + "/" + this.folderNameOfClonedRepository,
);
}
return GetLocalRepositoryPath();
}
public static async discardAllChangesOnCurrentBranch(): Promise<void> {
await CodeRepositoryServerUtil.discardAllChangesOnCurrentBranch({
repoPath: this.getLocalRepositoryPath(),
});
}
public static async setAuthorIdentity(data: {
name: string;
email: string;
}): Promise<void> {
await CodeRepositoryServerUtil.setAuthorIdentity({
repoPath: this.getLocalRepositoryPath(),
authorName: data.name,
authorEmail: data.email,
});
}
public static async getPullRequestState(data: {
pullRequestId: string;
}): Promise<PullRequestState> {
// check if org name and repo name is present.
if (!this.codeRepositoryResult?.codeRepository.organizationName) {
throw new BadDataException("Organization Name is required");
}
if (!this.codeRepositoryResult?.codeRepository.repositoryName) {
throw new BadDataException("Repository Name is required");
}
const githubUtil: GitHubUtil = this.getGitHubUtil();
if (!githubUtil) {
throw new BadDataException("GitHub Util is required");
}
const pullRequest: PullRequest | undefined =
await githubUtil.getPullRequestByNumber({
organizationName:
this.codeRepositoryResult.codeRepository.organizationName,
repositoryName: this.codeRepositoryResult.codeRepository.repositoryName,
pullRequestId: data.pullRequestId,
});
if (!pullRequest) {
throw new BadDataException("Pull Request not found");
}
return pullRequest.state;
}
public static async setUpRepo(): Promise<PullRequest> {
// check if the repository is setup properly.
const isRepoSetupProperly: boolean = await this.isRepoSetupProperly();
if (isRepoSetupProperly) {
throw new BadDataException("Repository is already setup properly.");
}
// otherwise, we copy the folder /usr/src/app/Templates/.oneuptime to the repository folder.
const templateFolderPath: string = LocalFile.sanitizeFilePath(
"/usr/src/app/Templates/.oneuptime",
);
const oneUptimeConfigPath: string = LocalFile.sanitizeFilePath(
this.getLocalRepositoryPath() + "/.oneuptime",
);
// create a new branch called oneuptime-copilot-setup
const branchName: string = "setup-" + Text.generateRandomText(5);
await this.createBranch({
branchName: branchName,
});
await LocalFile.makeDirectory(oneUptimeConfigPath);
await LocalFile.copyDirectory({
source: templateFolderPath,
destination: oneUptimeConfigPath,
});
// add all the files to the git.
await this.addAllChangedFilesToGit();
// commit the changes.
await this.commitChanges({
message: "OneUptime Copilot Setup",
});
// push changes to the repo.
await this.pushChanges({
branchName: branchName,
});
// create a pull request.
const pullRequest: PullRequest = await this.createPullRequest({
branchName: branchName,
title: "OneUptime Copilot Setup",
body: "This pull request is created by OneUptime Copilot to setup the repository.",
});
// save this to the database.
await CopilotPullRequestService.addPullRequestToDatabase({
pullRequest: pullRequest,
isSetupPullRequest: true,
});
return pullRequest;
}
public static async isRepoSetupProperly(): Promise<boolean> {
// check if .oneuptime folder exists.
const repoPath: string = this.getLocalRepositoryPath();
const oneUptimeFolderPath: string = LocalFile.sanitizeFilePath(
`${repoPath}/.oneuptime`,
);
const doesDirectoryExist: boolean =
await LocalFile.doesDirectoryExist(oneUptimeFolderPath);
if (!doesDirectoryExist) {
return false;
}
// check if .oneuptime/scripts folder exists.
const oneuptimeScriptsPath: string = LocalFile.sanitizeFilePath(
`${oneUptimeFolderPath}/scripts`,
);
const doesScriptsDirectoryExist: boolean =
await LocalFile.doesDirectoryExist(oneuptimeScriptsPath);
if (!doesScriptsDirectoryExist) {
return false;
}
return true; // return true if all checks pass.
}
public static addAllChangedFilesToGit(): Promise<void> {
return CodeRepositoryServerUtil.addAllChangedFilesToGit({
repoPath: this.getLocalRepositoryPath(),
});
}
// returns the folder name of the cloned repository.
public static async cloneRepository(data: {
codeRepository: CopilotCodeRepository;
}): Promise<void> {
// make sure this.getLocalRepositoryPath() is empty.
const repoLocalPath: string = this.getLocalRepositoryPath();
await LocalFile.deleteAllDataInDirectory(repoLocalPath);
await LocalFile.makeDirectory(repoLocalPath);
// check if the data in the directory eixsts, if it does then delete it.
if (!data.codeRepository.repositoryHostedAt) {
throw new BadDataException("Repository Hosted At is required");
}
if (!data.codeRepository.mainBranchName) {
throw new BadDataException("Main Branch Name is required");
}
if (!data.codeRepository.organizationName) {
throw new BadDataException("Organization Name is required");
}
if (!data.codeRepository.repositoryName) {
throw new BadDataException("Repository Name is required");
}
const CodeRepositoryUsername: string | null = GetCodeRepositoryUsername();
if (!CodeRepositoryUsername) {
throw new BadDataException("Code Repository Username is required");
}
const CodeRepositoryPassword: string | null = GetCodeRepositoryPassword();
if (!CodeRepositoryPassword) {
throw new BadDataException("Code Repository Password is required");
}
const repoUrl: string = `https://${CodeRepositoryUsername}:${CodeRepositoryPassword}@${
data.codeRepository.repositoryHostedAt === CodeRepositoryType.GitHub
? "github.com"
: ""
}/${data.codeRepository.organizationName}/${data.codeRepository.repositoryName}.git`;
const folderName: string = await CodeRepositoryServerUtil.cloneRepository({
repoUrl: repoUrl,
repoPath: repoLocalPath,
});
this.folderNameOfClonedRepository = folderName;
logger.debug(`Repository cloned to ${repoLocalPath}/${folderName}`);
}
public static async executeScript(data: { script: string }): Promise<string> {
const commands: Array<string> = data.script
.split("\n")
.filter((command: string) => {
return command.trim() !== "" && !command.startsWith("#");
});
const results: Array<string> = [];
for (const command of commands) {
logger.info(`Executing command: ${command}`);
const commandResult: string = await Execute.executeCommand(command, {
cwd: this.getLocalRepositoryPath(),
});
if (commandResult) {
logger.info(`Command result: ${commandResult}`);
results.push(commandResult);
}
}
return results.join("\n");
}
public static async getRepoScript(data: {
scriptType: RepoScriptType;
}): Promise<string | null> {
const repoPath: string = this.getLocalRepositoryPath();
const oneUptimeFolderPath: string = LocalFile.sanitizeFilePath(
`${repoPath}/.oneuptime`,
);
const doesDirectoryExist: boolean =
await LocalFile.doesDirectoryExist(oneUptimeFolderPath);
if (!doesDirectoryExist) {
return null;
}
const oneuptimeScriptsPath: string = LocalFile.sanitizeFilePath(
`${oneUptimeFolderPath}/scripts`,
);
const doesScriptsDirectoryExist: boolean =
await LocalFile.doesDirectoryExist(oneuptimeScriptsPath);
if (!doesScriptsDirectoryExist) {
return null;
}
const scriptPath: string = LocalFile.sanitizeFilePath(
`${oneuptimeScriptsPath}/${Text.fromPascalCaseToDashes(data.scriptType)}.sh`,
);
const doesScriptExist: boolean = await LocalFile.doesFileExist(scriptPath);
if (!doesScriptExist) {
return null;
}
const scriptContent: string = await LocalFile.read(scriptPath);
return scriptContent.trim() || null;
}
public static hasOpenPRForFile(data: {
filePath: string;
pullRequests: Array<PullRequest>;
}): boolean {
const pullRequests: Array<PullRequest> = this.getOpenPRForFile(data);
return pullRequests.length > 0;
}
public static getOpenPRForFile(data: {
filePath: string;
pullRequests: Array<PullRequest>;
}): Array<PullRequest> {
const pullRequests: Array<PullRequest> = [];
for (const pullRequest of data.pullRequests) {
if (pullRequest.title.includes(data.filePath)) {
pullRequests.push(pullRequest);
}
}
return pullRequests;
}
public static async listFilesInDirectory(data: {
directoryPath: string;
}): Promise<Array<string>> {
return await CodeRepositoryServerUtil.listFilesInDirectory({
repoPath: this.getLocalRepositoryPath(),
directoryPath: data.directoryPath,
});
}
public static getGitHubUtil(): GitHubUtil {
if (!this.gitHubUtil) {
const gitHubToken: string | null = GetCodeRepositoryPassword();
const gitHubUsername: string | null = GetCodeRepositoryUsername();
if (!gitHubUsername) {
throw new BadDataException("GitHub Username is required");
}
if (!gitHubToken) {
throw new BadDataException("GitHub Token is required");
}
this.gitHubUtil = new GitHubUtil({
authToken: gitHubToken,
username: gitHubUsername!,
});
}
return this.gitHubUtil;
}
public static async pullChanges(): Promise<void> {
await CodeRepositoryServerUtil.pullChanges({
repoPath: this.getLocalRepositoryPath(),
});
}
public static getBranchName(data: { branchName: string }): string {
return "oneuptime-copilot-" + data.branchName;
}
public static async createBranch(data: {
branchName: string;
}): Promise<void> {
const branchName: string = this.getBranchName({
branchName: data.branchName,
});
await CodeRepositoryServerUtil.createBranch({
repoPath: this.getLocalRepositoryPath(),
branchName: branchName,
});
}
public static async createOrCheckoutBranch(data: {
branchName: string;
}): Promise<void> {
const branchName: string = this.getBranchName({
branchName: data.branchName,
});
await CodeRepositoryServerUtil.createOrCheckoutBranch({
repoPath: this.getLocalRepositoryPath(),
branchName: branchName,
});
}
public static async writeToFile(data: {
filePath: string;
content: string;
}): Promise<void> {
await CodeRepositoryServerUtil.writeToFile({
repoPath: this.getLocalRepositoryPath(),
filePath: data.filePath,
content: data.content,
});
}
public static async createDirectory(data: {
directoryPath: string;
}): Promise<void> {
await CodeRepositoryServerUtil.createDirectory({
repoPath: this.getLocalRepositoryPath(),
directoryPath: data.directoryPath,
});
}
public static async deleteFile(data: { filePath: string }): Promise<void> {
await CodeRepositoryServerUtil.deleteFile({
repoPath: this.getLocalRepositoryPath(),
filePath: data.filePath,
});
}
public static async deleteDirectory(data: {
directoryPath: string;
}): Promise<void> {
await CodeRepositoryServerUtil.deleteDirectory({
repoPath: this.getLocalRepositoryPath(),
directoryPath: data.directoryPath,
});
}
public static async discardChanges(): Promise<void> {
if (this.isRepoCloned()) {
await CodeRepositoryServerUtil.discardChanges({
repoPath: this.getLocalRepositoryPath(),
});
}
}
public static async checkoutBranch(data: {
branchName: string;
}): Promise<void> {
if (this.isRepoCloned()) {
await CodeRepositoryServerUtil.checkoutBranch({
repoPath: this.getLocalRepositoryPath(),
branchName: data.branchName,
});
}
}
public static async checkoutMainBranch(): Promise<void> {
if (!this.isRepoCloned()) {
return;
}
const codeRepository: CopilotCodeRepository =
await this.getCodeRepository();
if (!codeRepository.mainBranchName) {
throw new BadDataException("Main Branch Name is required");
}
await this.checkoutBranch({
branchName: codeRepository.mainBranchName!,
});
}
public static async addFilesToGit(data: {
filePaths: Array<string>;
}): Promise<void> {
await CodeRepositoryServerUtil.addFilesToGit({
repoPath: this.getLocalRepositoryPath(),
filePaths: data.filePaths,
});
}
public static async commitChanges(data: { message: string }): Promise<void> {
let username: string | null = null;
if (
this.codeRepositoryResult?.codeRepository.repositoryHostedAt ===
CodeRepositoryType.GitHub
) {
username = GetCodeRepositoryUsername();
}
if (!username) {
throw new BadDataException("Username is required");
}
await CodeRepositoryServerUtil.commitChanges({
repoPath: this.getLocalRepositoryPath(),
message: data.message,
});
}
public static async pushChanges(data: { branchName: string }): Promise<void> {
const branchName: string = this.getBranchName({
branchName: data.branchName,
});
const codeRepository: CopilotCodeRepository =
await this.getCodeRepository();
if (!codeRepository.mainBranchName) {
throw new BadDataException("Main Branch Name is required");
}
if (!codeRepository.organizationName) {
throw new BadDataException("Organization Name is required");
}
if (!codeRepository.repositoryName) {
throw new BadDataException("Repository Name is required");
}
if (codeRepository.repositoryHostedAt === CodeRepositoryType.GitHub) {
return await this.getGitHubUtil().pushChanges({
repoPath: this.getLocalRepositoryPath(),
branchName: branchName,
organizationName: codeRepository.organizationName,
repositoryName: codeRepository.repositoryName,
});
}
}
public static async switchToMainBranch(): Promise<void> {
const codeRepository: CopilotCodeRepository =
await this.getCodeRepository();
if (!codeRepository.mainBranchName) {
throw new BadDataException("Main Branch Name is required");
}
await this.checkoutBranch({
branchName: codeRepository.mainBranchName!,
});
}
public static async createPullRequest(data: {
branchName: string;
title: string;
body: string;
}): Promise<PullRequest> {
const branchName: string = this.getBranchName({
branchName: data.branchName,
});
const codeRepository: CopilotCodeRepository =
await this.getCodeRepository();
if (!codeRepository.mainBranchName) {
throw new BadDataException("Main Branch Name is required");
}
if (!codeRepository.organizationName) {
throw new BadDataException("Organization Name is required");
}
if (!codeRepository.repositoryName) {
throw new BadDataException("Repository Name is required");
}
if (codeRepository.repositoryHostedAt === CodeRepositoryType.GitHub) {
return await this.getGitHubUtil().createPullRequest({
headBranchName: branchName,
baseBranchName: codeRepository.mainBranchName,
organizationName: codeRepository.organizationName,
repositoryName: codeRepository.repositoryName,
title: data.title,
body: data.body,
});
}
throw new BadDataException("Code Repository type not supported");
}
public static async getServicesToImproveCode(data: {
codeRepository: CopilotCodeRepository;
serviceRepositories: Array<ServiceCopilotCodeRepository>;
openPullRequests: Array<CopilotPullRequest>;
}): Promise<Array<ServiceToImproveResult>> {
const servicesToImproveCode: Array<ServiceToImproveResult> = [];
for (const service of data.serviceRepositories) {
if (!data.codeRepository.mainBranchName) {
throw new BadDataException("Main Branch Name is required");
}
if (!data.codeRepository.organizationName) {
throw new BadDataException("Organization Name is required");
}
if (!data.codeRepository.repositoryName) {
throw new BadDataException("Repository Name is required");
}
if (!service.limitNumberOfOpenPullRequestsCount) {
throw new BadDataException(
"Limit Number Of Open Pull Requests Count is required",
);
}
if (
data.codeRepository.repositoryHostedAt === CodeRepositoryType.GitHub
) {
const gitHuhbToken: string | null = GetCodeRepositoryPassword();
if (!gitHuhbToken) {
throw new BadDataException("GitHub Token is required");
}
const pullRequestByService: Array<CopilotPullRequest> =
data.openPullRequests.filter((pullRequest: CopilotPullRequest) => {
return (
pullRequest.serviceRepositoryId?.toString() ===
service.id?.toString()
);
});
const numberOfPullRequestForThisService: number =
pullRequestByService.length;
if (
numberOfPullRequestForThisService <
service.limitNumberOfOpenPullRequestsCount
) {
servicesToImproveCode.push({
serviceRepository: service,
numberOfOpenPullRequests: numberOfPullRequestForThisService,
pullRequests: pullRequestByService,
});
logger.info(
`Service ${service.serviceCatalog?.name} has ${numberOfPullRequestForThisService} open pull requests. Limit is ${service.limitNumberOfOpenPullRequestsCount}. Adding to the list to improve code...`,
);
} else {
logger.warn(
`Service ${service.serviceCatalog?.name} has ${numberOfPullRequestForThisService} open pull requests. Limit is ${service.limitNumberOfOpenPullRequestsCount}. Skipping...`,
);
}
}
}
return servicesToImproveCode;
}
public static async getCodeRepositoryResult(): Promise<CodeRepositoryResult> {
if (this.codeRepositoryResult) {
return this.codeRepositoryResult;
}
logger.info("Fetching Code Repository...");
const repositorySecretKey: string | null = GetRepositorySecretKey();
if (!repositorySecretKey) {
throw new BadDataException("Repository Secret Key is required");
}
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotCodeRepository()
.getCrudApiPath()
?.toString()}/get-code-repository/${repositorySecretKey}`,
);
const codeRepositoryResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.get({
url: url,
});
if (codeRepositoryResult instanceof HTTPErrorResponse) {
throw codeRepositoryResult;
}
const codeRepository: CopilotCodeRepository =
CopilotCodeRepository.fromJSON(
codeRepositoryResult.data["codeRepository"] as JSONObject,
CopilotCodeRepository,
) as CopilotCodeRepository;
const servicesRepository: Array<ServiceCopilotCodeRepository> = (
codeRepositoryResult.data["servicesRepository"] as JSONArray
).map((serviceRepository: JSONObject) => {
return ServiceCopilotCodeRepository.fromJSON(
serviceRepository,
ServiceCopilotCodeRepository,
) as ServiceCopilotCodeRepository;
});
if (!codeRepository) {
throw new BadDataException(
"Code Repository not found with the secret key provided.",
);
}
if (!servicesRepository || servicesRepository.length === 0) {
throw new BadDataException(
"No services attached to this repository. Please attach services to this repository on OneUptime Dashboard.",
);
}
logger.info(`Code Repository found: ${codeRepository.name}`);
logger.info("Services found in the repository:");
servicesRepository.forEach(
(serviceRepository: ServiceCopilotCodeRepository) => {
logger.info(`- ${serviceRepository.serviceCatalog?.name}`);
},
);
this.codeRepositoryResult = {
codeRepository,
serviceRepositories: servicesRepository,
};
return this.codeRepositoryResult;
}
public static async getCodeRepository(): Promise<CopilotCodeRepository> {
if (!this.codeRepositoryResult) {
const result: CodeRepositoryResult = await this.getCodeRepositoryResult();
return result.codeRepository;
}
return this.codeRepositoryResult.codeRepository;
}
public static getCodeFileExtentions(): Array<string> {
const extensions: Array<string> = [
"ts",
"js",
"tsx",
"jsx",
"py",
"go",
"java",
"c",
"cpp",
"cs",
"swift",
"php",
"rb",
"rs",
"kt",
"dart",
"sh",
"pl",
"lua",
"r",
"scala",
"ts",
"js",
"tsx",
"jsx",
];
return extensions;
}
public static getReadmeFileExtentions(): Array<string> {
return ["md"];
}
}

View File

@@ -1,373 +0,0 @@
import BadDataException from "Common/Types/Exception/BadDataException";
import CopilotAction from "Common/Models/DatabaseModels/CopilotAction";
import {
GetOneUptimeURL,
GetRepositorySecretKey,
MIN_ITEMS_IN_QUEUE_PER_SERVICE_CATALOG,
} from "../Config";
import URL from "Common/Types/API/URL";
import HTTPErrorResponse from "Common/Types/API/HTTPErrorResponse";
import { JSONArray, JSONObject } from "Common/Types/JSON";
import HTTPResponse from "Common/Types/API/HTTPResponse";
import API from "Common/Utils/API";
import ObjectID from "Common/Types/ObjectID";
import logger from "Common/Server/Utils/Logger";
import CopilotActionTypePriority from "Common/Models/DatabaseModels/CopilotActionTypePriority";
import CopilotActionTypeUtil from "./CopilotActionTypes";
import CopilotActionType from "Common/Types/Copilot/CopilotActionType";
import { ActionDictionary } from "../Service/CopilotActions/Index";
import CopilotActionBase from "../Service/CopilotActions/CopilotActionsBase";
import CopilotActionStatus from "Common/Types/Copilot/CopilotActionStatus";
import CopilotActionProp from "Common/Types/Copilot/CopilotActionProps/Index";
import CodeRepositoryUtil from "./CodeRepository";
export default class CopilotActionUtil {
public static async getExistingAction(data: {
serviceCatalogId: ObjectID;
actionType: CopilotActionType;
actionProps: JSONObject;
}): Promise<CopilotAction | null> {
if (!data.serviceCatalogId) {
throw new BadDataException("Service Catalog ID is required");
}
if (!data.actionType) {
throw new BadDataException("Action Type is required");
}
const repositorySecretKey: string | null = GetRepositorySecretKey();
if (!repositorySecretKey) {
throw new BadDataException("Repository Secret Key is required");
}
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotAction()
.getCrudApiPath()
?.toString()}/get-copilot-action/${repositorySecretKey}`,
);
const copilotActionResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.get({
url: url,
params: {
serviceCatalogId: data.serviceCatalogId.toString(),
actionType: data.actionType,
actionProps: JSON.stringify(data.actionProps),
},
});
if (copilotActionResult instanceof HTTPErrorResponse) {
throw copilotActionResult;
}
if (!copilotActionResult.data["copilotAction"]) {
return null;
}
return CopilotAction.fromJSONObject(
copilotActionResult.data["copilotAction"] as JSONObject,
CopilotAction,
);
}
public static async getActionTypesBasedOnPriority(): Promise<
Array<CopilotActionTypePriority>
> {
const repositorySecretKey: string | null = GetRepositorySecretKey();
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotAction().getCrudApiPath()?.toString()}/copilot-action-types-by-priority/${repositorySecretKey}`,
);
const actionTypesResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.get({
url: url,
});
if (actionTypesResult instanceof HTTPErrorResponse) {
throw actionTypesResult;
}
const actionTypes: Array<CopilotActionTypePriority> =
CopilotActionTypePriority.fromJSONArray(
actionTypesResult.data["actionTypes"] as JSONArray,
CopilotActionTypePriority,
) || [];
logger.debug(
`Copilot action types based on priority: ${JSON.stringify(actionTypes, null, 2)}`,
);
return actionTypes;
}
public static async getActionsToWorkOn(data: {
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
}): Promise<Array<CopilotAction>> {
logger.debug("Getting actions to work on");
if (!data.serviceCatalogId) {
throw new BadDataException("Service Catalog ID is required");
}
const repositorySecretKey: string | null = GetRepositorySecretKey();
if (!repositorySecretKey) {
throw new BadDataException("Repository Secret Key is required");
}
// check actions in queue
const actionsInQueue: Array<CopilotAction> =
await CopilotActionUtil.getInQueueActions({
serviceCatalogId: data.serviceCatalogId,
});
if (actionsInQueue.length >= MIN_ITEMS_IN_QUEUE_PER_SERVICE_CATALOG) {
logger.debug(
`Actions in queue: ${JSON.stringify(actionsInQueue, null, 2)}`,
);
return actionsInQueue;
}
const actionTypePriorities: Array<CopilotActionTypePriority> =
await CopilotActionTypeUtil.getEnabledActionTypesBasedOnPriority();
logger.debug(
"Action type priorities: " +
actionTypePriorities.map(
(actionTypePriority: CopilotActionTypePriority) => {
return actionTypePriority.actionType;
},
),
);
for (const actionTypePriority of actionTypePriorities) {
logger.debug(
`Getting actions for action type: ${actionTypePriority.actionType}`,
);
// get items in queue based on priority
const itemsInQueue: number =
CopilotActionTypeUtil.getItemsInQueueByPriority(
actionTypePriority.priority || 1,
);
// get actions based on priority
const actions: Array<CopilotAction> = await CopilotActionUtil.getActions({
serviceCatalogId: data.serviceCatalogId,
serviceRepositoryId: data.serviceRepositoryId,
actionType: actionTypePriority.actionType!,
itemsInQueue,
});
// add these actions to the queue
actionsInQueue.push(...actions);
}
return actionsInQueue;
}
public static async getActions(data: {
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
actionType: CopilotActionType;
itemsInQueue: number;
}): Promise<Array<CopilotAction>> {
logger.debug(`Getting actions for action type: ${data.actionType}`);
if (!data.serviceCatalogId) {
throw new BadDataException("Service Catalog ID is required");
}
if (!data.actionType) {
throw new BadDataException("Action Type is required");
}
const CopilotActionBaseType: typeof CopilotActionBase =
ActionDictionary[data.actionType]!;
const ActionBase: CopilotActionBase = new CopilotActionBaseType();
logger.debug(`Getting action props for action type: ${data.actionType}`);
const actionProps: Array<CopilotActionProp> =
await ActionBase.getActionPropsToQueue({
serviceCatalogId: data.serviceCatalogId,
serviceRepositoryId: data.serviceRepositoryId,
maxActionsToQueue: data.itemsInQueue,
});
logger.debug(`Action props for action type: ${data.actionType}`);
const savedActions: Array<CopilotAction> = [];
// now these actions need to be saved.
for (const actionProp of actionProps) {
try {
logger.debug(
`Creating copilot action for action type: ${data.actionType}`,
);
const savedAction: CopilotAction =
await CopilotActionUtil.createCopilotAction({
actionType: data.actionType,
serviceCatalogId: data.serviceCatalogId,
serviceRepositoryId: data.serviceRepositoryId,
actionProps: actionProp,
});
logger.debug(
`Copilot action created for action type: ${data.actionType}`,
);
logger.debug(savedAction);
savedActions.push(savedAction);
} catch (error) {
logger.error(`Error while adding copilot action: ${error}`);
}
}
return savedActions;
}
public static async updateCopilotAction(data: {
actionId: ObjectID;
actionStatus: CopilotActionStatus;
pullRequestId?: ObjectID | undefined;
commitHash?: string | undefined;
statusMessage?: string | undefined;
logs?: Array<string> | undefined;
}): Promise<void> {
// send this to the API.
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotAction()
.getCrudApiPath()
?.toString()}/update-copilot-action/${GetRepositorySecretKey()}`,
);
const codeRepositoryResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.post({
url: url,
data: {
...data,
},
});
if (codeRepositoryResult instanceof HTTPErrorResponse) {
throw codeRepositoryResult;
}
}
public static async createCopilotAction(data: {
actionType: CopilotActionType;
serviceCatalogId: ObjectID;
serviceRepositoryId: ObjectID;
actionProps: CopilotActionProp;
actionStatus?: CopilotActionStatus;
}): Promise<CopilotAction> {
const action: CopilotAction = new CopilotAction();
action.copilotActionType = data.actionType;
action.serviceCatalogId = data.serviceCatalogId;
action.serviceRepositoryId = data.serviceRepositoryId;
action.copilotActionProp = data.actionProps;
action.commitHash = await CodeRepositoryUtil.getCurrentCommitHash();
if (data.actionStatus) {
action.copilotActionStatus = data.actionStatus;
} else {
action.copilotActionStatus = CopilotActionStatus.IN_QUEUE;
}
// send this to the API.
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotAction()
.getCrudApiPath()
?.toString()}/create-copilot-action/${GetRepositorySecretKey()}`,
);
const codeRepositoryResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.post({
url: url,
data: {
copilotAction: CopilotAction.toJSON(action, CopilotAction),
},
});
if (codeRepositoryResult instanceof HTTPErrorResponse) {
throw codeRepositoryResult;
}
const copilotAction: CopilotAction = CopilotAction.fromJSONObject(
codeRepositoryResult.data as JSONObject,
CopilotAction,
);
if (!copilotAction) {
throw new BadDataException("Copilot action not created");
}
if (!copilotAction._id) {
throw new BadDataException("Copilot action ID not created");
}
return copilotAction;
}
public static async getInQueueActions(data: {
serviceCatalogId: ObjectID;
}): Promise<Array<CopilotAction>> {
if (!data.serviceCatalogId) {
throw new BadDataException("Service Catalog ID is required");
}
const repositorySecretKey: string | null = GetRepositorySecretKey();
if (!repositorySecretKey) {
throw new BadDataException("Repository Secret Key is required");
}
const url: URL = URL.fromString(
GetOneUptimeURL().toString() + "/api",
).addRoute(
`${new CopilotAction()
.getCrudApiPath()
?.toString()}/copilot-actions-in-queue/${repositorySecretKey}`,
);
const copilotActionsResult: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.get({
url: url,
params: {
serviceCatalogId: data.serviceCatalogId.toString(),
},
});
if (copilotActionsResult instanceof HTTPErrorResponse) {
throw copilotActionsResult;
}
const copilotActions: Array<CopilotAction> =
CopilotAction.fromJSONArray(
copilotActionsResult.data["copilotActions"] as JSONArray,
CopilotAction,
) || [];
logger.debug(
`Copilot actions in queue for service catalog id: ${data.serviceCatalogId}`,
);
logger.debug(`Copilot events: ${JSON.stringify(copilotActions, null, 2)}`);
return copilotActions;
}
}

View File

@@ -1,71 +0,0 @@
import CopilotActionTypePriority from "Common/Models/DatabaseModels/CopilotActionTypePriority";
import CopilotActionType, {
CopilotActionTypeUtil as ActionTypeUtil,
CopilotActionTypeData,
} from "Common/Types/Copilot/CopilotActionType";
import CopilotActionUtil from "./CopilotAction";
import { ActionDictionary } from "../Service/CopilotActions/Index";
import logger from "Common/Server/Utils/Logger";
export default class CopilotActionTypeUtil {
private static isActionEnabled(actionType: CopilotActionType): boolean {
return Boolean(ActionDictionary[actionType]); // if action is not in dictionary then it is not enabled
}
public static async getEnabledActionTypesBasedOnPriority(): Promise<
Array<CopilotActionTypePriority>
> {
// if there are no actions then, get actions based on priority
const actionTypes: Array<CopilotActionTypePriority> =
await CopilotActionUtil.getActionTypesBasedOnPriority();
const enabledActions: Array<CopilotActionTypePriority> = [];
for (const actionType of actionTypes) {
if (this.isActionEnabled(actionType.actionType!)) {
enabledActions.push(actionType);
}
}
return enabledActions;
}
public static getItemsInQueueByPriority(priority: number): number {
// so if the priority is 1, then there will be 5 items in queue. If the priority is 5, then there will be 1 item in queue.
const itemsInQueue: number = 6;
return itemsInQueue - priority;
}
public static printEnabledAndDisabledActionTypes(): void {
const allActionTypes: Array<CopilotActionTypeData> =
ActionTypeUtil.getAllCopilotActionTypes();
// log all the actions from these actions that are in Action dictionary
const enabledActionTypesData: Array<CopilotActionTypeData> =
allActionTypes.filter((actionTypeData: CopilotActionTypeData) => {
return this.isActionEnabled(actionTypeData.type);
});
const disabledActionTypesData: Array<CopilotActionTypeData> =
allActionTypes.filter((actionTypeData: CopilotActionTypeData) => {
return !this.isActionEnabled(actionTypeData.type);
});
logger.info("--------------------");
logger.info("Copilot will fix the following issues:");
for (const actionTypeData of enabledActionTypesData) {
logger.info(`- ${actionTypeData.type}`);
}
logger.info("--------------------");
logger.info(
"Copilot will not fix the following issues at this time (but we will in the future update of the software. We're working on this and they will be launched soon):",
);
for (const disabledTypesData of disabledActionTypesData) {
logger.info(`- ${disabledTypesData.type}`);
}
logger.info("--------------------");
}
}

View File

@@ -1,191 +0,0 @@
import TechStack from "Common/Types/ServiceCatalog/TechStack";
export default class ServiceFileTypesUtil {
private static getCommonDirectoriesToIgnore(): string[] {
return [
"node_modules",
".git",
"build",
"dist",
"coverage",
"logs",
"tmp",
"temp",
"temporal",
"tempfiles",
"tempfiles",
];
}
private static getCommonFilesToIgnore(): string[] {
return [".DS_Store", "Thumbs.db", ".gitignore", ".gitattributes"];
}
public static getCommonFilesToIgnoreByTechStackItem(
techStack: TechStack,
): string[] {
let filesToIgnore: string[] = [];
switch (techStack) {
case TechStack.NodeJS:
filesToIgnore = ["package-lock.json"];
break;
case TechStack.Python:
filesToIgnore = ["__pycache__"];
break;
case TechStack.Ruby:
filesToIgnore = ["Gemfile.lock"];
break;
case TechStack.Go:
filesToIgnore = ["go.sum", "go.mod"];
break;
case TechStack.Java:
filesToIgnore = ["pom.xml"];
break;
case TechStack.PHP:
filesToIgnore = ["composer.lock"];
break;
case TechStack.CSharp:
filesToIgnore = ["packages", "bin", "obj"];
break;
case TechStack.CPlusPlus:
filesToIgnore = ["build", "CMakeFiles", "CMakeCache.txt", "Makefile"];
break;
case TechStack.Rust:
filesToIgnore = ["Cargo.lock"];
break;
case TechStack.Swift:
filesToIgnore = ["Podfile.lock"];
break;
case TechStack.Kotlin:
filesToIgnore = [
"gradle",
"build",
"gradlew",
"gradlew.bat",
"gradle.properties",
];
break;
case TechStack.TypeScript:
filesToIgnore = ["node_modules", "package-lock.json"];
break;
case TechStack.JavaScript:
filesToIgnore = ["node_modules", "package-lock.json"];
break;
case TechStack.Shell:
filesToIgnore = [];
break;
case TechStack.React:
filesToIgnore = ["node_modules", "package-lock.json"];
break;
case TechStack.Other:
filesToIgnore = [];
break;
default:
filesToIgnore = [];
}
return filesToIgnore;
}
public static getCommonFilesToIgnoreByTechStack(
techStack: Array<TechStack>,
): string[] {
let filesToIgnore: string[] = [];
for (const stack of techStack) {
filesToIgnore = filesToIgnore.concat(
this.getCommonFilesToIgnoreByTechStackItem(stack),
);
}
return filesToIgnore
.concat(this.getCommonFilesToIgnore())
.concat(this.getCommonDirectoriesToIgnore());
}
private static getCommonFilesExtentions(): string[] {
// return markdown, dockerfile, etc.
return [".md", "dockerfile", ".yml", ".yaml", ".sh", ".gitignore"];
}
public static getFileExtentionsByTechStackItem(
techStack: TechStack,
): string[] {
let fileExtentions: Array<string> = [];
switch (techStack) {
case TechStack.NodeJS:
fileExtentions = [".js", ".ts", ".json", ".mjs"];
break;
case TechStack.Python:
fileExtentions = [".py"];
break;
case TechStack.Ruby:
fileExtentions = [".rb"];
break;
case TechStack.Go:
fileExtentions = [".go"];
break;
case TechStack.Java:
fileExtentions = [".java"];
break;
case TechStack.PHP:
fileExtentions = [".php"];
break;
case TechStack.CSharp:
fileExtentions = [".cs"];
break;
case TechStack.CPlusPlus:
fileExtentions = [".cpp", ".c"];
break;
case TechStack.Rust:
fileExtentions = [".rs"];
break;
case TechStack.Swift:
fileExtentions = [".swift"];
break;
case TechStack.Kotlin:
fileExtentions = [".kt", ".kts"];
break;
case TechStack.TypeScript:
fileExtentions = [".ts", ".tsx"];
break;
case TechStack.JavaScript:
fileExtentions = [".js", ".jsx"];
break;
case TechStack.Shell:
fileExtentions = [".sh"];
break;
case TechStack.React:
fileExtentions = [".js", ".ts", ".jsx", ".tsx"];
break;
case TechStack.Other:
fileExtentions = [];
break;
default:
fileExtentions = [];
}
return fileExtentions;
}
public static getFileExtentionsByTechStack(
techStack: Array<TechStack>,
): string[] {
let fileExtentions: Array<string> = [];
for (let i: number = 0; i < techStack.length; i++) {
if (!techStack[i]) {
continue;
}
fileExtentions = fileExtentions.concat(
this.getFileExtentionsByTechStackItem(techStack[i]!),
);
}
// add common files extentions
return fileExtentions.concat(this.getCommonFilesExtentions());
}
}

View File

@@ -1,83 +0,0 @@
import {
GetCodeRepositoryPassword,
GetLlmServerUrl,
GetLlmType,
GetOneUptimeURL,
GetRepositorySecretKey,
} from "../Config";
import CodeRepositoryUtil, { CodeRepositoryResult } from "./CodeRepository";
import CodeRepositoryType from "Common/Types/CodeRepository/CodeRepositoryType";
import BadDataException from "Common/Types/Exception/BadDataException";
import URL from "Common/Types/API/URL";
import LlmType from "../Types/LlmType";
import API from "Common/Utils/API";
import HTTPErrorResponse from "Common/Types/API/HTTPErrorResponse";
import HTTPResponse from "Common/Types/API/HTTPResponse";
import { JSONObject } from "Common/Types/JSON";
import logger from "Common/Server/Utils/Logger";
import CopilotActionTypeUtil from "./CopilotActionTypes";
export default class InitUtil {
public static async init(): Promise<CodeRepositoryResult> {
if (GetLlmType() === LlmType.ONEUPTIME_LLM) {
const llmServerUrl: URL = GetLlmServerUrl();
// check status of ll, server
const result: HTTPErrorResponse | HTTPResponse<JSONObject> =
await API.get({
url: URL.fromString(llmServerUrl.toString()),
});
if (result instanceof HTTPErrorResponse) {
throw new BadDataException(
"OneUptime LLM server is not reachable. Please check the server URL in the environment variables.",
);
}
}
// check if oneuptime server is up.
const oneuptimeServerUrl: URL = GetOneUptimeURL();
const result: HTTPErrorResponse | HTTPResponse<JSONObject> = await API.get({
url: URL.fromString(oneuptimeServerUrl.toString() + "/status"),
});
if (result instanceof HTTPErrorResponse) {
throw new BadDataException(
`OneUptime ${GetOneUptimeURL().toString()} is not reachable. Please check the server URL in the environment variables.`,
);
}
if (!GetRepositorySecretKey()) {
throw new BadDataException("Repository Secret Key is required");
}
const codeRepositoryResult: CodeRepositoryResult =
await CodeRepositoryUtil.getCodeRepositoryResult();
// Check if the repository type is GitHub and the GitHub token is provided
if (codeRepositoryResult.serviceRepositories.length === 0) {
logger.error(
"No services found in the repository. Please add services to the repository in OneUptime Dashboard.",
);
throw new BadDataException(
"No services found in the repository. Please add services to the repository in OneUptime Dashboard.",
);
}
if (
codeRepositoryResult.codeRepository.repositoryHostedAt ===
CodeRepositoryType.GitHub &&
!GetCodeRepositoryPassword()
) {
throw new BadDataException(
"GitHub token is required for this repository. Please provide the GitHub token in the environment variables.",
);
}
// check copilot action types enabled and print it out for user.
CopilotActionTypeUtil.printEnabledAndDisabledActionTypes();
return codeRepositoryResult;
}
}

View File

@@ -1,5 +0,0 @@
export default class ProcessUtil {
public static haltProcessWithSuccess(): void {
process.exit(0);
}
}

View File

@@ -1,32 +0,0 @@
import CopilotPullRequest from "Common/Models/DatabaseModels/CopilotPullRequest";
import CopilotPullRequestService from "../Service/CopilotPullRequest";
import PullRequestState from "Common/Types/CodeRepository/PullRequestState";
export default class PullRequestUtil {
public static async getOpenPRs(): Promise<Array<CopilotPullRequest>> {
const openPRs: Array<CopilotPullRequest> = [];
// get all open pull requests.
const openPullRequests: Array<CopilotPullRequest> =
await CopilotPullRequestService.getOpenPullRequestsFromDatabase();
for (const openPullRequest of openPullRequests) {
// refresh status of this PR.
if (!openPullRequest.pullRequestId) {
continue;
}
const pullRequestState: PullRequestState =
await CopilotPullRequestService.refreshPullRequestStatus({
copilotPullRequest: openPullRequest,
});
if (pullRequestState === PullRequestState.Open) {
openPRs.push(openPullRequest);
}
}
return openPRs;
}
}

View File

@@ -1,154 +0,0 @@
import ServiceFileTypesUtil from "./FileTypes";
import Dictionary from "Common/Types/Dictionary";
import BadDataException from "Common/Types/Exception/BadDataException";
import TechStack from "Common/Types/ServiceCatalog/TechStack";
import CodeRepositoryCommonServerUtil from "Common/Server/Utils/CodeRepository/CodeRepository";
import CodeRepositoryFile from "Common/Server/Utils/CodeRepository/CodeRepositoryFile";
import LocalFile from "Common/Server/Utils/LocalFile";
import ServiceCopilotCodeRepository from "Common/Models/DatabaseModels/ServiceCopilotCodeRepository";
import ServiceLanguageUtil from "Common/Utils/TechStack";
import CodeRepositoryUtil, {
CodeRepositoryResult,
ServiceToImproveResult,
} from "./CodeRepository";
import PullRequestUtil from "./PullRequest";
import CopilotPullRequest from "Common/Models/DatabaseModels/CopilotPullRequest";
import logger from "Common/Server/Utils/Logger";
import ProcessUtil from "./Process";
import ObjectID from "Common/Types/ObjectID";
export default class ServiceRepositoryUtil {
public static codeRepositoryResult: CodeRepositoryResult | null = null;
public static servicesToImprove: Array<ServiceCopilotCodeRepository> | null =
null;
public static setCodeRepositoryResult(data: {
codeRepositoryResult: CodeRepositoryResult;
}): void {
ServiceRepositoryUtil.codeRepositoryResult = data.codeRepositoryResult;
}
public static async getServicesToImprove(): Promise<
Array<ServiceCopilotCodeRepository>
> {
if (this.servicesToImprove) {
return this.servicesToImprove;
}
const codeRepositoryResult: CodeRepositoryResult =
ServiceRepositoryUtil.codeRepositoryResult!;
if (!codeRepositoryResult) {
throw new BadDataException("Code repository result is not set");
}
// before cloning the repo, check if there are any services to improve.
const openPullRequests: Array<CopilotPullRequest> =
await PullRequestUtil.getOpenPRs();
const servicesToImproveResult: Array<ServiceToImproveResult> =
await CodeRepositoryUtil.getServicesToImproveCode({
codeRepository: codeRepositoryResult.codeRepository,
serviceRepositories: codeRepositoryResult.serviceRepositories,
openPullRequests: openPullRequests,
});
const servicesToImprove: Array<ServiceCopilotCodeRepository> =
servicesToImproveResult.map(
(serviceToImproveResult: ServiceToImproveResult) => {
return serviceToImproveResult.serviceRepository;
},
);
if (servicesToImprove.length === 0) {
logger.info("No services to improve. Exiting.");
ProcessUtil.haltProcessWithSuccess();
}
this.servicesToImprove = servicesToImprove;
return servicesToImprove;
}
public static async getFileLanguage(data: {
filePath: string;
}): Promise<TechStack> {
const fileExtention: string = LocalFile.getFileExtension(data.filePath);
const techStack: TechStack = ServiceLanguageUtil.getLanguageByFileExtension(
{
fileExtension: fileExtention,
},
);
return techStack;
}
public static async getFileContent(data: {
filePath: string;
}): Promise<string> {
const { filePath } = data;
const fileContent: string =
await CodeRepositoryCommonServerUtil.getFileContent({
repoPath: CodeRepositoryUtil.getLocalRepositoryPath(),
filePath: filePath,
});
return fileContent;
}
public static async getFilesByServiceCatalogId(data: {
serviceCatalogId: ObjectID;
}): Promise<Dictionary<CodeRepositoryFile>> {
const { serviceCatalogId } = data;
const serviceRepository: ServiceCopilotCodeRepository | undefined = (
await ServiceRepositoryUtil.getServicesToImprove()
).find((serviceRepository: ServiceCopilotCodeRepository) => {
return (
serviceRepository.serviceCatalog!.id?.toString() ===
serviceCatalogId.toString()
);
});
if (!serviceRepository) {
throw new BadDataException("Service repository not found");
}
const allFiles: Dictionary<CodeRepositoryFile> =
await ServiceRepositoryUtil.getFilesInServiceDirectory({
serviceRepository,
});
return allFiles;
}
public static async getFilesInServiceDirectory(data: {
serviceRepository: ServiceCopilotCodeRepository;
}): Promise<Dictionary<CodeRepositoryFile>> {
const { serviceRepository } = data;
if (!serviceRepository.serviceCatalog?.techStack) {
throw new BadDataException(
"Service language is not defined in the service catalog",
);
}
const allFiles: Dictionary<CodeRepositoryFile> =
await CodeRepositoryCommonServerUtil.getFilesInDirectoryRecursive({
repoPath: CodeRepositoryUtil.getLocalRepositoryPath(),
directoryPath: serviceRepository.servicePathInRepository || ".",
acceptedFileExtensions:
ServiceFileTypesUtil.getFileExtentionsByTechStack(
serviceRepository.serviceCatalog!.techStack!,
),
ignoreFilesOrDirectories:
ServiceFileTypesUtil.getCommonFilesToIgnoreByTechStack(
serviceRepository.serviceCatalog!.techStack!,
),
});
return allFiles;
}
}

View File

@@ -1,32 +0,0 @@
{
"preset": "ts-jest",
"testPathIgnorePatterns": [
"node_modules",
"dist"
],
"verbose": true,
"globals": {
"ts-jest": {
"tsconfig": "tsconfig.json",
"babelConfig": false
}
},
"moduleFileExtensions": ["ts", "js", "json"],
"transform": {
".(ts|tsx)": "ts-jest"
},
"testEnvironment": "node",
"collectCoverage": false,
"coverageReporters": ["text", "lcov"],
"testRegex": "./Tests/(.*).test.ts",
"collectCoverageFrom": ["./**/*.(tsx||ts)"],
"coverageThreshold": {
"global": {
"lines": 0,
"functions": 0,
"branches": 0,
"statements": 0
}
}
}

View File

@@ -1,11 +0,0 @@
{
"watch": [
"./",
"../Common"
],
"ext": "ts,tsx",
"ignore": ["./node_modules/**", "./public/**", "./bin/**", "./build/**"],
"watchOptions": {"useFsEvents": false, "interval": 500},
"env": {"TS_NODE_TRANSPILE_ONLY": "1", "TS_NODE_FILES": "false"},
"exec": "node -r ts-node/register/transpile-only Index.ts"
}

23909
Copilot/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,31 +1,31 @@
{
"name": "@oneuptime/copilot",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"start": "export NODE_OPTIONS='--max-old-space-size=8096' && node --require ts-node/register Index.ts",
"compile": "tsc",
"clear-modules": "rm -rf node_modules && rm package-lock.json && npm install",
"dev": "npx nodemon",
"audit": "npm audit --audit-level=low",
"dep-check": "npm install -g depcheck && depcheck ./ --skip-missing=true",
"test": "jest --passWithNoTests"
},
"author": "OneUptime <hello@oneuptime.com> (https://oneuptime.com/)",
"license": "Apache-2.0",
"dependencies": {
"Common": "file:../Common",
"dotenv": "^16.4.5",
"openai": "^4.52.5",
"ts-node": "^10.9.1"
},
"devDependencies": {
"@types/jest": "^27.5.0",
"@types/node": "^17.0.31",
"jest": "^28.1.0",
"nodemon": "^2.0.20",
"ts-jest": "^28.0.2"
}
"name": "@oneuptime/copilot-agent",
"version": "0.1.0",
"description": "Standalone OneUptime Copilot coding agent CLI",
"private": true,
"bin": {
"oneuptime-copilot-agent": "./build/dist/index.js"
},
"scripts": {
"build": "tsc",
"compile": "tsc",
"dev": "ts-node --transpile-only -r tsconfig-paths/register src/index.ts",
"start": "node --enable-source-maps ./build/dist/index.js",
"clear-modules": "rm -rf node_modules && rm -f package-lock.json && npm install"
},
"dependencies": {
"Common": "file:../../Common",
"commander": "^12.1.0",
"undici": "^6.19.8",
"zod": "^3.23.8"
},
"devDependencies": {
"@types/node": "^17.0.45",
"tsconfig-paths": "^4.2.0",
"ts-node": "^10.9.2",
"typescript": "^5.6.3"
},
"engines": {
"node": ">=18"
}
}

View File

@@ -0,0 +1,201 @@
import path from "node:path";
import LocalFile from "Common/Server/Utils/LocalFile";
import { LMStudioClient } from "../llm/LMStudioClient";
import { buildSystemPrompt } from "./SystemPrompt";
import { WorkspaceContextBuilder } from "./WorkspaceContext";
import { ToolRegistry } from "../tools/ToolRegistry";
import { ChatMessage, OpenAIToolCall, ToolExecutionResult } from "../types";
import AgentLogger from "../utils/AgentLogger";
export interface CopilotAgentOptions {
prompt: string;
modelUrl: string;
modelName: string;
workspacePath: string;
temperature: number;
maxIterations: number;
requestTimeoutMs: number;
apiKey?: string | undefined;
}
export class CopilotAgent {
private readonly options: CopilotAgentOptions;
private readonly workspaceRoot: string;
private readonly client: LMStudioClient;
private readonly registry: ToolRegistry;
public constructor(options: CopilotAgentOptions) {
this.options = options;
this.workspaceRoot = path.resolve(options.workspacePath);
this.client = new LMStudioClient({
endpoint: options.modelUrl,
model: options.modelName,
temperature: options.temperature,
timeoutMs: options.requestTimeoutMs,
apiKey: options.apiKey,
});
this.registry = new ToolRegistry(this.workspaceRoot);
AgentLogger.debug("CopilotAgent initialized", {
workspaceRoot: this.workspaceRoot,
modelUrl: options.modelUrl,
modelName: options.modelName,
temperature: options.temperature,
maxIterations: options.maxIterations,
timeoutMs: options.requestTimeoutMs,
hasApiKey: Boolean(options.apiKey),
});
}
public async run(): Promise<void> {
AgentLogger.debug("Ensuring workspace exists", {
workspaceRoot: this.workspaceRoot,
});
await this.ensureWorkspace();
AgentLogger.debug("Workspace verified", {
workspaceRoot: this.workspaceRoot,
});
const contextSnapshot: string = await WorkspaceContextBuilder.buildSnapshot(
this.workspaceRoot,
);
AgentLogger.debug("Workspace snapshot built", {
snapshotLength: contextSnapshot.length,
});
const messages: Array<ChatMessage> = [
{ role: "system", content: buildSystemPrompt() },
{
role: "user",
content: this.composeUserPrompt(this.options.prompt, contextSnapshot),
},
];
AgentLogger.debug("Initial conversation seeded", {
messageCount: messages.length,
});
for (
let iteration: number = 0;
iteration < this.options.maxIterations;
iteration += 1
) {
AgentLogger.info(`Starting iteration ${iteration + 1}`);
AgentLogger.debug("Sending messages to LLM", {
iteration: iteration + 1,
messageCount: messages.length,
});
const response: ChatMessage = await this.client.createChatCompletion({
messages,
tools: this.registry.getToolDefinitions(),
});
AgentLogger.debug("LLM response received", {
iteration: iteration + 1,
hasToolCalls: Boolean(response.tool_calls?.length),
contentPreview: response.content?.slice(0, 200) ?? null,
});
if (response.tool_calls?.length) {
AgentLogger.info(
`Model requested tools: ${response.tool_calls
.map((call: OpenAIToolCall) => {
return call.function.name;
})
.join(", ")}`,
);
messages.push(response);
await this.handleToolCalls(response.tool_calls, messages);
continue;
}
const finalMessage: string =
response.content?.trim() ||
"Model ended the conversation without a reply.";
// eslint-disable-next-line no-console
console.log(`\n${finalMessage}`);
AgentLogger.debug("Conversation completed", {
iterationsUsed: iteration + 1,
finalMessagePreview: finalMessage.slice(0, 500),
});
return;
}
AgentLogger.error("Iteration limit reached", {
maxIterations: this.options.maxIterations,
prompt: this.options.prompt,
});
throw new Error(
`Reached the iteration limit (${this.options.maxIterations}) without a final response.`,
);
}
private async handleToolCalls(
calls: Array<{
id: string;
type: "function";
function: { name: string; arguments: string };
}>,
messages: Array<ChatMessage>,
): Promise<void> {
for (let index: number = 0; index < calls.length; index += 1) {
const call:
| {
id: string;
type: "function";
function: { name: string; arguments: string };
}
| undefined = calls[index];
if (call === undefined) {
AgentLogger.warn("Missing tool call entry", {
requestedIndex: index,
totalCalls: calls.length,
});
continue;
}
AgentLogger.debug("Executing tool", {
toolName: call.function.name,
callId: call.id,
});
const result: ToolExecutionResult = await this.registry.execute(call);
// eslint-disable-next-line no-console
console.log(`\n# Tool: ${call.function.name}\n${result.output}\n`);
AgentLogger.debug("Tool execution completed", {
toolName: call.function.name,
callId: call.id,
isError: result.output.startsWith("ERROR"),
outputLength: result.output.length,
});
messages.push({
role: "tool",
content: result.output,
tool_call_id: result.toolCallId,
});
AgentLogger.debug("Tool result appended to conversation", {
totalMessages: messages.length,
});
}
}
private async ensureWorkspace(): Promise<void> {
AgentLogger.debug("Validating workspace directory", {
workspaceRoot: this.workspaceRoot,
});
if (!(await LocalFile.doesDirectoryExist(this.workspaceRoot))) {
throw new Error(
`Workspace path ${this.workspaceRoot} does not exist or is not a directory.`,
);
}
AgentLogger.debug("Workspace exists", {
workspaceRoot: this.workspaceRoot,
});
}
private composeUserPrompt(task: string, snapshot: string): string {
const prompt: string = `# Task\n${task.trim()}\n\n# Workspace snapshot\n${snapshot}\n\nPlease reason step-by-step, gather any missing context with the tools, and keep iterating until the task is complete.`;
AgentLogger.debug("Composed user prompt", {
taskLength: task.length,
snapshotLength: snapshot.length,
promptLength: prompt.length,
});
return prompt;
}
}

View File

@@ -0,0 +1,15 @@
export function buildSystemPrompt(): string {
return `You are the OneUptime Copilot Agent, a fully autonomous senior engineer that works inside a local workspace. Your job is to understand the user's request, gather the context you need, modify files with precision, run checks, and stop only when the request is satisfied or truly blocked.
Core principles:
1. Stay focused on the workspace. Read files and inspect folders before editing. Never guess when you can verify.
2. Use the provided tools instead of printing raw code or shell commands. read_file/list_directory/search_workspace help you understand; apply_patch/write_file/run_command let you change or validate.
3. Break work into short iterations. Form a plan, call tools, review the output, and keep going until the plan is complete.
4. Prefer targeted edits (apply_patch) over rewriting entire files. If you must create or replace a whole file, describe why.
5. When running commands, capture real output and summarize failures honestly. Do not invent results.
6. Reference workspace paths or symbols using Markdown backticks (\`path/to/file.ts\`).
7. Keep responses concise and outcome-oriented. Explain what you inspected, what you changed, how you verified it, and what remains.
8. If you hit a blocker (missing dependency, failing command, lacking permission), describe the issue and what you tried before asking for help.
Always think before acting, gather enough evidence, and prefer high-quality, minimal diffs. The user expects you to proactively explore, implement, and validate fixes without further guidance.`;
}

View File

@@ -0,0 +1,101 @@
import fs from "node:fs/promises";
import type { Dirent } from "node:fs";
import path from "node:path";
import Execute from "Common/Server/Utils/Execute";
import AgentLogger from "../utils/AgentLogger";
export class WorkspaceContextBuilder {
public static async buildSnapshot(workspaceRoot: string): Promise<string> {
const absoluteRoot: string = path.resolve(workspaceRoot);
const sections: Array<string> = [`Workspace root: ${absoluteRoot}`];
AgentLogger.debug("Building workspace snapshot", {
workspaceRoot: absoluteRoot,
});
const branch: string | null = await this.tryGitCommand(
["rev-parse", "--abbrev-ref", "HEAD"],
absoluteRoot,
);
if (branch) {
sections.push(`Git branch: ${branch.trim()}`);
AgentLogger.debug("Detected git branch", { branch: branch.trim() });
}
const status: string | null = await this.tryGitCommand(
["status", "-sb"],
absoluteRoot,
);
if (status) {
sections.push(`Git status:\n${status.trim()}`);
AgentLogger.debug("Captured git status", {
statusLength: status.length,
});
}
const entries: Array<string> = await this.listTopLevelEntries(absoluteRoot);
sections.push(
`Top-level entries (${entries.length}): ${entries.join(", ")}`,
);
AgentLogger.debug("Listed top-level entries", {
entryCount: entries.length,
});
const snapshot: string = sections.join("\n");
AgentLogger.debug("Workspace snapshot complete", {
sectionCount: sections.length,
snapshotLength: snapshot.length,
});
return snapshot;
}
private static async listTopLevelEntries(
root: string,
): Promise<Array<string>> {
try {
const dirEntries: Array<Dirent> = await fs.readdir(root, {
withFileTypes: true,
});
return dirEntries
.filter((entry: Dirent) => {
return !entry.name.startsWith(".") && entry.name !== "node_modules";
})
.slice(0, 25)
.map((entry: Dirent) => {
return entry.isDirectory() ? `${entry.name}/` : entry.name;
});
} catch (error) {
AgentLogger.error("Unable to list workspace entries", error as Error);
return [];
} finally {
AgentLogger.debug("listTopLevelEntries completed", {
root,
});
}
}
private static async tryGitCommand(
args: Array<string>,
cwd: string,
): Promise<string | null> {
try {
const output: string = await Execute.executeCommandFile({
command: "git",
args,
cwd,
});
AgentLogger.debug("Git command succeeded", {
args,
cwd,
outputLength: output.length,
});
return output;
} catch (error) {
AgentLogger.debug("Git command failed", {
cwd,
args,
error: (error as Error).message,
});
return null;
}
}
}

119
Copilot/src/index.ts Normal file
View File

@@ -0,0 +1,119 @@
#!/usr/bin/env node
import path from "node:path";
import { Command } from "commander";
import { CopilotAgent, CopilotAgentOptions } from "./agent/CopilotAgent";
import AgentLogger from "./utils/AgentLogger";
const program: Command = new Command();
program
.name("oneuptime-copilot-agent")
.description("Autonomous OneUptime coding agent for LM Studio hosted models")
.requiredOption(
"--prompt <text>",
"Problem statement or set of tasks for the agent",
)
.requiredOption(
"--model <url>",
"Full LM Studio chat-completions endpoint (for example http://localhost:1234/v1/chat/completions)",
)
.requiredOption(
"--workspace-path <path>",
"Path to the repository or folder the agent should work inside",
)
.option(
"--model-name <name>",
"Model identifier expected by the LM Studio endpoint",
"lmstudio",
)
.option(
"--temperature <value>",
"Sampling temperature passed to the model (default 0.1)",
"0.1",
)
.option(
"--max-iterations <count>",
"Maximum number of tool-calling rounds (default 12)",
"12",
)
.option(
"--timeout <ms>",
"HTTP timeout for each LLM request in milliseconds (default 120000)",
"120000",
)
.option(
"--api-key <token>",
"API key if the endpoint requires authentication",
)
.option(
"--log-level <level>",
"debug | info | warn | error (default info)",
process.env["LOG_LEVEL"] ?? "info",
)
.option(
"--log-file <path>",
"Optional file path to append all agent logs for auditing",
)
.parse(process.argv);
(async () => {
const opts: {
prompt: string;
model: string;
workspacePath: string;
modelName?: string;
temperature: string;
maxIterations: string;
timeout: string;
apiKey?: string;
logLevel?: string;
logFile?: string;
} = program.opts<{
prompt: string;
model: string;
workspacePath: string;
modelName?: string;
temperature: string;
maxIterations: string;
timeout: string;
apiKey?: string;
logLevel?: string;
logFile?: string;
}>();
process.env["LOG_LEVEL"] = opts.logLevel?.toUpperCase() ?? "INFO";
await AgentLogger.configure({ logFilePath: opts.logFile });
AgentLogger.debug("CLI options parsed", {
workspacePath: opts.workspacePath,
model: opts.model,
modelName: opts.modelName,
temperature: opts.temperature,
maxIterations: opts.maxIterations,
timeout: opts.timeout,
hasApiKey: Boolean(opts.apiKey),
logLevel: process.env["LOG_LEVEL"],
logFile: opts.logFile,
});
const config: CopilotAgentOptions = {
prompt: opts.prompt,
modelUrl: opts.model,
modelName: opts.modelName || "lmstudio",
workspacePath: path.resolve(opts.workspacePath),
temperature: Number(opts.temperature) || 0.1,
maxIterations: Number(opts.maxIterations) || 12,
requestTimeoutMs: Number(opts.timeout) || 120000,
apiKey: opts.apiKey,
};
try {
const agent: CopilotAgent = new CopilotAgent(config);
await agent.run();
} catch (error) {
AgentLogger.error("Agent run failed", error as Error);
// eslint-disable-next-line no-console
console.error("Agent failed", error);
process.exit(1);
}
})();

View File

@@ -0,0 +1,208 @@
import { fetch, Response } from "undici";
import { ChatMessage, ToolDefinition } from "../types";
import AgentLogger from "../utils/AgentLogger";
type SerializableMessage = Omit<ChatMessage, "tool_calls"> & {
tool_calls?: Array<{
id: string;
type: "function";
function: {
name: string;
arguments: string;
};
}>;
};
interface ChatCompletionRequestPayload {
model: string;
messages: Array<SerializableMessage>;
temperature: number;
tool_choice: "auto";
tools?: Array<ToolDefinition>;
}
interface OpenAIChatCompletionResponse {
choices: Array<{
index: number;
finish_reason: string;
message: {
role: "assistant";
content: unknown;
tool_calls?: Array<{
id: string;
type: "function";
function: {
name: string;
arguments: string;
};
}>;
};
}>;
usage?: {
prompt_tokens?: number;
completion_tokens?: number;
total_tokens?: number;
};
}
export interface LMStudioClientOptions {
endpoint: string;
model: string;
temperature: number;
timeoutMs: number;
apiKey?: string | undefined;
}
export class LMStudioClient {
public constructor(private readonly options: LMStudioClientOptions) {}
public async createChatCompletion(data: {
messages: Array<ChatMessage>;
tools?: Array<ToolDefinition>;
}): Promise<ChatMessage> {
const controller: AbortController = new AbortController();
const timeout: NodeJS.Timeout = setTimeout(() => {
controller.abort();
}, this.options.timeoutMs);
try {
AgentLogger.debug("Dispatching LLM request", {
endpoint: this.options.endpoint,
model: this.options.model,
messageCount: data.messages.length,
toolCount: data.tools?.length ?? 0,
temperature: this.options.temperature,
});
const payload: ChatCompletionRequestPayload = {
model: this.options.model,
messages: data.messages.map((message: ChatMessage) => {
const serialized: SerializableMessage = {
role: message.role,
content: message.content,
};
if (message.name !== undefined) {
serialized.name = message.name;
}
if (message.tool_call_id !== undefined) {
serialized.tool_call_id = message.tool_call_id;
}
if (message.tool_calls !== undefined) {
serialized.tool_calls = message.tool_calls;
}
return serialized;
}),
temperature: this.options.temperature,
tool_choice: "auto",
...(data.tools !== undefined ? { tools: data.tools } : {}),
};
AgentLogger.debug("LLM payload prepared", {
messageRoles: data.messages.map((message: ChatMessage) => {
return message.role;
}),
toolNames: data.tools?.map((tool: ToolDefinition) => {
return tool.function.name;
}),
});
const headers: Record<string, string> = {
"Content-Type": "application/json",
};
if (this.options.apiKey) {
headers["Authorization"] = `Bearer ${this.options.apiKey}`;
}
const response: Response = await fetch(this.options.endpoint, {
method: "POST",
headers,
body: JSON.stringify(payload),
signal: controller.signal,
});
if (!response.ok) {
const errorBody: string = await response.text();
AgentLogger.error("LLM request failed", {
status: response.status,
bodyPreview: errorBody.slice(0, 500),
});
throw new Error(
`LLM request failed (${response.status}): ${errorBody}`,
);
}
const body: OpenAIChatCompletionResponse =
(await response.json()) as OpenAIChatCompletionResponse;
AgentLogger.debug("LLM request succeeded", {
tokenUsage: body.usage,
choiceCount: body.choices?.length ?? 0,
});
if (!body.choices?.length) {
throw new Error("LLM returned no choices");
}
const assistantMessage:
| OpenAIChatCompletionResponse["choices"][number]["message"]
| undefined = body.choices[0]?.message;
if (!assistantMessage) {
throw new Error("LLM response missing assistant message");
}
const assistantResponse: ChatMessage = {
role: "assistant",
content: this.normalizeContent(assistantMessage.content),
};
if (assistantMessage.tool_calls !== undefined) {
assistantResponse.tool_calls = assistantMessage.tool_calls;
}
return assistantResponse;
} catch (error) {
AgentLogger.error("LLM request error", error as Error);
throw error;
} finally {
clearTimeout(timeout);
AgentLogger.debug("LLM request finalized");
}
}
private normalizeContent(content: unknown): string | null {
if (typeof content === "string" || content === null) {
return content;
}
if (!content) {
return null;
}
if (Array.isArray(content)) {
return content
.map((item: unknown) => {
if (typeof item === "string") {
return item;
}
if (
typeof item === "object" &&
item !== null &&
"text" in item &&
typeof (item as { text?: unknown }).text === "string"
) {
return (item as { text: string }).text;
}
return JSON.stringify(item);
})
.join("\n");
}
if (typeof content === "object") {
return JSON.stringify(content);
}
return String(content);
}
}

View File

@@ -0,0 +1,200 @@
import fs from "node:fs/promises";
import os from "node:os";
import path from "node:path";
import { z } from "zod";
import Execute from "Common/Server/Utils/Execute";
import LocalFile from "Common/Server/Utils/LocalFile";
import { JSONObject } from "Common/Types/JSON";
import { StructuredTool, ToolResponse, ToolRuntime } from "./Tool";
import AgentLogger from "../utils/AgentLogger";
interface ApplyPatchArgs {
patch: string;
note?: string | undefined;
}
export class ApplyPatchTool extends StructuredTool<ApplyPatchArgs> {
public readonly name: string = "apply_patch";
public readonly description: string =
"Applies a unified diff (or the *** Begin Patch format) to modify existing files precisely.";
public readonly parameters: JSONObject = {
type: "object",
required: ["patch"],
properties: {
patch: {
type: "string",
description:
"Unified diff or *** Begin Patch instructions describing the edits to apply.",
},
note: {
type: "string",
description:
"Optional short description of why this patch is being applied.",
},
},
};
protected schema = z
.object({
patch: z.string().min(10),
note: z.string().max(2000).optional(),
})
.strict();
public async execute(
args: ApplyPatchArgs,
runtime: ToolRuntime,
): Promise<ToolResponse> {
AgentLogger.debug("ApplyPatchTool invoked", {
note: args.note,
});
const normalizedPatch: string = this.normalizePatchFormat(
args.patch,
runtime,
);
AgentLogger.debug("Patch normalized", {
originalLength: args.patch.length,
normalizedLength: normalizedPatch.length,
});
if (!normalizedPatch.trim()) {
return {
content: "Patch payload was empty. Nothing was applied.",
isError: true,
};
}
const tempDir: string = await fs.mkdtemp(
path.join(os.tmpdir(), "oneuptime-patch-"),
);
const patchFile: string = path.join(tempDir, "patch.diff");
await fs.writeFile(patchFile, normalizedPatch, { encoding: "utf8" });
try {
await Execute.executeCommandFile({
command: "git",
args: [
"apply",
"--whitespace=nowarn",
"--reject",
"--unidiff-zero",
patchFile,
],
cwd: runtime.workspaceRoot,
});
return {
content: `Patch applied successfully${args.note ? `: ${args.note}` : "."}`,
};
} catch (error) {
AgentLogger.error("Patch application failed", error as Error);
const filePreview: string = await LocalFile.read(patchFile);
return {
content: `Failed to apply patch. Please review the diff and adjust.\n${filePreview}`,
isError: true,
};
} finally {
await fs.rm(tempDir, { recursive: true, force: true });
}
}
private normalizePatchFormat(input: string, runtime: ToolRuntime): string {
if (input.includes("diff --git")) {
AgentLogger.debug("Patch already in diff format", {
length: input.length,
});
return input;
}
const matches: RegExpMatchArray[] = Array.from(
input.matchAll(/\*\*\* Begin Patch([\s\S]*?)\*\*\* End Patch/g),
);
if (!matches.length) {
AgentLogger.debug("No patch sections detected", {
inputLength: input.length,
});
return input;
}
const sections: Array<string> = [];
AgentLogger.debug("Processing patch sections", {
sectionCount: matches.length,
});
for (const match of matches) {
const body: string | undefined = match[1];
if (!body) {
continue;
}
const fileBlocks: RegExpMatchArray[] = Array.from(
body.matchAll(
/\*\*\* (Update|Add|Delete) File: (.+)\n([\s\S]*?)(?=\*\*\* (Update|Add|Delete) File: |$)/g,
),
);
for (const block of fileBlocks) {
const action: string | undefined = block[1];
const filePathRaw: string | undefined = block[2];
const diffBodyRaw: string | undefined = block[3];
if (!action || !filePathRaw || !diffBodyRaw) {
continue;
}
const filePath: string = filePathRaw.trim();
const diffBody: string = diffBodyRaw.trim();
if (!diffBody) {
AgentLogger.debug("Skipping empty patch block", {
action,
filePath,
});
continue;
}
const absolute: string = runtime.workspacePaths.resolve(filePath);
const relative: string = runtime.workspacePaths.relative(absolute);
AgentLogger.debug("Patch block resolved", {
action,
relative,
});
if (action === "Add") {
sections.push(
[
`diff --git a/${relative} b/${relative}`,
`--- /dev/null`,
`+++ b/${relative}`,
diffBody,
].join("\n"),
);
} else if (action === "Delete") {
sections.push(
[
`diff --git a/${relative} b/${relative}`,
`--- a/${relative}`,
`+++ /dev/null`,
diffBody,
].join("\n"),
);
} else {
sections.push(
[
`diff --git a/${relative} b/${relative}`,
`--- a/${relative}`,
`+++ b/${relative}`,
diffBody,
].join("\n"),
);
}
}
}
const finalPatch: string = sections.join("\n");
AgentLogger.debug("Patch sections assembled", {
sectionCount: sections.length,
finalLength: finalPatch.length,
});
return finalPatch;
}
}

View File

@@ -0,0 +1,176 @@
import fs from "node:fs/promises";
import type { Dirent } from "node:fs";
import path from "node:path";
import { z } from "zod";
import LocalFile from "Common/Server/Utils/LocalFile";
import { JSONObject } from "Common/Types/JSON";
import { StructuredTool, ToolResponse, ToolRuntime } from "./Tool";
import AgentLogger from "../utils/AgentLogger";
interface ListDirectoryArgs {
path?: string | undefined;
depth?: number | undefined;
includeFiles?: boolean | undefined;
limit?: number | undefined;
}
export class ListDirectoryTool extends StructuredTool<ListDirectoryArgs> {
public readonly name: string = "list_directory";
public readonly description: string =
"Lists files and folders inside the workspace to help you locate relevant code.";
public readonly parameters: JSONObject = {
type: "object",
properties: {
path: {
type: "string",
description: "Directory to inspect. Defaults to the workspace root.",
},
depth: {
type: "integer",
minimum: 1,
maximum: 5,
description: "How deep to recurse into subdirectories (default 2).",
},
includeFiles: {
type: "boolean",
description: "Include file entries as well as directories.",
},
limit: {
type: "integer",
minimum: 1,
maximum: 400,
description: "Maximum number of entries to return (default 80).",
},
},
};
protected schema = z
.object({
path: z.string().trim().optional(),
depth: z.number().int().min(1).max(5).optional().default(2),
includeFiles: z.boolean().optional().default(true),
limit: z.number().int().min(1).max(400).optional().default(80),
})
.strict();
public async execute(
args: ListDirectoryArgs,
runtime: ToolRuntime,
): Promise<ToolResponse> {
AgentLogger.debug("ListDirectoryTool executing", {
path: args.path,
depth: args.depth,
includeFiles: args.includeFiles,
limit: args.limit,
});
const targetPath: string = runtime.workspacePaths.resolve(args.path ?? ".");
if (!(await LocalFile.doesDirectoryExist(targetPath))) {
AgentLogger.warn("ListDirectoryTool target missing", {
targetPath,
});
return {
content: `Directory ${args.path ?? "."} does not exist in the workspace`,
isError: true,
};
}
const rows: Array<string> = [];
await this.walkDirectory({
current: targetPath,
currentDepth: 0,
maxDepth: args.depth ?? 2,
includeFiles: args.includeFiles ?? true,
limit: args.limit ?? 80,
output: rows,
runtime,
});
const relativeRoot: string = runtime.workspacePaths.relative(targetPath);
const header: string = `Listing ${rows.length} item(s) under ${relativeRoot || "."}`;
AgentLogger.debug("ListDirectoryTool completed", {
relativeRoot,
rowCount: rows.length,
});
return {
content: [header, ...rows].join("\n"),
};
}
private async walkDirectory(data: {
current: string;
currentDepth: number;
maxDepth: number;
includeFiles: boolean;
limit: number;
output: Array<string>;
runtime: ToolRuntime;
}): Promise<void> {
if (data.output.length >= data.limit) {
return;
}
const entries: Array<Dirent> = await fs.readdir(data.current, {
withFileTypes: true,
});
entries.sort((a: Dirent, b: Dirent) => {
return a.name.localeCompare(b.name);
});
AgentLogger.debug("Listing directory entries", {
current: data.current,
depth: data.currentDepth,
entryCount: entries.length,
});
for (let index: number = 0; index < entries.length; index += 1) {
const entry: Dirent | undefined = entries[index];
if (entry === undefined) {
AgentLogger.warn("Missing directory entry during traversal", {
directory: data.current,
requestedIndex: index,
});
continue;
}
if (data.output.length >= data.limit) {
break;
}
if (this.shouldSkip(entry.name)) {
AgentLogger.debug("Skipping directory entry", {
entry: entry.name,
});
continue;
}
const absoluteEntry: string = path.join(data.current, entry.name);
const prefix: string = " ".repeat(data.currentDepth);
if (entry.isDirectory()) {
data.output.push(`${prefix}${entry.name}/`);
if (data.currentDepth + 1 < data.maxDepth) {
await this.walkDirectory({
...data,
current: absoluteEntry,
currentDepth: data.currentDepth + 1,
});
}
} else if (data.includeFiles) {
data.output.push(`${prefix}${entry.name}`);
}
}
}
private shouldSkip(entryName: string): boolean {
const blocked: Array<string> = [
".git",
"node_modules",
".turbo",
"dist",
"build",
".next",
];
return blocked.includes(entryName);
}
}

View File

@@ -0,0 +1,113 @@
import { z } from "zod";
import LocalFile from "Common/Server/Utils/LocalFile";
import { JSONObject } from "Common/Types/JSON";
import { StructuredTool, ToolResponse, ToolRuntime } from "./Tool";
import AgentLogger from "../utils/AgentLogger";
interface ReadFileArgs {
path: string;
startLine?: number | undefined;
endLine?: number | undefined;
limit?: number | undefined;
}
export class ReadFileTool extends StructuredTool<ReadFileArgs> {
public readonly name: string = "read_file";
public readonly description: string =
"Reads a file from the workspace so you can inspect existing code before editing.";
public readonly parameters: JSONObject = {
type: "object",
required: ["path"],
properties: {
path: {
type: "string",
description: "File path relative to the workspace root.",
},
startLine: {
type: "integer",
minimum: 1,
description: "Optional starting line (1-indexed).",
},
endLine: {
type: "integer",
minimum: 1,
description: "Optional ending line (inclusive).",
},
limit: {
type: "integer",
minimum: 100,
maximum: 20000,
description: "Maximum number of characters to return (default 6000).",
},
},
};
protected schema = z
.object({
path: z.string().min(1),
startLine: z.number().int().min(1).optional(),
endLine: z.number().int().min(1).optional(),
limit: z.number().int().min(100).max(20000).optional().default(6000),
})
.strict()
.refine((data: ReadFileArgs) => {
if (data.startLine && data.endLine) {
return data.endLine >= data.startLine;
}
return true;
}, "endLine must be greater than startLine");
public async execute(
args: ReadFileArgs,
runtime: ToolRuntime,
): Promise<ToolResponse> {
AgentLogger.debug("ReadFileTool executing", {
path: args.path,
startLine: args.startLine,
endLine: args.endLine,
limit: args.limit,
});
const absolutePath: string = runtime.workspacePaths.resolve(args.path);
if (!(await LocalFile.doesFileExist(absolutePath))) {
AgentLogger.warn("ReadFileTool missing file", {
absolutePath,
});
return {
content: `File ${args.path} does not exist in the workspace`,
isError: true,
};
}
const rawContent: string = await LocalFile.read(absolutePath);
const lines: Array<string> = rawContent.split(/\r?\n/);
const start: number = (args.startLine ?? 1) - 1;
const end: number = args.endLine ? args.endLine : lines.length;
const slice: Array<string> = lines.slice(start, end);
let text: string = slice.join("\n");
let truncated: boolean = false;
const limit: number = args.limit ?? 6000;
if (text.length > limit) {
text = text.substring(0, limit);
truncated = true;
AgentLogger.debug("ReadFileTool output truncated", {
limit,
});
}
const relative: string = runtime.workspacePaths.relative(absolutePath);
const header: string = `Contents of ${relative} (lines ${start + 1}-${Math.min(end, lines.length)})`;
AgentLogger.debug("ReadFileTool completed", {
relative,
truncated,
returnedChars: text.length,
});
return {
content: truncated
? `${header}\n${text}\n... [truncated]`
: `${header}\n${text}`,
};
}
}

View File

@@ -0,0 +1,93 @@
import { ExecOptions } from "node:child_process";
import { z } from "zod";
import Execute from "Common/Server/Utils/Execute";
import { JSONObject } from "Common/Types/JSON";
import { StructuredTool, ToolResponse, ToolRuntime } from "./Tool";
import AgentLogger from "../utils/AgentLogger";
interface RunCommandArgs {
command: string;
path?: string | undefined;
timeoutMs?: number | undefined;
}
export class RunCommandTool extends StructuredTool<RunCommandArgs> {
public readonly name: string = "run_command";
public readonly description: string =
"Runs a shell command inside the workspace (for unit tests, linters, or project-specific scripts).";
public readonly parameters: JSONObject = {
type: "object",
required: ["command"],
properties: {
command: {
type: "string",
description:
"Shell command to execute. Prefer running package scripts instead of raw binaries when possible.",
},
path: {
type: "string",
description: "Optional subdirectory to run the command from.",
},
timeoutMs: {
type: "integer",
minimum: 1000,
maximum: 1800000,
description: "Timeout in milliseconds (default 10 minutes).",
},
},
};
protected schema = z
.object({
command: z.string().min(1),
path: z.string().trim().optional(),
timeoutMs: z.number().int().min(1000).max(1800000).optional(),
})
.strict();
public async execute(
args: RunCommandArgs,
runtime: ToolRuntime,
): Promise<ToolResponse> {
const cwd: string = args.path
? runtime.workspacePaths.resolve(args.path)
: runtime.workspaceRoot;
AgentLogger.debug("RunCommandTool executing", {
command: args.command,
cwd,
timeoutMs: args.timeoutMs,
});
const options: ExecOptions = {
cwd,
timeout: args.timeoutMs ?? 10 * 60 * 1000,
maxBuffer: 8 * 1024 * 1024,
};
AgentLogger.debug("RunCommandTool options prepared", {
cwd,
timeout: options.timeout,
maxBuffer: options.maxBuffer,
});
try {
const output: string = await Execute.executeCommand(
args.command,
options,
);
AgentLogger.debug("RunCommandTool succeeded", {
command: args.command,
cwd,
outputPreview: output.slice(0, 500),
});
return {
content: `Command executed in ${runtime.workspacePaths.relative(cwd) || "."}\n$ ${args.command}\n${output.trim()}`,
};
} catch (error) {
AgentLogger.error("RunCommandTool failed", error as Error);
return {
content: `Command failed: ${args.command}\n${(error as Error).message}`,
isError: true,
};
}
}
}

View File

@@ -0,0 +1,166 @@
import { z } from "zod";
import Execute from "Common/Server/Utils/Execute";
import { JSONObject } from "Common/Types/JSON";
import { StructuredTool, ToolResponse, ToolRuntime } from "./Tool";
import AgentLogger from "../utils/AgentLogger";
interface SearchArgs {
query: string;
path?: string | undefined;
useRegex?: boolean | undefined;
maxResults?: number | undefined;
}
export class SearchWorkspaceTool extends StructuredTool<SearchArgs> {
public readonly name: string = "search_workspace";
public readonly description: string =
"Searches the workspace for a literal string or regular expression to quickly find relevant files.";
public readonly parameters: JSONObject = {
type: "object",
required: ["query"],
properties: {
query: {
type: "string",
description: "String or regex to search for.",
},
path: {
type: "string",
description:
"Optional folder to scope the search to. Defaults to the workspace root.",
},
useRegex: {
type: "boolean",
description: "Set true to treat query as a regular expression.",
},
maxResults: {
type: "integer",
minimum: 1,
maximum: 200,
description: "Maximum number of matches to return (default 40).",
},
},
};
protected schema = z
.object({
query: z.string().min(2),
path: z.string().trim().optional(),
useRegex: z.boolean().optional().default(false),
maxResults: z.number().int().min(1).max(200).optional().default(40),
})
.strict();
public async execute(
args: SearchArgs,
runtime: ToolRuntime,
): Promise<ToolResponse> {
const cwd: string = args.path
? runtime.workspacePaths.resolve(args.path)
: runtime.workspaceRoot;
const relativeScope: string = runtime.workspacePaths.relative(cwd);
AgentLogger.debug("SearchWorkspaceTool executing", {
query: args.query,
path: args.path,
useRegex: args.useRegex,
maxResults: args.maxResults,
});
try {
const rgOutput: string = await this.runRipgrep(args, cwd);
AgentLogger.debug("SearchWorkspaceTool ripgrep success", {
scope: relativeScope,
});
return {
content: this.decorateSearchResult({
engine: "ripgrep",
scope: relativeScope,
body: rgOutput,
}),
};
} catch (rgError) {
AgentLogger.debug(
"SearchWorkspaceTool ripgrep failed, falling back to grep",
{
error: (rgError as Error).message,
},
);
const fallbackOutput: string = await this.runGrep(args, cwd);
AgentLogger.debug("SearchWorkspaceTool grep success", {
scope: relativeScope,
});
return {
content: this.decorateSearchResult({
engine: "grep",
scope: relativeScope,
body: fallbackOutput,
}),
isError: false,
};
}
}
private async runRipgrep(args: SearchArgs, cwd: string): Promise<string> {
const cliArgs: Array<string> = [
"--line-number",
"--color",
"never",
"--no-heading",
"--context",
"2",
"--max-filesize",
"200K",
"--max-columns",
"200",
"--max-count",
String(args.maxResults ?? 40),
"--glob",
"!*node_modules/*",
"--glob",
"!*.lock",
"--glob",
"!.git/*",
];
if (!args.useRegex) {
cliArgs.push("--fixed-strings");
}
cliArgs.push(args.query);
cliArgs.push(".");
return Execute.executeCommandFile({
command: "rg",
args: cliArgs,
cwd,
});
}
private async runGrep(args: SearchArgs, cwd: string): Promise<string> {
const finalArgs: Array<string> = [
"-R",
"-n",
"-C",
"2",
args.useRegex ? "-E" : "-F",
args.query,
".",
];
return Execute.executeCommandFile({
command: "grep",
args: finalArgs,
cwd,
});
}
private decorateSearchResult(data: {
engine: string;
scope: string;
body: string;
}): string {
const scope: string = data.scope || ".";
const trimmedBody: string = data.body.trim() || "No matches found";
return `Search (${data.engine}) under ${scope}\n${trimmedBody}`;
}
}

59
Copilot/src/tools/Tool.ts Normal file
View File

@@ -0,0 +1,59 @@
import { z } from "zod";
import { JSONObject } from "Common/Types/JSON";
import { ToolDefinition } from "../types";
import { WorkspacePaths } from "../utils/WorkspacePaths";
import AgentLogger from "../utils/AgentLogger";
export interface ToolRuntime {
workspacePaths: WorkspacePaths;
workspaceRoot: string;
}
export interface ToolResponse {
content: string;
isError?: boolean;
}
export interface AgentTool<TArgs> {
readonly name: string;
readonly description: string;
readonly parameters: JSONObject;
getDefinition(): ToolDefinition;
parse(input: unknown): TArgs;
execute(args: TArgs, runtime: ToolRuntime): Promise<ToolResponse>;
}
export abstract class StructuredTool<TArgs> implements AgentTool<TArgs> {
public abstract readonly name: string;
public abstract readonly description: string;
public abstract readonly parameters: JSONObject;
protected abstract schema: z.ZodType<TArgs>;
public getDefinition(): ToolDefinition {
return {
type: "function",
function: {
name: this.name,
description: this.description,
parameters: this.parameters,
},
};
}
public parse(input: unknown): TArgs {
AgentLogger.debug("Parsing tool arguments", {
tool: this.name,
inputType: typeof input,
});
const parsed: TArgs = this.schema.parse(input ?? {});
AgentLogger.debug("Parsed tool arguments", {
tool: this.name,
});
return parsed;
}
public abstract execute(
args: TArgs,
runtime: ToolRuntime,
): Promise<ToolResponse>;
}

View File

@@ -0,0 +1,122 @@
import { OpenAIToolCall, ToolDefinition, ToolExecutionResult } from "../types";
import { WorkspacePaths } from "../utils/WorkspacePaths";
import { ApplyPatchTool } from "./ApplyPatchTool";
import { ListDirectoryTool } from "./ListDirectoryTool";
import { ReadFileTool } from "./ReadFileTool";
import { RunCommandTool } from "./RunCommandTool";
import { SearchWorkspaceTool } from "./SearchWorkspaceTool";
import { AgentTool, ToolResponse, ToolRuntime } from "./Tool";
import { WriteFileTool } from "./WriteFileTool";
import AgentLogger from "../utils/AgentLogger";
export class ToolRegistry {
private readonly tools: Map<string, AgentTool<unknown>>;
private readonly runtime: ToolRuntime;
public constructor(workspaceRoot: string) {
const workspacePaths: WorkspacePaths = new WorkspacePaths(workspaceRoot);
this.runtime = {
workspacePaths,
workspaceRoot: workspacePaths.getRoot(),
};
AgentLogger.debug("Tool registry initialized", {
workspaceRoot: workspacePaths.getRoot(),
});
const toolInstances: Array<AgentTool<unknown>> = [
new ListDirectoryTool(),
new ReadFileTool(),
new SearchWorkspaceTool(),
new ApplyPatchTool(),
new WriteFileTool(),
new RunCommandTool(),
];
this.tools = new Map(
toolInstances.map((tool: AgentTool<unknown>) => {
return [tool.name, tool];
}),
);
}
public getToolDefinitions(): Array<ToolDefinition> {
const definitions: Array<ToolDefinition> = Array.from(
this.tools.values(),
).map((tool: AgentTool<unknown>) => {
return tool.getDefinition();
});
AgentLogger.debug("Tool definitions requested", {
count: definitions.length,
toolNames: definitions.map((definition: ToolDefinition) => {
return definition.function.name;
}),
});
return definitions;
}
public async execute(call: OpenAIToolCall): Promise<ToolExecutionResult> {
const tool: AgentTool<unknown> | undefined = this.tools.get(
call.function.name,
);
if (!tool) {
const message: string = `Tool ${call.function.name} is not available.`;
AgentLogger.error(message);
return {
toolCallId: call.id,
output: message,
};
}
let parsedArgs: unknown;
try {
parsedArgs = call.function.arguments
? JSON.parse(call.function.arguments)
: {};
AgentLogger.debug("Tool arguments parsed", {
toolName: call.function.name,
argumentKeys:
typeof parsedArgs === "object" && parsedArgs !== null
? Object.keys(parsedArgs as Record<string, unknown>)
: [],
});
} catch (error) {
const message: string = `Unable to parse tool arguments for ${call.function.name}: ${(error as Error).message}`;
AgentLogger.error(message);
return {
toolCallId: call.id,
output: message,
};
}
try {
AgentLogger.debug("Executing tool via registry", {
toolName: call.function.name,
});
const typedArgs: unknown = tool.parse(parsedArgs);
AgentLogger.debug("Tool arguments validated", {
toolName: call.function.name,
});
const response: ToolResponse = await tool.execute(
typedArgs,
this.runtime,
);
const prefix: string = response.isError ? "ERROR: " : "";
AgentLogger.debug("Tool execution result", {
toolName: call.function.name,
isError: response.isError ?? false,
});
return {
toolCallId: call.id,
output: `${prefix}${response.content}`,
};
} catch (error) {
const message: string = `Tool ${call.function.name} failed: ${(error as Error).message}`;
AgentLogger.error(message, error as Error);
return {
toolCallId: call.id,
output: message,
};
}
}
}

View File

@@ -0,0 +1,77 @@
import fs from "node:fs/promises";
import { z } from "zod";
import LocalFile from "Common/Server/Utils/LocalFile";
import { JSONObject } from "Common/Types/JSON";
import { StructuredTool, ToolResponse, ToolRuntime } from "./Tool";
import AgentLogger from "../utils/AgentLogger";
interface WriteFileArgs {
path: string;
content: string;
mode?: "overwrite" | "append" | undefined;
}
export class WriteFileTool extends StructuredTool<WriteFileArgs> {
public readonly name: string = "write_file";
public readonly description: string =
"Creates a new file or replaces an existing file with the provided content. Use this for docs, configs, or single-file outputs.";
public readonly parameters: JSONObject = {
type: "object",
required: ["path", "content"],
properties: {
path: {
type: "string",
description: "File path relative to the workspace root.",
},
content: {
type: "string",
description: "Entire file content to be written.",
},
mode: {
type: "string",
enum: ["overwrite", "append"],
description:
"Overwrite replaces the file (default). Append adds content to the end of the file.",
},
},
};
protected schema = z
.object({
path: z.string().min(1),
content: z.string().min(1),
mode: z.enum(["overwrite", "append"]).optional().default("overwrite"),
})
.strict();
public async execute(
args: WriteFileArgs,
runtime: ToolRuntime,
): Promise<ToolResponse> {
const absolutePath: string = runtime.workspacePaths.resolve(args.path);
AgentLogger.debug("WriteFileTool executing", {
path: args.path,
mode: args.mode,
contentLength: args.content.length,
});
await runtime.workspacePaths.ensureParentDirectory(absolutePath);
if (
args.mode === "append" &&
(await LocalFile.doesFileExist(absolutePath))
) {
await fs.appendFile(absolutePath, args.content);
} else {
await LocalFile.write(absolutePath, args.content);
}
const relative: string = runtime.workspacePaths.relative(absolutePath);
AgentLogger.debug("WriteFileTool completed", {
relative,
mode: args.mode ?? "overwrite",
});
return {
content: `${args.mode === "append" ? "Appended to" : "Wrote"} ${relative} (${args.content.length} characters).`,
};
}
}

34
Copilot/src/types.ts Normal file
View File

@@ -0,0 +1,34 @@
import { JSONObject } from "Common/Types/JSON";
export type ChatRole = "system" | "user" | "assistant" | "tool";
export interface ChatMessage {
role: ChatRole;
content: string | null;
name?: string | undefined;
tool_call_id?: string | undefined;
tool_calls?: Array<OpenAIToolCall> | undefined;
}
export interface OpenAIToolCall {
id: string;
type: "function";
function: {
name: string;
arguments: string;
};
}
export interface ToolDefinition {
type: "function";
function: {
name: string;
description: string;
parameters: JSONObject;
};
}
export interface ToolExecutionResult {
toolCallId: string;
output: string;
}

View File

@@ -0,0 +1,133 @@
import fs from "node:fs";
import path from "node:path";
import logger, { LogBody } from "Common/Server/Utils/Logger";
export type AgentLogLevel = "DEBUG" | "INFO" | "WARN" | "ERROR";
export class AgentLogger {
private static logStream: fs.WriteStream | null = null;
private static logFilePath: string | null = null;
private static exitHandlersRegistered: boolean = false;
private static fileWriteFailed: boolean = false;
public static async configure(options: {
logFilePath?: string | undefined;
}): Promise<void> {
const targetPath: string | undefined = options.logFilePath?.trim()
? path.resolve(options.logFilePath)
: undefined;
if (!targetPath) {
await this.closeStream();
this.logFilePath = null;
logger.debug("File logging disabled");
return;
}
if (this.logFilePath === targetPath && this.logStream) {
return;
}
await this.closeStream();
await fs.promises.mkdir(path.dirname(targetPath), { recursive: true });
this.logStream = fs.createWriteStream(targetPath, { flags: "a" });
this.logFilePath = targetPath;
this.fileWriteFailed = false;
this.registerExitHandlers();
this.info(`File logging enabled at ${targetPath}`);
}
public static debug(message: LogBody, meta?: unknown): void {
logger.debug(message);
this.writeToFile("DEBUG", message, meta);
}
public static info(message: LogBody, meta?: unknown): void {
logger.info(message);
this.writeToFile("INFO", message, meta);
}
public static warn(message: LogBody, meta?: unknown): void {
logger.warn(message);
this.writeToFile("WARN", message, meta);
}
public static error(message: LogBody, meta?: unknown): void {
logger.error(message);
this.writeToFile("ERROR", message, meta);
}
private static async closeStream(): Promise<void> {
if (!this.logStream) {
return;
}
await new Promise<void>((resolve: () => void) => {
this.logStream?.end(resolve);
});
this.logStream = null;
logger.debug("File logging stream closed");
}
private static writeToFile(
level: AgentLogLevel,
message: LogBody,
meta?: unknown,
): void {
if (!this.logStream) {
return;
}
const timestamp: string = new Date().toISOString();
const serializedMessage: string = logger.serializeLogBody(message);
const serializedMeta: string | null = this.serializeMeta(meta);
const line: string = serializedMeta
? `${timestamp} [${level}] ${serializedMessage} ${serializedMeta}`
: `${timestamp} [${level}] ${serializedMessage}`;
try {
this.logStream.write(line + "\n");
} catch (error) {
if (!this.fileWriteFailed) {
this.fileWriteFailed = true;
logger.error(
`Failed to write logs to ${this.logFilePath ?? "<unknown>"}: ${(error as Error).message}`,
);
}
}
}
private static serializeMeta(meta?: unknown): string | null {
if (meta === undefined || meta === null) {
return null;
}
if (typeof meta === "string") {
return meta;
}
try {
return JSON.stringify(meta);
} catch (error) {
return `"<unserializable meta: ${(error as Error).message}>"`;
}
}
private static registerExitHandlers(): void {
if (this.exitHandlersRegistered) {
return;
}
const cleanup: () => void = () => {
void this.closeStream();
};
process.once("exit", cleanup);
process.once("SIGINT", cleanup);
process.once("SIGTERM", cleanup);
this.exitHandlersRegistered = true;
}
}
export default AgentLogger;

View File

@@ -0,0 +1,69 @@
import path from "node:path";
import fs from "node:fs/promises";
import LocalFile from "Common/Server/Utils/LocalFile";
import BadDataException from "Common/Types/Exception/BadDataException";
import AgentLogger from "./AgentLogger";
export class WorkspacePaths {
private readonly root: string;
public constructor(workspaceRoot: string) {
this.root = path.resolve(workspaceRoot);
}
public resolve(candidate: string): string {
const sanitizedCandidate: string = candidate.trim() || ".";
const absolutePath: string = path.resolve(this.root, sanitizedCandidate);
AgentLogger.debug("Resolving workspace path", {
candidate,
sanitizedCandidate,
absolutePath,
});
if (!this.isInsideWorkspace(absolutePath)) {
AgentLogger.error("Path outside workspace", {
candidate,
absolutePath,
workspaceRoot: this.root,
});
throw new BadDataException(
`Path ${candidate} is outside the workspace root ${this.root}`,
);
}
return absolutePath;
}
public relative(target: string): string {
const absolute: string = path.resolve(target);
const relativePath: string = path.relative(this.root, absolute) || ".";
AgentLogger.debug("Computed relative path", {
target,
absolute,
relativePath,
});
return relativePath;
}
public getRoot(): string {
return this.root;
}
public async ensureParentDirectory(targetFile: string): Promise<void> {
const parentDir: string = path.dirname(targetFile);
if (!(await LocalFile.doesDirectoryExist(parentDir))) {
AgentLogger.debug("Creating parent directory", {
parentDir,
});
await fs.mkdir(parentDir, { recursive: true });
}
}
private isInsideWorkspace(target: string): boolean {
const normalizedTarget: string = path.resolve(target);
return (
normalizedTarget === this.root ||
normalizedTarget.startsWith(this.root + path.sep)
);
}
}

View File

@@ -22,8 +22,8 @@
"target": "es2017" /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */,
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
"jsx": "react" /* Specify what JSX code is generated. */,
"experimentalDecorators": true /* Enable experimental support for TC39 stage 2 draft decorators. */,
"emitDecoratorMetadata": true /* Emit design-type metadata for decorated declarations in source files. */,
"experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */
"emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */
// "jsxFactory": "", /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h' */
// "jsxFragmentFactory": "", /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */
// "jsxImportSource": "", /* Specify module specifier used to import the JSX factory functions when using `jsx: react-jsx*`.` */
@@ -33,18 +33,17 @@
/* Modules */
// "module": "es2022" /* Specify what module code is generated. */,
"rootDir": "" /* Specify the root folder within your source files. */,
"moduleResolution": "node" /* Specify how TypeScript looks up a file from a given module specifier. */,
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
// "rootDir": "./", /* Specify the root folder within your source files. */
"moduleResolution": "node", /* Specify how TypeScript looks up a file from a given module specifier. */
"baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
"paths": {
"Common/*": ["../Common/*"]
}, /* Specify a set of entries that re-map imports to additional lookup locations. */
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
"typeRoots": [
"./node_modules/@types"
] /* Specify multiple folders that act like `./node_modules/@types`. */,
"types": [
"node",
"jest"
] /* Specify type package names to be included without being referenced in a source file. */,
], /* Specify multiple folders that act like `./node_modules/@types`. */
"types": ["node"], /* Specify type package names to be included without being referenced in a source file. */
// "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */
// "resolveJsonModule": true, /* Enable importing .json files */
// "noResolve": true, /* Disallow `import`s, `require`s or `<reference>`s from expanding the number of files TypeScript should add to a project. */
@@ -58,9 +57,9 @@
// "declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
// "declarationMap": true, /* Create sourcemaps for d.ts files. */
// "emitDeclarationOnly": true, /* Only output d.ts files and not JavaScript files. */
"sourceMap": true /* Create source map files for emitted JavaScript files. */,
"sourceMap": true, /* Create source map files for emitted JavaScript files. */
// "outFile": "./", /* Specify a file that bundles all outputs into one JavaScript file. If `declaration` is true, also designates a file that bundles all .d.ts output. */
"outDir": "build/dist" /* Specify an output folder for all emitted files. */,
"outDir": "./build/dist", /* Specify an output folder for all emitted files. */
// "removeComments": true, /* Disable emitting comments. */
// "noEmit": true, /* Disable emitting files from a compilation. */
// "importHelpers": true, /* Allow importing helper functions from tslib once per project, instead of including them per-file. */
@@ -88,22 +87,22 @@
/* Type Checking */
"strict": true /* Enable all strict type-checking options. */,
"noImplicitAny": true /* Enable error reporting for expressions and declarations with an implied `any` type.. */,
"strictNullChecks": true /* When type checking, take into account `null` and `undefined`. */,
"strictFunctionTypes": true /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */,
"strictBindCallApply": true /* Check that the arguments for `bind`, `call`, and `apply` methods match the original function. */,
"strictPropertyInitialization": true /* Check for class properties that are declared but not set in the constructor. */,
"noImplicitThis": true /* Enable error reporting when `this` is given the type `any`. */,
"useUnknownInCatchVariables": true /* Type catch clause variables as 'unknown' instead of 'any'. */,
"alwaysStrict": true /* Ensure 'use strict' is always emitted. */,
"noUnusedLocals": true /* Enable error reporting when a local variables aren't read. */,
"noUnusedParameters": true /* Raise an error when a function parameter isn't read */,
"exactOptionalPropertyTypes": true /* Interpret optional property types as written, rather than adding 'undefined'. */,
"noImplicitReturns": true /* Enable error reporting for codepaths that do not explicitly return in a function. */,
"noFallthroughCasesInSwitch": true /* Enable error reporting for fallthrough cases in switch statements. */,
"noUncheckedIndexedAccess": true /* Include 'undefined' in index signature results */,
"noImplicitOverride": true /* Ensure overriding members in derived classes are marked with an override modifier. */,
"noPropertyAccessFromIndexSignature": true /* Enforces using indexed accessors for keys declared using an indexed type */,
"noImplicitAny": true, /* Enable error reporting for expressions and declarations with an implied `any` type.. */
"strictNullChecks": true, /* When type checking, take into account `null` and `undefined`. */
"strictFunctionTypes": true, /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */
"strictBindCallApply": true, /* Check that the arguments for `bind`, `call`, and `apply` methods match the original function. */
"strictPropertyInitialization": true, /* Check for class properties that are declared but not set in the constructor. */
"noImplicitThis": true, /* Enable error reporting when `this` is given the type `any`. */
"useUnknownInCatchVariables": true, /* Type catch clause variables as 'unknown' instead of 'any'. */
"alwaysStrict": true, /* Ensure 'use strict' is always emitted. */
"noUnusedLocals": true, /* Enable error reporting when a local variables aren't read. */
"noUnusedParameters": true, /* Raise an error when a function parameter isn't read */
"exactOptionalPropertyTypes": true, /* Interpret optional property types as written, rather than adding 'undefined'. */
"noImplicitReturns": true, /* Enable error reporting for codepaths that do not explicitly return in a function. */
"noFallthroughCasesInSwitch": true, /* Enable error reporting for fallthrough cases in switch statements. */
"noUncheckedIndexedAccess": true, /* Include 'undefined' in index signature results */
"noImplicitOverride": true, /* Ensure overriding members in derived classes are marked with an override modifier. */
"noPropertyAccessFromIndexSignature": true, /* Enforces using indexed accessors for keys declared using an indexed type */
// "allowUnusedLabels": true, /* Disable error reporting for unused labels. */
// "allowUnreachableCode": true, /* Disable error reporting for unreachable code. */

View File

@@ -2273,6 +2273,26 @@
"node": ">=6"
}
},
"../Common/node_modules/@hcaptcha/loader": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/@hcaptcha/loader/-/loader-2.3.0.tgz",
"integrity": "sha512-i4lnNxKBe+COf3R1nFZEWaZoHIoJjvDgWqvcNrdZq8ehoSNMN6KVZ56dcQ02qKie2h3+BkbkwlJA9DOIuLlK/g==",
"license": "MIT"
},
"../Common/node_modules/@hcaptcha/react-hcaptcha": {
"version": "1.16.0",
"resolved": "https://registry.npmjs.org/@hcaptcha/react-hcaptcha/-/react-hcaptcha-1.16.0.tgz",
"integrity": "sha512-ALRsAwQDCtGeR9QXrLjf7IQsWlaIg1dQWpPYdBR03KHkUrLgRjmqLKkOXaq8L+0qcicXRasHRuvXp2Vinun9aw==",
"license": "MIT",
"dependencies": {
"@babel/runtime": "^7.17.9",
"@hcaptcha/loader": "^2.3.0"
},
"peerDependencies": {
"react": ">= 16.3.0",
"react-dom": ">= 16.3.0"
}
},
"../Common/node_modules/@hexagon/base64": {
"version": "1.1.28",
"resolved": "https://registry.npmjs.org/@hexagon/base64/-/base64-1.1.28.tgz",
@@ -9287,39 +9307,39 @@
}
},
"../Common/node_modules/express": {
"version": "4.21.2",
"resolved": "https://registry.npmjs.org/express/-/express-4.21.2.tgz",
"integrity": "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA==",
"version": "4.22.1",
"resolved": "https://registry.npmjs.org/express/-/express-4.22.1.tgz",
"integrity": "sha512-F2X8g9P1X7uCPZMA3MVf9wcTqlyNp7IhH5qPCI0izhaOIYXaW9L535tGA3qmjRzpH+bZczqq7hVKxTR4NWnu+g==",
"license": "MIT",
"dependencies": {
"accepts": "~1.3.8",
"array-flatten": "1.1.1",
"body-parser": "1.20.3",
"content-disposition": "0.5.4",
"body-parser": "~1.20.3",
"content-disposition": "~0.5.4",
"content-type": "~1.0.4",
"cookie": "0.7.1",
"cookie-signature": "1.0.6",
"cookie": "~0.7.1",
"cookie-signature": "~1.0.6",
"debug": "2.6.9",
"depd": "2.0.0",
"encodeurl": "~2.0.0",
"escape-html": "~1.0.3",
"etag": "~1.8.1",
"finalhandler": "1.3.1",
"fresh": "0.5.2",
"http-errors": "2.0.0",
"finalhandler": "~1.3.1",
"fresh": "~0.5.2",
"http-errors": "~2.0.0",
"merge-descriptors": "1.0.3",
"methods": "~1.1.2",
"on-finished": "2.4.1",
"on-finished": "~2.4.1",
"parseurl": "~1.3.3",
"path-to-regexp": "0.1.12",
"path-to-regexp": "~0.1.12",
"proxy-addr": "~2.0.7",
"qs": "6.13.0",
"qs": "~6.14.0",
"range-parser": "~1.2.1",
"safe-buffer": "5.2.1",
"send": "0.19.0",
"serve-static": "1.16.2",
"send": "~0.19.0",
"serve-static": "~1.16.2",
"setprototypeof": "1.2.0",
"statuses": "2.0.1",
"statuses": "~2.0.1",
"type-is": "~1.6.18",
"utils-merge": "1.0.1",
"vary": "~1.1.2"
@@ -9332,15 +9352,6 @@
"url": "https://opencollective.com/express"
}
},
"../Common/node_modules/express/node_modules/cookie": {
"version": "0.7.1",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
"integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
"license": "MIT",
"engines": {
"node": ">= 0.6"
}
},
"../Common/node_modules/express/node_modules/debug": {
"version": "2.6.9",
"resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
@@ -9356,6 +9367,21 @@
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A==",
"license": "MIT"
},
"../Common/node_modules/express/node_modules/qs": {
"version": "6.14.0",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.0.tgz",
"integrity": "sha512-YWWTjgABSKcvs/nWBi9PycY/JiPJqOD4JA6o9Sej2AtvSGarXxKC3OQSk4pAarbdQlKAh5D4FCQkJNkW+GAn3w==",
"license": "BSD-3-Clause",
"dependencies": {
"side-channel": "^1.1.0"
},
"engines": {
"node": ">=0.6"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"../Common/node_modules/extend": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/extend/-/extend-3.0.2.tgz",
@@ -14023,9 +14049,9 @@
"license": "MIT"
},
"../Common/node_modules/nodemailer": {
"version": "7.0.10",
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-7.0.10.tgz",
"integrity": "sha512-Us/Se1WtT0ylXgNFfyFSx4LElllVLJXQjWi2Xz17xWw7amDKO2MLtFnVp1WACy7GkVGs+oBlRopVNUzlrGSw1w==",
"version": "7.0.11",
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-7.0.11.tgz",
"integrity": "sha512-gnXhNRE0FNhD7wPSCGhdNh46Hs6nm+uTyg+Kq0cZukNQiYdnCsoQjodNP9BQVG9XrcK/v6/MgpAPBUFyzh9pvw==",
"license": "MIT-0",
"engines": {
"node": ">=6.0.0"

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

View File

@@ -113,7 +113,7 @@
"display": "standalone",
"orientation": "portrait-primary",
"theme_color": "#000000",
"background_color": "#ffffff",
"background_color": "#000000",
"categories": ["productivity", "business", "utilities"],
"lang": "en",
"related_applications": [],

Some files were not shown because too many files have changed in this diff Show More