40 Commits

Author SHA1 Message Date
0880298cf5 refactor(opendock refactor on how releases are posted): this was a bug maybe just a better refactory
Some checks failed
Build and Push LST Docker Image / docker (push) Has been cancelled
2026-04-08 15:57:20 -05:00
34b0abac36 feat(puchase history): purhcase history changed to long running no notification 2026-04-08 15:55:25 -05:00
28c226ddbc build(agent): added westbend into the flow 2026-04-07 22:33:38 -05:00
42861cc69e feat(purchase): historical data capture for alpla purchase 2026-04-07 22:33:11 -05:00
5f3d683a13 refactor(notification): reprint - removed a console log as it shouldnt bc there 2026-04-06 16:41:39 -05:00
a17787e852 feat(notification): reprint added
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m6s
2026-04-06 16:01:06 -05:00
5865ac3b99 feat(notification): base notifcaiton sub and admin compelted
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m59s
can now sub to a notification and user can remove them selfs plus an admin can remove,updates to add
new emails are good as well
2026-04-06 12:59:30 -05:00
637de857f9 feat(user notifications): added the ability for users to sub to notifications and add multi email 2026-04-06 09:29:46 -05:00
3ecf5fb916 refactor(userprofile): changes to have the table be blank and say nothing subscribed
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m32s
later we will leave this off the profile and add it once at least one notification is subscribed
2026-04-05 20:50:27 -05:00
92ba3ef512 docs(readme): updated progress data
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m18s
2026-04-05 20:44:49 -05:00
7d6c2db89c style(notifcaion): style changes to the notificaion card and started the table
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m49s
2026-04-03 17:16:58 -05:00
74262beb65 refactor(notification): select menu looks propper now 2026-04-03 17:16:31 -05:00
f3b8dd94e5 refactor(queries): changed dev version to be 1500ms vs 5000ms 2026-04-03 17:16:02 -05:00
0059b9b850 build(changelog): reset the change log after all crap testing 2026-04-03 17:15:22 -05:00
1ad789b2b9 chore(release): 0.1.0-alpha.12
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m45s
Release and Build Image / release (push) Successful in 10s
2026-04-03 16:54:44 -05:00
079478f932 fix(typo): more dam typos 2026-04-03 16:54:29 -05:00
d6d5b451cd chore(release): 0.1.0-alpha.11
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m45s
Release and Build Image / release (push) Successful in 10s
2026-04-03 16:49:20 -05:00
76747cf917 fix(release): typo that caused errors 2026-04-03 16:49:12 -05:00
6e85991062 refactor(release): changes to only have the changelog in the release 2026-04-03 16:43:17 -05:00
98e408cb85 chore(release): 0.1.0-alpha.10
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m48s
Release and Build Image / release (push) Successful in 1m22s
2026-04-03 15:30:02 -05:00
ed052dff3c refactor(changelog): reverted back to commit-chagnelog, like more than changeset for solo dev 2026-04-03 15:29:49 -05:00
8f59bba614 chore(release): 0.1.0-alpha.9
All checks were successful
Release and Build Image / release (push) Successful in 1m52s
2026-04-03 15:22:26 -05:00
fb2c5609aa chore(release): version packages
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m46s
Release and Build Image / release (push) Successful in 1m20s
2026-04-03 13:06:52 -05:00
17aed6cb89 fix(lala): something here 2026-04-03 13:06:14 -05:00
b02b93b83f chore(release): version packages
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m50s
Release and Build Image / release (push) Successful in 1m26s
2026-04-03 12:51:52 -05:00
9ceba8b5bb fix(i suck): more learning experance 2026-04-03 12:51:11 -05:00
2c0dbf95c7 chore(release): version packages
Some checks failed
Build and Push LST Docker Image / docker (push) Successful in 1m50s
Release and Build Image / release (push) Failing after 1m22s
2026-04-03 12:44:43 -05:00
860207a60b fix(build): typo 2026-04-03 12:44:16 -05:00
5c6460012a chore(release): version packages
Some checks failed
Build and Push LST Docker Image / docker (push) Successful in 1m54s
Release and Build Image / release (push) Failing after 1m43s
2026-04-03 12:37:54 -05:00
be1d4081e0 docs(sop): added more info 2026-04-03 12:37:13 -05:00
83a94cacf3 fix(build): type in how we pushed the header over
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m20s
2026-04-03 12:33:20 -05:00
0ce3790675 chore(release): version packages
Some checks failed
Build and Push LST Docker Image / docker (push) Successful in 1m51s
Release and Build Image / release (push) Failing after 1m23s
2026-04-03 12:23:13 -05:00
5854889eb5 refactor(build): added in more info to the relase section 2026-04-03 12:22:26 -05:00
4caaf74569 chore(release): version packages
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m49s
Release and Build Image / release (push) Successful in 1m22s
2026-04-03 12:09:59 -05:00
fe889ca757 fix(build): issue with how i wrote the release token
Some checks failed
Build and Push LST Docker Image / docker (push) Has been cancelled
2026-04-03 12:08:57 -05:00
699c124b0e chore(release): version packages
Some checks failed
Build and Push LST Docker Image / docker (push) Successful in 1m42s
Release and Build Image / release (push) Failing after 6s
2026-04-03 11:56:40 -05:00
7d55c5f431 refactor(build): changes to the way we do release so it builds as well
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m21s
2026-04-03 11:54:41 -05:00
c4fd74fc93 chore(release): version packages
Some checks failed
Build and Push LST Docker Image / docker (push) Successful in 1m44s
Create Gitea Release / release (push) Failing after 17s
2026-04-03 11:42:52 -05:00
3775760734 fix(wrelease): forgot to save
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m18s
2026-04-03 11:41:27 -05:00
643d12ff18 refactor(build): changes to auto release when we cahnge version
Some checks failed
Build and Push LST Docker Image / docker (push) Has been cancelled
2026-04-03 11:40:09 -05:00
62 changed files with 10983 additions and 1060 deletions

View File

@@ -1,8 +0,0 @@
# Changesets
Hello and welcome! This folder has been automatically generated by `@changesets/cli`, a build tool that works
with multi-package repos, or single-package repos to help you version and publish your code. You can
find the full documentation for it [in our repository](https://github.com/changesets/changesets)
We have a quick list of common questions to get you started engaging with this project in
[our documentation](https://github.com/changesets/changesets/blob/main/docs/common-questions.md)

View File

@@ -1,11 +0,0 @@
{
"$schema": "https://unpkg.com/@changesets/config/schema.json",
"changelog": "@changesets/cli/changelog",
"commit": false,
"fixed": [],
"linked": [],
"access": "restricted",
"baseBranch": "main",
"updateInternalDependencies": "patch",
"ignore": []
}

View File

@@ -1,5 +0,0 @@
---
"lst_v3": patch
---
build stuff

View File

@@ -1,11 +0,0 @@
{
"mode": "pre",
"tag": "alpha",
"initialVersions": {
"lst_v3": "1.0.1"
},
"changesets": [
"neat-years-unite",
"soft-onions-appear"
]
}

View File

@@ -1,5 +0,0 @@
---
"lst_v3": patch
---
external url added for docker

View File

@@ -0,0 +1,157 @@
name: Release and Build Image
on:
push:
tags:
- "v*"
jobs:
release:
runs-on: ubuntu-latest
steps:
- name: Check out repository
uses: actions/checkout@v4
- name: Prepare release metadata
shell: bash
run: |
TAG="${GITHUB_REF_NAME:-${GITHUB_REF##refs/tags/}}"
VERSION="${TAG#v}"
IMAGE_REGISTRY="${{ gitea.server_url }}"
IMAGE_REGISTRY="${IMAGE_REGISTRY#http://}"
IMAGE_REGISTRY="${IMAGE_REGISTRY#https://}"
IMAGE_NAME="${IMAGE_REGISTRY}/${{ gitea.repository }}"
echo "TAG=$TAG" >> "$GITHUB_ENV"
echo "VERSION=$VERSION" >> "$GITHUB_ENV"
echo "IMAGE_NAME=$IMAGE_NAME" >> "$GITHUB_ENV"
if [[ "$TAG" == *-* ]]; then
echo "PRERELEASE=true" >> "$GITHUB_ENV"
else
echo "PRERELEASE=false" >> "$GITHUB_ENV"
fi
- name: Log in to Gitea container registry
shell: bash
env:
REGISTRY_USERNAME: ${{ secrets.REGISTRY_USERNAME }}
REGISTRY_TOKEN: ${{ secrets.RELEASE_TOKEN }}
run: |
echo "$REGISTRY_TOKEN" | docker login "${IMAGE_NAME%%/*}" -u "$REGISTRY_USERNAME" --password-stdin
- name: Build Docker image
shell: bash
run: |
docker build \
-t "$IMAGE_NAME:$TAG" \
-t "$IMAGE_NAME:latest" \
.
- name: Push version tag
shell: bash
run: |
docker push "$IMAGE_NAME:$TAG"
- name: Push latest tag
if: ${{ !contains(env.TAG, '-') }}
shell: bash
run: |
docker push "$IMAGE_NAME:latest"
- name: Push prerelease channel tag
if: ${{ contains(env.TAG, '-') }}
shell: bash
run: |
CHANNEL="${TAG#*-}"
CHANNEL="${CHANNEL%%.*}"
docker tag "$IMAGE_NAME:$TAG" "$IMAGE_NAME:$CHANNEL"
docker push "$IMAGE_NAME:$CHANNEL"
- name: Extract matching CHANGELOG section
shell: bash
run: |
python3 - <<'PY'
import os
import re
from pathlib import Path
version = os.environ["VERSION"]
changelog_path = Path("CHANGELOG.md")
if not changelog_path.exists():
Path("release_body.md").write_text(f"Release {version}\n", encoding="utf-8")
raise SystemExit(0)
text = changelog_path.read_text(encoding="utf-8")
pattern = re.compile(
rf"^##\s+\[?{re.escape(version)}\]?[^\n]*\n(.*?)(?=^##\s+\[?[0-9]|\Z)",
re.MULTILINE | re.DOTALL,
)
match = pattern.search(text)
if match:
body = match.group(1).strip()
else:
body = f"Release {version}"
if not body:
body = f"Release {version}"
Path("release_body.md").write_text(body + "\n", encoding="utf-8")
print(body)
PY
- name: Create Gitea release
env:
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
GITEA_SERVER_URL: ${{ gitea.server_url }}
GITEA_REPOSITORY: ${{ gitea.repository }}
shell: bash
run: |
python3 - <<'PY'
import json
import os
import urllib.request
import urllib.error
from pathlib import Path
tag = os.environ["TAG"]
prerelease = os.environ["PRERELEASE"].lower() == "true"
server_url = os.environ["GITEA_SERVER_URL"].rstrip("/")
repo = os.environ["GITEA_REPOSITORY"]
token = os.environ["RELEASE_TOKEN"]
body = Path("release_body.md").read_text(encoding="utf-8").strip()
url = f"{server_url}/api/v1/repos/{repo}/releases"
payload = {
"tag_name": tag,
"name": tag,
"body": body,
"draft": False,
"prerelease": prerelease,
}
data = json.dumps(payload).encode("utf-8")
req = urllib.request.Request(
url,
data=data,
method="POST",
headers={
"Authorization": f"token {token}",
"Content-Type": "application/json",
"Accept": "application/json",
},
)
try:
with urllib.request.urlopen(req) as resp:
print(resp.read().decode("utf-8"))
except urllib.error.HTTPError as e:
details = e.read().decode("utf-8", errors="replace")
print(details)
raise
PY

1
.gitignore vendored
View File

@@ -4,6 +4,7 @@ builds
.includes
.buildNumber
temp
brunoApi
.scriptCreds
node-v24.14.0-x64.msi
postgresql-17.9-2-windows-x64.exe

View File

@@ -11,7 +11,7 @@
{ "type": "ci", "hidden": false, "section": "📈 Project changes" },
{ "type": "build", "hidden": false, "section": "📈 Project Builds" }
],
"commitUrlFormat": "https://git.tuffraid.net/cowch/lst/commits/{{hash}}",
"compareUrlFormat": "https://git.tuffraid.net/cowch/lst/compare/{{previousTag}}...{{currentTag}}",
"commitUrlFormat": "https://git.tuffraid.net/cowch/lst_v3/commits/{{hash}}",
"compareUrlFormat": "https://git.tuffraid.net/cowch/lst_v3/compare/{{previousTag}}...{{currentTag}}",
"header": "# All Changes to LST can be found below.\n"
}

View File

@@ -54,8 +54,10 @@
"alpla",
"alplamart",
"alplaprod",
"alplapurchase",
"bookin",
"Datamart",
"dotenvx",
"dyco",
"intiallally",
"manadatory",

View File

@@ -1,14 +0,0 @@
# lst_v3
## 1.0.2-alpha.0
### Patch Changes
- build stuff
- external url added for docker
## 1.0.1
### Patch Changes
- cf18e94: core stuff

View File

@@ -7,7 +7,7 @@
Quick summary of current rewrite/migration goal.
- **Phase:** Backend rewrite
- **Last updated:** 2024-05-01
- **Last updated:** 2026-04-06
---
@@ -16,9 +16,9 @@ Quick summary of current rewrite/migration goal.
| Feature | Description | Status |
|----------|--------------|--------|
| User Authentication | ~~Login~~, ~~Signup~~, API Key | 🟨 In Progress |
| User Profile | Edit profile, upload avatar | ⏳ Not Started |
| User Profile | ~~Edit profile~~, upload avatar | 🟨 In Progress |
| User Admin | Edit user, create user, remove user, alplaprod user integration | ⏳ Not Started |
| Notifications | Subscribe, Create, Update, Remove, Manual Trigger | ⏳ Not Started |
| Notifications | ~~Subscribe~~, ~~Create~~, ~~Update~~, ~~~~Remove~~, Manual Trigger | 🟨 In Progress |
| Datamart | Create, Update, Run, Deactivate | 🔧 In Progress |
| Frontend | Analytics and charts | ⏳ Not Started |
| Docs | Instructions and trouble shooting | ⏳ Not Started |
@@ -44,7 +44,7 @@ _Status legend:_
How to run the current version of the app.
```bash
git clone https://github.com/youruser/yourrepo.git
cd yourrepo
git clone https://git.tuffraid.net/cowch/lst_v3.git
cd lst_v3
npm install
npm run dev

View File

@@ -26,7 +26,7 @@ const createApp = async () => {
const __dirname = dirname(__filename);
// well leave this active so we can monitor it to validate
app.use(morgan("tiny"));
app.use(morgan("dev"));
app.set("trust proxy", true);
app.use(lstCors());
app.all(`${baseUrl}/api/auth/*splat`, toNodeHandler(auth));
@@ -34,11 +34,11 @@ const createApp = async () => {
setupRoutes(baseUrl, app);
app.use(
baseUrl + "/app",
`${baseUrl}/app`,
express.static(join(__dirname, "../frontend/dist")),
);
app.get(baseUrl + "/app/*splat", (_, res) => {
app.get(`${baseUrl}/app/*splat`, (_, res) => {
res.sendFile(join(__dirname, "../frontend/dist/index.html"));
});

View File

@@ -0,0 +1,38 @@
import {
integer,
jsonb,
pgTable,
text,
timestamp,
uuid,
} from "drizzle-orm/pg-core";
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type { z } from "zod";
export const alplaPurchaseHistory = pgTable("alpla_purchase_history", {
id: uuid("id").defaultRandom().primaryKey(),
apo: integer("apo"),
revision: integer("revision"),
confirmed: integer("confirmed"),
status: integer("status"),
statusText: text("status_text"),
journalNum: integer("journal_num"),
add_date: timestamp("add_date").defaultNow(),
add_user: text("add_user"),
upd_user: text("upd_user"),
upd_date: timestamp("upd_date").defaultNow(),
remark: text("remark"),
approvedStatus: text("approved_status").default("pending"),
position: jsonb("position").default([]),
createdAt: timestamp("created_at").defaultNow(),
});
export const alplaPurchaseHistorySchema =
createSelectSchema(alplaPurchaseHistory);
export const newAlplaPurchaseHistorySchema =
createInsertSchema(alplaPurchaseHistory);
export type AlplaPurchaseHistory = z.infer<typeof alplaPurchaseHistorySchema>;
export type NewAlplaPurchaseHistory = z.infer<
typeof newAlplaPurchaseHistorySchema
>;

View File

@@ -1,4 +1,5 @@
import {
index,
integer,
jsonb,
pgTable,
@@ -9,14 +10,23 @@ import {
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type { z } from "zod";
export const opendockApt = pgTable("opendock_apt", {
id: uuid("id").defaultRandom().primaryKey(),
release: integer("release").unique(),
openDockAptId: text("open_dock_apt_id").notNull(),
appointment: jsonb("appointment").default([]),
upd_date: timestamp("upd_date").defaultNow(),
createdAt: timestamp("created_at").defaultNow(),
});
export const opendockApt = pgTable(
"opendock_apt",
{
id: uuid("id").defaultRandom().primaryKey(),
release: integer("release").notNull().unique(),
openDockAptId: text("open_dock_apt_id").notNull(),
appointment: jsonb("appointment").notNull().default([]),
upd_date: timestamp("upd_date").notNull().defaultNow(),
createdAt: timestamp("created_at").notNull().defaultNow(),
},
(table) => ({
releaseIdx: index("opendock_apt_release_idx").on(table.release),
openDockAptIdIdx: index("opendock_apt_opendock_id_idx").on(
table.openDockAptId,
),
}),
);
export const opendockAptSchema = createSelectSchema(opendockApt);
export const newOpendockAptSchema = createInsertSchema(opendockApt);

View File

@@ -0,0 +1,113 @@
import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { notifications } from "../db/schema/notifications.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { sendEmail } from "../utils/sendEmail.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
/**
*
*/
const func = async (data: any, emails: string) => {
// get the actual notification as items will be updated between intervals if no one touches
const { data: l, error: le } = (await tryCatch(
db.select().from(notifications).where(eq(notifications.id, data.id)),
)) as any;
if (le) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `${data.name} encountered an error while trying to get initial info`,
data: [le],
notify: true,
});
}
// search the query db for the query by name
const sqlQuery = sqlQuerySelector(`${data.name}`) as SqlQuery;
// create the ignore audit logs ids
const ignoreIds = l[0].options[0]?.auditId
? `${l[0].options[0]?.auditId}`
: "0";
// run the check
const { data: queryRun, error } = await tryCatch(
prodQuery(
sqlQuery.query
.replace("[intervalCheck]", l[0].interval)
.replace("[ignoreList]", ignoreIds),
`Running notification query: ${l[0].name}`,
),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `Data for: ${l[0].name} encountered an error while trying to get it`,
data: [error],
notify: true,
});
}
if (queryRun.data.length > 0) {
// update the latest audit id
const { error: dbe } = await tryCatch(
db
.update(notifications)
.set({ options: [{ auditId: `${queryRun.data[0].id}` }] })
.where(eq(notifications.id, data.id)),
);
if (dbe) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `Data for: ${l[0].name} encountered an error while trying to get it`,
data: [dbe],
notify: true,
});
}
// send the email
const sentEmail = await sendEmail({
email: emails,
subject: "Alert! Label Reprinted",
template: "reprintLabels",
context: {
items: queryRun.data,
},
});
if (!sentEmail?.success) {
return returnFunc({
success: false,
level: "error",
module: "email",
subModule: "notification",
message: `${l[0].name} failed to send the email`,
data: [sentEmail],
notify: true,
});
}
} else {
console.log("doing nothing as there is nothing to do.");
}
// TODO send the error to systemAdmin users so they do not always need to be on the notifications.
// these errors are defined per notification.
};
export default func;

View File

@@ -1,10 +1,106 @@
const reprint = (data: any, emails: string) => {
// TODO: do the actual logic for the notification.
console.log(data);
console.log(emails);
import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { notifications } from "../db/schema/notifications.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { sendEmail } from "../utils/sendEmail.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
// TODO send the error to systemAdmin users so they do not always need to be on the notifications.
// these errors are defined per notification.
const reprint = async (data: any, emails: string) => {
// TODO: do the actual logic for the notification.
const { data: l, error: le } = (await tryCatch(
db.select().from(notifications).where(eq(notifications.id, data.id)),
)) as any;
if (le) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `${data.name} encountered an error while trying to get initial info`,
data: [le],
notify: true,
});
}
// search the query db for the query by name
const sqlQuery = sqlQuerySelector(`${data.name}`) as SqlQuery;
// create the ignore audit logs ids
const ignoreIds = l[0].options[0]?.auditId
? `${l[0].options[0]?.auditId}`
: "0";
// run the check
const { data: queryRun, error } = await tryCatch(
prodQuery(
sqlQuery.query
.replace("[intervalCheck]", l[0].interval)
.replace("[ignoreList]", ignoreIds),
`Running notification query: ${l[0].name}`,
),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `Data for: ${l[0].name} encountered an error while trying to get it`,
data: [error],
notify: true,
});
}
if (queryRun.data.length > 0) {
// update the latest audit id
const { error: dbe } = await tryCatch(
db
.update(notifications)
.set({ options: [{ auditId: `${queryRun.data[0].id}` }] })
.where(eq(notifications.id, data.id)),
);
if (dbe) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `Data for: ${l[0].name} encountered an error while trying to get it`,
data: [dbe],
notify: true,
});
}
// send the email
const sentEmail = await sendEmail({
email: emails,
subject: "Alert! Label Reprinted",
template: "reprintLabels",
context: {
items: queryRun.data,
},
});
if (!sentEmail?.success) {
return returnFunc({
success: false,
level: "error",
module: "email",
subModule: "notification",
message: `${l[0].name} failed to send the email`,
data: [sentEmail],
notify: true,
});
}
}
};
export default reprint;

View File

@@ -3,12 +3,12 @@ import { type Response, Router } from "express";
import z from "zod";
import { db } from "../db/db.controller.js";
import { notificationSub } from "../db/schema/notifications.sub.schema.js";
import { auth } from "../utils/auth.utils.js";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { modifiedNotification } from "./notification.controller.js";
const newSubscribe = z.object({
emails: z.email().array().describe("An array of emails"),
userId: z.string().describe("User id."),
notificationId: z.string().describe("Notification id"),
});
@@ -16,14 +16,29 @@ const newSubscribe = z.object({
const r = Router();
r.delete("/", async (req, res: Response) => {
const hasPermissions = await auth.api.userHasPermission({
body: {
//userId: req?.user?.id,
role: req.user?.roles as any,
permissions: {
notifications: ["readAll"], // This must match the structure in your access control
},
},
});
try {
const validated = newSubscribe.parse(req.body);
const { data, error } = await tryCatch(
db
.delete(notificationSub)
.where(
and(
eq(notificationSub.userId, validated.userId),
eq(
notificationSub.userId,
hasPermissions ? validated.userId : (req?.user?.id ?? ""),
), // allows the admin to delete this
//eq(notificationSub.userId, req?.user?.id ?? ""),
eq(notificationSub.notificationId, validated.notificationId),
),
)
@@ -44,6 +59,18 @@ r.delete("/", async (req, res: Response) => {
});
}
if (data.length <= 0) {
return apiReturn(res, {
success: false,
level: "info",
module: "notification",
subModule: "post",
message: `Subscription was not deleted invalid data sent over`,
data: data ?? [],
status: 200,
});
}
return apiReturn(res, {
success: true,
level: "info",

View File

@@ -21,12 +21,16 @@ r.get("/", async (req, res: Response) => {
},
});
if (userId) {
hasPermissions.success = false;
}
const { data, error } = await tryCatch(
db
.select()
.from(notificationSub)
.where(
userId || !hasPermissions.success
!hasPermissions.success
? eq(notificationSub.userId, `${req?.user?.id ?? ""}`)
: undefined,
),

View File

@@ -25,8 +25,25 @@ r.post("/", async (req, res: Response) => {
try {
const validated = newSubscribe.parse(req.body);
const emails = validated.emails
.map((e) => e.trim().toLowerCase())
.filter(Boolean);
const uniqueEmails = [...new Set(emails)];
const { data, error } = await tryCatch(
db.insert(notificationSub).values(validated).returning(),
db
.insert(notificationSub)
.values({
userId: req?.user?.id ?? "",
notificationId: validated.notificationId,
emails: uniqueEmails,
})
.onConflictDoUpdate({
target: [notificationSub.userId, notificationSub.notificationId],
set: { emails: uniqueEmails },
})
.returning(),
);
await modifiedNotification(validated.notificationId);

View File

@@ -14,7 +14,27 @@ const note: NewNotification[] = [
"Monitors the labels that are printed and returns a there data, if one falls withing the time frame.",
active: false,
interval: "10",
options: [{ prodID: 1 }],
options: [{ auditId: [0] }],
},
{
name: "qualityBlocking",
description:
"Checks for new blocking orders that have been entered, recommend to get the most recent order in here before activating.",
active: false,
interval: "10",
options: [{ sentBlockingOrders: [{ timeStamp: "0", blockingOrder: 1 }] }],
},
{
name: "alplaPurchaseHistory",
description:
"Will check the alpla purchase data for any changes, if the req has not been sent already then we will send this, for a po or fresh order we will ignore. ",
active: false,
interval: "5",
options: [
{ sentReqs: [{ timeStamp: "0", req: 1, approved: false }] },
{ sentAPOs: [{ timeStamp: "0", apo: 1 }] },
{ sentRCT: [{ timeStamp: "0", rct: 1 }] },
],
},
];

View File

@@ -17,15 +17,6 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { getToken, odToken } from "./opendock.utils.js";
let lastCheck = formatInTimeZone(
new Date().toISOString(),
"America/New_York",
"yyyy-MM-dd HH:mm:ss",
);
//const queue: unknown[] = [];
//const isProcessing: boolean = false;
type Releases = {
ReleaseNumber: number;
DeliveryState: number;
@@ -37,10 +28,38 @@ type Releases = {
LineItemArticleWeight: number;
CustomerReleaseNumber: string;
};
const timeZone = process.env.TIMEZONE as string;
const TWENTY_FOUR_HOURS = 24 * 60 * 60 * 1000;
const log = createLogger({ module: "opendock", subModule: "releaseMonitor" });
// making the cron more safe when it comes to buffer stuff
let opendockSyncRunning = false;
let lastCheck = formatInTimeZone(
new Date().toISOString(),
timeZone,
"yyyy-MM-dd HH:mm:ss",
);
// const lastCheck = formatInTimeZone(
// new Date().toISOString(),
// `America/New_York`, //TODO: Pull timezone from the .env last as process.env.TIME_ZONE is not working so need to figure itout
// "yyyy-MM-dd HH:mm:ss",
// );
//const queue: unknown[] = [];
//const isProcessing: boolean = false;
// const parseDbDate = (value: string | Date) => {
// if (value instanceof Date) return value;
// // normalize "2026-04-08 13:10:43.280" -> "2026-04-08T13:10:43.280"
// const normalized = value.replace(" ", "T");
// // interpret that wall-clock time as America/New_York
// return fromZonedTime(normalized, timeZone);
// };
const postRelease = async (release: Releases) => {
if (!odToken.odToken) {
log.info({}, "Getting Auth Token");
@@ -152,22 +171,25 @@ const postRelease = async (release: Releases) => {
};
// TODO: pull the current added releases from the db and if one matches then we want to get its id and run the update vs create
const { data: apt, error: aptError } = await tryCatch(
db.select().from(opendockApt),
const { data: existingApt, error: aptError } = await tryCatch(
db
.select()
.from(opendockApt)
.where(eq(opendockApt.release, release.ReleaseNumber))
.limit(1),
);
if (aptError) {
log.error({ error: aptError }, "Error getting apt data");
// TODO: send an error email on this one as it will cause issues
return;
}
const releaseCheck = apt.filter((r) => r.release === release.ReleaseNumber);
const existing = existingApt[0];
//console.log(releaseCheck);
if (releaseCheck.length > 0) {
const id = releaseCheck[0]?.openDockAptId;
if (existing) {
const id = existing.openDockAptId;
try {
const response = await axios.patch(
`${process.env.OPENDOCK_URL}/appointment/${id}`,
@@ -196,7 +218,11 @@ const postRelease = async (release: Releases) => {
})
.onConflictDoUpdate({
target: opendockApt.release,
set: { appointment: response.data.data, upd_date: sql`NOW()` },
set: {
openDockAptId: response.data.data.id,
appointment: response.data.data,
upd_date: sql`NOW()`,
},
})
.returning();
@@ -250,8 +276,12 @@ const postRelease = async (release: Releases) => {
appointment: response.data.data,
})
.onConflictDoUpdate({
target: opendockApt.id,
set: { appointment: response.data.data, upd_date: sql`NOW()` },
target: opendockApt.release,
set: {
openDockAptId: response.data.data.id,
appointment: response.data.data,
upd_date: sql`NOW()`,
},
})
.returning();
@@ -270,7 +300,7 @@ const postRelease = async (release: Releases) => {
}
}
await delay(500); // rate limit protection
await delay(750); // rate limit protection
};
export const monitorReleaseChanges = async () => {
@@ -298,184 +328,66 @@ export const monitorReleaseChanges = async () => {
}
if (openDockMonitor[0]?.active) {
createCronJob("opendock_sync", "*/15 * * * * *", async () => {
try {
const result = await prodQuery(
sqlQuery.query.replace("[dateCheck]", `'${lastCheck}'`),
"Get release info",
);
// const BUFFER_MS =
// Math.floor(parseInt(openDockMonitor[0]?.value, 10) || 30) * 1.5 * 1000; // this should be >= to the interval we set in the cron TODO: should pull the buffer from the setting and give it an extra 10% then round to nearest int.
if (result.data.length) {
for (const release of result.data) {
await postRelease(release);
lastCheck = formatInTimeZone(
new Date(release.Upd_Date).toISOString(),
"UTC",
"yyyy-MM-dd HH:mm:ss",
);
await delay(500);
}
createCronJob(
"opendock_sync",
`*/${parseInt(openDockMonitor[0]?.value, 10) || 30} * * * * *`,
async () => {
if (opendockSyncRunning) {
log.warn(
{},
"Skipping opendock_sync because previous run is still active",
);
return;
}
} catch (e) {
console.error(
{ error: e },
"Error occurred while running the monitor job",
);
log.error({ error: e }, "Error occurred while running the monitor job");
}
});
opendockSyncRunning = true;
try {
// set this to the latest time.
const result = await prodQuery(
sqlQuery.query.replace("[dateCheck]", `'${lastCheck}'`),
"Get release info",
);
log.debug(
{ lastCheck },
`${result.data.length} Changes to a release have been made`,
);
if (result.data.length) {
for (const release of result.data) {
await postRelease(release);
// add a 2 seconds to account for a massive influx of orders and when we dont finish in 1 go it wont try to grab the same amount again
const nDate = new Date(release.Upd_Date);
nDate.setSeconds(nDate.getSeconds() + 2);
lastCheck = formatInTimeZone(
nDate.toISOString(),
"UTC",
"yyyy-MM-dd HH:mm:ss",
);
log.debug({ lastCheck }, "Changes to a release have been made");
await delay(500);
}
}
} catch (e) {
console.error(
{ error: e },
"Error occurred while running the monitor job",
);
log.error(
{ error: e },
"Error occurred while running the monitor job",
);
} finally {
opendockSyncRunning = false;
}
},
"monitorReleaseChanges",
);
}
// run the main game loop
// while (openDockSetting) {
// try {
// const result = await prodQuery(
// sqlQuery.query.replace("[dateCheck]", `'${lastCheck}'`),
// "Get release info",
// );
// if (result.data.length) {
// for (const release of result.data) {
// // potentially move this to a buffer table to easy up on memory
// await postRelease(release);
// // Move checkpoint AFTER successful post
// lastCheck = formatInTimeZone(
// new Date(release.Upd_Date).toISOString(),
// "UTC",
// "yyyy-MM-dd HH:mm:ss",
// );
// await delay(500);
// }
// }
// } catch (e) {
// console.error("Monitor error:", e);
// }
// await delay(15 * 1000); // making this 15 seconds as we would really only see issues if we have a mass burst.
// }
};
// export const monitorReleaseChanges = async () => {
// console.log("Starting release monitor", lastCheck);
// setInterval(async () => {
// try {
// const result = await prodQuery(
// releaseQuery.replace("[dateCheck]", `'${lastCheck}'`),
// "get last release change",
// );
// //console.log(releaseQuery.replace("[dateCheck]", `'${lastCheck}'`));
// if (result.data.length > 0) {
// console.log(
// formatInTimeZone(
// result.data[result.data.length - 1].Upd_Date,
// "UTC",
// "yyyy-MM-dd HH:mm:ss",
// ),
// lastCheck,
// );
// lastCheck = formatInTimeZone(
// result.data[result.data.length - 1].Upd_Date,
// "UTC",
// "yyyy-MM-dd HH:mm:ss",
// );
// const releases = result.data;
// for (let i = 0; i < releases.length; i++) {
// const newDockApt = {
// status: "Scheduled",
// userId: "ee956455-e193-47fc-b53b-dff30fabdf4b", // this should be the carrierid
// loadTypeId: "0aa7988e-b17b-4f10-acdd-3d029b44a773", // well get this and make it a default one
// dockId: "00ba4386-ce5a-4dd1-9356-6e6d10a24609", // this the warehouse we want it in to start out
// refNumbers: [releases[i].ReleaseNumber],
// refNumber: releases[i].ReleaseNumber,
// start: releases[i].DeliveryDate,
// end: addHours(releases[i].DeliveryDate, 1),
// notes: "",
// ccEmails: [""],
// muteNotifications: true,
// metadata: {
// externalValidationFailed: false,
// externalValidationErrorMessage: null,
// },
// units: null,
// customFields: [
// {
// name: "strArticle",
// type: "str",
// label: "Article",
// value: `${releases[i].LineItemHumanReadableId} - ${releases[i].ArticleAlias}`,
// description: "What bottle are we sending ",
// placeholder: "",
// dropDownValues: [],
// minLengthOrValue: 1,
// hiddenFromCarrier: false,
// requiredForCarrier: false,
// requiredForWarehouse: false,
// },
// {
// name: "intPallet Count",
// type: "int",
// label: "Pallet Count",
// value: parseInt(releases[i].LoadingUnits, 10),
// description: "How many pallets",
// placeholder: "22",
// dropDownValues: [],
// minLengthOrValue: 1,
// hiddenFromCarrier: false,
// requiredForCarrier: false,
// requiredForWarehouse: false,
// },
// {
// name: "strTotal Weight",
// type: "str",
// label: "Total Weight",
// value: `${(((releases[i].Quantity * releases[i].LineItemArticleWeight) / 1000) * 2.20462).toFixed(2)}`,
// description: "What is the total weight of the load",
// placeholder: "",
// dropDownValues: [],
// minLengthOrValue: 1,
// hiddenFromCarrier: false,
// requiredForCarrier: false,
// requiredForWarehouse: false,
// },
// {
// name: "strCustomer ReleaseNumber",
// type: "str",
// label: "Customer Release Number",
// value: `${releases[i].CustomerReleaseNumber}`,
// description: "What is the customer release number",
// placeholder: "",
// dropDownValues: [],
// minLengthOrValue: 1,
// hiddenFromCarrier: false,
// requiredForCarrier: false,
// requiredForWarehouse: false,
// },
// ],
// };
// //console.log(newDockApt);
// const newDockResult = await axios.post(
// "https://neutron.staging.opendock.com/appointment",
// newDockApt,
// {
// headers: {
// "content-type": "application/json; charset=utf-8",
// },
// },
// );
// console.log(newDockResult.statusText);
// await delay(500);
// }
// }
// } catch (e) {
// console.log(e);
// }
// }, 5 * 1000);
// };

View File

@@ -28,7 +28,7 @@ export const getToken = async () => {
}
odToken = { odToken: data.access_token, tokenDate: new Date() };
log.info({}, "Token added");
log.info({ odToken }, "Token added");
} catch (e) {
log.error({ error: e }, "Error getting/refreshing token");
}

View File

@@ -36,12 +36,12 @@ export const opendockSocketMonitor = async () => {
// console.log(data);
// });
socket.on("create-Appointment", (data) => {
console.log("appt create:", data);
socket.on("create-Appointment", () => {
//console.log("appt create:", data);
});
socket.on("update-Appointment", (data) => {
console.log("appt update:", data);
socket.on("update-Appointment", () => {
//console.log("appt update:", data);
});
socket.on("error", (data) => {

View File

@@ -0,0 +1,63 @@
use AlplaPROD_test1
declare @intervalCheck as int = '[interval]'
/*
Monitors alpla purchase for thing new. this will not update unless the order status is updated.
this means if a user just reopens the order it will update but everything changed in the position will not be updated until the user reorders or cancels the po
*/
select
IdBestellung as apo
,po.revision as revision
,po.Bestaetigt as confirmed
,po.status
,case po.Status
when 1 then 'Created'
when 2 then 'Ordered'
when 22 then 'Reopened'
when 11 then 'Reopened'
when 4 then 'Planned'
when 5 then 'Partly Delivered'
when 6 then 'Delivered'
when 7 then 'Canceled'
when 8 then 'Closed'
else 'Unknown' end as statusText
,po.IdJournal as journalNum -- use this to validate if we used it already.
,po.Add_User as add_user
,po.Add_Date as add_date
,po.Upd_User as upd_user
,po.Upd_Date as upd_Date
,po.Bemerkung as remark
,po.IdJournal as journal -- use this to validate if we used it already.
,isnull((
select
o.IdArtikelVarianten as av
,a.Bezeichnung as alias
,Lieferdatum as deliveryDate
,cast(BestellMenge as decimal(18,2)) as qty
,cast(BestellMengeVPK as decimal(18,0)) as pkg
,cast(PreisProEinheit as decimal(18,0)) as price
,PositionsStatus
,case PositionsStatus
when 1 then 'Created'
when 2 then 'Ordered'
when 22 then 'Reopened'
when 4 then 'Planned'
when 5 then 'Partly Delivered'
when 6 then 'Delivered'
when 7 then 'Canceled'
when 8 then 'Closed'
else 'Unknown' end as statusText
,o.upd_user
,o.upd_date
from T_Bestellpositionen (nolock) as o
left join
T_Artikelvarianten as a on
a.IdArtikelvarianten = o.IdArtikelVarianten
where o.IdBestellung = po.IdBestellung
for json path
), '[]') as position
--,*
from T_Bestellungen (nolock) as po
where po.Upd_Date > dateadd(MINUTE, -@intervalCheck, getdate())

View File

@@ -0,0 +1,28 @@
use [test1_AlplaPROD2.0_Read]
SELECT
--JSON_VALUE(content, '$.EntityId') as labelId
a.id
,ActorName
,FORMAT(PrintDate, 'yyyy-MM-dd HH:mm') as printDate
,FORMAT(CreatedDateTime, 'yyyy-MM-dd HH:mm') createdDateTime
,l.ArticleHumanReadableId as av
,l.ArticleDescription as alias
,PrintedCopies
,p.name as printerName
,RunningNumber
--,*
FROM [support].[AuditLog] (nolock) as a
left join
[labelling].[InternalLabel] (nolock) as l on
l.id = JSON_VALUE(content, '$.EntityId')
left join
[masterData].[printer] (nolock) as p on
p.id = l.PrinterId
where message like '%reprint%'
and CreatedDateTime > DATEADD(minute, -[intervalCheck], SYSDATETIMEOFFSET())
and a.id > [ignoreList]
order by CreatedDateTime desc

View File

@@ -0,0 +1,97 @@
/**
* This will monitor alpla purchase
*/
import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import {
alplaPurchaseHistory,
type NewAlplaPurchaseHistory,
} from "../db/schema/alplapurchase.schema.js";
import { settings } from "../db/schema/settings.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { createCronJob } from "../utils/croner.utils.js";
import { delay } from "../utils/delay.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
const log = createLogger({ module: "purchase", subModule: "purchaseMonitor" });
export const monitorAlplaPurchase = async () => {
const purchaseMonitor = await db
.select()
.from(settings)
.where(eq(settings.name, "purchaseMonitor"));
const sqlQuery = sqlQuerySelector(`alplapurchase`) as SqlQuery;
if (!sqlQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "purchase",
subModule: "query",
message: `Error getting alpla purchase info`,
data: [sqlQuery.message],
notify: false,
});
}
if (purchaseMonitor[0]?.active) {
createCronJob("purchaseMonitor", "0 */5 * * * *", async () => {
try {
const result = await prodQuery(
sqlQuery.query.replace(
"[interval]",
`${purchaseMonitor[0]?.value || "5"}`,
),
"Get release info",
);
log.debug(
{},
`There are ${result.data.length} pending to be updated from the last ${purchaseMonitor[0]?.value}`,
);
if (result.data.length) {
const convertedData = result.data.map((i) => ({
...i,
position: JSON.parse(i.position),
})) as NewAlplaPurchaseHistory;
const { data, error } = await tryCatch(
db.insert(alplaPurchaseHistory).values(convertedData).returning(),
);
if (data) {
log.debug(
{ data },
"New data was just added to alpla purchase history",
);
}
if (error) {
log.error(
{ error },
"There was an error adding alpla purchase history",
);
}
await delay(500);
}
} catch (e) {
console.error(
{ error: e },
"Error occurred while running the monitor job",
);
log.error({ error: e }, "Error occurred while running the monitor job");
return;
}
});
}
};

View File

@@ -10,6 +10,7 @@ import { createNotifications } from "./notification/notifications.master.js";
import { monitorReleaseChanges } from "./opendock/openDockRreleaseMonitor.utils.js";
import { opendockSocketMonitor } from "./opendock/opendockSocketMonitor.utils.js";
import { connectProdSql } from "./prodSql/prodSqlConnection.controller.js";
import { monitorAlplaPurchase } from "./purchase/purchase.controller.js";
import { setupSocketIORoutes } from "./socket.io/serverSetup.js";
import { baseSettingValidationCheck } from "./system/settingsBase.controller.js";
import { createCronJob } from "./utils/croner.utils.js";
@@ -36,7 +37,7 @@ const start = async () => {
// also we always want to have long lived processes inside a setting check.
setTimeout(() => {
if (systemSettings.filter((n) => n.name === "opendock_sync")[0]?.active) {
log.info({}, "Opendock is not active");
log.info({}, "Opendock is active");
monitorReleaseChanges(); // this is od monitoring the db for all new releases
opendockSocketMonitor();
createCronJob("opendockAptCleanup", "0 30 5 * * *", () =>
@@ -44,6 +45,10 @@ const start = async () => {
);
}
if (systemSettings.filter((n) => n.name === "purchaseMonitor")[0]?.active) {
monitorAlplaPurchase();
}
// these jobs below are system jobs and should run no matter what.
createCronJob("JobAuditLogCleanUp", "0 0 5 * * *", () =>
dbCleanup("jobs", 30),

View File

@@ -8,7 +8,7 @@ const newSettings: NewSetting[] = [
// feature settings
{
name: "opendock_sync",
value: "0",
value: "15",
active: false,
description: "Dock Scheduling system",
moduleName: "opendock",
@@ -66,6 +66,16 @@ const newSettings: NewSetting[] = [
roles: ["admin"],
seedVersion: 1,
},
{
name: "purchaseMonitor",
value: "5",
active: true,
description: "Monitors alpla purchase fo all changes",
moduleName: "purchase",
settingType: "feature",
roles: ["admin"],
seedVersion: 1,
},
// standard settings
{

View File

@@ -10,6 +10,7 @@ import {
killOpendockSocket,
opendockSocketMonitor,
} from "../opendock/opendockSocketMonitor.utils.js";
import { monitorAlplaPurchase } from "../purchase/purchase.controller.js";
import {
createCronJob,
resumeCronJob,
@@ -31,8 +32,24 @@ export const featureControl = async (data: Setting) => {
createCronJob("opendockAptCleanup", "0 30 5 * * *", () =>
dbCleanup("opendockApt", 90),
);
} else {
}
if (data.name === "opendock_sync" && !data.active) {
killOpendockSocket();
stopCronJob("opendockAptCleanup");
}
// purchase stuff
if (data.name === "purchaseMonitor" && data.active) {
monitorAlplaPurchase();
}
if (data.name === "purchaseMonitor" && !data.active) {
stopCronJob("purchaseMonitor");
}
// this means the data time has changed
if (data.name === "purchaseMonitor" && data.value) {
monitorAlplaPurchase();
}
};

View File

@@ -18,7 +18,9 @@ export interface JobInfo {
// Store running cronjobs
export const runningCrons: Record<string, Cron> = {};
const activeRuns = new Set<string>();
const log = createLogger({ module: "system", subModule: "croner" });
const cronStats: Record<string, { created: number; replaced: number }> = {};
// how to se the times
// * ┌──────────────── (optional) second (0 - 59)
@@ -38,17 +40,36 @@ const log = createLogger({ module: "system", subModule: "croner" });
* @param name Name of the job we want to run
* @param schedule Cron expression (example: `*\/5 * * * * *`)
* @param task Async function that will run
* @param source we can add where it came from to assist in getting this tracked down, more for debugging
*/
export const createCronJob = async (
name: string,
schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second
task: () => Promise<void>, // what function are we passing over
source = "unknown",
) => {
// get the timezone based on the os timezone set
const timeZone = Intl.DateTimeFormat().resolvedOptions().timeZone;
// initial go so just store it this is more for debugging to see if something crazy keeps happening
if (!cronStats[name]) {
cronStats[name] = { created: 0, replaced: 0 };
}
// Destroy existing job if it exist
if (runningCrons[name]) {
cronStats[name].replaced += 1;
log.warn(
{
job: name,
source,
oldSchedule: runningCrons[name].getPattern?.(),
newSchedule: schedule,
replaceCount: cronStats[name].replaced,
},
`Cron job "${name}" already existed and is being replaced`,
);
runningCrons[name].stop();
}
@@ -61,6 +82,13 @@ export const createCronJob = async (
name: name,
},
async () => {
if (activeRuns.has(name)) {
log.warn({ jobName: name }, "Skipping overlapping cron execution");
return;
}
activeRuns.add(name);
const startedAt = new Date();
const start = Date.now();
@@ -91,14 +119,19 @@ export const createCronJob = async (
.where(eq(jobAuditLog.id, executionId));
} catch (e: any) {
if (executionId) {
await db.update(jobAuditLog).set({
finishedAt: new Date(),
durationMs: Date.now() - start,
status: "error",
errorMessage: e.message,
errorStack: e.stack,
});
await db
.update(jobAuditLog)
.set({
finishedAt: new Date(),
durationMs: Date.now() - start,
status: "error",
errorMessage: e.message,
errorStack: e.stack,
})
.where(eq(jobAuditLog.id, executionId));
}
} finally {
activeRuns.delete(name);
}
},
);

View File

@@ -0,0 +1,47 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
{{!-- <link rel="stylesheet" href="styles/styles.css" /> --}}
{{> styles}}
</head>
<body>
<p>All,</p>
<p>The below labels have been reprinted.</p>
<table >
<thead>
<tr>
<th>AV</th>
<th>Description</th>
<th>Label Number</th>
<th>Date Added</th>
<th>Date Reprinted</th>
<th>Who printed/Updated</th>
<th>What printer it came from</th>
</tr>
</thead>
<tbody>
{{#each items}}
<tr>
<td>{{av}}</td>
<td>{{alias}}</td>
<td>{{RunningNumber}}</td>
<td>{{printDate}}</td>
<td>{{createdDateTime}}</td>
<td>{{ActorName}}</td>
<td>{{printerName}}</td>
</tr>
{{/each}}
</tbody>
</table>
<div>
<p>Thank you,</p>
<p>LST Team</p>
</div>
</body>
</html>

View File

@@ -10,7 +10,9 @@ interface Data<T = unknown[]> {
| "datamart"
| "utils"
| "opendock"
| "notification";
| "notification"
| "email"
| "purchase";
subModule:
| "db"
| "labeling"

View File

@@ -1,5 +1,5 @@
vars {
url: http://localhost:3000/lst
url: http://localhost:3600/lst
readerIp: 10.44.14.215
}
vars:secret [

View File

@@ -14,7 +14,7 @@ services:
environment:
- NODE_ENV=production
- LOG_LEVEL=info
- EXTERNAL_URL=192.168.8.222:3600
- EXTERNAL_URL=http://192.168.8.222:3600
- DATABASE_HOST=host.docker.internal # if running on the same docker then do this
- DATABASE_PORT=5433
- DATABASE_USER=${DATABASE_USER}

View File

@@ -1,5 +1,5 @@
import { Link } from "@tanstack/react-router";
import { Logs } from "lucide-react";
import { Bell, Logs, Settings } from "lucide-react";
import {
SidebarGroup,
@@ -24,10 +24,18 @@ import {
export default function AdminSidebar({ session }: any) {
const { setOpen } = useSidebar();
const items = [
{
title: "Notifications",
url: "/admin/notifications",
icon: Bell,
role: ["systemAdmin", "admin"],
module: "admin",
active: true,
},
{
title: "Settings",
url: "/admin/settings",
icon: Logs,
icon: Settings,
role: ["systemAdmin"],
module: "admin",
active: true,

View File

@@ -42,7 +42,7 @@ export const SelectField = ({
>
<SelectValue placeholder={placeholder} />
</SelectTrigger>
<SelectContent>
<SelectContent position={"popper"}>
{options.map((option) => (
<SelectItem key={option.value} value={option.value}>
{option.label}

View File

@@ -13,7 +13,7 @@ export function getSettings() {
const fetch = async () => {
if (window.location.hostname === "localhost") {
await new Promise((res) => setTimeout(res, 5000));
await new Promise((res) => setTimeout(res, 1500));
}
const { data } = await axios.get("/lst/api/settings");

View File

@@ -13,7 +13,7 @@ export function notificationSubs(userId?: string) {
const fetch = async (userId?: string) => {
if (window.location.hostname === "localhost") {
await new Promise((res) => setTimeout(res, 5000));
await new Promise((res) => setTimeout(res, 1500));
}
const { data } = await axios.get(

View File

@@ -13,7 +13,7 @@ export function notifications() {
const fetch = async () => {
if (window.location.hostname === "localhost") {
await new Promise((res) => setTimeout(res, 5000));
await new Promise((res) => setTimeout(res, 1500));
}
const { data } = await axios.get("/lst/api/notification");

View File

@@ -12,6 +12,7 @@ import { Route as rootRouteImport } from './routes/__root'
import { Route as AboutRouteImport } from './routes/about'
import { Route as IndexRouteImport } from './routes/index'
import { Route as AdminSettingsRouteImport } from './routes/admin/settings'
import { Route as AdminNotificationsRouteImport } from './routes/admin/notifications'
import { Route as AdminLogsRouteImport } from './routes/admin/logs'
import { Route as authLoginRouteImport } from './routes/(auth)/login'
import { Route as authUserSignupRouteImport } from './routes/(auth)/user.signup'
@@ -33,6 +34,11 @@ const AdminSettingsRoute = AdminSettingsRouteImport.update({
path: '/admin/settings',
getParentRoute: () => rootRouteImport,
} as any)
const AdminNotificationsRoute = AdminNotificationsRouteImport.update({
id: '/admin/notifications',
path: '/admin/notifications',
getParentRoute: () => rootRouteImport,
} as any)
const AdminLogsRoute = AdminLogsRouteImport.update({
id: '/admin/logs',
path: '/admin/logs',
@@ -64,6 +70,7 @@ export interface FileRoutesByFullPath {
'/about': typeof AboutRoute
'/login': typeof authLoginRoute
'/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute
'/admin/settings': typeof AdminSettingsRoute
'/user/profile': typeof authUserProfileRoute
'/user/resetpassword': typeof authUserResetpasswordRoute
@@ -74,6 +81,7 @@ export interface FileRoutesByTo {
'/about': typeof AboutRoute
'/login': typeof authLoginRoute
'/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute
'/admin/settings': typeof AdminSettingsRoute
'/user/profile': typeof authUserProfileRoute
'/user/resetpassword': typeof authUserResetpasswordRoute
@@ -85,6 +93,7 @@ export interface FileRoutesById {
'/about': typeof AboutRoute
'/(auth)/login': typeof authLoginRoute
'/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute
'/admin/settings': typeof AdminSettingsRoute
'/(auth)/user/profile': typeof authUserProfileRoute
'/(auth)/user/resetpassword': typeof authUserResetpasswordRoute
@@ -97,6 +106,7 @@ export interface FileRouteTypes {
| '/about'
| '/login'
| '/admin/logs'
| '/admin/notifications'
| '/admin/settings'
| '/user/profile'
| '/user/resetpassword'
@@ -107,6 +117,7 @@ export interface FileRouteTypes {
| '/about'
| '/login'
| '/admin/logs'
| '/admin/notifications'
| '/admin/settings'
| '/user/profile'
| '/user/resetpassword'
@@ -117,6 +128,7 @@ export interface FileRouteTypes {
| '/about'
| '/(auth)/login'
| '/admin/logs'
| '/admin/notifications'
| '/admin/settings'
| '/(auth)/user/profile'
| '/(auth)/user/resetpassword'
@@ -128,6 +140,7 @@ export interface RootRouteChildren {
AboutRoute: typeof AboutRoute
authLoginRoute: typeof authLoginRoute
AdminLogsRoute: typeof AdminLogsRoute
AdminNotificationsRoute: typeof AdminNotificationsRoute
AdminSettingsRoute: typeof AdminSettingsRoute
authUserProfileRoute: typeof authUserProfileRoute
authUserResetpasswordRoute: typeof authUserResetpasswordRoute
@@ -157,6 +170,13 @@ declare module '@tanstack/react-router' {
preLoaderRoute: typeof AdminSettingsRouteImport
parentRoute: typeof rootRouteImport
}
'/admin/notifications': {
id: '/admin/notifications'
path: '/admin/notifications'
fullPath: '/admin/notifications'
preLoaderRoute: typeof AdminNotificationsRouteImport
parentRoute: typeof rootRouteImport
}
'/admin/logs': {
id: '/admin/logs'
path: '/admin/logs'
@@ -200,6 +220,7 @@ const rootRouteChildren: RootRouteChildren = {
AboutRoute: AboutRoute,
authLoginRoute: authLoginRoute,
AdminLogsRoute: AdminLogsRoute,
AdminNotificationsRoute: AdminNotificationsRoute,
AdminSettingsRoute: AdminSettingsRoute,
authUserProfileRoute: authUserProfileRoute,
authUserResetpasswordRoute: authUserResetpasswordRoute,

View File

@@ -1,4 +1,6 @@
import { useSuspenseQuery } from "@tanstack/react-query";
import axios from "axios";
import { toast } from "sonner";
import {
Card,
CardContent,
@@ -12,15 +14,32 @@ import { notifications } from "../../../lib/queries/notifications";
export default function NotificationsSubCard({ user }: any) {
const { data } = useSuspenseQuery(notifications());
const { data: ns } = useSuspenseQuery(notificationSubs(user.id));
const { refetch } = useSuspenseQuery(notificationSubs(user.id));
const form = useAppForm({
defaultValues: {
notificationId: "",
emails: [user.email],
},
onSubmit: async ({ value }) => {
if (value.notificationId === "") {
toast.error("Please select a notification before trying to subscribe.");
return;
}
const postD = { ...value, userId: user.id };
console.log(postD);
try {
const res = await axios.post("/lst/api/notification/sub", postD, {
withCredentials: true,
});
if (res.status === 200) {
toast.success("Notification Subbed");
refetch();
form.reset();
}
} catch (error) {
console.error(error);
}
},
});
@@ -32,11 +51,9 @@ export default function NotificationsSubCard({ user }: any) {
}));
}
console.log(ns);
return (
<div>
<Card className="p-3 w-128">
<Card className="p-3 w-lg">
<CardHeader>
<CardTitle>Notifications</CardTitle>
<CardDescription>

View File

@@ -0,0 +1,114 @@
import { useSuspenseQuery } from "@tanstack/react-query";
import { createColumnHelper } from "@tanstack/react-table";
import axios from "axios";
import { Trash } from "lucide-react";
import { toast } from "sonner";
import type { Notifications } from "../../../../types/notifications";
import { Button } from "../../../components/ui/button";
import { Card, CardContent, CardHeader } from "../../../components/ui/card";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "../../../components/ui/tooltip";
import { notificationSubs } from "../../../lib/queries/notificationSubs";
import { notifications } from "../../../lib/queries/notifications";
import LstTable from "../../../lib/tableStuff/LstTable";
import SearchableHeader from "../../../lib/tableStuff/SearchableHeader";
export default function NotificationsTable({ userId }: any) {
const { data: subs, refetch } = useSuspenseQuery(notificationSubs(userId));
const { data: note } = useSuspenseQuery(notifications());
const columnHelper = createColumnHelper<Notifications>();
// filter out the current
const notificationMap = Object.fromEntries(note.map((n: any) => [n.id, n]));
const data = subs.map((sub: any) => ({
...sub,
name: notificationMap[sub.notificationId].name || null,
description: notificationMap[sub.notificationId].description || null,
emails: sub.emails ? sub.emails.join(",") : null,
}));
const removeNotification = async (ns: any) => {
try {
const res = await axios.delete(`/lst/api/notification/sub`, {
withCredentials: true,
data: {
userId: ns.userId,
notificationId: ns.notificationId,
},
});
if (res.data.success) {
toast.success(`Subscription removed`);
refetch();
} else {
console.info(res);
toast.error(res.data.message);
}
} catch {
toast.error(`There was an error removing subscription.`);
}
};
const column = [
columnHelper.accessor("name", {
header: ({ column }) => (
<SearchableHeader column={column} title="Name" searchable={true} />
),
filterFn: "includesString",
cell: (i) => i.getValue(),
}),
columnHelper.accessor("description", {
header: ({ column }) => (
<SearchableHeader column={column} title="Description" />
),
cell: (i) => (
<Tooltip>
<TooltipTrigger>
{i.getValue()?.length > 25 ? (
<span>{i.getValue().slice(0, 25)}...</span>
) : (
<span>{i.getValue()}</span>
)}
</TooltipTrigger>
<TooltipContent>{i.getValue()}</TooltipContent>
</Tooltip>
),
}),
columnHelper.accessor("emails", {
header: ({ column }) => (
<SearchableHeader column={column} title="Emails" searchable={true} />
),
filterFn: "includesString",
cell: (i) => i.getValue(),
}),
columnHelper.accessor("remove", {
header: ({ column }) => (
<SearchableHeader column={column} title="Remove" searchable={false} />
),
filterFn: "includesString",
cell: (i) => {
return (
<Button
size="icon"
variant={"destructive"}
onClick={() => removeNotification(i.row.original)}
>
<Trash />
</Button>
);
},
}),
];
return (
<Card>
<CardHeader className="text-center">Subscriptions</CardHeader>
<CardContent>
<LstTable data={data} columns={column} />
</CardContent>
</Card>
);
}

View File

@@ -13,6 +13,7 @@ import { useAppForm } from "@/lib/formSutff";
import { Spinner } from "../../components/ui/spinner";
import ChangePassword from "./-components/ChangePassword";
import NotificationsSubCard from "./-components/NotificationsSubCard";
import NotificationsTable from "./-components/NotificationsTable";
export const Route = createFileRoute("/(auth)/user/profile")({
beforeLoad: async () => {
@@ -57,51 +58,73 @@ function RouteComponent() {
},
});
return (
<div className="flex justify-center flex-col pt-4 gap-2 lg:flex-row">
<div>
<Card className="p-6 w-96">
<CardHeader>
<CardTitle>Profile</CardTitle>
<CardDescription>
Change your profile and password below
</CardDescription>
</CardHeader>
<div className="flex justify-center flex-col pt-4 gap-2">
<div className="flex justify-center flex-col pt-4 gap-2 lg:flex-row">
<div>
<Card className="p-6 w-96">
<CardHeader>
<CardTitle>Profile</CardTitle>
<CardDescription>
Change your profile and password below
</CardDescription>
</CardHeader>
<CardContent>
<form
onSubmit={(e) => {
e.preventDefault();
form.handleSubmit();
}}
>
<form.AppField name="name">
{(field) => (
<field.InputField
label="Name"
inputType="string"
required={true}
/>
)}
</form.AppField>
<CardContent>
<form
onSubmit={(e) => {
e.preventDefault();
form.handleSubmit();
}}
>
<form.AppField name="name">
{(field) => (
<field.InputField
label="Name"
inputType="string"
required={true}
/>
)}
</form.AppField>
<div className="flex justify-end mt-6">
<form.AppForm>
<form.SubmitButton>Update Profile</form.SubmitButton>
</form.AppForm>
</div>
</form>
</CardContent>
</Card>
<div className="flex justify-end mt-6">
<form.AppForm>
<form.SubmitButton>Update Profile</form.SubmitButton>
</form.AppForm>
</div>
</form>
</CardContent>
</Card>
</div>
<div>
<ChangePassword />
</div>
<div>
<Suspense
fallback={
<Card className="p-3 w-lg">
<CardHeader>
<CardTitle>Notifications</CardTitle>
</CardHeader>
<CardContent>
<div className="flex justify-center m-auto">
<div>
<Spinner className="size-32" />
</div>
</div>
</CardContent>
</Card>
}
>
{session && <NotificationsSubCard user={session.user} />}
</Suspense>
</div>
</div>
<div>
<ChangePassword />
</div>
<div>
<div className="w-fill">
<Suspense
fallback={
<Card className="p-3 w-96">
<Card className="p-3">
<CardHeader>
<CardTitle>Notifications</CardTitle>
<CardTitle className="text-center">Subscriptions</CardTitle>
</CardHeader>
<CardContent>
<div className="flex justify-center m-auto">
@@ -113,7 +136,7 @@ function RouteComponent() {
</Card>
}
>
{session && <NotificationsSubCard user={session.user} />}
{session && <NotificationsTable userId={`${session.user.id}`} />}
</Suspense>
</div>
</div>

View File

@@ -0,0 +1,316 @@
import { useMutation, useSuspenseQuery } from "@tanstack/react-query";
import { createFileRoute, redirect } from "@tanstack/react-router";
import { createColumnHelper } from "@tanstack/react-table";
import axios from "axios";
import { Trash } from "lucide-react";
import { Suspense, useState } from "react";
import { toast } from "sonner";
import type { Notifications } from "../../../types/notifications";
import { Button } from "../../components/ui/button";
import { Card, CardContent } from "../../components/ui/card";
import { Label } from "../../components/ui/label";
import { Switch } from "../../components/ui/switch";
import {
Tabs,
TabsContent,
TabsList,
TabsTrigger,
} from "../../components/ui/tabs";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "../../components/ui/tooltip";
import { authClient } from "../../lib/auth-client";
import { notificationSubs } from "../../lib/queries/notificationSubs";
import { notifications } from "../../lib/queries/notifications";
import EditableCellInput from "../../lib/tableStuff/EditableCellInput";
import LstTable from "../../lib/tableStuff/LstTable";
import SearchableHeader from "../../lib/tableStuff/SearchableHeader";
import SkellyTable from "../../lib/tableStuff/SkellyTable";
const updateNotifications = async (
id: string,
data: Record<string, string | number | boolean | null>,
) => {
//console.log(id, data);
try {
const res = await axios.patch(
`/lst/api/notification/${id}`,
{ interval: data.interval },
{
withCredentials: true,
},
);
toast.success(`Notification was just updated`);
return res;
} catch (err) {
toast.error("Error in updating the settings");
return err;
}
};
export const Route = createFileRoute("/admin/notifications")({
beforeLoad: async ({ location }) => {
const { data: session } = await authClient.getSession();
const allowedRole = ["systemAdmin", "admin"];
if (!session?.user) {
throw redirect({
to: "/",
search: {
redirect: location.href,
},
});
}
if (!allowedRole.includes(session.user.role as string)) {
throw redirect({
to: "/",
});
}
return { user: session.user };
},
component: RouteComponent,
});
const NotificationTable = () => {
const { data, refetch } = useSuspenseQuery(notifications());
const { data: subs, refetch: subRefetch } = useSuspenseQuery(
notificationSubs(),
);
const columnHelper = createColumnHelper<Notifications>();
const notificationMap = Object.fromEntries(data.map((n: any) => [n.id, n]));
const subData = subs.map((sub: any) => ({
...sub,
name: notificationMap[sub.notificationId].name || null,
description: notificationMap[sub.notificationId].description || null,
emails: sub.emails ? sub.emails.join(",") : null,
}));
const updateNotification = useMutation({
mutationFn: ({
id,
field,
value,
}: {
id: string;
field: string;
value: string | number | boolean | null;
}) => updateNotifications(id, { [field]: value }),
onSuccess: () => {
// refetch or update cache
refetch();
},
});
const removeNotification = async (ns: any) => {
try {
const res = await axios.delete(`/lst/api/notification/sub`, {
withCredentials: true,
data: {
userId: ns.userId,
notificationId: ns.notificationId,
},
});
if (res.data.success) {
toast.success(`Subscription removed`);
subRefetch();
} else {
console.info(res);
toast.error(res.data.message);
}
} catch {
toast.error(`There was an error removing subscription.`);
}
};
const column = [
columnHelper.accessor("name", {
header: ({ column }) => (
<SearchableHeader column={column} title="Name" searchable={true} />
),
filterFn: "includesString",
cell: (i) => i.getValue(),
}),
columnHelper.accessor("description", {
header: ({ column }) => (
<SearchableHeader column={column} title="Description" />
),
cell: (i) => (
<Tooltip>
<TooltipTrigger>
{i.getValue()?.length > 25 ? (
<span>{i.getValue().slice(0, 25)}...</span>
) : (
<span>{i.getValue()}</span>
)}
</TooltipTrigger>
<TooltipContent>{i.getValue()}</TooltipContent>
</Tooltip>
),
}),
columnHelper.accessor("active", {
header: ({ column }) => (
<SearchableHeader column={column} title="Active" searchable={false} />
),
filterFn: "includesString",
cell: (i) => {
// biome-ignore lint: just removing the lint for now to get this going will maybe fix later
const [activeToggle, setActiveToggle] = useState(i.getValue());
const onToggle = async (e: boolean) => {
setActiveToggle(e);
try {
const res = await axios.patch(
`/lst/api/notification/${i.row.original.id}`,
{
active: !activeToggle,
},
{ withCredentials: true },
);
if (res.data.success) {
toast.success(
`${i.row.original.name} was set to ${activeToggle ? "Inactive" : "Active"}`,
);
refetch();
}
} catch (error) {
console.error(error);
}
};
return (
<div className="w-48">
<div className="flex items-center space-x-2">
<Switch
id={i.row.original.id}
checked={activeToggle}
onCheckedChange={(e) => onToggle(e)}
//onBlur={field.handleBlur}
/>
<Label htmlFor={i.row.original.id}>
{activeToggle ? "Active" : "Deactivated"}
</Label>
</div>
</div>
);
},
}),
columnHelper.accessor("interval", {
header: ({ column }) => (
<SearchableHeader column={column} title="Interval" />
),
filterFn: "includesString",
cell: ({ row, getValue }) => {
return (
<EditableCellInput
value={getValue()}
id={row.original.id}
field="interval"
onSubmit={({ id, field, value }) => {
updateNotification.mutate({ id, field, value });
}}
/>
);
},
}),
];
const subsColumn = [
columnHelper.accessor("name", {
header: ({ column }) => (
<SearchableHeader column={column} title="Name" searchable={true} />
),
filterFn: "includesString",
cell: (i) => i.getValue(),
}),
columnHelper.accessor("description", {
header: ({ column }) => (
<SearchableHeader column={column} title="Description" />
),
cell: (i) => (
<Tooltip>
<TooltipTrigger>
{i.getValue()?.length > 25 ? (
<span>{i.getValue().slice(0, 25)}...</span>
) : (
<span>{i.getValue()}</span>
)}
</TooltipTrigger>
<TooltipContent>{i.getValue()}</TooltipContent>
</Tooltip>
),
}),
columnHelper.accessor("emails", {
header: ({ column }) => (
<SearchableHeader column={column} title="Emails" searchable={true} />
),
filterFn: "includesString",
cell: (i) => i.getValue(),
}),
columnHelper.accessor("remove", {
header: ({ column }) => (
<SearchableHeader column={column} title="Remove" searchable={false} />
),
filterFn: "includesString",
cell: (i) => {
return (
<Button
size="icon"
variant={"destructive"}
onClick={() => removeNotification(i.row.original)}
>
<Trash />
</Button>
);
},
}),
];
return (
<>
<TabsContent value="notifications">
<LstTable data={data} columns={column} />
</TabsContent>
<TabsContent value="subscriptions">
<LstTable data={subData} columns={subsColumn} />
</TabsContent>
</>
);
};
function RouteComponent() {
return (
<div className="space-y-6">
<div className="space-y-2">
<h1 className="text-2xl font-semibold">Notifications</h1>
<p className="text-sm text-muted-foreground">
Manage all notification settings and user subs.
</p>
</div>
<Card>
<CardContent>
<Tabs defaultValue="notifications" className="w-full">
<TabsList>
<TabsTrigger value="notifications">Notifications</TabsTrigger>
<TabsTrigger value="subscriptions">Subscriptions</TabsTrigger>
</TabsList>
<Suspense fallback={<SkellyTable />}>
<NotificationTable />
</Suspense>
</Tabs>
</CardContent>
</Card>
</div>
);
}

View File

@@ -46,7 +46,7 @@ const updateSettings = async (
id: string,
data: Record<string, string | number | boolean | null>,
) => {
console.log(id, data);
//console.log(id, data);
try {
const res = await axios.patch(`/lst/api/settings/${id}`, data, {
withCredentials: true,

View File

@@ -34,7 +34,7 @@ function Index() {
<p>
This is active in your plant today due to having warehousing activated
and new functions needed to be introduced, you should be still using LST
as you were before
as you were before.
</p>
<br></br>
<p>

View File

@@ -0,0 +1,10 @@
export type Notifications = {
id: string;
name: string;
emails: string;
description: string;
remove?: unknown;
active?: boolean;
interval: number;
options: unknown[];
};

View File

@@ -0,0 +1,15 @@
CREATE TABLE "alpla_purchase_history" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"apo" integer,
"revision" integer,
"confirmed" integer,
"status" integer,
"status_text" integer,
"add_date" timestamp DEFAULT now(),
"upd_date" timestamp DEFAULT now(),
"add_user" text,
"upd_user" text,
"remark" text,
"position" jsonb DEFAULT '[]'::jsonb,
"created_at" timestamp DEFAULT now()
);

View File

@@ -0,0 +1,2 @@
ALTER TABLE "alpla_purchase_history" ADD COLUMN "journal_num" integer;--> statement-breakpoint
ALTER TABLE "alpla_purchase_history" ADD COLUMN "approved_status" text;

View File

@@ -0,0 +1 @@
ALTER TABLE "alpla_purchase_history" ALTER COLUMN "approved_status" SET DEFAULT 'pending';

View File

@@ -0,0 +1 @@
ALTER TABLE "alpla_purchase_history" ALTER COLUMN "status_text" SET DATA TYPE text;

View File

@@ -0,0 +1,6 @@
ALTER TABLE "opendock_apt" ALTER COLUMN "release" SET NOT NULL;--> statement-breakpoint
ALTER TABLE "opendock_apt" ALTER COLUMN "appointment" SET NOT NULL;--> statement-breakpoint
ALTER TABLE "opendock_apt" ALTER COLUMN "upd_date" SET NOT NULL;--> statement-breakpoint
ALTER TABLE "opendock_apt" ALTER COLUMN "created_at" SET NOT NULL;--> statement-breakpoint
CREATE INDEX "opendock_apt_release_idx" ON "opendock_apt" USING btree ("release");--> statement-breakpoint
CREATE INDEX "opendock_apt_opendock_id_idx" ON "opendock_apt" USING btree ("open_dock_apt_id");

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -141,6 +141,41 @@
"when": 1775159956510,
"tag": "0019_large_thunderbird",
"breakpoints": true
},
{
"idx": 20,
"version": "7",
"when": 1775566910220,
"tag": "0020_stale_ma_gnuci",
"breakpoints": true
},
{
"idx": 21,
"version": "7",
"when": 1775647109925,
"tag": "0021_slimy_master_mold",
"breakpoints": true
},
{
"idx": 22,
"version": "7",
"when": 1775649219780,
"tag": "0022_large_sumo",
"breakpoints": true
},
{
"idx": 23,
"version": "7",
"when": 1775650901523,
"tag": "0023_normal_hellion",
"breakpoints": true
},
{
"idx": 24,
"version": "7",
"when": 1775661516749,
"tag": "0024_absent_barracuda",
"breakpoints": true
}
]
}

2885
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "lst_v3",
"version": "1.0.2-alpha.0",
"version": "0.0.1-alpha.0",
"description": "The tool that supports us in our everyday alplaprod",
"main": "index.js",
"scripts": {
@@ -20,13 +20,9 @@
"start:server": "dotenvx run -f .env -- node dist/server.js",
"start:docker": "node dist/server.js",
"version": "changeset version",
"release": "dotenvx run -f .env -- npm run version && git push --follow-tags && node scripts/create-release.js",
"specCheck": "node scripts/check-route-specs.mjs",
"commit": "cz",
"changeset": "changeset",
"changeset:add": "changeset",
"changeset:version": "changeset version",
"changeset:status": "changeset status --verbose"
"release": "commit-and-tag-version"
},
"repository": {
"type": "git",
@@ -38,7 +34,6 @@
"type": "module",
"devDependencies": {
"@biomejs/biome": "2.4.8",
"@changesets/cli": "^2.30.0",
"@commitlint/cli": "^20.5.0",
"@commitlint/config-conventional": "^20.5.0",
"@types/cors": "^2.8.19",
@@ -54,6 +49,7 @@
"@types/supertest": "^7.2.0",
"@types/swagger-jsdoc": "^6.0.4",
"@types/swagger-ui-express": "^4.1.8",
"commit-and-tag-version": "^12.7.1",
"commitizen": "^4.3.1",
"cpy-cli": "^7.0.0",
"cz-conventional-changelog": "^3.3.0",

View File

@@ -37,12 +37,15 @@ $Servers = @(
token = "test2"
loc = "E$\LST_V3"
}
#@{ server = "usmcd1vms036"; token = "test1"; loc = "E$\LST\lst_backend"; }
#@{ server = "usiow1vms036"; token = "test1"; loc = "E$\LST\lst_backend"; }
,
[PSCustomObject]@{
server = "usweb1vms006"
token = "usweb1"
loc = "D$\LST_V3"
}
#@{ server = "usbet1vms006"; token = "usbet1";loc = "C$\Users\adm_matthes01\Desktop\lst_backend"; }
#@{ server = "usbow1vms006"; token = "usbow1"; loc = "C$\Users\adm_matthes01\Desktop\lst_backend" ; }
#@{ server = "usbow2vms006"; token = "usbow2"; loc = "C$\Users\adm_matthes01\Desktop\lst_backend" ; }
#@{ server = "usday1vms006"; token = "usday1"; loc = "E$\LST\lst_backend" ; }
#@{ server = "usflo1vms006"; token = "usflo1"; loc = "C$\Users\adm_matthes01\Desktop\lst_backend" ; }
#@{ server = "ushou1vms006"; token = "ushou1"; loc = "C$\Users\adm_matthes01\Desktop\lst_backend" ;}
#@{ server = "usiow1vms006"; token = "usiow1"; loc = "C$\Users\adm_matthes01\Desktop\lst_backend" ; }
@@ -53,7 +56,6 @@ $Servers = @(
#@{ server = "usshe1vms006"; token = "usshe1"; loc = "C$\Users\adm_matthes01\Desktop\lst_backend" ;}
#@{ server = "usslc1vms006"; token = "usslc1"; loc = "C$\Users\adm_matthes01\Desktop\lst_backend" ;}
#@{ server = "usstp1vms006"; token = "usstp1"; loc = "E$\LST\lst_backend" ; }
#@{ server = "usweb1vms006"; token = "usweb1"; loc = "C$\Users\adm_matthes01\Desktop\lst_backend" ;}
#@{ server = "usmar1vms006"; token = "test1"; loc = "E$\LST\lst_backend"; }
)
@@ -339,7 +341,7 @@ function Update-Server {
# do the install/update
Push-Location $LocalPath
Write-Host "Running install/update in: $LocalPath"
npm install
npm install --omit=dev
Start-Sleep -Seconds 3
Write-Host "Install/update completed."
# do the migrations

View File

@@ -74,7 +74,7 @@ stage the change log file
git commit -m "chore(release): version packages"
git tag v1.0.1 this will be the new version
git tag v0.0.1-alpha.0 change this to the same version thats in the pkg.json
then push it
@@ -110,4 +110,23 @@ Release flow
6. git commit -m "chore(release): version packages"
7. git tag vX.X.X
8. git push
9. git push --tags
9. git push --tags
# normal work
stage files
npm run commit
# if releasing
npm run commit
npm run release -- --prerelease alpha
git push
git push --tags
git add .
git commit -m "chore(release): version packages"
git tag v0.0.1-alpha.0
git push
git push --tags