Compare commits
21 Commits
4b6061c478
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| d6328ab764 | |||
| a6d53f0266 | |||
| 7962463927 | |||
| f716de1a58 | |||
| 88cef2a56c | |||
| cb00addee9 | |||
| b832d7aa1e | |||
| 32517d0c98 | |||
| 82f8369640 | |||
| 3734d9daac | |||
| a1eeadeec4 | |||
| 3639c1b77c | |||
| cfbc156517 | |||
| fb3cd85b41 | |||
| 5b1c88546f | |||
| ba3227545d | |||
| 84909bfcf8 | |||
| e0d0ac2077 | |||
| 52a6c821f4 | |||
| eccaf17332 | |||
| 6307037985 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -5,6 +5,7 @@ builds
|
|||||||
.buildNumber
|
.buildNumber
|
||||||
temp
|
temp
|
||||||
brunoApi
|
brunoApi
|
||||||
|
downloads
|
||||||
.scriptCreds
|
.scriptCreds
|
||||||
node-v24.14.0-x64.msi
|
node-v24.14.0-x64.msi
|
||||||
postgresql-17.9-2-windows-x64.exe
|
postgresql-17.9-2-windows-x64.exe
|
||||||
@@ -148,3 +149,4 @@ dist
|
|||||||
.yarn/install-state.gz
|
.yarn/install-state.gz
|
||||||
.pnp.*
|
.pnp.*
|
||||||
|
|
||||||
|
frontend/.tanstack/tmp/2249110e-da91fb0b1b87b6c4cc3e2c2cd25037fd
|
||||||
|
|||||||
5
.vscode/settings.json
vendored
5
.vscode/settings.json
vendored
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"editor.defaultFormatter": "biomejs.biome",
|
"editor.defaultFormatter": "biomejs.biome",
|
||||||
"workbench.colorTheme": "Default Dark+",
|
"workbench.colorTheme": "Dark+",
|
||||||
"terminal.integrated.env.windows": {},
|
"terminal.integrated.env.windows": {},
|
||||||
"editor.formatOnSave": true,
|
"editor.formatOnSave": true,
|
||||||
"typescript.preferences.importModuleSpecifier": "relative",
|
"typescript.preferences.importModuleSpecifier": "relative",
|
||||||
@@ -71,7 +71,8 @@
|
|||||||
"prodlabels",
|
"prodlabels",
|
||||||
"prolink",
|
"prolink",
|
||||||
"Skelly",
|
"Skelly",
|
||||||
"trycatch"
|
"trycatch",
|
||||||
|
"whse"
|
||||||
],
|
],
|
||||||
"gitea.token": "8456def90e1c651a761a8711763d6ef225d6b2db",
|
"gitea.token": "8456def90e1c651a761a8711763d6ef225d6b2db",
|
||||||
"gitea.instanceURL": "https://git.tuffraid.net",
|
"gitea.instanceURL": "https://git.tuffraid.net",
|
||||||
|
|||||||
44
CHANGELOG.md
44
CHANGELOG.md
@@ -1,5 +1,49 @@
|
|||||||
# All Changes to LST can be found below.
|
# All Changes to LST can be found below.
|
||||||
|
|
||||||
|
## [0.0.1-alpha.4](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.3...v0.0.1-alpha.4) (2026-04-15)
|
||||||
|
|
||||||
|
|
||||||
|
### 🌟 Enhancements
|
||||||
|
|
||||||
|
* **datamart:** migrations completed remaining is the deactivation that will be ran by anylitics ([eccaf17](https://git.tuffraid.net/cowch/lst_v3/commits/eccaf17332fb1c63b8d6bbea6f668c3bb42d44b7))
|
||||||
|
* **datamart:** psi data has been added :D ([e0d0ac2](https://git.tuffraid.net/cowch/lst_v3/commits/e0d0ac20773159373495d65023587b76b47df34f))
|
||||||
|
* **migrate:** quality alert migrated ([b0e5fd7](https://git.tuffraid.net/cowch/lst_v3/commits/b0e5fd79998d551d4f155d58416157a324498fbd))
|
||||||
|
* **ocp:** printer sync and logging logic added ([80189ba](https://git.tuffraid.net/cowch/lst_v3/commits/80189baf906224da43ec1b9b7521153d2a49e059))
|
||||||
|
* **tcp crud:** tcp server start, stop, restart endpoints + status check ([6307037](https://git.tuffraid.net/cowch/lst_v3/commits/6307037985162bc6b49f9f711132853296f43eee))
|
||||||
|
|
||||||
|
|
||||||
|
### 🐛 Bug fixes
|
||||||
|
|
||||||
|
* **datamart:** error when running build and crashed everything ([52a6c82](https://git.tuffraid.net/cowch/lst_v3/commits/52a6c821f4632e4b5b51e0528a0d620e2e0deffc))
|
||||||
|
|
||||||
|
|
||||||
|
### 📚 Documentation
|
||||||
|
|
||||||
|
* **docs:** removed docusorus as all docs will be inside lst now to better assist users ([6ba905a](https://git.tuffraid.net/cowch/lst_v3/commits/6ba905a887dbd8f306d71fed75bb34c71fee74c9))
|
||||||
|
* **env example:** updated the file ([ca3425d](https://git.tuffraid.net/cowch/lst_v3/commits/ca3425d327757120c2cc876fff28e8668c76838d))
|
||||||
|
* **notifcations:** docs for intro, notifcations, reprint added ([87f7387](https://git.tuffraid.net/cowch/lst_v3/commits/87f738702a935279a248d471541cdd9d49330565))
|
||||||
|
|
||||||
|
|
||||||
|
### 🛠️ Code Refactor
|
||||||
|
|
||||||
|
* **agent:** changed to have the test servers on there own push for better testing ([3bf024c](https://git.tuffraid.net/cowch/lst_v3/commits/3bf024cfc97d2841130d54d1a7c5cb5f09f0f598))
|
||||||
|
* **connection:** corrected the connection to the old system ([38a0b65](https://git.tuffraid.net/cowch/lst_v3/commits/38a0b65e9450c65b8300a10058a8f0357400f4e6))
|
||||||
|
* **logging:** when notify is true send the error to systemAdmins ([79e653e](https://git.tuffraid.net/cowch/lst_v3/commits/79e653efa3bcb2941ccee06b28378e709e085ec0))
|
||||||
|
* **notification:** blocking added ([9a0ef8e](https://git.tuffraid.net/cowch/lst_v3/commits/9a0ef8e51a36e3ab45b601b977f1b5cf35d56947))
|
||||||
|
* **puchase:** changes how the error handling works so a better email can be sent ([9d39c13](https://git.tuffraid.net/cowch/lst_v3/commits/9d39c13510974b5ada2a6f6c2448da3f1b755a5c))
|
||||||
|
* **reprint:** new query added to deactivate the old notifcation so no chance of duplicates ([c9eb59e](https://git.tuffraid.net/cowch/lst_v3/commits/c9eb59e2ad9847418ac55cb8a4a91c013f6c97bb))
|
||||||
|
* **server:** added in serverCrash email ([dcb3f2d](https://git.tuffraid.net/cowch/lst_v3/commits/dcb3f2dd1382986639b722778fad113392533b28))
|
||||||
|
* **services:** added in examples for migration stuff ([fc6dc82](https://git.tuffraid.net/cowch/lst_v3/commits/fc6dc82d8458a9928050dd3770778d6a6e1eea7f))
|
||||||
|
* **sql:** corrections to the way we reconnect so the app can error out and be reactivated later ([f33587a](https://git.tuffraid.net/cowch/lst_v3/commits/f33587a3d9a72ca72806635fac9d1214bb1452f1))
|
||||||
|
* **templates:** corrections for new notify process on critcal errors ([07ebf88](https://git.tuffraid.net/cowch/lst_v3/commits/07ebf88806b93b9320f8f9d36b867572dd9a9580))
|
||||||
|
|
||||||
|
|
||||||
|
### 📈 Project changes
|
||||||
|
|
||||||
|
* **agent:** added in jeff city ([e47ea9e](https://git.tuffraid.net/cowch/lst_v3/commits/e47ea9ec52a6ebaf5a8f67a7e8bd2c73da6186fb))
|
||||||
|
* **agent:** added in sherman ([4b6061c](https://git.tuffraid.net/cowch/lst_v3/commits/4b6061c478cbeba7c845dc1c8a015b9998721456))
|
||||||
|
* **service:** changes to the script to allow running the powershell on execution palicy restrictions ([84909bf](https://git.tuffraid.net/cowch/lst_v3/commits/84909bfcf85b91d085ea9dca78be00482b7fd231))
|
||||||
|
|
||||||
## [0.0.1-alpha.3](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.2...v0.0.1-alpha.3) (2026-04-10)
|
## [0.0.1-alpha.3](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.2...v0.0.1-alpha.3) (2026-04-10)
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -19,7 +19,7 @@ Quick summary of current rewrite/migration goal.
|
|||||||
| User Profile | ~~Edit profile~~, upload avatar | 🟨 In Progress |
|
| User Profile | ~~Edit profile~~, upload avatar | 🟨 In Progress |
|
||||||
| User Admin | Edit user, create user, remove user, alplaprod user integration | ⏳ Not Started |
|
| User Admin | Edit user, create user, remove user, alplaprod user integration | ⏳ Not Started |
|
||||||
| Notifications | ~~Subscribe~~, ~~Create~~, ~~Update~~, ~~~~Remove~~, Manual Trigger | 🟨 In Progress |
|
| Notifications | ~~Subscribe~~, ~~Create~~, ~~Update~~, ~~~~Remove~~, Manual Trigger | 🟨 In Progress |
|
||||||
| Datamart | Create, Update, Run, Deactivate | 🔧 In Progress |
|
| Datamart | ~~Create~~, ~~Update~~, ~~Run~~, Deactivate | 🟨 In Progress |
|
||||||
| Frontend | Analytics and charts | ⏳ Not Started |
|
| Frontend | Analytics and charts | ⏳ Not Started |
|
||||||
| Docs | Instructions and trouble shooting | ⏳ Not Started |
|
| Docs | Instructions and trouble shooting | ⏳ Not Started |
|
||||||
| One Click Print | Get printers, monitor printers, label process, material process, Special processes | ⏳ Not Started |
|
| One Click Print | Get printers, monitor printers, label process, material process, Special processes | ⏳ Not Started |
|
||||||
|
|||||||
38
backend/admin/admin.build.ts
Normal file
38
backend/admin/admin.build.ts
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
/**
|
||||||
|
* To be able to run this we need to set our dev pc in the .env.
|
||||||
|
* if its empty just ignore it. this will just be the double catch
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { Router } from "express";
|
||||||
|
import { build, building } from "../utils/build.utils.js";
|
||||||
|
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||||
|
|
||||||
|
const router = Router();
|
||||||
|
|
||||||
|
router.post("/release", async (_, res) => {
|
||||||
|
if (!building) {
|
||||||
|
build();
|
||||||
|
|
||||||
|
return apiReturn(res, {
|
||||||
|
success: true,
|
||||||
|
level: "info",
|
||||||
|
module: "admin",
|
||||||
|
subModule: "build",
|
||||||
|
message: `The build has been triggered see logs for progress of the current build.`,
|
||||||
|
data: [],
|
||||||
|
status: 200,
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
return apiReturn(res, {
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "admin",
|
||||||
|
subModule: "build",
|
||||||
|
message: `There is a build in progress already please check the logs for on going progress.`,
|
||||||
|
data: [],
|
||||||
|
status: 200,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
export default router;
|
||||||
12
backend/admin/admin.routes.ts
Normal file
12
backend/admin/admin.routes.ts
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
import type { Express } from "express";
|
||||||
|
import { requireAuth } from "../middleware/auth.middleware.js";
|
||||||
|
import build from "./admin.build.js";
|
||||||
|
import update from "./admin.updateServer.js";
|
||||||
|
|
||||||
|
export const setupAdminRoutes = (baseUrl: string, app: Express) => {
|
||||||
|
//stats will be like this as we dont need to change this
|
||||||
|
app.use(`${baseUrl}/api/admin/build`, requireAuth, build);
|
||||||
|
app.use(`${baseUrl}/api/admin/build`, requireAuth, update);
|
||||||
|
|
||||||
|
// all other system should be under /api/system/*
|
||||||
|
};
|
||||||
86
backend/admin/admin.updateServer.ts
Normal file
86
backend/admin/admin.updateServer.ts
Normal file
@@ -0,0 +1,86 @@
|
|||||||
|
/**
|
||||||
|
* To be able to run this we need to set our dev pc in the .env.
|
||||||
|
* if its empty just ignore it. this will just be the double catch
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { Router } from "express";
|
||||||
|
import z from "zod";
|
||||||
|
import { building } from "../utils/build.utils.js";
|
||||||
|
import { runUpdate, updating } from "../utils/deployApp.js";
|
||||||
|
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||||
|
|
||||||
|
const updateServer = z.object({
|
||||||
|
server: z.string(),
|
||||||
|
destination: z.string(),
|
||||||
|
token: z.string().min(5, "Plant tokens should be at least 5 characters long"),
|
||||||
|
});
|
||||||
|
|
||||||
|
const router = Router();
|
||||||
|
|
||||||
|
type Update = {
|
||||||
|
success: boolean;
|
||||||
|
message: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
router.post("/updateServer", async (req, res) => {
|
||||||
|
try {
|
||||||
|
const validated = updateServer.parse(req.body);
|
||||||
|
|
||||||
|
if (!updating && !building) {
|
||||||
|
const update = (await runUpdate({
|
||||||
|
server: validated.server,
|
||||||
|
destination: validated.destination,
|
||||||
|
token: validated.token,
|
||||||
|
})) as Update;
|
||||||
|
return apiReturn(res, {
|
||||||
|
success: update.success,
|
||||||
|
level: update.success ? "info" : "error",
|
||||||
|
module: "admin",
|
||||||
|
subModule: "update",
|
||||||
|
message: update.message,
|
||||||
|
data: [],
|
||||||
|
status: 200,
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
return apiReturn(res, {
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "admin",
|
||||||
|
subModule: "update",
|
||||||
|
message: `${validated.server}: ${validated.token} is already being updated, or is currently building the app.`,
|
||||||
|
data: [],
|
||||||
|
status: 200,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
if (err instanceof z.ZodError) {
|
||||||
|
const flattened = z.flattenError(err);
|
||||||
|
// return res.status(400).json({
|
||||||
|
// error: "Validation failed",
|
||||||
|
// details: flattened,
|
||||||
|
// });
|
||||||
|
|
||||||
|
return apiReturn(res, {
|
||||||
|
success: false,
|
||||||
|
level: "error", //connect.success ? "info" : "error",
|
||||||
|
module: "routes",
|
||||||
|
subModule: "auth",
|
||||||
|
message: "Validation failed",
|
||||||
|
data: [flattened.fieldErrors],
|
||||||
|
status: 400, //connect.success ? 200 : 400,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return apiReturn(res, {
|
||||||
|
success: false,
|
||||||
|
level: "error", //connect.success ? "info" : "error",
|
||||||
|
module: "routes",
|
||||||
|
subModule: "auth",
|
||||||
|
message: "Internal Server Error creating user",
|
||||||
|
data: [err],
|
||||||
|
status: 400, //connect.success ? 200 : 400,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
export default router;
|
||||||
@@ -3,8 +3,13 @@ import type sql from "mssql";
|
|||||||
const username = "gpviewer";
|
const username = "gpviewer";
|
||||||
const password = "gp$$ViewOnly!";
|
const password = "gp$$ViewOnly!";
|
||||||
|
|
||||||
|
const port = process.env.SQL_PORT
|
||||||
|
? Number.parseInt(process.env.SQL_PORT, 10)
|
||||||
|
: undefined;
|
||||||
|
|
||||||
export const gpSqlConfig: sql.config = {
|
export const gpSqlConfig: sql.config = {
|
||||||
server: `USMCD1VMS011`,
|
server: `${process.env.GP_SERVER ?? "USMCD1VMS011"}`,
|
||||||
|
port: port,
|
||||||
database: `ALPLA`,
|
database: `ALPLA`,
|
||||||
user: username,
|
user: username,
|
||||||
password: password,
|
password: password,
|
||||||
|
|||||||
@@ -1,7 +1,13 @@
|
|||||||
import type sql from "mssql";
|
import type sql from "mssql";
|
||||||
|
|
||||||
|
const port = process.env.SQL_PORT
|
||||||
|
? Number.parseInt(process.env.SQL_PORT, 10)
|
||||||
|
: undefined;
|
||||||
|
|
||||||
export const prodSqlConfig: sql.config = {
|
export const prodSqlConfig: sql.config = {
|
||||||
server: `${process.env.PROD_SERVER}`,
|
server: `${process.env.PROD_SERVER}`,
|
||||||
database: `AlplaPROD_${process.env.PROD_PLANT_TOKEN}_cus`,
|
database: `AlplaPROD_${process.env.PROD_PLANT_TOKEN}_cus`,
|
||||||
|
port: port,
|
||||||
user: process.env.PROD_USER,
|
user: process.env.PROD_USER,
|
||||||
password: process.env.PROD_PASSWORD,
|
password: process.env.PROD_PASSWORD,
|
||||||
options: {
|
options: {
|
||||||
|
|||||||
@@ -13,6 +13,10 @@
|
|||||||
*
|
*
|
||||||
* when a criteria is password over we will handle it by counting how many were passed up to 3 then deal with each one respectively
|
* when a criteria is password over we will handle it by counting how many were passed up to 3 then deal with each one respectively
|
||||||
*/
|
*/
|
||||||
|
|
||||||
|
import { and, between, inArray, notInArray } from "drizzle-orm";
|
||||||
|
import { db } from "../db/db.controller.js";
|
||||||
|
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
|
||||||
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
|
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
|
||||||
import {
|
import {
|
||||||
type SqlQuery,
|
type SqlQuery,
|
||||||
@@ -22,37 +26,125 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
|
|||||||
import { tryCatch } from "../utils/trycatch.utils.js";
|
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||||
import { datamartData } from "./datamartData.utlis.js";
|
import { datamartData } from "./datamartData.utlis.js";
|
||||||
|
|
||||||
type Options = {
|
|
||||||
name: string;
|
|
||||||
value: string;
|
|
||||||
};
|
|
||||||
type Data = {
|
type Data = {
|
||||||
name: string;
|
name: string;
|
||||||
options: Options;
|
options: any;
|
||||||
optionsRequired?: boolean;
|
optionsRequired?: boolean;
|
||||||
howManyOptionsRequired?: number;
|
howManyOptionsRequired?: number;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const lstDbRun = async (data: Data) => {
|
||||||
|
if (data.options) {
|
||||||
|
if (data.name === "psiInventory") {
|
||||||
|
const ids = data.options.articles.split(",").map((id: any) => id.trim());
|
||||||
|
const whse = data.options.whseToInclude
|
||||||
|
? data.options.whseToInclude
|
||||||
|
.split(",")
|
||||||
|
.map((w: any) => w.trim())
|
||||||
|
.filter(Boolean)
|
||||||
|
: [];
|
||||||
|
|
||||||
|
const locations = data.options.exludeLanes
|
||||||
|
? data.options.exludeLanes
|
||||||
|
.split(",")
|
||||||
|
.map((l: any) => l.trim())
|
||||||
|
.filter(Boolean)
|
||||||
|
: [];
|
||||||
|
|
||||||
|
const conditions = [
|
||||||
|
inArray(invHistoricalData.article, ids),
|
||||||
|
between(
|
||||||
|
invHistoricalData.histDate,
|
||||||
|
data.options.startDate,
|
||||||
|
data.options.endDate,
|
||||||
|
),
|
||||||
|
];
|
||||||
|
|
||||||
|
// only add the warehouse condition if there are any whse values
|
||||||
|
if (whse.length > 0) {
|
||||||
|
conditions.push(inArray(invHistoricalData.whseId, whse));
|
||||||
|
}
|
||||||
|
|
||||||
|
// locations we dont want in the system
|
||||||
|
if (locations.length > 0) {
|
||||||
|
conditions.push(notInArray(invHistoricalData.location, locations));
|
||||||
|
}
|
||||||
|
|
||||||
|
return await db
|
||||||
|
.select()
|
||||||
|
.from(invHistoricalData)
|
||||||
|
.where(and(...conditions));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return [];
|
||||||
|
};
|
||||||
export const runDatamartQuery = async (data: Data) => {
|
export const runDatamartQuery = async (data: Data) => {
|
||||||
// search the query db for the query by name
|
// search the query db for the query by name
|
||||||
const sqlQuery = sqlQuerySelector(`${data.name}`) as SqlQuery;
|
const considerLstDBRuns = ["psiInventory"];
|
||||||
|
|
||||||
|
if (considerLstDBRuns.includes(data.name)) {
|
||||||
|
const lstDB = await lstDbRun(data);
|
||||||
|
|
||||||
|
return returnFunc({
|
||||||
|
success: true,
|
||||||
|
level: "info",
|
||||||
|
module: "datamart",
|
||||||
|
subModule: "lstDBrn",
|
||||||
|
message: `Data for: ${data.name}`,
|
||||||
|
data: lstDB,
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const featureQ = sqlQuerySelector(`featureCheck`) as SqlQuery;
|
||||||
|
|
||||||
|
const { data: fd, error: fe } = await tryCatch(
|
||||||
|
prodQuery(featureQ.query, `Running feature check`),
|
||||||
|
);
|
||||||
|
|
||||||
|
if (fe) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "datamart",
|
||||||
|
subModule: "query",
|
||||||
|
message: `feature check failed`,
|
||||||
|
data: fe as any,
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// for queries that will need to be ran on legacy until we get the plant updated need to go in here
|
||||||
|
const doubleQueries = ["inventory"];
|
||||||
|
let queryFile = "";
|
||||||
|
|
||||||
|
if (doubleQueries.includes(data.name)) {
|
||||||
|
queryFile = `datamart.${
|
||||||
|
fd.data[0].activated > 0 ? data.name : `legacy.${data.name}`
|
||||||
|
}`;
|
||||||
|
} else {
|
||||||
|
queryFile = `datamart.${data.name}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
const sqlQuery = sqlQuerySelector(queryFile) as SqlQuery;
|
||||||
|
// checking if warehousing is as it will start to effect a lot of queries for plants that are not on 2.
|
||||||
|
|
||||||
const getDataMartInfo = datamartData.filter((x) => x.endpoint === data.name);
|
const getDataMartInfo = datamartData.filter((x) => x.endpoint === data.name);
|
||||||
|
|
||||||
// const optionsMissing =
|
// const optionsMissing =
|
||||||
// !data.options || Object.keys(data.options).length === 0;
|
// !data.options || Object.keys(data.options).length === 0;
|
||||||
|
|
||||||
const optionCount =
|
const isValid =
|
||||||
Object.keys(data.options).length ===
|
Object.keys(data.options ?? {}).length >=
|
||||||
getDataMartInfo[0]?.howManyOptionsRequired;
|
(getDataMartInfo[0]?.howManyOptionsRequired ?? 0);
|
||||||
|
|
||||||
if (getDataMartInfo[0]?.optionsRequired && !optionCount) {
|
if (getDataMartInfo[0]?.optionsRequired && !isValid) {
|
||||||
return returnFunc({
|
return returnFunc({
|
||||||
success: false,
|
success: false,
|
||||||
level: "error",
|
level: "error",
|
||||||
module: "datamart",
|
module: "datamart",
|
||||||
subModule: "query",
|
subModule: "query",
|
||||||
message: `This query is required to have the ${getDataMartInfo[0]?.howManyOptionsRequired} options set in order use it.`,
|
message: `This query is required to have ${getDataMartInfo[0]?.howManyOptionsRequired} option(s) set in order use it, please add in your option(s) data and try again.`,
|
||||||
data: [getDataMartInfo[0].options],
|
data: [getDataMartInfo[0].options],
|
||||||
notify: false,
|
notify: false,
|
||||||
});
|
});
|
||||||
@@ -75,11 +167,131 @@ export const runDatamartQuery = async (data: Data) => {
|
|||||||
|
|
||||||
// split the criteria by "," then and then update the query
|
// split the criteria by "," then and then update the query
|
||||||
if (data.options) {
|
if (data.options) {
|
||||||
Object.entries(data.options ?? {}).forEach(([key, value]) => {
|
switch (data.name) {
|
||||||
const pattern = new RegExp(`\\[${key.trim()}\\]`, "g");
|
case "activeArticles":
|
||||||
datamartQuery = datamartQuery.replace(pattern, String(value).trim());
|
break;
|
||||||
|
case "deliveryByDateRange":
|
||||||
|
datamartQuery = datamartQuery
|
||||||
|
.replace("[startDate]", `${data.options.startDate}`)
|
||||||
|
.replace("[endDate]", `${data.options.endDate}`)
|
||||||
|
.replace(
|
||||||
|
"--and r.ArticleHumanReadableId in ([articles]) ",
|
||||||
|
data.options.articles
|
||||||
|
? `and r.ArticleHumanReadableId in (${data.options.articles})`
|
||||||
|
: "--and r.ArticleHumanReadableId in ([articles]) ",
|
||||||
|
)
|
||||||
|
.replace(
|
||||||
|
"and DeliveredQuantity > 0",
|
||||||
|
data.options.all
|
||||||
|
? "--and DeliveredQuantity > 0"
|
||||||
|
: "and DeliveredQuantity > 0",
|
||||||
|
);
|
||||||
|
|
||||||
|
break;
|
||||||
|
case "customerInventory":
|
||||||
|
datamartQuery = datamartQuery
|
||||||
|
.replace(
|
||||||
|
"--and IdAdressen",
|
||||||
|
`and IdAdressen in (${data.options.customer})`,
|
||||||
|
)
|
||||||
|
.replace(
|
||||||
|
"--and x.IdWarenlager in (0)",
|
||||||
|
`${data.options.whseToInclude ? `and x.IdWarenlager in (${data.options.whseToInclude})` : `--and x.IdWarenlager in (0)`}`,
|
||||||
|
);
|
||||||
|
break;
|
||||||
|
case "openOrders":
|
||||||
|
datamartQuery = datamartQuery
|
||||||
|
.replace("[startDay]", `${data.options.startDay}`)
|
||||||
|
.replace("[endDay]", `${data.options.endDay}`);
|
||||||
|
break;
|
||||||
|
case "inventory":
|
||||||
|
datamartQuery = datamartQuery
|
||||||
|
.replaceAll(
|
||||||
|
"--,l.RunningNumber",
|
||||||
|
`${data.options.includeRunningNumbers ? `,l.RunningNumber` : `--,l.RunningNumber`}`,
|
||||||
|
)
|
||||||
|
.replaceAll(
|
||||||
|
"--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot",
|
||||||
|
`${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot`}`,
|
||||||
|
)
|
||||||
|
.replaceAll(
|
||||||
|
"--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber",
|
||||||
|
`${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber`}`,
|
||||||
|
)
|
||||||
|
.replaceAll(
|
||||||
|
"--,l.WarehouseDescription,l.LaneDescription",
|
||||||
|
`${data.options.locations ? `,l.WarehouseDescription,l.LaneDescription` : `--,l.WarehouseDescription,l.LaneDescription`}`,
|
||||||
|
);
|
||||||
|
|
||||||
|
break;
|
||||||
|
case "fakeEDIUpdate":
|
||||||
|
datamartQuery = datamartQuery.replace(
|
||||||
|
"--AND h.CustomerHumanReadableId in (0)",
|
||||||
|
`${data.options.address ? `AND h.CustomerHumanReadableId in (${data.options.address})` : `--AND h.CustomerHumanReadableId in (0)`}`,
|
||||||
|
);
|
||||||
|
|
||||||
|
break;
|
||||||
|
case "forecast":
|
||||||
|
datamartQuery = datamartQuery.replace(
|
||||||
|
"where DeliveryAddressHumanReadableId in ([customers])",
|
||||||
|
data.options.customers
|
||||||
|
? `where DeliveryAddressHumanReadableId in (${data.options.customers})`
|
||||||
|
: "--where DeliveryAddressHumanReadableId in ([customers])",
|
||||||
|
);
|
||||||
|
|
||||||
|
break;
|
||||||
|
case "activeArticles2":
|
||||||
|
datamartQuery = datamartQuery.replace(
|
||||||
|
"and a.HumanReadableId in ([articles])",
|
||||||
|
data.options.articles
|
||||||
|
? `and a.HumanReadableId in (${data.options.articles})`
|
||||||
|
: "--and a.HumanReadableId in ([articles])",
|
||||||
|
);
|
||||||
|
|
||||||
|
break;
|
||||||
|
case "psiDeliveryData":
|
||||||
|
datamartQuery = datamartQuery
|
||||||
|
.replace("[startDate]", `${data.options.startDate}`)
|
||||||
|
.replace("[endDate]", `${data.options.endDate}`)
|
||||||
|
.replace(
|
||||||
|
"[articles]",
|
||||||
|
data.options.articles ? `${data.options.articles}` : "[articles]",
|
||||||
|
);
|
||||||
|
break;
|
||||||
|
case "productionData":
|
||||||
|
datamartQuery = datamartQuery
|
||||||
|
.replace("[startDate]", `${data.options.startDate}`)
|
||||||
|
.replace("[endDate]", `${data.options.endDate}`)
|
||||||
|
.replace(
|
||||||
|
"and ArticleHumanReadableId in ([articles])",
|
||||||
|
data.options.articles
|
||||||
|
? `and ArticleHumanReadableId in (${data.options.articles})`
|
||||||
|
: "--and ArticleHumanReadableId in ([articles])",
|
||||||
|
);
|
||||||
|
break;
|
||||||
|
case "psiPlanningData":
|
||||||
|
datamartQuery = datamartQuery
|
||||||
|
.replace("[startDate]", `${data.options.startDate}`)
|
||||||
|
.replace("[endDate]", `${data.options.endDate}`)
|
||||||
|
.replace(
|
||||||
|
"and p.IdArtikelvarianten in ([articles])",
|
||||||
|
data.options.articles
|
||||||
|
? `and p.IdArtikelvarianten in (${data.options.articles})`
|
||||||
|
: "--and p.IdArtikelvarianten in ([articles])",
|
||||||
|
);
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "datamart",
|
||||||
|
subModule: "query",
|
||||||
|
message: `${data.name} encountered an error as it might not exist in LST please contact support if this continues to happen`,
|
||||||
|
data: [sqlQuery.message],
|
||||||
|
notify: true,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const { data: queryRun, error } = await tryCatch(
|
const { data: queryRun, error } = await tryCatch(
|
||||||
prodQuery(datamartQuery, `Running datamart query: ${data.name}`),
|
prodQuery(datamartQuery, `Running datamart query: ${data.name}`),
|
||||||
|
|||||||
@@ -10,14 +10,50 @@ export const datamartData = [
|
|||||||
name: "Active articles",
|
name: "Active articles",
|
||||||
endpoint: "activeArticles",
|
endpoint: "activeArticles",
|
||||||
description: "returns all active articles for the server with custom data",
|
description: "returns all active articles for the server with custom data",
|
||||||
options: "", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
|
options: "",
|
||||||
optionsRequired: false,
|
optionsRequired: false,
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "Delivery by date range",
|
name: "Delivery by date range",
|
||||||
endpoint: "deliveryByDateRange",
|
endpoint: "deliveryByDateRange",
|
||||||
description: `Returns all Deliverys in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
|
description: `Returns all Deliveries in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
|
||||||
options: "startDate,endDate", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
|
options: "startDate,endDate",
|
||||||
|
optionsRequired: true,
|
||||||
|
howManyOptionsRequired: 2,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Get Customer Inventory",
|
||||||
|
endpoint: "customerInventory",
|
||||||
|
description: `Returns specific customer inventory based on there address ID, IE: 8,12,145. \nWith option to include specific warehousesIds, IE 36,41,5. \nNOTES: *leaving warehouse blank will just pull everything for the customer, Inventory dose not include PPOO or INV`,
|
||||||
|
options: "customer,whseToInclude",
|
||||||
|
optionsRequired: true,
|
||||||
|
howManyOptionsRequired: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Get open order",
|
||||||
|
endpoint: "openOrders",
|
||||||
|
description: `Returns open orders based on day count sent over, IE: startDay 15 days in the past endDay 5 days in the future, can be left empty for this default days`,
|
||||||
|
options: "startDay,endDay",
|
||||||
|
optionsRequired: true,
|
||||||
|
howManyOptionsRequired: 2,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Get inventory",
|
||||||
|
endpoint: "inventory",
|
||||||
|
description: `Returns all inventory, excludes inv location. adding an x in one of the options will enable it.`,
|
||||||
|
options: "includeRunningNumbers,locations,lots",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Fake EDI Update",
|
||||||
|
endpoint: "fakeEDIUpdate",
|
||||||
|
description: `Returns all open orders to correct and resubmit via lst demand mgt, leaving blank will get everything putting an address only returns the specified address. \nNOTE: only orders that were created via edi will populate here.`,
|
||||||
|
options: "address",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Production Data",
|
||||||
|
endpoint: "productionData",
|
||||||
|
description: `Returns all production data from the date range with the option to have 1 to many avs to search by.`,
|
||||||
|
options: "startDate,endDate,articles",
|
||||||
optionsRequired: true,
|
optionsRequired: true,
|
||||||
howManyOptionsRequired: 2,
|
howManyOptionsRequired: 2,
|
||||||
},
|
},
|
||||||
|
|||||||
10
backend/db/schema/buildHistory.schema.ts
Normal file
10
backend/db/schema/buildHistory.schema.ts
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
import { integer, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
|
||||||
|
|
||||||
|
export const deploymentHistory = pgTable("deployment_history", {
|
||||||
|
id: uuid("id").defaultRandom().primaryKey(),
|
||||||
|
serverId: uuid("server_id"),
|
||||||
|
buildNumber: integer("build_number").notNull(),
|
||||||
|
status: text("status").notNull(), // started, success, failed
|
||||||
|
message: text("message"),
|
||||||
|
createdAt: timestamp("created_at").defaultNow(),
|
||||||
|
});
|
||||||
30
backend/db/schema/historicalInv.schema.ts
Normal file
30
backend/db/schema/historicalInv.schema.ts
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
import { date, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
|
||||||
|
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
|
||||||
|
import type z from "zod";
|
||||||
|
|
||||||
|
export const invHistoricalData = pgTable("inv_historical_data", {
|
||||||
|
inv: uuid("id").defaultRandom().primaryKey(),
|
||||||
|
histDate: date("hist_date").notNull(), // this date should always be yesterday when we post it.
|
||||||
|
plantToken: text("plant_token"),
|
||||||
|
article: text("article").notNull(),
|
||||||
|
articleDescription: text("article_description").notNull(),
|
||||||
|
materialType: text("material_type"),
|
||||||
|
total_QTY: text("total_QTY"),
|
||||||
|
available_QTY: text("available_QTY"),
|
||||||
|
coa_QTY: text("coa_QTY"),
|
||||||
|
held_QTY: text("held_QTY"),
|
||||||
|
consignment_QTY: text("consignment_qty"),
|
||||||
|
lot_Number: text("lot_number"),
|
||||||
|
locationId: text("location_id"),
|
||||||
|
location: text("location"),
|
||||||
|
whseId: text("whse_id").default(""),
|
||||||
|
whseName: text("whse_name").default("missing whseName"),
|
||||||
|
upd_user: text("upd_user").default("lst-system"),
|
||||||
|
upd_date: timestamp("upd_date").defaultNow(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export const invHistoricalDataSchema = createSelectSchema(invHistoricalData);
|
||||||
|
export const newInvHistoricalDataSchema = createInsertSchema(invHistoricalData);
|
||||||
|
|
||||||
|
export type InvHistoricalData = z.infer<typeof invHistoricalDataSchema>;
|
||||||
|
export type NewInvHistoricalData = z.infer<typeof newInvHistoricalDataSchema>;
|
||||||
40
backend/db/schema/serverData.schema.ts
Normal file
40
backend/db/schema/serverData.schema.ts
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
import {
|
||||||
|
boolean,
|
||||||
|
integer,
|
||||||
|
pgTable,
|
||||||
|
text,
|
||||||
|
timestamp,
|
||||||
|
uuid,
|
||||||
|
} from "drizzle-orm/pg-core";
|
||||||
|
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
|
||||||
|
import type z from "zod";
|
||||||
|
|
||||||
|
export const serverData = pgTable(
|
||||||
|
"server_data",
|
||||||
|
{
|
||||||
|
server_id: uuid("id").defaultRandom().primaryKey(),
|
||||||
|
name: text("name").notNull(),
|
||||||
|
server: text("server"),
|
||||||
|
plantToken: text("plant_token").notNull().unique(),
|
||||||
|
idAddress: text("id_address"),
|
||||||
|
greatPlainsPlantCode: text("great_plains_plant_code"),
|
||||||
|
contactEmail: text("contact_email"),
|
||||||
|
contactPhone: text("contact_phone"),
|
||||||
|
active: boolean("active").default(true),
|
||||||
|
serverLoc: text("server_loc"),
|
||||||
|
lastUpdated: timestamp("last_updated").defaultNow(),
|
||||||
|
buildNumber: integer("build_number"),
|
||||||
|
isUpgrading: boolean("is_upgrading").default(false),
|
||||||
|
},
|
||||||
|
|
||||||
|
// (table) => [
|
||||||
|
// // uniqueIndex('emailUniqueIndex').on(sql`lower(${table.email})`),
|
||||||
|
// uniqueIndex("plant_token").on(table.plantToken),
|
||||||
|
// ],
|
||||||
|
);
|
||||||
|
|
||||||
|
export const serverDataSchema = createSelectSchema(serverData);
|
||||||
|
export const newServerDataSchema = createInsertSchema(serverData);
|
||||||
|
|
||||||
|
export type ServerDataSchema = z.infer<typeof serverDataSchema>;
|
||||||
|
export type NewServerData = z.infer<typeof newServerDataSchema>;
|
||||||
@@ -1,10 +1,27 @@
|
|||||||
import type { InferSelectModel } from "drizzle-orm";
|
import {
|
||||||
import { integer, pgTable, text, timestamp } from "drizzle-orm/pg-core";
|
boolean,
|
||||||
|
integer,
|
||||||
|
jsonb,
|
||||||
|
pgTable,
|
||||||
|
text,
|
||||||
|
timestamp,
|
||||||
|
} from "drizzle-orm/pg-core";
|
||||||
|
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
|
||||||
|
import type z from "zod";
|
||||||
|
|
||||||
export const serverStats = pgTable("stats", {
|
export const appStats = pgTable("app_stats", {
|
||||||
id: text("id").primaryKey().default("serverStats"),
|
id: text("id").primaryKey().default("primary"),
|
||||||
build: integer("build").notNull().default(1),
|
currentBuild: integer("current_build").notNull().default(1),
|
||||||
lastUpdate: timestamp("last_update").defaultNow(),
|
lastBuildAt: timestamp("last_build_at"),
|
||||||
|
lastDeployAt: timestamp("last_deploy_at"),
|
||||||
|
building: boolean("building").notNull().default(false),
|
||||||
|
updating: boolean("updating").notNull().default(false),
|
||||||
|
lastUpdated: timestamp("last_updated").defaultNow(),
|
||||||
|
meta: jsonb("meta").$type<Record<string, unknown>>().default({}),
|
||||||
});
|
});
|
||||||
|
|
||||||
export type ServerStats = InferSelectModel<typeof serverStats>;
|
export const appStatsSchema = createSelectSchema(appStats);
|
||||||
|
export const newAppStatsSchema = createInsertSchema(appStats, {});
|
||||||
|
|
||||||
|
export type AppStats = z.infer<typeof appStatsSchema>;
|
||||||
|
export type NewAppStats = z.infer<typeof newAppStatsSchema>;
|
||||||
|
|||||||
@@ -53,13 +53,14 @@ export const connectGPSql = async () => {
|
|||||||
notify: false,
|
notify: false,
|
||||||
});
|
});
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
|
console.log(error);
|
||||||
reconnectToSql;
|
reconnectToSql;
|
||||||
return returnFunc({
|
return returnFunc({
|
||||||
success: false,
|
success: false,
|
||||||
level: "error",
|
level: "error",
|
||||||
module: "system",
|
module: "system",
|
||||||
subModule: "db",
|
subModule: "db",
|
||||||
message: "Failed to connect to the prod sql server.",
|
message: "Failed to connect to the gp sql server.",
|
||||||
data: [error],
|
data: [error],
|
||||||
notify: false,
|
notify: false,
|
||||||
});
|
});
|
||||||
|
|||||||
223
backend/logistics/logistics.historicalInv.ts
Normal file
223
backend/logistics/logistics.historicalInv.ts
Normal file
@@ -0,0 +1,223 @@
|
|||||||
|
import { format } from "date-fns";
|
||||||
|
import { eq, sql } from "drizzle-orm";
|
||||||
|
import { runDatamartQuery } from "../datamart/datamart.controller.js";
|
||||||
|
import { db } from "../db/db.controller.js";
|
||||||
|
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
|
||||||
|
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
|
||||||
|
import {
|
||||||
|
type SqlQuery,
|
||||||
|
sqlQuerySelector,
|
||||||
|
} from "../prodSql/prodSqlQuerySelector.utils.js";
|
||||||
|
import { createCronJob } from "../utils/croner.utils.js";
|
||||||
|
import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||||
|
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||||
|
|
||||||
|
type Inventory = {
|
||||||
|
article: string;
|
||||||
|
alias: string;
|
||||||
|
materialType: string;
|
||||||
|
total_palletQTY: string;
|
||||||
|
available_QTY: string;
|
||||||
|
coa_QTY: string;
|
||||||
|
held_QTY: string;
|
||||||
|
consignment_qty: string;
|
||||||
|
lot: string;
|
||||||
|
locationId: string;
|
||||||
|
laneDescription: string;
|
||||||
|
warehouseId: string;
|
||||||
|
warehouseDescription: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
const historicalInvImport = async () => {
|
||||||
|
const today = new Date();
|
||||||
|
const { data, error } = await tryCatch(
|
||||||
|
db
|
||||||
|
.select()
|
||||||
|
.from(invHistoricalData)
|
||||||
|
.where(eq(invHistoricalData.histDate, format(today, "yyyy-MM-dd"))),
|
||||||
|
);
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "system",
|
||||||
|
subModule: "query",
|
||||||
|
message: `Error getting historical inv info`,
|
||||||
|
data: error as any,
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (data?.length === 0) {
|
||||||
|
const avSQLQuery = sqlQuerySelector(`datamart.activeArticles`) as SqlQuery;
|
||||||
|
|
||||||
|
if (!avSQLQuery.success) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "inv",
|
||||||
|
message: `Error getting Article info`,
|
||||||
|
data: [avSQLQuery.message],
|
||||||
|
notify: true,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const { data: inv, error: invError } = await tryCatch(
|
||||||
|
//prodQuery(sqlQuery.query, "Inventory data"),
|
||||||
|
runDatamartQuery({
|
||||||
|
name: "inventory",
|
||||||
|
options: { lots: "x", locations: "x" },
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
const { data: av, error: avError } = (await tryCatch(
|
||||||
|
runDatamartQuery({ name: "activeArticles", options: {} }),
|
||||||
|
)) as any;
|
||||||
|
|
||||||
|
if (invError) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "inv",
|
||||||
|
message: `Error getting inventory info from prod query`,
|
||||||
|
data: invError as any,
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (avError) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "inv",
|
||||||
|
message: `Error getting article info from prod query`,
|
||||||
|
data: invError as any,
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// shape the data to go into our table
|
||||||
|
const plantToken = process.env.PROD_PLANT_TOKEN ?? "test1";
|
||||||
|
const importInv = (inv.data ? inv.data : []) as Inventory[];
|
||||||
|
const importData = importInv.map((i) => {
|
||||||
|
return {
|
||||||
|
histDate: sql`(NOW())::date`,
|
||||||
|
plantToken: plantToken,
|
||||||
|
article: i.article,
|
||||||
|
articleDescription: i.alias,
|
||||||
|
materialType:
|
||||||
|
av.data.filter((a: any) => a.article === i.article).length > 0
|
||||||
|
? av.data.filter((a: any) => a.article === i.article)[0]
|
||||||
|
?.TypeOfMaterial
|
||||||
|
: "Item not defined",
|
||||||
|
total_QTY: i.total_palletQTY ?? "0.00",
|
||||||
|
available_QTY: i.available_QTY ?? "0.00",
|
||||||
|
coa_QTY: i.coa_QTY ?? "0.00",
|
||||||
|
held_QTY: i.held_QTY ?? "0.00",
|
||||||
|
consignment_QTY: i.consignment_qty ?? "0.00",
|
||||||
|
lot_Number: i.lot ?? "0",
|
||||||
|
locationId: i.locationId ?? "0",
|
||||||
|
location: i.laneDescription ?? "Missing lane",
|
||||||
|
whseId: i.warehouseId ?? "0",
|
||||||
|
whseName: i.warehouseDescription ?? "Missing warehouse",
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
const { data: dataImport, error: errorImport } = await tryCatch(
|
||||||
|
db.insert(invHistoricalData).values(importData),
|
||||||
|
);
|
||||||
|
|
||||||
|
if (errorImport) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "inv",
|
||||||
|
message: `Error adding historical data to lst db`,
|
||||||
|
data: errorImport as any,
|
||||||
|
notify: true,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (dataImport) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "info",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "inv",
|
||||||
|
message: `Historical data was added to lst :D`,
|
||||||
|
data: [],
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "info",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "inv",
|
||||||
|
message: `Historical Data for: ${format(today, "yyyy-MM-dd")}, is already added and nothing to do.`,
|
||||||
|
data: [],
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "info",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "inv",
|
||||||
|
message: `Some weird crazy error just happened and didnt get captured during the historical inv check.`,
|
||||||
|
data: [],
|
||||||
|
notify: true,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const historicalSchedule = async () => {
|
||||||
|
// running the history in case my silly ass dose an update around the shift change time lol, this will prevent loss data. it might be off a little but no one cares
|
||||||
|
historicalInvImport();
|
||||||
|
|
||||||
|
const sqlQuery = sqlQuerySelector(`shiftChange`) as SqlQuery;
|
||||||
|
|
||||||
|
if (!sqlQuery.success) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "query",
|
||||||
|
message: `Error getting shiftChange sql file`,
|
||||||
|
data: [sqlQuery.message],
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const { data, error } = await tryCatch(
|
||||||
|
prodQuery(sqlQuery.query, "Shift Change data"),
|
||||||
|
);
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "logistics",
|
||||||
|
subModule: "query",
|
||||||
|
message: `Error getting shiftChange info`,
|
||||||
|
data: error as any,
|
||||||
|
notify: false,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
// shift split
|
||||||
|
const shiftTimeSplit = data?.data[0]?.shiftChange.split(":");
|
||||||
|
|
||||||
|
const cronSetup = `0 ${
|
||||||
|
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[1])}` : "0"
|
||||||
|
} ${
|
||||||
|
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[0])}` : "7"
|
||||||
|
} * * *`;
|
||||||
|
|
||||||
|
createCronJob("historicalInv", cronSetup, () => historicalInvImport());
|
||||||
|
};
|
||||||
@@ -62,7 +62,7 @@ export const printerSync = async () => {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
if (printers?.success) {
|
if (printers?.success && Array.isArray(printers.data)) {
|
||||||
const ignorePrinters = ["pdf24", "standard"];
|
const ignorePrinters = ["pdf24", "standard"];
|
||||||
|
|
||||||
const validPrinters =
|
const validPrinters =
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
use AlplaPROD_test1
|
use AlplaPROD_test1
|
||||||
|
|
||||||
SELECT V_Artikel.IdArtikelvarianten,
|
SELECT V_Artikel.IdArtikelvarianten as article,
|
||||||
V_Artikel.Bezeichnung,
|
V_Artikel.Bezeichnung,
|
||||||
V_Artikel.ArtikelvariantenTypBez,
|
V_Artikel.ArtikelvariantenTypBez,
|
||||||
V_Artikel.PreisEinheitBez,
|
V_Artikel.PreisEinheitBez,
|
||||||
43
backend/prodSql/queries/datamart.activeArticles2.sql
Normal file
43
backend/prodSql/queries/datamart.activeArticles2.sql
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
/**
|
||||||
|
This will be replacing activeArticles once all data is remapped into this query.
|
||||||
|
make a note in the docs this activeArticles will go stale sooner or later.
|
||||||
|
**/
|
||||||
|
use [test1_AlplaPROD2.0_Read]
|
||||||
|
|
||||||
|
select a.Id,
|
||||||
|
a.HumanReadableId as av,
|
||||||
|
a.Alias as alias,
|
||||||
|
p.LoadingUnitsPerTruck as loadingUnitsPerTruck,
|
||||||
|
p.LoadingUnitsPerTruck * p.LoadingUnitPieces as qtyPerTruck,
|
||||||
|
p.LoadingUnitPieces,
|
||||||
|
case when i.MinQuantity IS NOT NULL then round(cast(i.MinQuantity as float), 2) else 0 end as min,
|
||||||
|
case when i.MaxQuantity IS NOT NULL then round(cast(i.MaxQuantity as float),2) else 0 end as max
|
||||||
|
from masterData.Article (nolock) as a
|
||||||
|
|
||||||
|
/* sales price */
|
||||||
|
left join
|
||||||
|
(select *
|
||||||
|
from (select
|
||||||
|
id,
|
||||||
|
PackagingId,
|
||||||
|
ArticleId,
|
||||||
|
DefaultCustomer,
|
||||||
|
ROW_NUMBER() OVER (PARTITION BY ArticleId ORDER BY ValidAfter DESC) AS RowNum
|
||||||
|
from masterData.SalesPrice (nolock)
|
||||||
|
where DefaultCustomer = 1) as x
|
||||||
|
where RowNum = 1
|
||||||
|
) as s
|
||||||
|
on a.id = s.ArticleId
|
||||||
|
|
||||||
|
/* pkg instructions */
|
||||||
|
left join
|
||||||
|
masterData.PackagingInstruction (nolock) as p
|
||||||
|
on s.PackagingId = p.id
|
||||||
|
|
||||||
|
/* stock limits */
|
||||||
|
left join
|
||||||
|
masterData.StockLimit (nolock) as i
|
||||||
|
on a.id = i.ArticleId
|
||||||
|
|
||||||
|
where a.active = 1
|
||||||
|
and a.HumanReadableId in ([articles])
|
||||||
45
backend/prodSql/queries/datamart.customerInventory.sql
Normal file
45
backend/prodSql/queries/datamart.customerInventory.sql
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
select x.idartikelVarianten as av
|
||||||
|
,ArtikelVariantenAlias as Alias
|
||||||
|
--x.Lfdnr as RunningNumber,
|
||||||
|
--,round(sum(EinlagerungsMengeVPKSum),0) as Total_Pallets
|
||||||
|
--,sum(EinlagerungsMengeSum) as Total_PalletQTY
|
||||||
|
,round(sum(VerfuegbareMengeVPKSum),0) as Avalible_Pallets
|
||||||
|
,sum(VerfuegbareMengeSum) as Avaliable_PalletQTY
|
||||||
|
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as COA_Pallets
|
||||||
|
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as COA_QTY
|
||||||
|
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as Held_Pallets
|
||||||
|
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeSum else 0 end) as Held_QTY
|
||||||
|
,IdProdPlanung as Lot
|
||||||
|
--,IdAdressen
|
||||||
|
--,x.AdressBez
|
||||||
|
--,*
|
||||||
|
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
|
||||||
|
|
||||||
|
left join
|
||||||
|
[AlplaPROD_test1].dbo.T_EtikettenGedruckt (nolock) on
|
||||||
|
x.Lfdnr = T_EtikettenGedruckt.Lfdnr AND T_EtikettenGedruckt.Lfdnr > 1
|
||||||
|
|
||||||
|
left join
|
||||||
|
|
||||||
|
(SELECT *
|
||||||
|
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] (nolock) where Active = 1) as c
|
||||||
|
on x.IdMainDefect = c.IdBlockingDefect
|
||||||
|
/*
|
||||||
|
The data below will be controlled by the user in excell by default everything will be passed over
|
||||||
|
IdAdressen = 3
|
||||||
|
*/
|
||||||
|
where
|
||||||
|
--IdArtikelTyp = 1
|
||||||
|
x.IdWarenlager not in (6, 1)
|
||||||
|
--and IdAdressen
|
||||||
|
--and x.IdWarenlager in (0)
|
||||||
|
|
||||||
|
|
||||||
|
group by x.IdArtikelVarianten
|
||||||
|
,ArtikelVariantenAlias
|
||||||
|
,IdProdPlanung
|
||||||
|
--,c.Description
|
||||||
|
,IdAdressen
|
||||||
|
,x.AdressBez
|
||||||
|
--, x.Lfdnr
|
||||||
|
order by x.IdArtikelVarianten
|
||||||
74
backend/prodSql/queries/datamart.deliveryByDateRange.sql
Normal file
74
backend/prodSql/queries/datamart.deliveryByDateRange.sql
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
use [test1_AlplaPROD2.0_Read]
|
||||||
|
|
||||||
|
DECLARE @StartDate DATE = '[startDate]' -- 2025-1-1
|
||||||
|
DECLARE @EndDate DATE = '[endDate]' -- 2025-1-31
|
||||||
|
SELECT
|
||||||
|
r.[ArticleHumanReadableId]
|
||||||
|
,[ReleaseNumber]
|
||||||
|
,h.CustomerOrderNumber
|
||||||
|
,x.CustomerLineItemNumber
|
||||||
|
,[CustomerReleaseNumber]
|
||||||
|
,[ReleaseState]
|
||||||
|
,[DeliveryState]
|
||||||
|
,ea.JournalNummer as BOL_Number
|
||||||
|
,[ReleaseConfirmationState]
|
||||||
|
,[PlanningState]
|
||||||
|
,format(r.[OrderDate], 'yyyy-MM-dd HH:mm') as OrderDate
|
||||||
|
--,r.[OrderDate]
|
||||||
|
,FORMAT(r.[DeliveryDate], 'yyyy-MM-dd HH:mm') as DeliveryDate
|
||||||
|
--,r.[DeliveryDate]
|
||||||
|
,FORMAT(r.[LoadingDate], 'yyyy-MM-dd HH:mm') as LoadingDate
|
||||||
|
--,r.[LoadingDate]
|
||||||
|
,[Quantity]
|
||||||
|
,[DeliveredQuantity]
|
||||||
|
,r.[AdditionalInformation1]
|
||||||
|
,r.[AdditionalInformation2]
|
||||||
|
,[TradeUnits]
|
||||||
|
,[LoadingUnits]
|
||||||
|
,[Trucks]
|
||||||
|
,[LoadingToleranceType]
|
||||||
|
,[SalesPrice]
|
||||||
|
,[Currency]
|
||||||
|
,[QuantityUnit]
|
||||||
|
,[SalesPriceRemark]
|
||||||
|
,r.[Remark]
|
||||||
|
,[Irradiated]
|
||||||
|
,r.[CreatedByEdi]
|
||||||
|
,[DeliveryAddressHumanReadableId]
|
||||||
|
,DeliveryAddressDescription
|
||||||
|
,[CustomerArtNo]
|
||||||
|
,[TotalPrice]
|
||||||
|
,r.[ArticleAlias]
|
||||||
|
|
||||||
|
FROM [order].[Release] (nolock) as r
|
||||||
|
|
||||||
|
left join
|
||||||
|
[order].LineItem as x on
|
||||||
|
|
||||||
|
r.LineItemId = x.id
|
||||||
|
|
||||||
|
left join
|
||||||
|
[order].Header as h on
|
||||||
|
x.HeaderId = h.id
|
||||||
|
|
||||||
|
--bol stuff
|
||||||
|
left join
|
||||||
|
AlplaPROD_test1.dbo.V_LadePlanungenLadeAuftragAbruf (nolock) as zz
|
||||||
|
on zz.AbrufIdAuftragsAbruf = r.ReleaseNumber
|
||||||
|
|
||||||
|
left join
|
||||||
|
(select * from (SELECT
|
||||||
|
ROW_NUMBER() OVER (PARTITION BY IdJournal ORDER BY add_date DESC) AS RowNum
|
||||||
|
,*
|
||||||
|
FROM [AlplaPROD_test1].[dbo].[T_Lieferungen] (nolock)) x
|
||||||
|
|
||||||
|
where RowNum = 1) as ea on
|
||||||
|
zz.IdLieferschein = ea.IdJournal
|
||||||
|
|
||||||
|
where
|
||||||
|
--r.ReleaseNumber = 1452
|
||||||
|
|
||||||
|
r.DeliveryDate between @StartDate AND @EndDate
|
||||||
|
and DeliveredQuantity > 0
|
||||||
|
--and r.ArticleHumanReadableId in ([articles])
|
||||||
|
--and Journalnummer = 169386
|
||||||
29
backend/prodSql/queries/datamart.fakeEDIUpdate.sql
Normal file
29
backend/prodSql/queries/datamart.fakeEDIUpdate.sql
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
use [test1_AlplaPROD2.0_Read]
|
||||||
|
|
||||||
|
select
|
||||||
|
customerartno as CustomerArticleNumber
|
||||||
|
,h.CustomerOrderNumber as CustomerOrderNumber
|
||||||
|
,l.CustomerLineItemNumber as CustomerLineNumber
|
||||||
|
,r.CustomerReleaseNumber as CustomerRealeaseNumber
|
||||||
|
,r.Quantity
|
||||||
|
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as DeliveryDate
|
||||||
|
,h.CustomerHumanReadableId as CustomerID
|
||||||
|
,r.Remark
|
||||||
|
--,*
|
||||||
|
from [order].[Release] as r (nolock)
|
||||||
|
|
||||||
|
left join
|
||||||
|
[order].LineItem as l (nolock) on
|
||||||
|
l.id = r.LineItemId
|
||||||
|
|
||||||
|
left join
|
||||||
|
[order].Header as h (nolock) on
|
||||||
|
h.id = l.HeaderId
|
||||||
|
|
||||||
|
WHERE releaseState not in (1, 2, 3, 4)
|
||||||
|
AND h.CreatedByEdi = 1
|
||||||
|
AND r.deliveryDate < getdate() + 1
|
||||||
|
--AND h.CustomerHumanReadableId in (0)
|
||||||
|
|
||||||
|
|
||||||
|
order by r.deliveryDate
|
||||||
8
backend/prodSql/queries/datamart.forecast.sql
Normal file
8
backend/prodSql/queries/datamart.forecast.sql
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
SELECT format(RequirementDate, 'yyyy-MM-dd') as requirementDate
|
||||||
|
,ArticleHumanReadableId
|
||||||
|
,CustomerArticleNumber
|
||||||
|
,ArticleDescription
|
||||||
|
,Quantity
|
||||||
|
FROM [test1_AlplaPROD2.0_Read].[forecast].[Forecast]
|
||||||
|
where DeliveryAddressHumanReadableId in ([customers])
|
||||||
|
order by RequirementDate
|
||||||
58
backend/prodSql/queries/datamart.inventory.sql
Normal file
58
backend/prodSql/queries/datamart.inventory.sql
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
use [test1_AlplaPROD2.0_Read]
|
||||||
|
|
||||||
|
select
|
||||||
|
ArticleHumanReadableId as article
|
||||||
|
,ArticleAlias as alias
|
||||||
|
,round(sum(QuantityLoadingUnits),2) total_pallets
|
||||||
|
,round(sum(Quantity),2) as total_palletQTY
|
||||||
|
,round(sum(case when State = 0 then QuantityLoadingUnits else 0 end),2) available_Pallets
|
||||||
|
,round(sum(case when State = 0 then Quantity else 0 end),2) available_QTY
|
||||||
|
,round(sum(case when b.HumanReadableId = 864 then QuantityLoadingUnits else 0 end),2) as coa_Pallets
|
||||||
|
,round(sum(case when b.HumanReadableId = 864 then Quantity else 0 end),2) as coa_QTY
|
||||||
|
,round(sum(case when b.HumanReadableId <> 864 then QuantityLoadingUnits else 0 end),2) as held_Pallets
|
||||||
|
,round(sum(case when b.HumanReadableId <> 864 then Quantity else 0 end),2) as held_QTY
|
||||||
|
,round(sum(case when w.type = 7 then QuantityLoadingUnits else 0 end),2) as consignment_Pallets
|
||||||
|
,round(sum(case when w.type = 7 then Quantity else 0 end),2) as consignment_qty
|
||||||
|
--,l.RunningNumber
|
||||||
|
|
||||||
|
/** datamart include lot number **/
|
||||||
|
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot
|
||||||
|
|
||||||
|
/** data mart include location data **/
|
||||||
|
--,l.WarehouseDescription,l.LaneDescription
|
||||||
|
|
||||||
|
,articleTypeName
|
||||||
|
|
||||||
|
FROM [warehousing].[WarehouseUnit] as l (nolock)
|
||||||
|
left join
|
||||||
|
(
|
||||||
|
SELECT [Id]
|
||||||
|
,[HumanReadableId]
|
||||||
|
,d.[Description]
|
||||||
|
,[DefectGroupId]
|
||||||
|
,[IsActive]
|
||||||
|
FROM [blocking].[BlockingDefect] as g (nolock)
|
||||||
|
|
||||||
|
left join
|
||||||
|
[AlplaPROD_test1].dbo.[T_BlockingDefects] as d (nolock) on
|
||||||
|
d.IdGlobalBlockingDefect = g.HumanReadableId
|
||||||
|
) as b on
|
||||||
|
b.id = l.MainDefectId
|
||||||
|
|
||||||
|
left join
|
||||||
|
[warehousing].[warehouse] as w (nolock) on
|
||||||
|
w.id = l.warehouseid
|
||||||
|
|
||||||
|
where LaneHumanReadableId not in (20000,21000)
|
||||||
|
group by ArticleHumanReadableId,
|
||||||
|
ArticleAlias,
|
||||||
|
ArticleTypeName
|
||||||
|
--,l.RunningNumber
|
||||||
|
|
||||||
|
/** datamart include lot number **/
|
||||||
|
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber
|
||||||
|
|
||||||
|
/** data mart include location data **/
|
||||||
|
--,l.WarehouseDescription,l.LaneDescription
|
||||||
|
|
||||||
|
order by ArticleHumanReadableId
|
||||||
48
backend/prodSql/queries/datamart.legacy.inventory.sql
Normal file
48
backend/prodSql/queries/datamart.legacy.inventory.sql
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
select
|
||||||
|
x.idartikelVarianten as article,
|
||||||
|
x.ArtikelVariantenAlias as alias
|
||||||
|
--x.Lfdnr as RunningNumber,
|
||||||
|
,round(sum(EinlagerungsMengeVPKSum),2) as total_pallets
|
||||||
|
,sum(EinlagerungsMengeSum) as total_palletQTY
|
||||||
|
,round(sum(VerfuegbareMengeVPKSum),0) as available_Pallets
|
||||||
|
,sum(VerfuegbareMengeSum) as available_QTY
|
||||||
|
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as coa_Pallets
|
||||||
|
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as coa_QTY
|
||||||
|
,sum(case when c.Description NOT LIKE '%COA%' or x.IdMainDefect = -1 then GesperrteMengeVPKSum else 0 end) as held_Pallets
|
||||||
|
,sum(case when c.Description NOT LIKE '%COA%' or x.IdMainDefect = -1 then GesperrteMengeSum else 0 end) as held_QTY
|
||||||
|
,sum(case when x.WarenLagerLagerTyp = 8 then VerfuegbareMengeSum else 0 end) as consignment_qty
|
||||||
|
,IdProdPlanung as lot
|
||||||
|
----,IdAdressen,
|
||||||
|
,x.AdressBez
|
||||||
|
,x.IdLagerAbteilung as locationId
|
||||||
|
,x.LagerAbteilungKurzBez as laneDescription
|
||||||
|
,x.IdWarenlager as warehouseId
|
||||||
|
,x.WarenLagerKurzBez as warehouseDescription
|
||||||
|
--,*
|
||||||
|
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
|
||||||
|
|
||||||
|
left join
|
||||||
|
[AlplaPROD_test1].dbo.T_EtikettenGedruckt as l(nolock) on
|
||||||
|
x.Lfdnr = l.Lfdnr AND l.Lfdnr > 1
|
||||||
|
|
||||||
|
left join
|
||||||
|
|
||||||
|
(SELECT *
|
||||||
|
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] where Active = 1) as c
|
||||||
|
on x.IdMainDefect = c.IdBlockingDefect
|
||||||
|
/*
|
||||||
|
The data below will be controlled by the user in excell by default everything will be passed over
|
||||||
|
IdAdressen = 3
|
||||||
|
*/
|
||||||
|
where /*IdArtikelTyp = 1 and */x.IdWarenlager not in (6, 1)
|
||||||
|
|
||||||
|
group by x.idartikelVarianten, ArtikelVariantenAlias, c.Description
|
||||||
|
--,IdAdressen
|
||||||
|
,x.AdressBez
|
||||||
|
,IdProdPlanung
|
||||||
|
,x.IdLagerAbteilung
|
||||||
|
,x.LagerAbteilungKurzBez
|
||||||
|
,x.IdWarenlager
|
||||||
|
,x.WarenLagerKurzBez
|
||||||
|
--, x.Lfdnr
|
||||||
|
order by x.IdArtikelVarianten
|
||||||
33
backend/prodSql/queries/datamart.openOrders.sql
Normal file
33
backend/prodSql/queries/datamart.openOrders.sql
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
use [test1_AlplaPROD2.0_Read]
|
||||||
|
|
||||||
|
select
|
||||||
|
customerartno
|
||||||
|
,r.ArticleHumanReadableId as article
|
||||||
|
,r.ArticleAlias as articleAlias
|
||||||
|
,ReleaseNumber
|
||||||
|
,h.CustomerOrderNumber as header
|
||||||
|
,l.CustomerLineItemNumber as lineItem
|
||||||
|
,r.CustomerReleaseNumber as releaseNumber
|
||||||
|
,r.LoadingUnits
|
||||||
|
,r.Quantity
|
||||||
|
,r.TradeUnits
|
||||||
|
,h.CustomerHumanReadableId
|
||||||
|
,r.DeliveryAddressDescription
|
||||||
|
,format(r.LoadingDate, 'MM/dd/yyyy HH:mm') as loadingDate
|
||||||
|
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as deliveryDate
|
||||||
|
,r.Remark
|
||||||
|
--,*
|
||||||
|
from [order].[Release] as r (nolock)
|
||||||
|
|
||||||
|
left join
|
||||||
|
[order].LineItem as l (nolock) on
|
||||||
|
l.id = r.LineItemId
|
||||||
|
|
||||||
|
left join
|
||||||
|
[order].Header as h (nolock) on
|
||||||
|
h.id = l.HeaderId
|
||||||
|
|
||||||
|
WHERE releasestate not in (1, 2, 4)
|
||||||
|
AND r.deliverydate between getDate() + -[startDay] and getdate() + [endDay]
|
||||||
|
|
||||||
|
order by r.deliverydate
|
||||||
19
backend/prodSql/queries/datamart.productionData.sql
Normal file
19
backend/prodSql/queries/datamart.productionData.sql
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
use [test1_AlplaPROD2.0_Reporting]
|
||||||
|
|
||||||
|
declare @startDate nvarchar(30) = '[startDate]' --'2024-12-30'
|
||||||
|
declare @endDate nvarchar(30) = '[endDate]' --'2025-08-09'
|
||||||
|
|
||||||
|
select MachineLocation,
|
||||||
|
ArticleHumanReadableId as article,
|
||||||
|
sum(Quantity) as Produced,
|
||||||
|
count(Quantity) as palletsProdued,
|
||||||
|
FORMAT(convert(date, ProductionDay), 'M/d/yyyy') as ProductionDay,
|
||||||
|
ProductionLotHumanReadableId as productionLot
|
||||||
|
|
||||||
|
from [reporting_productionControlling].[ScannedUnit] (nolock)
|
||||||
|
|
||||||
|
where convert(date, ProductionDay) between @startDate and @endDate
|
||||||
|
and ArticleHumanReadableId in ([articles])
|
||||||
|
and BookedOut is null
|
||||||
|
|
||||||
|
group by MachineLocation, ArticleHumanReadableId,ProductionDay, ProductionLotHumanReadableId
|
||||||
@@ -1,5 +1,10 @@
|
|||||||
use [test1_AlplaPROD2.0_Read]
|
use AlplaPROD_test1
|
||||||
|
/**
|
||||||
|
|
||||||
|
move this over to the delivery date range query once we have the shift data mapped over correctly.
|
||||||
|
|
||||||
|
update the psi stuff on this as well.
|
||||||
|
**/
|
||||||
DECLARE @StartDate DATE = '[startDate]' -- 2025-1-1
|
DECLARE @StartDate DATE = '[startDate]' -- 2025-1-1
|
||||||
DECLARE @EndDate DATE = '[endDate]' -- 2025-1-31
|
DECLARE @EndDate DATE = '[endDate]' -- 2025-1-31
|
||||||
SELECT
|
SELECT
|
||||||
@@ -66,9 +71,9 @@ ROW_NUMBER() OVER (PARTITION BY IdJournal ORDER BY add_date DESC) AS RowNum
|
|||||||
zz.IdLieferschein = ea.IdJournal
|
zz.IdLieferschein = ea.IdJournal
|
||||||
|
|
||||||
where
|
where
|
||||||
--r.ArticleHumanReadableId in ([articles])
|
r.ArticleHumanReadableId in ([articles])
|
||||||
--r.ReleaseNumber = 1452
|
--r.ReleaseNumber = 1452
|
||||||
|
|
||||||
r.DeliveryDate between @StartDate AND @EndDate
|
and r.DeliveryDate between @StartDate AND @EndDate
|
||||||
and DeliveredQuantity > 0
|
--and DeliveredQuantity > 0
|
||||||
--and Journalnummer = 169386
|
--and Journalnummer = 169386
|
||||||
32
backend/prodSql/queries/datamart.psiPlanningData.sql
Normal file
32
backend/prodSql/queries/datamart.psiPlanningData.sql
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
use AlplaPROD_test1
|
||||||
|
declare @start_date nvarchar(30) = '[startDate]' --'2025-01-01'
|
||||||
|
declare @end_date nvarchar(30) = '[endDate]' --'2025-08-09'
|
||||||
|
/*
|
||||||
|
articles will need to be passed over as well as the date structure we want to see
|
||||||
|
*/
|
||||||
|
|
||||||
|
select x.IdArtikelvarianten As Article,
|
||||||
|
ProduktionAlias as Description,
|
||||||
|
standort as MachineId,
|
||||||
|
MaschinenBezeichnung as MachineName,
|
||||||
|
--MaschZyklus as PlanningCycleTime,
|
||||||
|
x.IdProdPlanung as LotNumber,
|
||||||
|
FORMAT(ProdTag, 'MM/dd/yyyy') as ProductionDay,
|
||||||
|
x.planMenge as TotalPlanned,
|
||||||
|
ProduktionMenge as QTYPerDay,
|
||||||
|
round(ProduktionMengeVPK, 2) PalDay,
|
||||||
|
Status as finished
|
||||||
|
--MaschStdAuslastung as nee
|
||||||
|
|
||||||
|
from dbo.V_ProdLosProduktionJeProdTag_PLANNING (nolock) as x
|
||||||
|
|
||||||
|
left join
|
||||||
|
dbo.V_ProdPlanung (nolock) as p on
|
||||||
|
x.IdProdPlanung = p.IdProdPlanung
|
||||||
|
|
||||||
|
where ProdTag between @start_date and @end_date
|
||||||
|
and p.IdArtikelvarianten in ([articles])
|
||||||
|
--and V_ProdLosProduktionJeProdTag_PLANNING.IdKunde = 10
|
||||||
|
--and IdProdPlanung = 18442
|
||||||
|
|
||||||
|
order by ProdTag desc
|
||||||
11
backend/prodSql/queries/featureCheck.sql
Normal file
11
backend/prodSql/queries/featureCheck.sql
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
SELECT count(*) as activated
|
||||||
|
FROM [test1_AlplaPROD2.0_Read].[support].[FeatureActivation]
|
||||||
|
|
||||||
|
where feature in (108,7)
|
||||||
|
|
||||||
|
|
||||||
|
/*
|
||||||
|
as more features get activated and need to have this checked to include the new endpoints add here so we can check this.
|
||||||
|
108 = waste
|
||||||
|
7 = warehousing
|
||||||
|
*/
|
||||||
4
backend/prodSql/queries/shiftChange.sql
Normal file
4
backend/prodSql/queries/shiftChange.sql
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
select top(1) convert(varchar(8) ,
|
||||||
|
convert(time,startdate), 108) as shiftChange
|
||||||
|
from [test1_AlplaPROD2.0_Read].[masterData].[ShiftDefinition]
|
||||||
|
where teamNumber = 1
|
||||||
@@ -45,7 +45,7 @@ export const monitorAlplaPurchase = async () => {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (purchaseMonitor[0]?.active) {
|
if (purchaseMonitor[0]?.active) {
|
||||||
createCronJob("purchaseMonitor", "0 */5 * * * *", async () => {
|
createCronJob("purchaseMonitor", "0 5 * * * *", async () => {
|
||||||
try {
|
try {
|
||||||
const result = await prodQuery(
|
const result = await prodQuery(
|
||||||
sqlQuery.query.replace(
|
sqlQuery.query.replace(
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import type { Express } from "express";
|
import type { Express } from "express";
|
||||||
|
import { setupAdminRoutes } from "./admin/admin.routes.js";
|
||||||
import { setupAuthRoutes } from "./auth/auth.routes.js";
|
import { setupAuthRoutes } from "./auth/auth.routes.js";
|
||||||
// import the routes and route setups
|
// import the routes and route setups
|
||||||
import { setupApiDocsRoutes } from "./configs/scaler.config.js";
|
import { setupApiDocsRoutes } from "./configs/scaler.config.js";
|
||||||
@@ -10,11 +10,13 @@ import { setupOCPRoutes } from "./ocp/ocp.routes.js";
|
|||||||
import { setupOpendockRoutes } from "./opendock/opendock.routes.js";
|
import { setupOpendockRoutes } from "./opendock/opendock.routes.js";
|
||||||
import { setupProdSqlRoutes } from "./prodSql/prodSql.routes.js";
|
import { setupProdSqlRoutes } from "./prodSql/prodSql.routes.js";
|
||||||
import { setupSystemRoutes } from "./system/system.routes.js";
|
import { setupSystemRoutes } from "./system/system.routes.js";
|
||||||
|
import { setupTCPRoutes } from "./tcpServer/tcp.routes.js";
|
||||||
import { setupUtilsRoutes } from "./utils/utils.routes.js";
|
import { setupUtilsRoutes } from "./utils/utils.routes.js";
|
||||||
|
|
||||||
export const setupRoutes = (baseUrl: string, app: Express) => {
|
export const setupRoutes = (baseUrl: string, app: Express) => {
|
||||||
//routes that are on by default
|
//routes that are on by default
|
||||||
setupSystemRoutes(baseUrl, app);
|
setupSystemRoutes(baseUrl, app);
|
||||||
|
setupAdminRoutes(baseUrl, app);
|
||||||
setupApiDocsRoutes(baseUrl, app);
|
setupApiDocsRoutes(baseUrl, app);
|
||||||
setupProdSqlRoutes(baseUrl, app);
|
setupProdSqlRoutes(baseUrl, app);
|
||||||
setupGPSqlRoutes(baseUrl, app);
|
setupGPSqlRoutes(baseUrl, app);
|
||||||
@@ -24,4 +26,5 @@ export const setupRoutes = (baseUrl: string, app: Express) => {
|
|||||||
setupOpendockRoutes(baseUrl, app);
|
setupOpendockRoutes(baseUrl, app);
|
||||||
setupNotificationRoutes(baseUrl, app);
|
setupNotificationRoutes(baseUrl, app);
|
||||||
setupOCPRoutes(baseUrl, app);
|
setupOCPRoutes(baseUrl, app);
|
||||||
|
setupTCPRoutes(baseUrl, app);
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ import { dbCleanup } from "./db/dbCleanup.controller.js";
|
|||||||
import { type Setting, settings } from "./db/schema/settings.schema.js";
|
import { type Setting, settings } from "./db/schema/settings.schema.js";
|
||||||
import { connectGPSql } from "./gpSql/gpSqlConnection.controller.js";
|
import { connectGPSql } from "./gpSql/gpSqlConnection.controller.js";
|
||||||
import { createLogger } from "./logger/logger.controller.js";
|
import { createLogger } from "./logger/logger.controller.js";
|
||||||
|
import { historicalSchedule } from "./logistics/logistics.historicalInv.js";
|
||||||
import { startNotifications } from "./notification/notification.controller.js";
|
import { startNotifications } from "./notification/notification.controller.js";
|
||||||
import { createNotifications } from "./notification/notifications.master.js";
|
import { createNotifications } from "./notification/notifications.master.js";
|
||||||
import { printerSync } from "./ocp/ocp.printer.manage.js";
|
import { printerSync } from "./ocp/ocp.printer.manage.js";
|
||||||
@@ -14,6 +15,7 @@ import { opendockSocketMonitor } from "./opendock/opendockSocketMonitor.utils.js
|
|||||||
import { connectProdSql } from "./prodSql/prodSqlConnection.controller.js";
|
import { connectProdSql } from "./prodSql/prodSqlConnection.controller.js";
|
||||||
import { monitorAlplaPurchase } from "./purchase/purchase.controller.js";
|
import { monitorAlplaPurchase } from "./purchase/purchase.controller.js";
|
||||||
import { setupSocketIORoutes } from "./socket.io/serverSetup.js";
|
import { setupSocketIORoutes } from "./socket.io/serverSetup.js";
|
||||||
|
import { serversChecks } from "./system/serverData.controller.js";
|
||||||
import { baseSettingValidationCheck } from "./system/settingsBase.controller.js";
|
import { baseSettingValidationCheck } from "./system/settingsBase.controller.js";
|
||||||
import { startTCPServer } from "./tcpServer/tcp.server.js";
|
import { startTCPServer } from "./tcpServer/tcp.server.js";
|
||||||
import { createCronJob } from "./utils/croner.utils.js";
|
import { createCronJob } from "./utils/croner.utils.js";
|
||||||
@@ -64,10 +66,12 @@ const start = async () => {
|
|||||||
dbCleanup("jobs", 30),
|
dbCleanup("jobs", 30),
|
||||||
);
|
);
|
||||||
createCronJob("logsCleanup", "0 15 5 * * *", () => dbCleanup("logs", 120));
|
createCronJob("logsCleanup", "0 15 5 * * *", () => dbCleanup("logs", 120));
|
||||||
|
historicalSchedule();
|
||||||
|
|
||||||
// one shots only needed to run on server startups
|
// one shots only needed to run on server startups
|
||||||
createNotifications();
|
createNotifications();
|
||||||
startNotifications();
|
startNotifications();
|
||||||
|
serversChecks();
|
||||||
}, 5 * 1000);
|
}, 5 * 1000);
|
||||||
|
|
||||||
process.on("uncaughtException", async (err) => {
|
process.on("uncaughtException", async (err) => {
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ type RoomDefinition<T = unknown> = {
|
|||||||
|
|
||||||
export const protectedRooms: any = {
|
export const protectedRooms: any = {
|
||||||
logs: { requiresAuth: true, role: ["admin", "systemAdmin"] },
|
logs: { requiresAuth: true, role: ["admin", "systemAdmin"] },
|
||||||
admin: { requiresAuth: true, role: ["admin", "systemAdmin"] },
|
//admin: { requiresAuth: false, role: ["admin", "systemAdmin"] },
|
||||||
};
|
};
|
||||||
|
|
||||||
export const roomDefinition: Record<RoomId, RoomDefinition> = {
|
export const roomDefinition: Record<RoomId, RoomDefinition> = {
|
||||||
@@ -36,4 +36,16 @@ export const roomDefinition: Record<RoomId, RoomDefinition> = {
|
|||||||
return [];
|
return [];
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
admin: {
|
||||||
|
seed: async (limit) => {
|
||||||
|
console.info(limit);
|
||||||
|
return [];
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"admin:build": {
|
||||||
|
seed: async (limit) => {
|
||||||
|
console.info(limit);
|
||||||
|
return [];
|
||||||
|
},
|
||||||
|
},
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -88,14 +88,12 @@ export const setupSocketIORoutes = (baseUrl: string, server: HttpServer) => {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
const roles = Array.isArray(config.role) ? config.role : [config.role];
|
const roles = Array.isArray(config?.role) ? config?.role : [config?.role];
|
||||||
|
|
||||||
console.log(roles, s.user.role);
|
|
||||||
|
|
||||||
//if (config?.role && s.user?.role !== config.role) {
|
//if (config?.role && s.user?.role !== config.role) {
|
||||||
if (config?.role && !roles.includes(s.user?.role)) {
|
if (config?.role && !roles.includes(s.user?.role)) {
|
||||||
return s.emit("room-error", {
|
return s.emit("room-error", {
|
||||||
room: rn,
|
roomId: rn,
|
||||||
message: `Not authorized to be in room: ${rn}`,
|
message: `Not authorized to be in room: ${rn}`,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
export type RoomId = "logs" | "labels"; //| "alerts" | "metrics";
|
export type RoomId = "logs" | "labels" | "admin" | "admin:build"; //| "alerts" | "metrics";
|
||||||
|
|||||||
154
backend/system/serverData.controller.ts
Normal file
154
backend/system/serverData.controller.ts
Normal file
@@ -0,0 +1,154 @@
|
|||||||
|
import { sql } from "drizzle-orm";
|
||||||
|
import { db } from "../db/db.controller.js";
|
||||||
|
import {
|
||||||
|
type NewServerData,
|
||||||
|
serverData,
|
||||||
|
} from "../db/schema/serverData.schema.js";
|
||||||
|
import { createLogger } from "../logger/logger.controller.js";
|
||||||
|
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||||
|
|
||||||
|
const servers: NewServerData[] = [
|
||||||
|
{
|
||||||
|
name: "Test server 1",
|
||||||
|
server: "USMCD1VMS036",
|
||||||
|
plantToken: "test3",
|
||||||
|
idAddress: "10.193.0.56",
|
||||||
|
greatPlainsPlantCode: "00",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Test server 2",
|
||||||
|
server: "USIOW1VMS036",
|
||||||
|
plantToken: "test2",
|
||||||
|
idAddress: "10.75.0.56",
|
||||||
|
greatPlainsPlantCode: "00",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Lima",
|
||||||
|
server: "USLIM1VMS006",
|
||||||
|
plantToken: "uslim1",
|
||||||
|
idAddress: "10.53.0.26",
|
||||||
|
greatPlainsPlantCode: "50",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Houston",
|
||||||
|
server: "ushou1VMS006",
|
||||||
|
plantToken: "ushou1",
|
||||||
|
idAddress: "10.195.0.26",
|
||||||
|
greatPlainsPlantCode: "20",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Dayton",
|
||||||
|
server: "usday1VMS006",
|
||||||
|
plantToken: "usday1",
|
||||||
|
idAddress: "10.44.0.56",
|
||||||
|
greatPlainsPlantCode: "80",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "West Bend",
|
||||||
|
server: "usweb1VMS006",
|
||||||
|
plantToken: "usweb1",
|
||||||
|
idAddress: "10.80.0.26",
|
||||||
|
greatPlainsPlantCode: "65",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Jeff City",
|
||||||
|
server: "usjci1VMS006",
|
||||||
|
plantToken: "usjci",
|
||||||
|
idAddress: "10.167.0.26",
|
||||||
|
greatPlainsPlantCode: "40",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Sherman",
|
||||||
|
server: "usshe1vms006",
|
||||||
|
plantToken: "usshe1",
|
||||||
|
idAddress: "10.205.0.26",
|
||||||
|
greatPlainsPlantCode: "21",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "McDonough",
|
||||||
|
server: "USMCD1VMS006",
|
||||||
|
plantToken: "usmcd1",
|
||||||
|
idAddress: "10.193.0.26",
|
||||||
|
greatPlainsPlantCode: "10",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 82,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "St. Peters",
|
||||||
|
server: "USTP1VMS006",
|
||||||
|
plantToken: "usstp1",
|
||||||
|
idAddress: "10.37.0.26",
|
||||||
|
greatPlainsPlantCode: "45",
|
||||||
|
contactEmail: "",
|
||||||
|
contactPhone: "",
|
||||||
|
serverLoc: "D$\\LST_V3",
|
||||||
|
buildNumber: 1,
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
export const serversChecks = async () => {
|
||||||
|
const log = createLogger({ module: "system", subModule: "serverData" });
|
||||||
|
const { data, error } = await tryCatch(
|
||||||
|
db
|
||||||
|
.insert(serverData)
|
||||||
|
.values(servers)
|
||||||
|
.onConflictDoUpdate({
|
||||||
|
target: serverData.plantToken,
|
||||||
|
set: {
|
||||||
|
server: sql`excluded.server`,
|
||||||
|
name: sql`excluded.name`,
|
||||||
|
idAddress: sql`excluded."id_address"`,
|
||||||
|
greatPlainsPlantCode: sql`excluded.great_plains_plant_code`,
|
||||||
|
contactEmail: sql`excluded."contact_email"`,
|
||||||
|
contactPhone: sql`excluded.contact_phone`,
|
||||||
|
serverLoc: sql`excluded.server_loc`,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
.returning(),
|
||||||
|
);
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
log.error(
|
||||||
|
{ error: error },
|
||||||
|
"There was an error when adding or updating the servers.",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (data) {
|
||||||
|
log.info({}, "All Servers were added/updated");
|
||||||
|
}
|
||||||
|
};
|
||||||
43
backend/system/serverData.route.ts
Normal file
43
backend/system/serverData.route.ts
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
import { type Response, Router } from "express";
|
||||||
|
import { db } from "../db/db.controller.js";
|
||||||
|
import { serverData } from "../db/schema/serverData.schema.js";
|
||||||
|
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||||
|
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||||
|
|
||||||
|
// export const updateSetting = async (setting: Setting) => {
|
||||||
|
// // TODO: when the setting is a feature setting we will need to have it run each kill switch on the crons well just stop them and during a reset it just wont start them
|
||||||
|
// // TODO: when the setting is a system we will need to force an app restart
|
||||||
|
// // TODO: when the setting is standard we don't do anything.
|
||||||
|
// };
|
||||||
|
|
||||||
|
const r = Router();
|
||||||
|
|
||||||
|
r.get("/", async (_, res: Response) => {
|
||||||
|
const { data: sName, error: sError } = await tryCatch(
|
||||||
|
db.select().from(serverData).orderBy(serverData.name),
|
||||||
|
);
|
||||||
|
|
||||||
|
if (sError) {
|
||||||
|
return apiReturn(res, {
|
||||||
|
success: false,
|
||||||
|
level: "error",
|
||||||
|
module: "system",
|
||||||
|
subModule: "serverData",
|
||||||
|
message: `There was an error getting the servers `,
|
||||||
|
data: [sError],
|
||||||
|
status: 400,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return apiReturn(res, {
|
||||||
|
success: true,
|
||||||
|
level: "info",
|
||||||
|
module: "system",
|
||||||
|
subModule: "serverData",
|
||||||
|
message: `All current servers`,
|
||||||
|
data: sName ?? [],
|
||||||
|
status: 200,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
export default r;
|
||||||
@@ -1,9 +1,12 @@
|
|||||||
import { Router } from "express";
|
import { Router } from "express";
|
||||||
|
import { connected as gpSql } from "../gpSql/gpSqlConnection.controller.js";
|
||||||
|
import { connected as prodSql } from "../prodSql/prodSqlConnection.controller.js";
|
||||||
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
|
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
|
||||||
import {
|
import {
|
||||||
type SqlQuery,
|
type SqlQuery,
|
||||||
sqlQuerySelector,
|
sqlQuerySelector,
|
||||||
} from "../prodSql/prodSqlQuerySelector.utils.js";
|
} from "../prodSql/prodSqlQuerySelector.utils.js";
|
||||||
|
import { isServerRunning } from "../tcpServer/tcp.server.js";
|
||||||
|
|
||||||
const router = Router();
|
const router = Router();
|
||||||
|
|
||||||
@@ -24,7 +27,10 @@ router.get("/", async (_, res) => {
|
|||||||
? sqlServerStats?.data[0].UptimeSeconds
|
? sqlServerStats?.data[0].UptimeSeconds
|
||||||
: [],
|
: [],
|
||||||
eomFGPkgSheetVersion: 1, // this is the excel file version when we have a change to the macro we want to grab this
|
eomFGPkgSheetVersion: 1, // this is the excel file version when we have a change to the macro we want to grab this
|
||||||
masterMacroFile: 1,
|
masterMacroFile: 1.1,
|
||||||
|
tcpServerOnline: isServerRunning,
|
||||||
|
sqlServerConnected: prodSql,
|
||||||
|
gpServerConnected: gpSql,
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
49
backend/system/system.mobileApp.ts
Normal file
49
backend/system/system.mobileApp.ts
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
import fs from "node:fs";
|
||||||
|
import { Router } from "express";
|
||||||
|
import path from "path";
|
||||||
|
import { fileURLToPath } from "url";
|
||||||
|
|
||||||
|
const router = Router();
|
||||||
|
|
||||||
|
const __filename = fileURLToPath(import.meta.url);
|
||||||
|
const __dirname = path.dirname(__filename);
|
||||||
|
|
||||||
|
const downloadDir = path.resolve(__dirname, "../../downloads/mobile");
|
||||||
|
|
||||||
|
const currentApk = {
|
||||||
|
packageName: "net.alpla.lst.mobile",
|
||||||
|
versionName: "0.0.1-alpha",
|
||||||
|
versionCode: 1,
|
||||||
|
minSupportedVersionCode: 1,
|
||||||
|
fileName: "lst-mobile.apk",
|
||||||
|
};
|
||||||
|
|
||||||
|
router.get("/version", async (req, res) => {
|
||||||
|
const baseUrl = `${req.protocol}://${req.get("host")}`;
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
packageName: currentApk.packageName,
|
||||||
|
versionName: currentApk.versionName,
|
||||||
|
versionCode: currentApk.versionCode,
|
||||||
|
minSupportedVersionCode: currentApk.minSupportedVersionCode,
|
||||||
|
downloadUrl: `${baseUrl}/lst/api/mobile/apk/latest`,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
router.get("/apk/latest", (_, res) => {
|
||||||
|
const apkPath = path.join(downloadDir, currentApk.fileName);
|
||||||
|
|
||||||
|
if (!fs.existsSync(apkPath)) {
|
||||||
|
return res.status(404).json({ success: false, message: "APK not found" });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.setHeader("Content-Type", "application/vnd.android.package-archive");
|
||||||
|
res.setHeader(
|
||||||
|
"Content-Disposition",
|
||||||
|
`attachment; filename="${currentApk.fileName}"`,
|
||||||
|
);
|
||||||
|
|
||||||
|
return res.sendFile(apkPath);
|
||||||
|
});
|
||||||
|
|
||||||
|
export default router;
|
||||||
@@ -1,13 +1,17 @@
|
|||||||
import type { Express } from "express";
|
import type { Express } from "express";
|
||||||
import { requireAuth } from "../middleware/auth.middleware.js";
|
import { requireAuth } from "../middleware/auth.middleware.js";
|
||||||
|
import getServers from "./serverData.route.js";
|
||||||
import getSettings from "./settings.route.js";
|
import getSettings from "./settings.route.js";
|
||||||
import updSetting from "./settingsUpdate.route.js";
|
import updSetting from "./settingsUpdate.route.js";
|
||||||
import stats from "./stats.route.js";
|
import stats from "./stats.route.js";
|
||||||
|
import mobile from "./system.mobileApp.js";
|
||||||
|
|
||||||
export const setupSystemRoutes = (baseUrl: string, app: Express) => {
|
export const setupSystemRoutes = (baseUrl: string, app: Express) => {
|
||||||
//stats will be like this as we dont need to change this
|
//stats will be like this as we dont need to change this
|
||||||
app.use(`${baseUrl}/api/stats`, stats);
|
app.use(`${baseUrl}/api/stats`, stats);
|
||||||
|
app.use(`${baseUrl}/api/mobile`, mobile);
|
||||||
app.use(`${baseUrl}/api/settings`, getSettings);
|
app.use(`${baseUrl}/api/settings`, getSettings);
|
||||||
|
app.use(`${baseUrl}/api/servers`, getServers);
|
||||||
app.use(`${baseUrl}/api/settings`, requireAuth, updSetting);
|
app.use(`${baseUrl}/api/settings`, requireAuth, updSetting);
|
||||||
|
|
||||||
// all other system should be under /api/system/*
|
// all other system should be under /api/system/*
|
||||||
|
|||||||
14
backend/tcpServer/tcp.routes.ts
Normal file
14
backend/tcpServer/tcp.routes.ts
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
import type { Express } from "express";
|
||||||
|
import { requireAuth } from "../middleware/auth.middleware.js";
|
||||||
|
import restart from "./tcpRestart.route.js";
|
||||||
|
import start from "./tcpStart.route.js";
|
||||||
|
import stop from "./tcpStop.route.js";
|
||||||
|
|
||||||
|
export const setupTCPRoutes = (baseUrl: string, app: Express) => {
|
||||||
|
//stats will be like this as we dont need to change this
|
||||||
|
app.use(`${baseUrl}/api/tcp/start`, requireAuth, start);
|
||||||
|
app.use(`${baseUrl}/api/tcp/stop`, requireAuth, stop);
|
||||||
|
app.use(`${baseUrl}/api/tcp/restart`, requireAuth, restart);
|
||||||
|
|
||||||
|
// all other system should be under /api/system/*
|
||||||
|
};
|
||||||
@@ -3,13 +3,14 @@ import { eq } from "drizzle-orm";
|
|||||||
import { db } from "../db/db.controller.js";
|
import { db } from "../db/db.controller.js";
|
||||||
import { printerData } from "../db/schema/printers.schema.js";
|
import { printerData } from "../db/schema/printers.schema.js";
|
||||||
import { createLogger } from "../logger/logger.controller.js";
|
import { createLogger } from "../logger/logger.controller.js";
|
||||||
|
import { delay } from "../utils/delay.utils.js";
|
||||||
import { returnFunc } from "../utils/returnHelper.utils.js";
|
import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||||
import { tryCatch } from "../utils/trycatch.utils.js";
|
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||||
import { type PrinterData, printerListen } from "./tcp.printerListener.js";
|
import { type PrinterData, printerListen } from "./tcp.printerListener.js";
|
||||||
|
|
||||||
let tcpServer: net.Server;
|
let tcpServer: net.Server;
|
||||||
const tcpSockets: Set<net.Socket> = new Set();
|
const tcpSockets: Set<net.Socket> = new Set();
|
||||||
//let isServerRunning = false;
|
export let isServerRunning = false;
|
||||||
|
|
||||||
const port = parseInt(process.env.TCP_PORT ?? "2222", 10);
|
const port = parseInt(process.env.TCP_PORT ?? "2222", 10);
|
||||||
|
|
||||||
@@ -39,9 +40,8 @@ const parseTcpAlert = (input: string) => {
|
|||||||
name,
|
name,
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
const log = createLogger({ module: "tcp", submodule: "create_server" });
|
||||||
export const startTCPServer = () => {
|
export const startTCPServer = async () => {
|
||||||
const log = createLogger({ module: "tcp", submodule: "create_server" });
|
|
||||||
tcpServer = net.createServer(async (socket) => {
|
tcpServer = net.createServer(async (socket) => {
|
||||||
tcpSockets.add(socket);
|
tcpSockets.add(socket);
|
||||||
socket.on("data", async (data: Buffer) => {
|
socket.on("data", async (data: Buffer) => {
|
||||||
@@ -103,7 +103,7 @@ export const startTCPServer = () => {
|
|||||||
log.info({}, `TCP Server listening on port ${port}`);
|
log.info({}, `TCP Server listening on port ${port}`);
|
||||||
});
|
});
|
||||||
|
|
||||||
//isServerRunning = true;
|
isServerRunning = true;
|
||||||
return returnFunc({
|
return returnFunc({
|
||||||
success: true,
|
success: true,
|
||||||
level: "info",
|
level: "info",
|
||||||
@@ -115,3 +115,66 @@ export const startTCPServer = () => {
|
|||||||
room: "",
|
room: "",
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
|
export const stopTCPServer = async () => {
|
||||||
|
if (!isServerRunning)
|
||||||
|
return { success: false, message: "Server is not running" };
|
||||||
|
for (const socket of tcpSockets) {
|
||||||
|
socket.destroy();
|
||||||
|
}
|
||||||
|
tcpSockets.clear();
|
||||||
|
tcpServer.close(() => {
|
||||||
|
log.info({}, "TCP Server stopped");
|
||||||
|
});
|
||||||
|
isServerRunning = false;
|
||||||
|
return returnFunc({
|
||||||
|
success: true,
|
||||||
|
level: "info",
|
||||||
|
module: "tcp",
|
||||||
|
subModule: "create_server",
|
||||||
|
message: "TCP server stopped.",
|
||||||
|
data: [],
|
||||||
|
notify: false,
|
||||||
|
room: "",
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
export const restartTCPServer = async () => {
|
||||||
|
if (!isServerRunning) {
|
||||||
|
startTCPServer();
|
||||||
|
return returnFunc({
|
||||||
|
success: false,
|
||||||
|
level: "warn",
|
||||||
|
module: "tcp",
|
||||||
|
subModule: "create_server",
|
||||||
|
message: "Server is not running will try to start it",
|
||||||
|
data: [],
|
||||||
|
notify: false,
|
||||||
|
room: "",
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
for (const socket of tcpSockets) {
|
||||||
|
socket.destroy();
|
||||||
|
}
|
||||||
|
tcpSockets.clear();
|
||||||
|
tcpServer.close(() => {
|
||||||
|
log.info({}, "TCP Server stopped");
|
||||||
|
});
|
||||||
|
isServerRunning = false;
|
||||||
|
|
||||||
|
await delay(1500);
|
||||||
|
|
||||||
|
startTCPServer();
|
||||||
|
}
|
||||||
|
|
||||||
|
return returnFunc({
|
||||||
|
success: true,
|
||||||
|
level: "info",
|
||||||
|
module: "tcp",
|
||||||
|
subModule: "create_server",
|
||||||
|
message: "TCP server has been restarted.",
|
||||||
|
data: [],
|
||||||
|
notify: false,
|
||||||
|
room: "",
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|||||||
19
backend/tcpServer/tcpRestart.route.ts
Normal file
19
backend/tcpServer/tcpRestart.route.ts
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
import { Router } from "express";
|
||||||
|
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||||
|
import { restartTCPServer } from "./tcp.server.js";
|
||||||
|
|
||||||
|
const r = Router();
|
||||||
|
|
||||||
|
r.post("/restart", async (_, res) => {
|
||||||
|
const connect = await restartTCPServer();
|
||||||
|
apiReturn(res, {
|
||||||
|
success: connect.success,
|
||||||
|
level: connect.success ? "info" : "error",
|
||||||
|
module: "tcp",
|
||||||
|
subModule: "post",
|
||||||
|
message: "TCP Server has been restarted",
|
||||||
|
data: connect.data,
|
||||||
|
status: connect.success ? 200 : 400,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
export default r;
|
||||||
20
backend/tcpServer/tcpStart.route.ts
Normal file
20
backend/tcpServer/tcpStart.route.ts
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
import { Router } from "express";
|
||||||
|
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||||
|
import { startTCPServer } from "./tcp.server.js";
|
||||||
|
|
||||||
|
const r = Router();
|
||||||
|
|
||||||
|
r.post("/start", async (_, res) => {
|
||||||
|
const connect = await startTCPServer();
|
||||||
|
apiReturn(res, {
|
||||||
|
success: connect.success,
|
||||||
|
level: connect.success ? "info" : "error",
|
||||||
|
module: "routes",
|
||||||
|
subModule: "prodSql",
|
||||||
|
message: connect.message,
|
||||||
|
data: connect.data,
|
||||||
|
status: connect.success ? 200 : 400,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
export default r;
|
||||||
20
backend/tcpServer/tcpStop.route.ts
Normal file
20
backend/tcpServer/tcpStop.route.ts
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
import { Router } from "express";
|
||||||
|
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||||
|
import { stopTCPServer } from "./tcp.server.js";
|
||||||
|
|
||||||
|
const r = Router();
|
||||||
|
|
||||||
|
r.post("/stop", async (_, res) => {
|
||||||
|
const connect = await stopTCPServer();
|
||||||
|
apiReturn(res, {
|
||||||
|
success: connect.success,
|
||||||
|
level: connect.success ? "info" : "error",
|
||||||
|
module: "routes",
|
||||||
|
subModule: "prodSql",
|
||||||
|
message: connect.message,
|
||||||
|
data: [],
|
||||||
|
status: connect.success ? 200 : 400,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
export default r;
|
||||||
91
backend/utils/build.utils.ts
Normal file
91
backend/utils/build.utils.ts
Normal file
@@ -0,0 +1,91 @@
|
|||||||
|
import { spawn } from "node:child_process";
|
||||||
|
import { createLogger } from "../logger/logger.controller.js";
|
||||||
|
import { emitToRoom } from "../socket.io/roomEmitter.socket.js";
|
||||||
|
import { updateAppStats } from "./updateAppStats.utils.js";
|
||||||
|
import { zipBuild } from "./zipper.utils.js";
|
||||||
|
|
||||||
|
export const emitBuildLog = (message: string, level = "info") => {
|
||||||
|
const payload = {
|
||||||
|
type: "build",
|
||||||
|
level,
|
||||||
|
message,
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
};
|
||||||
|
|
||||||
|
//console.log(`[BUILD][${level.toUpperCase()}] ${message}`);
|
||||||
|
|
||||||
|
emitToRoom("admin:build", payload as any);
|
||||||
|
if (payload.level === "info") {
|
||||||
|
log.info({ stack: payload }, payload.message);
|
||||||
|
}
|
||||||
|
|
||||||
|
// if (log) {
|
||||||
|
// log(payload);
|
||||||
|
// }
|
||||||
|
};
|
||||||
|
|
||||||
|
export let building = false;
|
||||||
|
const log = createLogger({ module: "utils", subModule: "builds" });
|
||||||
|
export const build = async () => {
|
||||||
|
const appDir = process.env.DEV_DIR ?? "";
|
||||||
|
return new Promise((resolve) => {
|
||||||
|
building = true;
|
||||||
|
|
||||||
|
updateAppStats({
|
||||||
|
lastUpdated: new Date(),
|
||||||
|
building: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
emitBuildLog(`Starting build in: ${appDir}`);
|
||||||
|
|
||||||
|
const child = spawn("npm", ["run", "build"], {
|
||||||
|
cwd: appDir,
|
||||||
|
shell: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
child.stdout.on("data", (data) => {
|
||||||
|
const lines = data.toString().split(/\r?\n/);
|
||||||
|
for (const line of lines) {
|
||||||
|
if (line.trim() !== "") {
|
||||||
|
emitBuildLog(line, "info");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
child.stderr.on("data", (data) => {
|
||||||
|
const lines = data.toString().split(/\r?\n/);
|
||||||
|
for (const line of lines) {
|
||||||
|
if (line.trim() !== "") {
|
||||||
|
emitBuildLog(line, "error");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
child.on("close", (code) => {
|
||||||
|
if (code === 0) {
|
||||||
|
emitBuildLog("Build completed successfully.", "info");
|
||||||
|
building = false;
|
||||||
|
zipBuild();
|
||||||
|
resolve(true);
|
||||||
|
} else {
|
||||||
|
building = false;
|
||||||
|
updateAppStats({
|
||||||
|
lastUpdated: new Date(),
|
||||||
|
building: false,
|
||||||
|
});
|
||||||
|
emitBuildLog(`Build failed with code ${code}`, "error");
|
||||||
|
//reject(new Error(`Build failed with code ${code}`));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
child.on("error", (err) => {
|
||||||
|
building = false;
|
||||||
|
updateAppStats({
|
||||||
|
lastUpdated: new Date(),
|
||||||
|
building: false,
|
||||||
|
});
|
||||||
|
emitBuildLog(`Process error: ${err.message}`, "error");
|
||||||
|
// reject(err);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
};
|
||||||
@@ -9,6 +9,7 @@ export const allowedOrigins = [
|
|||||||
"http://localhost:4000",
|
"http://localhost:4000",
|
||||||
"http://localhost:4001",
|
"http://localhost:4001",
|
||||||
"http://localhost:5500",
|
"http://localhost:5500",
|
||||||
|
"http://localhost:8081",
|
||||||
"https://admin.socket.io",
|
"https://admin.socket.io",
|
||||||
"https://electron-socket-io-playground.vercel.app",
|
"https://electron-socket-io-playground.vercel.app",
|
||||||
`${process.env.URL}`,
|
`${process.env.URL}`,
|
||||||
|
|||||||
@@ -3,6 +3,7 @@ import { eq } from "drizzle-orm";
|
|||||||
import { db } from "../db/db.controller.js";
|
import { db } from "../db/db.controller.js";
|
||||||
import { jobAuditLog } from "../db/schema/auditLog.schema.js";
|
import { jobAuditLog } from "../db/schema/auditLog.schema.js";
|
||||||
import { createLogger } from "../logger/logger.controller.js";
|
import { createLogger } from "../logger/logger.controller.js";
|
||||||
|
import type { ReturnHelper } from "./returnHelper.utils.js";
|
||||||
|
|
||||||
// example createJob
|
// example createJob
|
||||||
// createCronJob("test Cron", "*/5 * * * * *", async () => {
|
// createCronJob("test Cron", "*/5 * * * * *", async () => {
|
||||||
@@ -45,7 +46,7 @@ const cronStats: Record<string, { created: number; replaced: number }> = {};
|
|||||||
export const createCronJob = async (
|
export const createCronJob = async (
|
||||||
name: string,
|
name: string,
|
||||||
schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second
|
schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second
|
||||||
task: () => Promise<void>, // what function are we passing over
|
task: () => Promise<void | ReturnHelper>, // what function are we passing over
|
||||||
source = "unknown",
|
source = "unknown",
|
||||||
) => {
|
) => {
|
||||||
// get the timezone based on the os timezone set
|
// get the timezone based on the os timezone set
|
||||||
|
|||||||
123
backend/utils/deployApp.ts
Normal file
123
backend/utils/deployApp.ts
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
import { spawn } from "node:child_process";
|
||||||
|
import { eq, sql } from "drizzle-orm";
|
||||||
|
import { db } from "../db/db.controller.js";
|
||||||
|
import { serverData } from "../db/schema/serverData.schema.js";
|
||||||
|
import { appStats } from "../db/schema/stats.schema.js";
|
||||||
|
//import { createLogger } from "../logger/logger.controller.js";
|
||||||
|
import { emitBuildLog } from "./build.utils.js";
|
||||||
|
import { returnFunc } from "./returnHelper.utils.js";
|
||||||
|
|
||||||
|
// const log = createLogger({ module: "utils", subModule: "deploy" });
|
||||||
|
export let updating = false;
|
||||||
|
|
||||||
|
const updateServerBuildNumber = async (token: string) => {
|
||||||
|
// get the current build
|
||||||
|
const buildNum = await db.select().from(appStats);
|
||||||
|
|
||||||
|
// update the build now
|
||||||
|
|
||||||
|
await db
|
||||||
|
.update(serverData)
|
||||||
|
.set({ buildNumber: buildNum[0]?.currentBuild, lastUpdated: sql`NOW()` })
|
||||||
|
.where(eq(serverData.plantToken, token));
|
||||||
|
};
|
||||||
|
export const runUpdate = ({
|
||||||
|
server,
|
||||||
|
destination,
|
||||||
|
token,
|
||||||
|
}: {
|
||||||
|
server: string;
|
||||||
|
destination: string;
|
||||||
|
token: string;
|
||||||
|
}) => {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
updating = true;
|
||||||
|
const scriptPath = process.env.UPDATE_SCRIPT_PATH;
|
||||||
|
if (!scriptPath) {
|
||||||
|
return returnFunc({
|
||||||
|
success: true,
|
||||||
|
level: "error",
|
||||||
|
module: "utils",
|
||||||
|
subModule: "deploy",
|
||||||
|
message: "UPDATE_SCRIPT_PATH please make sure you have this set.",
|
||||||
|
data: [],
|
||||||
|
notify: true,
|
||||||
|
room: "admin",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const args = [
|
||||||
|
"-ExecutionPolicy",
|
||||||
|
"Bypass",
|
||||||
|
"-File",
|
||||||
|
scriptPath,
|
||||||
|
"-Server",
|
||||||
|
server,
|
||||||
|
"-Destination",
|
||||||
|
destination,
|
||||||
|
"-Token",
|
||||||
|
token,
|
||||||
|
"-ADM_USER",
|
||||||
|
process.env.DEV_USER ?? "",
|
||||||
|
"-ADM_PASSWORD",
|
||||||
|
process.env.DEV_PASSWORD ?? "",
|
||||||
|
"-AppDir",
|
||||||
|
process.env.DEV_DIR ?? "",
|
||||||
|
];
|
||||||
|
|
||||||
|
emitBuildLog(`Starting update for ${server}`);
|
||||||
|
|
||||||
|
const child = spawn("powershell.exe", args, {
|
||||||
|
shell: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
child.stdout.on("data", (data) => {
|
||||||
|
const lines = data.toString().split(/\r?\n/);
|
||||||
|
for (const line of lines) {
|
||||||
|
if (line.trim()) {
|
||||||
|
emitBuildLog(line);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
child.stderr.on("data", (data) => {
|
||||||
|
const lines = data.toString().split(/\r?\n/);
|
||||||
|
for (const line of lines) {
|
||||||
|
if (line.trim()) {
|
||||||
|
emitBuildLog(line, "error");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
child.on("close", (code) => {
|
||||||
|
if (code === 0) {
|
||||||
|
emitBuildLog(`Update completed for ${server}`);
|
||||||
|
updating = false;
|
||||||
|
updateServerBuildNumber(token);
|
||||||
|
resolve({
|
||||||
|
success: true,
|
||||||
|
message: `Update completed for ${server}`,
|
||||||
|
data: [],
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
emitBuildLog(`Update failed for ${server} (code ${code})`, "error");
|
||||||
|
updating = false;
|
||||||
|
reject({
|
||||||
|
success: false,
|
||||||
|
message: `Update failed for ${server} (code ${code})`,
|
||||||
|
data: [],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
child.on("error", (err) => {
|
||||||
|
emitBuildLog(`Process error: ${err.message}`, "error");
|
||||||
|
updating = false;
|
||||||
|
reject({
|
||||||
|
success: false,
|
||||||
|
message: `${server}: Encountered an error while processing: ${err.message} `,
|
||||||
|
data: err,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
};
|
||||||
@@ -1,7 +1,7 @@
|
|||||||
import type { Response } from "express";
|
import type { Response } from "express";
|
||||||
import { createLogger } from "../logger/logger.controller.js";
|
import { createLogger } from "../logger/logger.controller.js";
|
||||||
|
|
||||||
interface Data<T = unknown[]> {
|
export interface ReturnHelper<T = unknown[]> {
|
||||||
success: boolean;
|
success: boolean;
|
||||||
module:
|
module:
|
||||||
| "system"
|
| "system"
|
||||||
@@ -13,32 +13,12 @@ interface Data<T = unknown[]> {
|
|||||||
| "notification"
|
| "notification"
|
||||||
| "email"
|
| "email"
|
||||||
| "purchase"
|
| "purchase"
|
||||||
| "tcp";
|
| "tcp"
|
||||||
subModule:
|
| "logistics"
|
||||||
| "db"
|
| "admin";
|
||||||
| "labeling"
|
subModule: string;
|
||||||
| "printer"
|
|
||||||
| "prodSql"
|
level: "info" | "error" | "debug" | "fatal" | "warn";
|
||||||
| "query"
|
|
||||||
| "sendmail"
|
|
||||||
| "auth"
|
|
||||||
| "datamart"
|
|
||||||
| "jobs"
|
|
||||||
| "apt"
|
|
||||||
| "settings"
|
|
||||||
| "get"
|
|
||||||
| "update"
|
|
||||||
| "delete"
|
|
||||||
| "post"
|
|
||||||
| "notification"
|
|
||||||
| "delete"
|
|
||||||
| "printing"
|
|
||||||
| "gpSql"
|
|
||||||
| "email"
|
|
||||||
| "gpChecks"
|
|
||||||
| "prodEndpoint"
|
|
||||||
| "create_server";
|
|
||||||
level: "info" | "error" | "debug" | "fatal";
|
|
||||||
message: string;
|
message: string;
|
||||||
room?: string;
|
room?: string;
|
||||||
data?: T;
|
data?: T;
|
||||||
@@ -59,7 +39,7 @@ interface Data<T = unknown[]> {
|
|||||||
* data: [] the data that will be passed back
|
* data: [] the data that will be passed back
|
||||||
* notify: false by default this is to send a notification to a users email to alert them of an issue.
|
* notify: false by default this is to send a notification to a users email to alert them of an issue.
|
||||||
*/
|
*/
|
||||||
export const returnFunc = (data: Data) => {
|
export const returnFunc = (data: ReturnHelper) => {
|
||||||
const notify = data.notify ? data.notify : false;
|
const notify = data.notify ? data.notify : false;
|
||||||
const room = data.room ?? data.room;
|
const room = data.room ?? data.room;
|
||||||
const log = createLogger({ module: data.module, subModule: data.subModule });
|
const log = createLogger({ module: data.module, subModule: data.subModule });
|
||||||
@@ -92,7 +72,7 @@ export const returnFunc = (data: Data) => {
|
|||||||
|
|
||||||
export function apiReturn(
|
export function apiReturn(
|
||||||
res: Response,
|
res: Response,
|
||||||
opts: Data & { status?: number },
|
opts: ReturnHelper & { status?: number },
|
||||||
optional?: unknown, // leave this as unknown so we can pass an object or an array over.
|
optional?: unknown, // leave this as unknown so we can pass an object or an array over.
|
||||||
): Response {
|
): Response {
|
||||||
const result = returnFunc(opts);
|
const result = returnFunc(opts);
|
||||||
|
|||||||
17
backend/utils/updateAppStats.utils.ts
Normal file
17
backend/utils/updateAppStats.utils.ts
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
import { db } from "../db/db.controller.js";
|
||||||
|
import { appStats } from "../db/schema/stats.schema.js";
|
||||||
|
|
||||||
|
export const updateAppStats = async (
|
||||||
|
data: Partial<typeof appStats.$inferInsert>,
|
||||||
|
) => {
|
||||||
|
await db
|
||||||
|
.insert(appStats)
|
||||||
|
.values({
|
||||||
|
id: "primary",
|
||||||
|
...data,
|
||||||
|
})
|
||||||
|
.onConflictDoUpdate({
|
||||||
|
target: appStats.id,
|
||||||
|
set: data,
|
||||||
|
});
|
||||||
|
};
|
||||||
177
backend/utils/zipper.utils.ts
Normal file
177
backend/utils/zipper.utils.ts
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
import fs from "node:fs";
|
||||||
|
import fsp from "node:fs/promises";
|
||||||
|
import path from "node:path";
|
||||||
|
import archiver from "archiver";
|
||||||
|
import { createLogger } from "../logger/logger.controller.js";
|
||||||
|
import { emitBuildLog } from "./build.utils.js";
|
||||||
|
import { updateAppStats } from "./updateAppStats.utils.js";
|
||||||
|
|
||||||
|
const log = createLogger({ module: "utils", subModule: "zip" });
|
||||||
|
|
||||||
|
const exists = async (target: string) => {
|
||||||
|
try {
|
||||||
|
await fsp.access(target);
|
||||||
|
return true;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const getNextBuildNumber = async (buildNumberFile: string) => {
|
||||||
|
if (!(await exists(buildNumberFile))) {
|
||||||
|
await fsp.writeFile(buildNumberFile, "1", "utf8");
|
||||||
|
return 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
const raw = await fsp.readFile(buildNumberFile, "utf8");
|
||||||
|
const current = Number.parseInt(raw.trim(), 10);
|
||||||
|
|
||||||
|
if (Number.isNaN(current) || current < 1) {
|
||||||
|
await fsp.writeFile(buildNumberFile, "1", "utf8");
|
||||||
|
return 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
const next = current + 1;
|
||||||
|
|
||||||
|
await fsp.writeFile(buildNumberFile, String(next), "utf8");
|
||||||
|
|
||||||
|
// update the server with the next build number
|
||||||
|
|
||||||
|
await updateAppStats({
|
||||||
|
currentBuild: next,
|
||||||
|
lastBuildAt: new Date(),
|
||||||
|
building: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
return next;
|
||||||
|
};
|
||||||
|
|
||||||
|
const cleanupOldBuilds = async (buildFolder: string, maxBuilds: number) => {
|
||||||
|
const entries = await fsp.readdir(buildFolder, { withFileTypes: true });
|
||||||
|
|
||||||
|
const zipFiles: { fullPath: string; name: string; mtimeMs: number }[] = [];
|
||||||
|
|
||||||
|
for (const entry of entries) {
|
||||||
|
if (!entry.isFile()) continue;
|
||||||
|
if (!/^LSTV3-\d+\.zip$/i.test(entry.name)) continue;
|
||||||
|
|
||||||
|
const fullPath = path.join(buildFolder, entry.name);
|
||||||
|
const stat = await fsp.stat(fullPath);
|
||||||
|
|
||||||
|
zipFiles.push({
|
||||||
|
fullPath,
|
||||||
|
name: entry.name,
|
||||||
|
mtimeMs: stat.mtimeMs,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
zipFiles.sort((a, b) => b.mtimeMs - a.mtimeMs);
|
||||||
|
|
||||||
|
const toRemove = zipFiles.slice(maxBuilds);
|
||||||
|
|
||||||
|
for (const file of toRemove) {
|
||||||
|
await fsp.rm(file.fullPath, { force: true });
|
||||||
|
emitBuildLog(`Removed old build: ${file.name}`);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export const zipBuild = async () => {
|
||||||
|
const appDir = process.env.DEV_DIR ?? "";
|
||||||
|
const maxBuilds = Number(process.env.MAX_BUILDS ?? 5);
|
||||||
|
|
||||||
|
if (!appDir) {
|
||||||
|
log.error({ notify: true }, "Forgot to add in the dev dir into the env");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const includesFile = path.join(appDir, ".includes");
|
||||||
|
const buildNumberFile = path.join(appDir, ".buildNumber");
|
||||||
|
const buildFolder = path.join(appDir, "builds");
|
||||||
|
const tempFolder = path.join(appDir, "temp", "zip-temp");
|
||||||
|
if (!(await exists(includesFile))) {
|
||||||
|
log.error({ notify: true }, "Missing .includes file common");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
await fsp.mkdir(buildFolder, { recursive: true });
|
||||||
|
|
||||||
|
const buildNumber = await getNextBuildNumber(buildNumberFile);
|
||||||
|
const zipFileName = `LSTV3-${buildNumber}.zip`;
|
||||||
|
const zipFile = path.join(buildFolder, zipFileName);
|
||||||
|
// make the folders in case they are not created already
|
||||||
|
emitBuildLog(`Using build number: ${buildNumber}`);
|
||||||
|
|
||||||
|
if (await exists(tempFolder)) {
|
||||||
|
await fsp.rm(tempFolder, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
await fsp.mkdir(tempFolder, { recursive: true });
|
||||||
|
|
||||||
|
const includes = (await fsp.readFile(includesFile, "utf8"))
|
||||||
|
.split(/\r?\n/)
|
||||||
|
.map((line) => line.trim())
|
||||||
|
.filter(Boolean);
|
||||||
|
|
||||||
|
emitBuildLog(`Preparing zip from ${includes.length} include entries`);
|
||||||
|
|
||||||
|
for (const relPath of includes) {
|
||||||
|
const source = path.join(appDir, relPath);
|
||||||
|
const dest = path.join(tempFolder, relPath);
|
||||||
|
|
||||||
|
if (!(await exists(source))) {
|
||||||
|
emitBuildLog(`Skipping missing path: ${relPath}`, "error");
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const stat = await fsp.stat(source);
|
||||||
|
await fsp.mkdir(path.dirname(dest), { recursive: true });
|
||||||
|
|
||||||
|
if (stat.isDirectory()) {
|
||||||
|
emitBuildLog(`Copying folder: ${relPath}`);
|
||||||
|
await fsp.cp(source, dest, { recursive: true });
|
||||||
|
} else {
|
||||||
|
emitBuildLog(`Copying file: ${relPath}`);
|
||||||
|
await fsp.copyFile(source, dest);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// if something crazy happens and we get the same build lets just reuse it
|
||||||
|
// if (await exists(zipFile)) {
|
||||||
|
// await fsp.rm(zipFile, { force: true });
|
||||||
|
// }
|
||||||
|
|
||||||
|
emitBuildLog(`Creating zip: ${zipFile}`);
|
||||||
|
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
const output = fs.createWriteStream(zipFile);
|
||||||
|
const archive = archiver("zip", { zlib: { level: 9 } });
|
||||||
|
|
||||||
|
output.on("close", () => resolve());
|
||||||
|
output.on("error", reject);
|
||||||
|
archive.on("error", reject);
|
||||||
|
|
||||||
|
archive.pipe(output);
|
||||||
|
|
||||||
|
// zip contents of temp folder, not temp folder itself
|
||||||
|
archive.directory(tempFolder, false);
|
||||||
|
archive.finalize();
|
||||||
|
});
|
||||||
|
|
||||||
|
await fsp.rm(tempFolder, { recursive: true, force: true });
|
||||||
|
|
||||||
|
emitBuildLog(`Zip completed successfully: ${zipFile}`);
|
||||||
|
|
||||||
|
await cleanupOldBuilds(buildFolder, maxBuilds);
|
||||||
|
|
||||||
|
await updateAppStats({
|
||||||
|
lastUpdated: new Date(),
|
||||||
|
building: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
success: true,
|
||||||
|
buildNumber,
|
||||||
|
zipFile,
|
||||||
|
zipFileName,
|
||||||
|
};
|
||||||
|
};
|
||||||
@@ -1,37 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Login
|
|
||||||
type: http
|
|
||||||
seq: 1
|
|
||||||
}
|
|
||||||
|
|
||||||
post {
|
|
||||||
url: {{url}}/api/auth/sign-in/email
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
headers {
|
|
||||||
Origin: http://localhost:3000
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"email": "blake.matthes@alpla.com",
|
|
||||||
"password": "nova0511"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
script:post-response {
|
|
||||||
// // grab the raw Set-Cookie header
|
|
||||||
// const cookies = res.headers["set-cookie"];
|
|
||||||
|
|
||||||
// const sessionCookie = cookies[0].split(";")[0];
|
|
||||||
|
|
||||||
// // Save it as an environment variable
|
|
||||||
// bru.setEnvVar("session_cookie", sessionCookie);
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,35 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Register
|
|
||||||
type: http
|
|
||||||
seq: 2
|
|
||||||
}
|
|
||||||
|
|
||||||
post {
|
|
||||||
url: {{url}}/api/authentication/register
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"name":"Blake", // option when in the frontend as we will pass over as username if not added
|
|
||||||
"username": "matthes01",
|
|
||||||
"email": "blake.matthes@alpla.com",
|
|
||||||
"password": "nova0511"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
script:post-response {
|
|
||||||
// // grab the raw Set-Cookie header
|
|
||||||
// const cookies = res.headers["set-cookie"];
|
|
||||||
|
|
||||||
// const sessionCookie = cookies[0].split(";")[0];
|
|
||||||
|
|
||||||
// // Save it as an environment variable
|
|
||||||
// bru.setEnvVar("session_cookie", sessionCookie);
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: auth
|
|
||||||
seq: 5
|
|
||||||
}
|
|
||||||
|
|
||||||
auth {
|
|
||||||
mode: inherit
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: getSession
|
|
||||||
type: http
|
|
||||||
seq: 3
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/auth/get-session
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,9 +0,0 @@
|
|||||||
{
|
|
||||||
"version": "1",
|
|
||||||
"name": "lst_v3",
|
|
||||||
"type": "collection",
|
|
||||||
"ignore": [
|
|
||||||
"node_modules",
|
|
||||||
".git"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
docs {
|
|
||||||
All Api endpoints to the logistics support tool
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Get queries
|
|
||||||
type: http
|
|
||||||
seq: 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/datamart
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Run Query
|
|
||||||
type: http
|
|
||||||
seq: 2
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/datamart/:name
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
params:path {
|
|
||||||
name: activeArticles
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: datamart
|
|
||||||
seq: 2
|
|
||||||
}
|
|
||||||
|
|
||||||
auth {
|
|
||||||
mode: inherit
|
|
||||||
}
|
|
||||||
@@ -1,7 +0,0 @@
|
|||||||
vars {
|
|
||||||
url: http://uslim1vms006:3100/lst
|
|
||||||
readerIp: 10.44.14.215
|
|
||||||
}
|
|
||||||
vars:secret [
|
|
||||||
token
|
|
||||||
]
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Get All notifications.
|
|
||||||
type: http
|
|
||||||
seq: 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/notification
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
|
|
||||||
docs {
|
|
||||||
Passing all as a query param will return all queries active and none active
|
|
||||||
}
|
|
||||||
@@ -1,24 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Subscribe to notification
|
|
||||||
type: http
|
|
||||||
seq: 2
|
|
||||||
}
|
|
||||||
|
|
||||||
post {
|
|
||||||
url: {{url}}/api/notification/sub
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"userId":"m6AbQXFwOXoX3YKLfwWgq2LIdDqS5jqv",
|
|
||||||
"notificationId": "0399eb2a-39df-48b7-9f1c-d233cec94d2e",
|
|
||||||
"emails": ["blake.matthes@alpla.com","blake.matthes@alpla.com"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: notifications
|
|
||||||
seq: 7
|
|
||||||
}
|
|
||||||
|
|
||||||
auth {
|
|
||||||
mode: inherit
|
|
||||||
}
|
|
||||||
@@ -1,24 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: remove sub notification
|
|
||||||
type: http
|
|
||||||
seq: 4
|
|
||||||
}
|
|
||||||
|
|
||||||
delete {
|
|
||||||
url: {{url}}/api/notification/sub
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"userId":"0kHd6Kkdub4GW6rK1qa1yjWwqXtvykqT",
|
|
||||||
"notificationId": "0399eb2a-39df-48b7-9f1c-d233cec94d2e",
|
|
||||||
"emails": ["blake.mattes@alpla.com"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: subscriptions
|
|
||||||
type: http
|
|
||||||
seq: 5
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/notification/sub
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,31 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: update notification
|
|
||||||
type: http
|
|
||||||
seq: 6
|
|
||||||
}
|
|
||||||
|
|
||||||
patch {
|
|
||||||
url: {{url}}/api/notification/:id
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
params:path {
|
|
||||||
id: 0399eb2a-39df-48b7-9f1c-d233cec94d2e
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"active" : true,
|
|
||||||
"options": []
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
|
|
||||||
docs {
|
|
||||||
Passing all as a query param will return all queries active and none active
|
|
||||||
}
|
|
||||||
@@ -1,24 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: update sub notification
|
|
||||||
type: http
|
|
||||||
seq: 3
|
|
||||||
}
|
|
||||||
|
|
||||||
patch {
|
|
||||||
url: {{url}}/api/notification/sub
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"userId":"m6AbQXFwOXoX3YKLfwWgq2LIdDqS5jqv",
|
|
||||||
"notificationId": "0399eb2a-39df-48b7-9f1c-d233cec94d2e",
|
|
||||||
"emails": ["cowchmonkey@gmail.com"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,22 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Printer Listenter
|
|
||||||
type: http
|
|
||||||
seq: 1
|
|
||||||
}
|
|
||||||
|
|
||||||
post {
|
|
||||||
url: {{url}}/api/ocp/printer/listener/line_1
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"message":"xnvjdhhgsdfr"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: ocp
|
|
||||||
seq: 9
|
|
||||||
}
|
|
||||||
|
|
||||||
auth {
|
|
||||||
mode: inherit
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: GetApt
|
|
||||||
type: http
|
|
||||||
seq: 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/opendock
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Sql Start
|
|
||||||
type: http
|
|
||||||
seq: 4
|
|
||||||
}
|
|
||||||
|
|
||||||
post {
|
|
||||||
url: {{url}}/api/system/prodsql/start
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Sql restart
|
|
||||||
type: http
|
|
||||||
seq: 4
|
|
||||||
}
|
|
||||||
|
|
||||||
post {
|
|
||||||
url: {{url}}/api/system/prodsql/restart
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Sql stop
|
|
||||||
type: http
|
|
||||||
seq: 4
|
|
||||||
}
|
|
||||||
|
|
||||||
post {
|
|
||||||
url: {{url}}/api/system/prodsql/stop
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: prodSql
|
|
||||||
seq: 6
|
|
||||||
}
|
|
||||||
|
|
||||||
auth {
|
|
||||||
mode: inherit
|
|
||||||
}
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: rfidReaders
|
|
||||||
seq: 8
|
|
||||||
}
|
|
||||||
|
|
||||||
auth {
|
|
||||||
mode: inherit
|
|
||||||
}
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: reader
|
|
||||||
type: http
|
|
||||||
seq: 2
|
|
||||||
}
|
|
||||||
|
|
||||||
post {
|
|
||||||
url: https://usday1prod.alpla.net/lst/old/api/rfid/mgtevents/line3.1
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Config
|
|
||||||
type: http
|
|
||||||
seq: 2
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: https://{{readerIp}}/cloud/config
|
|
||||||
body: none
|
|
||||||
auth: bearer
|
|
||||||
}
|
|
||||||
|
|
||||||
auth:bearer {
|
|
||||||
token: {{token}}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,32 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Login
|
|
||||||
type: http
|
|
||||||
seq: 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: https://{{readerIp}}/cloud/localRestLogin
|
|
||||||
body: none
|
|
||||||
auth: basic
|
|
||||||
}
|
|
||||||
|
|
||||||
auth:basic {
|
|
||||||
username: admin
|
|
||||||
password: Zebra123!
|
|
||||||
}
|
|
||||||
|
|
||||||
script:post-response {
|
|
||||||
const body = res.getBody();
|
|
||||||
|
|
||||||
if (body.message) {
|
|
||||||
bru.setEnvVar("token", body.message);
|
|
||||||
} else {
|
|
||||||
bru.setEnvVar("token", "error");
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,237 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Update Config
|
|
||||||
type: http
|
|
||||||
seq: 3
|
|
||||||
}
|
|
||||||
|
|
||||||
put {
|
|
||||||
url: https://{{readerIp}}/cloud/config
|
|
||||||
body: json
|
|
||||||
auth: bearer
|
|
||||||
}
|
|
||||||
|
|
||||||
headers {
|
|
||||||
Content-Type: application/json
|
|
||||||
}
|
|
||||||
|
|
||||||
auth:bearer {
|
|
||||||
token: {{token}}
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"GPIO-LED": {
|
|
||||||
"GPODefaults": {
|
|
||||||
"1": "HIGH",
|
|
||||||
"2": "HIGH",
|
|
||||||
"3": "HIGH",
|
|
||||||
"4": "HIGH"
|
|
||||||
},
|
|
||||||
"LEDDefaults": {
|
|
||||||
"3": "GREEN"
|
|
||||||
},
|
|
||||||
"TAG_READ": [
|
|
||||||
{
|
|
||||||
"pin": 1,
|
|
||||||
"state": "HIGH",
|
|
||||||
"type": "GPO"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"READER-GATEWAY": {
|
|
||||||
"batching": [
|
|
||||||
{
|
|
||||||
"maxPayloadSizePerReport": 256000,
|
|
||||||
"reportingInterval": 2000
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"maxPayloadSizePerReport": 256000,
|
|
||||||
"reportingInterval": 2000
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"endpointConfig": {
|
|
||||||
"data": {
|
|
||||||
"event": {
|
|
||||||
"connections": [
|
|
||||||
{
|
|
||||||
"additionalOptions": {
|
|
||||||
"retention": {
|
|
||||||
"maxEventRetentionTimeInMin": 500,
|
|
||||||
"maxNumEvents": 150000,
|
|
||||||
"throttle": 100
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"description": "",
|
|
||||||
"name": "LST",
|
|
||||||
"options": {
|
|
||||||
"URL": "https://usday1prod.alpla.net/lst/old/api/rfid/taginfo/line3.4",
|
|
||||||
"security": {
|
|
||||||
"CACertificateFileLocation": "",
|
|
||||||
"authenticationOptions": {},
|
|
||||||
"authenticationType": "NONE",
|
|
||||||
"verifyHost": false,
|
|
||||||
"verifyPeer": false
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"type": "httpPost"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"additionalOptions": {
|
|
||||||
"retention": {
|
|
||||||
"maxEventRetentionTimeInMin": 500,
|
|
||||||
"maxNumEvents": 150000,
|
|
||||||
"throttle": 100
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"description": "",
|
|
||||||
"name": "mgt",
|
|
||||||
"options": {
|
|
||||||
"URL": "https://usday1prod.alpla.net/lst/old/api/rfid/mgtevents/line3.4",
|
|
||||||
"security": {
|
|
||||||
"CACertificateFileLocation": "",
|
|
||||||
"authenticationOptions": {},
|
|
||||||
"authenticationType": "NONE",
|
|
||||||
"verifyHost": false,
|
|
||||||
"verifyPeer": false
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"type": "httpPost"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"managementEventConfig": {
|
|
||||||
"errors": {
|
|
||||||
"antenna": false,
|
|
||||||
"cpu": {
|
|
||||||
"reportIntervalInSec": 1800,
|
|
||||||
"threshold": 90
|
|
||||||
},
|
|
||||||
"database": true,
|
|
||||||
"flash": {
|
|
||||||
"reportIntervalInSec": 1800,
|
|
||||||
"threshold": 90
|
|
||||||
},
|
|
||||||
"ntp": true,
|
|
||||||
"radio": true,
|
|
||||||
"radio_control": true,
|
|
||||||
"ram": {
|
|
||||||
"reportIntervalInSec": 1800,
|
|
||||||
"threshold": 90
|
|
||||||
},
|
|
||||||
"reader_gateway": true,
|
|
||||||
"userApp": {
|
|
||||||
"reportIntervalInSec": 1800,
|
|
||||||
"threshold": 120
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"gpiEvents": true,
|
|
||||||
"gpoEvents": true,
|
|
||||||
"heartbeat": {
|
|
||||||
"fields": {
|
|
||||||
"radio_control": [
|
|
||||||
"ANTENNAS",
|
|
||||||
"RADIO_ACTIVITY",
|
|
||||||
"RADIO_CONNECTION",
|
|
||||||
"CPU",
|
|
||||||
"RAM",
|
|
||||||
"UPTIME",
|
|
||||||
"NUM_ERRORS",
|
|
||||||
"NUM_WARNINGS",
|
|
||||||
"NUM_TAG_READS",
|
|
||||||
"NUM_TAG_READS_PER_ANTENNA",
|
|
||||||
"NUM_DATA_MESSAGES_TXED",
|
|
||||||
"NUM_RADIO_PACKETS_RXED"
|
|
||||||
],
|
|
||||||
"reader_gateway": [
|
|
||||||
"NUM_DATA_MESSAGES_RXED",
|
|
||||||
"NUM_MANAGEMENT_EVENTS_TXED",
|
|
||||||
"NUM_DATA_MESSAGES_TXED",
|
|
||||||
"NUM_DATA_MESSAGES_RETAINED",
|
|
||||||
"NUM_DATA_MESSAGES_DROPPED",
|
|
||||||
"CPU",
|
|
||||||
"RAM",
|
|
||||||
"UPTIME",
|
|
||||||
"NUM_ERRORS",
|
|
||||||
"NUM_WARNINGS",
|
|
||||||
"INTERFACE_CONNECTION_STATUS",
|
|
||||||
"NOLOCKQ_DEPTH"
|
|
||||||
],
|
|
||||||
"system": [
|
|
||||||
"CPU",
|
|
||||||
"FLASH",
|
|
||||||
"NTP",
|
|
||||||
"RAM",
|
|
||||||
"SYSTEMTIME",
|
|
||||||
"TEMPERATURE",
|
|
||||||
"UPTIME",
|
|
||||||
"GPO",
|
|
||||||
"GPI",
|
|
||||||
"POWER_NEGOTIATION",
|
|
||||||
"POWER_SOURCE",
|
|
||||||
"MAC_ADDRESS",
|
|
||||||
"HOSTNAME"
|
|
||||||
],
|
|
||||||
"userapps": [
|
|
||||||
"STATUS",
|
|
||||||
"CPU",
|
|
||||||
"RAM",
|
|
||||||
"UPTIME",
|
|
||||||
"NUM_DATA_MESSAGES_RXED",
|
|
||||||
"NUM_DATA_MESSAGES_TXED",
|
|
||||||
"INCOMING_DATA_BUFFER_PERCENTAGE_REMAINING",
|
|
||||||
"OUTGOING_DATA_BUFFER_PERCENTAGE_REMAINING"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"interval": 60
|
|
||||||
},
|
|
||||||
"userappEvents": true,
|
|
||||||
"warnings": {
|
|
||||||
"cpu": {
|
|
||||||
"reportIntervalInSec": 1800,
|
|
||||||
"threshold": 80
|
|
||||||
},
|
|
||||||
"database": true,
|
|
||||||
"flash": {
|
|
||||||
"reportIntervalInSec": 1800,
|
|
||||||
"threshold": 80
|
|
||||||
},
|
|
||||||
"ntp": true,
|
|
||||||
"radio_api": true,
|
|
||||||
"radio_control": true,
|
|
||||||
"ram": {
|
|
||||||
"reportIntervalInSec": 1800,
|
|
||||||
"threshold": 80
|
|
||||||
},
|
|
||||||
"reader_gateway": true,
|
|
||||||
"temperature": {
|
|
||||||
"ambient": 75,
|
|
||||||
"pa": 105
|
|
||||||
},
|
|
||||||
"userApp": {
|
|
||||||
"reportIntervalInSec": 1800,
|
|
||||||
"threshold": 60
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"retention": [
|
|
||||||
{
|
|
||||||
"maxEventRetentionTimeInMin": 500,
|
|
||||||
"maxNumEvents": 150000,
|
|
||||||
"throttle": 100
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"maxEventRetentionTimeInMin": 500,
|
|
||||||
"maxNumEvents": 150000,
|
|
||||||
"throttle": 100
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,12 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: readerSpecific
|
|
||||||
}
|
|
||||||
|
|
||||||
auth {
|
|
||||||
mode: basic
|
|
||||||
}
|
|
||||||
|
|
||||||
auth:basic {
|
|
||||||
username: admin
|
|
||||||
password: Zebra123!
|
|
||||||
}
|
|
||||||
@@ -1,20 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Get Settings
|
|
||||||
type: http
|
|
||||||
seq: 3
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/settings
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
|
|
||||||
docs {
|
|
||||||
returns all settings
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Status
|
|
||||||
type: http
|
|
||||||
seq: 1
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/stats
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,33 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: updateSetting
|
|
||||||
type: http
|
|
||||||
seq: 2
|
|
||||||
}
|
|
||||||
|
|
||||||
patch {
|
|
||||||
url: {{url}}/api/settings/opendock_sync
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"value" : "1",
|
|
||||||
"active": "true"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
|
|
||||||
docs {
|
|
||||||
Allows the changing of a setting based on the parameter.
|
|
||||||
|
|
||||||
* when a setting that is being changed is a feature there will be some backgound logic that will stop that features processes and no long work.
|
|
||||||
|
|
||||||
* when the setting is being changed is system the entire app will do a full restart
|
|
||||||
|
|
||||||
* when a seeting is being changed and is standard nothing will happen until the next action is completed. example someone prints a label and you changed the default to 120 second from 90 seconds
|
|
||||||
}
|
|
||||||
@@ -1,16 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Active Jobs
|
|
||||||
type: http
|
|
||||||
seq: 5
|
|
||||||
}
|
|
||||||
|
|
||||||
get {
|
|
||||||
url: {{url}}/api/utils/croner
|
|
||||||
body: none
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -1,22 +0,0 @@
|
|||||||
meta {
|
|
||||||
name: Change job status
|
|
||||||
type: http
|
|
||||||
seq: 2
|
|
||||||
}
|
|
||||||
|
|
||||||
patch {
|
|
||||||
url: {{url}}/api/utils/croner/stop
|
|
||||||
body: json
|
|
||||||
auth: inherit
|
|
||||||
}
|
|
||||||
|
|
||||||
body:json {
|
|
||||||
{
|
|
||||||
"name": "open-dock-monitor"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
settings {
|
|
||||||
encodeUrl: true
|
|
||||||
timeout: 0
|
|
||||||
}
|
|
||||||
@@ -12,48 +12,36 @@ services:
|
|||||||
#- "${VITE_PORT:-4200}:4200"
|
#- "${VITE_PORT:-4200}:4200"
|
||||||
- "3600:3000"
|
- "3600:3000"
|
||||||
dns:
|
dns:
|
||||||
- 10.193.9.250
|
- 10.44.9.250
|
||||||
- 10.193.9.251 # your internal DNS server
|
- 10.44.9.251 # your internal DNS server
|
||||||
dns_search:
|
- 1.1.1.1
|
||||||
- alpla.net # or your internal search suffix
|
|
||||||
environment:
|
environment:
|
||||||
- NODE_ENV=production
|
- NODE_ENV=production
|
||||||
- LOG_LEVEL=info
|
- LOG_LEVEL=info
|
||||||
- EXTERNAL_URL=http://192.168.8.222:3600
|
- EXTERNAL_URL=http://192.168.8.222:3600
|
||||||
- DATABASE_HOST=host.docker.internal # if running on the same docker then do this
|
- DATABASE_HOST=postgres # if running on the same docker then do this
|
||||||
- DATABASE_PORT=5433
|
- DATABASE_PORT=5432
|
||||||
- DATABASE_USER=${DATABASE_USER}
|
- DATABASE_USER=${DATABASE_USER}
|
||||||
- DATABASE_PASSWORD=${DATABASE_PASSWORD}
|
- DATABASE_PASSWORD=${DATABASE_PASSWORD}
|
||||||
- DATABASE_DB=${DATABASE_DB}
|
- DATABASE_DB=${DATABASE_DB}
|
||||||
- PROD_SERVER=${PROD_SERVER}
|
- PROD_SERVER=10.75.9.56 #${PROD_SERVER}
|
||||||
- PROD_PLANT_TOKEN=${PROD_PLANT_TOKEN}
|
- PROD_PLANT_TOKEN=${PROD_PLANT_TOKEN}
|
||||||
- PROD_USER=${PROD_USER}
|
- PROD_USER=${PROD_USER}
|
||||||
- PROD_PASSWORD=${PROD_PASSWORD}
|
- PROD_PASSWORD=${PROD_PASSWORD}
|
||||||
|
- GP_SERVER=10.193.9.31
|
||||||
|
- SQL_PORT=1433
|
||||||
- BETTER_AUTH_SECRET=${BETTER_AUTH_SECRET}
|
- BETTER_AUTH_SECRET=${BETTER_AUTH_SECRET}
|
||||||
- BETTER_AUTH_URL=${URL}
|
- BETTER_AUTH_URL=${URL}
|
||||||
# for all host including prod servers, plc's, printers, or other de
|
- OPENDOCK_URL=${OPENDOCK_URL}
|
||||||
# extra_hosts:
|
- OPENDOCK_PASSWORD=${OPENDOCK_PASSWORD}
|
||||||
# - "${PROD_SERVER}:${PROD_IP}"
|
- DEFAULT_DOCK=${DEFAULT_DOCK}
|
||||||
|
- DEFAULT_LOAD_TYPE=${DEFAULT_LOAD_TYPE}
|
||||||
|
- DEFAULT_CARRIER=${DEFAULT_CARRIER}
|
||||||
|
|
||||||
# networks:
|
#for all host including prod servers, plc's, printers, or other de
|
||||||
# - default
|
networks:
|
||||||
# - logisticsNetwork
|
- docker-network
|
||||||
# #- mlan1
|
|
||||||
# networks:
|
|
||||||
# logisticsNetwork:
|
|
||||||
# driver: macvlan
|
|
||||||
# driver_opts:
|
|
||||||
# parent: eth0
|
|
||||||
# ipam:
|
|
||||||
# config:
|
|
||||||
# - subnet: ${LOGISTICS_NETWORK}
|
|
||||||
# gateway: ${LOGISTICS_GATEWAY}
|
|
||||||
|
|
||||||
# mlan1:
|
networks:
|
||||||
# driver: macvlan
|
docker-network:
|
||||||
# driver_opts:
|
external: true
|
||||||
# parent: eth0
|
|
||||||
# ipam:
|
|
||||||
# config:
|
|
||||||
# - subnet: ${MLAN1_NETWORK}
|
|
||||||
# gateway: ${MLAN1_GATEWAY}
|
|
||||||
21
frontend/package-lock.json
generated
21
frontend/package-lock.json
generated
@@ -19,6 +19,8 @@
|
|||||||
"better-auth": "^1.5.5",
|
"better-auth": "^1.5.5",
|
||||||
"class-variance-authority": "^0.7.1",
|
"class-variance-authority": "^0.7.1",
|
||||||
"clsx": "^2.1.1",
|
"clsx": "^2.1.1",
|
||||||
|
"date-fns": "^4.1.0",
|
||||||
|
"date-fns-tz": "^3.2.0",
|
||||||
"lucide-react": "^0.577.0",
|
"lucide-react": "^0.577.0",
|
||||||
"next-themes": "^0.4.6",
|
"next-themes": "^0.4.6",
|
||||||
"radix-ui": "^1.4.3",
|
"radix-ui": "^1.4.3",
|
||||||
@@ -6016,6 +6018,25 @@
|
|||||||
"node": ">= 12"
|
"node": ">= 12"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/date-fns": {
|
||||||
|
"version": "4.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/date-fns/-/date-fns-4.1.0.tgz",
|
||||||
|
"integrity": "sha512-Ukq0owbQXxa/U3EGtsdVBkR1w7KOQ5gIBqdH2hkvknzZPYvBxb/aa6E8L7tmjFtkwZBu3UXBbjIgPo/Ez4xaNg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"funding": {
|
||||||
|
"type": "github",
|
||||||
|
"url": "https://github.com/sponsors/kossnocorp"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/date-fns-tz": {
|
||||||
|
"version": "3.2.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/date-fns-tz/-/date-fns-tz-3.2.0.tgz",
|
||||||
|
"integrity": "sha512-sg8HqoTEulcbbbVXeg84u5UnlsQa8GS5QXMqjjYIhS4abEVVKIUwe0/l/UhrZdKaL/W5eWZNlbTeEIiOXTcsBQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"peerDependencies": {
|
||||||
|
"date-fns": "^3.0.0 || ^4.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/debug": {
|
"node_modules/debug": {
|
||||||
"version": "4.4.3",
|
"version": "4.4.3",
|
||||||
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",
|
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",
|
||||||
|
|||||||
@@ -34,7 +34,9 @@
|
|||||||
"tailwind-merge": "^3.5.0",
|
"tailwind-merge": "^3.5.0",
|
||||||
"tailwindcss": "^4.2.1",
|
"tailwindcss": "^4.2.1",
|
||||||
"tw-animate-css": "^1.4.0",
|
"tw-animate-css": "^1.4.0",
|
||||||
"zod": "^4.3.6"
|
"zod": "^4.3.6",
|
||||||
|
"date-fns": "^4.1.0",
|
||||||
|
"date-fns-tz": "^3.2.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@eslint/js": "^9.36.0",
|
"@eslint/js": "^9.36.0",
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { Link } from "@tanstack/react-router";
|
import { Link } from "@tanstack/react-router";
|
||||||
import { Bell, Logs, Settings } from "lucide-react";
|
import { Bell, Logs, Server, Settings } from "lucide-react";
|
||||||
|
|
||||||
import {
|
import {
|
||||||
SidebarGroup,
|
SidebarGroup,
|
||||||
@@ -40,6 +40,14 @@ export default function AdminSidebar({ session }: any) {
|
|||||||
module: "admin",
|
module: "admin",
|
||||||
active: true,
|
active: true,
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
title: "Servers",
|
||||||
|
url: "/admin/servers",
|
||||||
|
icon: Server,
|
||||||
|
role: ["systemAdmin", "admin"],
|
||||||
|
module: "admin",
|
||||||
|
active: true,
|
||||||
|
},
|
||||||
{
|
{
|
||||||
title: "Logs",
|
title: "Logs",
|
||||||
url: "/admin/logs",
|
url: "/admin/logs",
|
||||||
|
|||||||
@@ -1,22 +1,55 @@
|
|||||||
import { useEffect, useState } from "react";
|
import { useCallback, useEffect, useState } from "react";
|
||||||
import socket from "@/lib/socket.io";
|
import socket from "@/lib/socket.io";
|
||||||
|
|
||||||
export function useSocketRoom<T>(roomId: string) {
|
type RoomUpdatePayload<T> = {
|
||||||
|
roomId: string;
|
||||||
|
payloads: T[];
|
||||||
|
};
|
||||||
|
|
||||||
|
type RoomErrorPayload = {
|
||||||
|
roomId?: string;
|
||||||
|
message?: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function useSocketRoom<T>(
|
||||||
|
roomId: string,
|
||||||
|
getKey?: (item: T) => string | number,
|
||||||
|
) {
|
||||||
const [data, setData] = useState<T[]>([]);
|
const [data, setData] = useState<T[]>([]);
|
||||||
const [info, setInfo] = useState(
|
const [info, setInfo] = useState(
|
||||||
"No data yet — join the room to start receiving",
|
"No data yet — join the room to start receiving",
|
||||||
);
|
);
|
||||||
|
|
||||||
|
const clearRoom = useCallback(
|
||||||
|
(id?: string | number) => {
|
||||||
|
if (id !== undefined && getKey) {
|
||||||
|
setData((prev) => prev.filter((item) => getKey(item) !== id));
|
||||||
|
setInfo(`Removed item ${id}`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
setData([]);
|
||||||
|
setInfo("Room data cleared");
|
||||||
|
},
|
||||||
|
[getKey],
|
||||||
|
);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
function handleConnect() {
|
function handleConnect() {
|
||||||
socket.emit("join-room", roomId);
|
socket.emit("join-room", roomId);
|
||||||
|
setInfo(`Joined room: ${roomId}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
function handleUpdate(payload: any) {
|
function handleUpdate(payload: RoomUpdatePayload<T>) {
|
||||||
|
// protects against other room updates hitting this hook
|
||||||
|
if (payload.roomId !== roomId) return;
|
||||||
|
|
||||||
setData((prev) => [...payload.payloads, ...prev]);
|
setData((prev) => [...payload.payloads, ...prev]);
|
||||||
|
setInfo("");
|
||||||
}
|
}
|
||||||
|
|
||||||
function handleError(err: any) {
|
function handleError(err: RoomErrorPayload) {
|
||||||
|
if (err.roomId && err.roomId !== roomId) return;
|
||||||
setInfo(err.message ?? "Room error");
|
setInfo(err.message ?? "Room error");
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -31,6 +64,7 @@ export function useSocketRoom<T>(roomId: string) {
|
|||||||
// If already connected, join immediately
|
// If already connected, join immediately
|
||||||
if (socket.connected) {
|
if (socket.connected) {
|
||||||
socket.emit("join-room", roomId);
|
socket.emit("join-room", roomId);
|
||||||
|
setInfo(`Joined room: ${roomId}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
return () => {
|
return () => {
|
||||||
@@ -42,5 +76,5 @@ export function useSocketRoom<T>(roomId: string) {
|
|||||||
};
|
};
|
||||||
}, [roomId]);
|
}, [roomId]);
|
||||||
|
|
||||||
return { data, info };
|
return { data, info, clearRoom };
|
||||||
}
|
}
|
||||||
|
|||||||
22
frontend/src/lib/queries/servers.ts
Normal file
22
frontend/src/lib/queries/servers.ts
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
import { keepPreviousData, queryOptions } from "@tanstack/react-query";
|
||||||
|
import axios from "axios";
|
||||||
|
|
||||||
|
export function servers() {
|
||||||
|
return queryOptions({
|
||||||
|
queryKey: ["servers"],
|
||||||
|
queryFn: () => fetch(),
|
||||||
|
staleTime: 5000,
|
||||||
|
refetchOnWindowFocus: true,
|
||||||
|
placeholderData: keepPreviousData,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const fetch = async () => {
|
||||||
|
if (window.location.hostname === "localhost") {
|
||||||
|
await new Promise((res) => setTimeout(res, 1500));
|
||||||
|
}
|
||||||
|
|
||||||
|
const { data } = await axios.get("/lst/api/servers");
|
||||||
|
|
||||||
|
return data.data;
|
||||||
|
};
|
||||||
@@ -105,6 +105,7 @@ export default function LstTable({
|
|||||||
</TableBody>
|
</TableBody>
|
||||||
</Table>
|
</Table>
|
||||||
<ScrollBar orientation="horizontal" />
|
<ScrollBar orientation="horizontal" />
|
||||||
|
<ScrollBar orientation="vertical" />
|
||||||
</ScrollArea>
|
</ScrollArea>
|
||||||
<div className="flex items-center justify-end space-x-2 py-4">
|
<div className="flex items-center justify-end space-x-2 py-4">
|
||||||
<Button
|
<Button
|
||||||
|
|||||||
@@ -14,6 +14,7 @@ import { Route as IndexRouteImport } from './routes/index'
|
|||||||
import { Route as DocsIndexRouteImport } from './routes/docs/index'
|
import { Route as DocsIndexRouteImport } from './routes/docs/index'
|
||||||
import { Route as DocsSplatRouteImport } from './routes/docs/$'
|
import { Route as DocsSplatRouteImport } from './routes/docs/$'
|
||||||
import { Route as AdminSettingsRouteImport } from './routes/admin/settings'
|
import { Route as AdminSettingsRouteImport } from './routes/admin/settings'
|
||||||
|
import { Route as AdminServersRouteImport } from './routes/admin/servers'
|
||||||
import { Route as AdminNotificationsRouteImport } from './routes/admin/notifications'
|
import { Route as AdminNotificationsRouteImport } from './routes/admin/notifications'
|
||||||
import { Route as AdminLogsRouteImport } from './routes/admin/logs'
|
import { Route as AdminLogsRouteImport } from './routes/admin/logs'
|
||||||
import { Route as authLoginRouteImport } from './routes/(auth)/login'
|
import { Route as authLoginRouteImport } from './routes/(auth)/login'
|
||||||
@@ -46,6 +47,11 @@ const AdminSettingsRoute = AdminSettingsRouteImport.update({
|
|||||||
path: '/admin/settings',
|
path: '/admin/settings',
|
||||||
getParentRoute: () => rootRouteImport,
|
getParentRoute: () => rootRouteImport,
|
||||||
} as any)
|
} as any)
|
||||||
|
const AdminServersRoute = AdminServersRouteImport.update({
|
||||||
|
id: '/admin/servers',
|
||||||
|
path: '/admin/servers',
|
||||||
|
getParentRoute: () => rootRouteImport,
|
||||||
|
} as any)
|
||||||
const AdminNotificationsRoute = AdminNotificationsRouteImport.update({
|
const AdminNotificationsRoute = AdminNotificationsRouteImport.update({
|
||||||
id: '/admin/notifications',
|
id: '/admin/notifications',
|
||||||
path: '/admin/notifications',
|
path: '/admin/notifications',
|
||||||
@@ -83,6 +89,7 @@ export interface FileRoutesByFullPath {
|
|||||||
'/login': typeof authLoginRoute
|
'/login': typeof authLoginRoute
|
||||||
'/admin/logs': typeof AdminLogsRoute
|
'/admin/logs': typeof AdminLogsRoute
|
||||||
'/admin/notifications': typeof AdminNotificationsRoute
|
'/admin/notifications': typeof AdminNotificationsRoute
|
||||||
|
'/admin/servers': typeof AdminServersRoute
|
||||||
'/admin/settings': typeof AdminSettingsRoute
|
'/admin/settings': typeof AdminSettingsRoute
|
||||||
'/docs/$': typeof DocsSplatRoute
|
'/docs/$': typeof DocsSplatRoute
|
||||||
'/docs/': typeof DocsIndexRoute
|
'/docs/': typeof DocsIndexRoute
|
||||||
@@ -96,6 +103,7 @@ export interface FileRoutesByTo {
|
|||||||
'/login': typeof authLoginRoute
|
'/login': typeof authLoginRoute
|
||||||
'/admin/logs': typeof AdminLogsRoute
|
'/admin/logs': typeof AdminLogsRoute
|
||||||
'/admin/notifications': typeof AdminNotificationsRoute
|
'/admin/notifications': typeof AdminNotificationsRoute
|
||||||
|
'/admin/servers': typeof AdminServersRoute
|
||||||
'/admin/settings': typeof AdminSettingsRoute
|
'/admin/settings': typeof AdminSettingsRoute
|
||||||
'/docs/$': typeof DocsSplatRoute
|
'/docs/$': typeof DocsSplatRoute
|
||||||
'/docs': typeof DocsIndexRoute
|
'/docs': typeof DocsIndexRoute
|
||||||
@@ -110,6 +118,7 @@ export interface FileRoutesById {
|
|||||||
'/(auth)/login': typeof authLoginRoute
|
'/(auth)/login': typeof authLoginRoute
|
||||||
'/admin/logs': typeof AdminLogsRoute
|
'/admin/logs': typeof AdminLogsRoute
|
||||||
'/admin/notifications': typeof AdminNotificationsRoute
|
'/admin/notifications': typeof AdminNotificationsRoute
|
||||||
|
'/admin/servers': typeof AdminServersRoute
|
||||||
'/admin/settings': typeof AdminSettingsRoute
|
'/admin/settings': typeof AdminSettingsRoute
|
||||||
'/docs/$': typeof DocsSplatRoute
|
'/docs/$': typeof DocsSplatRoute
|
||||||
'/docs/': typeof DocsIndexRoute
|
'/docs/': typeof DocsIndexRoute
|
||||||
@@ -125,6 +134,7 @@ export interface FileRouteTypes {
|
|||||||
| '/login'
|
| '/login'
|
||||||
| '/admin/logs'
|
| '/admin/logs'
|
||||||
| '/admin/notifications'
|
| '/admin/notifications'
|
||||||
|
| '/admin/servers'
|
||||||
| '/admin/settings'
|
| '/admin/settings'
|
||||||
| '/docs/$'
|
| '/docs/$'
|
||||||
| '/docs/'
|
| '/docs/'
|
||||||
@@ -138,6 +148,7 @@ export interface FileRouteTypes {
|
|||||||
| '/login'
|
| '/login'
|
||||||
| '/admin/logs'
|
| '/admin/logs'
|
||||||
| '/admin/notifications'
|
| '/admin/notifications'
|
||||||
|
| '/admin/servers'
|
||||||
| '/admin/settings'
|
| '/admin/settings'
|
||||||
| '/docs/$'
|
| '/docs/$'
|
||||||
| '/docs'
|
| '/docs'
|
||||||
@@ -151,6 +162,7 @@ export interface FileRouteTypes {
|
|||||||
| '/(auth)/login'
|
| '/(auth)/login'
|
||||||
| '/admin/logs'
|
| '/admin/logs'
|
||||||
| '/admin/notifications'
|
| '/admin/notifications'
|
||||||
|
| '/admin/servers'
|
||||||
| '/admin/settings'
|
| '/admin/settings'
|
||||||
| '/docs/$'
|
| '/docs/$'
|
||||||
| '/docs/'
|
| '/docs/'
|
||||||
@@ -165,6 +177,7 @@ export interface RootRouteChildren {
|
|||||||
authLoginRoute: typeof authLoginRoute
|
authLoginRoute: typeof authLoginRoute
|
||||||
AdminLogsRoute: typeof AdminLogsRoute
|
AdminLogsRoute: typeof AdminLogsRoute
|
||||||
AdminNotificationsRoute: typeof AdminNotificationsRoute
|
AdminNotificationsRoute: typeof AdminNotificationsRoute
|
||||||
|
AdminServersRoute: typeof AdminServersRoute
|
||||||
AdminSettingsRoute: typeof AdminSettingsRoute
|
AdminSettingsRoute: typeof AdminSettingsRoute
|
||||||
DocsSplatRoute: typeof DocsSplatRoute
|
DocsSplatRoute: typeof DocsSplatRoute
|
||||||
DocsIndexRoute: typeof DocsIndexRoute
|
DocsIndexRoute: typeof DocsIndexRoute
|
||||||
@@ -210,6 +223,13 @@ declare module '@tanstack/react-router' {
|
|||||||
preLoaderRoute: typeof AdminSettingsRouteImport
|
preLoaderRoute: typeof AdminSettingsRouteImport
|
||||||
parentRoute: typeof rootRouteImport
|
parentRoute: typeof rootRouteImport
|
||||||
}
|
}
|
||||||
|
'/admin/servers': {
|
||||||
|
id: '/admin/servers'
|
||||||
|
path: '/admin/servers'
|
||||||
|
fullPath: '/admin/servers'
|
||||||
|
preLoaderRoute: typeof AdminServersRouteImport
|
||||||
|
parentRoute: typeof rootRouteImport
|
||||||
|
}
|
||||||
'/admin/notifications': {
|
'/admin/notifications': {
|
||||||
id: '/admin/notifications'
|
id: '/admin/notifications'
|
||||||
path: '/admin/notifications'
|
path: '/admin/notifications'
|
||||||
@@ -261,6 +281,7 @@ const rootRouteChildren: RootRouteChildren = {
|
|||||||
authLoginRoute: authLoginRoute,
|
authLoginRoute: authLoginRoute,
|
||||||
AdminLogsRoute: AdminLogsRoute,
|
AdminLogsRoute: AdminLogsRoute,
|
||||||
AdminNotificationsRoute: AdminNotificationsRoute,
|
AdminNotificationsRoute: AdminNotificationsRoute,
|
||||||
|
AdminServersRoute: AdminServersRoute,
|
||||||
AdminSettingsRoute: AdminSettingsRoute,
|
AdminSettingsRoute: AdminSettingsRoute,
|
||||||
DocsSplatRoute: DocsSplatRoute,
|
DocsSplatRoute: DocsSplatRoute,
|
||||||
DocsIndexRoute: DocsIndexRoute,
|
DocsIndexRoute: DocsIndexRoute,
|
||||||
|
|||||||
251
frontend/src/routes/admin/servers.tsx
Normal file
251
frontend/src/routes/admin/servers.tsx
Normal file
@@ -0,0 +1,251 @@
|
|||||||
|
import { useSuspenseQuery } from "@tanstack/react-query";
|
||||||
|
import { createFileRoute, redirect } from "@tanstack/react-router";
|
||||||
|
import { createColumnHelper } from "@tanstack/react-table";
|
||||||
|
import axios from "axios";
|
||||||
|
import { format } from "date-fns-tz";
|
||||||
|
import { CircleFadingArrowUp, Trash } from "lucide-react";
|
||||||
|
import { Suspense, useState } from "react";
|
||||||
|
import { toast } from "sonner";
|
||||||
|
import { Button } from "../../components/ui/button";
|
||||||
|
import { Spinner } from "../../components/ui/spinner";
|
||||||
|
import {
|
||||||
|
Tooltip,
|
||||||
|
TooltipContent,
|
||||||
|
TooltipTrigger,
|
||||||
|
} from "../../components/ui/tooltip";
|
||||||
|
import { useSocketRoom } from "../../hooks/socket.io.hook";
|
||||||
|
import { authClient } from "../../lib/auth-client";
|
||||||
|
import { servers } from "../../lib/queries/servers";
|
||||||
|
import LstTable from "../../lib/tableStuff/LstTable";
|
||||||
|
import SearchableHeader from "../../lib/tableStuff/SearchableHeader";
|
||||||
|
import SkellyTable from "../../lib/tableStuff/SkellyTable";
|
||||||
|
|
||||||
|
export const Route = createFileRoute("/admin/servers")({
|
||||||
|
beforeLoad: async ({ location }) => {
|
||||||
|
const { data: session } = await authClient.getSession();
|
||||||
|
const allowedRole = ["systemAdmin", "admin"];
|
||||||
|
|
||||||
|
if (!session?.user) {
|
||||||
|
throw redirect({
|
||||||
|
to: "/",
|
||||||
|
search: {
|
||||||
|
redirect: location.href,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!allowedRole.includes(session.user.role as string)) {
|
||||||
|
throw redirect({
|
||||||
|
to: "/",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return { user: session.user };
|
||||||
|
},
|
||||||
|
component: RouteComponent,
|
||||||
|
});
|
||||||
|
|
||||||
|
const ServerTable = () => {
|
||||||
|
const { data, refetch } = useSuspenseQuery(servers());
|
||||||
|
const columnHelper = createColumnHelper<any>();
|
||||||
|
const okToUpdate = ["localhost", "usmcd1olp082"];
|
||||||
|
const columns = [
|
||||||
|
columnHelper.accessor("name", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="Name" searchable={true} />
|
||||||
|
),
|
||||||
|
filterFn: "includesString",
|
||||||
|
cell: (i) => i.getValue(),
|
||||||
|
}),
|
||||||
|
columnHelper.accessor("greatPlainsPlantCode", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="GP Code" />
|
||||||
|
),
|
||||||
|
cell: (i) => <span>{i.getValue().toUpperCase()}</span>,
|
||||||
|
}),
|
||||||
|
columnHelper.accessor("server", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="server" />
|
||||||
|
),
|
||||||
|
cell: (i) => <span>{i.getValue().toUpperCase()}</span>,
|
||||||
|
}),
|
||||||
|
columnHelper.accessor("idAddress", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="IP Address" />
|
||||||
|
),
|
||||||
|
cell: (i) => <span>{i.getValue()}</span>,
|
||||||
|
}),
|
||||||
|
];
|
||||||
|
|
||||||
|
if (okToUpdate.includes(window.location.hostname)) {
|
||||||
|
columns.push(
|
||||||
|
columnHelper.accessor("lastUpdated", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="Last Update" />
|
||||||
|
),
|
||||||
|
cell: (i) => <span>{format(i.getValue(), "M/d/yyyy HH:mm")}</span>,
|
||||||
|
}),
|
||||||
|
columnHelper.accessor("buildNumber", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="Build" />
|
||||||
|
),
|
||||||
|
cell: (i) => <span>{i.getValue()}</span>,
|
||||||
|
}),
|
||||||
|
columnHelper.accessor("update", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="Update" searchable={false} />
|
||||||
|
),
|
||||||
|
filterFn: "includesString",
|
||||||
|
cell: (i) => {
|
||||||
|
// biome-ignore lint: just removing the lint for now to get this going will maybe fix later
|
||||||
|
const [activeToggle, setActiveToggle] = useState(false);
|
||||||
|
|
||||||
|
const onToggle = async () => {
|
||||||
|
setActiveToggle(true);
|
||||||
|
toast.success(
|
||||||
|
`${i.row.original.name} just started the upgrade monitor logs for errors.`,
|
||||||
|
);
|
||||||
|
try {
|
||||||
|
const res = await axios.post(
|
||||||
|
`/lst/api/admin/build/updateServer`,
|
||||||
|
{
|
||||||
|
server: i.row.original.server,
|
||||||
|
destination: i.row.original.serverLoc,
|
||||||
|
token: i.row.original.plantToken,
|
||||||
|
},
|
||||||
|
{ withCredentials: true },
|
||||||
|
);
|
||||||
|
|
||||||
|
if (res.data.success) {
|
||||||
|
toast.success(
|
||||||
|
`${i.row.original.name} has completed its upgrade.`,
|
||||||
|
);
|
||||||
|
refetch();
|
||||||
|
setActiveToggle(false);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
setActiveToggle(false);
|
||||||
|
console.error(error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div>
|
||||||
|
<div className="flex items-center space-x-2">
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
disabled={activeToggle}
|
||||||
|
onClick={() => onToggle()}
|
||||||
|
>
|
||||||
|
{activeToggle ? (
|
||||||
|
<span>
|
||||||
|
<Spinner />
|
||||||
|
</span>
|
||||||
|
) : (
|
||||||
|
<span>
|
||||||
|
<CircleFadingArrowUp />
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return <LstTable data={data} columns={columns} />;
|
||||||
|
};
|
||||||
|
|
||||||
|
function RouteComponent() {
|
||||||
|
const { data: logs = [], clearRoom } = useSocketRoom<any>("admin:build");
|
||||||
|
|
||||||
|
const columnHelper = createColumnHelper<any>();
|
||||||
|
|
||||||
|
console.log(window.location);
|
||||||
|
const logColumns = [
|
||||||
|
columnHelper.accessor("timestamp", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="Time" searchable={false} />
|
||||||
|
),
|
||||||
|
filterFn: "includesString",
|
||||||
|
cell: (i) => format(i.getValue(), "M/d/yyyy HH:mm"),
|
||||||
|
}),
|
||||||
|
columnHelper.accessor("message", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="Message" />
|
||||||
|
),
|
||||||
|
cell: (i) => (
|
||||||
|
<Tooltip>
|
||||||
|
<TooltipTrigger>
|
||||||
|
{i.getValue()?.length > 250 ? (
|
||||||
|
<span>{i.getValue().slice(0, 250)}...</span>
|
||||||
|
) : (
|
||||||
|
<span>{i.getValue()}</span>
|
||||||
|
)}
|
||||||
|
</TooltipTrigger>
|
||||||
|
<TooltipContent>{i.getValue()}</TooltipContent>
|
||||||
|
</Tooltip>
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
columnHelper.accessor("clearLog", {
|
||||||
|
header: ({ column }) => (
|
||||||
|
<SearchableHeader column={column} title="Clear" />
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const x = row.original;
|
||||||
|
return (
|
||||||
|
<Button
|
||||||
|
size="icon"
|
||||||
|
variant={"destructive"}
|
||||||
|
onClick={() => clearRoom(x.timestamp)}
|
||||||
|
>
|
||||||
|
<Trash />
|
||||||
|
</Button>
|
||||||
|
);
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
];
|
||||||
|
const triggerBuild = async () => {
|
||||||
|
try {
|
||||||
|
const res = await axios.post(
|
||||||
|
`/lst/api/admin/build/release`,
|
||||||
|
|
||||||
|
{
|
||||||
|
withCredentials: true,
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (res.data.success) {
|
||||||
|
toast.success(res.data.message);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!res.data.success) {
|
||||||
|
toast.error(res.data.message);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.log(err);
|
||||||
|
//toast.error(err?.message);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
//console.log(logs);
|
||||||
|
return (
|
||||||
|
<div className="flex flex-col gap-1">
|
||||||
|
<div className="flex gap-1 justify-end">
|
||||||
|
<Button onClick={triggerBuild}>Trigger Build</Button>
|
||||||
|
<Button onClick={() => clearRoom()}>Clear Logs</Button>
|
||||||
|
</div>
|
||||||
|
<div className="flex gap-1 w-full">
|
||||||
|
<div className="w-full">
|
||||||
|
<Suspense fallback={<SkellyTable />}>
|
||||||
|
<ServerTable />
|
||||||
|
</Suspense>
|
||||||
|
</div>
|
||||||
|
<div className="w-1/2">
|
||||||
|
<LstTable data={logs} columns={logColumns} />
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
43
lstMobile/.gitignore
vendored
Normal file
43
lstMobile/.gitignore
vendored
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
# Learn more https://docs.github.com/en/get-started/getting-started-with-git/ignoring-files
|
||||||
|
|
||||||
|
# dependencies
|
||||||
|
node_modules/
|
||||||
|
|
||||||
|
# Expo
|
||||||
|
.expo/
|
||||||
|
dist/
|
||||||
|
web-build/
|
||||||
|
expo-env.d.ts
|
||||||
|
|
||||||
|
# Native
|
||||||
|
.kotlin/
|
||||||
|
*.orig.*
|
||||||
|
*.jks
|
||||||
|
*.p8
|
||||||
|
*.p12
|
||||||
|
*.key
|
||||||
|
*.mobileprovision
|
||||||
|
|
||||||
|
# Metro
|
||||||
|
.metro-health-check*
|
||||||
|
|
||||||
|
# debug
|
||||||
|
npm-debug.*
|
||||||
|
yarn-debug.*
|
||||||
|
yarn-error.*
|
||||||
|
|
||||||
|
# macOS
|
||||||
|
.DS_Store
|
||||||
|
*.pem
|
||||||
|
|
||||||
|
# local env files
|
||||||
|
.env*.local
|
||||||
|
|
||||||
|
# typescript
|
||||||
|
*.tsbuildinfo
|
||||||
|
|
||||||
|
app-example
|
||||||
|
|
||||||
|
# generated native folders
|
||||||
|
/ios
|
||||||
|
/android
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user