16 Commits

Author SHA1 Message Date
7962463927 refactor(server): server updates can now only be done from a dev pc
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m54s
2026-04-21 19:01:52 -05:00
f716de1a58 chore(clean): removed bruno api a proper api doc will be added to lst later 2026-04-21 19:01:21 -05:00
88cef2a56c refactor(servers): added mcd and stp1 2026-04-21 19:00:30 -05:00
cb00addee9 feat(admin): moved server build/update to full app
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m27s
2026-04-21 07:36:04 -05:00
b832d7aa1e fix(datamart): fixes to correct how we handle activations of new features and legacy queries 2026-04-20 08:49:24 -05:00
32517d0c98 fix(inventory): changes to accruatly adjust the query and check the feature set 2026-04-20 07:25:33 -05:00
82f8369640 refactor(scanner): more basic work to get the scanner just running
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m33s
2026-04-19 17:20:57 -05:00
3734d9daac feat(lstmobile): intial scanner setup kinda working
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m7s
2026-04-17 16:47:09 -05:00
a1eeadeec4 fix(psi): refactor psi queries 2026-04-17 16:46:44 -05:00
3639c1b77c fix(logistics): purchasing monitoring was going off every 5th min instead of every 5 min 2026-04-17 14:47:23 -05:00
cfbc156517 fix(logistics): historical issue where it was being really weird 2026-04-17 08:02:44 -05:00
fb3cd85b41 fix(ocp): fixes to make sure we always hav printer.data as an array or dont do anything 2026-04-15 09:20:08 -05:00
5b1c88546f fix(datamart): if we do not have 2.0 warehousing activate we need to use legacy 2026-04-15 08:45:48 -05:00
ba3227545d chore(release): 0.0.1-alpha.4
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m4s
Release and Build Image / release (push) Successful in 12s
2026-04-15 07:31:49 -05:00
84909bfcf8 ci(service): changes to the script to allow running the powershell on execution palicy restrictions
Some checks failed
Build and Push LST Docker Image / docker (push) Has been cancelled
2026-04-15 07:31:06 -05:00
e0d0ac2077 feat(datamart): psi data has been added :D 2026-04-15 07:29:35 -05:00
154 changed files with 34876 additions and 950 deletions

2
.gitignore vendored
View File

@@ -5,6 +5,7 @@ builds
.buildNumber .buildNumber
temp temp
brunoApi brunoApi
downloads
.scriptCreds .scriptCreds
node-v24.14.0-x64.msi node-v24.14.0-x64.msi
postgresql-17.9-2-windows-x64.exe postgresql-17.9-2-windows-x64.exe
@@ -148,3 +149,4 @@ dist
.yarn/install-state.gz .yarn/install-state.gz
.pnp.* .pnp.*
frontend/.tanstack/tmp/2249110e-da91fb0b1b87b6c4cc3e2c2cd25037fd

View File

@@ -1,6 +1,6 @@
{ {
"editor.defaultFormatter": "biomejs.biome", "editor.defaultFormatter": "biomejs.biome",
"workbench.colorTheme": "Default Dark+", "workbench.colorTheme": "Dark+",
"terminal.integrated.env.windows": {}, "terminal.integrated.env.windows": {},
"editor.formatOnSave": true, "editor.formatOnSave": true,
"typescript.preferences.importModuleSpecifier": "relative", "typescript.preferences.importModuleSpecifier": "relative",
@@ -71,7 +71,8 @@
"prodlabels", "prodlabels",
"prolink", "prolink",
"Skelly", "Skelly",
"trycatch" "trycatch",
"whse"
], ],
"gitea.token": "8456def90e1c651a761a8711763d6ef225d6b2db", "gitea.token": "8456def90e1c651a761a8711763d6ef225d6b2db",
"gitea.instanceURL": "https://git.tuffraid.net", "gitea.instanceURL": "https://git.tuffraid.net",

View File

@@ -1,5 +1,49 @@
# All Changes to LST can be found below. # All Changes to LST can be found below.
## [0.0.1-alpha.4](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.3...v0.0.1-alpha.4) (2026-04-15)
### 🌟 Enhancements
* **datamart:** migrations completed remaining is the deactivation that will be ran by anylitics ([eccaf17](https://git.tuffraid.net/cowch/lst_v3/commits/eccaf17332fb1c63b8d6bbea6f668c3bb42d44b7))
* **datamart:** psi data has been added :D ([e0d0ac2](https://git.tuffraid.net/cowch/lst_v3/commits/e0d0ac20773159373495d65023587b76b47df34f))
* **migrate:** quality alert migrated ([b0e5fd7](https://git.tuffraid.net/cowch/lst_v3/commits/b0e5fd79998d551d4f155d58416157a324498fbd))
* **ocp:** printer sync and logging logic added ([80189ba](https://git.tuffraid.net/cowch/lst_v3/commits/80189baf906224da43ec1b9b7521153d2a49e059))
* **tcp crud:** tcp server start, stop, restart endpoints + status check ([6307037](https://git.tuffraid.net/cowch/lst_v3/commits/6307037985162bc6b49f9f711132853296f43eee))
### 🐛 Bug fixes
* **datamart:** error when running build and crashed everything ([52a6c82](https://git.tuffraid.net/cowch/lst_v3/commits/52a6c821f4632e4b5b51e0528a0d620e2e0deffc))
### 📚 Documentation
* **docs:** removed docusorus as all docs will be inside lst now to better assist users ([6ba905a](https://git.tuffraid.net/cowch/lst_v3/commits/6ba905a887dbd8f306d71fed75bb34c71fee74c9))
* **env example:** updated the file ([ca3425d](https://git.tuffraid.net/cowch/lst_v3/commits/ca3425d327757120c2cc876fff28e8668c76838d))
* **notifcations:** docs for intro, notifcations, reprint added ([87f7387](https://git.tuffraid.net/cowch/lst_v3/commits/87f738702a935279a248d471541cdd9d49330565))
### 🛠️ Code Refactor
* **agent:** changed to have the test servers on there own push for better testing ([3bf024c](https://git.tuffraid.net/cowch/lst_v3/commits/3bf024cfc97d2841130d54d1a7c5cb5f09f0f598))
* **connection:** corrected the connection to the old system ([38a0b65](https://git.tuffraid.net/cowch/lst_v3/commits/38a0b65e9450c65b8300a10058a8f0357400f4e6))
* **logging:** when notify is true send the error to systemAdmins ([79e653e](https://git.tuffraid.net/cowch/lst_v3/commits/79e653efa3bcb2941ccee06b28378e709e085ec0))
* **notification:** blocking added ([9a0ef8e](https://git.tuffraid.net/cowch/lst_v3/commits/9a0ef8e51a36e3ab45b601b977f1b5cf35d56947))
* **puchase:** changes how the error handling works so a better email can be sent ([9d39c13](https://git.tuffraid.net/cowch/lst_v3/commits/9d39c13510974b5ada2a6f6c2448da3f1b755a5c))
* **reprint:** new query added to deactivate the old notifcation so no chance of duplicates ([c9eb59e](https://git.tuffraid.net/cowch/lst_v3/commits/c9eb59e2ad9847418ac55cb8a4a91c013f6c97bb))
* **server:** added in serverCrash email ([dcb3f2d](https://git.tuffraid.net/cowch/lst_v3/commits/dcb3f2dd1382986639b722778fad113392533b28))
* **services:** added in examples for migration stuff ([fc6dc82](https://git.tuffraid.net/cowch/lst_v3/commits/fc6dc82d8458a9928050dd3770778d6a6e1eea7f))
* **sql:** corrections to the way we reconnect so the app can error out and be reactivated later ([f33587a](https://git.tuffraid.net/cowch/lst_v3/commits/f33587a3d9a72ca72806635fac9d1214bb1452f1))
* **templates:** corrections for new notify process on critcal errors ([07ebf88](https://git.tuffraid.net/cowch/lst_v3/commits/07ebf88806b93b9320f8f9d36b867572dd9a9580))
### 📈 Project changes
* **agent:** added in jeff city ([e47ea9e](https://git.tuffraid.net/cowch/lst_v3/commits/e47ea9ec52a6ebaf5a8f67a7e8bd2c73da6186fb))
* **agent:** added in sherman ([4b6061c](https://git.tuffraid.net/cowch/lst_v3/commits/4b6061c478cbeba7c845dc1c8a015b9998721456))
* **service:** changes to the script to allow running the powershell on execution palicy restrictions ([84909bf](https://git.tuffraid.net/cowch/lst_v3/commits/84909bfcf85b91d085ea9dca78be00482b7fd231))
## [0.0.1-alpha.3](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.2...v0.0.1-alpha.3) (2026-04-10) ## [0.0.1-alpha.3](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.2...v0.0.1-alpha.3) (2026-04-10)

View File

@@ -0,0 +1,38 @@
/**
* To be able to run this we need to set our dev pc in the .env.
* if its empty just ignore it. this will just be the double catch
*/
import { Router } from "express";
import { build, building } from "../utils/build.utils.js";
import { apiReturn } from "../utils/returnHelper.utils.js";
const router = Router();
router.post("/release", async (_, res) => {
if (!building) {
build();
return apiReturn(res, {
success: true,
level: "info",
module: "admin",
subModule: "build",
message: `The build has been triggered see logs for progress of the current build.`,
data: [],
status: 200,
});
} else {
return apiReturn(res, {
success: false,
level: "error",
module: "admin",
subModule: "build",
message: `There is a build in progress already please check the logs for on going progress.`,
data: [],
status: 200,
});
}
});
export default router;

View File

@@ -0,0 +1,12 @@
import type { Express } from "express";
import { requireAuth } from "../middleware/auth.middleware.js";
import build from "./admin.build.js";
import update from "./admin.updateServer.js";
export const setupAdminRoutes = (baseUrl: string, app: Express) => {
//stats will be like this as we dont need to change this
app.use(`${baseUrl}/api/admin/build`, requireAuth, build);
app.use(`${baseUrl}/api/admin/build`, requireAuth, update);
// all other system should be under /api/system/*
};

View File

@@ -0,0 +1,86 @@
/**
* To be able to run this we need to set our dev pc in the .env.
* if its empty just ignore it. this will just be the double catch
*/
import { Router } from "express";
import z from "zod";
import { building } from "../utils/build.utils.js";
import { runUpdate, updating } from "../utils/deployApp.js";
import { apiReturn } from "../utils/returnHelper.utils.js";
const updateServer = z.object({
server: z.string(),
destination: z.string(),
token: z.string().min(5, "Plant tokens should be at least 5 characters long"),
});
const router = Router();
type Update = {
success: boolean;
message: string;
};
router.post("/updateServer", async (req, res) => {
try {
const validated = updateServer.parse(req.body);
if (!updating && !building) {
const update = (await runUpdate({
server: validated.server,
destination: validated.destination,
token: validated.token,
})) as Update;
return apiReturn(res, {
success: update.success,
level: update.success ? "info" : "error",
module: "admin",
subModule: "update",
message: update.message,
data: [],
status: 200,
});
} else {
return apiReturn(res, {
success: false,
level: "error",
module: "admin",
subModule: "update",
message: `${validated.server}: ${validated.token} is already being updated, or is currently building the app.`,
data: [],
status: 200,
});
}
} catch (err) {
if (err instanceof z.ZodError) {
const flattened = z.flattenError(err);
// return res.status(400).json({
// error: "Validation failed",
// details: flattened,
// });
return apiReturn(res, {
success: false,
level: "error", //connect.success ? "info" : "error",
module: "routes",
subModule: "auth",
message: "Validation failed",
data: [flattened.fieldErrors],
status: 400, //connect.success ? 200 : 400,
});
}
return apiReturn(res, {
success: false,
level: "error", //connect.success ? "info" : "error",
module: "routes",
subModule: "auth",
message: "Internal Server Error creating user",
data: [err],
status: 400, //connect.success ? 200 : 400,
});
}
});
export default router;

View File

@@ -13,6 +13,10 @@
* *
* when a criteria is password over we will handle it by counting how many were passed up to 3 then deal with each one respectively * when a criteria is password over we will handle it by counting how many were passed up to 3 then deal with each one respectively
*/ */
import { and, between, inArray, notInArray } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js"; import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import { import {
type SqlQuery, type SqlQuery,
@@ -29,9 +33,101 @@ type Data = {
howManyOptionsRequired?: number; howManyOptionsRequired?: number;
}; };
const lstDbRun = async (data: Data) => {
if (data.options) {
if (data.name === "psiInventory") {
const ids = data.options.articles.split(",").map((id: any) => id.trim());
const whse = data.options.whseToInclude
? data.options.whseToInclude
.split(",")
.map((w: any) => w.trim())
.filter(Boolean)
: [];
const locations = data.options.exludeLanes
? data.options.exludeLanes
.split(",")
.map((l: any) => l.trim())
.filter(Boolean)
: [];
const conditions = [
inArray(invHistoricalData.article, ids),
between(
invHistoricalData.histDate,
data.options.startDate,
data.options.endDate,
),
];
// only add the warehouse condition if there are any whse values
if (whse.length > 0) {
conditions.push(inArray(invHistoricalData.whseId, whse));
}
// locations we dont want in the system
if (locations.length > 0) {
conditions.push(notInArray(invHistoricalData.location, locations));
}
return await db
.select()
.from(invHistoricalData)
.where(and(...conditions));
}
}
return [];
};
export const runDatamartQuery = async (data: Data) => { export const runDatamartQuery = async (data: Data) => {
// search the query db for the query by name // search the query db for the query by name
const sqlQuery = sqlQuerySelector(`datamart.${data.name}`) as SqlQuery; const considerLstDBRuns = ["psiInventory"];
if (considerLstDBRuns.includes(data.name)) {
const lstDB = await lstDbRun(data);
return returnFunc({
success: true,
level: "info",
module: "datamart",
subModule: "lstDBrn",
message: `Data for: ${data.name}`,
data: lstDB,
notify: false,
});
}
const featureQ = sqlQuerySelector(`featureCheck`) as SqlQuery;
const { data: fd, error: fe } = await tryCatch(
prodQuery(featureQ.query, `Running feature check`),
);
if (fe) {
return returnFunc({
success: false,
level: "error",
module: "datamart",
subModule: "query",
message: `feature check failed`,
data: fe as any,
notify: false,
});
}
// for queries that will need to be ran on legacy until we get the plant updated need to go in here
const doubleQueries = ["inventory"];
let queryFile = "";
if (doubleQueries.includes(data.name)) {
queryFile = `datamart.${
fd.data[0].activated > 0 ? data.name : `legacy.${data.name}`
}`;
} else {
queryFile = `datamart.${data.name}`;
}
const sqlQuery = sqlQuerySelector(queryFile) as SqlQuery;
// checking if warehousing is as it will start to effect a lot of queries for plants that are not on 2.
const getDataMartInfo = datamartData.filter((x) => x.endpoint === data.name); const getDataMartInfo = datamartData.filter((x) => x.endpoint === data.name);
@@ -77,7 +173,19 @@ export const runDatamartQuery = async (data: Data) => {
case "deliveryByDateRange": case "deliveryByDateRange":
datamartQuery = datamartQuery datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`) .replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`); .replace("[endDate]", `${data.options.endDate}`)
.replace(
"--and r.ArticleHumanReadableId in ([articles]) ",
data.options.articles
? `and r.ArticleHumanReadableId in (${data.options.articles})`
: "--and r.ArticleHumanReadableId in ([articles]) ",
)
.replace(
"and DeliveredQuantity > 0",
data.options.all
? "--and DeliveredQuantity > 0"
: "and DeliveredQuantity > 0",
);
break; break;
case "customerInventory": case "customerInventory":
@@ -102,6 +210,10 @@ export const runDatamartQuery = async (data: Data) => {
"--,l.RunningNumber", "--,l.RunningNumber",
`${data.options.includeRunningNumbers ? `,l.RunningNumber` : `--,l.RunningNumber`}`, `${data.options.includeRunningNumbers ? `,l.RunningNumber` : `--,l.RunningNumber`}`,
) )
.replaceAll(
"--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot",
`${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot`}`,
)
.replaceAll( .replaceAll(
"--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber", "--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber",
`${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber`}`, `${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber`}`,
@@ -110,6 +222,7 @@ export const runDatamartQuery = async (data: Data) => {
"--,l.WarehouseDescription,l.LaneDescription", "--,l.WarehouseDescription,l.LaneDescription",
`${data.options.locations ? `,l.WarehouseDescription,l.LaneDescription` : `--,l.WarehouseDescription,l.LaneDescription`}`, `${data.options.locations ? `,l.WarehouseDescription,l.LaneDescription` : `--,l.WarehouseDescription,l.LaneDescription`}`,
); );
break; break;
case "fakeEDIUpdate": case "fakeEDIUpdate":
datamartQuery = datamartQuery.replace( datamartQuery = datamartQuery.replace(
@@ -118,6 +231,55 @@ export const runDatamartQuery = async (data: Data) => {
); );
break; break;
case "forecast":
datamartQuery = datamartQuery.replace(
"where DeliveryAddressHumanReadableId in ([customers])",
data.options.customers
? `where DeliveryAddressHumanReadableId in (${data.options.customers})`
: "--where DeliveryAddressHumanReadableId in ([customers])",
);
break;
case "activeArticles2":
datamartQuery = datamartQuery.replace(
"and a.HumanReadableId in ([articles])",
data.options.articles
? `and a.HumanReadableId in (${data.options.articles})`
: "--and a.HumanReadableId in ([articles])",
);
break;
case "psiDeliveryData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"[articles]",
data.options.articles ? `${data.options.articles}` : "[articles]",
);
break;
case "productionData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and ArticleHumanReadableId in ([articles])",
data.options.articles
? `and ArticleHumanReadableId in (${data.options.articles})`
: "--and ArticleHumanReadableId in ([articles])",
);
break;
case "psiPlanningData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and p.IdArtikelvarianten in ([articles])",
data.options.articles
? `and p.IdArtikelvarianten in (${data.options.articles})`
: "--and p.IdArtikelvarianten in ([articles])",
);
break;
default: default:
return returnFunc({ return returnFunc({
success: false, success: false,

View File

@@ -49,4 +49,12 @@ export const datamartData = [
description: `Returns all open orders to correct and resubmit via lst demand mgt, leaving blank will get everything putting an address only returns the specified address. \nNOTE: only orders that were created via edi will populate here.`, description: `Returns all open orders to correct and resubmit via lst demand mgt, leaving blank will get everything putting an address only returns the specified address. \nNOTE: only orders that were created via edi will populate here.`,
options: "address", options: "address",
}, },
{
name: "Production Data",
endpoint: "productionData",
description: `Returns all production data from the date range with the option to have 1 to many avs to search by.`,
options: "startDate,endDate,articles",
optionsRequired: true,
howManyOptionsRequired: 2,
},
]; ];

View File

@@ -0,0 +1,10 @@
import { integer, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
export const deploymentHistory = pgTable("deployment_history", {
id: uuid("id").defaultRandom().primaryKey(),
serverId: uuid("server_id"),
buildNumber: integer("build_number").notNull(),
status: text("status").notNull(), // started, success, failed
message: text("message"),
createdAt: timestamp("created_at").defaultNow(),
});

View File

@@ -0,0 +1,30 @@
import { date, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type z from "zod";
export const invHistoricalData = pgTable("inv_historical_data", {
inv: uuid("id").defaultRandom().primaryKey(),
histDate: date("hist_date").notNull(), // this date should always be yesterday when we post it.
plantToken: text("plant_token"),
article: text("article").notNull(),
articleDescription: text("article_description").notNull(),
materialType: text("material_type"),
total_QTY: text("total_QTY"),
available_QTY: text("available_QTY"),
coa_QTY: text("coa_QTY"),
held_QTY: text("held_QTY"),
consignment_QTY: text("consignment_qty"),
lot_Number: text("lot_number"),
locationId: text("location_id"),
location: text("location"),
whseId: text("whse_id").default(""),
whseName: text("whse_name").default("missing whseName"),
upd_user: text("upd_user").default("lst-system"),
upd_date: timestamp("upd_date").defaultNow(),
});
export const invHistoricalDataSchema = createSelectSchema(invHistoricalData);
export const newInvHistoricalDataSchema = createInsertSchema(invHistoricalData);
export type InvHistoricalData = z.infer<typeof invHistoricalDataSchema>;
export type NewInvHistoricalData = z.infer<typeof newInvHistoricalDataSchema>;

View File

@@ -0,0 +1,40 @@
import {
boolean,
integer,
pgTable,
text,
timestamp,
uuid,
} from "drizzle-orm/pg-core";
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type z from "zod";
export const serverData = pgTable(
"server_data",
{
server_id: uuid("id").defaultRandom().primaryKey(),
name: text("name").notNull(),
server: text("server"),
plantToken: text("plant_token").notNull().unique(),
idAddress: text("id_address"),
greatPlainsPlantCode: text("great_plains_plant_code"),
contactEmail: text("contact_email"),
contactPhone: text("contact_phone"),
active: boolean("active").default(true),
serverLoc: text("server_loc"),
lastUpdated: timestamp("last_updated").defaultNow(),
buildNumber: integer("build_number"),
isUpgrading: boolean("is_upgrading").default(false),
},
// (table) => [
// // uniqueIndex('emailUniqueIndex').on(sql`lower(${table.email})`),
// uniqueIndex("plant_token").on(table.plantToken),
// ],
);
export const serverDataSchema = createSelectSchema(serverData);
export const newServerDataSchema = createInsertSchema(serverData);
export type ServerDataSchema = z.infer<typeof serverDataSchema>;
export type NewServerData = z.infer<typeof newServerDataSchema>;

View File

@@ -1,10 +1,27 @@
import type { InferSelectModel } from "drizzle-orm"; import {
import { integer, pgTable, text, timestamp } from "drizzle-orm/pg-core"; boolean,
integer,
jsonb,
pgTable,
text,
timestamp,
} from "drizzle-orm/pg-core";
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type z from "zod";
export const serverStats = pgTable("stats", { export const appStats = pgTable("app_stats", {
id: text("id").primaryKey().default("serverStats"), id: text("id").primaryKey().default("primary"),
build: integer("build").notNull().default(1), currentBuild: integer("current_build").notNull().default(1),
lastUpdate: timestamp("last_update").defaultNow(), lastBuildAt: timestamp("last_build_at"),
lastDeployAt: timestamp("last_deploy_at"),
building: boolean("building").notNull().default(false),
updating: boolean("updating").notNull().default(false),
lastUpdated: timestamp("last_updated").defaultNow(),
meta: jsonb("meta").$type<Record<string, unknown>>().default({}),
}); });
export type ServerStats = InferSelectModel<typeof serverStats>; export const appStatsSchema = createSelectSchema(appStats);
export const newAppStatsSchema = createInsertSchema(appStats, {});
export type AppStats = z.infer<typeof appStatsSchema>;
export type NewAppStats = z.infer<typeof newAppStatsSchema>;

View File

@@ -0,0 +1,223 @@
import { format } from "date-fns";
import { eq, sql } from "drizzle-orm";
import { runDatamartQuery } from "../datamart/datamart.controller.js";
import { db } from "../db/db.controller.js";
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { createCronJob } from "../utils/croner.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
type Inventory = {
article: string;
alias: string;
materialType: string;
total_palletQTY: string;
available_QTY: string;
coa_QTY: string;
held_QTY: string;
consignment_qty: string;
lot: string;
locationId: string;
laneDescription: string;
warehouseId: string;
warehouseDescription: string;
};
const historicalInvImport = async () => {
const today = new Date();
const { data, error } = await tryCatch(
db
.select()
.from(invHistoricalData)
.where(eq(invHistoricalData.histDate, format(today, "yyyy-MM-dd"))),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "query",
message: `Error getting historical inv info`,
data: error as any,
notify: false,
});
}
if (data?.length === 0) {
const avSQLQuery = sqlQuerySelector(`datamart.activeArticles`) as SqlQuery;
if (!avSQLQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting Article info`,
data: [avSQLQuery.message],
notify: true,
});
}
const { data: inv, error: invError } = await tryCatch(
//prodQuery(sqlQuery.query, "Inventory data"),
runDatamartQuery({
name: "inventory",
options: { lots: "x", locations: "x" },
}),
);
const { data: av, error: avError } = (await tryCatch(
runDatamartQuery({ name: "activeArticles", options: {} }),
)) as any;
if (invError) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting inventory info from prod query`,
data: invError as any,
notify: false,
});
}
if (avError) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting article info from prod query`,
data: invError as any,
notify: false,
});
}
// shape the data to go into our table
const plantToken = process.env.PROD_PLANT_TOKEN ?? "test1";
const importInv = (inv.data ? inv.data : []) as Inventory[];
const importData = importInv.map((i) => {
return {
histDate: sql`(NOW())::date`,
plantToken: plantToken,
article: i.article,
articleDescription: i.alias,
materialType:
av.data.filter((a: any) => a.article === i.article).length > 0
? av.data.filter((a: any) => a.article === i.article)[0]
?.TypeOfMaterial
: "Item not defined",
total_QTY: i.total_palletQTY ?? "0.00",
available_QTY: i.available_QTY ?? "0.00",
coa_QTY: i.coa_QTY ?? "0.00",
held_QTY: i.held_QTY ?? "0.00",
consignment_QTY: i.consignment_qty ?? "0.00",
lot_Number: i.lot ?? "0",
locationId: i.locationId ?? "0",
location: i.laneDescription ?? "Missing lane",
whseId: i.warehouseId ?? "0",
whseName: i.warehouseDescription ?? "Missing warehouse",
};
});
const { data: dataImport, error: errorImport } = await tryCatch(
db.insert(invHistoricalData).values(importData),
);
if (errorImport) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error adding historical data to lst db`,
data: errorImport as any,
notify: true,
});
}
if (dataImport) {
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Historical data was added to lst :D`,
data: [],
notify: false,
});
}
} else {
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Historical Data for: ${format(today, "yyyy-MM-dd")}, is already added and nothing to do.`,
data: [],
notify: false,
});
}
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Some weird crazy error just happened and didnt get captured during the historical inv check.`,
data: [],
notify: true,
});
};
export const historicalSchedule = async () => {
// running the history in case my silly ass dose an update around the shift change time lol, this will prevent loss data. it might be off a little but no one cares
historicalInvImport();
const sqlQuery = sqlQuerySelector(`shiftChange`) as SqlQuery;
if (!sqlQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "query",
message: `Error getting shiftChange sql file`,
data: [sqlQuery.message],
notify: false,
});
}
const { data, error } = await tryCatch(
prodQuery(sqlQuery.query, "Shift Change data"),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "query",
message: `Error getting shiftChange info`,
data: error as any,
notify: false,
});
}
// shift split
const shiftTimeSplit = data?.data[0]?.shiftChange.split(":");
const cronSetup = `0 ${
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[1])}` : "0"
} ${
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[0])}` : "7"
} * * *`;
createCronJob("historicalInv", cronSetup, () => historicalInvImport());
};

View File

@@ -62,7 +62,7 @@ export const printerSync = async () => {
}); });
} }
if (printers?.success) { if (printers?.success && Array.isArray(printers.data)) {
const ignorePrinters = ["pdf24", "standard"]; const ignorePrinters = ["pdf24", "standard"];
const validPrinters = const validPrinters =

View File

@@ -1,6 +1,6 @@
use AlplaPROD_test1 use AlplaPROD_test1
SELECT V_Artikel.IdArtikelvarianten, SELECT V_Artikel.IdArtikelvarianten as article,
V_Artikel.Bezeichnung, V_Artikel.Bezeichnung,
V_Artikel.ArtikelvariantenTypBez, V_Artikel.ArtikelvariantenTypBez,
V_Artikel.PreisEinheitBez, V_Artikel.PreisEinheitBez,

View File

@@ -0,0 +1,43 @@
/**
This will be replacing activeArticles once all data is remapped into this query.
make a note in the docs this activeArticles will go stale sooner or later.
**/
use [test1_AlplaPROD2.0_Read]
select a.Id,
a.HumanReadableId as av,
a.Alias as alias,
p.LoadingUnitsPerTruck as loadingUnitsPerTruck,
p.LoadingUnitsPerTruck * p.LoadingUnitPieces as qtyPerTruck,
p.LoadingUnitPieces,
case when i.MinQuantity IS NOT NULL then round(cast(i.MinQuantity as float), 2) else 0 end as min,
case when i.MaxQuantity IS NOT NULL then round(cast(i.MaxQuantity as float),2) else 0 end as max
from masterData.Article (nolock) as a
/* sales price */
left join
(select *
from (select
id,
PackagingId,
ArticleId,
DefaultCustomer,
ROW_NUMBER() OVER (PARTITION BY ArticleId ORDER BY ValidAfter DESC) AS RowNum
from masterData.SalesPrice (nolock)
where DefaultCustomer = 1) as x
where RowNum = 1
) as s
on a.id = s.ArticleId
/* pkg instructions */
left join
masterData.PackagingInstruction (nolock) as p
on s.PackagingId = p.id
/* stock limits */
left join
masterData.StockLimit (nolock) as i
on a.id = i.ArticleId
where a.active = 1
and a.HumanReadableId in ([articles])

View File

@@ -13,12 +13,12 @@ r.[ArticleHumanReadableId]
,ea.JournalNummer as BOL_Number ,ea.JournalNummer as BOL_Number
,[ReleaseConfirmationState] ,[ReleaseConfirmationState]
,[PlanningState] ,[PlanningState]
--,format(r.[OrderDate], 'yyyy-MM-dd HH:mm') as OrderDate ,format(r.[OrderDate], 'yyyy-MM-dd HH:mm') as OrderDate
,r.[OrderDate] --,r.[OrderDate]
--,FORMAT(r.[DeliveryDate], 'yyyy-MM-dd HH:mm') as DeliveryDate ,FORMAT(r.[DeliveryDate], 'yyyy-MM-dd HH:mm') as DeliveryDate
,r.[DeliveryDate] --,r.[DeliveryDate]
--,FORMAT(r.[LoadingDate], 'yyyy-MM-dd HH:mm') as LoadingDate ,FORMAT(r.[LoadingDate], 'yyyy-MM-dd HH:mm') as LoadingDate
,r.[LoadingDate] --,r.[LoadingDate]
,[Quantity] ,[Quantity]
,[DeliveredQuantity] ,[DeliveredQuantity]
,r.[AdditionalInformation1] ,r.[AdditionalInformation1]
@@ -66,9 +66,9 @@ ROW_NUMBER() OVER (PARTITION BY IdJournal ORDER BY add_date DESC) AS RowNum
zz.IdLieferschein = ea.IdJournal zz.IdLieferschein = ea.IdJournal
where where
--r.ArticleHumanReadableId in ([articles])
--r.ReleaseNumber = 1452 --r.ReleaseNumber = 1452
r.DeliveryDate between @StartDate AND @EndDate r.DeliveryDate between @StartDate AND @EndDate
and DeliveredQuantity > 0 and DeliveredQuantity > 0
--and r.ArticleHumanReadableId in ([articles])
--and Journalnummer = 169386 --and Journalnummer = 169386

View File

@@ -0,0 +1,8 @@
SELECT format(RequirementDate, 'yyyy-MM-dd') as requirementDate
,ArticleHumanReadableId
,CustomerArticleNumber
,ArticleDescription
,Quantity
FROM [test1_AlplaPROD2.0_Read].[forecast].[Forecast]
where DeliveryAddressHumanReadableId in ([customers])
order by RequirementDate

View File

@@ -1,22 +1,29 @@
use [test1_AlplaPROD2.0_Read]
select select
ArticleHumanReadableId as av ArticleHumanReadableId as article
,ArticleAlias as alias ,ArticleAlias as alias
,round(sum(QuantityLoadingUnits),0) total_pallets ,round(sum(QuantityLoadingUnits),2) total_pallets
,round(sum(Quantity),0) as total_palletQTY ,round(sum(Quantity),2) as total_palletQTY
,round(sum(case when State = 0 then QuantityLoadingUnits else 0 end),0) avalible_Pallets ,round(sum(case when State = 0 then QuantityLoadingUnits else 0 end),2) available_Pallets
,round(sum(case when State = 0 then Quantity else 0 end),0) available_QTY ,round(sum(case when State = 0 then Quantity else 0 end),2) available_QTY
,sum(case when b.HumanReadableId = 864 then QuantityLoadingUnits else 0 end) as coa_Pallets ,round(sum(case when b.HumanReadableId = 864 then QuantityLoadingUnits else 0 end),2) as coa_Pallets
,sum(case when b.HumanReadableId = 864 then Quantity else 0 end) as coa_QTY ,round(sum(case when b.HumanReadableId = 864 then Quantity else 0 end),2) as coa_QTY
,sum(case when b.HumanReadableId <> 864 then QuantityLoadingUnits else 0 end) as held_Pallets ,round(sum(case when b.HumanReadableId <> 864 then QuantityLoadingUnits else 0 end),2) as held_Pallets
,sum(case when b.HumanReadableId <> 864 then Quantity else 0 end) as held_QTY ,round(sum(case when b.HumanReadableId <> 864 then Quantity else 0 end),2) as held_QTY
,round(sum(case when w.type = 7 then QuantityLoadingUnits else 0 end),2) as consignment_Pallets
,round(sum(case when w.type = 7 then Quantity else 0 end),2) as consignment_qty
--,l.RunningNumber --,l.RunningNumber
/** should be in line **/
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber /** datamart include lot number **/
/** should be in line **/ --,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot
/** data mart include location data **/
--,l.WarehouseDescription,l.LaneDescription --,l.WarehouseDescription,l.LaneDescription
,articleTypeName ,articleTypeName
FROM [test1_AlplaPROD2.0_Read].[warehousing].[WarehouseUnit] as l FROM [warehousing].[WarehouseUnit] as l (nolock)
left join left join
( (
SELECT [Id] SELECT [Id]
@@ -24,21 +31,28 @@ SELECT [Id]
,d.[Description] ,d.[Description]
,[DefectGroupId] ,[DefectGroupId]
,[IsActive] ,[IsActive]
FROM [test1_AlplaPROD2.0_Read].[blocking].[BlockingDefect] as g FROM [blocking].[BlockingDefect] as g (nolock)
left join left join
[AlplaPROD_test1].dbo.[T_BlockingDefects] as d (nolock) on [AlplaPROD_test1].dbo.[T_BlockingDefects] as d (nolock) on
d.IdGlobalBlockingDefect = g.HumanReadableId d.IdGlobalBlockingDefect = g.HumanReadableId
) as b on ) as b on
b.id = l.MainDefectId b.id = l.MainDefectId
left join
[warehousing].[warehouse] as w (nolock) on
w.id = l.warehouseid
where LaneHumanReadableId not in (20000,21000) where LaneHumanReadableId not in (20000,21000)
group by ArticleHumanReadableId, group by ArticleHumanReadableId,
ArticleAlias, ArticleAlias,
ArticleTypeName ArticleTypeName
--,l.RunningNumber --,l.RunningNumber
/** should be in line **/
/** datamart include lot number **/
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber --,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber
/** should be in line **/
/** data mart include location data **/
--,l.WarehouseDescription,l.LaneDescription --,l.WarehouseDescription,l.LaneDescription
order by ArticleHumanReadableId order by ArticleHumanReadableId

View File

@@ -0,0 +1,48 @@
select
x.idartikelVarianten as article,
x.ArtikelVariantenAlias as alias
--x.Lfdnr as RunningNumber,
,round(sum(EinlagerungsMengeVPKSum),2) as total_pallets
,sum(EinlagerungsMengeSum) as total_palletQTY
,round(sum(VerfuegbareMengeVPKSum),0) as available_Pallets
,sum(VerfuegbareMengeSum) as available_QTY
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as coa_Pallets
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as coa_QTY
,sum(case when c.Description NOT LIKE '%COA%' or x.IdMainDefect = -1 then GesperrteMengeVPKSum else 0 end) as held_Pallets
,sum(case when c.Description NOT LIKE '%COA%' or x.IdMainDefect = -1 then GesperrteMengeSum else 0 end) as held_QTY
,sum(case when x.WarenLagerLagerTyp = 8 then VerfuegbareMengeSum else 0 end) as consignment_qty
,IdProdPlanung as lot
----,IdAdressen,
,x.AdressBez
,x.IdLagerAbteilung as locationId
,x.LagerAbteilungKurzBez as laneDescription
,x.IdWarenlager as warehouseId
,x.WarenLagerKurzBez as warehouseDescription
--,*
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
left join
[AlplaPROD_test1].dbo.T_EtikettenGedruckt as l(nolock) on
x.Lfdnr = l.Lfdnr AND l.Lfdnr > 1
left join
(SELECT *
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] where Active = 1) as c
on x.IdMainDefect = c.IdBlockingDefect
/*
The data below will be controlled by the user in excell by default everything will be passed over
IdAdressen = 3
*/
where /*IdArtikelTyp = 1 and */x.IdWarenlager not in (6, 1)
group by x.idartikelVarianten, ArtikelVariantenAlias, c.Description
--,IdAdressen
,x.AdressBez
,IdProdPlanung
,x.IdLagerAbteilung
,x.LagerAbteilungKurzBez
,x.IdWarenlager
,x.WarenLagerKurzBez
--, x.Lfdnr
order by x.IdArtikelVarianten

View File

@@ -0,0 +1,19 @@
use [test1_AlplaPROD2.0_Reporting]
declare @startDate nvarchar(30) = '[startDate]' --'2024-12-30'
declare @endDate nvarchar(30) = '[endDate]' --'2025-08-09'
select MachineLocation,
ArticleHumanReadableId as article,
sum(Quantity) as Produced,
count(Quantity) as palletsProdued,
FORMAT(convert(date, ProductionDay), 'M/d/yyyy') as ProductionDay,
ProductionLotHumanReadableId as productionLot
from [reporting_productionControlling].[ScannedUnit] (nolock)
where convert(date, ProductionDay) between @startDate and @endDate
and ArticleHumanReadableId in ([articles])
and BookedOut is null
group by MachineLocation, ArticleHumanReadableId,ProductionDay, ProductionLotHumanReadableId

View File

@@ -0,0 +1,79 @@
use AlplaPROD_test1
/**
move this over to the delivery date range query once we have the shift data mapped over correctly.
update the psi stuff on this as well.
**/
DECLARE @StartDate DATE = '[startDate]' -- 2025-1-1
DECLARE @EndDate DATE = '[endDate]' -- 2025-1-31
SELECT
r.[ArticleHumanReadableId]
,[ReleaseNumber]
,h.CustomerOrderNumber
,x.CustomerLineItemNumber
,[CustomerReleaseNumber]
,[ReleaseState]
,[DeliveryState]
,ea.JournalNummer as BOL_Number
,[ReleaseConfirmationState]
,[PlanningState]
--,format(r.[OrderDate], 'yyyy-MM-dd HH:mm') as OrderDate
,r.[OrderDate]
--,FORMAT(r.[DeliveryDate], 'yyyy-MM-dd HH:mm') as DeliveryDate
,r.[DeliveryDate]
--,FORMAT(r.[LoadingDate], 'yyyy-MM-dd HH:mm') as LoadingDate
,r.[LoadingDate]
,[Quantity]
,[DeliveredQuantity]
,r.[AdditionalInformation1]
,r.[AdditionalInformation2]
,[TradeUnits]
,[LoadingUnits]
,[Trucks]
,[LoadingToleranceType]
,[SalesPrice]
,[Currency]
,[QuantityUnit]
,[SalesPriceRemark]
,r.[Remark]
,[Irradiated]
,r.[CreatedByEdi]
,[DeliveryAddressHumanReadableId]
,DeliveryAddressDescription
,[CustomerArtNo]
,[TotalPrice]
,r.[ArticleAlias]
FROM [order].[Release] (nolock) as r
left join
[order].LineItem as x on
r.LineItemId = x.id
left join
[order].Header as h on
x.HeaderId = h.id
--bol stuff
left join
AlplaPROD_test1.dbo.V_LadePlanungenLadeAuftragAbruf (nolock) as zz
on zz.AbrufIdAuftragsAbruf = r.ReleaseNumber
left join
(select * from (SELECT
ROW_NUMBER() OVER (PARTITION BY IdJournal ORDER BY add_date DESC) AS RowNum
,*
FROM [AlplaPROD_test1].[dbo].[T_Lieferungen] (nolock)) x
where RowNum = 1) as ea on
zz.IdLieferschein = ea.IdJournal
where
r.ArticleHumanReadableId in ([articles])
--r.ReleaseNumber = 1452
and r.DeliveryDate between @StartDate AND @EndDate
--and DeliveredQuantity > 0
--and Journalnummer = 169386

View File

@@ -0,0 +1,32 @@
use AlplaPROD_test1
declare @start_date nvarchar(30) = '[startDate]' --'2025-01-01'
declare @end_date nvarchar(30) = '[endDate]' --'2025-08-09'
/*
articles will need to be passed over as well as the date structure we want to see
*/
select x.IdArtikelvarianten As Article,
ProduktionAlias as Description,
standort as MachineId,
MaschinenBezeichnung as MachineName,
--MaschZyklus as PlanningCycleTime,
x.IdProdPlanung as LotNumber,
FORMAT(ProdTag, 'MM/dd/yyyy') as ProductionDay,
x.planMenge as TotalPlanned,
ProduktionMenge as QTYPerDay,
round(ProduktionMengeVPK, 2) PalDay,
Status as finished
--MaschStdAuslastung as nee
from dbo.V_ProdLosProduktionJeProdTag_PLANNING (nolock) as x
left join
dbo.V_ProdPlanung (nolock) as p on
x.IdProdPlanung = p.IdProdPlanung
where ProdTag between @start_date and @end_date
and p.IdArtikelvarianten in ([articles])
--and V_ProdLosProduktionJeProdTag_PLANNING.IdKunde = 10
--and IdProdPlanung = 18442
order by ProdTag desc

View File

@@ -0,0 +1,11 @@
SELECT count(*) as activated
FROM [test1_AlplaPROD2.0_Read].[support].[FeatureActivation]
where feature in (108,7)
/*
as more features get activated and need to have this checked to include the new endpoints add here so we can check this.
108 = waste
7 = warehousing
*/

View File

@@ -0,0 +1,4 @@
select top(1) convert(varchar(8) ,
convert(time,startdate), 108) as shiftChange
from [test1_AlplaPROD2.0_Read].[masterData].[ShiftDefinition]
where teamNumber = 1

View File

@@ -45,7 +45,7 @@ export const monitorAlplaPurchase = async () => {
} }
if (purchaseMonitor[0]?.active) { if (purchaseMonitor[0]?.active) {
createCronJob("purchaseMonitor", "0 */5 * * * *", async () => { createCronJob("purchaseMonitor", "0 5 * * * *", async () => {
try { try {
const result = await prodQuery( const result = await prodQuery(
sqlQuery.query.replace( sqlQuery.query.replace(

View File

@@ -1,5 +1,5 @@
import type { Express } from "express"; import type { Express } from "express";
import { setupAdminRoutes } from "./admin/admin.routes.js";
import { setupAuthRoutes } from "./auth/auth.routes.js"; import { setupAuthRoutes } from "./auth/auth.routes.js";
// import the routes and route setups // import the routes and route setups
import { setupApiDocsRoutes } from "./configs/scaler.config.js"; import { setupApiDocsRoutes } from "./configs/scaler.config.js";
@@ -16,6 +16,7 @@ import { setupUtilsRoutes } from "./utils/utils.routes.js";
export const setupRoutes = (baseUrl: string, app: Express) => { export const setupRoutes = (baseUrl: string, app: Express) => {
//routes that are on by default //routes that are on by default
setupSystemRoutes(baseUrl, app); setupSystemRoutes(baseUrl, app);
setupAdminRoutes(baseUrl, app);
setupApiDocsRoutes(baseUrl, app); setupApiDocsRoutes(baseUrl, app);
setupProdSqlRoutes(baseUrl, app); setupProdSqlRoutes(baseUrl, app);
setupGPSqlRoutes(baseUrl, app); setupGPSqlRoutes(baseUrl, app);

View File

@@ -6,6 +6,7 @@ import { dbCleanup } from "./db/dbCleanup.controller.js";
import { type Setting, settings } from "./db/schema/settings.schema.js"; import { type Setting, settings } from "./db/schema/settings.schema.js";
import { connectGPSql } from "./gpSql/gpSqlConnection.controller.js"; import { connectGPSql } from "./gpSql/gpSqlConnection.controller.js";
import { createLogger } from "./logger/logger.controller.js"; import { createLogger } from "./logger/logger.controller.js";
import { historicalSchedule } from "./logistics/logistics.historicalInv.js";
import { startNotifications } from "./notification/notification.controller.js"; import { startNotifications } from "./notification/notification.controller.js";
import { createNotifications } from "./notification/notifications.master.js"; import { createNotifications } from "./notification/notifications.master.js";
import { printerSync } from "./ocp/ocp.printer.manage.js"; import { printerSync } from "./ocp/ocp.printer.manage.js";
@@ -14,6 +15,7 @@ import { opendockSocketMonitor } from "./opendock/opendockSocketMonitor.utils.js
import { connectProdSql } from "./prodSql/prodSqlConnection.controller.js"; import { connectProdSql } from "./prodSql/prodSqlConnection.controller.js";
import { monitorAlplaPurchase } from "./purchase/purchase.controller.js"; import { monitorAlplaPurchase } from "./purchase/purchase.controller.js";
import { setupSocketIORoutes } from "./socket.io/serverSetup.js"; import { setupSocketIORoutes } from "./socket.io/serverSetup.js";
import { serversChecks } from "./system/serverData.controller.js";
import { baseSettingValidationCheck } from "./system/settingsBase.controller.js"; import { baseSettingValidationCheck } from "./system/settingsBase.controller.js";
import { startTCPServer } from "./tcpServer/tcp.server.js"; import { startTCPServer } from "./tcpServer/tcp.server.js";
import { createCronJob } from "./utils/croner.utils.js"; import { createCronJob } from "./utils/croner.utils.js";
@@ -64,10 +66,12 @@ const start = async () => {
dbCleanup("jobs", 30), dbCleanup("jobs", 30),
); );
createCronJob("logsCleanup", "0 15 5 * * *", () => dbCleanup("logs", 120)); createCronJob("logsCleanup", "0 15 5 * * *", () => dbCleanup("logs", 120));
historicalSchedule();
// one shots only needed to run on server startups // one shots only needed to run on server startups
createNotifications(); createNotifications();
startNotifications(); startNotifications();
serversChecks();
}, 5 * 1000); }, 5 * 1000);
process.on("uncaughtException", async (err) => { process.on("uncaughtException", async (err) => {

View File

@@ -9,7 +9,7 @@ type RoomDefinition<T = unknown> = {
export const protectedRooms: any = { export const protectedRooms: any = {
logs: { requiresAuth: true, role: ["admin", "systemAdmin"] }, logs: { requiresAuth: true, role: ["admin", "systemAdmin"] },
admin: { requiresAuth: true, role: ["admin", "systemAdmin"] }, //admin: { requiresAuth: false, role: ["admin", "systemAdmin"] },
}; };
export const roomDefinition: Record<RoomId, RoomDefinition> = { export const roomDefinition: Record<RoomId, RoomDefinition> = {
@@ -36,4 +36,16 @@ export const roomDefinition: Record<RoomId, RoomDefinition> = {
return []; return [];
}, },
}, },
admin: {
seed: async (limit) => {
console.info(limit);
return [];
},
},
"admin:build": {
seed: async (limit) => {
console.info(limit);
return [];
},
},
}; };

View File

@@ -88,14 +88,12 @@ export const setupSocketIORoutes = (baseUrl: string, server: HttpServer) => {
}); });
} }
const roles = Array.isArray(config.role) ? config.role : [config.role]; const roles = Array.isArray(config?.role) ? config?.role : [config?.role];
console.log(roles, s.user.role);
//if (config?.role && s.user?.role !== config.role) { //if (config?.role && s.user?.role !== config.role) {
if (config?.role && !roles.includes(s.user?.role)) { if (config?.role && !roles.includes(s.user?.role)) {
return s.emit("room-error", { return s.emit("room-error", {
room: rn, roomId: rn,
message: `Not authorized to be in room: ${rn}`, message: `Not authorized to be in room: ${rn}`,
}); });
} }

View File

@@ -1 +1 @@
export type RoomId = "logs" | "labels"; //| "alerts" | "metrics"; export type RoomId = "logs" | "labels" | "admin" | "admin:build"; //| "alerts" | "metrics";

View File

@@ -0,0 +1,154 @@
import { sql } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import {
type NewServerData,
serverData,
} from "../db/schema/serverData.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import { tryCatch } from "../utils/trycatch.utils.js";
const servers: NewServerData[] = [
{
name: "Test server 1",
server: "USMCD1VMS036",
plantToken: "test3",
idAddress: "10.193.0.56",
greatPlainsPlantCode: "00",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
{
name: "Test server 2",
server: "USIOW1VMS036",
plantToken: "test2",
idAddress: "10.75.0.56",
greatPlainsPlantCode: "00",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
{
name: "Lima",
server: "USLIM1VMS006",
plantToken: "uslim1",
idAddress: "10.53.0.26",
greatPlainsPlantCode: "50",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
{
name: "Houston",
server: "ushou1VMS006",
plantToken: "ushou1",
idAddress: "10.195.0.26",
greatPlainsPlantCode: "20",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
{
name: "Dayton",
server: "usday1VMS006",
plantToken: "usday1",
idAddress: "10.44.0.56",
greatPlainsPlantCode: "80",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
{
name: "West Bend",
server: "usweb1VMS006",
plantToken: "usweb1",
idAddress: "10.80.0.26",
greatPlainsPlantCode: "65",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
{
name: "Jeff City",
server: "usjci1VMS006",
plantToken: "usjci",
idAddress: "10.167.0.26",
greatPlainsPlantCode: "40",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
{
name: "Sherman",
server: "usshe1vms006",
plantToken: "usshe1",
idAddress: "10.205.0.26",
greatPlainsPlantCode: "21",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
{
name: "McDonough",
server: "USMCD1VMS006",
plantToken: "usmcd1",
idAddress: "10.193.0.26",
greatPlainsPlantCode: "10",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 82,
},
{
name: "St. Peters",
server: "USTP1VMS006",
plantToken: "usstp1",
idAddress: "10.37.0.26",
greatPlainsPlantCode: "45",
contactEmail: "",
contactPhone: "",
serverLoc: "D$\\LST_V3",
buildNumber: 1,
},
];
export const serversChecks = async () => {
const log = createLogger({ module: "system", subModule: "serverData" });
const { data, error } = await tryCatch(
db
.insert(serverData)
.values(servers)
.onConflictDoUpdate({
target: serverData.plantToken,
set: {
server: sql`excluded.server`,
name: sql`excluded.name`,
idAddress: sql`excluded."id_address"`,
greatPlainsPlantCode: sql`excluded.great_plains_plant_code`,
contactEmail: sql`excluded."contact_email"`,
contactPhone: sql`excluded.contact_phone`,
serverLoc: sql`excluded.server_loc`,
},
})
.returning(),
);
if (error) {
log.error(
{ error: error },
"There was an error when adding or updating the servers.",
);
}
if (data) {
log.info({}, "All Servers were added/updated");
}
};

View File

@@ -0,0 +1,43 @@
import { type Response, Router } from "express";
import { db } from "../db/db.controller.js";
import { serverData } from "../db/schema/serverData.schema.js";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
// export const updateSetting = async (setting: Setting) => {
// // TODO: when the setting is a feature setting we will need to have it run each kill switch on the crons well just stop them and during a reset it just wont start them
// // TODO: when the setting is a system we will need to force an app restart
// // TODO: when the setting is standard we don't do anything.
// };
const r = Router();
r.get("/", async (_, res: Response) => {
const { data: sName, error: sError } = await tryCatch(
db.select().from(serverData).orderBy(serverData.name),
);
if (sError) {
return apiReturn(res, {
success: false,
level: "error",
module: "system",
subModule: "serverData",
message: `There was an error getting the servers `,
data: [sError],
status: 400,
});
}
return apiReturn(res, {
success: true,
level: "info",
module: "system",
subModule: "serverData",
message: `All current servers`,
data: sName ?? [],
status: 200,
});
});
export default r;

View File

@@ -27,7 +27,7 @@ router.get("/", async (_, res) => {
? sqlServerStats?.data[0].UptimeSeconds ? sqlServerStats?.data[0].UptimeSeconds
: [], : [],
eomFGPkgSheetVersion: 1, // this is the excel file version when we have a change to the macro we want to grab this eomFGPkgSheetVersion: 1, // this is the excel file version when we have a change to the macro we want to grab this
masterMacroFile: 1, masterMacroFile: 1.1,
tcpServerOnline: isServerRunning, tcpServerOnline: isServerRunning,
sqlServerConnected: prodSql, sqlServerConnected: prodSql,
gpServerConnected: gpSql, gpServerConnected: gpSql,

View File

@@ -0,0 +1,49 @@
import fs from "node:fs";
import { Router } from "express";
import path from "path";
import { fileURLToPath } from "url";
const router = Router();
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const downloadDir = path.resolve(__dirname, "../../downloads/mobile");
const currentApk = {
packageName: "net.alpla.lst.mobile",
versionName: "0.0.1-alpha",
versionCode: 1,
minSupportedVersionCode: 1,
fileName: "lst-mobile.apk",
};
router.get("/version", async (req, res) => {
const baseUrl = `${req.protocol}://${req.get("host")}`;
res.json({
packageName: currentApk.packageName,
versionName: currentApk.versionName,
versionCode: currentApk.versionCode,
minSupportedVersionCode: currentApk.minSupportedVersionCode,
downloadUrl: `${baseUrl}/lst/api/mobile/apk/latest`,
});
});
router.get("/apk/latest", (_, res) => {
const apkPath = path.join(downloadDir, currentApk.fileName);
if (!fs.existsSync(apkPath)) {
return res.status(404).json({ success: false, message: "APK not found" });
}
res.setHeader("Content-Type", "application/vnd.android.package-archive");
res.setHeader(
"Content-Disposition",
`attachment; filename="${currentApk.fileName}"`,
);
return res.sendFile(apkPath);
});
export default router;

View File

@@ -1,13 +1,17 @@
import type { Express } from "express"; import type { Express } from "express";
import { requireAuth } from "../middleware/auth.middleware.js"; import { requireAuth } from "../middleware/auth.middleware.js";
import getServers from "./serverData.route.js";
import getSettings from "./settings.route.js"; import getSettings from "./settings.route.js";
import updSetting from "./settingsUpdate.route.js"; import updSetting from "./settingsUpdate.route.js";
import stats from "./stats.route.js"; import stats from "./stats.route.js";
import mobile from "./system.mobileApp.js";
export const setupSystemRoutes = (baseUrl: string, app: Express) => { export const setupSystemRoutes = (baseUrl: string, app: Express) => {
//stats will be like this as we dont need to change this //stats will be like this as we dont need to change this
app.use(`${baseUrl}/api/stats`, stats); app.use(`${baseUrl}/api/stats`, stats);
app.use(`${baseUrl}/api/mobile`, mobile);
app.use(`${baseUrl}/api/settings`, getSettings); app.use(`${baseUrl}/api/settings`, getSettings);
app.use(`${baseUrl}/api/servers`, getServers);
app.use(`${baseUrl}/api/settings`, requireAuth, updSetting); app.use(`${baseUrl}/api/settings`, requireAuth, updSetting);
// all other system should be under /api/system/* // all other system should be under /api/system/*

View File

@@ -0,0 +1,91 @@
import { spawn } from "node:child_process";
import { createLogger } from "../logger/logger.controller.js";
import { emitToRoom } from "../socket.io/roomEmitter.socket.js";
import { updateAppStats } from "./updateAppStats.utils.js";
import { zipBuild } from "./zipper.utils.js";
export const emitBuildLog = (message: string, level = "info") => {
const payload = {
type: "build",
level,
message,
timestamp: new Date().toISOString(),
};
//console.log(`[BUILD][${level.toUpperCase()}] ${message}`);
emitToRoom("admin:build", payload as any);
if (payload.level === "info") {
log.info({ stack: payload }, payload.message);
}
// if (log) {
// log(payload);
// }
};
export let building = false;
const log = createLogger({ module: "utils", subModule: "builds" });
export const build = async () => {
const appDir = process.env.DEV_DIR ?? "";
return new Promise((resolve) => {
building = true;
updateAppStats({
lastUpdated: new Date(),
building: true,
});
emitBuildLog(`Starting build in: ${appDir}`);
const child = spawn("npm", ["run", "build"], {
cwd: appDir,
shell: true,
});
child.stdout.on("data", (data) => {
const lines = data.toString().split(/\r?\n/);
for (const line of lines) {
if (line.trim() !== "") {
emitBuildLog(line, "info");
}
}
});
child.stderr.on("data", (data) => {
const lines = data.toString().split(/\r?\n/);
for (const line of lines) {
if (line.trim() !== "") {
emitBuildLog(line, "error");
}
}
});
child.on("close", (code) => {
if (code === 0) {
emitBuildLog("Build completed successfully.", "info");
building = false;
zipBuild();
resolve(true);
} else {
building = false;
updateAppStats({
lastUpdated: new Date(),
building: false,
});
emitBuildLog(`Build failed with code ${code}`, "error");
//reject(new Error(`Build failed with code ${code}`));
}
});
child.on("error", (err) => {
building = false;
updateAppStats({
lastUpdated: new Date(),
building: false,
});
emitBuildLog(`Process error: ${err.message}`, "error");
// reject(err);
});
});
};

View File

@@ -9,6 +9,7 @@ export const allowedOrigins = [
"http://localhost:4000", "http://localhost:4000",
"http://localhost:4001", "http://localhost:4001",
"http://localhost:5500", "http://localhost:5500",
"http://localhost:8081",
"https://admin.socket.io", "https://admin.socket.io",
"https://electron-socket-io-playground.vercel.app", "https://electron-socket-io-playground.vercel.app",
`${process.env.URL}`, `${process.env.URL}`,

View File

@@ -3,6 +3,7 @@ import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js"; import { db } from "../db/db.controller.js";
import { jobAuditLog } from "../db/schema/auditLog.schema.js"; import { jobAuditLog } from "../db/schema/auditLog.schema.js";
import { createLogger } from "../logger/logger.controller.js"; import { createLogger } from "../logger/logger.controller.js";
import type { ReturnHelper } from "./returnHelper.utils.js";
// example createJob // example createJob
// createCronJob("test Cron", "*/5 * * * * *", async () => { // createCronJob("test Cron", "*/5 * * * * *", async () => {
@@ -45,7 +46,7 @@ const cronStats: Record<string, { created: number; replaced: number }> = {};
export const createCronJob = async ( export const createCronJob = async (
name: string, name: string,
schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second
task: () => Promise<void>, // what function are we passing over task: () => Promise<void | ReturnHelper>, // what function are we passing over
source = "unknown", source = "unknown",
) => { ) => {
// get the timezone based on the os timezone set // get the timezone based on the os timezone set

123
backend/utils/deployApp.ts Normal file
View File

@@ -0,0 +1,123 @@
import { spawn } from "node:child_process";
import { eq, sql } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { serverData } from "../db/schema/serverData.schema.js";
import { appStats } from "../db/schema/stats.schema.js";
//import { createLogger } from "../logger/logger.controller.js";
import { emitBuildLog } from "./build.utils.js";
import { returnFunc } from "./returnHelper.utils.js";
// const log = createLogger({ module: "utils", subModule: "deploy" });
export let updating = false;
const updateServerBuildNumber = async (token: string) => {
// get the current build
const buildNum = await db.select().from(appStats);
// update the build now
await db
.update(serverData)
.set({ buildNumber: buildNum[0]?.currentBuild, lastUpdated: sql`NOW()` })
.where(eq(serverData.plantToken, token));
};
export const runUpdate = ({
server,
destination,
token,
}: {
server: string;
destination: string;
token: string;
}) => {
return new Promise((resolve, reject) => {
updating = true;
const scriptPath = process.env.UPDATE_SCRIPT_PATH;
if (!scriptPath) {
return returnFunc({
success: true,
level: "error",
module: "utils",
subModule: "deploy",
message: "UPDATE_SCRIPT_PATH please make sure you have this set.",
data: [],
notify: true,
room: "admin",
});
}
const args = [
"-ExecutionPolicy",
"Bypass",
"-File",
scriptPath,
"-Server",
server,
"-Destination",
destination,
"-Token",
token,
"-ADM_USER",
process.env.DEV_USER ?? "",
"-ADM_PASSWORD",
process.env.DEV_PASSWORD ?? "",
"-AppDir",
process.env.DEV_DIR ?? "",
];
emitBuildLog(`Starting update for ${server}`);
const child = spawn("powershell.exe", args, {
shell: false,
});
child.stdout.on("data", (data) => {
const lines = data.toString().split(/\r?\n/);
for (const line of lines) {
if (line.trim()) {
emitBuildLog(line);
}
}
});
child.stderr.on("data", (data) => {
const lines = data.toString().split(/\r?\n/);
for (const line of lines) {
if (line.trim()) {
emitBuildLog(line, "error");
}
}
});
child.on("close", (code) => {
if (code === 0) {
emitBuildLog(`Update completed for ${server}`);
updating = false;
updateServerBuildNumber(token);
resolve({
success: true,
message: `Update completed for ${server}`,
data: [],
});
} else {
emitBuildLog(`Update failed for ${server} (code ${code})`, "error");
updating = false;
reject({
success: false,
message: `Update failed for ${server} (code ${code})`,
data: [],
});
}
});
child.on("error", (err) => {
emitBuildLog(`Process error: ${err.message}`, "error");
updating = false;
reject({
success: false,
message: `${server}: Encountered an error while processing: ${err.message} `,
data: err,
});
});
});
};

View File

@@ -1,7 +1,7 @@
import type { Response } from "express"; import type { Response } from "express";
import { createLogger } from "../logger/logger.controller.js"; import { createLogger } from "../logger/logger.controller.js";
interface Data<T = unknown[]> { export interface ReturnHelper<T = unknown[]> {
success: boolean; success: boolean;
module: module:
| "system" | "system"
@@ -13,31 +13,11 @@ interface Data<T = unknown[]> {
| "notification" | "notification"
| "email" | "email"
| "purchase" | "purchase"
| "tcp"; | "tcp"
subModule: | "logistics"
| "db" | "admin";
| "labeling" subModule: string;
| "printer"
| "prodSql"
| "query"
| "sendmail"
| "auth"
| "datamart"
| "jobs"
| "apt"
| "settings"
| "get"
| "update"
| "delete"
| "post"
| "notification"
| "delete"
| "printing"
| "gpSql"
| "email"
| "gpChecks"
| "prodEndpoint"
| "create_server";
level: "info" | "error" | "debug" | "fatal" | "warn"; level: "info" | "error" | "debug" | "fatal" | "warn";
message: string; message: string;
room?: string; room?: string;
@@ -59,7 +39,7 @@ interface Data<T = unknown[]> {
* data: [] the data that will be passed back * data: [] the data that will be passed back
* notify: false by default this is to send a notification to a users email to alert them of an issue. * notify: false by default this is to send a notification to a users email to alert them of an issue.
*/ */
export const returnFunc = (data: Data) => { export const returnFunc = (data: ReturnHelper) => {
const notify = data.notify ? data.notify : false; const notify = data.notify ? data.notify : false;
const room = data.room ?? data.room; const room = data.room ?? data.room;
const log = createLogger({ module: data.module, subModule: data.subModule }); const log = createLogger({ module: data.module, subModule: data.subModule });
@@ -92,7 +72,7 @@ export const returnFunc = (data: Data) => {
export function apiReturn( export function apiReturn(
res: Response, res: Response,
opts: Data & { status?: number }, opts: ReturnHelper & { status?: number },
optional?: unknown, // leave this as unknown so we can pass an object or an array over. optional?: unknown, // leave this as unknown so we can pass an object or an array over.
): Response { ): Response {
const result = returnFunc(opts); const result = returnFunc(opts);

View File

@@ -0,0 +1,17 @@
import { db } from "../db/db.controller.js";
import { appStats } from "../db/schema/stats.schema.js";
export const updateAppStats = async (
data: Partial<typeof appStats.$inferInsert>,
) => {
await db
.insert(appStats)
.values({
id: "primary",
...data,
})
.onConflictDoUpdate({
target: appStats.id,
set: data,
});
};

View File

@@ -0,0 +1,177 @@
import fs from "node:fs";
import fsp from "node:fs/promises";
import path from "node:path";
import archiver from "archiver";
import { createLogger } from "../logger/logger.controller.js";
import { emitBuildLog } from "./build.utils.js";
import { updateAppStats } from "./updateAppStats.utils.js";
const log = createLogger({ module: "utils", subModule: "zip" });
const exists = async (target: string) => {
try {
await fsp.access(target);
return true;
} catch {
return false;
}
};
const getNextBuildNumber = async (buildNumberFile: string) => {
if (!(await exists(buildNumberFile))) {
await fsp.writeFile(buildNumberFile, "1", "utf8");
return 1;
}
const raw = await fsp.readFile(buildNumberFile, "utf8");
const current = Number.parseInt(raw.trim(), 10);
if (Number.isNaN(current) || current < 1) {
await fsp.writeFile(buildNumberFile, "1", "utf8");
return 1;
}
const next = current + 1;
await fsp.writeFile(buildNumberFile, String(next), "utf8");
// update the server with the next build number
await updateAppStats({
currentBuild: next,
lastBuildAt: new Date(),
building: true,
});
return next;
};
const cleanupOldBuilds = async (buildFolder: string, maxBuilds: number) => {
const entries = await fsp.readdir(buildFolder, { withFileTypes: true });
const zipFiles: { fullPath: string; name: string; mtimeMs: number }[] = [];
for (const entry of entries) {
if (!entry.isFile()) continue;
if (!/^LSTV3-\d+\.zip$/i.test(entry.name)) continue;
const fullPath = path.join(buildFolder, entry.name);
const stat = await fsp.stat(fullPath);
zipFiles.push({
fullPath,
name: entry.name,
mtimeMs: stat.mtimeMs,
});
}
zipFiles.sort((a, b) => b.mtimeMs - a.mtimeMs);
const toRemove = zipFiles.slice(maxBuilds);
for (const file of toRemove) {
await fsp.rm(file.fullPath, { force: true });
emitBuildLog(`Removed old build: ${file.name}`);
}
};
export const zipBuild = async () => {
const appDir = process.env.DEV_DIR ?? "";
const maxBuilds = Number(process.env.MAX_BUILDS ?? 5);
if (!appDir) {
log.error({ notify: true }, "Forgot to add in the dev dir into the env");
return;
}
const includesFile = path.join(appDir, ".includes");
const buildNumberFile = path.join(appDir, ".buildNumber");
const buildFolder = path.join(appDir, "builds");
const tempFolder = path.join(appDir, "temp", "zip-temp");
if (!(await exists(includesFile))) {
log.error({ notify: true }, "Missing .includes file common");
return;
}
await fsp.mkdir(buildFolder, { recursive: true });
const buildNumber = await getNextBuildNumber(buildNumberFile);
const zipFileName = `LSTV3-${buildNumber}.zip`;
const zipFile = path.join(buildFolder, zipFileName);
// make the folders in case they are not created already
emitBuildLog(`Using build number: ${buildNumber}`);
if (await exists(tempFolder)) {
await fsp.rm(tempFolder, { recursive: true, force: true });
}
await fsp.mkdir(tempFolder, { recursive: true });
const includes = (await fsp.readFile(includesFile, "utf8"))
.split(/\r?\n/)
.map((line) => line.trim())
.filter(Boolean);
emitBuildLog(`Preparing zip from ${includes.length} include entries`);
for (const relPath of includes) {
const source = path.join(appDir, relPath);
const dest = path.join(tempFolder, relPath);
if (!(await exists(source))) {
emitBuildLog(`Skipping missing path: ${relPath}`, "error");
continue;
}
const stat = await fsp.stat(source);
await fsp.mkdir(path.dirname(dest), { recursive: true });
if (stat.isDirectory()) {
emitBuildLog(`Copying folder: ${relPath}`);
await fsp.cp(source, dest, { recursive: true });
} else {
emitBuildLog(`Copying file: ${relPath}`);
await fsp.copyFile(source, dest);
}
}
// if something crazy happens and we get the same build lets just reuse it
// if (await exists(zipFile)) {
// await fsp.rm(zipFile, { force: true });
// }
emitBuildLog(`Creating zip: ${zipFile}`);
await new Promise<void>((resolve, reject) => {
const output = fs.createWriteStream(zipFile);
const archive = archiver("zip", { zlib: { level: 9 } });
output.on("close", () => resolve());
output.on("error", reject);
archive.on("error", reject);
archive.pipe(output);
// zip contents of temp folder, not temp folder itself
archive.directory(tempFolder, false);
archive.finalize();
});
await fsp.rm(tempFolder, { recursive: true, force: true });
emitBuildLog(`Zip completed successfully: ${zipFile}`);
await cleanupOldBuilds(buildFolder, maxBuilds);
await updateAppStats({
lastUpdated: new Date(),
building: false,
});
return {
success: true,
buildNumber,
zipFile,
zipFileName,
};
};

View File

@@ -1,37 +0,0 @@
meta {
name: Login
type: http
seq: 1
}
post {
url: {{url}}/api/auth/sign-in/email
body: json
auth: inherit
}
headers {
Origin: http://localhost:3000
}
body:json {
{
"email": "blake.matthes@alpla.com",
"password": "nova0511"
}
}
script:post-response {
// // grab the raw Set-Cookie header
// const cookies = res.headers["set-cookie"];
// const sessionCookie = cookies[0].split(";")[0];
// // Save it as an environment variable
// bru.setEnvVar("session_cookie", sessionCookie);
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,35 +0,0 @@
meta {
name: Register
type: http
seq: 2
}
post {
url: {{url}}/api/authentication/register
body: json
auth: inherit
}
body:json {
{
"name":"Blake", // option when in the frontend as we will pass over as username if not added
"username": "matthes01",
"email": "blake.matthes@alpla.com",
"password": "nova0511"
}
}
script:post-response {
// // grab the raw Set-Cookie header
// const cookies = res.headers["set-cookie"];
// const sessionCookie = cookies[0].split(";")[0];
// // Save it as an environment variable
// bru.setEnvVar("session_cookie", sessionCookie);
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,8 +0,0 @@
meta {
name: auth
seq: 5
}
auth {
mode: inherit
}

View File

@@ -1,16 +0,0 @@
meta {
name: getSession
type: http
seq: 3
}
get {
url: {{url}}/api/auth/get-session
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,9 +0,0 @@
{
"version": "1",
"name": "lst_v3",
"type": "collection",
"ignore": [
"node_modules",
".git"
]
}

View File

@@ -1,3 +0,0 @@
docs {
All Api endpoints to the logistics support tool
}

View File

@@ -1,16 +0,0 @@
meta {
name: Get queries
type: http
seq: 1
}
get {
url: {{url}}/api/datamart
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,20 +0,0 @@
meta {
name: Run Query
type: http
seq: 2
}
get {
url: {{url}}/api/datamart/:name
body: none
auth: inherit
}
params:path {
name: activeArticles
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,8 +0,0 @@
meta {
name: datamart
seq: 2
}
auth {
mode: inherit
}

View File

@@ -1,7 +0,0 @@
vars {
url: http://localhost:3000/lst
readerIp: 10.44.14.215
}
vars:secret [
token
]

View File

@@ -1,20 +0,0 @@
meta {
name: Get All notifications.
type: http
seq: 1
}
get {
url: {{url}}/api/notification
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}
docs {
Passing all as a query param will return all queries active and none active
}

View File

@@ -1,24 +0,0 @@
meta {
name: Subscribe to notification
type: http
seq: 2
}
post {
url: {{url}}/api/notification/sub
body: json
auth: inherit
}
body:json {
{
"userId":"m6AbQXFwOXoX3YKLfwWgq2LIdDqS5jqv",
"notificationId": "0399eb2a-39df-48b7-9f1c-d233cec94d2e",
"emails": ["blake.matthes@alpla.com","blake.matthes@alpla.com"]
}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,8 +0,0 @@
meta {
name: notifications
seq: 7
}
auth {
mode: inherit
}

View File

@@ -1,24 +0,0 @@
meta {
name: remove sub notification
type: http
seq: 4
}
delete {
url: {{url}}/api/notification/sub
body: json
auth: inherit
}
body:json {
{
"userId":"0kHd6Kkdub4GW6rK1qa1yjWwqXtvykqT",
"notificationId": "0399eb2a-39df-48b7-9f1c-d233cec94d2e",
"emails": ["blake.mattes@alpla.com"]
}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,16 +0,0 @@
meta {
name: subscriptions
type: http
seq: 5
}
get {
url: {{url}}/api/notification/sub
body: json
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,31 +0,0 @@
meta {
name: update notification
type: http
seq: 6
}
patch {
url: {{url}}/api/notification/:id
body: json
auth: inherit
}
params:path {
id: 0399eb2a-39df-48b7-9f1c-d233cec94d2e
}
body:json {
{
"active" : true,
"options": []
}
}
settings {
encodeUrl: true
timeout: 0
}
docs {
Passing all as a query param will return all queries active and none active
}

View File

@@ -1,24 +0,0 @@
meta {
name: update sub notification
type: http
seq: 3
}
patch {
url: {{url}}/api/notification/sub
body: json
auth: inherit
}
body:json {
{
"userId":"m6AbQXFwOXoX3YKLfwWgq2LIdDqS5jqv",
"notificationId": "0399eb2a-39df-48b7-9f1c-d233cec94d2e",
"emails": ["cowchmonkey@gmail.com"]
}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,22 +0,0 @@
meta {
name: Printer Listenter
type: http
seq: 1
}
post {
url: {{url}}/api/ocp/printer/listener/line_1
body: json
auth: inherit
}
body:json {
{
"message":"xnvjdhhgsdfr"
}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,8 +0,0 @@
meta {
name: ocp
seq: 9
}
auth {
mode: inherit
}

View File

@@ -1,16 +0,0 @@
meta {
name: GetApt
type: http
seq: 1
}
get {
url: {{url}}/api/opendock
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,16 +0,0 @@
meta {
name: Sql Start
type: http
seq: 4
}
post {
url: {{url}}/api/system/prodsql/start
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,16 +0,0 @@
meta {
name: Sql restart
type: http
seq: 4
}
post {
url: {{url}}/api/system/prodsql/restart
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,16 +0,0 @@
meta {
name: Sql stop
type: http
seq: 4
}
post {
url: {{url}}/api/system/prodsql/stop
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,8 +0,0 @@
meta {
name: prodSql
seq: 6
}
auth {
mode: inherit
}

View File

@@ -1,8 +0,0 @@
meta {
name: rfidReaders
seq: 8
}
auth {
mode: inherit
}

View File

@@ -1,20 +0,0 @@
meta {
name: reader
type: http
seq: 2
}
post {
url: https://usday1prod.alpla.net/lst/old/api/rfid/mgtevents/line3.1
body: json
auth: inherit
}
body:json {
{}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,20 +0,0 @@
meta {
name: Config
type: http
seq: 2
}
get {
url: https://{{readerIp}}/cloud/config
body: none
auth: bearer
}
auth:bearer {
token: {{token}}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,32 +0,0 @@
meta {
name: Login
type: http
seq: 1
}
get {
url: https://{{readerIp}}/cloud/localRestLogin
body: none
auth: basic
}
auth:basic {
username: admin
password: Zebra123!
}
script:post-response {
const body = res.getBody();
if (body.message) {
bru.setEnvVar("token", body.message);
} else {
bru.setEnvVar("token", "error");
}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,237 +0,0 @@
meta {
name: Update Config
type: http
seq: 3
}
put {
url: https://{{readerIp}}/cloud/config
body: json
auth: bearer
}
headers {
Content-Type: application/json
}
auth:bearer {
token: {{token}}
}
body:json {
{
"GPIO-LED": {
"GPODefaults": {
"1": "HIGH",
"2": "HIGH",
"3": "HIGH",
"4": "HIGH"
},
"LEDDefaults": {
"3": "GREEN"
},
"TAG_READ": [
{
"pin": 1,
"state": "HIGH",
"type": "GPO"
}
]
},
"READER-GATEWAY": {
"batching": [
{
"maxPayloadSizePerReport": 256000,
"reportingInterval": 2000
},
{
"maxPayloadSizePerReport": 256000,
"reportingInterval": 2000
}
],
"endpointConfig": {
"data": {
"event": {
"connections": [
{
"additionalOptions": {
"retention": {
"maxEventRetentionTimeInMin": 500,
"maxNumEvents": 150000,
"throttle": 100
}
},
"description": "",
"name": "LST",
"options": {
"URL": "https://usday1prod.alpla.net/lst/old/api/rfid/taginfo/line3.4",
"security": {
"CACertificateFileLocation": "",
"authenticationOptions": {},
"authenticationType": "NONE",
"verifyHost": false,
"verifyPeer": false
}
},
"type": "httpPost"
},
{
"additionalOptions": {
"retention": {
"maxEventRetentionTimeInMin": 500,
"maxNumEvents": 150000,
"throttle": 100
}
},
"description": "",
"name": "mgt",
"options": {
"URL": "https://usday1prod.alpla.net/lst/old/api/rfid/mgtevents/line3.4",
"security": {
"CACertificateFileLocation": "",
"authenticationOptions": {},
"authenticationType": "NONE",
"verifyHost": false,
"verifyPeer": false
}
},
"type": "httpPost"
}
]
}
}
},
"managementEventConfig": {
"errors": {
"antenna": false,
"cpu": {
"reportIntervalInSec": 1800,
"threshold": 90
},
"database": true,
"flash": {
"reportIntervalInSec": 1800,
"threshold": 90
},
"ntp": true,
"radio": true,
"radio_control": true,
"ram": {
"reportIntervalInSec": 1800,
"threshold": 90
},
"reader_gateway": true,
"userApp": {
"reportIntervalInSec": 1800,
"threshold": 120
}
},
"gpiEvents": true,
"gpoEvents": true,
"heartbeat": {
"fields": {
"radio_control": [
"ANTENNAS",
"RADIO_ACTIVITY",
"RADIO_CONNECTION",
"CPU",
"RAM",
"UPTIME",
"NUM_ERRORS",
"NUM_WARNINGS",
"NUM_TAG_READS",
"NUM_TAG_READS_PER_ANTENNA",
"NUM_DATA_MESSAGES_TXED",
"NUM_RADIO_PACKETS_RXED"
],
"reader_gateway": [
"NUM_DATA_MESSAGES_RXED",
"NUM_MANAGEMENT_EVENTS_TXED",
"NUM_DATA_MESSAGES_TXED",
"NUM_DATA_MESSAGES_RETAINED",
"NUM_DATA_MESSAGES_DROPPED",
"CPU",
"RAM",
"UPTIME",
"NUM_ERRORS",
"NUM_WARNINGS",
"INTERFACE_CONNECTION_STATUS",
"NOLOCKQ_DEPTH"
],
"system": [
"CPU",
"FLASH",
"NTP",
"RAM",
"SYSTEMTIME",
"TEMPERATURE",
"UPTIME",
"GPO",
"GPI",
"POWER_NEGOTIATION",
"POWER_SOURCE",
"MAC_ADDRESS",
"HOSTNAME"
],
"userapps": [
"STATUS",
"CPU",
"RAM",
"UPTIME",
"NUM_DATA_MESSAGES_RXED",
"NUM_DATA_MESSAGES_TXED",
"INCOMING_DATA_BUFFER_PERCENTAGE_REMAINING",
"OUTGOING_DATA_BUFFER_PERCENTAGE_REMAINING"
]
},
"interval": 60
},
"userappEvents": true,
"warnings": {
"cpu": {
"reportIntervalInSec": 1800,
"threshold": 80
},
"database": true,
"flash": {
"reportIntervalInSec": 1800,
"threshold": 80
},
"ntp": true,
"radio_api": true,
"radio_control": true,
"ram": {
"reportIntervalInSec": 1800,
"threshold": 80
},
"reader_gateway": true,
"temperature": {
"ambient": 75,
"pa": 105
},
"userApp": {
"reportIntervalInSec": 1800,
"threshold": 60
}
}
},
"retention": [
{
"maxEventRetentionTimeInMin": 500,
"maxNumEvents": 150000,
"throttle": 100
},
{
"maxEventRetentionTimeInMin": 500,
"maxNumEvents": 150000,
"throttle": 100
}
]
}
}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,12 +0,0 @@
meta {
name: readerSpecific
}
auth {
mode: basic
}
auth:basic {
username: admin
password: Zebra123!
}

View File

@@ -1,20 +0,0 @@
meta {
name: Get Settings
type: http
seq: 3
}
get {
url: {{url}}/api/settings
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}
docs {
returns all settings
}

View File

@@ -1,16 +0,0 @@
meta {
name: Status
type: http
seq: 1
}
get {
url: {{url}}/api/stats
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,33 +0,0 @@
meta {
name: updateSetting
type: http
seq: 2
}
patch {
url: {{url}}/api/settings/opendock_sync
body: json
auth: inherit
}
body:json {
{
"value" : "1",
"active": "true"
}
}
settings {
encodeUrl: true
timeout: 0
}
docs {
Allows the changing of a setting based on the parameter.
* when a setting that is being changed is a feature there will be some backgound logic that will stop that features processes and no long work.
* when the setting is being changed is system the entire app will do a full restart
* when a seeting is being changed and is standard nothing will happen until the next action is completed. example someone prints a label and you changed the default to 120 second from 90 seconds
}

View File

@@ -1,16 +0,0 @@
meta {
name: Active Jobs
type: http
seq: 5
}
get {
url: {{url}}/api/utils/croner
body: none
auth: inherit
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -1,22 +0,0 @@
meta {
name: Change job status
type: http
seq: 2
}
patch {
url: {{url}}/api/utils/croner/stop
body: json
auth: inherit
}
body:json {
{
"name": "open-dock-monitor"
}
}
settings {
encodeUrl: true
timeout: 0
}

View File

@@ -19,6 +19,8 @@
"better-auth": "^1.5.5", "better-auth": "^1.5.5",
"class-variance-authority": "^0.7.1", "class-variance-authority": "^0.7.1",
"clsx": "^2.1.1", "clsx": "^2.1.1",
"date-fns": "^4.1.0",
"date-fns-tz": "^3.2.0",
"lucide-react": "^0.577.0", "lucide-react": "^0.577.0",
"next-themes": "^0.4.6", "next-themes": "^0.4.6",
"radix-ui": "^1.4.3", "radix-ui": "^1.4.3",
@@ -6016,6 +6018,25 @@
"node": ">= 12" "node": ">= 12"
} }
}, },
"node_modules/date-fns": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/date-fns/-/date-fns-4.1.0.tgz",
"integrity": "sha512-Ukq0owbQXxa/U3EGtsdVBkR1w7KOQ5gIBqdH2hkvknzZPYvBxb/aa6E8L7tmjFtkwZBu3UXBbjIgPo/Ez4xaNg==",
"license": "MIT",
"funding": {
"type": "github",
"url": "https://github.com/sponsors/kossnocorp"
}
},
"node_modules/date-fns-tz": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/date-fns-tz/-/date-fns-tz-3.2.0.tgz",
"integrity": "sha512-sg8HqoTEulcbbbVXeg84u5UnlsQa8GS5QXMqjjYIhS4abEVVKIUwe0/l/UhrZdKaL/W5eWZNlbTeEIiOXTcsBQ==",
"license": "MIT",
"peerDependencies": {
"date-fns": "^3.0.0 || ^4.0.0"
}
},
"node_modules/debug": { "node_modules/debug": {
"version": "4.4.3", "version": "4.4.3",
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz", "resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",

View File

@@ -34,7 +34,9 @@
"tailwind-merge": "^3.5.0", "tailwind-merge": "^3.5.0",
"tailwindcss": "^4.2.1", "tailwindcss": "^4.2.1",
"tw-animate-css": "^1.4.0", "tw-animate-css": "^1.4.0",
"zod": "^4.3.6" "zod": "^4.3.6",
"date-fns": "^4.1.0",
"date-fns-tz": "^3.2.0"
}, },
"devDependencies": { "devDependencies": {
"@eslint/js": "^9.36.0", "@eslint/js": "^9.36.0",

View File

@@ -1,5 +1,5 @@
import { Link } from "@tanstack/react-router"; import { Link } from "@tanstack/react-router";
import { Bell, Logs, Settings } from "lucide-react"; import { Bell, Logs, Server, Settings } from "lucide-react";
import { import {
SidebarGroup, SidebarGroup,
@@ -40,6 +40,14 @@ export default function AdminSidebar({ session }: any) {
module: "admin", module: "admin",
active: true, active: true,
}, },
{
title: "Servers",
url: "/admin/servers",
icon: Server,
role: ["systemAdmin", "admin"],
module: "admin",
active: true,
},
{ {
title: "Logs", title: "Logs",
url: "/admin/logs", url: "/admin/logs",

View File

@@ -1,22 +1,55 @@
import { useEffect, useState } from "react"; import { useCallback, useEffect, useState } from "react";
import socket from "@/lib/socket.io"; import socket from "@/lib/socket.io";
export function useSocketRoom<T>(roomId: string) { type RoomUpdatePayload<T> = {
roomId: string;
payloads: T[];
};
type RoomErrorPayload = {
roomId?: string;
message?: string;
};
export function useSocketRoom<T>(
roomId: string,
getKey?: (item: T) => string | number,
) {
const [data, setData] = useState<T[]>([]); const [data, setData] = useState<T[]>([]);
const [info, setInfo] = useState( const [info, setInfo] = useState(
"No data yet — join the room to start receiving", "No data yet — join the room to start receiving",
); );
const clearRoom = useCallback(
(id?: string | number) => {
if (id !== undefined && getKey) {
setData((prev) => prev.filter((item) => getKey(item) !== id));
setInfo(`Removed item ${id}`);
return;
}
setData([]);
setInfo("Room data cleared");
},
[getKey],
);
useEffect(() => { useEffect(() => {
function handleConnect() { function handleConnect() {
socket.emit("join-room", roomId); socket.emit("join-room", roomId);
setInfo(`Joined room: ${roomId}`);
} }
function handleUpdate(payload: any) { function handleUpdate(payload: RoomUpdatePayload<T>) {
// protects against other room updates hitting this hook
if (payload.roomId !== roomId) return;
setData((prev) => [...payload.payloads, ...prev]); setData((prev) => [...payload.payloads, ...prev]);
setInfo("");
} }
function handleError(err: any) { function handleError(err: RoomErrorPayload) {
if (err.roomId && err.roomId !== roomId) return;
setInfo(err.message ?? "Room error"); setInfo(err.message ?? "Room error");
} }
@@ -31,6 +64,7 @@ export function useSocketRoom<T>(roomId: string) {
// If already connected, join immediately // If already connected, join immediately
if (socket.connected) { if (socket.connected) {
socket.emit("join-room", roomId); socket.emit("join-room", roomId);
setInfo(`Joined room: ${roomId}`);
} }
return () => { return () => {
@@ -42,5 +76,5 @@ export function useSocketRoom<T>(roomId: string) {
}; };
}, [roomId]); }, [roomId]);
return { data, info }; return { data, info, clearRoom };
} }

View File

@@ -0,0 +1,22 @@
import { keepPreviousData, queryOptions } from "@tanstack/react-query";
import axios from "axios";
export function servers() {
return queryOptions({
queryKey: ["servers"],
queryFn: () => fetch(),
staleTime: 5000,
refetchOnWindowFocus: true,
placeholderData: keepPreviousData,
});
}
const fetch = async () => {
if (window.location.hostname === "localhost") {
await new Promise((res) => setTimeout(res, 1500));
}
const { data } = await axios.get("/lst/api/servers");
return data.data;
};

View File

@@ -105,6 +105,7 @@ export default function LstTable({
</TableBody> </TableBody>
</Table> </Table>
<ScrollBar orientation="horizontal" /> <ScrollBar orientation="horizontal" />
<ScrollBar orientation="vertical" />
</ScrollArea> </ScrollArea>
<div className="flex items-center justify-end space-x-2 py-4"> <div className="flex items-center justify-end space-x-2 py-4">
<Button <Button

View File

@@ -14,6 +14,7 @@ import { Route as IndexRouteImport } from './routes/index'
import { Route as DocsIndexRouteImport } from './routes/docs/index' import { Route as DocsIndexRouteImport } from './routes/docs/index'
import { Route as DocsSplatRouteImport } from './routes/docs/$' import { Route as DocsSplatRouteImport } from './routes/docs/$'
import { Route as AdminSettingsRouteImport } from './routes/admin/settings' import { Route as AdminSettingsRouteImport } from './routes/admin/settings'
import { Route as AdminServersRouteImport } from './routes/admin/servers'
import { Route as AdminNotificationsRouteImport } from './routes/admin/notifications' import { Route as AdminNotificationsRouteImport } from './routes/admin/notifications'
import { Route as AdminLogsRouteImport } from './routes/admin/logs' import { Route as AdminLogsRouteImport } from './routes/admin/logs'
import { Route as authLoginRouteImport } from './routes/(auth)/login' import { Route as authLoginRouteImport } from './routes/(auth)/login'
@@ -46,6 +47,11 @@ const AdminSettingsRoute = AdminSettingsRouteImport.update({
path: '/admin/settings', path: '/admin/settings',
getParentRoute: () => rootRouteImport, getParentRoute: () => rootRouteImport,
} as any) } as any)
const AdminServersRoute = AdminServersRouteImport.update({
id: '/admin/servers',
path: '/admin/servers',
getParentRoute: () => rootRouteImport,
} as any)
const AdminNotificationsRoute = AdminNotificationsRouteImport.update({ const AdminNotificationsRoute = AdminNotificationsRouteImport.update({
id: '/admin/notifications', id: '/admin/notifications',
path: '/admin/notifications', path: '/admin/notifications',
@@ -83,6 +89,7 @@ export interface FileRoutesByFullPath {
'/login': typeof authLoginRoute '/login': typeof authLoginRoute
'/admin/logs': typeof AdminLogsRoute '/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute '/admin/notifications': typeof AdminNotificationsRoute
'/admin/servers': typeof AdminServersRoute
'/admin/settings': typeof AdminSettingsRoute '/admin/settings': typeof AdminSettingsRoute
'/docs/$': typeof DocsSplatRoute '/docs/$': typeof DocsSplatRoute
'/docs/': typeof DocsIndexRoute '/docs/': typeof DocsIndexRoute
@@ -96,6 +103,7 @@ export interface FileRoutesByTo {
'/login': typeof authLoginRoute '/login': typeof authLoginRoute
'/admin/logs': typeof AdminLogsRoute '/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute '/admin/notifications': typeof AdminNotificationsRoute
'/admin/servers': typeof AdminServersRoute
'/admin/settings': typeof AdminSettingsRoute '/admin/settings': typeof AdminSettingsRoute
'/docs/$': typeof DocsSplatRoute '/docs/$': typeof DocsSplatRoute
'/docs': typeof DocsIndexRoute '/docs': typeof DocsIndexRoute
@@ -110,6 +118,7 @@ export interface FileRoutesById {
'/(auth)/login': typeof authLoginRoute '/(auth)/login': typeof authLoginRoute
'/admin/logs': typeof AdminLogsRoute '/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute '/admin/notifications': typeof AdminNotificationsRoute
'/admin/servers': typeof AdminServersRoute
'/admin/settings': typeof AdminSettingsRoute '/admin/settings': typeof AdminSettingsRoute
'/docs/$': typeof DocsSplatRoute '/docs/$': typeof DocsSplatRoute
'/docs/': typeof DocsIndexRoute '/docs/': typeof DocsIndexRoute
@@ -125,6 +134,7 @@ export interface FileRouteTypes {
| '/login' | '/login'
| '/admin/logs' | '/admin/logs'
| '/admin/notifications' | '/admin/notifications'
| '/admin/servers'
| '/admin/settings' | '/admin/settings'
| '/docs/$' | '/docs/$'
| '/docs/' | '/docs/'
@@ -138,6 +148,7 @@ export interface FileRouteTypes {
| '/login' | '/login'
| '/admin/logs' | '/admin/logs'
| '/admin/notifications' | '/admin/notifications'
| '/admin/servers'
| '/admin/settings' | '/admin/settings'
| '/docs/$' | '/docs/$'
| '/docs' | '/docs'
@@ -151,6 +162,7 @@ export interface FileRouteTypes {
| '/(auth)/login' | '/(auth)/login'
| '/admin/logs' | '/admin/logs'
| '/admin/notifications' | '/admin/notifications'
| '/admin/servers'
| '/admin/settings' | '/admin/settings'
| '/docs/$' | '/docs/$'
| '/docs/' | '/docs/'
@@ -165,6 +177,7 @@ export interface RootRouteChildren {
authLoginRoute: typeof authLoginRoute authLoginRoute: typeof authLoginRoute
AdminLogsRoute: typeof AdminLogsRoute AdminLogsRoute: typeof AdminLogsRoute
AdminNotificationsRoute: typeof AdminNotificationsRoute AdminNotificationsRoute: typeof AdminNotificationsRoute
AdminServersRoute: typeof AdminServersRoute
AdminSettingsRoute: typeof AdminSettingsRoute AdminSettingsRoute: typeof AdminSettingsRoute
DocsSplatRoute: typeof DocsSplatRoute DocsSplatRoute: typeof DocsSplatRoute
DocsIndexRoute: typeof DocsIndexRoute DocsIndexRoute: typeof DocsIndexRoute
@@ -210,6 +223,13 @@ declare module '@tanstack/react-router' {
preLoaderRoute: typeof AdminSettingsRouteImport preLoaderRoute: typeof AdminSettingsRouteImport
parentRoute: typeof rootRouteImport parentRoute: typeof rootRouteImport
} }
'/admin/servers': {
id: '/admin/servers'
path: '/admin/servers'
fullPath: '/admin/servers'
preLoaderRoute: typeof AdminServersRouteImport
parentRoute: typeof rootRouteImport
}
'/admin/notifications': { '/admin/notifications': {
id: '/admin/notifications' id: '/admin/notifications'
path: '/admin/notifications' path: '/admin/notifications'
@@ -261,6 +281,7 @@ const rootRouteChildren: RootRouteChildren = {
authLoginRoute: authLoginRoute, authLoginRoute: authLoginRoute,
AdminLogsRoute: AdminLogsRoute, AdminLogsRoute: AdminLogsRoute,
AdminNotificationsRoute: AdminNotificationsRoute, AdminNotificationsRoute: AdminNotificationsRoute,
AdminServersRoute: AdminServersRoute,
AdminSettingsRoute: AdminSettingsRoute, AdminSettingsRoute: AdminSettingsRoute,
DocsSplatRoute: DocsSplatRoute, DocsSplatRoute: DocsSplatRoute,
DocsIndexRoute: DocsIndexRoute, DocsIndexRoute: DocsIndexRoute,

View File

@@ -0,0 +1,251 @@
import { useSuspenseQuery } from "@tanstack/react-query";
import { createFileRoute, redirect } from "@tanstack/react-router";
import { createColumnHelper } from "@tanstack/react-table";
import axios from "axios";
import { format } from "date-fns-tz";
import { CircleFadingArrowUp, Trash } from "lucide-react";
import { Suspense, useState } from "react";
import { toast } from "sonner";
import { Button } from "../../components/ui/button";
import { Spinner } from "../../components/ui/spinner";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "../../components/ui/tooltip";
import { useSocketRoom } from "../../hooks/socket.io.hook";
import { authClient } from "../../lib/auth-client";
import { servers } from "../../lib/queries/servers";
import LstTable from "../../lib/tableStuff/LstTable";
import SearchableHeader from "../../lib/tableStuff/SearchableHeader";
import SkellyTable from "../../lib/tableStuff/SkellyTable";
export const Route = createFileRoute("/admin/servers")({
beforeLoad: async ({ location }) => {
const { data: session } = await authClient.getSession();
const allowedRole = ["systemAdmin", "admin"];
if (!session?.user) {
throw redirect({
to: "/",
search: {
redirect: location.href,
},
});
}
if (!allowedRole.includes(session.user.role as string)) {
throw redirect({
to: "/",
});
}
return { user: session.user };
},
component: RouteComponent,
});
const ServerTable = () => {
const { data, refetch } = useSuspenseQuery(servers());
const columnHelper = createColumnHelper<any>();
const okToUpdate = ["localhost", "usmcd1olp082"];
const columns = [
columnHelper.accessor("name", {
header: ({ column }) => (
<SearchableHeader column={column} title="Name" searchable={true} />
),
filterFn: "includesString",
cell: (i) => i.getValue(),
}),
columnHelper.accessor("greatPlainsPlantCode", {
header: ({ column }) => (
<SearchableHeader column={column} title="GP Code" />
),
cell: (i) => <span>{i.getValue().toUpperCase()}</span>,
}),
columnHelper.accessor("server", {
header: ({ column }) => (
<SearchableHeader column={column} title="server" />
),
cell: (i) => <span>{i.getValue().toUpperCase()}</span>,
}),
columnHelper.accessor("idAddress", {
header: ({ column }) => (
<SearchableHeader column={column} title="IP Address" />
),
cell: (i) => <span>{i.getValue()}</span>,
}),
];
if (okToUpdate.includes(window.location.hostname)) {
columns.push(
columnHelper.accessor("lastUpdated", {
header: ({ column }) => (
<SearchableHeader column={column} title="Last Update" />
),
cell: (i) => <span>{format(i.getValue(), "M/d/yyyy HH:mm")}</span>,
}),
columnHelper.accessor("buildNumber", {
header: ({ column }) => (
<SearchableHeader column={column} title="Build" />
),
cell: (i) => <span>{i.getValue()}</span>,
}),
columnHelper.accessor("update", {
header: ({ column }) => (
<SearchableHeader column={column} title="Update" searchable={false} />
),
filterFn: "includesString",
cell: (i) => {
// biome-ignore lint: just removing the lint for now to get this going will maybe fix later
const [activeToggle, setActiveToggle] = useState(false);
const onToggle = async () => {
setActiveToggle(true);
toast.success(
`${i.row.original.name} just started the upgrade monitor logs for errors.`,
);
try {
const res = await axios.post(
`/lst/api/admin/build/updateServer`,
{
server: i.row.original.server,
destination: i.row.original.serverLoc,
token: i.row.original.plantToken,
},
{ withCredentials: true },
);
if (res.data.success) {
toast.success(
`${i.row.original.name} has completed its upgrade.`,
);
refetch();
setActiveToggle(false);
}
} catch (error) {
setActiveToggle(false);
console.error(error);
}
};
return (
<div>
<div className="flex items-center space-x-2">
<Button
variant="ghost"
disabled={activeToggle}
onClick={() => onToggle()}
>
{activeToggle ? (
<span>
<Spinner />
</span>
) : (
<span>
<CircleFadingArrowUp />
</span>
)}
</Button>
</div>
</div>
);
},
}),
);
}
return <LstTable data={data} columns={columns} />;
};
function RouteComponent() {
const { data: logs = [], clearRoom } = useSocketRoom<any>("admin:build");
const columnHelper = createColumnHelper<any>();
console.log(window.location);
const logColumns = [
columnHelper.accessor("timestamp", {
header: ({ column }) => (
<SearchableHeader column={column} title="Time" searchable={false} />
),
filterFn: "includesString",
cell: (i) => format(i.getValue(), "M/d/yyyy HH:mm"),
}),
columnHelper.accessor("message", {
header: ({ column }) => (
<SearchableHeader column={column} title="Message" />
),
cell: (i) => (
<Tooltip>
<TooltipTrigger>
{i.getValue()?.length > 250 ? (
<span>{i.getValue().slice(0, 250)}...</span>
) : (
<span>{i.getValue()}</span>
)}
</TooltipTrigger>
<TooltipContent>{i.getValue()}</TooltipContent>
</Tooltip>
),
}),
columnHelper.accessor("clearLog", {
header: ({ column }) => (
<SearchableHeader column={column} title="Clear" />
),
cell: ({ row }) => {
const x = row.original;
return (
<Button
size="icon"
variant={"destructive"}
onClick={() => clearRoom(x.timestamp)}
>
<Trash />
</Button>
);
},
}),
];
const triggerBuild = async () => {
try {
const res = await axios.post(
`/lst/api/admin/build/release`,
{
withCredentials: true,
},
);
if (res.data.success) {
toast.success(res.data.message);
}
if (!res.data.success) {
toast.error(res.data.message);
}
} catch (err) {
console.log(err);
//toast.error(err?.message);
}
};
//console.log(logs);
return (
<div className="flex flex-col gap-1">
<div className="flex gap-1 justify-end">
<Button onClick={triggerBuild}>Trigger Build</Button>
<Button onClick={() => clearRoom()}>Clear Logs</Button>
</div>
<div className="flex gap-1 w-full">
<div className="w-full">
<Suspense fallback={<SkellyTable />}>
<ServerTable />
</Suspense>
</div>
<div className="w-1/2">
<LstTable data={logs} columns={logColumns} />
</div>
</div>
</div>
);
}

43
lstMobile/.gitignore vendored Normal file
View File

@@ -0,0 +1,43 @@
# Learn more https://docs.github.com/en/get-started/getting-started-with-git/ignoring-files
# dependencies
node_modules/
# Expo
.expo/
dist/
web-build/
expo-env.d.ts
# Native
.kotlin/
*.orig.*
*.jks
*.p8
*.p12
*.key
*.mobileprovision
# Metro
.metro-health-check*
# debug
npm-debug.*
yarn-debug.*
yarn-error.*
# macOS
.DS_Store
*.pem
# local env files
.env*.local
# typescript
*.tsbuildinfo
app-example
# generated native folders
/ios
/android

1
lstMobile/.vscode/extensions.json vendored Normal file
View File

@@ -0,0 +1 @@
{ "recommendations": ["expo.vscode-expo-tools"] }

7
lstMobile/.vscode/settings.json vendored Normal file
View File

@@ -0,0 +1,7 @@
{
"editor.codeActionsOnSave": {
"source.fixAll": "explicit",
"source.organizeImports": "explicit",
"source.sortMembers": "explicit"
}
}

56
lstMobile/README.md Normal file
View File

@@ -0,0 +1,56 @@
# Welcome to your Expo app 👋
This is an [Expo](https://expo.dev) project created with [`create-expo-app`](https://www.npmjs.com/package/create-expo-app).
## Get started
1. Install dependencies
```bash
npm install
```
2. Start the app
```bash
npx expo start
```
In the output, you'll find options to open the app in a
- [development build](https://docs.expo.dev/develop/development-builds/introduction/)
- [Android emulator](https://docs.expo.dev/workflow/android-studio-emulator/)
- [iOS simulator](https://docs.expo.dev/workflow/ios-simulator/)
- [Expo Go](https://expo.dev/go), a limited sandbox for trying out app development with Expo
You can start developing by editing the files inside the **app** directory. This project uses [file-based routing](https://docs.expo.dev/router/introduction).
## Get a fresh project
When you're ready, run:
```bash
npm run reset-project
```
This command will move the starter code to the **app-example** directory and create a blank **app** directory where you can start developing.
### Other setup steps
- To set up ESLint for linting, run `npx expo lint`, or follow our guide on ["Using ESLint and Prettier"](https://docs.expo.dev/guides/using-eslint/)
- If you'd like to set up unit testing, follow our guide on ["Unit Testing with Jest"](https://docs.expo.dev/develop/unit-testing/)
- Learn more about the TypeScript setup in this template in our guide on ["Using TypeScript"](https://docs.expo.dev/guides/typescript/)
## Learn more
To learn more about developing your project with Expo, look at the following resources:
- [Expo documentation](https://docs.expo.dev/): Learn fundamentals, or go into advanced topics with our [guides](https://docs.expo.dev/guides).
- [Learn Expo tutorial](https://docs.expo.dev/tutorial/introduction/): Follow a step-by-step tutorial where you'll create a project that runs on Android, iOS, and the web.
## Join the community
Join our community of developers creating universal apps.
- [Expo on GitHub](https://github.com/expo/expo): View our open source platform and contribute.
- [Discord community](https://chat.expo.dev): Chat with Expo users and ask questions.

47
lstMobile/app.json Normal file
View File

@@ -0,0 +1,47 @@
{
"expo": {
"name": "LST mobile",
"slug": "lst-mobile",
"version": "0.0.1-alpha",
"orientation": "portrait",
"icon": "./assets/images/icon.png",
"scheme": "lstmobile",
"userInterfaceStyle": "automatic",
"ios": {
"icon": "./assets/expo.icon"
},
"android": {
"adaptiveIcon": {
"backgroundColor": "#E6F4FE",
"foregroundImage": "./assets/images/android-icon-foreground.png",
"backgroundImage": "./assets/images/android-icon-background.png",
"monochromeImage": "./assets/images/android-icon-monochrome.png",
"package": "net.alpla.lst.mobile",
"versionCode": 1
},
"predictiveBackGestureEnabled": false,
"package": "com.anonymous.lstMobile"
},
"web": {
"output": "static",
"favicon": "./assets/images/favicon.png"
},
"plugins": [
"expo-router",
[
"expo-splash-screen",
{
"backgroundColor": "#208AEF",
"android": {
"image": "./assets/images/splash-icon.png",
"imageWidth": 76
}
}
]
],
"experiments": {
"typedRoutes": true,
"reactCompiler": true
}
}
}

View File

@@ -0,0 +1,3 @@
<svg width="652" height="606" viewBox="0 0 652 606" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M353.554 0H298.446C273.006 0 249.684 14.6347 237.962 37.9539L4.37994 502.646C-1.04325 513.435 -1.45067 526.178 3.2716 537.313L22.6123 582.918C34.6475 611.297 72.5404 614.156 88.4414 587.885L309.863 222.063C313.34 216.317 319.439 212.826 326 212.826C332.561 212.826 338.659 216.317 342.137 222.063L563.559 587.885C579.46 614.156 617.352 611.297 629.388 582.918L648.728 537.313C653.451 526.178 653.043 513.435 647.62 502.646L414.038 37.9539C402.316 14.6347 378.994 0 353.554 0Z" fill="white"/>
</svg>

After

Width:  |  Height:  |  Size: 608 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

View File

@@ -0,0 +1,40 @@
{
"fill" : {
"automatic-gradient" : "extended-srgb:0.00000,0.47843,1.00000,1.00000"
},
"groups" : [
{
"layers" : [
{
"image-name" : "expo-symbol 2.svg",
"name" : "expo-symbol 2",
"position" : {
"scale" : 1,
"translation-in-points" : [
1.1008400065293245e-05,
-16.046875
]
}
},
{
"image-name" : "grid.png",
"name" : "grid"
}
],
"shadow" : {
"kind" : "neutral",
"opacity" : 0.5
},
"translucency" : {
"enabled" : true,
"value" : 0.5
}
}
],
"supported-platforms" : {
"circles" : [
"watchOS"
],
"squares" : "shared"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 17 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 77 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.2 KiB

Some files were not shown because too many files have changed in this diff Show More