6 Commits

Author SHA1 Message Date
ba3227545d chore(release): 0.0.1-alpha.4
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m4s
Release and Build Image / release (push) Successful in 12s
2026-04-15 07:31:49 -05:00
84909bfcf8 ci(service): changes to the script to allow running the powershell on execution palicy restrictions
Some checks failed
Build and Push LST Docker Image / docker (push) Has been cancelled
2026-04-15 07:31:06 -05:00
e0d0ac2077 feat(datamart): psi data has been added :D 2026-04-15 07:29:35 -05:00
52a6c821f4 fix(datamart): error when running build and crashed everything
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m34s
2026-04-14 20:30:34 -05:00
eccaf17332 feat(datamart): migrations completed remaining is the deactivation that will be ran by anylitics
Some checks failed
Build and Push LST Docker Image / docker (push) Failing after 39s
2026-04-14 20:25:20 -05:00
6307037985 feat(tcp crud): tcp server start, stop, restart endpoints + status check
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m30s
2026-04-13 17:30:47 -05:00
39 changed files with 4583 additions and 64 deletions

View File

@@ -71,7 +71,8 @@
"prodlabels",
"prolink",
"Skelly",
"trycatch"
"trycatch",
"whse"
],
"gitea.token": "8456def90e1c651a761a8711763d6ef225d6b2db",
"gitea.instanceURL": "https://git.tuffraid.net",

View File

@@ -1,5 +1,49 @@
# All Changes to LST can be found below.
## [0.0.1-alpha.4](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.3...v0.0.1-alpha.4) (2026-04-15)
### 🌟 Enhancements
* **datamart:** migrations completed remaining is the deactivation that will be ran by anylitics ([eccaf17](https://git.tuffraid.net/cowch/lst_v3/commits/eccaf17332fb1c63b8d6bbea6f668c3bb42d44b7))
* **datamart:** psi data has been added :D ([e0d0ac2](https://git.tuffraid.net/cowch/lst_v3/commits/e0d0ac20773159373495d65023587b76b47df34f))
* **migrate:** quality alert migrated ([b0e5fd7](https://git.tuffraid.net/cowch/lst_v3/commits/b0e5fd79998d551d4f155d58416157a324498fbd))
* **ocp:** printer sync and logging logic added ([80189ba](https://git.tuffraid.net/cowch/lst_v3/commits/80189baf906224da43ec1b9b7521153d2a49e059))
* **tcp crud:** tcp server start, stop, restart endpoints + status check ([6307037](https://git.tuffraid.net/cowch/lst_v3/commits/6307037985162bc6b49f9f711132853296f43eee))
### 🐛 Bug fixes
* **datamart:** error when running build and crashed everything ([52a6c82](https://git.tuffraid.net/cowch/lst_v3/commits/52a6c821f4632e4b5b51e0528a0d620e2e0deffc))
### 📚 Documentation
* **docs:** removed docusorus as all docs will be inside lst now to better assist users ([6ba905a](https://git.tuffraid.net/cowch/lst_v3/commits/6ba905a887dbd8f306d71fed75bb34c71fee74c9))
* **env example:** updated the file ([ca3425d](https://git.tuffraid.net/cowch/lst_v3/commits/ca3425d327757120c2cc876fff28e8668c76838d))
* **notifcations:** docs for intro, notifcations, reprint added ([87f7387](https://git.tuffraid.net/cowch/lst_v3/commits/87f738702a935279a248d471541cdd9d49330565))
### 🛠️ Code Refactor
* **agent:** changed to have the test servers on there own push for better testing ([3bf024c](https://git.tuffraid.net/cowch/lst_v3/commits/3bf024cfc97d2841130d54d1a7c5cb5f09f0f598))
* **connection:** corrected the connection to the old system ([38a0b65](https://git.tuffraid.net/cowch/lst_v3/commits/38a0b65e9450c65b8300a10058a8f0357400f4e6))
* **logging:** when notify is true send the error to systemAdmins ([79e653e](https://git.tuffraid.net/cowch/lst_v3/commits/79e653efa3bcb2941ccee06b28378e709e085ec0))
* **notification:** blocking added ([9a0ef8e](https://git.tuffraid.net/cowch/lst_v3/commits/9a0ef8e51a36e3ab45b601b977f1b5cf35d56947))
* **puchase:** changes how the error handling works so a better email can be sent ([9d39c13](https://git.tuffraid.net/cowch/lst_v3/commits/9d39c13510974b5ada2a6f6c2448da3f1b755a5c))
* **reprint:** new query added to deactivate the old notifcation so no chance of duplicates ([c9eb59e](https://git.tuffraid.net/cowch/lst_v3/commits/c9eb59e2ad9847418ac55cb8a4a91c013f6c97bb))
* **server:** added in serverCrash email ([dcb3f2d](https://git.tuffraid.net/cowch/lst_v3/commits/dcb3f2dd1382986639b722778fad113392533b28))
* **services:** added in examples for migration stuff ([fc6dc82](https://git.tuffraid.net/cowch/lst_v3/commits/fc6dc82d8458a9928050dd3770778d6a6e1eea7f))
* **sql:** corrections to the way we reconnect so the app can error out and be reactivated later ([f33587a](https://git.tuffraid.net/cowch/lst_v3/commits/f33587a3d9a72ca72806635fac9d1214bb1452f1))
* **templates:** corrections for new notify process on critcal errors ([07ebf88](https://git.tuffraid.net/cowch/lst_v3/commits/07ebf88806b93b9320f8f9d36b867572dd9a9580))
### 📈 Project changes
* **agent:** added in jeff city ([e47ea9e](https://git.tuffraid.net/cowch/lst_v3/commits/e47ea9ec52a6ebaf5a8f67a7e8bd2c73da6186fb))
* **agent:** added in sherman ([4b6061c](https://git.tuffraid.net/cowch/lst_v3/commits/4b6061c478cbeba7c845dc1c8a015b9998721456))
* **service:** changes to the script to allow running the powershell on execution palicy restrictions ([84909bf](https://git.tuffraid.net/cowch/lst_v3/commits/84909bfcf85b91d085ea9dca78be00482b7fd231))
## [0.0.1-alpha.3](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.2...v0.0.1-alpha.3) (2026-04-10)

View File

@@ -19,7 +19,7 @@ Quick summary of current rewrite/migration goal.
| User Profile | ~~Edit profile~~, upload avatar | 🟨 In Progress |
| User Admin | Edit user, create user, remove user, alplaprod user integration | ⏳ Not Started |
| Notifications | ~~Subscribe~~, ~~Create~~, ~~Update~~, ~~~~Remove~~, Manual Trigger | 🟨 In Progress |
| Datamart | Create, Update, Run, Deactivate | 🔧 In Progress |
| Datamart | ~~Create~~, ~~Update~~, ~~Run~~, Deactivate | 🟨 In Progress |
| Frontend | Analytics and charts | ⏳ Not Started |
| Docs | Instructions and trouble shooting | ⏳ Not Started |
| One Click Print | Get printers, monitor printers, label process, material process, Special processes | ⏳ Not Started |

View File

@@ -13,6 +13,10 @@
*
* when a criteria is password over we will handle it by counting how many were passed up to 3 then deal with each one respectively
*/
import { and, between, inArray, notInArray } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
@@ -22,37 +26,93 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { datamartData } from "./datamartData.utlis.js";
type Options = {
name: string;
value: string;
};
type Data = {
name: string;
options: Options;
options: any;
optionsRequired?: boolean;
howManyOptionsRequired?: number;
};
const lstDbRun = async (data: Data) => {
if (data.options) {
if (data.name === "psiInventory") {
const ids = data.options.articles.split(",").map((id: any) => id.trim());
const whse = data.options.whseToInclude
? data.options.whseToInclude
.split(",")
.map((w: any) => w.trim())
.filter(Boolean)
: [];
const locations = data.options.exludeLanes
? data.options.exludeLanes
.split(",")
.map((l: any) => l.trim())
.filter(Boolean)
: [];
const conditions = [
inArray(invHistoricalData.article, ids),
between(
invHistoricalData.histDate,
data.options.startDate,
data.options.endDate,
),
];
// only add the warehouse condition if there are any whse values
if (whse.length > 0) {
conditions.push(inArray(invHistoricalData.whseId, whse));
}
// locations we dont want in the system
if (locations.length > 0) {
conditions.push(notInArray(invHistoricalData.location, locations));
}
return await db
.select()
.from(invHistoricalData)
.where(and(...conditions));
}
}
return [];
};
export const runDatamartQuery = async (data: Data) => {
// search the query db for the query by name
const sqlQuery = sqlQuerySelector(`${data.name}`) as SqlQuery;
const considerLstDBRuns = ["psiInventory"];
if (considerLstDBRuns.includes(data.name)) {
const lstDB = await lstDbRun(data);
return returnFunc({
success: true,
level: "info",
module: "datamart",
subModule: "lstDBrn",
message: `Data for: ${data.name}`,
data: lstDB,
notify: false,
});
}
const sqlQuery = sqlQuerySelector(`datamart.${data.name}`) as SqlQuery;
const getDataMartInfo = datamartData.filter((x) => x.endpoint === data.name);
// const optionsMissing =
// !data.options || Object.keys(data.options).length === 0;
const optionCount =
Object.keys(data.options).length ===
getDataMartInfo[0]?.howManyOptionsRequired;
const isValid =
Object.keys(data.options ?? {}).length >=
(getDataMartInfo[0]?.howManyOptionsRequired ?? 0);
if (getDataMartInfo[0]?.optionsRequired && !optionCount) {
if (getDataMartInfo[0]?.optionsRequired && !isValid) {
return returnFunc({
success: false,
level: "error",
module: "datamart",
subModule: "query",
message: `This query is required to have the ${getDataMartInfo[0]?.howManyOptionsRequired} options set in order use it.`,
message: `This query is required to have ${getDataMartInfo[0]?.howManyOptionsRequired} option(s) set in order use it, please add in your option(s) data and try again.`,
data: [getDataMartInfo[0].options],
notify: false,
});
@@ -75,10 +135,129 @@ export const runDatamartQuery = async (data: Data) => {
// split the criteria by "," then and then update the query
if (data.options) {
Object.entries(data.options ?? {}).forEach(([key, value]) => {
const pattern = new RegExp(`\\[${key.trim()}\\]`, "g");
datamartQuery = datamartQuery.replace(pattern, String(value).trim());
});
switch (data.name) {
case "activeArticles":
break;
case "deliveryByDateRange":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`);
break;
case "customerInventory":
datamartQuery = datamartQuery
.replace(
"--and IdAdressen",
`and IdAdressen in (${data.options.customer})`,
)
.replace(
"--and x.IdWarenlager in (0)",
`${data.options.whseToInclude ? `and x.IdWarenlager in (${data.options.whseToInclude})` : `--and x.IdWarenlager in (0)`}`,
);
break;
case "openOrders":
datamartQuery = datamartQuery
.replace("[startDay]", `${data.options.startDay}`)
.replace("[endDay]", `${data.options.endDay}`);
break;
case "inventory":
datamartQuery = datamartQuery
.replaceAll(
"--,l.RunningNumber",
`${data.options.includeRunningNumbers ? `,l.RunningNumber` : `--,l.RunningNumber`}`,
)
.replaceAll(
"--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot",
`${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot`}`,
)
.replaceAll(
"--,l.WarehouseDescription,l.LaneDescription",
`${data.options.locations ? `,l.WarehouseDescription,l.LaneDescription` : `--,l.WarehouseDescription,l.LaneDescription`}`,
);
// adding in a test for historical check.
if (data.options.historical) {
datamartQuery = datamartQuery
.replace(
"--,l.ProductionLotRunningNumber as lot,l.warehousehumanreadableid as warehouseId,l.WarehouseDescription as warehouseDescription,l.lanehumanreadableid as locationId,l.lanedescription as laneDescription",
",l.ProductionLotRunningNumber as lot,l.warehousehumanreadableid as warehouseId,l.WarehouseDescription as warehouseDescription,l.lanehumanreadableid as locationId,l.lanedescription as laneDescription",
)
.replace(
"--,l.ProductionLotRunningNumber,l.warehousehumanreadableid,l.WarehouseDescription,l.lanehumanreadableid,l.lanedescription",
",l.ProductionLotRunningNumber,l.warehousehumanreadableid,l.WarehouseDescription,l.lanehumanreadableid,l.lanedescription",
);
}
break;
case "fakeEDIUpdate":
datamartQuery = datamartQuery.replace(
"--AND h.CustomerHumanReadableId in (0)",
`${data.options.address ? `AND h.CustomerHumanReadableId in (${data.options.address})` : `--AND h.CustomerHumanReadableId in (0)`}`,
);
break;
case "forecast":
datamartQuery = datamartQuery.replace(
"where DeliveryAddressHumanReadableId in ([customers])",
data.options.customers
? `where DeliveryAddressHumanReadableId in (${data.options.customers})`
: "--where DeliveryAddressHumanReadableId in ([customers])",
);
break;
case "activeArticles2":
datamartQuery = datamartQuery.replace(
"and a.HumanReadableId in ([articles])",
data.options.articles
? `and a.HumanReadableId in (${data.options.articles})`
: "--and a.HumanReadableId in ([articles])",
);
break;
case "psiDeliveryData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and IdArtikelVarianten in ([articles])",
data.options.articles
? `and IdArtikelVarianten in (${data.options.articles})`
: "--and IdArtikelVarianten in ([articles])",
);
break;
case "productionData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and ArticleHumanReadableId in ([articles])",
data.options.articles
? `and ArticleHumanReadableId in (${data.options.articles})`
: "--and ArticleHumanReadableId in ([articles])",
);
break;
case "psiPlanningData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and p.IdArtikelvarianten in ([articles])",
data.options.articles
? `and p.IdArtikelvarianten in (${data.options.articles})`
: "--and p.IdArtikelvarianten in ([articles])",
);
break;
default:
return returnFunc({
success: false,
level: "error",
module: "datamart",
subModule: "query",
message: `${data.name} encountered an error as it might not exist in LST please contact support if this continues to happen`,
data: [sqlQuery.message],
notify: true,
});
}
}
const { data: queryRun, error } = await tryCatch(

View File

@@ -10,14 +10,50 @@ export const datamartData = [
name: "Active articles",
endpoint: "activeArticles",
description: "returns all active articles for the server with custom data",
options: "", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
options: "",
optionsRequired: false,
},
{
name: "Delivery by date range",
endpoint: "deliveryByDateRange",
description: `Returns all Deliverys in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
options: "startDate,endDate", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
description: `Returns all Deliveries in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
options: "startDate,endDate",
optionsRequired: true,
howManyOptionsRequired: 2,
},
{
name: "Get Customer Inventory",
endpoint: "customerInventory",
description: `Returns specific customer inventory based on there address ID, IE: 8,12,145. \nWith option to include specific warehousesIds, IE 36,41,5. \nNOTES: *leaving warehouse blank will just pull everything for the customer, Inventory dose not include PPOO or INV`,
options: "customer,whseToInclude",
optionsRequired: true,
howManyOptionsRequired: 1,
},
{
name: "Get open order",
endpoint: "openOrders",
description: `Returns open orders based on day count sent over, IE: startDay 15 days in the past endDay 5 days in the future, can be left empty for this default days`,
options: "startDay,endDay",
optionsRequired: true,
howManyOptionsRequired: 2,
},
{
name: "Get inventory",
endpoint: "inventory",
description: `Returns all inventory, excludes inv location. adding an x in one of the options will enable it.`,
options: "includeRunningNumbers,locations,lots",
},
{
name: "Fake EDI Update",
endpoint: "fakeEDIUpdate",
description: `Returns all open orders to correct and resubmit via lst demand mgt, leaving blank will get everything putting an address only returns the specified address. \nNOTE: only orders that were created via edi will populate here.`,
options: "address",
},
{
name: "Production Data",
endpoint: "productionData",
description: `Returns all production data from the date range with the option to have 1 to many avs to search by.`,
options: "startDate,endDate,articles",
optionsRequired: true,
howManyOptionsRequired: 2,
},

View File

@@ -0,0 +1,30 @@
import { date, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type z from "zod";
export const invHistoricalData = pgTable("inv_historical_data", {
inv: uuid("id").defaultRandom().primaryKey(),
histDate: date("hist_date").notNull(), // this date should always be yesterday when we post it.
plantToken: text("plant_token"),
article: text("article").notNull(),
articleDescription: text("article_description").notNull(),
materialType: text("material_type"),
total_QTY: text("total_QTY"),
available_QTY: text("available_QTY"),
coa_QTY: text("coa_QTY"),
held_QTY: text("held_QTY"),
consignment_QTY: text("consignment_qty"),
lot_Number: text("lot_number"),
locationId: text("location_id"),
location: text("location"),
whseId: text("whse_id").default(""),
whseName: text("whse_name").default("missing whseName"),
upd_user: text("upd_user").default("lst-system"),
upd_date: timestamp("upd_date").defaultNow(),
});
export const invHistoricalDataSchema = createSelectSchema(invHistoricalData);
export const newInvHistoricalDataSchema = createInsertSchema(invHistoricalData);
export type InvHistoricalData = z.infer<typeof invHistoricalDataSchema>;
export type NewInvHistoricalData = z.infer<typeof newInvHistoricalDataSchema>;

View File

@@ -0,0 +1,220 @@
import { format } from "date-fns";
import { eq, sql } from "drizzle-orm";
import { runDatamartQuery } from "../datamart/datamart.controller.js";
import { db } from "../db/db.controller.js";
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { createCronJob } from "../utils/croner.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
type Inventory = {
article: string;
alias: string;
materialType: string;
total_palletQTY: string;
available_QTY: string;
coa_QTY: string;
held_QTY: string;
consignment_qty: string;
lot: string;
locationId: string;
laneDescription: string;
warehouseId: string;
warehouseDescription: string;
};
const historicalInvImport = async () => {
const today = new Date();
const { data, error } = await tryCatch(
db
.select()
.from(invHistoricalData)
.where(eq(invHistoricalData.histDate, format(today, "yyyy-MM-dd"))),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "query",
message: `Error getting historical inv info`,
data: error as any,
notify: false,
});
}
if (data?.length === 0) {
const avSQLQuery = sqlQuerySelector(`datamart.activeArticles`) as SqlQuery;
if (!avSQLQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting Article info`,
data: [avSQLQuery.message],
notify: true,
});
}
const { data: inv, error: invError } = await tryCatch(
//prodQuery(sqlQuery.query, "Inventory data"),
runDatamartQuery({ name: "inventory", options: { historical: "x" } }),
);
const { data: av, error: avError } = (await tryCatch(
runDatamartQuery({ name: "activeArticles", options: {} }),
)) as any;
if (invError) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting inventory info from prod query`,
data: invError as any,
notify: false,
});
}
if (avError) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting article info from prod query`,
data: invError as any,
notify: false,
});
}
// shape the data to go into our table
const plantToken = process.env.PROD_PLANT_TOKEN ?? "test1";
const importInv = (inv.data ? inv.data : []) as Inventory[];
const importData = importInv.map((i) => {
return {
histDate: sql`(NOW())::date`,
plantToken: plantToken,
article: i.article,
articleDescription: i.alias,
materialType:
av.data.filter((a: any) => a.article === i.article).length > 0
? av.data.filter((a: any) => a.article === i.article)[0]
?.TypeOfMaterial
: "Item not defined",
total_QTY: i.total_palletQTY ?? "0.00",
available_QTY: i.available_QTY ?? "0.00",
coa_QTY: i.coa_QTY ?? "0.00",
held_QTY: i.held_QTY ?? "0.00",
consignment_QTY: i.consignment_qty ?? "0.00",
lot_Number: i.lot ?? "0",
locationId: i.locationId ?? "0",
location: i.laneDescription ?? "Missing lane",
whseId: i.warehouseId ?? "0",
whseName: i.warehouseDescription ?? "Missing warehouse",
};
});
const { data: dataImport, error: errorImport } = await tryCatch(
db.insert(invHistoricalData).values(importData),
);
if (errorImport) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error adding historical data to lst db`,
data: errorImport as any,
notify: true,
});
}
if (dataImport) {
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Historical data was added to lst :D`,
data: [],
notify: false,
});
}
} else {
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Historical Data for: ${format(today, "yyyy-MM-dd")}, is already added and nothing to do.`,
data: [],
notify: false,
});
}
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Some weird crazy error just happened and didnt get captured during the historical inv check.`,
data: [],
notify: true,
});
};
export const historicalSchedule = async () => {
// running the history in case my silly ass dose an update around the shift change time lol, this will prevent loss data. it might be off a little but no one cares
historicalInvImport();
const sqlQuery = sqlQuerySelector(`shiftChange`) as SqlQuery;
if (!sqlQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "query",
message: `Error getting shiftChange sql file`,
data: [sqlQuery.message],
notify: false,
});
}
const { data, error } = await tryCatch(
prodQuery(sqlQuery.query, "Shift Change data"),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "query",
message: `Error getting shiftChange info`,
data: error as any,
notify: false,
});
}
// shift split
const shiftTimeSplit = data?.data[0]?.shiftChange.split(":");
const cronSetup = `0 ${
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[1])}` : "0"
} ${
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[0])}` : "7"
} * * *`;
createCronJob("historicalInv", cronSetup, () => historicalInvImport());
};

View File

@@ -1,6 +1,6 @@
use AlplaPROD_test1
SELECT V_Artikel.IdArtikelvarianten,
SELECT V_Artikel.IdArtikelvarianten as article,
V_Artikel.Bezeichnung,
V_Artikel.ArtikelvariantenTypBez,
V_Artikel.PreisEinheitBez,

View File

@@ -0,0 +1,43 @@
/**
This will be replacing activeArticles once all data is remapped into this query.
make a note in the docs this activeArticles will go stale sooner or later.
**/
use [test1_AlplaPROD2.0_Read]
select a.Id,
a.HumanReadableId as av,
a.Alias as alias,
p.LoadingUnitsPerTruck as loadingUnitsPerTruck,
p.LoadingUnitsPerTruck * p.LoadingUnitPieces as qtyPerTruck,
p.LoadingUnitPieces,
case when i.MinQuantity IS NOT NULL then round(cast(i.MinQuantity as float), 2) else 0 end as min,
case when i.MaxQuantity IS NOT NULL then round(cast(i.MaxQuantity as float),2) else 0 end as max
from masterData.Article (nolock) as a
/* sales price */
left join
(select *
from (select
id,
PackagingId,
ArticleId,
DefaultCustomer,
ROW_NUMBER() OVER (PARTITION BY ArticleId ORDER BY ValidAfter DESC) AS RowNum
from masterData.SalesPrice (nolock)
where DefaultCustomer = 1) as x
where RowNum = 1
) as s
on a.id = s.ArticleId
/* pkg instructions */
left join
masterData.PackagingInstruction (nolock) as p
on s.PackagingId = p.id
/* stock limits */
left join
masterData.StockLimit (nolock) as i
on a.id = i.ArticleId
where a.active = 1
and a.HumanReadableId in ([articles])

View File

@@ -0,0 +1,45 @@
select x.idartikelVarianten as av
,ArtikelVariantenAlias as Alias
--x.Lfdnr as RunningNumber,
--,round(sum(EinlagerungsMengeVPKSum),0) as Total_Pallets
--,sum(EinlagerungsMengeSum) as Total_PalletQTY
,round(sum(VerfuegbareMengeVPKSum),0) as Avalible_Pallets
,sum(VerfuegbareMengeSum) as Avaliable_PalletQTY
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as COA_Pallets
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as COA_QTY
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as Held_Pallets
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeSum else 0 end) as Held_QTY
,IdProdPlanung as Lot
--,IdAdressen
--,x.AdressBez
--,*
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
left join
[AlplaPROD_test1].dbo.T_EtikettenGedruckt (nolock) on
x.Lfdnr = T_EtikettenGedruckt.Lfdnr AND T_EtikettenGedruckt.Lfdnr > 1
left join
(SELECT *
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] (nolock) where Active = 1) as c
on x.IdMainDefect = c.IdBlockingDefect
/*
The data below will be controlled by the user in excell by default everything will be passed over
IdAdressen = 3
*/
where
--IdArtikelTyp = 1
x.IdWarenlager not in (6, 1)
--and IdAdressen
--and x.IdWarenlager in (0)
group by x.IdArtikelVarianten
,ArtikelVariantenAlias
,IdProdPlanung
--,c.Description
,IdAdressen
,x.AdressBez
--, x.Lfdnr
order by x.IdArtikelVarianten

View File

@@ -0,0 +1,29 @@
use [test1_AlplaPROD2.0_Read]
select
customerartno as CustomerArticleNumber
,h.CustomerOrderNumber as CustomerOrderNumber
,l.CustomerLineItemNumber as CustomerLineNumber
,r.CustomerReleaseNumber as CustomerRealeaseNumber
,r.Quantity
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as DeliveryDate
,h.CustomerHumanReadableId as CustomerID
,r.Remark
--,*
from [order].[Release] as r (nolock)
left join
[order].LineItem as l (nolock) on
l.id = r.LineItemId
left join
[order].Header as h (nolock) on
h.id = l.HeaderId
WHERE releaseState not in (1, 2, 3, 4)
AND h.CreatedByEdi = 1
AND r.deliveryDate < getdate() + 1
--AND h.CustomerHumanReadableId in (0)
order by r.deliveryDate

View File

@@ -0,0 +1,8 @@
SELECT format(RequirementDate, 'yyyy-MM-dd') as requirementDate
,ArticleHumanReadableId
,CustomerArticleNumber
,ArticleDescription
,Quantity
FROM [test1_AlplaPROD2.0_Read].[forecast].[Forecast]
where DeliveryAddressHumanReadableId in ([customers])
order by RequirementDate

View File

@@ -0,0 +1,64 @@
use [test1_AlplaPROD2.0_Read]
select
ArticleHumanReadableId as article
,ArticleAlias as alias
,round(sum(QuantityLoadingUnits),2) total_pallets
,round(sum(Quantity),2) as total_palletQTY
,round(sum(case when State = 0 then QuantityLoadingUnits else 0 end),2) available_Pallets
,round(sum(case when State = 0 then Quantity else 0 end),2) available_QTY
,round(sum(case when b.HumanReadableId = 864 then QuantityLoadingUnits else 0 end),2) as coa_Pallets
,round(sum(case when b.HumanReadableId = 864 then Quantity else 0 end),2) as coa_QTY
,round(sum(case when b.HumanReadableId <> 864 then QuantityLoadingUnits else 0 end),2) as held_Pallets
,round(sum(case when b.HumanReadableId <> 864 then Quantity else 0 end),2) as held_QTY
,round(sum(case when w.type = 7 then QuantityLoadingUnits else 0 end),2) as consignment_Pallets
,round(sum(case when w.type = 7 then Quantity else 0 end),2) as consignment_qty
--,l.RunningNumber
/** datamart include lot number **/
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot
/** data mart include location data **/
--,l.WarehouseDescription,l.LaneDescription
/** historical section **/
--,l.ProductionLotRunningNumber as lot,l.warehousehumanreadableid as warehouseId,l.WarehouseDescription as warehouseDescription,l.lanehumanreadableid as locationId,l.lanedescription as laneDescription
,articleTypeName
FROM [warehousing].[WarehouseUnit] as l (nolock)
left join
(
SELECT [Id]
,[HumanReadableId]
,d.[Description]
,[DefectGroupId]
,[IsActive]
FROM [blocking].[BlockingDefect] as g (nolock)
left join
[AlplaPROD_test1].dbo.[T_BlockingDefects] as d (nolock) on
d.IdGlobalBlockingDefect = g.HumanReadableId
) as b on
b.id = l.MainDefectId
left join
[warehousing].[warehouse] as w (nolock) on
w.id = l.warehouseid
where LaneHumanReadableId not in (20000,21000)
group by ArticleHumanReadableId,
ArticleAlias,
ArticleTypeName
--,l.RunningNumber
/** datamart include lot number **/
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber
/** data mart include location data **/
--,l.WarehouseDescription,l.LaneDescription
/** historical section **/
--,l.ProductionLotRunningNumber,l.warehousehumanreadableid,l.WarehouseDescription,l.lanehumanreadableid,l.lanedescription
order by ArticleHumanReadableId

View File

@@ -0,0 +1,33 @@
use [test1_AlplaPROD2.0_Read]
select
customerartno
,r.ArticleHumanReadableId as article
,r.ArticleAlias as articleAlias
,ReleaseNumber
,h.CustomerOrderNumber as header
,l.CustomerLineItemNumber as lineItem
,r.CustomerReleaseNumber as releaseNumber
,r.LoadingUnits
,r.Quantity
,r.TradeUnits
,h.CustomerHumanReadableId
,r.DeliveryAddressDescription
,format(r.LoadingDate, 'MM/dd/yyyy HH:mm') as loadingDate
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as deliveryDate
,r.Remark
--,*
from [order].[Release] as r (nolock)
left join
[order].LineItem as l (nolock) on
l.id = r.LineItemId
left join
[order].Header as h (nolock) on
h.id = l.HeaderId
WHERE releasestate not in (1, 2, 4)
AND r.deliverydate between getDate() + -[startDay] and getdate() + [endDay]
order by r.deliverydate

View File

@@ -0,0 +1,19 @@
use [test1_AlplaPROD2.0_Reporting]
declare @startDate nvarchar(30) = '[startDate]' --'2024-12-30'
declare @endDate nvarchar(30) = '[endDate]' --'2025-08-09'
select MachineLocation,
ArticleHumanReadableId as article,
sum(Quantity) as Produced,
count(Quantity) as palletsProdued,
FORMAT(convert(date, ProductionDay), 'M/d/yyyy') as ProductionDay,
ProductionLotHumanReadableId as productionLot
from [reporting_productionControlling].[ScannedUnit] (nolock)
where convert(date, ProductionDay) between @startDate and @endDate
and ArticleHumanReadableId in ([articles])
and BookedOut is null
group by MachineLocation, ArticleHumanReadableId,ProductionDay, ProductionLotHumanReadableId

View File

@@ -0,0 +1,23 @@
use AlplaPROD_test1
/**
move this over to the delivery date range query once we have the shift data mapped over correctly.
update the psi stuff on this as well.
**/
declare @start_date nvarchar(30) = '[startDate]' --'2025-01-01'
declare @end_date nvarchar(30) = '[endDate]' --'2025-08-09'
select IdArtikelVarianten,
ArtikelVariantenBez,
sum(Menge) totalDelivered,
case when convert(time, upd_date) between '00:00' and '07:00' then convert(date, upd_date - 1) else convert(date, upd_date) end as ShippingDate
from dbo.V_LadePlanungenLadeAuftragAbruf (nolock)
where upd_date between CONVERT(datetime, @start_date + ' 7:00') and CONVERT(datetime, @end_date + ' 7:00')
and IdArtikelVarianten in ([articles])
group by IdArtikelVarianten, upd_date,
ArtikelVariantenBez

View File

@@ -0,0 +1,32 @@
use AlplaPROD_test1
declare @start_date nvarchar(30) = '[startDate]' --'2025-01-01'
declare @end_date nvarchar(30) = '[endDate]' --'2025-08-09'
/*
articles will need to be passed over as well as the date structure we want to see
*/
select x.IdArtikelvarianten As Article,
ProduktionAlias as Description,
standort as MachineId,
MaschinenBezeichnung as MachineName,
--MaschZyklus as PlanningCycleTime,
x.IdProdPlanung as LotNumber,
FORMAT(ProdTag, 'MM/dd/yyyy') as ProductionDay,
x.planMenge as TotalPlanned,
ProduktionMenge as QTYPerDay,
round(ProduktionMengeVPK, 2) PalDay,
Status as finished
--MaschStdAuslastung as nee
from dbo.V_ProdLosProduktionJeProdTag_PLANNING (nolock) as x
left join
dbo.V_ProdPlanung (nolock) as p on
x.IdProdPlanung = p.IdProdPlanung
where ProdTag between @start_date and @end_date
and p.IdArtikelvarianten in ([articles])
--and V_ProdLosProduktionJeProdTag_PLANNING.IdKunde = 10
--and IdProdPlanung = 18442
order by ProdTag desc

View File

@@ -0,0 +1,4 @@
select top(1) convert(varchar(8) ,
convert(time,startdate), 108) as shiftChange
from [test1_AlplaPROD2.0_Read].[masterData].[ShiftDefinition]
where teamNumber = 1

View File

@@ -10,6 +10,7 @@ import { setupOCPRoutes } from "./ocp/ocp.routes.js";
import { setupOpendockRoutes } from "./opendock/opendock.routes.js";
import { setupProdSqlRoutes } from "./prodSql/prodSql.routes.js";
import { setupSystemRoutes } from "./system/system.routes.js";
import { setupTCPRoutes } from "./tcpServer/tcp.routes.js";
import { setupUtilsRoutes } from "./utils/utils.routes.js";
export const setupRoutes = (baseUrl: string, app: Express) => {
@@ -24,4 +25,5 @@ export const setupRoutes = (baseUrl: string, app: Express) => {
setupOpendockRoutes(baseUrl, app);
setupNotificationRoutes(baseUrl, app);
setupOCPRoutes(baseUrl, app);
setupTCPRoutes(baseUrl, app);
};

View File

@@ -6,6 +6,7 @@ import { dbCleanup } from "./db/dbCleanup.controller.js";
import { type Setting, settings } from "./db/schema/settings.schema.js";
import { connectGPSql } from "./gpSql/gpSqlConnection.controller.js";
import { createLogger } from "./logger/logger.controller.js";
import { historicalSchedule } from "./logistics/logistics.historicalInv.js";
import { startNotifications } from "./notification/notification.controller.js";
import { createNotifications } from "./notification/notifications.master.js";
import { printerSync } from "./ocp/ocp.printer.manage.js";
@@ -64,6 +65,7 @@ const start = async () => {
dbCleanup("jobs", 30),
);
createCronJob("logsCleanup", "0 15 5 * * *", () => dbCleanup("logs", 120));
historicalSchedule();
// one shots only needed to run on server startups
createNotifications();

View File

@@ -1,9 +1,12 @@
import { Router } from "express";
import { connected as gpSql } from "../gpSql/gpSqlConnection.controller.js";
import { connected as prodSql } from "../prodSql/prodSqlConnection.controller.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { isServerRunning } from "../tcpServer/tcp.server.js";
const router = Router();
@@ -25,6 +28,9 @@ router.get("/", async (_, res) => {
: [],
eomFGPkgSheetVersion: 1, // this is the excel file version when we have a change to the macro we want to grab this
masterMacroFile: 1,
tcpServerOnline: isServerRunning,
sqlServerConnected: prodSql,
gpServerConnected: gpSql,
});
});

View File

@@ -0,0 +1,14 @@
import type { Express } from "express";
import { requireAuth } from "../middleware/auth.middleware.js";
import restart from "./tcpRestart.route.js";
import start from "./tcpStart.route.js";
import stop from "./tcpStop.route.js";
export const setupTCPRoutes = (baseUrl: string, app: Express) => {
//stats will be like this as we dont need to change this
app.use(`${baseUrl}/api/tcp/start`, requireAuth, start);
app.use(`${baseUrl}/api/tcp/stop`, requireAuth, stop);
app.use(`${baseUrl}/api/tcp/restart`, requireAuth, restart);
// all other system should be under /api/system/*
};

View File

@@ -3,13 +3,14 @@ import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { printerData } from "../db/schema/printers.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import { delay } from "../utils/delay.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { type PrinterData, printerListen } from "./tcp.printerListener.js";
let tcpServer: net.Server;
const tcpSockets: Set<net.Socket> = new Set();
//let isServerRunning = false;
export let isServerRunning = false;
const port = parseInt(process.env.TCP_PORT ?? "2222", 10);
@@ -39,9 +40,8 @@ const parseTcpAlert = (input: string) => {
name,
};
};
export const startTCPServer = () => {
const log = createLogger({ module: "tcp", submodule: "create_server" });
const log = createLogger({ module: "tcp", submodule: "create_server" });
export const startTCPServer = async () => {
tcpServer = net.createServer(async (socket) => {
tcpSockets.add(socket);
socket.on("data", async (data: Buffer) => {
@@ -103,7 +103,7 @@ export const startTCPServer = () => {
log.info({}, `TCP Server listening on port ${port}`);
});
//isServerRunning = true;
isServerRunning = true;
return returnFunc({
success: true,
level: "info",
@@ -115,3 +115,66 @@ export const startTCPServer = () => {
room: "",
});
};
export const stopTCPServer = async () => {
if (!isServerRunning)
return { success: false, message: "Server is not running" };
for (const socket of tcpSockets) {
socket.destroy();
}
tcpSockets.clear();
tcpServer.close(() => {
log.info({}, "TCP Server stopped");
});
isServerRunning = false;
return returnFunc({
success: true,
level: "info",
module: "tcp",
subModule: "create_server",
message: "TCP server stopped.",
data: [],
notify: false,
room: "",
});
};
export const restartTCPServer = async () => {
if (!isServerRunning) {
startTCPServer();
return returnFunc({
success: false,
level: "warn",
module: "tcp",
subModule: "create_server",
message: "Server is not running will try to start it",
data: [],
notify: false,
room: "",
});
} else {
for (const socket of tcpSockets) {
socket.destroy();
}
tcpSockets.clear();
tcpServer.close(() => {
log.info({}, "TCP Server stopped");
});
isServerRunning = false;
await delay(1500);
startTCPServer();
}
return returnFunc({
success: true,
level: "info",
module: "tcp",
subModule: "create_server",
message: "TCP server has been restarted.",
data: [],
notify: false,
room: "",
});
};

View File

@@ -0,0 +1,19 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { restartTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/restart", async (_, res) => {
const connect = await restartTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "tcp",
subModule: "post",
message: "TCP Server has been restarted",
data: connect.data,
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -0,0 +1,20 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { startTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/start", async (_, res) => {
const connect = await startTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "routes",
subModule: "prodSql",
message: connect.message,
data: connect.data,
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -0,0 +1,20 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { stopTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/stop", async (_, res) => {
const connect = await stopTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "routes",
subModule: "prodSql",
message: connect.message,
data: [],
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -3,6 +3,7 @@ import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { jobAuditLog } from "../db/schema/auditLog.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import type { ReturnHelper } from "./returnHelper.utils.js";
// example createJob
// createCronJob("test Cron", "*/5 * * * * *", async () => {
@@ -45,7 +46,7 @@ const cronStats: Record<string, { created: number; replaced: number }> = {};
export const createCronJob = async (
name: string,
schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second
task: () => Promise<void>, // what function are we passing over
task: () => Promise<void | ReturnHelper>, // what function are we passing over
source = "unknown",
) => {
// get the timezone based on the os timezone set

View File

@@ -1,7 +1,7 @@
import type { Response } from "express";
import { createLogger } from "../logger/logger.controller.js";
interface Data<T = unknown[]> {
export interface ReturnHelper<T = unknown[]> {
success: boolean;
module:
| "system"
@@ -13,32 +13,11 @@ interface Data<T = unknown[]> {
| "notification"
| "email"
| "purchase"
| "tcp";
subModule:
| "db"
| "labeling"
| "printer"
| "prodSql"
| "query"
| "sendmail"
| "auth"
| "datamart"
| "jobs"
| "apt"
| "settings"
| "get"
| "update"
| "delete"
| "post"
| "notification"
| "delete"
| "printing"
| "gpSql"
| "email"
| "gpChecks"
| "prodEndpoint"
| "create_server";
level: "info" | "error" | "debug" | "fatal";
| "tcp"
| "logistics";
subModule: string;
level: "info" | "error" | "debug" | "fatal" | "warn";
message: string;
room?: string;
data?: T;
@@ -59,7 +38,7 @@ interface Data<T = unknown[]> {
* data: [] the data that will be passed back
* notify: false by default this is to send a notification to a users email to alert them of an issue.
*/
export const returnFunc = (data: Data) => {
export const returnFunc = (data: ReturnHelper) => {
const notify = data.notify ? data.notify : false;
const room = data.room ?? data.room;
const log = createLogger({ module: data.module, subModule: data.subModule });
@@ -92,7 +71,7 @@ export const returnFunc = (data: Data) => {
export function apiReturn(
res: Response,
opts: Data & { status?: number },
opts: ReturnHelper & { status?: number },
optional?: unknown, // leave this as unknown so we can pass an object or an array over.
): Response {
const result = returnFunc(opts);

View File

@@ -5,13 +5,17 @@ meta {
}
get {
url: {{url}}/api/datamart/:name
url: {{url}}/api/datamart/:name?historical=x
body: none
auth: inherit
}
params:query {
historical: x
}
params:path {
name: activeArticles
name: inventory
}
settings {

View File

@@ -1,5 +1,5 @@
vars {
url: http://uslim1vms006:3100/lst
url: http://localhost:3000/lst
readerIp: 10.44.14.215
}
vars:secret [

View File

@@ -0,0 +1,20 @@
CREATE TABLE "inv_historical_data" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"hist_date" date NOT NULL,
"plant_token" text,
"article" text NOT NULL,
"article_description" text NOT NULL,
"material_type" text,
"total_QTY" text,
"available_QTY" text,
"coa_QTY" text,
"held_QTY" text,
"consignment_qty" text,
"lot_number" text,
"location_id" text,
"location" text,
"whse_id" text DEFAULT '',
"whse_name" text DEFAULT 'missing whseName',
"upd_user" text DEFAULT 'lst',
"upd_date" timestamp DEFAULT now()
);

View File

@@ -0,0 +1 @@
ALTER TABLE "inv_historical_data" ALTER COLUMN "upd_user" SET DEFAULT 'lst-system';

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -225,6 +225,20 @@
"when": 1776098377074,
"tag": "0031_numerous_the_phantom",
"breakpoints": true
},
{
"idx": 32,
"version": "7",
"when": 1776245938243,
"tag": "0032_tranquil_onslaught",
"breakpoints": true
},
{
"idx": 33,
"version": "7",
"when": 1776256060808,
"tag": "0033_elite_adam_warlock",
"breakpoints": true
}
]
}

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "lst_v3",
"version": "0.0.1-alpha.3",
"version": "0.0.1-alpha.4",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "lst_v3",
"version": "0.0.1-alpha.3",
"version": "0.0.1-alpha.4",
"license": "ISC",
"dependencies": {
"@dotenvx/dotenvx": "^1.57.0",

View File

@@ -1,6 +1,6 @@
{
"name": "lst_v3",
"version": "0.0.1-alpha.3",
"version": "0.0.1-alpha.4",
"description": "The tool that supports us in our everyday alplaprod",
"main": "index.js",
"scripts": {

View File

@@ -14,8 +14,8 @@ param (
# .\scripts\services.ps1 -serviceName "LSTV3_app" -option "install" -appPath "D:\LST_V3" -description "Logistics Support Tool" -command "run start"
# server migrations get - reminder to add to old version in pkg "start:lst": "cd lstV2 && npm start",
# .\scripts\services.ps1 -serviceName "LST_app" -option "install" -appPath "D:\LST" -description "Logistics Support Tool" -command "run start"
# .\scripts\services.ps1 -serviceName "LST_app" -option "install" -appPath "D:\LST" -description "Logistics Support Tool" -command "run start:lst"
# powershell.exe -ExecutionPolicy Bypass -File .\scripts\services.ps1 -serviceName "LST_app" -option "install" -appPath "D:\LST" -description "Logistics Support Tool" -command "run start"
# powershell.exe -ExecutionPolicy Bypass -File .\scripts\services.ps1 -serviceName "LSTV2" -option "install" -appPath "D:\LST" -description "Logistics Support Tool" -command "run start:lst"
$nssmPath = $AppPath + "\nssm.exe"
$npmPath = "C:\Program Files\nodejs\npm.cmd" # Path to npm.cmd