12 Commits

Author SHA1 Message Date
3734d9daac feat(lstmobile): intial scanner setup kinda working
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m7s
2026-04-17 16:47:09 -05:00
a1eeadeec4 fix(psi): refactor psi queries 2026-04-17 16:46:44 -05:00
3639c1b77c fix(logistics): purchasing monitoring was going off every 5th min instead of every 5 min 2026-04-17 14:47:23 -05:00
cfbc156517 fix(logistics): historical issue where it was being really weird 2026-04-17 08:02:44 -05:00
fb3cd85b41 fix(ocp): fixes to make sure we always hav printer.data as an array or dont do anything 2026-04-15 09:20:08 -05:00
5b1c88546f fix(datamart): if we do not have 2.0 warehousing activate we need to use legacy 2026-04-15 08:45:48 -05:00
ba3227545d chore(release): 0.0.1-alpha.4
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m4s
Release and Build Image / release (push) Successful in 12s
2026-04-15 07:31:49 -05:00
84909bfcf8 ci(service): changes to the script to allow running the powershell on execution palicy restrictions
Some checks failed
Build and Push LST Docker Image / docker (push) Has been cancelled
2026-04-15 07:31:06 -05:00
e0d0ac2077 feat(datamart): psi data has been added :D 2026-04-15 07:29:35 -05:00
52a6c821f4 fix(datamart): error when running build and crashed everything
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m34s
2026-04-14 20:30:34 -05:00
eccaf17332 feat(datamart): migrations completed remaining is the deactivation that will be ran by anylitics
Some checks failed
Build and Push LST Docker Image / docker (push) Failing after 39s
2026-04-14 20:25:20 -05:00
6307037985 feat(tcp crud): tcp server start, stop, restart endpoints + status check
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m30s
2026-04-13 17:30:47 -05:00
86 changed files with 19348 additions and 72 deletions

1
.gitignore vendored
View File

@@ -5,6 +5,7 @@ builds
.buildNumber
temp
brunoApi
downloads
.scriptCreds
node-v24.14.0-x64.msi
postgresql-17.9-2-windows-x64.exe

View File

@@ -71,7 +71,8 @@
"prodlabels",
"prolink",
"Skelly",
"trycatch"
"trycatch",
"whse"
],
"gitea.token": "8456def90e1c651a761a8711763d6ef225d6b2db",
"gitea.instanceURL": "https://git.tuffraid.net",

View File

@@ -1,5 +1,49 @@
# All Changes to LST can be found below.
## [0.0.1-alpha.4](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.3...v0.0.1-alpha.4) (2026-04-15)
### 🌟 Enhancements
* **datamart:** migrations completed remaining is the deactivation that will be ran by anylitics ([eccaf17](https://git.tuffraid.net/cowch/lst_v3/commits/eccaf17332fb1c63b8d6bbea6f668c3bb42d44b7))
* **datamart:** psi data has been added :D ([e0d0ac2](https://git.tuffraid.net/cowch/lst_v3/commits/e0d0ac20773159373495d65023587b76b47df34f))
* **migrate:** quality alert migrated ([b0e5fd7](https://git.tuffraid.net/cowch/lst_v3/commits/b0e5fd79998d551d4f155d58416157a324498fbd))
* **ocp:** printer sync and logging logic added ([80189ba](https://git.tuffraid.net/cowch/lst_v3/commits/80189baf906224da43ec1b9b7521153d2a49e059))
* **tcp crud:** tcp server start, stop, restart endpoints + status check ([6307037](https://git.tuffraid.net/cowch/lst_v3/commits/6307037985162bc6b49f9f711132853296f43eee))
### 🐛 Bug fixes
* **datamart:** error when running build and crashed everything ([52a6c82](https://git.tuffraid.net/cowch/lst_v3/commits/52a6c821f4632e4b5b51e0528a0d620e2e0deffc))
### 📚 Documentation
* **docs:** removed docusorus as all docs will be inside lst now to better assist users ([6ba905a](https://git.tuffraid.net/cowch/lst_v3/commits/6ba905a887dbd8f306d71fed75bb34c71fee74c9))
* **env example:** updated the file ([ca3425d](https://git.tuffraid.net/cowch/lst_v3/commits/ca3425d327757120c2cc876fff28e8668c76838d))
* **notifcations:** docs for intro, notifcations, reprint added ([87f7387](https://git.tuffraid.net/cowch/lst_v3/commits/87f738702a935279a248d471541cdd9d49330565))
### 🛠️ Code Refactor
* **agent:** changed to have the test servers on there own push for better testing ([3bf024c](https://git.tuffraid.net/cowch/lst_v3/commits/3bf024cfc97d2841130d54d1a7c5cb5f09f0f598))
* **connection:** corrected the connection to the old system ([38a0b65](https://git.tuffraid.net/cowch/lst_v3/commits/38a0b65e9450c65b8300a10058a8f0357400f4e6))
* **logging:** when notify is true send the error to systemAdmins ([79e653e](https://git.tuffraid.net/cowch/lst_v3/commits/79e653efa3bcb2941ccee06b28378e709e085ec0))
* **notification:** blocking added ([9a0ef8e](https://git.tuffraid.net/cowch/lst_v3/commits/9a0ef8e51a36e3ab45b601b977f1b5cf35d56947))
* **puchase:** changes how the error handling works so a better email can be sent ([9d39c13](https://git.tuffraid.net/cowch/lst_v3/commits/9d39c13510974b5ada2a6f6c2448da3f1b755a5c))
* **reprint:** new query added to deactivate the old notifcation so no chance of duplicates ([c9eb59e](https://git.tuffraid.net/cowch/lst_v3/commits/c9eb59e2ad9847418ac55cb8a4a91c013f6c97bb))
* **server:** added in serverCrash email ([dcb3f2d](https://git.tuffraid.net/cowch/lst_v3/commits/dcb3f2dd1382986639b722778fad113392533b28))
* **services:** added in examples for migration stuff ([fc6dc82](https://git.tuffraid.net/cowch/lst_v3/commits/fc6dc82d8458a9928050dd3770778d6a6e1eea7f))
* **sql:** corrections to the way we reconnect so the app can error out and be reactivated later ([f33587a](https://git.tuffraid.net/cowch/lst_v3/commits/f33587a3d9a72ca72806635fac9d1214bb1452f1))
* **templates:** corrections for new notify process on critcal errors ([07ebf88](https://git.tuffraid.net/cowch/lst_v3/commits/07ebf88806b93b9320f8f9d36b867572dd9a9580))
### 📈 Project changes
* **agent:** added in jeff city ([e47ea9e](https://git.tuffraid.net/cowch/lst_v3/commits/e47ea9ec52a6ebaf5a8f67a7e8bd2c73da6186fb))
* **agent:** added in sherman ([4b6061c](https://git.tuffraid.net/cowch/lst_v3/commits/4b6061c478cbeba7c845dc1c8a015b9998721456))
* **service:** changes to the script to allow running the powershell on execution palicy restrictions ([84909bf](https://git.tuffraid.net/cowch/lst_v3/commits/84909bfcf85b91d085ea9dca78be00482b7fd231))
## [0.0.1-alpha.3](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.2...v0.0.1-alpha.3) (2026-04-10)

View File

@@ -19,7 +19,7 @@ Quick summary of current rewrite/migration goal.
| User Profile | ~~Edit profile~~, upload avatar | 🟨 In Progress |
| User Admin | Edit user, create user, remove user, alplaprod user integration | ⏳ Not Started |
| Notifications | ~~Subscribe~~, ~~Create~~, ~~Update~~, ~~~~Remove~~, Manual Trigger | 🟨 In Progress |
| Datamart | Create, Update, Run, Deactivate | 🔧 In Progress |
| Datamart | ~~Create~~, ~~Update~~, ~~Run~~, Deactivate | 🟨 In Progress |
| Frontend | Analytics and charts | ⏳ Not Started |
| Docs | Instructions and trouble shooting | ⏳ Not Started |
| One Click Print | Get printers, monitor printers, label process, material process, Special processes | ⏳ Not Started |

View File

@@ -13,6 +13,10 @@
*
* when a criteria is password over we will handle it by counting how many were passed up to 3 then deal with each one respectively
*/
import { and, between, inArray, notInArray } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
@@ -22,37 +26,118 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { datamartData } from "./datamartData.utlis.js";
type Options = {
name: string;
value: string;
};
type Data = {
name: string;
options: Options;
options: any;
optionsRequired?: boolean;
howManyOptionsRequired?: number;
};
const lstDbRun = async (data: Data) => {
if (data.options) {
if (data.name === "psiInventory") {
const ids = data.options.articles.split(",").map((id: any) => id.trim());
const whse = data.options.whseToInclude
? data.options.whseToInclude
.split(",")
.map((w: any) => w.trim())
.filter(Boolean)
: [];
const locations = data.options.exludeLanes
? data.options.exludeLanes
.split(",")
.map((l: any) => l.trim())
.filter(Boolean)
: [];
const conditions = [
inArray(invHistoricalData.article, ids),
between(
invHistoricalData.histDate,
data.options.startDate,
data.options.endDate,
),
];
// only add the warehouse condition if there are any whse values
if (whse.length > 0) {
conditions.push(inArray(invHistoricalData.whseId, whse));
}
// locations we dont want in the system
if (locations.length > 0) {
conditions.push(notInArray(invHistoricalData.location, locations));
}
return await db
.select()
.from(invHistoricalData)
.where(and(...conditions));
}
}
return [];
};
export const runDatamartQuery = async (data: Data) => {
// search the query db for the query by name
const sqlQuery = sqlQuerySelector(`${data.name}`) as SqlQuery;
const considerLstDBRuns = ["psiInventory"];
if (considerLstDBRuns.includes(data.name)) {
const lstDB = await lstDbRun(data);
return returnFunc({
success: true,
level: "info",
module: "datamart",
subModule: "lstDBrn",
message: `Data for: ${data.name}`,
data: lstDB,
notify: false,
});
}
const featureQ = sqlQuerySelector(`featureCheck`) as SqlQuery;
const { data: fd, error: fe } = await tryCatch(
prodQuery(featureQ.query, `Running feature check`),
);
if (fe) {
return returnFunc({
success: false,
level: "error",
module: "datamart",
subModule: "query",
message: `feature check failed`,
data: fe as any,
notify: false,
});
}
// for queries that will need to be ran on legacy until we get the plant updated need to go in here
const doubleQueries = ["inventory"];
const sqlQuery = sqlQuerySelector(
`datamart.${fd.data[0].activated > 0 && !doubleQueries.includes(data.name) ? data.name : `legacy.${data.name}`}`,
) as SqlQuery;
// checking if warehousing is as it will start to effect a lot of queries for plants that are not on 2.
const getDataMartInfo = datamartData.filter((x) => x.endpoint === data.name);
// const optionsMissing =
// !data.options || Object.keys(data.options).length === 0;
const optionCount =
Object.keys(data.options).length ===
getDataMartInfo[0]?.howManyOptionsRequired;
const isValid =
Object.keys(data.options ?? {}).length >=
(getDataMartInfo[0]?.howManyOptionsRequired ?? 0);
if (getDataMartInfo[0]?.optionsRequired && !optionCount) {
if (getDataMartInfo[0]?.optionsRequired && !isValid) {
return returnFunc({
success: false,
level: "error",
module: "datamart",
subModule: "query",
message: `This query is required to have the ${getDataMartInfo[0]?.howManyOptionsRequired} options set in order use it.`,
message: `This query is required to have ${getDataMartInfo[0]?.howManyOptionsRequired} option(s) set in order use it, please add in your option(s) data and try again.`,
data: [getDataMartInfo[0].options],
notify: false,
});
@@ -75,10 +160,120 @@ export const runDatamartQuery = async (data: Data) => {
// split the criteria by "," then and then update the query
if (data.options) {
Object.entries(data.options ?? {}).forEach(([key, value]) => {
const pattern = new RegExp(`\\[${key.trim()}\\]`, "g");
datamartQuery = datamartQuery.replace(pattern, String(value).trim());
});
switch (data.name) {
case "activeArticles":
break;
case "deliveryByDateRange":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"--and r.ArticleHumanReadableId in ([articles]) ",
data.options.articles
? `and r.ArticleHumanReadableId in (${data.options.articles})`
: "--and r.ArticleHumanReadableId in ([articles]) ",
);
break;
case "customerInventory":
datamartQuery = datamartQuery
.replace(
"--and IdAdressen",
`and IdAdressen in (${data.options.customer})`,
)
.replace(
"--and x.IdWarenlager in (0)",
`${data.options.whseToInclude ? `and x.IdWarenlager in (${data.options.whseToInclude})` : `--and x.IdWarenlager in (0)`}`,
);
break;
case "openOrders":
datamartQuery = datamartQuery
.replace("[startDay]", `${data.options.startDay}`)
.replace("[endDay]", `${data.options.endDay}`);
break;
case "inventory":
datamartQuery = datamartQuery
.replaceAll(
"--,l.RunningNumber",
`${data.options.includeRunningNumbers ? `,l.RunningNumber` : `--,l.RunningNumber`}`,
)
.replaceAll(
"--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot",
`${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot`}`,
)
.replaceAll(
"--,l.WarehouseDescription,l.LaneDescription",
`${data.options.locations ? `,l.WarehouseDescription,l.LaneDescription` : `--,l.WarehouseDescription,l.LaneDescription`}`,
);
break;
case "fakeEDIUpdate":
datamartQuery = datamartQuery.replace(
"--AND h.CustomerHumanReadableId in (0)",
`${data.options.address ? `AND h.CustomerHumanReadableId in (${data.options.address})` : `--AND h.CustomerHumanReadableId in (0)`}`,
);
break;
case "forecast":
datamartQuery = datamartQuery.replace(
"where DeliveryAddressHumanReadableId in ([customers])",
data.options.customers
? `where DeliveryAddressHumanReadableId in (${data.options.customers})`
: "--where DeliveryAddressHumanReadableId in ([customers])",
);
break;
case "activeArticles2":
datamartQuery = datamartQuery.replace(
"and a.HumanReadableId in ([articles])",
data.options.articles
? `and a.HumanReadableId in (${data.options.articles})`
: "--and a.HumanReadableId in ([articles])",
);
break;
case "psiDeliveryData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"[articles]",
data.options.articles ? `${data.options.articles}` : "[articles]",
);
break;
case "productionData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and ArticleHumanReadableId in ([articles])",
data.options.articles
? `and ArticleHumanReadableId in (${data.options.articles})`
: "--and ArticleHumanReadableId in ([articles])",
);
break;
case "psiPlanningData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and p.IdArtikelvarianten in ([articles])",
data.options.articles
? `and p.IdArtikelvarianten in (${data.options.articles})`
: "--and p.IdArtikelvarianten in ([articles])",
);
break;
default:
return returnFunc({
success: false,
level: "error",
module: "datamart",
subModule: "query",
message: `${data.name} encountered an error as it might not exist in LST please contact support if this continues to happen`,
data: [sqlQuery.message],
notify: true,
});
}
}
const { data: queryRun, error } = await tryCatch(

View File

@@ -10,14 +10,50 @@ export const datamartData = [
name: "Active articles",
endpoint: "activeArticles",
description: "returns all active articles for the server with custom data",
options: "", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
options: "",
optionsRequired: false,
},
{
name: "Delivery by date range",
endpoint: "deliveryByDateRange",
description: `Returns all Deliverys in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
options: "startDate,endDate", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
description: `Returns all Deliveries in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
options: "startDate,endDate",
optionsRequired: true,
howManyOptionsRequired: 2,
},
{
name: "Get Customer Inventory",
endpoint: "customerInventory",
description: `Returns specific customer inventory based on there address ID, IE: 8,12,145. \nWith option to include specific warehousesIds, IE 36,41,5. \nNOTES: *leaving warehouse blank will just pull everything for the customer, Inventory dose not include PPOO or INV`,
options: "customer,whseToInclude",
optionsRequired: true,
howManyOptionsRequired: 1,
},
{
name: "Get open order",
endpoint: "openOrders",
description: `Returns open orders based on day count sent over, IE: startDay 15 days in the past endDay 5 days in the future, can be left empty for this default days`,
options: "startDay,endDay",
optionsRequired: true,
howManyOptionsRequired: 2,
},
{
name: "Get inventory",
endpoint: "inventory",
description: `Returns all inventory, excludes inv location. adding an x in one of the options will enable it.`,
options: "includeRunningNumbers,locations,lots",
},
{
name: "Fake EDI Update",
endpoint: "fakeEDIUpdate",
description: `Returns all open orders to correct and resubmit via lst demand mgt, leaving blank will get everything putting an address only returns the specified address. \nNOTE: only orders that were created via edi will populate here.`,
options: "address",
},
{
name: "Production Data",
endpoint: "productionData",
description: `Returns all production data from the date range with the option to have 1 to many avs to search by.`,
options: "startDate,endDate,articles",
optionsRequired: true,
howManyOptionsRequired: 2,
},

View File

@@ -0,0 +1,30 @@
import { date, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type z from "zod";
export const invHistoricalData = pgTable("inv_historical_data", {
inv: uuid("id").defaultRandom().primaryKey(),
histDate: date("hist_date").notNull(), // this date should always be yesterday when we post it.
plantToken: text("plant_token"),
article: text("article").notNull(),
articleDescription: text("article_description").notNull(),
materialType: text("material_type"),
total_QTY: text("total_QTY"),
available_QTY: text("available_QTY"),
coa_QTY: text("coa_QTY"),
held_QTY: text("held_QTY"),
consignment_QTY: text("consignment_qty"),
lot_Number: text("lot_number"),
locationId: text("location_id"),
location: text("location"),
whseId: text("whse_id").default(""),
whseName: text("whse_name").default("missing whseName"),
upd_user: text("upd_user").default("lst-system"),
upd_date: timestamp("upd_date").defaultNow(),
});
export const invHistoricalDataSchema = createSelectSchema(invHistoricalData);
export const newInvHistoricalDataSchema = createInsertSchema(invHistoricalData);
export type InvHistoricalData = z.infer<typeof invHistoricalDataSchema>;
export type NewInvHistoricalData = z.infer<typeof newInvHistoricalDataSchema>;

View File

@@ -0,0 +1,223 @@
import { format } from "date-fns";
import { eq, sql } from "drizzle-orm";
import { runDatamartQuery } from "../datamart/datamart.controller.js";
import { db } from "../db/db.controller.js";
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { createCronJob } from "../utils/croner.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
type Inventory = {
article: string;
alias: string;
materialType: string;
total_palletQTY: string;
available_QTY: string;
coa_QTY: string;
held_QTY: string;
consignment_qty: string;
lot: string;
locationId: string;
laneDescription: string;
warehouseId: string;
warehouseDescription: string;
};
const historicalInvImport = async () => {
const today = new Date();
const { data, error } = await tryCatch(
db
.select()
.from(invHistoricalData)
.where(eq(invHistoricalData.histDate, format(today, "yyyy-MM-dd"))),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "query",
message: `Error getting historical inv info`,
data: error as any,
notify: false,
});
}
if (data?.length === 0) {
const avSQLQuery = sqlQuerySelector(`datamart.activeArticles`) as SqlQuery;
if (!avSQLQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting Article info`,
data: [avSQLQuery.message],
notify: true,
});
}
const { data: inv, error: invError } = await tryCatch(
//prodQuery(sqlQuery.query, "Inventory data"),
runDatamartQuery({
name: "inventory",
options: { lots: "x", locations: "x" },
}),
);
const { data: av, error: avError } = (await tryCatch(
runDatamartQuery({ name: "activeArticles", options: {} }),
)) as any;
if (invError) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting inventory info from prod query`,
data: invError as any,
notify: false,
});
}
if (avError) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting article info from prod query`,
data: invError as any,
notify: false,
});
}
// shape the data to go into our table
const plantToken = process.env.PROD_PLANT_TOKEN ?? "test1";
const importInv = (inv.data ? inv.data : []) as Inventory[];
const importData = importInv.map((i) => {
return {
histDate: sql`(NOW())::date`,
plantToken: plantToken,
article: i.article,
articleDescription: i.alias,
materialType:
av.data.filter((a: any) => a.article === i.article).length > 0
? av.data.filter((a: any) => a.article === i.article)[0]
?.TypeOfMaterial
: "Item not defined",
total_QTY: i.total_palletQTY ?? "0.00",
available_QTY: i.available_QTY ?? "0.00",
coa_QTY: i.coa_QTY ?? "0.00",
held_QTY: i.held_QTY ?? "0.00",
consignment_QTY: i.consignment_qty ?? "0.00",
lot_Number: i.lot ?? "0",
locationId: i.locationId ?? "0",
location: i.laneDescription ?? "Missing lane",
whseId: i.warehouseId ?? "0",
whseName: i.warehouseDescription ?? "Missing warehouse",
};
});
const { data: dataImport, error: errorImport } = await tryCatch(
db.insert(invHistoricalData).values(importData),
);
if (errorImport) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error adding historical data to lst db`,
data: errorImport as any,
notify: true,
});
}
if (dataImport) {
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Historical data was added to lst :D`,
data: [],
notify: false,
});
}
} else {
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Historical Data for: ${format(today, "yyyy-MM-dd")}, is already added and nothing to do.`,
data: [],
notify: false,
});
}
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Some weird crazy error just happened and didnt get captured during the historical inv check.`,
data: [],
notify: true,
});
};
export const historicalSchedule = async () => {
// running the history in case my silly ass dose an update around the shift change time lol, this will prevent loss data. it might be off a little but no one cares
historicalInvImport();
const sqlQuery = sqlQuerySelector(`shiftChange`) as SqlQuery;
if (!sqlQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "query",
message: `Error getting shiftChange sql file`,
data: [sqlQuery.message],
notify: false,
});
}
const { data, error } = await tryCatch(
prodQuery(sqlQuery.query, "Shift Change data"),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "query",
message: `Error getting shiftChange info`,
data: error as any,
notify: false,
});
}
// shift split
const shiftTimeSplit = data?.data[0]?.shiftChange.split(":");
const cronSetup = `0 ${
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[1])}` : "0"
} ${
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[0])}` : "7"
} * * *`;
createCronJob("historicalInv", cronSetup, () => historicalInvImport());
};

View File

@@ -62,7 +62,7 @@ export const printerSync = async () => {
});
}
if (printers?.success) {
if (printers?.success && Array.isArray(printers.data)) {
const ignorePrinters = ["pdf24", "standard"];
const validPrinters =

View File

@@ -1,6 +1,6 @@
use AlplaPROD_test1
SELECT V_Artikel.IdArtikelvarianten,
SELECT V_Artikel.IdArtikelvarianten as article,
V_Artikel.Bezeichnung,
V_Artikel.ArtikelvariantenTypBez,
V_Artikel.PreisEinheitBez,

View File

@@ -0,0 +1,43 @@
/**
This will be replacing activeArticles once all data is remapped into this query.
make a note in the docs this activeArticles will go stale sooner or later.
**/
use [test1_AlplaPROD2.0_Read]
select a.Id,
a.HumanReadableId as av,
a.Alias as alias,
p.LoadingUnitsPerTruck as loadingUnitsPerTruck,
p.LoadingUnitsPerTruck * p.LoadingUnitPieces as qtyPerTruck,
p.LoadingUnitPieces,
case when i.MinQuantity IS NOT NULL then round(cast(i.MinQuantity as float), 2) else 0 end as min,
case when i.MaxQuantity IS NOT NULL then round(cast(i.MaxQuantity as float),2) else 0 end as max
from masterData.Article (nolock) as a
/* sales price */
left join
(select *
from (select
id,
PackagingId,
ArticleId,
DefaultCustomer,
ROW_NUMBER() OVER (PARTITION BY ArticleId ORDER BY ValidAfter DESC) AS RowNum
from masterData.SalesPrice (nolock)
where DefaultCustomer = 1) as x
where RowNum = 1
) as s
on a.id = s.ArticleId
/* pkg instructions */
left join
masterData.PackagingInstruction (nolock) as p
on s.PackagingId = p.id
/* stock limits */
left join
masterData.StockLimit (nolock) as i
on a.id = i.ArticleId
where a.active = 1
and a.HumanReadableId in ([articles])

View File

@@ -0,0 +1,45 @@
select x.idartikelVarianten as av
,ArtikelVariantenAlias as Alias
--x.Lfdnr as RunningNumber,
--,round(sum(EinlagerungsMengeVPKSum),0) as Total_Pallets
--,sum(EinlagerungsMengeSum) as Total_PalletQTY
,round(sum(VerfuegbareMengeVPKSum),0) as Avalible_Pallets
,sum(VerfuegbareMengeSum) as Avaliable_PalletQTY
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as COA_Pallets
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as COA_QTY
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as Held_Pallets
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeSum else 0 end) as Held_QTY
,IdProdPlanung as Lot
--,IdAdressen
--,x.AdressBez
--,*
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
left join
[AlplaPROD_test1].dbo.T_EtikettenGedruckt (nolock) on
x.Lfdnr = T_EtikettenGedruckt.Lfdnr AND T_EtikettenGedruckt.Lfdnr > 1
left join
(SELECT *
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] (nolock) where Active = 1) as c
on x.IdMainDefect = c.IdBlockingDefect
/*
The data below will be controlled by the user in excell by default everything will be passed over
IdAdressen = 3
*/
where
--IdArtikelTyp = 1
x.IdWarenlager not in (6, 1)
--and IdAdressen
--and x.IdWarenlager in (0)
group by x.IdArtikelVarianten
,ArtikelVariantenAlias
,IdProdPlanung
--,c.Description
,IdAdressen
,x.AdressBez
--, x.Lfdnr
order by x.IdArtikelVarianten

View File

@@ -0,0 +1,74 @@
use [test1_AlplaPROD2.0_Read]
DECLARE @StartDate DATE = '[startDate]' -- 2025-1-1
DECLARE @EndDate DATE = '[endDate]' -- 2025-1-31
SELECT
r.[ArticleHumanReadableId]
,[ReleaseNumber]
,h.CustomerOrderNumber
,x.CustomerLineItemNumber
,[CustomerReleaseNumber]
,[ReleaseState]
,[DeliveryState]
,ea.JournalNummer as BOL_Number
,[ReleaseConfirmationState]
,[PlanningState]
,format(r.[OrderDate], 'yyyy-MM-dd HH:mm') as OrderDate
--,r.[OrderDate]
,FORMAT(r.[DeliveryDate], 'yyyy-MM-dd HH:mm') as DeliveryDate
--,r.[DeliveryDate]
,FORMAT(r.[LoadingDate], 'yyyy-MM-dd HH:mm') as LoadingDate
--,r.[LoadingDate]
,[Quantity]
,[DeliveredQuantity]
,r.[AdditionalInformation1]
,r.[AdditionalInformation2]
,[TradeUnits]
,[LoadingUnits]
,[Trucks]
,[LoadingToleranceType]
,[SalesPrice]
,[Currency]
,[QuantityUnit]
,[SalesPriceRemark]
,r.[Remark]
,[Irradiated]
,r.[CreatedByEdi]
,[DeliveryAddressHumanReadableId]
,DeliveryAddressDescription
,[CustomerArtNo]
,[TotalPrice]
,r.[ArticleAlias]
FROM [order].[Release] (nolock) as r
left join
[order].LineItem as x on
r.LineItemId = x.id
left join
[order].Header as h on
x.HeaderId = h.id
--bol stuff
left join
AlplaPROD_test1.dbo.V_LadePlanungenLadeAuftragAbruf (nolock) as zz
on zz.AbrufIdAuftragsAbruf = r.ReleaseNumber
left join
(select * from (SELECT
ROW_NUMBER() OVER (PARTITION BY IdJournal ORDER BY add_date DESC) AS RowNum
,*
FROM [AlplaPROD_test1].[dbo].[T_Lieferungen] (nolock)) x
where RowNum = 1) as ea on
zz.IdLieferschein = ea.IdJournal
where
--r.ReleaseNumber = 1452
r.DeliveryDate between @StartDate AND @EndDate
and DeliveredQuantity > 0
--and r.ArticleHumanReadableId in ([articles])
--and Journalnummer = 169386

View File

@@ -0,0 +1,29 @@
use [test1_AlplaPROD2.0_Read]
select
customerartno as CustomerArticleNumber
,h.CustomerOrderNumber as CustomerOrderNumber
,l.CustomerLineItemNumber as CustomerLineNumber
,r.CustomerReleaseNumber as CustomerRealeaseNumber
,r.Quantity
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as DeliveryDate
,h.CustomerHumanReadableId as CustomerID
,r.Remark
--,*
from [order].[Release] as r (nolock)
left join
[order].LineItem as l (nolock) on
l.id = r.LineItemId
left join
[order].Header as h (nolock) on
h.id = l.HeaderId
WHERE releaseState not in (1, 2, 3, 4)
AND h.CreatedByEdi = 1
AND r.deliveryDate < getdate() + 1
--AND h.CustomerHumanReadableId in (0)
order by r.deliveryDate

View File

@@ -0,0 +1,8 @@
SELECT format(RequirementDate, 'yyyy-MM-dd') as requirementDate
,ArticleHumanReadableId
,CustomerArticleNumber
,ArticleDescription
,Quantity
FROM [test1_AlplaPROD2.0_Read].[forecast].[Forecast]
where DeliveryAddressHumanReadableId in ([customers])
order by RequirementDate

View File

@@ -0,0 +1,58 @@
use [test1_AlplaPROD2.0_Read]
select
ArticleHumanReadableId as article
,ArticleAlias as alias
,round(sum(QuantityLoadingUnits),2) total_pallets
,round(sum(Quantity),2) as total_palletQTY
,round(sum(case when State = 0 then QuantityLoadingUnits else 0 end),2) available_Pallets
,round(sum(case when State = 0 then Quantity else 0 end),2) available_QTY
,round(sum(case when b.HumanReadableId = 864 then QuantityLoadingUnits else 0 end),2) as coa_Pallets
,round(sum(case when b.HumanReadableId = 864 then Quantity else 0 end),2) as coa_QTY
,round(sum(case when b.HumanReadableId <> 864 then QuantityLoadingUnits else 0 end),2) as held_Pallets
,round(sum(case when b.HumanReadableId <> 864 then Quantity else 0 end),2) as held_QTY
,round(sum(case when w.type = 7 then QuantityLoadingUnits else 0 end),2) as consignment_Pallets
,round(sum(case when w.type = 7 then Quantity else 0 end),2) as consignment_qty
--,l.RunningNumber
/** datamart include lot number **/
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot
/** data mart include location data **/
--,l.WarehouseDescription,l.LaneDescription
,articleTypeName
FROM [warehousing].[WarehouseUnit] as l (nolock)
left join
(
SELECT [Id]
,[HumanReadableId]
,d.[Description]
,[DefectGroupId]
,[IsActive]
FROM [blocking].[BlockingDefect] as g (nolock)
left join
[AlplaPROD_test1].dbo.[T_BlockingDefects] as d (nolock) on
d.IdGlobalBlockingDefect = g.HumanReadableId
) as b on
b.id = l.MainDefectId
left join
[warehousing].[warehouse] as w (nolock) on
w.id = l.warehouseid
where LaneHumanReadableId not in (20000,21000)
group by ArticleHumanReadableId,
ArticleAlias,
ArticleTypeName
--,l.RunningNumber
/** datamart include lot number **/
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber
/** data mart include location data **/
--,l.WarehouseDescription,l.LaneDescription
order by ArticleHumanReadableId

View File

@@ -0,0 +1,48 @@
select
x.idartikelVarianten as article,
x.ArtikelVariantenAlias as alias
--x.Lfdnr as RunningNumber,
,round(sum(EinlagerungsMengeVPKSum),2) as total_pallets
,sum(EinlagerungsMengeSum) as total_palletQTY
,round(sum(VerfuegbareMengeVPKSum),0) as available_Pallets
,sum(VerfuegbareMengeSum) as available_QTY
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as coa_Pallets
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as coa_QTY
,sum(case when c.Description NOT LIKE '%COA%' or x.IdMainDefect = -1 then GesperrteMengeVPKSum else 0 end) as held_Pallets
,sum(case when c.Description NOT LIKE '%COA%' or x.IdMainDefect = -1 then GesperrteMengeSum else 0 end) as held_QTY
,sum(case when x.WarenLagerLagerTyp = 8 then VerfuegbareMengeSum else 0 end) as consignment_qty
,IdProdPlanung as lot
----,IdAdressen,
,x.AdressBez
,x.IdLagerAbteilung as locationId
,x.LagerAbteilungKurzBez as laneDescription
,x.IdWarenlager as warehouseId
,x.WarenLagerKurzBez as warehouseDescription
--,*
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
left join
[AlplaPROD_test1].dbo.T_EtikettenGedruckt as l(nolock) on
x.Lfdnr = l.Lfdnr AND l.Lfdnr > 1
left join
(SELECT *
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] where Active = 1) as c
on x.IdMainDefect = c.IdBlockingDefect
/*
The data below will be controlled by the user in excell by default everything will be passed over
IdAdressen = 3
*/
where /*IdArtikelTyp = 1 and */x.IdWarenlager not in (6, 1)
group by x.idartikelVarianten, ArtikelVariantenAlias, c.Description
--,IdAdressen
,x.AdressBez
,IdProdPlanung
,x.IdLagerAbteilung
,x.LagerAbteilungKurzBez
,x.IdWarenlager
,x.WarenLagerKurzBez
--, x.Lfdnr
order by x.IdArtikelVarianten

View File

@@ -0,0 +1,33 @@
use [test1_AlplaPROD2.0_Read]
select
customerartno
,r.ArticleHumanReadableId as article
,r.ArticleAlias as articleAlias
,ReleaseNumber
,h.CustomerOrderNumber as header
,l.CustomerLineItemNumber as lineItem
,r.CustomerReleaseNumber as releaseNumber
,r.LoadingUnits
,r.Quantity
,r.TradeUnits
,h.CustomerHumanReadableId
,r.DeliveryAddressDescription
,format(r.LoadingDate, 'MM/dd/yyyy HH:mm') as loadingDate
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as deliveryDate
,r.Remark
--,*
from [order].[Release] as r (nolock)
left join
[order].LineItem as l (nolock) on
l.id = r.LineItemId
left join
[order].Header as h (nolock) on
h.id = l.HeaderId
WHERE releasestate not in (1, 2, 4)
AND r.deliverydate between getDate() + -[startDay] and getdate() + [endDay]
order by r.deliverydate

View File

@@ -0,0 +1,19 @@
use [test1_AlplaPROD2.0_Reporting]
declare @startDate nvarchar(30) = '[startDate]' --'2024-12-30'
declare @endDate nvarchar(30) = '[endDate]' --'2025-08-09'
select MachineLocation,
ArticleHumanReadableId as article,
sum(Quantity) as Produced,
count(Quantity) as palletsProdued,
FORMAT(convert(date, ProductionDay), 'M/d/yyyy') as ProductionDay,
ProductionLotHumanReadableId as productionLot
from [reporting_productionControlling].[ScannedUnit] (nolock)
where convert(date, ProductionDay) between @startDate and @endDate
and ArticleHumanReadableId in ([articles])
and BookedOut is null
group by MachineLocation, ArticleHumanReadableId,ProductionDay, ProductionLotHumanReadableId

View File

@@ -1,5 +1,10 @@
use [test1_AlplaPROD2.0_Read]
use AlplaPROD_test1
/**
move this over to the delivery date range query once we have the shift data mapped over correctly.
update the psi stuff on this as well.
**/
DECLARE @StartDate DATE = '[startDate]' -- 2025-1-1
DECLARE @EndDate DATE = '[endDate]' -- 2025-1-31
SELECT
@@ -66,9 +71,9 @@ ROW_NUMBER() OVER (PARTITION BY IdJournal ORDER BY add_date DESC) AS RowNum
zz.IdLieferschein = ea.IdJournal
where
--r.ArticleHumanReadableId in ([articles])
r.ArticleHumanReadableId in ([articles])
--r.ReleaseNumber = 1452
r.DeliveryDate between @StartDate AND @EndDate
and DeliveredQuantity > 0
--and Journalnummer = 169386
and r.DeliveryDate between @StartDate AND @EndDate
--and DeliveredQuantity > 0
--and Journalnummer = 169386

View File

@@ -0,0 +1,32 @@
use AlplaPROD_test1
declare @start_date nvarchar(30) = '[startDate]' --'2025-01-01'
declare @end_date nvarchar(30) = '[endDate]' --'2025-08-09'
/*
articles will need to be passed over as well as the date structure we want to see
*/
select x.IdArtikelvarianten As Article,
ProduktionAlias as Description,
standort as MachineId,
MaschinenBezeichnung as MachineName,
--MaschZyklus as PlanningCycleTime,
x.IdProdPlanung as LotNumber,
FORMAT(ProdTag, 'MM/dd/yyyy') as ProductionDay,
x.planMenge as TotalPlanned,
ProduktionMenge as QTYPerDay,
round(ProduktionMengeVPK, 2) PalDay,
Status as finished
--MaschStdAuslastung as nee
from dbo.V_ProdLosProduktionJeProdTag_PLANNING (nolock) as x
left join
dbo.V_ProdPlanung (nolock) as p on
x.IdProdPlanung = p.IdProdPlanung
where ProdTag between @start_date and @end_date
and p.IdArtikelvarianten in ([articles])
--and V_ProdLosProduktionJeProdTag_PLANNING.IdKunde = 10
--and IdProdPlanung = 18442
order by ProdTag desc

View File

@@ -0,0 +1,11 @@
SELECT count(*) as activated
FROM [test1_AlplaPROD2.0_Read].[support].[FeatureActivation]
where feature in (108,7)
/*
as more features get activated and need to have this checked to include the new endpoints add here so we can check this.
108 = waste
7 = warehousing
*/

View File

@@ -0,0 +1,4 @@
select top(1) convert(varchar(8) ,
convert(time,startdate), 108) as shiftChange
from [test1_AlplaPROD2.0_Read].[masterData].[ShiftDefinition]
where teamNumber = 1

View File

@@ -45,7 +45,7 @@ export const monitorAlplaPurchase = async () => {
}
if (purchaseMonitor[0]?.active) {
createCronJob("purchaseMonitor", "0 */5 * * * *", async () => {
createCronJob("purchaseMonitor", "0 5 * * * *", async () => {
try {
const result = await prodQuery(
sqlQuery.query.replace(

View File

@@ -10,6 +10,7 @@ import { setupOCPRoutes } from "./ocp/ocp.routes.js";
import { setupOpendockRoutes } from "./opendock/opendock.routes.js";
import { setupProdSqlRoutes } from "./prodSql/prodSql.routes.js";
import { setupSystemRoutes } from "./system/system.routes.js";
import { setupTCPRoutes } from "./tcpServer/tcp.routes.js";
import { setupUtilsRoutes } from "./utils/utils.routes.js";
export const setupRoutes = (baseUrl: string, app: Express) => {
@@ -24,4 +25,5 @@ export const setupRoutes = (baseUrl: string, app: Express) => {
setupOpendockRoutes(baseUrl, app);
setupNotificationRoutes(baseUrl, app);
setupOCPRoutes(baseUrl, app);
setupTCPRoutes(baseUrl, app);
};

View File

@@ -6,6 +6,7 @@ import { dbCleanup } from "./db/dbCleanup.controller.js";
import { type Setting, settings } from "./db/schema/settings.schema.js";
import { connectGPSql } from "./gpSql/gpSqlConnection.controller.js";
import { createLogger } from "./logger/logger.controller.js";
import { historicalSchedule } from "./logistics/logistics.historicalInv.js";
import { startNotifications } from "./notification/notification.controller.js";
import { createNotifications } from "./notification/notifications.master.js";
import { printerSync } from "./ocp/ocp.printer.manage.js";
@@ -64,6 +65,7 @@ const start = async () => {
dbCleanup("jobs", 30),
);
createCronJob("logsCleanup", "0 15 5 * * *", () => dbCleanup("logs", 120));
historicalSchedule();
// one shots only needed to run on server startups
createNotifications();

View File

@@ -1,9 +1,12 @@
import { Router } from "express";
import { connected as gpSql } from "../gpSql/gpSqlConnection.controller.js";
import { connected as prodSql } from "../prodSql/prodSqlConnection.controller.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { isServerRunning } from "../tcpServer/tcp.server.js";
const router = Router();
@@ -25,6 +28,9 @@ router.get("/", async (_, res) => {
: [],
eomFGPkgSheetVersion: 1, // this is the excel file version when we have a change to the macro we want to grab this
masterMacroFile: 1,
tcpServerOnline: isServerRunning,
sqlServerConnected: prodSql,
gpServerConnected: gpSql,
});
});

View File

@@ -0,0 +1,49 @@
import fs from "node:fs";
import { Router } from "express";
import path from "path";
import { fileURLToPath } from "url";
const router = Router();
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const downloadDir = path.resolve(__dirname, "../../downloads/mobile");
const currentApk = {
packageName: "net.alpla.lst.mobile",
versionName: "0.0.1-alpha",
versionCode: 1,
minSupportedVersionCode: 1,
fileName: "lst-mobile.apk",
};
router.get("/version", async (req, res) => {
const baseUrl = `${req.protocol}://${req.get("host")}`;
res.json({
packageName: currentApk.packageName,
versionName: currentApk.versionName,
versionCode: currentApk.versionCode,
minSupportedVersionCode: currentApk.minSupportedVersionCode,
downloadUrl: `${baseUrl}/lst/api/mobile/apk/latest`,
});
});
router.get("/apk/latest", (_, res) => {
const apkPath = path.join(downloadDir, currentApk.fileName);
if (!fs.existsSync(apkPath)) {
return res.status(404).json({ success: false, message: "APK not found" });
}
res.setHeader("Content-Type", "application/vnd.android.package-archive");
res.setHeader(
"Content-Disposition",
`attachment; filename="${currentApk.fileName}"`,
);
return res.sendFile(apkPath);
});
export default router;

View File

@@ -3,10 +3,12 @@ import { requireAuth } from "../middleware/auth.middleware.js";
import getSettings from "./settings.route.js";
import updSetting from "./settingsUpdate.route.js";
import stats from "./stats.route.js";
import mobile from "./system.mobileApp.js";
export const setupSystemRoutes = (baseUrl: string, app: Express) => {
//stats will be like this as we dont need to change this
app.use(`${baseUrl}/api/stats`, stats);
app.use(`${baseUrl}/api/mobile`, mobile);
app.use(`${baseUrl}/api/settings`, getSettings);
app.use(`${baseUrl}/api/settings`, requireAuth, updSetting);

View File

@@ -0,0 +1,14 @@
import type { Express } from "express";
import { requireAuth } from "../middleware/auth.middleware.js";
import restart from "./tcpRestart.route.js";
import start from "./tcpStart.route.js";
import stop from "./tcpStop.route.js";
export const setupTCPRoutes = (baseUrl: string, app: Express) => {
//stats will be like this as we dont need to change this
app.use(`${baseUrl}/api/tcp/start`, requireAuth, start);
app.use(`${baseUrl}/api/tcp/stop`, requireAuth, stop);
app.use(`${baseUrl}/api/tcp/restart`, requireAuth, restart);
// all other system should be under /api/system/*
};

View File

@@ -3,13 +3,14 @@ import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { printerData } from "../db/schema/printers.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import { delay } from "../utils/delay.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { type PrinterData, printerListen } from "./tcp.printerListener.js";
let tcpServer: net.Server;
const tcpSockets: Set<net.Socket> = new Set();
//let isServerRunning = false;
export let isServerRunning = false;
const port = parseInt(process.env.TCP_PORT ?? "2222", 10);
@@ -39,9 +40,8 @@ const parseTcpAlert = (input: string) => {
name,
};
};
export const startTCPServer = () => {
const log = createLogger({ module: "tcp", submodule: "create_server" });
const log = createLogger({ module: "tcp", submodule: "create_server" });
export const startTCPServer = async () => {
tcpServer = net.createServer(async (socket) => {
tcpSockets.add(socket);
socket.on("data", async (data: Buffer) => {
@@ -103,7 +103,7 @@ export const startTCPServer = () => {
log.info({}, `TCP Server listening on port ${port}`);
});
//isServerRunning = true;
isServerRunning = true;
return returnFunc({
success: true,
level: "info",
@@ -115,3 +115,66 @@ export const startTCPServer = () => {
room: "",
});
};
export const stopTCPServer = async () => {
if (!isServerRunning)
return { success: false, message: "Server is not running" };
for (const socket of tcpSockets) {
socket.destroy();
}
tcpSockets.clear();
tcpServer.close(() => {
log.info({}, "TCP Server stopped");
});
isServerRunning = false;
return returnFunc({
success: true,
level: "info",
module: "tcp",
subModule: "create_server",
message: "TCP server stopped.",
data: [],
notify: false,
room: "",
});
};
export const restartTCPServer = async () => {
if (!isServerRunning) {
startTCPServer();
return returnFunc({
success: false,
level: "warn",
module: "tcp",
subModule: "create_server",
message: "Server is not running will try to start it",
data: [],
notify: false,
room: "",
});
} else {
for (const socket of tcpSockets) {
socket.destroy();
}
tcpSockets.clear();
tcpServer.close(() => {
log.info({}, "TCP Server stopped");
});
isServerRunning = false;
await delay(1500);
startTCPServer();
}
return returnFunc({
success: true,
level: "info",
module: "tcp",
subModule: "create_server",
message: "TCP server has been restarted.",
data: [],
notify: false,
room: "",
});
};

View File

@@ -0,0 +1,19 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { restartTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/restart", async (_, res) => {
const connect = await restartTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "tcp",
subModule: "post",
message: "TCP Server has been restarted",
data: connect.data,
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -0,0 +1,20 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { startTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/start", async (_, res) => {
const connect = await startTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "routes",
subModule: "prodSql",
message: connect.message,
data: connect.data,
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -0,0 +1,20 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { stopTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/stop", async (_, res) => {
const connect = await stopTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "routes",
subModule: "prodSql",
message: connect.message,
data: [],
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -9,6 +9,7 @@ export const allowedOrigins = [
"http://localhost:4000",
"http://localhost:4001",
"http://localhost:5500",
"http://localhost:8081",
"https://admin.socket.io",
"https://electron-socket-io-playground.vercel.app",
`${process.env.URL}`,

View File

@@ -3,6 +3,7 @@ import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { jobAuditLog } from "../db/schema/auditLog.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import type { ReturnHelper } from "./returnHelper.utils.js";
// example createJob
// createCronJob("test Cron", "*/5 * * * * *", async () => {
@@ -45,7 +46,7 @@ const cronStats: Record<string, { created: number; replaced: number }> = {};
export const createCronJob = async (
name: string,
schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second
task: () => Promise<void>, // what function are we passing over
task: () => Promise<void | ReturnHelper>, // what function are we passing over
source = "unknown",
) => {
// get the timezone based on the os timezone set

View File

@@ -1,7 +1,7 @@
import type { Response } from "express";
import { createLogger } from "../logger/logger.controller.js";
interface Data<T = unknown[]> {
export interface ReturnHelper<T = unknown[]> {
success: boolean;
module:
| "system"
@@ -13,32 +13,11 @@ interface Data<T = unknown[]> {
| "notification"
| "email"
| "purchase"
| "tcp";
subModule:
| "db"
| "labeling"
| "printer"
| "prodSql"
| "query"
| "sendmail"
| "auth"
| "datamart"
| "jobs"
| "apt"
| "settings"
| "get"
| "update"
| "delete"
| "post"
| "notification"
| "delete"
| "printing"
| "gpSql"
| "email"
| "gpChecks"
| "prodEndpoint"
| "create_server";
level: "info" | "error" | "debug" | "fatal";
| "tcp"
| "logistics";
subModule: string;
level: "info" | "error" | "debug" | "fatal" | "warn";
message: string;
room?: string;
data?: T;
@@ -59,7 +38,7 @@ interface Data<T = unknown[]> {
* data: [] the data that will be passed back
* notify: false by default this is to send a notification to a users email to alert them of an issue.
*/
export const returnFunc = (data: Data) => {
export const returnFunc = (data: ReturnHelper) => {
const notify = data.notify ? data.notify : false;
const room = data.room ?? data.room;
const log = createLogger({ module: data.module, subModule: data.subModule });
@@ -92,7 +71,7 @@ export const returnFunc = (data: Data) => {
export function apiReturn(
res: Response,
opts: Data & { status?: number },
opts: ReturnHelper & { status?: number },
optional?: unknown, // leave this as unknown so we can pass an object or an array over.
): Response {
const result = returnFunc(opts);

View File

@@ -5,13 +5,17 @@ meta {
}
get {
url: {{url}}/api/datamart/:name
url: {{url}}/api/datamart/:name?historical=x
body: none
auth: inherit
}
params:query {
historical: x
}
params:path {
name: activeArticles
name: inventory
}
settings {

View File

@@ -1,5 +1,5 @@
vars {
url: http://uslim1vms006:3100/lst
url: http://localhost:3000/lst
readerIp: 10.44.14.215
}
vars:secret [

43
lstMobile/.gitignore vendored Normal file
View File

@@ -0,0 +1,43 @@
# Learn more https://docs.github.com/en/get-started/getting-started-with-git/ignoring-files
# dependencies
node_modules/
# Expo
.expo/
dist/
web-build/
expo-env.d.ts
# Native
.kotlin/
*.orig.*
*.jks
*.p8
*.p12
*.key
*.mobileprovision
# Metro
.metro-health-check*
# debug
npm-debug.*
yarn-debug.*
yarn-error.*
# macOS
.DS_Store
*.pem
# local env files
.env*.local
# typescript
*.tsbuildinfo
app-example
# generated native folders
/ios
/android

1
lstMobile/.vscode/extensions.json vendored Normal file
View File

@@ -0,0 +1 @@
{ "recommendations": ["expo.vscode-expo-tools"] }

7
lstMobile/.vscode/settings.json vendored Normal file
View File

@@ -0,0 +1,7 @@
{
"editor.codeActionsOnSave": {
"source.fixAll": "explicit",
"source.organizeImports": "explicit",
"source.sortMembers": "explicit"
}
}

56
lstMobile/README.md Normal file
View File

@@ -0,0 +1,56 @@
# Welcome to your Expo app 👋
This is an [Expo](https://expo.dev) project created with [`create-expo-app`](https://www.npmjs.com/package/create-expo-app).
## Get started
1. Install dependencies
```bash
npm install
```
2. Start the app
```bash
npx expo start
```
In the output, you'll find options to open the app in a
- [development build](https://docs.expo.dev/develop/development-builds/introduction/)
- [Android emulator](https://docs.expo.dev/workflow/android-studio-emulator/)
- [iOS simulator](https://docs.expo.dev/workflow/ios-simulator/)
- [Expo Go](https://expo.dev/go), a limited sandbox for trying out app development with Expo
You can start developing by editing the files inside the **app** directory. This project uses [file-based routing](https://docs.expo.dev/router/introduction).
## Get a fresh project
When you're ready, run:
```bash
npm run reset-project
```
This command will move the starter code to the **app-example** directory and create a blank **app** directory where you can start developing.
### Other setup steps
- To set up ESLint for linting, run `npx expo lint`, or follow our guide on ["Using ESLint and Prettier"](https://docs.expo.dev/guides/using-eslint/)
- If you'd like to set up unit testing, follow our guide on ["Unit Testing with Jest"](https://docs.expo.dev/develop/unit-testing/)
- Learn more about the TypeScript setup in this template in our guide on ["Using TypeScript"](https://docs.expo.dev/guides/typescript/)
## Learn more
To learn more about developing your project with Expo, look at the following resources:
- [Expo documentation](https://docs.expo.dev/): Learn fundamentals, or go into advanced topics with our [guides](https://docs.expo.dev/guides).
- [Learn Expo tutorial](https://docs.expo.dev/tutorial/introduction/): Follow a step-by-step tutorial where you'll create a project that runs on Android, iOS, and the web.
## Join the community
Join our community of developers creating universal apps.
- [Expo on GitHub](https://github.com/expo/expo): View our open source platform and contribute.
- [Discord community](https://chat.expo.dev): Chat with Expo users and ask questions.

47
lstMobile/app.json Normal file
View File

@@ -0,0 +1,47 @@
{
"expo": {
"name": "LST mobile",
"slug": "lst-mobile",
"version": "0.0.1-alpha",
"orientation": "portrait",
"icon": "./assets/images/icon.png",
"scheme": "lstmobile",
"userInterfaceStyle": "automatic",
"ios": {
"icon": "./assets/expo.icon"
},
"android": {
"adaptiveIcon": {
"backgroundColor": "#E6F4FE",
"foregroundImage": "./assets/images/android-icon-foreground.png",
"backgroundImage": "./assets/images/android-icon-background.png",
"monochromeImage": "./assets/images/android-icon-monochrome.png",
"package": "net.alpla.lst.mobile",
"versionCode": 1
},
"predictiveBackGestureEnabled": false,
"package": "com.anonymous.lstMobile"
},
"web": {
"output": "static",
"favicon": "./assets/images/favicon.png"
},
"plugins": [
"expo-router",
[
"expo-splash-screen",
{
"backgroundColor": "#208AEF",
"android": {
"image": "./assets/images/splash-icon.png",
"imageWidth": 76
}
}
]
],
"experiments": {
"typedRoutes": true,
"reactCompiler": true
}
}
}

View File

@@ -0,0 +1,3 @@
<svg width="652" height="606" viewBox="0 0 652 606" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M353.554 0H298.446C273.006 0 249.684 14.6347 237.962 37.9539L4.37994 502.646C-1.04325 513.435 -1.45067 526.178 3.2716 537.313L22.6123 582.918C34.6475 611.297 72.5404 614.156 88.4414 587.885L309.863 222.063C313.34 216.317 319.439 212.826 326 212.826C332.561 212.826 338.659 216.317 342.137 222.063L563.559 587.885C579.46 614.156 617.352 611.297 629.388 582.918L648.728 537.313C653.451 526.178 653.043 513.435 647.62 502.646L414.038 37.9539C402.316 14.6347 378.994 0 353.554 0Z" fill="white"/>
</svg>

After

Width:  |  Height:  |  Size: 608 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

View File

@@ -0,0 +1,40 @@
{
"fill" : {
"automatic-gradient" : "extended-srgb:0.00000,0.47843,1.00000,1.00000"
},
"groups" : [
{
"layers" : [
{
"image-name" : "expo-symbol 2.svg",
"name" : "expo-symbol 2",
"position" : {
"scale" : 1,
"translation-in-points" : [
1.1008400065293245e-05,
-16.046875
]
}
},
{
"image-name" : "grid.png",
"name" : "grid"
}
],
"shadow" : {
"kind" : "neutral",
"opacity" : 0.5
},
"translucency" : {
"enabled" : true,
"value" : 0.5
}
}
],
"supported-platforms" : {
"circles" : [
"watchOS"
],
"squares" : "shared"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 17 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 77 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 780 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 324 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 215 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 347 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 468 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 253 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 343 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 479 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 58 KiB

13903
lstMobile/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

55
lstMobile/package.json Normal file
View File

@@ -0,0 +1,55 @@
{
"name": "lstmobile",
"main": "expo-router/entry",
"version": "0.0.1-alpha",
"scripts": {
"start": "expo start",
"reset-project": "node ./scripts/reset-project.js",
"android": "expo run:android",
"ios": "expo run:ios",
"web": "expo start --web",
"lint": "expo lint",
"build:apk": "expo prebuild --clean && cd android && gradlew.bat assembleRelease ",
"update": "adb install android/app/build/outputs/apk/release/app-release.apk"
},
"dependencies": {
"@react-native-async-storage/async-storage": "2.2.0",
"@react-navigation/bottom-tabs": "^7.15.5",
"@react-navigation/elements": "^2.9.10",
"@react-navigation/native": "^7.1.33",
"@tanstack/react-query": "^5.99.0",
"axios": "^1.15.0",
"expo": "~55.0.15",
"expo-application": "~55.0.14",
"expo-constants": "~55.0.14",
"expo-device": "~55.0.15",
"expo-font": "~55.0.6",
"expo-glass-effect": "~55.0.10",
"expo-image": "~55.0.8",
"expo-linking": "~55.0.13",
"expo-router": "~55.0.12",
"expo-splash-screen": "~55.0.18",
"expo-status-bar": "~55.0.5",
"expo-symbols": "~55.0.7",
"expo-system-ui": "~55.0.15",
"expo-web-browser": "~55.0.14",
"lucide-react-native": "^1.8.0",
"react": "19.2.0",
"react-dom": "19.2.0",
"react-native": "0.83.4",
"react-native-gesture-handler": "~2.30.0",
"react-native-reanimated": "4.2.1",
"react-native-safe-area-context": "~5.6.2",
"react-native-screens": "~4.23.0",
"react-native-web": "~0.21.0",
"react-native-worklets": "0.7.2",
"socket.io-client": "^4.8.3",
"zod": "^4.3.6"
},
"devDependencies": {
"@types/react": "~19.2.2",
"eas-cli": "^18.7.0",
"typescript": "~5.9.2"
},
"private": true
}

View File

@@ -0,0 +1,38 @@
import { Tabs } from 'expo-router'
import React from 'react'
import { colors } from '../../stlyes/global'
import { Home,Settings } from 'lucide-react-native'
export default function TabLayout() {
return (
<Tabs
screenOptions={{
headerShown: false,
tabBarStyle:{
},
tabBarActiveTintColor: 'black',
tabBarInactiveTintColor: colors.textSecondary,
}}
>
<Tabs.Screen
name='index'
options={{
title:'Home',
tabBarIcon: ({color, size})=>(
<Home color={color} size={size}/>
)
}}
/>
<Tabs.Screen
name='config'
options={{
title: 'Config',
tabBarIcon: ({color, size})=> (
<Settings size={size} color={color}/>
)
}}
/>
</Tabs>
)
}

View File

@@ -0,0 +1,92 @@
// app/config.tsx
import { useEffect, useState } from "react";
import { View, Text, TextInput, Button, Alert } from "react-native";
import { useRouter } from "expo-router";
import { AppConfig, getConfig, saveConfig } from "../../lib/storage";
import Constants from "expo-constants";
export default function Config() {
const [serverUrl, setServerUrl] = useState("");
const [scannerId, setScannerId] = useState("");
const [config, setConfig] = useState<AppConfig | null>(null)
const [loading, setLoading] = useState(true);
const router = useRouter()
const version = Constants.expoConfig?.version;
const build = Constants.expoConfig?.android?.versionCode ?? 1;
useEffect(() => {
const loadConfig = async () => {
const existing = await getConfig();
if (existing) {
setServerUrl(existing.serverUrl);
setScannerId(existing.scannerId);
setConfig(existing)
}
setLoading(false);
};
loadConfig();
}, []);
const handleSave = async () => {
if (!serverUrl.trim() || !scannerId.trim()) {
Alert.alert("Missing info", "Please fill in both fields.");
return;
}
await saveConfig({
serverUrl: serverUrl.trim(),
scannerId: scannerId.trim(),
});
Alert.alert("Saved", "Config saved to device.");
//router.replace("/");
};
if (loading) {
return <Text>Loading config...</Text>;
}
return (
<View style={{ flex: 1, padding: 16, gap: 12 }}>
<View style={{alignItems: "center", margin: 10}}>
<Text style={{ fontSize: 20, fontWeight: "600"}}>LST Scanner Config</Text>
</View>
<Text>Server IP</Text>
<TextInput
value={serverUrl}
onChangeText={setServerUrl}
placeholder="192.168.1.1"
autoCapitalize="none"
keyboardType="numeric"
style={{ borderWidth: 1, padding: 10, borderRadius: 8 }}
/>
<Text>Server port</Text>
<TextInput
value={scannerId}
onChangeText={setScannerId}
placeholder="3000"
autoCapitalize="characters"
keyboardType="numeric"
style={{ borderWidth: 1, padding: 10, borderRadius: 8, }}
/>
<View style={{flexDirection: 'row',justifyContent: 'center', padding: 3}}>
<Button title="Save Config" onPress={handleSave} />
</View>
<View style={{ marginTop: "auto", alignItems: "center", padding: 10 }}>
<Text style={{ fontSize: 12, color: "#666" }}>
LST Scanner v{version}-{build}
</Text>
</View>
</View>
);
}

View File

@@ -0,0 +1,134 @@
import * as Application from "expo-application";
import * as Device from "expo-device";
import { useRouter } from "expo-router";
import { useEffect, useState } from "react";
import {
Alert,
Platform,
ScrollView,
Text,
View,
} from "react-native";
import HomeHeader from "../../components/HomeHeader";
import { type AppConfig, getConfig, hasValidConfig } from "../../lib/storage";
import {
evaluateVersion,
type ServerVersionInfo,
type StartupStatus,
} from "../../lib/versionValidation";
import { globalStyles } from "../../stlyes/global";
import axios from 'axios'
export default function Index() {
const [config, setConfig] = useState<AppConfig | null>(null);
const [loading, setLoading] = useState(true);
const [startupStatus, setStartupStatus] = useState<StartupStatus>({state: "checking"});
const [serverInfo, setServerInfo] = useState<ServerVersionInfo>()
const router = useRouter();
const versionName = Application.nativeApplicationVersion ?? "unknown";
const versionCode = Number(Application.nativeBuildVersion ?? "0");
useEffect(() => {
let isMounted = true;
const startUp = async () => {
try {
const savedConfig = await getConfig();
if (!hasValidConfig(savedConfig)) {
router.replace("/config");
return;
}
if (!isMounted) return;
setConfig(savedConfig);
// temp while testing
const appBuildCode = 1;
try {
const res = await axios.get(`http://${savedConfig?.serverUrl}:${savedConfig?.scannerId}/lst/api/mobile/version`);
console.log(res)
const server = (await res.data) as ServerVersionInfo;
if (!isMounted) return;
const result = evaluateVersion(appBuildCode, server);
setStartupStatus(result);
setServerInfo(server)
if (result.state === "warning") {
Alert.alert("Update available", result.message);
}
} catch {
if (!isMounted) return;
setStartupStatus({ state: "offline" });
}
} finally {
if (isMounted) {
setLoading(false);
}
}
};
startUp();
return () => {
isMounted = false;
};
}, [router]);
if (loading) {
return <Text>Validating Configs.</Text>;
}
if (startupStatus.state === "checking") {
return <Text>Checking device and server status...</Text>;
}
if (startupStatus.state === "blocked") {
return (
<View>
<Text>Update Required</Text>
<Text>This scanner must be updated before it can be used.</Text>
<Text>Scan the update code to continue.</Text>
</View>
);
}
if (startupStatus.state === "offline") {
// app still renders, but show disconnected state
}
return (
<ScrollView >
<View style={globalStyles.container}>
<HomeHeader />
<Text>
Welcome.{versionName} - {versionCode}
</Text>
<Text>Running on: {Platform.OS}</Text>
<Text>Device model: {Device.modelName}</Text>
<Text>Device Brand: {Device.brand}</Text>
<Text> OS Version: {Device.osVersion}</Text>
<View style={{ flex: 1, padding: 16, gap: 12 }}>
<Text style={{ fontSize: 22, fontWeight: "600" }}>Welcome</Text>
{config ? (
<>
<Text>Server: {config.serverUrl}</Text>
<Text>Scanner: {config.scannerId}</Text>
<Text>Server: v{serverInfo?.versionName}</Text>
</>
) : (
<Text>No config found yet.</Text>
)}
</View></View>
</ScrollView>
);
}

View File

@@ -0,0 +1,12 @@
import { Stack } from "expo-router";
import {StatusBar} from 'expo-status-bar'
import { colors } from "../stlyes/global";
export default function RootLayout() {
return <>
<StatusBar style="dark" />
<Stack screenOptions={{ headerShown: false }}>
<Stack.Screen name='(tabs)' />
</Stack>
</>;
}

View File

@@ -0,0 +1,24 @@
import React from "react";
import { StyleSheet, Text, View } from "react-native";
import { colors, globalStyles } from "../stlyes/global";
export default function HomeHeader() {
const currentDate = new Date().toLocaleDateString("en-US", {
weekday: "long",
month: "long",
day: "numeric",
});
return (
<View >
<Text style={styles.date}>{currentDate}</Text>
</View>
);
}
const styles = StyleSheet.create({
date: {
fontSize: 14,
color: colors.textSecondary,
marginTop: 4,
marginBottom: 30,
},
});

View File

@@ -0,0 +1,36 @@
import AsyncStorage from "@react-native-async-storage/async-storage";
export type AppConfig = {
serverUrl: string;
scannerId: string;
};
const CONFIG_KEY = "scanner_app_config";
export async function saveConfig(config: AppConfig) {
await AsyncStorage.setItem(CONFIG_KEY, JSON.stringify(config));
}
export async function getConfig(): Promise<AppConfig | null> {
const raw = await AsyncStorage.getItem(CONFIG_KEY);
if (!raw) return null;
try {
return JSON.parse(raw) as AppConfig;
} catch (error) {
console.log("Error", error)
return null;
}
}
export function hasValidConfig(config: AppConfig | null) {
if (!config) return false;
return Boolean(
config.serverUrl?.trim() &&
config.scannerId?.trim()
);
}

View File

@@ -0,0 +1,43 @@
export type ServerVersionInfo = {
packageName: string;
versionName: string;
versionCode: number;
minSupportedVersionCode: number;
fileName: string;
};
export type StartupStatus =
| { state: "checking" }
| { state: "needs-config" }
| { state: "offline" }
| { state: "blocked"; reason: string; server: ServerVersionInfo }
| { state: "warning"; message: string; server: ServerVersionInfo }
| { state: "ready"; server: ServerVersionInfo | null };
export function evaluateVersion(
appBuildCode: number,
server: ServerVersionInfo
): StartupStatus {
if (appBuildCode < server.minSupportedVersionCode) {
return {
state: "blocked",
reason: "This scanner app is too old and must be updated before use.",
server,
};
}
if (appBuildCode !== server.versionCode) {
return {
state: "warning",
message: `A newer version is available. Installed build: ${appBuildCode}, latest build: ${server.versionCode}.`,
server,
};
}
return {
state: "ready",
server,
};
}

View File

@@ -0,0 +1,21 @@
import { StyleSheet } from "react-native";
export const colors = {
background: "white",
header: "white",
primary: 'blue',
textSecondary: "blue",
};
export const globalStyles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: colors.background,
justifyContent: 'center',
alignItems: 'center',
paddingTop: 60,
},
header: {
padding: 4,
},
});

20
lstMobile/tsconfig.json Normal file
View File

@@ -0,0 +1,20 @@
{
"extends": "expo/tsconfig.base",
"compilerOptions": {
"strict": true,
"paths": {
"@/*": [
"./src/*"
],
"@/assets/*": [
"./assets/*"
]
}
},
"include": [
"**/*.ts",
"**/*.tsx",
".expo/types/**/*.ts",
"expo-env.d.ts"
]
}

View File

@@ -0,0 +1,20 @@
CREATE TABLE "inv_historical_data" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"hist_date" date NOT NULL,
"plant_token" text,
"article" text NOT NULL,
"article_description" text NOT NULL,
"material_type" text,
"total_QTY" text,
"available_QTY" text,
"coa_QTY" text,
"held_QTY" text,
"consignment_qty" text,
"lot_number" text,
"location_id" text,
"location" text,
"whse_id" text DEFAULT '',
"whse_name" text DEFAULT 'missing whseName',
"upd_user" text DEFAULT 'lst',
"upd_date" timestamp DEFAULT now()
);

View File

@@ -0,0 +1 @@
ALTER TABLE "inv_historical_data" ALTER COLUMN "upd_user" SET DEFAULT 'lst-system';

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -225,6 +225,20 @@
"when": 1776098377074,
"tag": "0031_numerous_the_phantom",
"breakpoints": true
},
{
"idx": 32,
"version": "7",
"when": 1776245938243,
"tag": "0032_tranquil_onslaught",
"breakpoints": true
},
{
"idx": 33,
"version": "7",
"when": 1776256060808,
"tag": "0033_elite_adam_warlock",
"breakpoints": true
}
]
}

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "lst_v3",
"version": "0.0.1-alpha.3",
"version": "0.0.1-alpha.4",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "lst_v3",
"version": "0.0.1-alpha.3",
"version": "0.0.1-alpha.4",
"license": "ISC",
"dependencies": {
"@dotenvx/dotenvx": "^1.57.0",

View File

@@ -1,6 +1,6 @@
{
"name": "lst_v3",
"version": "0.0.1-alpha.3",
"version": "0.0.1-alpha.4",
"description": "The tool that supports us in our everyday alplaprod",
"main": "index.js",
"scripts": {
@@ -24,7 +24,8 @@
"version": "changeset version",
"specCheck": "node scripts/check-route-specs.mjs",
"commit": "cz",
"release": "commit-and-tag-version"
"release": "commit-and-tag-version",
"build:apk": "cd lstMobile && expo prebuild --clean && cd android && gradlew.bat assembleRelease "
},
"repository": {
"type": "git",

View File

@@ -14,8 +14,8 @@ param (
# .\scripts\services.ps1 -serviceName "LSTV3_app" -option "install" -appPath "D:\LST_V3" -description "Logistics Support Tool" -command "run start"
# server migrations get - reminder to add to old version in pkg "start:lst": "cd lstV2 && npm start",
# .\scripts\services.ps1 -serviceName "LST_app" -option "install" -appPath "D:\LST" -description "Logistics Support Tool" -command "run start"
# .\scripts\services.ps1 -serviceName "LST_app" -option "install" -appPath "D:\LST" -description "Logistics Support Tool" -command "run start:lst"
# powershell.exe -ExecutionPolicy Bypass -File .\scripts\services.ps1 -serviceName "LST_app" -option "install" -appPath "D:\LST" -description "Logistics Support Tool" -command "run start"
# powershell.exe -ExecutionPolicy Bypass -File .\scripts\services.ps1 -serviceName "LSTV2" -option "install" -appPath "D:\LST" -description "Logistics Support Tool" -command "run start:lst"
$nssmPath = $AppPath + "\nssm.exe"
$npmPath = "C:\Program Files\nodejs\npm.cmd" # Path to npm.cmd