Compare commits
17 Commits
87f738702a
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 3734d9daac | |||
| a1eeadeec4 | |||
| 3639c1b77c | |||
| cfbc156517 | |||
| fb3cd85b41 | |||
| 5b1c88546f | |||
| ba3227545d | |||
| 84909bfcf8 | |||
| e0d0ac2077 | |||
| 52a6c821f4 | |||
| eccaf17332 | |||
| 6307037985 | |||
| 4b6061c478 | |||
| fc6dc82d84 | |||
| 6ba905a887 | |||
| f33587a3d9 | |||
| 80189baf90 |
@@ -2,6 +2,7 @@ NODE_ENV=development
|
||||
# Server
|
||||
PORT=3000
|
||||
URL=http://localhost:3000
|
||||
SERVER_IP=10.75.2.38
|
||||
TIMEZONE=America/New_York
|
||||
TCP_PORT=2222
|
||||
|
||||
@@ -22,7 +23,7 @@ DEFAULT_DOCK=
|
||||
DEFAULT_LOAD_TYPE=
|
||||
DEFAULT_CARRIER=
|
||||
|
||||
# prodServer when runing on an actual prod server use localhost this way we dont go out and back in.
|
||||
# prodServer when ruining on an actual prod server use localhost this way we don't go out and back in.
|
||||
PROD_SERVER=
|
||||
PROD_PLANT_TOKEN=
|
||||
PROD_USER=
|
||||
|
||||
1
.gitignore
vendored
@@ -5,6 +5,7 @@ builds
|
||||
.buildNumber
|
||||
temp
|
||||
brunoApi
|
||||
downloads
|
||||
.scriptCreds
|
||||
node-v24.14.0-x64.msi
|
||||
postgresql-17.9-2-windows-x64.exe
|
||||
|
||||
3
.vscode/settings.json
vendored
@@ -71,7 +71,8 @@
|
||||
"prodlabels",
|
||||
"prolink",
|
||||
"Skelly",
|
||||
"trycatch"
|
||||
"trycatch",
|
||||
"whse"
|
||||
],
|
||||
"gitea.token": "8456def90e1c651a761a8711763d6ef225d6b2db",
|
||||
"gitea.instanceURL": "https://git.tuffraid.net",
|
||||
|
||||
44
CHANGELOG.md
@@ -1,5 +1,49 @@
|
||||
# All Changes to LST can be found below.
|
||||
|
||||
## [0.0.1-alpha.4](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.3...v0.0.1-alpha.4) (2026-04-15)
|
||||
|
||||
|
||||
### 🌟 Enhancements
|
||||
|
||||
* **datamart:** migrations completed remaining is the deactivation that will be ran by anylitics ([eccaf17](https://git.tuffraid.net/cowch/lst_v3/commits/eccaf17332fb1c63b8d6bbea6f668c3bb42d44b7))
|
||||
* **datamart:** psi data has been added :D ([e0d0ac2](https://git.tuffraid.net/cowch/lst_v3/commits/e0d0ac20773159373495d65023587b76b47df34f))
|
||||
* **migrate:** quality alert migrated ([b0e5fd7](https://git.tuffraid.net/cowch/lst_v3/commits/b0e5fd79998d551d4f155d58416157a324498fbd))
|
||||
* **ocp:** printer sync and logging logic added ([80189ba](https://git.tuffraid.net/cowch/lst_v3/commits/80189baf906224da43ec1b9b7521153d2a49e059))
|
||||
* **tcp crud:** tcp server start, stop, restart endpoints + status check ([6307037](https://git.tuffraid.net/cowch/lst_v3/commits/6307037985162bc6b49f9f711132853296f43eee))
|
||||
|
||||
|
||||
### 🐛 Bug fixes
|
||||
|
||||
* **datamart:** error when running build and crashed everything ([52a6c82](https://git.tuffraid.net/cowch/lst_v3/commits/52a6c821f4632e4b5b51e0528a0d620e2e0deffc))
|
||||
|
||||
|
||||
### 📚 Documentation
|
||||
|
||||
* **docs:** removed docusorus as all docs will be inside lst now to better assist users ([6ba905a](https://git.tuffraid.net/cowch/lst_v3/commits/6ba905a887dbd8f306d71fed75bb34c71fee74c9))
|
||||
* **env example:** updated the file ([ca3425d](https://git.tuffraid.net/cowch/lst_v3/commits/ca3425d327757120c2cc876fff28e8668c76838d))
|
||||
* **notifcations:** docs for intro, notifcations, reprint added ([87f7387](https://git.tuffraid.net/cowch/lst_v3/commits/87f738702a935279a248d471541cdd9d49330565))
|
||||
|
||||
|
||||
### 🛠️ Code Refactor
|
||||
|
||||
* **agent:** changed to have the test servers on there own push for better testing ([3bf024c](https://git.tuffraid.net/cowch/lst_v3/commits/3bf024cfc97d2841130d54d1a7c5cb5f09f0f598))
|
||||
* **connection:** corrected the connection to the old system ([38a0b65](https://git.tuffraid.net/cowch/lst_v3/commits/38a0b65e9450c65b8300a10058a8f0357400f4e6))
|
||||
* **logging:** when notify is true send the error to systemAdmins ([79e653e](https://git.tuffraid.net/cowch/lst_v3/commits/79e653efa3bcb2941ccee06b28378e709e085ec0))
|
||||
* **notification:** blocking added ([9a0ef8e](https://git.tuffraid.net/cowch/lst_v3/commits/9a0ef8e51a36e3ab45b601b977f1b5cf35d56947))
|
||||
* **puchase:** changes how the error handling works so a better email can be sent ([9d39c13](https://git.tuffraid.net/cowch/lst_v3/commits/9d39c13510974b5ada2a6f6c2448da3f1b755a5c))
|
||||
* **reprint:** new query added to deactivate the old notifcation so no chance of duplicates ([c9eb59e](https://git.tuffraid.net/cowch/lst_v3/commits/c9eb59e2ad9847418ac55cb8a4a91c013f6c97bb))
|
||||
* **server:** added in serverCrash email ([dcb3f2d](https://git.tuffraid.net/cowch/lst_v3/commits/dcb3f2dd1382986639b722778fad113392533b28))
|
||||
* **services:** added in examples for migration stuff ([fc6dc82](https://git.tuffraid.net/cowch/lst_v3/commits/fc6dc82d8458a9928050dd3770778d6a6e1eea7f))
|
||||
* **sql:** corrections to the way we reconnect so the app can error out and be reactivated later ([f33587a](https://git.tuffraid.net/cowch/lst_v3/commits/f33587a3d9a72ca72806635fac9d1214bb1452f1))
|
||||
* **templates:** corrections for new notify process on critcal errors ([07ebf88](https://git.tuffraid.net/cowch/lst_v3/commits/07ebf88806b93b9320f8f9d36b867572dd9a9580))
|
||||
|
||||
|
||||
### 📈 Project changes
|
||||
|
||||
* **agent:** added in jeff city ([e47ea9e](https://git.tuffraid.net/cowch/lst_v3/commits/e47ea9ec52a6ebaf5a8f67a7e8bd2c73da6186fb))
|
||||
* **agent:** added in sherman ([4b6061c](https://git.tuffraid.net/cowch/lst_v3/commits/4b6061c478cbeba7c845dc1c8a015b9998721456))
|
||||
* **service:** changes to the script to allow running the powershell on execution palicy restrictions ([84909bf](https://git.tuffraid.net/cowch/lst_v3/commits/84909bfcf85b91d085ea9dca78be00482b7fd231))
|
||||
|
||||
## [0.0.1-alpha.3](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.2...v0.0.1-alpha.3) (2026-04-10)
|
||||
|
||||
|
||||
|
||||
@@ -19,7 +19,7 @@ Quick summary of current rewrite/migration goal.
|
||||
| User Profile | ~~Edit profile~~, upload avatar | 🟨 In Progress |
|
||||
| User Admin | Edit user, create user, remove user, alplaprod user integration | ⏳ Not Started |
|
||||
| Notifications | ~~Subscribe~~, ~~Create~~, ~~Update~~, ~~~~Remove~~, Manual Trigger | 🟨 In Progress |
|
||||
| Datamart | Create, Update, Run, Deactivate | 🔧 In Progress |
|
||||
| Datamart | ~~Create~~, ~~Update~~, ~~Run~~, Deactivate | 🟨 In Progress |
|
||||
| Frontend | Analytics and charts | ⏳ Not Started |
|
||||
| Docs | Instructions and trouble shooting | ⏳ Not Started |
|
||||
| One Click Print | Get printers, monitor printers, label process, material process, Special processes | ⏳ Not Started |
|
||||
|
||||
@@ -13,6 +13,10 @@
|
||||
*
|
||||
* when a criteria is password over we will handle it by counting how many were passed up to 3 then deal with each one respectively
|
||||
*/
|
||||
|
||||
import { and, between, inArray, notInArray } from "drizzle-orm";
|
||||
import { db } from "../db/db.controller.js";
|
||||
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
|
||||
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
|
||||
import {
|
||||
type SqlQuery,
|
||||
@@ -22,37 +26,118 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||
import { datamartData } from "./datamartData.utlis.js";
|
||||
|
||||
type Options = {
|
||||
name: string;
|
||||
value: string;
|
||||
};
|
||||
type Data = {
|
||||
name: string;
|
||||
options: Options;
|
||||
options: any;
|
||||
optionsRequired?: boolean;
|
||||
howManyOptionsRequired?: number;
|
||||
};
|
||||
|
||||
const lstDbRun = async (data: Data) => {
|
||||
if (data.options) {
|
||||
if (data.name === "psiInventory") {
|
||||
const ids = data.options.articles.split(",").map((id: any) => id.trim());
|
||||
const whse = data.options.whseToInclude
|
||||
? data.options.whseToInclude
|
||||
.split(",")
|
||||
.map((w: any) => w.trim())
|
||||
.filter(Boolean)
|
||||
: [];
|
||||
|
||||
const locations = data.options.exludeLanes
|
||||
? data.options.exludeLanes
|
||||
.split(",")
|
||||
.map((l: any) => l.trim())
|
||||
.filter(Boolean)
|
||||
: [];
|
||||
|
||||
const conditions = [
|
||||
inArray(invHistoricalData.article, ids),
|
||||
between(
|
||||
invHistoricalData.histDate,
|
||||
data.options.startDate,
|
||||
data.options.endDate,
|
||||
),
|
||||
];
|
||||
|
||||
// only add the warehouse condition if there are any whse values
|
||||
if (whse.length > 0) {
|
||||
conditions.push(inArray(invHistoricalData.whseId, whse));
|
||||
}
|
||||
|
||||
// locations we dont want in the system
|
||||
if (locations.length > 0) {
|
||||
conditions.push(notInArray(invHistoricalData.location, locations));
|
||||
}
|
||||
|
||||
return await db
|
||||
.select()
|
||||
.from(invHistoricalData)
|
||||
.where(and(...conditions));
|
||||
}
|
||||
}
|
||||
return [];
|
||||
};
|
||||
export const runDatamartQuery = async (data: Data) => {
|
||||
// search the query db for the query by name
|
||||
const sqlQuery = sqlQuerySelector(`${data.name}`) as SqlQuery;
|
||||
const considerLstDBRuns = ["psiInventory"];
|
||||
|
||||
if (considerLstDBRuns.includes(data.name)) {
|
||||
const lstDB = await lstDbRun(data);
|
||||
|
||||
return returnFunc({
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "datamart",
|
||||
subModule: "lstDBrn",
|
||||
message: `Data for: ${data.name}`,
|
||||
data: lstDB,
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
const featureQ = sqlQuerySelector(`featureCheck`) as SqlQuery;
|
||||
|
||||
const { data: fd, error: fe } = await tryCatch(
|
||||
prodQuery(featureQ.query, `Running feature check`),
|
||||
);
|
||||
|
||||
if (fe) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "datamart",
|
||||
subModule: "query",
|
||||
message: `feature check failed`,
|
||||
data: fe as any,
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
// for queries that will need to be ran on legacy until we get the plant updated need to go in here
|
||||
const doubleQueries = ["inventory"];
|
||||
const sqlQuery = sqlQuerySelector(
|
||||
`datamart.${fd.data[0].activated > 0 && !doubleQueries.includes(data.name) ? data.name : `legacy.${data.name}`}`,
|
||||
) as SqlQuery;
|
||||
|
||||
// checking if warehousing is as it will start to effect a lot of queries for plants that are not on 2.
|
||||
|
||||
const getDataMartInfo = datamartData.filter((x) => x.endpoint === data.name);
|
||||
|
||||
// const optionsMissing =
|
||||
// !data.options || Object.keys(data.options).length === 0;
|
||||
|
||||
const optionCount =
|
||||
Object.keys(data.options).length ===
|
||||
getDataMartInfo[0]?.howManyOptionsRequired;
|
||||
const isValid =
|
||||
Object.keys(data.options ?? {}).length >=
|
||||
(getDataMartInfo[0]?.howManyOptionsRequired ?? 0);
|
||||
|
||||
if (getDataMartInfo[0]?.optionsRequired && !optionCount) {
|
||||
if (getDataMartInfo[0]?.optionsRequired && !isValid) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "datamart",
|
||||
subModule: "query",
|
||||
message: `This query is required to have the ${getDataMartInfo[0]?.howManyOptionsRequired} options set in order use it.`,
|
||||
message: `This query is required to have ${getDataMartInfo[0]?.howManyOptionsRequired} option(s) set in order use it, please add in your option(s) data and try again.`,
|
||||
data: [getDataMartInfo[0].options],
|
||||
notify: false,
|
||||
});
|
||||
@@ -75,10 +160,120 @@ export const runDatamartQuery = async (data: Data) => {
|
||||
|
||||
// split the criteria by "," then and then update the query
|
||||
if (data.options) {
|
||||
Object.entries(data.options ?? {}).forEach(([key, value]) => {
|
||||
const pattern = new RegExp(`\\[${key.trim()}\\]`, "g");
|
||||
datamartQuery = datamartQuery.replace(pattern, String(value).trim());
|
||||
});
|
||||
switch (data.name) {
|
||||
case "activeArticles":
|
||||
break;
|
||||
case "deliveryByDateRange":
|
||||
datamartQuery = datamartQuery
|
||||
.replace("[startDate]", `${data.options.startDate}`)
|
||||
.replace("[endDate]", `${data.options.endDate}`)
|
||||
.replace(
|
||||
"--and r.ArticleHumanReadableId in ([articles]) ",
|
||||
data.options.articles
|
||||
? `and r.ArticleHumanReadableId in (${data.options.articles})`
|
||||
: "--and r.ArticleHumanReadableId in ([articles]) ",
|
||||
);
|
||||
|
||||
break;
|
||||
case "customerInventory":
|
||||
datamartQuery = datamartQuery
|
||||
.replace(
|
||||
"--and IdAdressen",
|
||||
`and IdAdressen in (${data.options.customer})`,
|
||||
)
|
||||
.replace(
|
||||
"--and x.IdWarenlager in (0)",
|
||||
`${data.options.whseToInclude ? `and x.IdWarenlager in (${data.options.whseToInclude})` : `--and x.IdWarenlager in (0)`}`,
|
||||
);
|
||||
break;
|
||||
case "openOrders":
|
||||
datamartQuery = datamartQuery
|
||||
.replace("[startDay]", `${data.options.startDay}`)
|
||||
.replace("[endDay]", `${data.options.endDay}`);
|
||||
break;
|
||||
case "inventory":
|
||||
datamartQuery = datamartQuery
|
||||
.replaceAll(
|
||||
"--,l.RunningNumber",
|
||||
`${data.options.includeRunningNumbers ? `,l.RunningNumber` : `--,l.RunningNumber`}`,
|
||||
)
|
||||
.replaceAll(
|
||||
"--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot",
|
||||
`${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot`}`,
|
||||
)
|
||||
.replaceAll(
|
||||
"--,l.WarehouseDescription,l.LaneDescription",
|
||||
`${data.options.locations ? `,l.WarehouseDescription,l.LaneDescription` : `--,l.WarehouseDescription,l.LaneDescription`}`,
|
||||
);
|
||||
break;
|
||||
case "fakeEDIUpdate":
|
||||
datamartQuery = datamartQuery.replace(
|
||||
"--AND h.CustomerHumanReadableId in (0)",
|
||||
`${data.options.address ? `AND h.CustomerHumanReadableId in (${data.options.address})` : `--AND h.CustomerHumanReadableId in (0)`}`,
|
||||
);
|
||||
|
||||
break;
|
||||
case "forecast":
|
||||
datamartQuery = datamartQuery.replace(
|
||||
"where DeliveryAddressHumanReadableId in ([customers])",
|
||||
data.options.customers
|
||||
? `where DeliveryAddressHumanReadableId in (${data.options.customers})`
|
||||
: "--where DeliveryAddressHumanReadableId in ([customers])",
|
||||
);
|
||||
|
||||
break;
|
||||
case "activeArticles2":
|
||||
datamartQuery = datamartQuery.replace(
|
||||
"and a.HumanReadableId in ([articles])",
|
||||
data.options.articles
|
||||
? `and a.HumanReadableId in (${data.options.articles})`
|
||||
: "--and a.HumanReadableId in ([articles])",
|
||||
);
|
||||
|
||||
break;
|
||||
|
||||
case "psiDeliveryData":
|
||||
datamartQuery = datamartQuery
|
||||
.replace("[startDate]", `${data.options.startDate}`)
|
||||
.replace("[endDate]", `${data.options.endDate}`)
|
||||
.replace(
|
||||
"[articles]",
|
||||
data.options.articles ? `${data.options.articles}` : "[articles]",
|
||||
);
|
||||
break;
|
||||
case "productionData":
|
||||
datamartQuery = datamartQuery
|
||||
.replace("[startDate]", `${data.options.startDate}`)
|
||||
.replace("[endDate]", `${data.options.endDate}`)
|
||||
.replace(
|
||||
"and ArticleHumanReadableId in ([articles])",
|
||||
data.options.articles
|
||||
? `and ArticleHumanReadableId in (${data.options.articles})`
|
||||
: "--and ArticleHumanReadableId in ([articles])",
|
||||
);
|
||||
break;
|
||||
case "psiPlanningData":
|
||||
datamartQuery = datamartQuery
|
||||
.replace("[startDate]", `${data.options.startDate}`)
|
||||
.replace("[endDate]", `${data.options.endDate}`)
|
||||
.replace(
|
||||
"and p.IdArtikelvarianten in ([articles])",
|
||||
data.options.articles
|
||||
? `and p.IdArtikelvarianten in (${data.options.articles})`
|
||||
: "--and p.IdArtikelvarianten in ([articles])",
|
||||
);
|
||||
break;
|
||||
default:
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "datamart",
|
||||
subModule: "query",
|
||||
message: `${data.name} encountered an error as it might not exist in LST please contact support if this continues to happen`,
|
||||
data: [sqlQuery.message],
|
||||
notify: true,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
const { data: queryRun, error } = await tryCatch(
|
||||
|
||||
@@ -10,14 +10,50 @@ export const datamartData = [
|
||||
name: "Active articles",
|
||||
endpoint: "activeArticles",
|
||||
description: "returns all active articles for the server with custom data",
|
||||
options: "", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
|
||||
options: "",
|
||||
optionsRequired: false,
|
||||
},
|
||||
{
|
||||
name: "Delivery by date range",
|
||||
endpoint: "deliveryByDateRange",
|
||||
description: `Returns all Deliverys in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
|
||||
options: "startDate,endDate", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
|
||||
description: `Returns all Deliveries in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
|
||||
options: "startDate,endDate",
|
||||
optionsRequired: true,
|
||||
howManyOptionsRequired: 2,
|
||||
},
|
||||
{
|
||||
name: "Get Customer Inventory",
|
||||
endpoint: "customerInventory",
|
||||
description: `Returns specific customer inventory based on there address ID, IE: 8,12,145. \nWith option to include specific warehousesIds, IE 36,41,5. \nNOTES: *leaving warehouse blank will just pull everything for the customer, Inventory dose not include PPOO or INV`,
|
||||
options: "customer,whseToInclude",
|
||||
optionsRequired: true,
|
||||
howManyOptionsRequired: 1,
|
||||
},
|
||||
{
|
||||
name: "Get open order",
|
||||
endpoint: "openOrders",
|
||||
description: `Returns open orders based on day count sent over, IE: startDay 15 days in the past endDay 5 days in the future, can be left empty for this default days`,
|
||||
options: "startDay,endDay",
|
||||
optionsRequired: true,
|
||||
howManyOptionsRequired: 2,
|
||||
},
|
||||
{
|
||||
name: "Get inventory",
|
||||
endpoint: "inventory",
|
||||
description: `Returns all inventory, excludes inv location. adding an x in one of the options will enable it.`,
|
||||
options: "includeRunningNumbers,locations,lots",
|
||||
},
|
||||
{
|
||||
name: "Fake EDI Update",
|
||||
endpoint: "fakeEDIUpdate",
|
||||
description: `Returns all open orders to correct and resubmit via lst demand mgt, leaving blank will get everything putting an address only returns the specified address. \nNOTE: only orders that were created via edi will populate here.`,
|
||||
options: "address",
|
||||
},
|
||||
{
|
||||
name: "Production Data",
|
||||
endpoint: "productionData",
|
||||
description: `Returns all production data from the date range with the option to have 1 to many avs to search by.`,
|
||||
options: "startDate,endDate,articles",
|
||||
optionsRequired: true,
|
||||
howManyOptionsRequired: 2,
|
||||
},
|
||||
|
||||
30
backend/db/schema/historicalInv.schema.ts
Normal file
@@ -0,0 +1,30 @@
|
||||
import { date, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
|
||||
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
|
||||
import type z from "zod";
|
||||
|
||||
export const invHistoricalData = pgTable("inv_historical_data", {
|
||||
inv: uuid("id").defaultRandom().primaryKey(),
|
||||
histDate: date("hist_date").notNull(), // this date should always be yesterday when we post it.
|
||||
plantToken: text("plant_token"),
|
||||
article: text("article").notNull(),
|
||||
articleDescription: text("article_description").notNull(),
|
||||
materialType: text("material_type"),
|
||||
total_QTY: text("total_QTY"),
|
||||
available_QTY: text("available_QTY"),
|
||||
coa_QTY: text("coa_QTY"),
|
||||
held_QTY: text("held_QTY"),
|
||||
consignment_QTY: text("consignment_qty"),
|
||||
lot_Number: text("lot_number"),
|
||||
locationId: text("location_id"),
|
||||
location: text("location"),
|
||||
whseId: text("whse_id").default(""),
|
||||
whseName: text("whse_name").default("missing whseName"),
|
||||
upd_user: text("upd_user").default("lst-system"),
|
||||
upd_date: timestamp("upd_date").defaultNow(),
|
||||
});
|
||||
|
||||
export const invHistoricalDataSchema = createSelectSchema(invHistoricalData);
|
||||
export const newInvHistoricalDataSchema = createInsertSchema(invHistoricalData);
|
||||
|
||||
export type InvHistoricalData = z.infer<typeof invHistoricalDataSchema>;
|
||||
export type NewInvHistoricalData = z.infer<typeof newInvHistoricalDataSchema>;
|
||||
@@ -1,6 +1,11 @@
|
||||
import { integer, pgTable, text } from "drizzle-orm/pg-core";
|
||||
import { integer, pgTable, text, timestamp } from "drizzle-orm/pg-core";
|
||||
|
||||
export const opendockApt = pgTable("printer_log", {
|
||||
export const printerLog = pgTable("printer_log", {
|
||||
id: integer().primaryKey().generatedAlwaysAsIdentity(),
|
||||
name: text("name").notNull(),
|
||||
name: text("name"),
|
||||
ip: text("ip"),
|
||||
printerSN: text("printer_sn"),
|
||||
condition: text("condition").notNull(),
|
||||
message: text("message"),
|
||||
createdAt: timestamp("created_at").defaultNow(),
|
||||
});
|
||||
|
||||
44
backend/db/schema/printers.schema.ts
Normal file
@@ -0,0 +1,44 @@
|
||||
import {
|
||||
boolean,
|
||||
integer,
|
||||
jsonb,
|
||||
pgTable,
|
||||
text,
|
||||
timestamp,
|
||||
uniqueIndex,
|
||||
uuid,
|
||||
} from "drizzle-orm/pg-core";
|
||||
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
|
||||
import type z from "zod";
|
||||
|
||||
export const printerData = pgTable(
|
||||
"printer_data",
|
||||
{
|
||||
id: uuid("id").defaultRandom().primaryKey(),
|
||||
humanReadableId: text("humanReadable_id").unique().notNull(),
|
||||
name: text("name").notNull(),
|
||||
ipAddress: text("ipAddress"),
|
||||
port: integer("port"),
|
||||
status: text("status"),
|
||||
statusText: text("statusText"),
|
||||
printerSN: text("printer_sn"),
|
||||
lastTimePrinted: timestamp("last_time_printed").notNull().defaultNow(),
|
||||
assigned: boolean("assigned").default(false),
|
||||
remark: text("remark"),
|
||||
printDelay: integer("printDelay").default(90),
|
||||
processes: jsonb("processes").default([]),
|
||||
printDelayOverride: boolean("print_delay_override").default(false), // this will be more for if we have the lot time active but want to over ride this single line for some reason
|
||||
add_Date: timestamp("add_Date").defaultNow(),
|
||||
upd_date: timestamp("upd_date").defaultNow(),
|
||||
},
|
||||
(table) => [
|
||||
//uniqueIndex("emailUniqueIndex").on(sql`lower(${table.email})`),
|
||||
uniqueIndex("printer_id").on(table.humanReadableId),
|
||||
],
|
||||
);
|
||||
|
||||
export const printerSchema = createSelectSchema(printerData);
|
||||
export const newPrinterSchema = createInsertSchema(printerData);
|
||||
|
||||
export type Printer = z.infer<typeof printerSchema>;
|
||||
export type NewPrinter = z.infer<typeof newPrinterSchema>;
|
||||
@@ -7,12 +7,17 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
export let pool2: sql.ConnectionPool;
|
||||
export let connected: boolean = false;
|
||||
export let reconnecting = false;
|
||||
// start the delay out as 2 seconds
|
||||
let delayStart = 2000;
|
||||
let attempt = 0;
|
||||
const maxAttempts = 10;
|
||||
|
||||
export const connectGPSql = async () => {
|
||||
const serverUp = await checkHostnamePort(`USMCD1VMS011:1433`);
|
||||
if (!serverUp) {
|
||||
// we will try to reconnect
|
||||
connected = false;
|
||||
reconnectToSql;
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
@@ -48,6 +53,7 @@ export const connectGPSql = async () => {
|
||||
notify: false,
|
||||
});
|
||||
} catch (error) {
|
||||
reconnectToSql;
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
@@ -104,11 +110,6 @@ export const reconnectToSql = async () => {
|
||||
//set reconnecting to true while we try to reconnect
|
||||
reconnecting = true;
|
||||
|
||||
// start the delay out as 2 seconds
|
||||
let delayStart = 2000;
|
||||
let attempt = 0;
|
||||
const maxAttempts = 10;
|
||||
|
||||
while (!connected && attempt < maxAttempts) {
|
||||
attempt++;
|
||||
log.info(
|
||||
@@ -121,7 +122,7 @@ export const reconnectToSql = async () => {
|
||||
|
||||
if (!serverUp) {
|
||||
delayStart = Math.min(delayStart * 2, 30000); // exponential backoff until up to 30000
|
||||
return;
|
||||
continue;
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -131,19 +132,11 @@ export const reconnectToSql = async () => {
|
||||
log.info(`${gpSqlConfig.server} is connected to ${gpSqlConfig.database}`);
|
||||
} catch (error) {
|
||||
delayStart = Math.min(delayStart * 2, 30000);
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "db",
|
||||
message: "Failed to reconnect to the prod sql server.",
|
||||
data: [error],
|
||||
notify: false,
|
||||
});
|
||||
log.error({ error }, "Failed to reconnect to the prod sql server.");
|
||||
}
|
||||
}
|
||||
|
||||
if (!connected) {
|
||||
if (!connected && attempt >= maxAttempts) {
|
||||
log.error(
|
||||
{ notify: true },
|
||||
"Max reconnect attempts reached on the prodSql server. Stopping retries.",
|
||||
|
||||
@@ -1,10 +1,5 @@
|
||||
import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
import {
|
||||
connected,
|
||||
pool2,
|
||||
reconnecting,
|
||||
reconnectToSql,
|
||||
} from "./gpSqlConnection.controller.js";
|
||||
import { connected, pool2 } from "./gpSqlConnection.controller.js";
|
||||
|
||||
interface SqlError extends Error {
|
||||
code?: string;
|
||||
@@ -22,29 +17,15 @@ interface SqlError extends Error {
|
||||
*/
|
||||
export const gpQuery = async (queryToRun: string, name: string) => {
|
||||
if (!connected) {
|
||||
reconnectToSql();
|
||||
|
||||
if (reconnecting) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "gpSql",
|
||||
message: `The sql ${process.env.PROD_PLANT_TOKEN} is trying to reconnect already`,
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
} else {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "gpSql",
|
||||
message: `${process.env.PROD_PLANT_TOKEN} is not connected, and failed to connect.`,
|
||||
data: [],
|
||||
notify: true,
|
||||
});
|
||||
}
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "gpSql",
|
||||
message: `${process.env.PROD_PLANT_TOKEN} is offline or attempting to reconnect`,
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
//change to the correct server
|
||||
|
||||
223
backend/logistics/logistics.historicalInv.ts
Normal file
@@ -0,0 +1,223 @@
|
||||
import { format } from "date-fns";
|
||||
import { eq, sql } from "drizzle-orm";
|
||||
import { runDatamartQuery } from "../datamart/datamart.controller.js";
|
||||
import { db } from "../db/db.controller.js";
|
||||
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
|
||||
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
|
||||
import {
|
||||
type SqlQuery,
|
||||
sqlQuerySelector,
|
||||
} from "../prodSql/prodSqlQuerySelector.utils.js";
|
||||
import { createCronJob } from "../utils/croner.utils.js";
|
||||
import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||
|
||||
type Inventory = {
|
||||
article: string;
|
||||
alias: string;
|
||||
materialType: string;
|
||||
total_palletQTY: string;
|
||||
available_QTY: string;
|
||||
coa_QTY: string;
|
||||
held_QTY: string;
|
||||
consignment_qty: string;
|
||||
lot: string;
|
||||
locationId: string;
|
||||
laneDescription: string;
|
||||
warehouseId: string;
|
||||
warehouseDescription: string;
|
||||
};
|
||||
|
||||
const historicalInvImport = async () => {
|
||||
const today = new Date();
|
||||
const { data, error } = await tryCatch(
|
||||
db
|
||||
.select()
|
||||
.from(invHistoricalData)
|
||||
.where(eq(invHistoricalData.histDate, format(today, "yyyy-MM-dd"))),
|
||||
);
|
||||
|
||||
if (error) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "query",
|
||||
message: `Error getting historical inv info`,
|
||||
data: error as any,
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
if (data?.length === 0) {
|
||||
const avSQLQuery = sqlQuerySelector(`datamart.activeArticles`) as SqlQuery;
|
||||
|
||||
if (!avSQLQuery.success) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "logistics",
|
||||
subModule: "inv",
|
||||
message: `Error getting Article info`,
|
||||
data: [avSQLQuery.message],
|
||||
notify: true,
|
||||
});
|
||||
}
|
||||
|
||||
const { data: inv, error: invError } = await tryCatch(
|
||||
//prodQuery(sqlQuery.query, "Inventory data"),
|
||||
runDatamartQuery({
|
||||
name: "inventory",
|
||||
options: { lots: "x", locations: "x" },
|
||||
}),
|
||||
);
|
||||
|
||||
const { data: av, error: avError } = (await tryCatch(
|
||||
runDatamartQuery({ name: "activeArticles", options: {} }),
|
||||
)) as any;
|
||||
|
||||
if (invError) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "logistics",
|
||||
subModule: "inv",
|
||||
message: `Error getting inventory info from prod query`,
|
||||
data: invError as any,
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
if (avError) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "logistics",
|
||||
subModule: "inv",
|
||||
message: `Error getting article info from prod query`,
|
||||
data: invError as any,
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
// shape the data to go into our table
|
||||
const plantToken = process.env.PROD_PLANT_TOKEN ?? "test1";
|
||||
const importInv = (inv.data ? inv.data : []) as Inventory[];
|
||||
const importData = importInv.map((i) => {
|
||||
return {
|
||||
histDate: sql`(NOW())::date`,
|
||||
plantToken: plantToken,
|
||||
article: i.article,
|
||||
articleDescription: i.alias,
|
||||
materialType:
|
||||
av.data.filter((a: any) => a.article === i.article).length > 0
|
||||
? av.data.filter((a: any) => a.article === i.article)[0]
|
||||
?.TypeOfMaterial
|
||||
: "Item not defined",
|
||||
total_QTY: i.total_palletQTY ?? "0.00",
|
||||
available_QTY: i.available_QTY ?? "0.00",
|
||||
coa_QTY: i.coa_QTY ?? "0.00",
|
||||
held_QTY: i.held_QTY ?? "0.00",
|
||||
consignment_QTY: i.consignment_qty ?? "0.00",
|
||||
lot_Number: i.lot ?? "0",
|
||||
locationId: i.locationId ?? "0",
|
||||
location: i.laneDescription ?? "Missing lane",
|
||||
whseId: i.warehouseId ?? "0",
|
||||
whseName: i.warehouseDescription ?? "Missing warehouse",
|
||||
};
|
||||
});
|
||||
|
||||
const { data: dataImport, error: errorImport } = await tryCatch(
|
||||
db.insert(invHistoricalData).values(importData),
|
||||
);
|
||||
|
||||
if (errorImport) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "logistics",
|
||||
subModule: "inv",
|
||||
message: `Error adding historical data to lst db`,
|
||||
data: errorImport as any,
|
||||
notify: true,
|
||||
});
|
||||
}
|
||||
|
||||
if (dataImport) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "info",
|
||||
module: "logistics",
|
||||
subModule: "inv",
|
||||
message: `Historical data was added to lst :D`,
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "info",
|
||||
module: "logistics",
|
||||
subModule: "inv",
|
||||
message: `Historical Data for: ${format(today, "yyyy-MM-dd")}, is already added and nothing to do.`,
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "info",
|
||||
module: "logistics",
|
||||
subModule: "inv",
|
||||
message: `Some weird crazy error just happened and didnt get captured during the historical inv check.`,
|
||||
data: [],
|
||||
notify: true,
|
||||
});
|
||||
};
|
||||
|
||||
export const historicalSchedule = async () => {
|
||||
// running the history in case my silly ass dose an update around the shift change time lol, this will prevent loss data. it might be off a little but no one cares
|
||||
historicalInvImport();
|
||||
|
||||
const sqlQuery = sqlQuerySelector(`shiftChange`) as SqlQuery;
|
||||
|
||||
if (!sqlQuery.success) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "logistics",
|
||||
subModule: "query",
|
||||
message: `Error getting shiftChange sql file`,
|
||||
data: [sqlQuery.message],
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
const { data, error } = await tryCatch(
|
||||
prodQuery(sqlQuery.query, "Shift Change data"),
|
||||
);
|
||||
|
||||
if (error) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "logistics",
|
||||
subModule: "query",
|
||||
message: `Error getting shiftChange info`,
|
||||
data: error as any,
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
// shift split
|
||||
const shiftTimeSplit = data?.data[0]?.shiftChange.split(":");
|
||||
|
||||
const cronSetup = `0 ${
|
||||
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[1])}` : "0"
|
||||
} ${
|
||||
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[0])}` : "7"
|
||||
} * * *`;
|
||||
|
||||
createCronJob("historicalInv", cronSetup, () => historicalInvImport());
|
||||
};
|
||||
@@ -14,20 +14,82 @@
|
||||
*/
|
||||
|
||||
import { Router } from "express";
|
||||
import multer from "multer";
|
||||
import { db } from "../db/db.controller.js";
|
||||
import { printerLog } from "../db/schema/printerLogs.schema.js";
|
||||
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||
|
||||
type PrinterEvent = {
|
||||
name: string;
|
||||
condition: string;
|
||||
message: string;
|
||||
};
|
||||
const r = Router();
|
||||
const upload = multer();
|
||||
|
||||
r.post("/printer/listener/:printer", async (req, res) => {
|
||||
const parseZebraAlert = (body: any): PrinterEvent => {
|
||||
const name = body.uniqueId || "unknown";
|
||||
const decoded = decodeURIComponent(body.alertMsg || "");
|
||||
|
||||
const [conditionRaw, ...rest] = decoded.split(":");
|
||||
const condition = conditionRaw?.toLowerCase()?.trim() || "unknown";
|
||||
const message = rest.join(":").trim();
|
||||
|
||||
return {
|
||||
name,
|
||||
condition,
|
||||
message,
|
||||
};
|
||||
};
|
||||
|
||||
r.post("/printer/listener/:printer", upload.any(), async (req, res) => {
|
||||
const { printer: printerName } = req.params;
|
||||
console.log(req.body);
|
||||
const event: PrinterEvent = parseZebraAlert(req.body);
|
||||
|
||||
const rawIp =
|
||||
req.headers["x-forwarded-for"]?.toString().split(",")[0]?.trim() ||
|
||||
req.socket.remoteAddress ||
|
||||
req.ip;
|
||||
|
||||
const ip = rawIp?.replace("::ffff:", "");
|
||||
|
||||
// post the new message
|
||||
const { data, error } = await tryCatch(
|
||||
db
|
||||
.insert(printerLog)
|
||||
.values({
|
||||
ip: ip?.replace("::ffff:", ""),
|
||||
name: printerName,
|
||||
printerSN: event.name,
|
||||
condition: event.condition,
|
||||
message: event.message,
|
||||
})
|
||||
.returning(),
|
||||
);
|
||||
|
||||
if (error) {
|
||||
return apiReturn(res, {
|
||||
success: false,
|
||||
level: "info",
|
||||
module: "ocp",
|
||||
subModule: "printing",
|
||||
message: `${printerName} encountered an error posting the log`,
|
||||
data: error as any,
|
||||
status: 400,
|
||||
});
|
||||
}
|
||||
|
||||
if (data) {
|
||||
// TODO: send message over to the controller to decide what to do next with it
|
||||
}
|
||||
|
||||
return apiReturn(res, {
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "ocp",
|
||||
subModule: "printing",
|
||||
message: `${printerName} just passed over a message`,
|
||||
message: `${printerName} just sent a message`,
|
||||
data: req.body ?? [],
|
||||
status: 200,
|
||||
});
|
||||
|
||||
@@ -10,10 +10,323 @@
|
||||
* printer status will live here this will be how we manage all the levels of status like 3 paused, 1 printing, 8 error, 10 power up, etc...
|
||||
*/
|
||||
|
||||
import { eq } from "drizzle-orm";
|
||||
import net from "net";
|
||||
import { db } from "../db/db.controller.js";
|
||||
import { printerData } from "../db/schema/printers.schema.js";
|
||||
import { createLogger } from "../logger/logger.controller.js";
|
||||
import { delay } from "../utils/delay.utils.js";
|
||||
import { runProdApi } from "../utils/prodEndpoint.utils.js";
|
||||
import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
|
||||
type Printer = {
|
||||
name: string;
|
||||
humanReadableId: string;
|
||||
type: number;
|
||||
ipAddress: string;
|
||||
port: number;
|
||||
default: boolean;
|
||||
labelInstanceIpAddress: string;
|
||||
labelInstancePort: number;
|
||||
active: boolean;
|
||||
remark: string;
|
||||
processes: number[];
|
||||
};
|
||||
|
||||
const log = createLogger({ module: "ocp", subModule: "printers" });
|
||||
|
||||
export const printerManager = async () => {};
|
||||
|
||||
export const printerHeartBeat = async () => {
|
||||
// heat heats will be defaulted to 60 seconds no reason to allow anything else
|
||||
// heat heats will be defaulted to 60 seconds no reason to allow anything else, and heart beats will only go to assigned printers no need to be monitoring non labeling printers
|
||||
};
|
||||
|
||||
//export const printerStatus = async (statusNr: number, printerId: number) => {};
|
||||
export const printerSync = async () => {
|
||||
// pull the printers from alpla prod and update them in lst
|
||||
|
||||
const printers = await runProdApi({
|
||||
method: "get",
|
||||
endpoint: "/public/v1.0/Administration/Printers",
|
||||
});
|
||||
|
||||
if (!printers?.success) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "ocp",
|
||||
subModule: "printer",
|
||||
message: printers?.message ?? "",
|
||||
data: printers?.data ?? [],
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
if (printers?.success && Array.isArray(printers.data)) {
|
||||
const ignorePrinters = ["pdf24", "standard"];
|
||||
|
||||
const validPrinters =
|
||||
printers.data.filter(
|
||||
(n: any) =>
|
||||
!ignorePrinters.includes(n.name.toLowerCase()) && n.ipAddress,
|
||||
) ?? [];
|
||||
if (validPrinters.length) {
|
||||
for (const printer of validPrinters as Printer[]) {
|
||||
// run an update for each printer, do on conflicts based on the printer id
|
||||
log.debug({}, `Add/Updating ${printer.name}`);
|
||||
|
||||
if (printer.active) {
|
||||
await db
|
||||
.insert(printerData)
|
||||
.values({
|
||||
name: printer.name,
|
||||
humanReadableId: printer.humanReadableId,
|
||||
ipAddress: printer.ipAddress,
|
||||
port: printer.port,
|
||||
remark: printer.remark,
|
||||
processes: printer.processes,
|
||||
})
|
||||
.onConflictDoUpdate({
|
||||
target: printerData.humanReadableId,
|
||||
set: {
|
||||
name: printer.name,
|
||||
humanReadableId: printer.humanReadableId,
|
||||
ipAddress: printer.ipAddress,
|
||||
port: printer.port,
|
||||
remark: printer.remark,
|
||||
processes: printer.processes,
|
||||
},
|
||||
})
|
||||
.returning();
|
||||
await tcpPrinter(printer);
|
||||
}
|
||||
|
||||
if (!printer.active) {
|
||||
log.warn({}, `${printer.name} is not active so removing from lst.`);
|
||||
await db
|
||||
.delete(printerData)
|
||||
.where(eq(printerData.humanReadableId, printer.humanReadableId));
|
||||
}
|
||||
}
|
||||
return returnFunc({
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "ocp",
|
||||
subModule: "printer",
|
||||
message: `${printers.data.length} printers were just synced, this includes new and old printers`,
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return returnFunc({
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "ocp",
|
||||
subModule: "printer",
|
||||
message: `No printers to update`,
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
};
|
||||
|
||||
const tcpPrinter = (printer: Printer) => {
|
||||
return new Promise<void>((resolve) => {
|
||||
const socket = new net.Socket();
|
||||
const timeoutMs = 15 * 1000;
|
||||
|
||||
const commands = [
|
||||
{
|
||||
key: "clearAlerts",
|
||||
command: '! U1 setvar "alerts.configured" ""\r\n',
|
||||
},
|
||||
{
|
||||
key: "addAlert",
|
||||
command: `! U1 setvar "alerts.add" "ALL MESSAGES,HTTP-POST,Y,Y,http://${process.env.SERVER_IP}:${process.env.PORT}/lst/api/ocp/printer/listener/${printer.name},0,N,printer"\r\n`,
|
||||
},
|
||||
{
|
||||
key: "setFriendlyName",
|
||||
command: `! U1 setvar "device.friendly_name" "${printer.name}"\r\n`,
|
||||
},
|
||||
{
|
||||
key: "getUniqueId",
|
||||
command: '! U1 getvar "device.unique_id"\r\n',
|
||||
},
|
||||
] as const;
|
||||
|
||||
let currentCommandIndex = 0;
|
||||
let awaitingSerial = false;
|
||||
let settled = false;
|
||||
|
||||
const cleanup = () => {
|
||||
socket.removeAllListeners();
|
||||
socket.destroy();
|
||||
};
|
||||
|
||||
const finish = (err?: unknown) => {
|
||||
if (settled) return;
|
||||
settled = true;
|
||||
clearTimeout(timeout);
|
||||
cleanup();
|
||||
|
||||
if (err) {
|
||||
log.error(
|
||||
{ err, printer: printer.name },
|
||||
`Printer update failed for ${printer.name}: doing the name and alert add directly on the printer.`,
|
||||
);
|
||||
}
|
||||
|
||||
resolve();
|
||||
};
|
||||
|
||||
const timeout = setTimeout(() => {
|
||||
finish(`${printer.name} timed out while updating printer config`);
|
||||
}, timeoutMs);
|
||||
|
||||
const sendNext = async () => {
|
||||
if (currentCommandIndex >= commands.length) {
|
||||
socket.end();
|
||||
return;
|
||||
}
|
||||
|
||||
const current = commands[currentCommandIndex];
|
||||
|
||||
if (!current) {
|
||||
socket.end();
|
||||
return;
|
||||
}
|
||||
|
||||
awaitingSerial = current.key === "getUniqueId";
|
||||
|
||||
log.info(
|
||||
{ printer: printer.name, command: current.key },
|
||||
`Sending command to ${printer.name}`,
|
||||
);
|
||||
|
||||
socket.write(current.command);
|
||||
|
||||
currentCommandIndex++;
|
||||
|
||||
// Small pause between commands so the printer has breathing room
|
||||
if (currentCommandIndex < commands.length) {
|
||||
await delay(1500);
|
||||
await sendNext();
|
||||
} else {
|
||||
// last command was sent, now wait for final data/close
|
||||
await delay(1500);
|
||||
socket.end();
|
||||
}
|
||||
};
|
||||
|
||||
socket.connect(printer.port, printer.ipAddress, async () => {
|
||||
log.info({}, `Connected to ${printer.name}`);
|
||||
|
||||
try {
|
||||
await sendNext();
|
||||
} catch (error) {
|
||||
finish(
|
||||
error instanceof Error
|
||||
? error
|
||||
: new Error(
|
||||
`Unknown error while sending commands to ${printer.name}`,
|
||||
),
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
socket.on("data", async (data) => {
|
||||
const response = data.toString().trim().replaceAll('"', "");
|
||||
|
||||
log.info(
|
||||
{ printer: printer.name, response },
|
||||
`Received printer response from ${printer.name}`,
|
||||
);
|
||||
|
||||
if (!awaitingSerial) return;
|
||||
|
||||
awaitingSerial = false;
|
||||
|
||||
try {
|
||||
await db
|
||||
.update(printerData)
|
||||
.set({ printerSN: response })
|
||||
.where(eq(printerData.humanReadableId, printer.humanReadableId));
|
||||
} catch (error) {
|
||||
finish(
|
||||
error instanceof Error
|
||||
? error
|
||||
: new Error(`Failed to update printer SN for ${printer.name}`),
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
socket.on("close", () => {
|
||||
log.info({}, `Closed connection to ${printer.name}`);
|
||||
finish();
|
||||
});
|
||||
|
||||
socket.on("error", (err) => {
|
||||
finish(err);
|
||||
});
|
||||
});
|
||||
};
|
||||
|
||||
// const tcpPrinter = async (printer: Printer) => {
|
||||
// const p = new net.Socket();
|
||||
// const commands = [
|
||||
// '! U1 setvar "alerts.configured" ""\r\n', // clean install just remove all alerts
|
||||
// `! U1 setvar "alerts.add" "ALL MESSAGES,HTTP-POST,Y,Y,http://${process.env.SERVER_IP}:${process.env.PORT}/lst/api/ocp/printer/listener/${printer.name},0,N,printer"\r\n`, // add in the all alert
|
||||
// `! U1 setvar "device.friendly_name" "${printer.name}"\r\n`, // change the name to match the alplaprod name
|
||||
// `! U1 getvar "device.unique_id"\r\n`, // this will get mapped into the printer as this is the one we will link to in the db.
|
||||
// //'! U1 getvar "alerts.configured" ""\r\n',
|
||||
// ];
|
||||
|
||||
// let index = 0;
|
||||
// const sendNext = async () => {
|
||||
// if (index >= commands.length) {
|
||||
// p.end();
|
||||
// return;
|
||||
// }
|
||||
|
||||
// const cmd = commands[index] as string;
|
||||
// p.write(cmd);
|
||||
// return;
|
||||
// };
|
||||
|
||||
// p.connect(printer.port, printer.ipAddress, async () => {
|
||||
// log.info({}, `Connected to ${printer.name}`);
|
||||
// while (index < commands.length) {
|
||||
// await sendNext();
|
||||
// await delay(2000);
|
||||
// index++;
|
||||
// }
|
||||
// });
|
||||
|
||||
// p.on("data", async (data) => {
|
||||
// // this is just the sn that comes over so we will update this printer.
|
||||
// await db
|
||||
// .update(printerData)
|
||||
// .set({ printerSN: data.toString().trim().replaceAll('"', "") })
|
||||
// .where(eq(printerData.humanReadableId, printer.humanReadableId));
|
||||
|
||||
// // get the name
|
||||
// // p.write('! U1 getvar "device.friendly_name"\r\n');
|
||||
// // p.write('! U1 getvar "device.unique_id"\r\n');
|
||||
// // p.write('! U1 getvar "alerts.configured"\r\n');
|
||||
// });
|
||||
|
||||
// p.on("close", () => {
|
||||
// log.info({}, `Closed connection to ${printer.name}`);
|
||||
// p.destroy();
|
||||
// return;
|
||||
// });
|
||||
|
||||
// p.on("error", (err) => {
|
||||
// log.info(
|
||||
// { stack: err },
|
||||
// `${printer.name} encountered an error while trying to update`,
|
||||
// );
|
||||
// return;
|
||||
// });
|
||||
// };
|
||||
|
||||
38
backend/ocp/ocp.printer.update.ts
Normal file
@@ -0,0 +1,38 @@
|
||||
/**
|
||||
* the route that listens for the printers post.
|
||||
*
|
||||
* and http-post alert should be setup on each printer pointing to at min you will want to make the alert for
|
||||
* pause printer, you can have all on here as it will also monitor and do things on all messages
|
||||
*
|
||||
* http://{serverIP}:2222/lst/api/ocp/printer/listener/{printerName}
|
||||
*
|
||||
* the messages will be sent over to the db for logging as well as specific ones will do something
|
||||
*
|
||||
* pause will validate if can print
|
||||
* close head will repause the printer so it wont print a label
|
||||
* power up will just repause the printer so it wont print a label
|
||||
*/
|
||||
|
||||
import { Router } from "express";
|
||||
|
||||
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||
//import { tryCatch } from "../utils/trycatch.utils.js";
|
||||
import { printerSync } from "./ocp.printer.manage.js";
|
||||
|
||||
const r = Router();
|
||||
|
||||
r.post("/printer/update", async (_, res) => {
|
||||
printerSync();
|
||||
return apiReturn(res, {
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "ocp",
|
||||
subModule: "printing",
|
||||
message:
|
||||
"Printer update has been triggered to monitor progress please head to the logs.",
|
||||
data: [],
|
||||
status: 200,
|
||||
});
|
||||
});
|
||||
|
||||
export default r;
|
||||
@@ -2,6 +2,7 @@ import { type Express, Router } from "express";
|
||||
import { requireAuth } from "../middleware/auth.middleware.js";
|
||||
import { featureCheck } from "../middleware/featureActive.middleware.js";
|
||||
import listener from "./ocp.printer.listener.js";
|
||||
import update from "./ocp.printer.update.js";
|
||||
|
||||
export const setupOCPRoutes = (baseUrl: string, app: Express) => {
|
||||
//setup all the routes
|
||||
@@ -16,6 +17,7 @@ export const setupOCPRoutes = (baseUrl: string, app: Express) => {
|
||||
// auth routes below here
|
||||
router.use(requireAuth);
|
||||
|
||||
router.use(update);
|
||||
//router.use("");
|
||||
|
||||
app.use(`${baseUrl}/api/ocp`, router);
|
||||
|
||||
@@ -7,12 +7,17 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
export let pool: sql.ConnectionPool;
|
||||
export let connected: boolean = false;
|
||||
export let reconnecting = false;
|
||||
// start the delay out as 2 seconds
|
||||
let delayStart = 2000;
|
||||
let attempt = 0;
|
||||
const maxAttempts = 10;
|
||||
|
||||
export const connectProdSql = async () => {
|
||||
const serverUp = await checkHostnamePort(`${process.env.PROD_SERVER}:1433`);
|
||||
if (!serverUp) {
|
||||
// we will try to reconnect
|
||||
connected = false;
|
||||
reconnectToSql();
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
@@ -48,6 +53,7 @@ export const connectProdSql = async () => {
|
||||
notify: false,
|
||||
});
|
||||
} catch (error) {
|
||||
reconnectToSql();
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
@@ -104,11 +110,6 @@ export const reconnectToSql = async () => {
|
||||
//set reconnecting to true while we try to reconnect
|
||||
reconnecting = true;
|
||||
|
||||
// start the delay out as 2 seconds
|
||||
let delayStart = 2000;
|
||||
let attempt = 0;
|
||||
const maxAttempts = 10;
|
||||
|
||||
while (!connected && attempt < maxAttempts) {
|
||||
attempt++;
|
||||
log.info(
|
||||
@@ -121,7 +122,7 @@ export const reconnectToSql = async () => {
|
||||
|
||||
if (!serverUp) {
|
||||
delayStart = Math.min(delayStart * 2, 30000); // exponential backoff until up to 30000
|
||||
return;
|
||||
continue;
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -133,19 +134,12 @@ export const reconnectToSql = async () => {
|
||||
);
|
||||
} catch (error) {
|
||||
delayStart = Math.min(delayStart * 2, 30000);
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "db",
|
||||
message: "Failed to reconnect to the prod sql server.",
|
||||
data: [error],
|
||||
notify: false,
|
||||
});
|
||||
delayStart = Math.min(delayStart * 2, 30000);
|
||||
log.error({ error }, "Failed to reconnect to the prod sql server.");
|
||||
}
|
||||
}
|
||||
|
||||
if (!connected) {
|
||||
if (!connected && attempt >= maxAttempts) {
|
||||
log.error(
|
||||
{ notify: true },
|
||||
"Max reconnect attempts reached on the prodSql server. Stopping retries.",
|
||||
|
||||
@@ -1,10 +1,5 @@
|
||||
import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
import {
|
||||
connected,
|
||||
pool,
|
||||
reconnecting,
|
||||
reconnectToSql,
|
||||
} from "./prodSqlConnection.controller.js";
|
||||
import { connected, pool } from "./prodSqlConnection.controller.js";
|
||||
|
||||
interface SqlError extends Error {
|
||||
code?: string;
|
||||
@@ -22,29 +17,15 @@ interface SqlError extends Error {
|
||||
*/
|
||||
export const prodQuery = async (queryToRun: string, name: string) => {
|
||||
if (!connected) {
|
||||
reconnectToSql();
|
||||
|
||||
if (reconnecting) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "prodSql",
|
||||
message: `The sql ${process.env.PROD_PLANT_TOKEN} is trying to reconnect already`,
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
} else {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "prodSql",
|
||||
message: `${process.env.PROD_PLANT_TOKEN} is not connected, and failed to connect.`,
|
||||
data: [],
|
||||
notify: true,
|
||||
});
|
||||
}
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "system",
|
||||
subModule: "prodSql",
|
||||
message: `${process.env.PROD_PLANT_TOKEN} is offline or attempting to reconnect`,
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
//change to the correct server
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
use AlplaPROD_test1
|
||||
|
||||
SELECT V_Artikel.IdArtikelvarianten,
|
||||
SELECT V_Artikel.IdArtikelvarianten as article,
|
||||
V_Artikel.Bezeichnung,
|
||||
V_Artikel.ArtikelvariantenTypBez,
|
||||
V_Artikel.PreisEinheitBez,
|
||||
43
backend/prodSql/queries/datamart.activeArticles2.sql
Normal file
@@ -0,0 +1,43 @@
|
||||
/**
|
||||
This will be replacing activeArticles once all data is remapped into this query.
|
||||
make a note in the docs this activeArticles will go stale sooner or later.
|
||||
**/
|
||||
use [test1_AlplaPROD2.0_Read]
|
||||
|
||||
select a.Id,
|
||||
a.HumanReadableId as av,
|
||||
a.Alias as alias,
|
||||
p.LoadingUnitsPerTruck as loadingUnitsPerTruck,
|
||||
p.LoadingUnitsPerTruck * p.LoadingUnitPieces as qtyPerTruck,
|
||||
p.LoadingUnitPieces,
|
||||
case when i.MinQuantity IS NOT NULL then round(cast(i.MinQuantity as float), 2) else 0 end as min,
|
||||
case when i.MaxQuantity IS NOT NULL then round(cast(i.MaxQuantity as float),2) else 0 end as max
|
||||
from masterData.Article (nolock) as a
|
||||
|
||||
/* sales price */
|
||||
left join
|
||||
(select *
|
||||
from (select
|
||||
id,
|
||||
PackagingId,
|
||||
ArticleId,
|
||||
DefaultCustomer,
|
||||
ROW_NUMBER() OVER (PARTITION BY ArticleId ORDER BY ValidAfter DESC) AS RowNum
|
||||
from masterData.SalesPrice (nolock)
|
||||
where DefaultCustomer = 1) as x
|
||||
where RowNum = 1
|
||||
) as s
|
||||
on a.id = s.ArticleId
|
||||
|
||||
/* pkg instructions */
|
||||
left join
|
||||
masterData.PackagingInstruction (nolock) as p
|
||||
on s.PackagingId = p.id
|
||||
|
||||
/* stock limits */
|
||||
left join
|
||||
masterData.StockLimit (nolock) as i
|
||||
on a.id = i.ArticleId
|
||||
|
||||
where a.active = 1
|
||||
and a.HumanReadableId in ([articles])
|
||||
45
backend/prodSql/queries/datamart.customerInventory.sql
Normal file
@@ -0,0 +1,45 @@
|
||||
select x.idartikelVarianten as av
|
||||
,ArtikelVariantenAlias as Alias
|
||||
--x.Lfdnr as RunningNumber,
|
||||
--,round(sum(EinlagerungsMengeVPKSum),0) as Total_Pallets
|
||||
--,sum(EinlagerungsMengeSum) as Total_PalletQTY
|
||||
,round(sum(VerfuegbareMengeVPKSum),0) as Avalible_Pallets
|
||||
,sum(VerfuegbareMengeSum) as Avaliable_PalletQTY
|
||||
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as COA_Pallets
|
||||
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as COA_QTY
|
||||
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as Held_Pallets
|
||||
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeSum else 0 end) as Held_QTY
|
||||
,IdProdPlanung as Lot
|
||||
--,IdAdressen
|
||||
--,x.AdressBez
|
||||
--,*
|
||||
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
|
||||
|
||||
left join
|
||||
[AlplaPROD_test1].dbo.T_EtikettenGedruckt (nolock) on
|
||||
x.Lfdnr = T_EtikettenGedruckt.Lfdnr AND T_EtikettenGedruckt.Lfdnr > 1
|
||||
|
||||
left join
|
||||
|
||||
(SELECT *
|
||||
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] (nolock) where Active = 1) as c
|
||||
on x.IdMainDefect = c.IdBlockingDefect
|
||||
/*
|
||||
The data below will be controlled by the user in excell by default everything will be passed over
|
||||
IdAdressen = 3
|
||||
*/
|
||||
where
|
||||
--IdArtikelTyp = 1
|
||||
x.IdWarenlager not in (6, 1)
|
||||
--and IdAdressen
|
||||
--and x.IdWarenlager in (0)
|
||||
|
||||
|
||||
group by x.IdArtikelVarianten
|
||||
,ArtikelVariantenAlias
|
||||
,IdProdPlanung
|
||||
--,c.Description
|
||||
,IdAdressen
|
||||
,x.AdressBez
|
||||
--, x.Lfdnr
|
||||
order by x.IdArtikelVarianten
|
||||
74
backend/prodSql/queries/datamart.deliveryByDateRange.sql
Normal file
@@ -0,0 +1,74 @@
|
||||
use [test1_AlplaPROD2.0_Read]
|
||||
|
||||
DECLARE @StartDate DATE = '[startDate]' -- 2025-1-1
|
||||
DECLARE @EndDate DATE = '[endDate]' -- 2025-1-31
|
||||
SELECT
|
||||
r.[ArticleHumanReadableId]
|
||||
,[ReleaseNumber]
|
||||
,h.CustomerOrderNumber
|
||||
,x.CustomerLineItemNumber
|
||||
,[CustomerReleaseNumber]
|
||||
,[ReleaseState]
|
||||
,[DeliveryState]
|
||||
,ea.JournalNummer as BOL_Number
|
||||
,[ReleaseConfirmationState]
|
||||
,[PlanningState]
|
||||
,format(r.[OrderDate], 'yyyy-MM-dd HH:mm') as OrderDate
|
||||
--,r.[OrderDate]
|
||||
,FORMAT(r.[DeliveryDate], 'yyyy-MM-dd HH:mm') as DeliveryDate
|
||||
--,r.[DeliveryDate]
|
||||
,FORMAT(r.[LoadingDate], 'yyyy-MM-dd HH:mm') as LoadingDate
|
||||
--,r.[LoadingDate]
|
||||
,[Quantity]
|
||||
,[DeliveredQuantity]
|
||||
,r.[AdditionalInformation1]
|
||||
,r.[AdditionalInformation2]
|
||||
,[TradeUnits]
|
||||
,[LoadingUnits]
|
||||
,[Trucks]
|
||||
,[LoadingToleranceType]
|
||||
,[SalesPrice]
|
||||
,[Currency]
|
||||
,[QuantityUnit]
|
||||
,[SalesPriceRemark]
|
||||
,r.[Remark]
|
||||
,[Irradiated]
|
||||
,r.[CreatedByEdi]
|
||||
,[DeliveryAddressHumanReadableId]
|
||||
,DeliveryAddressDescription
|
||||
,[CustomerArtNo]
|
||||
,[TotalPrice]
|
||||
,r.[ArticleAlias]
|
||||
|
||||
FROM [order].[Release] (nolock) as r
|
||||
|
||||
left join
|
||||
[order].LineItem as x on
|
||||
|
||||
r.LineItemId = x.id
|
||||
|
||||
left join
|
||||
[order].Header as h on
|
||||
x.HeaderId = h.id
|
||||
|
||||
--bol stuff
|
||||
left join
|
||||
AlplaPROD_test1.dbo.V_LadePlanungenLadeAuftragAbruf (nolock) as zz
|
||||
on zz.AbrufIdAuftragsAbruf = r.ReleaseNumber
|
||||
|
||||
left join
|
||||
(select * from (SELECT
|
||||
ROW_NUMBER() OVER (PARTITION BY IdJournal ORDER BY add_date DESC) AS RowNum
|
||||
,*
|
||||
FROM [AlplaPROD_test1].[dbo].[T_Lieferungen] (nolock)) x
|
||||
|
||||
where RowNum = 1) as ea on
|
||||
zz.IdLieferschein = ea.IdJournal
|
||||
|
||||
where
|
||||
--r.ReleaseNumber = 1452
|
||||
|
||||
r.DeliveryDate between @StartDate AND @EndDate
|
||||
and DeliveredQuantity > 0
|
||||
--and r.ArticleHumanReadableId in ([articles])
|
||||
--and Journalnummer = 169386
|
||||
29
backend/prodSql/queries/datamart.fakeEDIUpdate.sql
Normal file
@@ -0,0 +1,29 @@
|
||||
use [test1_AlplaPROD2.0_Read]
|
||||
|
||||
select
|
||||
customerartno as CustomerArticleNumber
|
||||
,h.CustomerOrderNumber as CustomerOrderNumber
|
||||
,l.CustomerLineItemNumber as CustomerLineNumber
|
||||
,r.CustomerReleaseNumber as CustomerRealeaseNumber
|
||||
,r.Quantity
|
||||
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as DeliveryDate
|
||||
,h.CustomerHumanReadableId as CustomerID
|
||||
,r.Remark
|
||||
--,*
|
||||
from [order].[Release] as r (nolock)
|
||||
|
||||
left join
|
||||
[order].LineItem as l (nolock) on
|
||||
l.id = r.LineItemId
|
||||
|
||||
left join
|
||||
[order].Header as h (nolock) on
|
||||
h.id = l.HeaderId
|
||||
|
||||
WHERE releaseState not in (1, 2, 3, 4)
|
||||
AND h.CreatedByEdi = 1
|
||||
AND r.deliveryDate < getdate() + 1
|
||||
--AND h.CustomerHumanReadableId in (0)
|
||||
|
||||
|
||||
order by r.deliveryDate
|
||||
8
backend/prodSql/queries/datamart.forecast.sql
Normal file
@@ -0,0 +1,8 @@
|
||||
SELECT format(RequirementDate, 'yyyy-MM-dd') as requirementDate
|
||||
,ArticleHumanReadableId
|
||||
,CustomerArticleNumber
|
||||
,ArticleDescription
|
||||
,Quantity
|
||||
FROM [test1_AlplaPROD2.0_Read].[forecast].[Forecast]
|
||||
where DeliveryAddressHumanReadableId in ([customers])
|
||||
order by RequirementDate
|
||||
58
backend/prodSql/queries/datamart.inventory.sql
Normal file
@@ -0,0 +1,58 @@
|
||||
use [test1_AlplaPROD2.0_Read]
|
||||
|
||||
select
|
||||
ArticleHumanReadableId as article
|
||||
,ArticleAlias as alias
|
||||
,round(sum(QuantityLoadingUnits),2) total_pallets
|
||||
,round(sum(Quantity),2) as total_palletQTY
|
||||
,round(sum(case when State = 0 then QuantityLoadingUnits else 0 end),2) available_Pallets
|
||||
,round(sum(case when State = 0 then Quantity else 0 end),2) available_QTY
|
||||
,round(sum(case when b.HumanReadableId = 864 then QuantityLoadingUnits else 0 end),2) as coa_Pallets
|
||||
,round(sum(case when b.HumanReadableId = 864 then Quantity else 0 end),2) as coa_QTY
|
||||
,round(sum(case when b.HumanReadableId <> 864 then QuantityLoadingUnits else 0 end),2) as held_Pallets
|
||||
,round(sum(case when b.HumanReadableId <> 864 then Quantity else 0 end),2) as held_QTY
|
||||
,round(sum(case when w.type = 7 then QuantityLoadingUnits else 0 end),2) as consignment_Pallets
|
||||
,round(sum(case when w.type = 7 then Quantity else 0 end),2) as consignment_qty
|
||||
--,l.RunningNumber
|
||||
|
||||
/** datamart include lot number **/
|
||||
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot
|
||||
|
||||
/** data mart include location data **/
|
||||
--,l.WarehouseDescription,l.LaneDescription
|
||||
|
||||
,articleTypeName
|
||||
|
||||
FROM [warehousing].[WarehouseUnit] as l (nolock)
|
||||
left join
|
||||
(
|
||||
SELECT [Id]
|
||||
,[HumanReadableId]
|
||||
,d.[Description]
|
||||
,[DefectGroupId]
|
||||
,[IsActive]
|
||||
FROM [blocking].[BlockingDefect] as g (nolock)
|
||||
|
||||
left join
|
||||
[AlplaPROD_test1].dbo.[T_BlockingDefects] as d (nolock) on
|
||||
d.IdGlobalBlockingDefect = g.HumanReadableId
|
||||
) as b on
|
||||
b.id = l.MainDefectId
|
||||
|
||||
left join
|
||||
[warehousing].[warehouse] as w (nolock) on
|
||||
w.id = l.warehouseid
|
||||
|
||||
where LaneHumanReadableId not in (20000,21000)
|
||||
group by ArticleHumanReadableId,
|
||||
ArticleAlias,
|
||||
ArticleTypeName
|
||||
--,l.RunningNumber
|
||||
|
||||
/** datamart include lot number **/
|
||||
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber
|
||||
|
||||
/** data mart include location data **/
|
||||
--,l.WarehouseDescription,l.LaneDescription
|
||||
|
||||
order by ArticleHumanReadableId
|
||||
48
backend/prodSql/queries/datamart.legacy.inventory.sql
Normal file
@@ -0,0 +1,48 @@
|
||||
select
|
||||
x.idartikelVarianten as article,
|
||||
x.ArtikelVariantenAlias as alias
|
||||
--x.Lfdnr as RunningNumber,
|
||||
,round(sum(EinlagerungsMengeVPKSum),2) as total_pallets
|
||||
,sum(EinlagerungsMengeSum) as total_palletQTY
|
||||
,round(sum(VerfuegbareMengeVPKSum),0) as available_Pallets
|
||||
,sum(VerfuegbareMengeSum) as available_QTY
|
||||
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as coa_Pallets
|
||||
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as coa_QTY
|
||||
,sum(case when c.Description NOT LIKE '%COA%' or x.IdMainDefect = -1 then GesperrteMengeVPKSum else 0 end) as held_Pallets
|
||||
,sum(case when c.Description NOT LIKE '%COA%' or x.IdMainDefect = -1 then GesperrteMengeSum else 0 end) as held_QTY
|
||||
,sum(case when x.WarenLagerLagerTyp = 8 then VerfuegbareMengeSum else 0 end) as consignment_qty
|
||||
,IdProdPlanung as lot
|
||||
----,IdAdressen,
|
||||
,x.AdressBez
|
||||
,x.IdLagerAbteilung as locationId
|
||||
,x.LagerAbteilungKurzBez as laneDescription
|
||||
,x.IdWarenlager as warehouseId
|
||||
,x.WarenLagerKurzBez as warehouseDescription
|
||||
--,*
|
||||
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
|
||||
|
||||
left join
|
||||
[AlplaPROD_test1].dbo.T_EtikettenGedruckt as l(nolock) on
|
||||
x.Lfdnr = l.Lfdnr AND l.Lfdnr > 1
|
||||
|
||||
left join
|
||||
|
||||
(SELECT *
|
||||
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] where Active = 1) as c
|
||||
on x.IdMainDefect = c.IdBlockingDefect
|
||||
/*
|
||||
The data below will be controlled by the user in excell by default everything will be passed over
|
||||
IdAdressen = 3
|
||||
*/
|
||||
where /*IdArtikelTyp = 1 and */x.IdWarenlager not in (6, 1)
|
||||
|
||||
group by x.idartikelVarianten, ArtikelVariantenAlias, c.Description
|
||||
--,IdAdressen
|
||||
,x.AdressBez
|
||||
,IdProdPlanung
|
||||
,x.IdLagerAbteilung
|
||||
,x.LagerAbteilungKurzBez
|
||||
,x.IdWarenlager
|
||||
,x.WarenLagerKurzBez
|
||||
--, x.Lfdnr
|
||||
order by x.IdArtikelVarianten
|
||||
33
backend/prodSql/queries/datamart.openOrders.sql
Normal file
@@ -0,0 +1,33 @@
|
||||
use [test1_AlplaPROD2.0_Read]
|
||||
|
||||
select
|
||||
customerartno
|
||||
,r.ArticleHumanReadableId as article
|
||||
,r.ArticleAlias as articleAlias
|
||||
,ReleaseNumber
|
||||
,h.CustomerOrderNumber as header
|
||||
,l.CustomerLineItemNumber as lineItem
|
||||
,r.CustomerReleaseNumber as releaseNumber
|
||||
,r.LoadingUnits
|
||||
,r.Quantity
|
||||
,r.TradeUnits
|
||||
,h.CustomerHumanReadableId
|
||||
,r.DeliveryAddressDescription
|
||||
,format(r.LoadingDate, 'MM/dd/yyyy HH:mm') as loadingDate
|
||||
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as deliveryDate
|
||||
,r.Remark
|
||||
--,*
|
||||
from [order].[Release] as r (nolock)
|
||||
|
||||
left join
|
||||
[order].LineItem as l (nolock) on
|
||||
l.id = r.LineItemId
|
||||
|
||||
left join
|
||||
[order].Header as h (nolock) on
|
||||
h.id = l.HeaderId
|
||||
|
||||
WHERE releasestate not in (1, 2, 4)
|
||||
AND r.deliverydate between getDate() + -[startDay] and getdate() + [endDay]
|
||||
|
||||
order by r.deliverydate
|
||||
19
backend/prodSql/queries/datamart.productionData.sql
Normal file
@@ -0,0 +1,19 @@
|
||||
use [test1_AlplaPROD2.0_Reporting]
|
||||
|
||||
declare @startDate nvarchar(30) = '[startDate]' --'2024-12-30'
|
||||
declare @endDate nvarchar(30) = '[endDate]' --'2025-08-09'
|
||||
|
||||
select MachineLocation,
|
||||
ArticleHumanReadableId as article,
|
||||
sum(Quantity) as Produced,
|
||||
count(Quantity) as palletsProdued,
|
||||
FORMAT(convert(date, ProductionDay), 'M/d/yyyy') as ProductionDay,
|
||||
ProductionLotHumanReadableId as productionLot
|
||||
|
||||
from [reporting_productionControlling].[ScannedUnit] (nolock)
|
||||
|
||||
where convert(date, ProductionDay) between @startDate and @endDate
|
||||
and ArticleHumanReadableId in ([articles])
|
||||
and BookedOut is null
|
||||
|
||||
group by MachineLocation, ArticleHumanReadableId,ProductionDay, ProductionLotHumanReadableId
|
||||
@@ -1,5 +1,10 @@
|
||||
use [test1_AlplaPROD2.0_Read]
|
||||
use AlplaPROD_test1
|
||||
/**
|
||||
|
||||
move this over to the delivery date range query once we have the shift data mapped over correctly.
|
||||
|
||||
update the psi stuff on this as well.
|
||||
**/
|
||||
DECLARE @StartDate DATE = '[startDate]' -- 2025-1-1
|
||||
DECLARE @EndDate DATE = '[endDate]' -- 2025-1-31
|
||||
SELECT
|
||||
@@ -66,9 +71,9 @@ ROW_NUMBER() OVER (PARTITION BY IdJournal ORDER BY add_date DESC) AS RowNum
|
||||
zz.IdLieferschein = ea.IdJournal
|
||||
|
||||
where
|
||||
--r.ArticleHumanReadableId in ([articles])
|
||||
r.ArticleHumanReadableId in ([articles])
|
||||
--r.ReleaseNumber = 1452
|
||||
|
||||
r.DeliveryDate between @StartDate AND @EndDate
|
||||
and DeliveredQuantity > 0
|
||||
--and Journalnummer = 169386
|
||||
and r.DeliveryDate between @StartDate AND @EndDate
|
||||
--and DeliveredQuantity > 0
|
||||
--and Journalnummer = 169386
|
||||
32
backend/prodSql/queries/datamart.psiPlanningData.sql
Normal file
@@ -0,0 +1,32 @@
|
||||
use AlplaPROD_test1
|
||||
declare @start_date nvarchar(30) = '[startDate]' --'2025-01-01'
|
||||
declare @end_date nvarchar(30) = '[endDate]' --'2025-08-09'
|
||||
/*
|
||||
articles will need to be passed over as well as the date structure we want to see
|
||||
*/
|
||||
|
||||
select x.IdArtikelvarianten As Article,
|
||||
ProduktionAlias as Description,
|
||||
standort as MachineId,
|
||||
MaschinenBezeichnung as MachineName,
|
||||
--MaschZyklus as PlanningCycleTime,
|
||||
x.IdProdPlanung as LotNumber,
|
||||
FORMAT(ProdTag, 'MM/dd/yyyy') as ProductionDay,
|
||||
x.planMenge as TotalPlanned,
|
||||
ProduktionMenge as QTYPerDay,
|
||||
round(ProduktionMengeVPK, 2) PalDay,
|
||||
Status as finished
|
||||
--MaschStdAuslastung as nee
|
||||
|
||||
from dbo.V_ProdLosProduktionJeProdTag_PLANNING (nolock) as x
|
||||
|
||||
left join
|
||||
dbo.V_ProdPlanung (nolock) as p on
|
||||
x.IdProdPlanung = p.IdProdPlanung
|
||||
|
||||
where ProdTag between @start_date and @end_date
|
||||
and p.IdArtikelvarianten in ([articles])
|
||||
--and V_ProdLosProduktionJeProdTag_PLANNING.IdKunde = 10
|
||||
--and IdProdPlanung = 18442
|
||||
|
||||
order by ProdTag desc
|
||||
11
backend/prodSql/queries/featureCheck.sql
Normal file
@@ -0,0 +1,11 @@
|
||||
SELECT count(*) as activated
|
||||
FROM [test1_AlplaPROD2.0_Read].[support].[FeatureActivation]
|
||||
|
||||
where feature in (108,7)
|
||||
|
||||
|
||||
/*
|
||||
as more features get activated and need to have this checked to include the new endpoints add here so we can check this.
|
||||
108 = waste
|
||||
7 = warehousing
|
||||
*/
|
||||
4
backend/prodSql/queries/shiftChange.sql
Normal file
@@ -0,0 +1,4 @@
|
||||
select top(1) convert(varchar(8) ,
|
||||
convert(time,startdate), 108) as shiftChange
|
||||
from [test1_AlplaPROD2.0_Read].[masterData].[ShiftDefinition]
|
||||
where teamNumber = 1
|
||||
@@ -45,7 +45,7 @@ export const monitorAlplaPurchase = async () => {
|
||||
}
|
||||
|
||||
if (purchaseMonitor[0]?.active) {
|
||||
createCronJob("purchaseMonitor", "0 */5 * * * *", async () => {
|
||||
createCronJob("purchaseMonitor", "0 5 * * * *", async () => {
|
||||
try {
|
||||
const result = await prodQuery(
|
||||
sqlQuery.query.replace(
|
||||
|
||||
@@ -10,6 +10,7 @@ import { setupOCPRoutes } from "./ocp/ocp.routes.js";
|
||||
import { setupOpendockRoutes } from "./opendock/opendock.routes.js";
|
||||
import { setupProdSqlRoutes } from "./prodSql/prodSql.routes.js";
|
||||
import { setupSystemRoutes } from "./system/system.routes.js";
|
||||
import { setupTCPRoutes } from "./tcpServer/tcp.routes.js";
|
||||
import { setupUtilsRoutes } from "./utils/utils.routes.js";
|
||||
|
||||
export const setupRoutes = (baseUrl: string, app: Express) => {
|
||||
@@ -24,4 +25,5 @@ export const setupRoutes = (baseUrl: string, app: Express) => {
|
||||
setupOpendockRoutes(baseUrl, app);
|
||||
setupNotificationRoutes(baseUrl, app);
|
||||
setupOCPRoutes(baseUrl, app);
|
||||
setupTCPRoutes(baseUrl, app);
|
||||
};
|
||||
|
||||
@@ -6,14 +6,17 @@ import { dbCleanup } from "./db/dbCleanup.controller.js";
|
||||
import { type Setting, settings } from "./db/schema/settings.schema.js";
|
||||
import { connectGPSql } from "./gpSql/gpSqlConnection.controller.js";
|
||||
import { createLogger } from "./logger/logger.controller.js";
|
||||
import { historicalSchedule } from "./logistics/logistics.historicalInv.js";
|
||||
import { startNotifications } from "./notification/notification.controller.js";
|
||||
import { createNotifications } from "./notification/notifications.master.js";
|
||||
import { printerSync } from "./ocp/ocp.printer.manage.js";
|
||||
import { monitorReleaseChanges } from "./opendock/openDockRreleaseMonitor.utils.js";
|
||||
import { opendockSocketMonitor } from "./opendock/opendockSocketMonitor.utils.js";
|
||||
import { connectProdSql } from "./prodSql/prodSqlConnection.controller.js";
|
||||
import { monitorAlplaPurchase } from "./purchase/purchase.controller.js";
|
||||
import { setupSocketIORoutes } from "./socket.io/serverSetup.js";
|
||||
import { baseSettingValidationCheck } from "./system/settingsBase.controller.js";
|
||||
import { startTCPServer } from "./tcpServer/tcp.server.js";
|
||||
import { createCronJob } from "./utils/croner.utils.js";
|
||||
import { sendEmail } from "./utils/sendEmail.utils.js";
|
||||
|
||||
@@ -29,6 +32,7 @@ const start = async () => {
|
||||
const log = createLogger({ module: "system", subModule: "main start" });
|
||||
|
||||
// triggering long lived processes
|
||||
startTCPServer();
|
||||
connectProdSql();
|
||||
connectGPSql();
|
||||
|
||||
@@ -52,11 +56,16 @@ const start = async () => {
|
||||
monitorAlplaPurchase();
|
||||
}
|
||||
|
||||
if (systemSettings.filter((n) => n.name === "ocp")[0]?.active) {
|
||||
printerSync();
|
||||
}
|
||||
|
||||
// these jobs below are system jobs and should run no matter what.
|
||||
createCronJob("JobAuditLogCleanUp", "0 0 5 * * *", () =>
|
||||
dbCleanup("jobs", 30),
|
||||
);
|
||||
createCronJob("logsCleanup", "0 15 5 * * *", () => dbCleanup("logs", 120));
|
||||
historicalSchedule();
|
||||
|
||||
// one shots only needed to run on server startups
|
||||
createNotifications();
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
import { Router } from "express";
|
||||
import { connected as gpSql } from "../gpSql/gpSqlConnection.controller.js";
|
||||
import { connected as prodSql } from "../prodSql/prodSqlConnection.controller.js";
|
||||
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
|
||||
import {
|
||||
type SqlQuery,
|
||||
sqlQuerySelector,
|
||||
} from "../prodSql/prodSqlQuerySelector.utils.js";
|
||||
import { isServerRunning } from "../tcpServer/tcp.server.js";
|
||||
|
||||
const router = Router();
|
||||
|
||||
@@ -25,6 +28,9 @@ router.get("/", async (_, res) => {
|
||||
: [],
|
||||
eomFGPkgSheetVersion: 1, // this is the excel file version when we have a change to the macro we want to grab this
|
||||
masterMacroFile: 1,
|
||||
tcpServerOnline: isServerRunning,
|
||||
sqlServerConnected: prodSql,
|
||||
gpServerConnected: gpSql,
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
49
backend/system/system.mobileApp.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import fs from "node:fs";
|
||||
import { Router } from "express";
|
||||
import path from "path";
|
||||
import { fileURLToPath } from "url";
|
||||
|
||||
const router = Router();
|
||||
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = path.dirname(__filename);
|
||||
|
||||
const downloadDir = path.resolve(__dirname, "../../downloads/mobile");
|
||||
|
||||
const currentApk = {
|
||||
packageName: "net.alpla.lst.mobile",
|
||||
versionName: "0.0.1-alpha",
|
||||
versionCode: 1,
|
||||
minSupportedVersionCode: 1,
|
||||
fileName: "lst-mobile.apk",
|
||||
};
|
||||
|
||||
router.get("/version", async (req, res) => {
|
||||
const baseUrl = `${req.protocol}://${req.get("host")}`;
|
||||
|
||||
res.json({
|
||||
packageName: currentApk.packageName,
|
||||
versionName: currentApk.versionName,
|
||||
versionCode: currentApk.versionCode,
|
||||
minSupportedVersionCode: currentApk.minSupportedVersionCode,
|
||||
downloadUrl: `${baseUrl}/lst/api/mobile/apk/latest`,
|
||||
});
|
||||
});
|
||||
|
||||
router.get("/apk/latest", (_, res) => {
|
||||
const apkPath = path.join(downloadDir, currentApk.fileName);
|
||||
|
||||
if (!fs.existsSync(apkPath)) {
|
||||
return res.status(404).json({ success: false, message: "APK not found" });
|
||||
}
|
||||
|
||||
res.setHeader("Content-Type", "application/vnd.android.package-archive");
|
||||
res.setHeader(
|
||||
"Content-Disposition",
|
||||
`attachment; filename="${currentApk.fileName}"`,
|
||||
);
|
||||
|
||||
return res.sendFile(apkPath);
|
||||
});
|
||||
|
||||
export default router;
|
||||
@@ -3,10 +3,12 @@ import { requireAuth } from "../middleware/auth.middleware.js";
|
||||
import getSettings from "./settings.route.js";
|
||||
import updSetting from "./settingsUpdate.route.js";
|
||||
import stats from "./stats.route.js";
|
||||
import mobile from "./system.mobileApp.js";
|
||||
|
||||
export const setupSystemRoutes = (baseUrl: string, app: Express) => {
|
||||
//stats will be like this as we dont need to change this
|
||||
app.use(`${baseUrl}/api/stats`, stats);
|
||||
app.use(`${baseUrl}/api/mobile`, mobile);
|
||||
app.use(`${baseUrl}/api/settings`, getSettings);
|
||||
app.use(`${baseUrl}/api/settings`, requireAuth, updSetting);
|
||||
|
||||
|
||||
51
backend/tcpServer/tcp.printerListener.ts
Normal file
@@ -0,0 +1,51 @@
|
||||
import { db } from "../db/db.controller.js";
|
||||
import { printerLog } from "../db/schema/printerLogs.schema.js";
|
||||
import { createLogger } from "../logger/logger.controller.js";
|
||||
import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||
|
||||
export type PrinterData = {
|
||||
ip: string;
|
||||
name: string;
|
||||
condition: string;
|
||||
message: string;
|
||||
date?: string;
|
||||
printerSN: string;
|
||||
};
|
||||
|
||||
const log = createLogger({ module: "tcp", submodule: "create_server" });
|
||||
|
||||
export const printerListen = async (tcpData: PrinterData) => {
|
||||
const ip = tcpData.ip?.replace("::ffff:", "");
|
||||
|
||||
// post the new message
|
||||
const { data, error } = await tryCatch(
|
||||
db
|
||||
.insert(printerLog)
|
||||
.values({
|
||||
ip,
|
||||
name: tcpData.name,
|
||||
condition: tcpData.condition,
|
||||
message: tcpData.message,
|
||||
printerSN: tcpData.printerSN,
|
||||
})
|
||||
.returning(),
|
||||
);
|
||||
|
||||
if (error) {
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "tcp",
|
||||
subModule: "post",
|
||||
message: "Failed to post tcp printer data.",
|
||||
data: [],
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
if (data) {
|
||||
log.info({}, `${tcpData.name} sent a message over`);
|
||||
// TODO: send message over to the controller to decide what to do next with it
|
||||
}
|
||||
};
|
||||
14
backend/tcpServer/tcp.routes.ts
Normal file
@@ -0,0 +1,14 @@
|
||||
import type { Express } from "express";
|
||||
import { requireAuth } from "../middleware/auth.middleware.js";
|
||||
import restart from "./tcpRestart.route.js";
|
||||
import start from "./tcpStart.route.js";
|
||||
import stop from "./tcpStop.route.js";
|
||||
|
||||
export const setupTCPRoutes = (baseUrl: string, app: Express) => {
|
||||
//stats will be like this as we dont need to change this
|
||||
app.use(`${baseUrl}/api/tcp/start`, requireAuth, start);
|
||||
app.use(`${baseUrl}/api/tcp/stop`, requireAuth, stop);
|
||||
app.use(`${baseUrl}/api/tcp/restart`, requireAuth, restart);
|
||||
|
||||
// all other system should be under /api/system/*
|
||||
};
|
||||
180
backend/tcpServer/tcp.server.ts
Normal file
@@ -0,0 +1,180 @@
|
||||
import net from "node:net";
|
||||
import { eq } from "drizzle-orm";
|
||||
import { db } from "../db/db.controller.js";
|
||||
import { printerData } from "../db/schema/printers.schema.js";
|
||||
import { createLogger } from "../logger/logger.controller.js";
|
||||
import { delay } from "../utils/delay.utils.js";
|
||||
import { returnFunc } from "../utils/returnHelper.utils.js";
|
||||
import { tryCatch } from "../utils/trycatch.utils.js";
|
||||
import { type PrinterData, printerListen } from "./tcp.printerListener.js";
|
||||
|
||||
let tcpServer: net.Server;
|
||||
const tcpSockets: Set<net.Socket> = new Set();
|
||||
export let isServerRunning = false;
|
||||
|
||||
const port = parseInt(process.env.TCP_PORT ?? "2222", 10);
|
||||
|
||||
const parseTcpAlert = (input: string) => {
|
||||
// guard
|
||||
const colonIndex = input.indexOf(":");
|
||||
if (colonIndex === -1) return null;
|
||||
|
||||
const condition = input.slice(0, colonIndex).trim();
|
||||
const rest = input.slice(colonIndex + 1).trim();
|
||||
|
||||
// extract all [ ... ] blocks from rest
|
||||
const matches = [...rest.matchAll(/\[(.*?)\]/g)];
|
||||
|
||||
const date = matches[0]?.[1] ?? "";
|
||||
const name = matches[1]?.[1] ?? "";
|
||||
|
||||
// message = everything before first "["
|
||||
const bracketIndex = rest.indexOf("[");
|
||||
const message =
|
||||
bracketIndex !== -1 ? rest.slice(0, bracketIndex).trim() : rest;
|
||||
|
||||
return {
|
||||
condition,
|
||||
message,
|
||||
date,
|
||||
name,
|
||||
};
|
||||
};
|
||||
const log = createLogger({ module: "tcp", submodule: "create_server" });
|
||||
export const startTCPServer = async () => {
|
||||
tcpServer = net.createServer(async (socket) => {
|
||||
tcpSockets.add(socket);
|
||||
socket.on("data", async (data: Buffer) => {
|
||||
const parseData = data.toString("utf-8").trimEnd();
|
||||
|
||||
// check where the data came from then we do something.
|
||||
|
||||
const ip = socket.remoteAddress ?? "127.0.0.1";
|
||||
const { data: printer, error: pError } = await tryCatch(
|
||||
db
|
||||
.select()
|
||||
.from(printerData)
|
||||
.where(eq(printerData.ipAddress, ip.replace("::ffff:", ""))),
|
||||
);
|
||||
if (pError) {
|
||||
log.error(
|
||||
{ stack: pError },
|
||||
"There was an error getting printer data for tcp check",
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
if (printer?.length) {
|
||||
const printerData = {
|
||||
...parseTcpAlert(parseData),
|
||||
ip,
|
||||
printerSN: printer[0]?.printerSN,
|
||||
name: printer[0]?.name,
|
||||
};
|
||||
|
||||
printerListen(printerData as PrinterData);
|
||||
}
|
||||
});
|
||||
|
||||
socket.on("end", () => {
|
||||
log.debug({}, "Client disconnected");
|
||||
// just in case we dont fully disconnect
|
||||
setTimeout(() => {
|
||||
if (!socket.destroyed) {
|
||||
socket.destroy();
|
||||
}
|
||||
}, 1000);
|
||||
tcpSockets.delete(socket);
|
||||
});
|
||||
|
||||
socket.on("error", (err: Error) => {
|
||||
log.error({ stack: err }, `Socket error:", ${err}`);
|
||||
// just in case we dont fully disconnect
|
||||
setTimeout(() => {
|
||||
if (!socket.destroyed) {
|
||||
socket.destroy();
|
||||
}
|
||||
}, 1000);
|
||||
tcpSockets.delete(socket);
|
||||
});
|
||||
});
|
||||
|
||||
tcpServer.listen(port, () => {
|
||||
log.info({}, `TCP Server listening on port ${port}`);
|
||||
});
|
||||
|
||||
isServerRunning = true;
|
||||
return returnFunc({
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "tcp",
|
||||
subModule: "create_server",
|
||||
message: "TCP server started.",
|
||||
data: [],
|
||||
notify: false,
|
||||
room: "",
|
||||
});
|
||||
};
|
||||
|
||||
export const stopTCPServer = async () => {
|
||||
if (!isServerRunning)
|
||||
return { success: false, message: "Server is not running" };
|
||||
for (const socket of tcpSockets) {
|
||||
socket.destroy();
|
||||
}
|
||||
tcpSockets.clear();
|
||||
tcpServer.close(() => {
|
||||
log.info({}, "TCP Server stopped");
|
||||
});
|
||||
isServerRunning = false;
|
||||
return returnFunc({
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "tcp",
|
||||
subModule: "create_server",
|
||||
message: "TCP server stopped.",
|
||||
data: [],
|
||||
notify: false,
|
||||
room: "",
|
||||
});
|
||||
};
|
||||
|
||||
export const restartTCPServer = async () => {
|
||||
if (!isServerRunning) {
|
||||
startTCPServer();
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "warn",
|
||||
module: "tcp",
|
||||
subModule: "create_server",
|
||||
message: "Server is not running will try to start it",
|
||||
data: [],
|
||||
notify: false,
|
||||
room: "",
|
||||
});
|
||||
} else {
|
||||
for (const socket of tcpSockets) {
|
||||
socket.destroy();
|
||||
}
|
||||
tcpSockets.clear();
|
||||
tcpServer.close(() => {
|
||||
log.info({}, "TCP Server stopped");
|
||||
});
|
||||
isServerRunning = false;
|
||||
|
||||
await delay(1500);
|
||||
|
||||
startTCPServer();
|
||||
}
|
||||
|
||||
return returnFunc({
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "tcp",
|
||||
subModule: "create_server",
|
||||
message: "TCP server has been restarted.",
|
||||
data: [],
|
||||
notify: false,
|
||||
room: "",
|
||||
});
|
||||
};
|
||||
19
backend/tcpServer/tcpRestart.route.ts
Normal file
@@ -0,0 +1,19 @@
|
||||
import { Router } from "express";
|
||||
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||
import { restartTCPServer } from "./tcp.server.js";
|
||||
|
||||
const r = Router();
|
||||
|
||||
r.post("/restart", async (_, res) => {
|
||||
const connect = await restartTCPServer();
|
||||
apiReturn(res, {
|
||||
success: connect.success,
|
||||
level: connect.success ? "info" : "error",
|
||||
module: "tcp",
|
||||
subModule: "post",
|
||||
message: "TCP Server has been restarted",
|
||||
data: connect.data,
|
||||
status: connect.success ? 200 : 400,
|
||||
});
|
||||
});
|
||||
export default r;
|
||||
20
backend/tcpServer/tcpStart.route.ts
Normal file
@@ -0,0 +1,20 @@
|
||||
import { Router } from "express";
|
||||
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||
import { startTCPServer } from "./tcp.server.js";
|
||||
|
||||
const r = Router();
|
||||
|
||||
r.post("/start", async (_, res) => {
|
||||
const connect = await startTCPServer();
|
||||
apiReturn(res, {
|
||||
success: connect.success,
|
||||
level: connect.success ? "info" : "error",
|
||||
module: "routes",
|
||||
subModule: "prodSql",
|
||||
message: connect.message,
|
||||
data: connect.data,
|
||||
status: connect.success ? 200 : 400,
|
||||
});
|
||||
});
|
||||
|
||||
export default r;
|
||||
20
backend/tcpServer/tcpStop.route.ts
Normal file
@@ -0,0 +1,20 @@
|
||||
import { Router } from "express";
|
||||
import { apiReturn } from "../utils/returnHelper.utils.js";
|
||||
import { stopTCPServer } from "./tcp.server.js";
|
||||
|
||||
const r = Router();
|
||||
|
||||
r.post("/stop", async (_, res) => {
|
||||
const connect = await stopTCPServer();
|
||||
apiReturn(res, {
|
||||
success: connect.success,
|
||||
level: connect.success ? "info" : "error",
|
||||
module: "routes",
|
||||
subModule: "prodSql",
|
||||
message: connect.message,
|
||||
data: [],
|
||||
status: connect.success ? 200 : 400,
|
||||
});
|
||||
});
|
||||
|
||||
export default r;
|
||||
@@ -9,6 +9,7 @@ export const allowedOrigins = [
|
||||
"http://localhost:4000",
|
||||
"http://localhost:4001",
|
||||
"http://localhost:5500",
|
||||
"http://localhost:8081",
|
||||
"https://admin.socket.io",
|
||||
"https://electron-socket-io-playground.vercel.app",
|
||||
`${process.env.URL}`,
|
||||
|
||||
@@ -3,6 +3,7 @@ import { eq } from "drizzle-orm";
|
||||
import { db } from "../db/db.controller.js";
|
||||
import { jobAuditLog } from "../db/schema/auditLog.schema.js";
|
||||
import { createLogger } from "../logger/logger.controller.js";
|
||||
import type { ReturnHelper } from "./returnHelper.utils.js";
|
||||
|
||||
// example createJob
|
||||
// createCronJob("test Cron", "*/5 * * * * *", async () => {
|
||||
@@ -45,7 +46,7 @@ const cronStats: Record<string, { created: number; replaced: number }> = {};
|
||||
export const createCronJob = async (
|
||||
name: string,
|
||||
schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second
|
||||
task: () => Promise<void>, // what function are we passing over
|
||||
task: () => Promise<void | ReturnHelper>, // what function are we passing over
|
||||
source = "unknown",
|
||||
) => {
|
||||
// get the timezone based on the os timezone set
|
||||
|
||||
124
backend/utils/prodEndpoint.utils.ts
Normal file
@@ -0,0 +1,124 @@
|
||||
import https from "node:https";
|
||||
import axios from "axios";
|
||||
import { returnFunc } from "./returnHelper.utils.js";
|
||||
import { tryCatch } from "./trycatch.utils.js";
|
||||
|
||||
type bodyData = any;
|
||||
|
||||
type Data = {
|
||||
endpoint: string;
|
||||
data?: bodyData[];
|
||||
method: "post" | "get" | "delete" | "patch";
|
||||
};
|
||||
|
||||
// type ApiResponse<T = unknown> = {
|
||||
// status: number;
|
||||
// statusText: string;
|
||||
// data: T;
|
||||
// };
|
||||
|
||||
// create the test server stuff
|
||||
const testServers = [
|
||||
{ token: "test1", port: 8940 },
|
||||
{ token: "test2", port: 8941 },
|
||||
{ token: "test3", port: 8942 },
|
||||
];
|
||||
|
||||
const agent = new https.Agent({
|
||||
rejectUnauthorized: false,
|
||||
});
|
||||
|
||||
export const prodEndpointCreation = async (endpoint: string) => {
|
||||
let url = "";
|
||||
//get the plant token
|
||||
const plantToken = process.env.PROD_PLANT_TOKEN ?? "test1";
|
||||
|
||||
// check if we are a test server
|
||||
const testServer = testServers.some((server) => server.token === plantToken);
|
||||
|
||||
// await db
|
||||
// .select()
|
||||
// .from(settings)
|
||||
// .where(eq(settings.name, "dbServer"));
|
||||
|
||||
if (testServer) {
|
||||
//filter out what testserver we are
|
||||
const test = testServers.filter((t) => t.token === plantToken);
|
||||
// "https://usmcd1vms036.alpla.net:8942/application/public/v1.0/DemandManagement/ORDERS"
|
||||
url = `https://${process.env.PROD_SERVER}.alpla.net:${test[0]?.port}/application${endpoint}`;
|
||||
return url;
|
||||
} else {
|
||||
url = `https://${plantToken}prod.alpla.net/application${endpoint}`;
|
||||
return url;
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
*
|
||||
* @param data
|
||||
* @param timeoutDelay
|
||||
* @returns
|
||||
*/
|
||||
export const runProdApi = async (data: Data) => {
|
||||
const url = await prodEndpointCreation(data.endpoint);
|
||||
|
||||
const { data: d, error } = await tryCatch(
|
||||
axios({
|
||||
method: data.method as string,
|
||||
url,
|
||||
data: data.data ? data.data[0] : undefined,
|
||||
headers: {
|
||||
"X-API-Key": process.env.TEC_API_KEY || "",
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
validateStatus: () => true,
|
||||
httpsAgent: agent,
|
||||
}),
|
||||
);
|
||||
|
||||
switch (d?.status) {
|
||||
case 200:
|
||||
return returnFunc({
|
||||
success: true,
|
||||
level: "info",
|
||||
module: "utils",
|
||||
subModule: "prodEndpoint",
|
||||
message: "Data from prod endpoint",
|
||||
data: d.data,
|
||||
notify: false,
|
||||
});
|
||||
|
||||
case 401:
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "utils",
|
||||
subModule: "prodEndpoint",
|
||||
message: "Data from prod endpoint",
|
||||
data: d.data,
|
||||
notify: false,
|
||||
});
|
||||
case 400:
|
||||
return returnFunc({
|
||||
success: false,
|
||||
level: "error",
|
||||
module: "utils",
|
||||
subModule: "prodEndpoint",
|
||||
message: "Data from prod endpoint",
|
||||
data: d.data,
|
||||
notify: false,
|
||||
});
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return returnFunc({
|
||||
success: true,
|
||||
level: "error",
|
||||
module: "utils",
|
||||
subModule: "prodEndpoint",
|
||||
message: "Failed to get data from the prod endpoint",
|
||||
data: error as any,
|
||||
notify: true,
|
||||
});
|
||||
}
|
||||
};
|
||||
@@ -1,7 +1,7 @@
|
||||
import type { Response } from "express";
|
||||
import { createLogger } from "../logger/logger.controller.js";
|
||||
|
||||
interface Data<T = unknown[]> {
|
||||
export interface ReturnHelper<T = unknown[]> {
|
||||
success: boolean;
|
||||
module:
|
||||
| "system"
|
||||
@@ -12,30 +12,12 @@ interface Data<T = unknown[]> {
|
||||
| "opendock"
|
||||
| "notification"
|
||||
| "email"
|
||||
| "purchase";
|
||||
subModule:
|
||||
| "db"
|
||||
| "labeling"
|
||||
| "printer"
|
||||
| "prodSql"
|
||||
| "query"
|
||||
| "sendmail"
|
||||
| "auth"
|
||||
| "datamart"
|
||||
| "jobs"
|
||||
| "apt"
|
||||
| "settings"
|
||||
| "get"
|
||||
| "update"
|
||||
| "delete"
|
||||
| "post"
|
||||
| "notification"
|
||||
| "delete"
|
||||
| "printing"
|
||||
| "gpSql"
|
||||
| "email"
|
||||
| "gpChecks";
|
||||
level: "info" | "error" | "debug" | "fatal";
|
||||
| "purchase"
|
||||
| "tcp"
|
||||
| "logistics";
|
||||
subModule: string;
|
||||
|
||||
level: "info" | "error" | "debug" | "fatal" | "warn";
|
||||
message: string;
|
||||
room?: string;
|
||||
data?: T;
|
||||
@@ -56,7 +38,7 @@ interface Data<T = unknown[]> {
|
||||
* data: [] the data that will be passed back
|
||||
* notify: false by default this is to send a notification to a users email to alert them of an issue.
|
||||
*/
|
||||
export const returnFunc = (data: Data) => {
|
||||
export const returnFunc = (data: ReturnHelper) => {
|
||||
const notify = data.notify ? data.notify : false;
|
||||
const room = data.room ?? data.room;
|
||||
const log = createLogger({ module: data.module, subModule: data.subModule });
|
||||
@@ -89,7 +71,7 @@ export const returnFunc = (data: Data) => {
|
||||
|
||||
export function apiReturn(
|
||||
res: Response,
|
||||
opts: Data & { status?: number },
|
||||
opts: ReturnHelper & { status?: number },
|
||||
optional?: unknown, // leave this as unknown so we can pass an object or an array over.
|
||||
): Response {
|
||||
const result = returnFunc(opts);
|
||||
|
||||
@@ -5,13 +5,17 @@ meta {
|
||||
}
|
||||
|
||||
get {
|
||||
url: {{url}}/api/datamart/:name
|
||||
url: {{url}}/api/datamart/:name?historical=x
|
||||
body: none
|
||||
auth: inherit
|
||||
}
|
||||
|
||||
params:query {
|
||||
historical: x
|
||||
}
|
||||
|
||||
params:path {
|
||||
name: activeArticles
|
||||
name: inventory
|
||||
}
|
||||
|
||||
settings {
|
||||
|
||||
43
lstMobile/.gitignore
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
# Learn more https://docs.github.com/en/get-started/getting-started-with-git/ignoring-files
|
||||
|
||||
# dependencies
|
||||
node_modules/
|
||||
|
||||
# Expo
|
||||
.expo/
|
||||
dist/
|
||||
web-build/
|
||||
expo-env.d.ts
|
||||
|
||||
# Native
|
||||
.kotlin/
|
||||
*.orig.*
|
||||
*.jks
|
||||
*.p8
|
||||
*.p12
|
||||
*.key
|
||||
*.mobileprovision
|
||||
|
||||
# Metro
|
||||
.metro-health-check*
|
||||
|
||||
# debug
|
||||
npm-debug.*
|
||||
yarn-debug.*
|
||||
yarn-error.*
|
||||
|
||||
# macOS
|
||||
.DS_Store
|
||||
*.pem
|
||||
|
||||
# local env files
|
||||
.env*.local
|
||||
|
||||
# typescript
|
||||
*.tsbuildinfo
|
||||
|
||||
app-example
|
||||
|
||||
# generated native folders
|
||||
/ios
|
||||
/android
|
||||
1
lstMobile/.vscode/extensions.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{ "recommendations": ["expo.vscode-expo-tools"] }
|
||||
7
lstMobile/.vscode/settings.json
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"editor.codeActionsOnSave": {
|
||||
"source.fixAll": "explicit",
|
||||
"source.organizeImports": "explicit",
|
||||
"source.sortMembers": "explicit"
|
||||
}
|
||||
}
|
||||
56
lstMobile/README.md
Normal file
@@ -0,0 +1,56 @@
|
||||
# Welcome to your Expo app 👋
|
||||
|
||||
This is an [Expo](https://expo.dev) project created with [`create-expo-app`](https://www.npmjs.com/package/create-expo-app).
|
||||
|
||||
## Get started
|
||||
|
||||
1. Install dependencies
|
||||
|
||||
```bash
|
||||
npm install
|
||||
```
|
||||
|
||||
2. Start the app
|
||||
|
||||
```bash
|
||||
npx expo start
|
||||
```
|
||||
|
||||
In the output, you'll find options to open the app in a
|
||||
|
||||
- [development build](https://docs.expo.dev/develop/development-builds/introduction/)
|
||||
- [Android emulator](https://docs.expo.dev/workflow/android-studio-emulator/)
|
||||
- [iOS simulator](https://docs.expo.dev/workflow/ios-simulator/)
|
||||
- [Expo Go](https://expo.dev/go), a limited sandbox for trying out app development with Expo
|
||||
|
||||
You can start developing by editing the files inside the **app** directory. This project uses [file-based routing](https://docs.expo.dev/router/introduction).
|
||||
|
||||
## Get a fresh project
|
||||
|
||||
When you're ready, run:
|
||||
|
||||
```bash
|
||||
npm run reset-project
|
||||
```
|
||||
|
||||
This command will move the starter code to the **app-example** directory and create a blank **app** directory where you can start developing.
|
||||
|
||||
### Other setup steps
|
||||
|
||||
- To set up ESLint for linting, run `npx expo lint`, or follow our guide on ["Using ESLint and Prettier"](https://docs.expo.dev/guides/using-eslint/)
|
||||
- If you'd like to set up unit testing, follow our guide on ["Unit Testing with Jest"](https://docs.expo.dev/develop/unit-testing/)
|
||||
- Learn more about the TypeScript setup in this template in our guide on ["Using TypeScript"](https://docs.expo.dev/guides/typescript/)
|
||||
|
||||
## Learn more
|
||||
|
||||
To learn more about developing your project with Expo, look at the following resources:
|
||||
|
||||
- [Expo documentation](https://docs.expo.dev/): Learn fundamentals, or go into advanced topics with our [guides](https://docs.expo.dev/guides).
|
||||
- [Learn Expo tutorial](https://docs.expo.dev/tutorial/introduction/): Follow a step-by-step tutorial where you'll create a project that runs on Android, iOS, and the web.
|
||||
|
||||
## Join the community
|
||||
|
||||
Join our community of developers creating universal apps.
|
||||
|
||||
- [Expo on GitHub](https://github.com/expo/expo): View our open source platform and contribute.
|
||||
- [Discord community](https://chat.expo.dev): Chat with Expo users and ask questions.
|
||||
47
lstMobile/app.json
Normal file
@@ -0,0 +1,47 @@
|
||||
{
|
||||
"expo": {
|
||||
"name": "LST mobile",
|
||||
"slug": "lst-mobile",
|
||||
"version": "0.0.1-alpha",
|
||||
"orientation": "portrait",
|
||||
"icon": "./assets/images/icon.png",
|
||||
"scheme": "lstmobile",
|
||||
"userInterfaceStyle": "automatic",
|
||||
"ios": {
|
||||
"icon": "./assets/expo.icon"
|
||||
},
|
||||
"android": {
|
||||
"adaptiveIcon": {
|
||||
"backgroundColor": "#E6F4FE",
|
||||
"foregroundImage": "./assets/images/android-icon-foreground.png",
|
||||
"backgroundImage": "./assets/images/android-icon-background.png",
|
||||
"monochromeImage": "./assets/images/android-icon-monochrome.png",
|
||||
"package": "net.alpla.lst.mobile",
|
||||
"versionCode": 1
|
||||
},
|
||||
"predictiveBackGestureEnabled": false,
|
||||
"package": "com.anonymous.lstMobile"
|
||||
},
|
||||
"web": {
|
||||
"output": "static",
|
||||
"favicon": "./assets/images/favicon.png"
|
||||
},
|
||||
"plugins": [
|
||||
"expo-router",
|
||||
[
|
||||
"expo-splash-screen",
|
||||
{
|
||||
"backgroundColor": "#208AEF",
|
||||
"android": {
|
||||
"image": "./assets/images/splash-icon.png",
|
||||
"imageWidth": 76
|
||||
}
|
||||
}
|
||||
]
|
||||
],
|
||||
"experiments": {
|
||||
"typedRoutes": true,
|
||||
"reactCompiler": true
|
||||
}
|
||||
}
|
||||
}
|
||||
3
lstMobile/assets/expo.icon/Assets/expo-symbol 2.svg
Normal file
@@ -0,0 +1,3 @@
|
||||
<svg width="652" height="606" viewBox="0 0 652 606" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M353.554 0H298.446C273.006 0 249.684 14.6347 237.962 37.9539L4.37994 502.646C-1.04325 513.435 -1.45067 526.178 3.2716 537.313L22.6123 582.918C34.6475 611.297 72.5404 614.156 88.4414 587.885L309.863 222.063C313.34 216.317 319.439 212.826 326 212.826C332.561 212.826 338.659 216.317 342.137 222.063L563.559 587.885C579.46 614.156 617.352 611.297 629.388 582.918L648.728 537.313C653.451 526.178 653.043 513.435 647.62 502.646L414.038 37.9539C402.316 14.6347 378.994 0 353.554 0Z" fill="white"/>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 608 B |
BIN
lstMobile/assets/expo.icon/Assets/grid.png
Normal file
|
After Width: | Height: | Size: 52 KiB |
40
lstMobile/assets/expo.icon/icon.json
Normal file
@@ -0,0 +1,40 @@
|
||||
{
|
||||
"fill" : {
|
||||
"automatic-gradient" : "extended-srgb:0.00000,0.47843,1.00000,1.00000"
|
||||
},
|
||||
"groups" : [
|
||||
{
|
||||
"layers" : [
|
||||
{
|
||||
"image-name" : "expo-symbol 2.svg",
|
||||
"name" : "expo-symbol 2",
|
||||
"position" : {
|
||||
"scale" : 1,
|
||||
"translation-in-points" : [
|
||||
1.1008400065293245e-05,
|
||||
-16.046875
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"image-name" : "grid.png",
|
||||
"name" : "grid"
|
||||
}
|
||||
],
|
||||
"shadow" : {
|
||||
"kind" : "neutral",
|
||||
"opacity" : 0.5
|
||||
},
|
||||
"translucency" : {
|
||||
"enabled" : true,
|
||||
"value" : 0.5
|
||||
}
|
||||
}
|
||||
],
|
||||
"supported-platforms" : {
|
||||
"circles" : [
|
||||
"watchOS"
|
||||
],
|
||||
"squares" : "shared"
|
||||
}
|
||||
}
|
||||
BIN
lstMobile/assets/images/android-icon-background.png
Normal file
|
After Width: | Height: | Size: 17 KiB |
BIN
lstMobile/assets/images/android-icon-foreground.png
Normal file
|
After Width: | Height: | Size: 77 KiB |
BIN
lstMobile/assets/images/android-icon-monochrome.png
Normal file
|
After Width: | Height: | Size: 4.0 KiB |
BIN
lstMobile/assets/images/expo-badge-white.png
Normal file
|
After Width: | Height: | Size: 4.0 KiB |
BIN
lstMobile/assets/images/expo-badge.png
Normal file
|
After Width: | Height: | Size: 4.0 KiB |
BIN
lstMobile/assets/images/expo-logo.png
Normal file
|
After Width: | Height: | Size: 3.2 KiB |
BIN
lstMobile/assets/images/favicon.png
Normal file
|
After Width: | Height: | Size: 1.1 KiB |
BIN
lstMobile/assets/images/icon.png
Normal file
|
After Width: | Height: | Size: 780 KiB |
BIN
lstMobile/assets/images/logo-glow.png
Normal file
|
After Width: | Height: | Size: 324 KiB |
BIN
lstMobile/assets/images/react-logo.png
Normal file
|
After Width: | Height: | Size: 6.2 KiB |
BIN
lstMobile/assets/images/react-logo@2x.png
Normal file
|
After Width: | Height: | Size: 14 KiB |
BIN
lstMobile/assets/images/react-logo@3x.png
Normal file
|
After Width: | Height: | Size: 21 KiB |
BIN
lstMobile/assets/images/splash-icon.png
Normal file
|
After Width: | Height: | Size: 3.2 KiB |
BIN
lstMobile/assets/images/tabIcons/explore.png
Normal file
|
After Width: | Height: | Size: 215 B |
BIN
lstMobile/assets/images/tabIcons/explore@2x.png
Normal file
|
After Width: | Height: | Size: 347 B |
BIN
lstMobile/assets/images/tabIcons/explore@3x.png
Normal file
|
After Width: | Height: | Size: 468 B |
BIN
lstMobile/assets/images/tabIcons/home.png
Normal file
|
After Width: | Height: | Size: 253 B |
BIN
lstMobile/assets/images/tabIcons/home@2x.png
Normal file
|
After Width: | Height: | Size: 343 B |
BIN
lstMobile/assets/images/tabIcons/home@3x.png
Normal file
|
After Width: | Height: | Size: 479 B |
BIN
lstMobile/assets/images/tutorial-web.png
Normal file
|
After Width: | Height: | Size: 58 KiB |
13903
lstMobile/package-lock.json
generated
Normal file
55
lstMobile/package.json
Normal file
@@ -0,0 +1,55 @@
|
||||
{
|
||||
"name": "lstmobile",
|
||||
"main": "expo-router/entry",
|
||||
"version": "0.0.1-alpha",
|
||||
"scripts": {
|
||||
"start": "expo start",
|
||||
"reset-project": "node ./scripts/reset-project.js",
|
||||
"android": "expo run:android",
|
||||
"ios": "expo run:ios",
|
||||
"web": "expo start --web",
|
||||
"lint": "expo lint",
|
||||
"build:apk": "expo prebuild --clean && cd android && gradlew.bat assembleRelease ",
|
||||
"update": "adb install android/app/build/outputs/apk/release/app-release.apk"
|
||||
},
|
||||
"dependencies": {
|
||||
"@react-native-async-storage/async-storage": "2.2.0",
|
||||
"@react-navigation/bottom-tabs": "^7.15.5",
|
||||
"@react-navigation/elements": "^2.9.10",
|
||||
"@react-navigation/native": "^7.1.33",
|
||||
"@tanstack/react-query": "^5.99.0",
|
||||
"axios": "^1.15.0",
|
||||
"expo": "~55.0.15",
|
||||
"expo-application": "~55.0.14",
|
||||
"expo-constants": "~55.0.14",
|
||||
"expo-device": "~55.0.15",
|
||||
"expo-font": "~55.0.6",
|
||||
"expo-glass-effect": "~55.0.10",
|
||||
"expo-image": "~55.0.8",
|
||||
"expo-linking": "~55.0.13",
|
||||
"expo-router": "~55.0.12",
|
||||
"expo-splash-screen": "~55.0.18",
|
||||
"expo-status-bar": "~55.0.5",
|
||||
"expo-symbols": "~55.0.7",
|
||||
"expo-system-ui": "~55.0.15",
|
||||
"expo-web-browser": "~55.0.14",
|
||||
"lucide-react-native": "^1.8.0",
|
||||
"react": "19.2.0",
|
||||
"react-dom": "19.2.0",
|
||||
"react-native": "0.83.4",
|
||||
"react-native-gesture-handler": "~2.30.0",
|
||||
"react-native-reanimated": "4.2.1",
|
||||
"react-native-safe-area-context": "~5.6.2",
|
||||
"react-native-screens": "~4.23.0",
|
||||
"react-native-web": "~0.21.0",
|
||||
"react-native-worklets": "0.7.2",
|
||||
"socket.io-client": "^4.8.3",
|
||||
"zod": "^4.3.6"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/react": "~19.2.2",
|
||||
"eas-cli": "^18.7.0",
|
||||
"typescript": "~5.9.2"
|
||||
},
|
||||
"private": true
|
||||
}
|
||||
38
lstMobile/src/app/(tabs)/_layout.tsx
Normal file
@@ -0,0 +1,38 @@
|
||||
import { Tabs } from 'expo-router'
|
||||
import React from 'react'
|
||||
import { colors } from '../../stlyes/global'
|
||||
import { Home,Settings } from 'lucide-react-native'
|
||||
|
||||
export default function TabLayout() {
|
||||
return (
|
||||
<Tabs
|
||||
screenOptions={{
|
||||
headerShown: false,
|
||||
tabBarStyle:{
|
||||
|
||||
},
|
||||
tabBarActiveTintColor: 'black',
|
||||
tabBarInactiveTintColor: colors.textSecondary,
|
||||
}}
|
||||
>
|
||||
<Tabs.Screen
|
||||
name='index'
|
||||
options={{
|
||||
title:'Home',
|
||||
tabBarIcon: ({color, size})=>(
|
||||
<Home color={color} size={size}/>
|
||||
)
|
||||
}}
|
||||
/>
|
||||
<Tabs.Screen
|
||||
name='config'
|
||||
options={{
|
||||
title: 'Config',
|
||||
tabBarIcon: ({color, size})=> (
|
||||
<Settings size={size} color={color}/>
|
||||
)
|
||||
}}
|
||||
/>
|
||||
</Tabs>
|
||||
)
|
||||
}
|
||||
92
lstMobile/src/app/(tabs)/config.tsx
Normal file
@@ -0,0 +1,92 @@
|
||||
// app/config.tsx
|
||||
import { useEffect, useState } from "react";
|
||||
import { View, Text, TextInput, Button, Alert } from "react-native";
|
||||
import { useRouter } from "expo-router";
|
||||
import { AppConfig, getConfig, saveConfig } from "../../lib/storage";
|
||||
import Constants from "expo-constants";
|
||||
|
||||
export default function Config() {
|
||||
const [serverUrl, setServerUrl] = useState("");
|
||||
const [scannerId, setScannerId] = useState("");
|
||||
const [config, setConfig] = useState<AppConfig | null>(null)
|
||||
const [loading, setLoading] = useState(true);
|
||||
const router = useRouter()
|
||||
|
||||
const version = Constants.expoConfig?.version;
|
||||
const build = Constants.expoConfig?.android?.versionCode ?? 1;
|
||||
|
||||
useEffect(() => {
|
||||
const loadConfig = async () => {
|
||||
const existing = await getConfig();
|
||||
|
||||
if (existing) {
|
||||
setServerUrl(existing.serverUrl);
|
||||
setScannerId(existing.scannerId);
|
||||
setConfig(existing)
|
||||
}
|
||||
|
||||
setLoading(false);
|
||||
};
|
||||
|
||||
loadConfig();
|
||||
}, []);
|
||||
|
||||
const handleSave = async () => {
|
||||
if (!serverUrl.trim() || !scannerId.trim()) {
|
||||
Alert.alert("Missing info", "Please fill in both fields.");
|
||||
return;
|
||||
}
|
||||
|
||||
await saveConfig({
|
||||
serverUrl: serverUrl.trim(),
|
||||
scannerId: scannerId.trim(),
|
||||
});
|
||||
|
||||
Alert.alert("Saved", "Config saved to device.");
|
||||
//router.replace("/");
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return <Text>Loading config...</Text>;
|
||||
}
|
||||
|
||||
return (
|
||||
<View style={{ flex: 1, padding: 16, gap: 12 }}>
|
||||
<View style={{alignItems: "center", margin: 10}}>
|
||||
<Text style={{ fontSize: 20, fontWeight: "600"}}>LST Scanner Config</Text>
|
||||
</View>
|
||||
|
||||
|
||||
<Text>Server IP</Text>
|
||||
<TextInput
|
||||
value={serverUrl}
|
||||
onChangeText={setServerUrl}
|
||||
placeholder="192.168.1.1"
|
||||
autoCapitalize="none"
|
||||
keyboardType="numeric"
|
||||
style={{ borderWidth: 1, padding: 10, borderRadius: 8 }}
|
||||
/>
|
||||
|
||||
<Text>Server port</Text>
|
||||
<TextInput
|
||||
value={scannerId}
|
||||
onChangeText={setScannerId}
|
||||
placeholder="3000"
|
||||
autoCapitalize="characters"
|
||||
keyboardType="numeric"
|
||||
style={{ borderWidth: 1, padding: 10, borderRadius: 8, }}
|
||||
/>
|
||||
|
||||
<View style={{flexDirection: 'row',justifyContent: 'center', padding: 3}}>
|
||||
<Button title="Save Config" onPress={handleSave} />
|
||||
</View>
|
||||
|
||||
|
||||
<View style={{ marginTop: "auto", alignItems: "center", padding: 10 }}>
|
||||
<Text style={{ fontSize: 12, color: "#666" }}>
|
||||
LST Scanner v{version}-{build}
|
||||
</Text>
|
||||
</View>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
134
lstMobile/src/app/(tabs)/index.tsx
Normal file
@@ -0,0 +1,134 @@
|
||||
import * as Application from "expo-application";
|
||||
import * as Device from "expo-device";
|
||||
import { useRouter } from "expo-router";
|
||||
import { useEffect, useState } from "react";
|
||||
import {
|
||||
Alert,
|
||||
Platform,
|
||||
ScrollView,
|
||||
Text,
|
||||
View,
|
||||
} from "react-native";
|
||||
import HomeHeader from "../../components/HomeHeader";
|
||||
import { type AppConfig, getConfig, hasValidConfig } from "../../lib/storage";
|
||||
import {
|
||||
evaluateVersion,
|
||||
type ServerVersionInfo,
|
||||
type StartupStatus,
|
||||
} from "../../lib/versionValidation";
|
||||
import { globalStyles } from "../../stlyes/global";
|
||||
import axios from 'axios'
|
||||
|
||||
export default function Index() {
|
||||
const [config, setConfig] = useState<AppConfig | null>(null);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [startupStatus, setStartupStatus] = useState<StartupStatus>({state: "checking"});
|
||||
const [serverInfo, setServerInfo] = useState<ServerVersionInfo>()
|
||||
|
||||
const router = useRouter();
|
||||
|
||||
const versionName = Application.nativeApplicationVersion ?? "unknown";
|
||||
const versionCode = Number(Application.nativeBuildVersion ?? "0");
|
||||
|
||||
useEffect(() => {
|
||||
let isMounted = true;
|
||||
|
||||
const startUp = async () => {
|
||||
try {
|
||||
const savedConfig = await getConfig();
|
||||
|
||||
if (!hasValidConfig(savedConfig)) {
|
||||
router.replace("/config");
|
||||
return;
|
||||
}
|
||||
|
||||
if (!isMounted) return;
|
||||
setConfig(savedConfig);
|
||||
|
||||
// temp while testing
|
||||
const appBuildCode = 1;
|
||||
|
||||
try {
|
||||
const res = await axios.get(`http://${savedConfig?.serverUrl}:${savedConfig?.scannerId}/lst/api/mobile/version`);
|
||||
console.log(res)
|
||||
const server = (await res.data) as ServerVersionInfo;
|
||||
|
||||
if (!isMounted) return;
|
||||
|
||||
const result = evaluateVersion(appBuildCode, server);
|
||||
setStartupStatus(result);
|
||||
setServerInfo(server)
|
||||
|
||||
if (result.state === "warning") {
|
||||
Alert.alert("Update available", result.message);
|
||||
}
|
||||
} catch {
|
||||
if (!isMounted) return;
|
||||
setStartupStatus({ state: "offline" });
|
||||
}
|
||||
} finally {
|
||||
if (isMounted) {
|
||||
setLoading(false);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
startUp();
|
||||
|
||||
return () => {
|
||||
isMounted = false;
|
||||
};
|
||||
|
||||
}, [router]);
|
||||
|
||||
if (loading) {
|
||||
return <Text>Validating Configs.</Text>;
|
||||
}
|
||||
|
||||
if (startupStatus.state === "checking") {
|
||||
return <Text>Checking device and server status...</Text>;
|
||||
}
|
||||
|
||||
if (startupStatus.state === "blocked") {
|
||||
return (
|
||||
<View>
|
||||
<Text>Update Required</Text>
|
||||
<Text>This scanner must be updated before it can be used.</Text>
|
||||
<Text>Scan the update code to continue.</Text>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
|
||||
if (startupStatus.state === "offline") {
|
||||
// app still renders, but show disconnected state
|
||||
}
|
||||
|
||||
return (
|
||||
<ScrollView >
|
||||
|
||||
<View style={globalStyles.container}>
|
||||
<HomeHeader />
|
||||
|
||||
<Text>
|
||||
Welcome.{versionName} - {versionCode}
|
||||
</Text>
|
||||
<Text>Running on: {Platform.OS}</Text>
|
||||
<Text>Device model: {Device.modelName}</Text>
|
||||
<Text>Device Brand: {Device.brand}</Text>
|
||||
<Text> OS Version: {Device.osVersion}</Text>
|
||||
<View style={{ flex: 1, padding: 16, gap: 12 }}>
|
||||
<Text style={{ fontSize: 22, fontWeight: "600" }}>Welcome</Text>
|
||||
|
||||
{config ? (
|
||||
<>
|
||||
<Text>Server: {config.serverUrl}</Text>
|
||||
<Text>Scanner: {config.scannerId}</Text>
|
||||
<Text>Server: v{serverInfo?.versionName}</Text>
|
||||
</>
|
||||
) : (
|
||||
<Text>No config found yet.</Text>
|
||||
)}
|
||||
</View></View>
|
||||
</ScrollView>
|
||||
);
|
||||
}
|
||||
12
lstMobile/src/app/_layout.tsx
Normal file
@@ -0,0 +1,12 @@
|
||||
import { Stack } from "expo-router";
|
||||
import {StatusBar} from 'expo-status-bar'
|
||||
import { colors } from "../stlyes/global";
|
||||
|
||||
export default function RootLayout() {
|
||||
return <>
|
||||
<StatusBar style="dark" />
|
||||
<Stack screenOptions={{ headerShown: false }}>
|
||||
<Stack.Screen name='(tabs)' />
|
||||
</Stack>
|
||||
</>;
|
||||
}
|
||||
24
lstMobile/src/components/HomeHeader.tsx
Normal file
@@ -0,0 +1,24 @@
|
||||
import React from "react";
|
||||
import { StyleSheet, Text, View } from "react-native";
|
||||
import { colors, globalStyles } from "../stlyes/global";
|
||||
|
||||
export default function HomeHeader() {
|
||||
const currentDate = new Date().toLocaleDateString("en-US", {
|
||||
weekday: "long",
|
||||
month: "long",
|
||||
day: "numeric",
|
||||
});
|
||||
return (
|
||||
<View >
|
||||
<Text style={styles.date}>{currentDate}</Text>
|
||||
</View>
|
||||
);
|
||||
}
|
||||
const styles = StyleSheet.create({
|
||||
date: {
|
||||
fontSize: 14,
|
||||
color: colors.textSecondary,
|
||||
marginTop: 4,
|
||||
marginBottom: 30,
|
||||
},
|
||||
});
|
||||
36
lstMobile/src/lib/storage.ts
Normal file
@@ -0,0 +1,36 @@
|
||||
|
||||
import AsyncStorage from "@react-native-async-storage/async-storage";
|
||||
|
||||
export type AppConfig = {
|
||||
serverUrl: string;
|
||||
scannerId: string;
|
||||
};
|
||||
|
||||
const CONFIG_KEY = "scanner_app_config";
|
||||
|
||||
export async function saveConfig(config: AppConfig) {
|
||||
|
||||
await AsyncStorage.setItem(CONFIG_KEY, JSON.stringify(config));
|
||||
}
|
||||
|
||||
export async function getConfig(): Promise<AppConfig | null> {
|
||||
const raw = await AsyncStorage.getItem(CONFIG_KEY);
|
||||
|
||||
if (!raw) return null;
|
||||
|
||||
try {
|
||||
return JSON.parse(raw) as AppConfig;
|
||||
} catch (error) {
|
||||
console.log("Error", error)
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export function hasValidConfig(config: AppConfig | null) {
|
||||
if (!config) return false;
|
||||
|
||||
return Boolean(
|
||||
config.serverUrl?.trim() &&
|
||||
config.scannerId?.trim()
|
||||
);
|
||||
}
|
||||
43
lstMobile/src/lib/versionValidation.ts
Normal file
@@ -0,0 +1,43 @@
|
||||
|
||||
|
||||
export type ServerVersionInfo = {
|
||||
packageName: string;
|
||||
versionName: string;
|
||||
versionCode: number;
|
||||
minSupportedVersionCode: number;
|
||||
fileName: string;
|
||||
};
|
||||
|
||||
export type StartupStatus =
|
||||
| { state: "checking" }
|
||||
| { state: "needs-config" }
|
||||
| { state: "offline" }
|
||||
| { state: "blocked"; reason: string; server: ServerVersionInfo }
|
||||
| { state: "warning"; message: string; server: ServerVersionInfo }
|
||||
| { state: "ready"; server: ServerVersionInfo | null };
|
||||
|
||||
export function evaluateVersion(
|
||||
appBuildCode: number,
|
||||
server: ServerVersionInfo
|
||||
): StartupStatus {
|
||||
if (appBuildCode < server.minSupportedVersionCode) {
|
||||
return {
|
||||
state: "blocked",
|
||||
reason: "This scanner app is too old and must be updated before use.",
|
||||
server,
|
||||
};
|
||||
}
|
||||
|
||||
if (appBuildCode !== server.versionCode) {
|
||||
return {
|
||||
state: "warning",
|
||||
message: `A newer version is available. Installed build: ${appBuildCode}, latest build: ${server.versionCode}.`,
|
||||
server,
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
state: "ready",
|
||||
server,
|
||||
};
|
||||
}
|
||||
21
lstMobile/src/stlyes/global.ts
Normal file
@@ -0,0 +1,21 @@
|
||||
import { StyleSheet } from "react-native";
|
||||
|
||||
export const colors = {
|
||||
background: "white",
|
||||
header: "white",
|
||||
primary: 'blue',
|
||||
textSecondary: "blue",
|
||||
};
|
||||
|
||||
export const globalStyles = StyleSheet.create({
|
||||
container: {
|
||||
flex: 1,
|
||||
backgroundColor: colors.background,
|
||||
justifyContent: 'center',
|
||||
alignItems: 'center',
|
||||
paddingTop: 60,
|
||||
},
|
||||
header: {
|
||||
padding: 4,
|
||||
},
|
||||
});
|
||||
20
lstMobile/tsconfig.json
Normal file
@@ -0,0 +1,20 @@
|
||||
{
|
||||
"extends": "expo/tsconfig.base",
|
||||
"compilerOptions": {
|
||||
"strict": true,
|
||||
"paths": {
|
||||
"@/*": [
|
||||
"./src/*"
|
||||
],
|
||||
"@/assets/*": [
|
||||
"./assets/*"
|
||||
]
|
||||
}
|
||||
},
|
||||
"include": [
|
||||
"**/*.ts",
|
||||
"**/*.tsx",
|
||||
".expo/types/**/*.ts",
|
||||
"expo-env.d.ts"
|
||||
]
|
||||
}
|
||||
20
lst_docs/.gitignore
vendored
@@ -1,20 +0,0 @@
|
||||
# Dependencies
|
||||
/node_modules
|
||||
|
||||
# Production
|
||||
/build
|
||||
|
||||
# Generated files
|
||||
.docusaurus
|
||||
.cache-loader
|
||||
|
||||
# Misc
|
||||
.DS_Store
|
||||
.env.local
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
@@ -1,41 +0,0 @@
|
||||
# Website
|
||||
|
||||
This website is built using [Docusaurus](https://docusaurus.io/), a modern static website generator.
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
yarn
|
||||
```
|
||||
|
||||
## Local Development
|
||||
|
||||
```bash
|
||||
yarn start
|
||||
```
|
||||
|
||||
This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.
|
||||
|
||||
## Build
|
||||
|
||||
```bash
|
||||
yarn build
|
||||
```
|
||||
|
||||
This command generates static content into the `build` directory and can be served using any static contents hosting service.
|
||||
|
||||
## Deployment
|
||||
|
||||
Using SSH:
|
||||
|
||||
```bash
|
||||
USE_SSH=true yarn deploy
|
||||
```
|
||||
|
||||
Not using SSH:
|
||||
|
||||
```bash
|
||||
GIT_USER=<Your GitHub username> yarn deploy
|
||||
```
|
||||
|
||||
If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the `gh-pages` branch.
|
||||
@@ -1,12 +0,0 @@
|
||||
---
|
||||
slug: first-blog-post
|
||||
title: First Blog Post
|
||||
authors: [slorber, yangshun]
|
||||
tags: [hola, docusaurus]
|
||||
---
|
||||
|
||||
Lorem ipsum dolor sit amet...
|
||||
|
||||
<!-- truncate -->
|
||||
|
||||
...consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
@@ -1,44 +0,0 @@
|
||||
---
|
||||
slug: long-blog-post
|
||||
title: Long Blog Post
|
||||
authors: yangshun
|
||||
tags: [hello, docusaurus]
|
||||
---
|
||||
|
||||
This is the summary of a very long blog post,
|
||||
|
||||
Use a `<!--` `truncate` `-->` comment to limit blog post size in the list view.
|
||||
|
||||
<!-- truncate -->
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
|
||||
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
|
||||
@@ -1,24 +0,0 @@
|
||||
---
|
||||
slug: mdx-blog-post
|
||||
title: MDX Blog Post
|
||||
authors: [slorber]
|
||||
tags: [docusaurus]
|
||||
---
|
||||
|
||||
Blog posts support [Docusaurus Markdown features](https://docusaurus.io/docs/markdown-features), such as [MDX](https://mdxjs.com/).
|
||||
|
||||
:::tip
|
||||
|
||||
Use the power of React to create interactive blog posts.
|
||||
|
||||
:::
|
||||
|
||||
{/* truncate */}
|
||||
|
||||
For example, use JSX to create an interactive button:
|
||||
|
||||
```js
|
||||
<button onClick={() => alert('button clicked!')}>Click me!</button>
|
||||
```
|
||||
|
||||
<button onClick={() => alert('button clicked!')}>Click me!</button>
|
||||
|
Before Width: | Height: | Size: 94 KiB |
@@ -1,29 +0,0 @@
|
||||
---
|
||||
slug: welcome
|
||||
title: Welcome
|
||||
authors: [slorber, yangshun]
|
||||
tags: [facebook, hello, docusaurus]
|
||||
---
|
||||
|
||||
[Docusaurus blogging features](https://docusaurus.io/docs/blog) are powered by the [blog plugin](https://docusaurus.io/docs/api/plugins/@docusaurus/plugin-content-blog).
|
||||
|
||||
Here are a few tips you might find useful.
|
||||
|
||||
<!-- truncate -->
|
||||
|
||||
Simply add Markdown files (or folders) to the `blog` directory.
|
||||
|
||||
Regular blog authors can be added to `authors.yml`.
|
||||
|
||||
The blog post date can be extracted from filenames, such as:
|
||||
|
||||
- `2019-05-30-welcome.md`
|
||||
- `2019-05-30-welcome/index.md`
|
||||
|
||||
A blog post folder can be convenient to co-locate blog post images:
|
||||
|
||||

|
||||
|
||||
The blog supports tags as well!
|
||||
|
||||
**And if you don't want a blog**: just delete this directory, and use `blog: false` in your Docusaurus config.
|
||||
@@ -1,25 +0,0 @@
|
||||
yangshun:
|
||||
name: Yangshun Tay
|
||||
title: Ex-Meta Staff Engineer, Co-founder GreatFrontEnd
|
||||
url: https://linkedin.com/in/yangshun
|
||||
image_url: https://github.com/yangshun.png
|
||||
page: true
|
||||
socials:
|
||||
x: yangshunz
|
||||
linkedin: yangshun
|
||||
github: yangshun
|
||||
newsletter: https://www.greatfrontend.com
|
||||
|
||||
slorber:
|
||||
name: Sébastien Lorber
|
||||
title: Docusaurus maintainer
|
||||
url: https://sebastienlorber.com
|
||||
image_url: https://github.com/slorber.png
|
||||
page:
|
||||
# customize the url of the author page at /blog/authors/<permalink>
|
||||
permalink: '/all-sebastien-lorber-articles'
|
||||
socials:
|
||||
x: sebastienlorber
|
||||
linkedin: sebastienlorber
|
||||
github: slorber
|
||||
newsletter: https://thisweekinreact.com
|
||||
@@ -1,19 +0,0 @@
|
||||
facebook:
|
||||
label: Facebook
|
||||
permalink: /facebook
|
||||
description: Facebook tag description
|
||||
|
||||
hello:
|
||||
label: Hello
|
||||
permalink: /hello
|
||||
description: Hello tag description
|
||||
|
||||
docusaurus:
|
||||
label: Docusaurus
|
||||
permalink: /docusaurus
|
||||
description: Docusaurus tag description
|
||||
|
||||
hola:
|
||||
label: Hola
|
||||
permalink: /hola
|
||||
description: Hola tag description
|
||||
@@ -1,47 +0,0 @@
|
||||
---
|
||||
sidebar_position: 1
|
||||
---
|
||||
|
||||
# Tutorial Intro
|
||||
|
||||
Let's discover **Docusaurus in less than 5 minutes**.
|
||||
|
||||
## Getting Started
|
||||
|
||||
Get started by **creating a new site**.
|
||||
|
||||
Or **try Docusaurus immediately** with **[docusaurus.new](https://docusaurus.new)**.
|
||||
|
||||
### What you'll need
|
||||
|
||||
- [Node.js](https://nodejs.org/en/download/) version 20.0 or above:
|
||||
- When installing Node.js, you are recommended to check all checkboxes related to dependencies.
|
||||
|
||||
## Generate a new site
|
||||
|
||||
Generate a new Docusaurus site using the **classic template**.
|
||||
|
||||
The classic template will automatically be added to your project after you run the command:
|
||||
|
||||
```bash
|
||||
npm init docusaurus@latest my-website classic
|
||||
```
|
||||
|
||||
You can type this command into Command Prompt, Powershell, Terminal, or any other integrated terminal of your code editor.
|
||||
|
||||
The command also installs all necessary dependencies you need to run Docusaurus.
|
||||
|
||||
## Start your site
|
||||
|
||||
Run the development server:
|
||||
|
||||
```bash
|
||||
cd my-website
|
||||
npm run start
|
||||
```
|
||||
|
||||
The `cd` command changes the directory you're working with. In order to work with your newly created Docusaurus site, you'll need to navigate the terminal there.
|
||||
|
||||
The `npm run start` command builds your website locally and serves it through a development server, ready for you to view at http://localhost:3000/.
|
||||
|
||||
Open `docs/intro.md` (this page) and edit some lines: the site **reloads automatically** and displays your changes.
|
||||
@@ -1,8 +0,0 @@
|
||||
{
|
||||
"label": "Tutorial - Basics",
|
||||
"position": 2,
|
||||
"link": {
|
||||
"type": "generated-index",
|
||||
"description": "5 minutes to learn the most important Docusaurus concepts."
|
||||
}
|
||||
}
|
||||