24 Commits

Author SHA1 Message Date
ba3227545d chore(release): 0.0.1-alpha.4
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m4s
Release and Build Image / release (push) Successful in 12s
2026-04-15 07:31:49 -05:00
84909bfcf8 ci(service): changes to the script to allow running the powershell on execution palicy restrictions
Some checks failed
Build and Push LST Docker Image / docker (push) Has been cancelled
2026-04-15 07:31:06 -05:00
e0d0ac2077 feat(datamart): psi data has been added :D 2026-04-15 07:29:35 -05:00
52a6c821f4 fix(datamart): error when running build and crashed everything
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m34s
2026-04-14 20:30:34 -05:00
eccaf17332 feat(datamart): migrations completed remaining is the deactivation that will be ran by anylitics
Some checks failed
Build and Push LST Docker Image / docker (push) Failing after 39s
2026-04-14 20:25:20 -05:00
6307037985 feat(tcp crud): tcp server start, stop, restart endpoints + status check
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m30s
2026-04-13 17:30:47 -05:00
4b6061c478 ci(agent): added in sherman
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 1m36s
2026-04-13 15:36:50 -05:00
fc6dc82d84 refactor(services): added in examples for migration stuff 2026-04-13 15:36:29 -05:00
6ba905a887 docs(docs): removed docusorus as all docs will be inside lst now to better assist users 2026-04-13 15:36:02 -05:00
f33587a3d9 refactor(sql): corrections to the way we reconnect so the app can error out and be reactivated later 2026-04-13 15:35:12 -05:00
80189baf90 feat(ocp): printer sync and logging logic added 2026-04-13 15:34:18 -05:00
87f738702a docs(notifcations): docs for intro, notifcations, reprint added
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m25s
2026-04-10 21:35:12 -05:00
38a0b65e94 refactor(connection): corrected the connection to the old system 2026-04-10 21:33:55 -05:00
9a0ef8e51a refactor(notification): blocking added 2026-04-10 21:33:26 -05:00
dcb3f2dd13 refactor(server): added in serverCrash email 2026-04-10 21:32:25 -05:00
e47ea9ec52 ci(agent): added in jeff city 2026-04-10 21:31:57 -05:00
ca3425d327 docs(env example): updated the file 2026-04-10 21:30:46 -05:00
3bf024cfc9 refactor(agent): changed to have the test servers on there own push for better testing
production servers will soon pull a build from git rather and push the zip so splitting things up
now
2026-04-10 14:12:02 -05:00
9d39c13510 refactor(puchase): changes how the error handling works so a better email can be sent 2026-04-10 13:58:30 -05:00
c9eb59e2ad refactor(reprint): new query added to deactivate the old notifcation so no chance of duplicates 2026-04-10 13:57:52 -05:00
b0e5fd7999 feat(migrate): quality alert migrated 2026-04-10 13:57:15 -05:00
07ebf88806 refactor(templates): corrections for new notify process on critcal errors 2026-04-10 10:33:01 -05:00
79e653efa3 refactor(logging): when notify is true send the error to systemAdmins 2026-04-10 10:32:20 -05:00
d05a0ce930 chore(release): 0.0.1-alpha.3
All checks were successful
Build and Push LST Docker Image / docker (push) Successful in 2m1s
Release and Build Image / release (push) Successful in 11s
2026-04-10 08:22:16 -05:00
139 changed files with 16297 additions and 20239 deletions

View File

@@ -1,32 +1,52 @@
NODE_ENV=development
# Server
PORT=3000
URL=http://localhost:3000
SERVER_IP=10.75.2.38
TIMEZONE=America/New_York
TCP_PORT=2222
# authentication
BETTER_AUTH_SECRET=""
# Better auth Secret
BETTER_AUTH_SECRET=
RESET_EXPIRY_SECONDS=3600
# logging
LOG_LEVEL=debug
LOG_LEVEL=
# prodServer
PROD_SERVER=usmcd1vms036
PROD_PLANT_TOKEN=test3
PROD_USER=alplaprod
PROD_PASSWORD=password
# SMTP password
SMTP_PASSWORD=
# opendock
OPENDOCK_URL=https://neutron.opendock.com
OPENDOCK_PASSWORD=
DEFAULT_DOCK=
DEFAULT_LOAD_TYPE=
DEFAULT_CARRIER=
# prodServer when ruining on an actual prod server use localhost this way we don't go out and back in.
PROD_SERVER=
PROD_PLANT_TOKEN=
PROD_USER=
PROD_PASSWORD=
# Tech user for alplaprod api
TEC_API_KEY=
# AD STUFF
# this is mainly used for purchase stuff to reference reqs
LDAP_URL=
# postgres connection
DATABASE_HOST=localhost
DATABASE_PORT=5433
DATABASE_USER=user
DATABASE_PASSWORD=password
DATABASE_DB=lst_dev
DATABASE_PORT=5432
DATABASE_USER=
DATABASE_PASSWORD=
DATABASE_DB=
# how is the app running server or client when in client mode you must provide the server
APP_RUNNING_IN=server
SERVER_NAME=localhost
# Gp connection
GP_USER=
GP_PASSWORD=
#dev stuff
GITEA_TOKEN=""
EMAIL_USER=""
EMAIL_PASSWORD=""
# how often to check for new/updated queries in min
QUERY_TIME_TYPE=m #valid options are m, h
QUERY_CHECK=1

View File

@@ -65,12 +65,14 @@
"onnotice",
"opendock",
"opendocks",
"palletizer",
"ppoo",
"preseed",
"prodlabels",
"prolink",
"Skelly",
"trycatch"
"trycatch",
"whse"
],
"gitea.token": "8456def90e1c651a761a8711763d6ef225d6b2db",
"gitea.instanceURL": "https://git.tuffraid.net",

View File

@@ -1,5 +1,63 @@
# All Changes to LST can be found below.
## [0.0.1-alpha.4](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.3...v0.0.1-alpha.4) (2026-04-15)
### 🌟 Enhancements
* **datamart:** migrations completed remaining is the deactivation that will be ran by anylitics ([eccaf17](https://git.tuffraid.net/cowch/lst_v3/commits/eccaf17332fb1c63b8d6bbea6f668c3bb42d44b7))
* **datamart:** psi data has been added :D ([e0d0ac2](https://git.tuffraid.net/cowch/lst_v3/commits/e0d0ac20773159373495d65023587b76b47df34f))
* **migrate:** quality alert migrated ([b0e5fd7](https://git.tuffraid.net/cowch/lst_v3/commits/b0e5fd79998d551d4f155d58416157a324498fbd))
* **ocp:** printer sync and logging logic added ([80189ba](https://git.tuffraid.net/cowch/lst_v3/commits/80189baf906224da43ec1b9b7521153d2a49e059))
* **tcp crud:** tcp server start, stop, restart endpoints + status check ([6307037](https://git.tuffraid.net/cowch/lst_v3/commits/6307037985162bc6b49f9f711132853296f43eee))
### 🐛 Bug fixes
* **datamart:** error when running build and crashed everything ([52a6c82](https://git.tuffraid.net/cowch/lst_v3/commits/52a6c821f4632e4b5b51e0528a0d620e2e0deffc))
### 📚 Documentation
* **docs:** removed docusorus as all docs will be inside lst now to better assist users ([6ba905a](https://git.tuffraid.net/cowch/lst_v3/commits/6ba905a887dbd8f306d71fed75bb34c71fee74c9))
* **env example:** updated the file ([ca3425d](https://git.tuffraid.net/cowch/lst_v3/commits/ca3425d327757120c2cc876fff28e8668c76838d))
* **notifcations:** docs for intro, notifcations, reprint added ([87f7387](https://git.tuffraid.net/cowch/lst_v3/commits/87f738702a935279a248d471541cdd9d49330565))
### 🛠️ Code Refactor
* **agent:** changed to have the test servers on there own push for better testing ([3bf024c](https://git.tuffraid.net/cowch/lst_v3/commits/3bf024cfc97d2841130d54d1a7c5cb5f09f0f598))
* **connection:** corrected the connection to the old system ([38a0b65](https://git.tuffraid.net/cowch/lst_v3/commits/38a0b65e9450c65b8300a10058a8f0357400f4e6))
* **logging:** when notify is true send the error to systemAdmins ([79e653e](https://git.tuffraid.net/cowch/lst_v3/commits/79e653efa3bcb2941ccee06b28378e709e085ec0))
* **notification:** blocking added ([9a0ef8e](https://git.tuffraid.net/cowch/lst_v3/commits/9a0ef8e51a36e3ab45b601b977f1b5cf35d56947))
* **puchase:** changes how the error handling works so a better email can be sent ([9d39c13](https://git.tuffraid.net/cowch/lst_v3/commits/9d39c13510974b5ada2a6f6c2448da3f1b755a5c))
* **reprint:** new query added to deactivate the old notifcation so no chance of duplicates ([c9eb59e](https://git.tuffraid.net/cowch/lst_v3/commits/c9eb59e2ad9847418ac55cb8a4a91c013f6c97bb))
* **server:** added in serverCrash email ([dcb3f2d](https://git.tuffraid.net/cowch/lst_v3/commits/dcb3f2dd1382986639b722778fad113392533b28))
* **services:** added in examples for migration stuff ([fc6dc82](https://git.tuffraid.net/cowch/lst_v3/commits/fc6dc82d8458a9928050dd3770778d6a6e1eea7f))
* **sql:** corrections to the way we reconnect so the app can error out and be reactivated later ([f33587a](https://git.tuffraid.net/cowch/lst_v3/commits/f33587a3d9a72ca72806635fac9d1214bb1452f1))
* **templates:** corrections for new notify process on critcal errors ([07ebf88](https://git.tuffraid.net/cowch/lst_v3/commits/07ebf88806b93b9320f8f9d36b867572dd9a9580))
### 📈 Project changes
* **agent:** added in jeff city ([e47ea9e](https://git.tuffraid.net/cowch/lst_v3/commits/e47ea9ec52a6ebaf5a8f67a7e8bd2c73da6186fb))
* **agent:** added in sherman ([4b6061c](https://git.tuffraid.net/cowch/lst_v3/commits/4b6061c478cbeba7c845dc1c8a015b9998721456))
* **service:** changes to the script to allow running the powershell on execution palicy restrictions ([84909bf](https://git.tuffraid.net/cowch/lst_v3/commits/84909bfcf85b91d085ea9dca78be00482b7fd231))
## [0.0.1-alpha.3](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.2...v0.0.1-alpha.3) (2026-04-10)
### 🌟 Enhancements
* **puchase hist:** finished up purhcase historical / gp updates ([a691dc2](https://git.tuffraid.net/cowch/lst_v3/commits/a691dc276e8650c669409241f73d7b2d7a1f9176))
### 🛠️ Code Refactor
* **gp connect:** gp connect as was added to long live services ([635635b](https://git.tuffraid.net/cowch/lst_v3/commits/635635b356e1262e1c0b063408fe2209e6a8d4ec))
* **reprints:** changes the module and submodule around to be more accurate ([97f93a1](https://git.tuffraid.net/cowch/lst_v3/commits/97f93a1830761437118863372108df810ce9977a))
* **send email:** changes the error message to show the true message in the error ([995b1dd](https://git.tuffraid.net/cowch/lst_v3/commits/995b1dda7cdfebf4367d301ccac38fd339fab6dd))
## [0.0.1-alpha.2](https://git.tuffraid.net/cowch/lst_v3/compare/v0.0.1-alpha.1...v0.0.1-alpha.2) (2026-04-08)

View File

@@ -19,7 +19,7 @@ Quick summary of current rewrite/migration goal.
| User Profile | ~~Edit profile~~, upload avatar | 🟨 In Progress |
| User Admin | Edit user, create user, remove user, alplaprod user integration | ⏳ Not Started |
| Notifications | ~~Subscribe~~, ~~Create~~, ~~Update~~, ~~~~Remove~~, Manual Trigger | 🟨 In Progress |
| Datamart | Create, Update, Run, Deactivate | 🔧 In Progress |
| Datamart | ~~Create~~, ~~Update~~, ~~Run~~, Deactivate | 🟨 In Progress |
| Frontend | Analytics and charts | ⏳ Not Started |
| Docs | Instructions and trouble shooting | ⏳ Not Started |
| One Click Print | Get printers, monitor printers, label process, material process, Special processes | ⏳ Not Started |

View File

@@ -13,6 +13,10 @@
*
* when a criteria is password over we will handle it by counting how many were passed up to 3 then deal with each one respectively
*/
import { and, between, inArray, notInArray } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
@@ -22,37 +26,93 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { datamartData } from "./datamartData.utlis.js";
type Options = {
name: string;
value: string;
};
type Data = {
name: string;
options: Options;
options: any;
optionsRequired?: boolean;
howManyOptionsRequired?: number;
};
const lstDbRun = async (data: Data) => {
if (data.options) {
if (data.name === "psiInventory") {
const ids = data.options.articles.split(",").map((id: any) => id.trim());
const whse = data.options.whseToInclude
? data.options.whseToInclude
.split(",")
.map((w: any) => w.trim())
.filter(Boolean)
: [];
const locations = data.options.exludeLanes
? data.options.exludeLanes
.split(",")
.map((l: any) => l.trim())
.filter(Boolean)
: [];
const conditions = [
inArray(invHistoricalData.article, ids),
between(
invHistoricalData.histDate,
data.options.startDate,
data.options.endDate,
),
];
// only add the warehouse condition if there are any whse values
if (whse.length > 0) {
conditions.push(inArray(invHistoricalData.whseId, whse));
}
// locations we dont want in the system
if (locations.length > 0) {
conditions.push(notInArray(invHistoricalData.location, locations));
}
return await db
.select()
.from(invHistoricalData)
.where(and(...conditions));
}
}
return [];
};
export const runDatamartQuery = async (data: Data) => {
// search the query db for the query by name
const sqlQuery = sqlQuerySelector(`${data.name}`) as SqlQuery;
const considerLstDBRuns = ["psiInventory"];
if (considerLstDBRuns.includes(data.name)) {
const lstDB = await lstDbRun(data);
return returnFunc({
success: true,
level: "info",
module: "datamart",
subModule: "lstDBrn",
message: `Data for: ${data.name}`,
data: lstDB,
notify: false,
});
}
const sqlQuery = sqlQuerySelector(`datamart.${data.name}`) as SqlQuery;
const getDataMartInfo = datamartData.filter((x) => x.endpoint === data.name);
// const optionsMissing =
// !data.options || Object.keys(data.options).length === 0;
const optionCount =
Object.keys(data.options).length ===
getDataMartInfo[0]?.howManyOptionsRequired;
const isValid =
Object.keys(data.options ?? {}).length >=
(getDataMartInfo[0]?.howManyOptionsRequired ?? 0);
if (getDataMartInfo[0]?.optionsRequired && !optionCount) {
if (getDataMartInfo[0]?.optionsRequired && !isValid) {
return returnFunc({
success: false,
level: "error",
module: "datamart",
subModule: "query",
message: `This query is required to have the ${getDataMartInfo[0]?.howManyOptionsRequired} options set in order use it.`,
message: `This query is required to have ${getDataMartInfo[0]?.howManyOptionsRequired} option(s) set in order use it, please add in your option(s) data and try again.`,
data: [getDataMartInfo[0].options],
notify: false,
});
@@ -75,10 +135,129 @@ export const runDatamartQuery = async (data: Data) => {
// split the criteria by "," then and then update the query
if (data.options) {
Object.entries(data.options ?? {}).forEach(([key, value]) => {
const pattern = new RegExp(`\\[${key.trim()}\\]`, "g");
datamartQuery = datamartQuery.replace(pattern, String(value).trim());
});
switch (data.name) {
case "activeArticles":
break;
case "deliveryByDateRange":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`);
break;
case "customerInventory":
datamartQuery = datamartQuery
.replace(
"--and IdAdressen",
`and IdAdressen in (${data.options.customer})`,
)
.replace(
"--and x.IdWarenlager in (0)",
`${data.options.whseToInclude ? `and x.IdWarenlager in (${data.options.whseToInclude})` : `--and x.IdWarenlager in (0)`}`,
);
break;
case "openOrders":
datamartQuery = datamartQuery
.replace("[startDay]", `${data.options.startDay}`)
.replace("[endDay]", `${data.options.endDay}`);
break;
case "inventory":
datamartQuery = datamartQuery
.replaceAll(
"--,l.RunningNumber",
`${data.options.includeRunningNumbers ? `,l.RunningNumber` : `--,l.RunningNumber`}`,
)
.replaceAll(
"--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot",
`${data.options.lots ? `,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot` : `--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot`}`,
)
.replaceAll(
"--,l.WarehouseDescription,l.LaneDescription",
`${data.options.locations ? `,l.WarehouseDescription,l.LaneDescription` : `--,l.WarehouseDescription,l.LaneDescription`}`,
);
// adding in a test for historical check.
if (data.options.historical) {
datamartQuery = datamartQuery
.replace(
"--,l.ProductionLotRunningNumber as lot,l.warehousehumanreadableid as warehouseId,l.WarehouseDescription as warehouseDescription,l.lanehumanreadableid as locationId,l.lanedescription as laneDescription",
",l.ProductionLotRunningNumber as lot,l.warehousehumanreadableid as warehouseId,l.WarehouseDescription as warehouseDescription,l.lanehumanreadableid as locationId,l.lanedescription as laneDescription",
)
.replace(
"--,l.ProductionLotRunningNumber,l.warehousehumanreadableid,l.WarehouseDescription,l.lanehumanreadableid,l.lanedescription",
",l.ProductionLotRunningNumber,l.warehousehumanreadableid,l.WarehouseDescription,l.lanehumanreadableid,l.lanedescription",
);
}
break;
case "fakeEDIUpdate":
datamartQuery = datamartQuery.replace(
"--AND h.CustomerHumanReadableId in (0)",
`${data.options.address ? `AND h.CustomerHumanReadableId in (${data.options.address})` : `--AND h.CustomerHumanReadableId in (0)`}`,
);
break;
case "forecast":
datamartQuery = datamartQuery.replace(
"where DeliveryAddressHumanReadableId in ([customers])",
data.options.customers
? `where DeliveryAddressHumanReadableId in (${data.options.customers})`
: "--where DeliveryAddressHumanReadableId in ([customers])",
);
break;
case "activeArticles2":
datamartQuery = datamartQuery.replace(
"and a.HumanReadableId in ([articles])",
data.options.articles
? `and a.HumanReadableId in (${data.options.articles})`
: "--and a.HumanReadableId in ([articles])",
);
break;
case "psiDeliveryData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and IdArtikelVarianten in ([articles])",
data.options.articles
? `and IdArtikelVarianten in (${data.options.articles})`
: "--and IdArtikelVarianten in ([articles])",
);
break;
case "productionData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and ArticleHumanReadableId in ([articles])",
data.options.articles
? `and ArticleHumanReadableId in (${data.options.articles})`
: "--and ArticleHumanReadableId in ([articles])",
);
break;
case "psiPlanningData":
datamartQuery = datamartQuery
.replace("[startDate]", `${data.options.startDate}`)
.replace("[endDate]", `${data.options.endDate}`)
.replace(
"and p.IdArtikelvarianten in ([articles])",
data.options.articles
? `and p.IdArtikelvarianten in (${data.options.articles})`
: "--and p.IdArtikelvarianten in ([articles])",
);
break;
default:
return returnFunc({
success: false,
level: "error",
module: "datamart",
subModule: "query",
message: `${data.name} encountered an error as it might not exist in LST please contact support if this continues to happen`,
data: [sqlQuery.message],
notify: true,
});
}
}
const { data: queryRun, error } = await tryCatch(

View File

@@ -10,14 +10,50 @@ export const datamartData = [
name: "Active articles",
endpoint: "activeArticles",
description: "returns all active articles for the server with custom data",
options: "", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
options: "",
optionsRequired: false,
},
{
name: "Delivery by date range",
endpoint: "deliveryByDateRange",
description: `Returns all Deliverys in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
options: "startDate,endDate", // set as a string and each item will be seperated by a , this way we can split it later in the excel file.
description: `Returns all Deliveries in selected date range IE: 1/1/${new Date(Date.now()).getFullYear()} to 1/31/${new Date(Date.now()).getFullYear()}`,
options: "startDate,endDate",
optionsRequired: true,
howManyOptionsRequired: 2,
},
{
name: "Get Customer Inventory",
endpoint: "customerInventory",
description: `Returns specific customer inventory based on there address ID, IE: 8,12,145. \nWith option to include specific warehousesIds, IE 36,41,5. \nNOTES: *leaving warehouse blank will just pull everything for the customer, Inventory dose not include PPOO or INV`,
options: "customer,whseToInclude",
optionsRequired: true,
howManyOptionsRequired: 1,
},
{
name: "Get open order",
endpoint: "openOrders",
description: `Returns open orders based on day count sent over, IE: startDay 15 days in the past endDay 5 days in the future, can be left empty for this default days`,
options: "startDay,endDay",
optionsRequired: true,
howManyOptionsRequired: 2,
},
{
name: "Get inventory",
endpoint: "inventory",
description: `Returns all inventory, excludes inv location. adding an x in one of the options will enable it.`,
options: "includeRunningNumbers,locations,lots",
},
{
name: "Fake EDI Update",
endpoint: "fakeEDIUpdate",
description: `Returns all open orders to correct and resubmit via lst demand mgt, leaving blank will get everything putting an address only returns the specified address. \nNOTE: only orders that were created via edi will populate here.`,
options: "address",
},
{
name: "Production Data",
endpoint: "productionData",
description: `Returns all production data from the date range with the option to have 1 to many avs to search by.`,
options: "startDate,endDate,articles",
optionsRequired: true,
howManyOptionsRequired: 2,
},

View File

@@ -0,0 +1,30 @@
import { date, pgTable, text, timestamp, uuid } from "drizzle-orm/pg-core";
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type z from "zod";
export const invHistoricalData = pgTable("inv_historical_data", {
inv: uuid("id").defaultRandom().primaryKey(),
histDate: date("hist_date").notNull(), // this date should always be yesterday when we post it.
plantToken: text("plant_token"),
article: text("article").notNull(),
articleDescription: text("article_description").notNull(),
materialType: text("material_type"),
total_QTY: text("total_QTY"),
available_QTY: text("available_QTY"),
coa_QTY: text("coa_QTY"),
held_QTY: text("held_QTY"),
consignment_QTY: text("consignment_qty"),
lot_Number: text("lot_number"),
locationId: text("location_id"),
location: text("location"),
whseId: text("whse_id").default(""),
whseName: text("whse_name").default("missing whseName"),
upd_user: text("upd_user").default("lst-system"),
upd_date: timestamp("upd_date").defaultNow(),
});
export const invHistoricalDataSchema = createSelectSchema(invHistoricalData);
export const newInvHistoricalDataSchema = createInsertSchema(invHistoricalData);
export type InvHistoricalData = z.infer<typeof invHistoricalDataSchema>;
export type NewInvHistoricalData = z.infer<typeof newInvHistoricalDataSchema>;

View File

@@ -1,6 +1,11 @@
import { integer, pgTable, text } from "drizzle-orm/pg-core";
import { integer, pgTable, text, timestamp } from "drizzle-orm/pg-core";
export const opendockApt = pgTable("printer_log", {
export const printerLog = pgTable("printer_log", {
id: integer().primaryKey().generatedAlwaysAsIdentity(),
name: text("name").notNull(),
name: text("name"),
ip: text("ip"),
printerSN: text("printer_sn"),
condition: text("condition").notNull(),
message: text("message"),
createdAt: timestamp("created_at").defaultNow(),
});

View File

@@ -0,0 +1,44 @@
import {
boolean,
integer,
jsonb,
pgTable,
text,
timestamp,
uniqueIndex,
uuid,
} from "drizzle-orm/pg-core";
import { createInsertSchema, createSelectSchema } from "drizzle-zod";
import type z from "zod";
export const printerData = pgTable(
"printer_data",
{
id: uuid("id").defaultRandom().primaryKey(),
humanReadableId: text("humanReadable_id").unique().notNull(),
name: text("name").notNull(),
ipAddress: text("ipAddress"),
port: integer("port"),
status: text("status"),
statusText: text("statusText"),
printerSN: text("printer_sn"),
lastTimePrinted: timestamp("last_time_printed").notNull().defaultNow(),
assigned: boolean("assigned").default(false),
remark: text("remark"),
printDelay: integer("printDelay").default(90),
processes: jsonb("processes").default([]),
printDelayOverride: boolean("print_delay_override").default(false), // this will be more for if we have the lot time active but want to over ride this single line for some reason
add_Date: timestamp("add_Date").defaultNow(),
upd_date: timestamp("upd_date").defaultNow(),
},
(table) => [
//uniqueIndex("emailUniqueIndex").on(sql`lower(${table.email})`),
uniqueIndex("printer_id").on(table.humanReadableId),
],
);
export const printerSchema = createSelectSchema(printerData);
export const newPrinterSchema = createInsertSchema(printerData);
export type Printer = z.infer<typeof printerSchema>;
export type NewPrinter = z.infer<typeof newPrinterSchema>;

View File

@@ -7,12 +7,17 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
export let pool2: sql.ConnectionPool;
export let connected: boolean = false;
export let reconnecting = false;
// start the delay out as 2 seconds
let delayStart = 2000;
let attempt = 0;
const maxAttempts = 10;
export const connectGPSql = async () => {
const serverUp = await checkHostnamePort(`USMCD1VMS011:1433`);
if (!serverUp) {
// we will try to reconnect
connected = false;
reconnectToSql;
return returnFunc({
success: false,
level: "error",
@@ -48,6 +53,7 @@ export const connectGPSql = async () => {
notify: false,
});
} catch (error) {
reconnectToSql;
return returnFunc({
success: false,
level: "error",
@@ -104,11 +110,6 @@ export const reconnectToSql = async () => {
//set reconnecting to true while we try to reconnect
reconnecting = true;
// start the delay out as 2 seconds
let delayStart = 2000;
let attempt = 0;
const maxAttempts = 10;
while (!connected && attempt < maxAttempts) {
attempt++;
log.info(
@@ -121,7 +122,7 @@ export const reconnectToSql = async () => {
if (!serverUp) {
delayStart = Math.min(delayStart * 2, 30000); // exponential backoff until up to 30000
return;
continue;
}
try {
@@ -131,19 +132,11 @@ export const reconnectToSql = async () => {
log.info(`${gpSqlConfig.server} is connected to ${gpSqlConfig.database}`);
} catch (error) {
delayStart = Math.min(delayStart * 2, 30000);
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "db",
message: "Failed to reconnect to the prod sql server.",
data: [error],
notify: false,
});
log.error({ error }, "Failed to reconnect to the prod sql server.");
}
}
if (!connected) {
if (!connected && attempt >= maxAttempts) {
log.error(
{ notify: true },
"Max reconnect attempts reached on the prodSql server. Stopping retries.",

View File

@@ -1,10 +1,5 @@
import { returnFunc } from "../utils/returnHelper.utils.js";
import {
connected,
pool2,
reconnecting,
reconnectToSql,
} from "./gpSqlConnection.controller.js";
import { connected, pool2 } from "./gpSqlConnection.controller.js";
interface SqlError extends Error {
code?: string;
@@ -22,29 +17,15 @@ interface SqlError extends Error {
*/
export const gpQuery = async (queryToRun: string, name: string) => {
if (!connected) {
reconnectToSql();
if (reconnecting) {
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "gpSql",
message: `The sql ${process.env.PROD_PLANT_TOKEN} is trying to reconnect already`,
data: [],
notify: false,
});
} else {
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "gpSql",
message: `${process.env.PROD_PLANT_TOKEN} is not connected, and failed to connect.`,
data: [],
notify: true,
});
}
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "gpSql",
message: `${process.env.PROD_PLANT_TOKEN} is offline or attempting to reconnect`,
data: [],
notify: false,
});
}
//change to the correct server

View File

@@ -5,6 +5,7 @@ import { db } from "../db/db.controller.js";
import { logs } from "../db/schema/logs.schema.js";
import { emitToRoom } from "../socket.io/roomEmitter.socket.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { notifySystemIssue } from "./logger.notify.js";
//import build from "pino-abstract-transport";
export const logLevel = process.env.LOG_LEVEL || "info";
@@ -45,6 +46,10 @@ const dbStream = new Writable({
console.error(res.error);
}
if (obj.notify) {
notifySystemIssue(obj);
}
if (obj.room) {
emitToRoom(obj.room, res.data ? res.data[0] : obj);
}

View File

@@ -0,0 +1,44 @@
/**
* For all logging that has notify set to true well send an email to the system admins, if we have a discord webhook set well send it there as well
*/
import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { user } from "../db/schema/auth.schema.js";
import { sendEmail } from "../utils/sendEmail.utils.js";
type NotifyData = {
module: string;
submodule: string;
hostname: string;
msg: string;
stack: unknown[];
};
export const notifySystemIssue = async (data: NotifyData) => {
// build the email out
const formattedError = Array.isArray(data.stack)
? data.stack.map((e: any) => e.error || e)
: data.stack;
const sysAdmin = await db
.select()
.from(user)
.where(eq(user.role, "systemAdmin"));
await sendEmail({
email: sysAdmin.map((r) => r.email).join("; ") ?? "cowchmonkey@gmail.com", // change to pull in system admin emails
subject: `${data.hostname} has encountered a critical issue.`,
template: "serverCritialIssue",
context: {
plant: data.hostname,
module: data.module,
subModule: data.submodule,
message: data.msg,
error: JSON.stringify(formattedError, null, 2),
},
});
// TODO: add discord
};

View File

@@ -0,0 +1,220 @@
import { format } from "date-fns";
import { eq, sql } from "drizzle-orm";
import { runDatamartQuery } from "../datamart/datamart.controller.js";
import { db } from "../db/db.controller.js";
import { invHistoricalData } from "../db/schema/historicalInv.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { createCronJob } from "../utils/croner.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
type Inventory = {
article: string;
alias: string;
materialType: string;
total_palletQTY: string;
available_QTY: string;
coa_QTY: string;
held_QTY: string;
consignment_qty: string;
lot: string;
locationId: string;
laneDescription: string;
warehouseId: string;
warehouseDescription: string;
};
const historicalInvImport = async () => {
const today = new Date();
const { data, error } = await tryCatch(
db
.select()
.from(invHistoricalData)
.where(eq(invHistoricalData.histDate, format(today, "yyyy-MM-dd"))),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "query",
message: `Error getting historical inv info`,
data: error as any,
notify: false,
});
}
if (data?.length === 0) {
const avSQLQuery = sqlQuerySelector(`datamart.activeArticles`) as SqlQuery;
if (!avSQLQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting Article info`,
data: [avSQLQuery.message],
notify: true,
});
}
const { data: inv, error: invError } = await tryCatch(
//prodQuery(sqlQuery.query, "Inventory data"),
runDatamartQuery({ name: "inventory", options: { historical: "x" } }),
);
const { data: av, error: avError } = (await tryCatch(
runDatamartQuery({ name: "activeArticles", options: {} }),
)) as any;
if (invError) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting inventory info from prod query`,
data: invError as any,
notify: false,
});
}
if (avError) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error getting article info from prod query`,
data: invError as any,
notify: false,
});
}
// shape the data to go into our table
const plantToken = process.env.PROD_PLANT_TOKEN ?? "test1";
const importInv = (inv.data ? inv.data : []) as Inventory[];
const importData = importInv.map((i) => {
return {
histDate: sql`(NOW())::date`,
plantToken: plantToken,
article: i.article,
articleDescription: i.alias,
materialType:
av.data.filter((a: any) => a.article === i.article).length > 0
? av.data.filter((a: any) => a.article === i.article)[0]
?.TypeOfMaterial
: "Item not defined",
total_QTY: i.total_palletQTY ?? "0.00",
available_QTY: i.available_QTY ?? "0.00",
coa_QTY: i.coa_QTY ?? "0.00",
held_QTY: i.held_QTY ?? "0.00",
consignment_QTY: i.consignment_qty ?? "0.00",
lot_Number: i.lot ?? "0",
locationId: i.locationId ?? "0",
location: i.laneDescription ?? "Missing lane",
whseId: i.warehouseId ?? "0",
whseName: i.warehouseDescription ?? "Missing warehouse",
};
});
const { data: dataImport, error: errorImport } = await tryCatch(
db.insert(invHistoricalData).values(importData),
);
if (errorImport) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "inv",
message: `Error adding historical data to lst db`,
data: errorImport as any,
notify: true,
});
}
if (dataImport) {
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Historical data was added to lst :D`,
data: [],
notify: false,
});
}
} else {
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Historical Data for: ${format(today, "yyyy-MM-dd")}, is already added and nothing to do.`,
data: [],
notify: false,
});
}
return returnFunc({
success: false,
level: "info",
module: "logistics",
subModule: "inv",
message: `Some weird crazy error just happened and didnt get captured during the historical inv check.`,
data: [],
notify: true,
});
};
export const historicalSchedule = async () => {
// running the history in case my silly ass dose an update around the shift change time lol, this will prevent loss data. it might be off a little but no one cares
historicalInvImport();
const sqlQuery = sqlQuerySelector(`shiftChange`) as SqlQuery;
if (!sqlQuery.success) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "query",
message: `Error getting shiftChange sql file`,
data: [sqlQuery.message],
notify: false,
});
}
const { data, error } = await tryCatch(
prodQuery(sqlQuery.query, "Shift Change data"),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "logistics",
subModule: "query",
message: `Error getting shiftChange info`,
data: error as any,
notify: false,
});
}
// shift split
const shiftTimeSplit = data?.data[0]?.shiftChange.split(":");
const cronSetup = `0 ${
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[1])}` : "0"
} ${
shiftTimeSplit?.length > 0 ? `${parseInt(shiftTimeSplit[0])}` : "7"
} * * *`;
createCronJob("historicalInv", cronSetup, () => historicalInvImport());
};

View File

@@ -0,0 +1,96 @@
import { eq } from "drizzle-orm";
import { type Response, Router } from "express";
import { db } from "../db/db.controller.js";
import { notifications } from "../db/schema/notifications.schema.js";
import { auth } from "../utils/auth.utils.js";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
const r = Router();
r.post("/", async (req, res: Response) => {
const hasPermissions = await auth.api.userHasPermission({
body: {
//userId: req?.user?.id,
role: req.user?.roles as any,
permissions: {
notifications: ["readAll"], // This must match the structure in your access control
},
},
});
if (!hasPermissions) {
return apiReturn(res, {
success: false,
level: "error",
module: "notification",
subModule: "post",
message: `You do not have permissions to be here`,
data: [],
status: 400,
});
}
const { data: nName, error: nError } = await tryCatch(
db
.select()
.from(notifications)
.where(eq(notifications.name, req.body.name)),
);
if (nError) {
return apiReturn(res, {
success: false,
level: "error",
module: "notification",
subModule: "get",
message: `There was an error getting the notifications `,
data: [nError],
status: 400,
});
}
const { data: sub, error: sError } = await tryCatch(
db
.select()
.from(notifications)
.where(eq(notifications.name, req.body.name)),
);
if (sError) {
return apiReturn(res, {
success: false,
level: "error",
module: "notification",
subModule: "get",
message: `There was an error getting the subs `,
data: [sError],
status: 400,
});
}
const emailString = [
...new Set(
sub.flatMap((e: any) =>
e.emails?.map((email: any) => email.trim().toLowerCase()),
),
),
].join(";");
console.log(emailString);
const { default: runFun } = await import(
`./notification.${req.body.name.trim()}.js`
);
const manual = await runFun(nName[0], "blake.matthes@alpla.com");
return apiReturn(res, {
success: true,
level: "info",
module: "notification",
subModule: "post",
message: `Manual Trigger ran`,
data: manual ?? [],
status: 200,
});
});
export default r;

View File

@@ -0,0 +1,114 @@
import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { notifications } from "../db/schema/notifications.schema.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { delay } from "../utils/delay.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { sendEmail } from "../utils/sendEmail.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { v2QueryRun } from "../utils/pgConnectToLst.utils.js";
let shutoffv1 = false
const func = async (data: any, emails: string) => {
// TODO: remove this disable once all 17 plants are on this new lst
if (!shutoffv1){
v2QueryRun(`update public.notifications set active = false where name = '${data.name}'`)
shutoffv1 = true
}
const { data: l, error: le } = (await tryCatch(
db.select().from(notifications).where(eq(notifications.id, data.id)),
)) as any;
if (le) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `${data.name} encountered an error while trying to get initial info`,
data: le as any,
notify: true,
});
}
// search the query db for the query by name
const sqlQuery = sqlQuerySelector(`${data.name}`) as SqlQuery;
// create the ignore audit logs ids
// get get the latest blocking order id that was sent
const blockingOrderId = l[0].options[0].lastBlockingOrderIdSent ?? 69;
// run the check
const { data: queryRun, error } = await tryCatch(
prodQuery(
sqlQuery.query.replace("[lastBlocking]", blockingOrderId),
`Running notification query: ${l[0].name}`,
),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `Data for: ${l[0].name} encountered an error while trying to get it`,
data: error as any,
notify: true,
});
}
if (queryRun.data.length > 0) {
for (const bo of queryRun.data) {
const sentEmail = await sendEmail({
email: emails,
subject: bo.subject,
template: "qualityBlocking",
context: {
items: bo,
},
});
if (!sentEmail?.success) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "email",
message: `${l[0].name} failed to send the email`,
data: sentEmail?.data as any,
notify: true,
});
}
await delay(1500);
const { error: dbe } = await tryCatch(
db
.update(notifications)
.set({ options: [{ lastBlockingOrderIdSent: bo.blockingNumber }] })
.where(eq(notifications.id, data.id)),
);
if (dbe) {
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "query",
message: `Data for: ${l[0].name} encountered an error while trying to get it`,
data: dbe as any,
notify: true,
});
}
}
}
};
export default func;

View File

@@ -9,9 +9,16 @@ import {
import { returnFunc } from "../utils/returnHelper.utils.js";
import { sendEmail } from "../utils/sendEmail.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { v2QueryRun } from "../utils/pgConnectToLst.utils.js";
let shutoffv1 = false
const func = async (data: any, emails: string) => {
// TODO: remove this disable once all 17 plants are on this new lst
if (!shutoffv1){
v2QueryRun(`update public.notifications set active = false where name = '${data.name}'`)
shutoffv1 = true
}
const reprint = async (data: any, emails: string) => {
// TODO: do the actual logic for the notification.
const { data: l, error: le } = (await tryCatch(
db.select().from(notifications).where(eq(notifications.id, data.id)),
)) as any;
@@ -23,7 +30,7 @@ const reprint = async (data: any, emails: string) => {
module: "notification",
subModule: "query",
message: `${data.name} encountered an error while trying to get initial info`,
data: [le],
data: le as any,
notify: true,
});
}
@@ -52,7 +59,7 @@ const reprint = async (data: any, emails: string) => {
module: "notification",
subModule: "query",
message: `Data for: ${l[0].name} encountered an error while trying to get it`,
data: [error],
data: error as any,
notify: true,
});
}
@@ -73,7 +80,7 @@ const reprint = async (data: any, emails: string) => {
module: "notification",
subModule: "query",
message: `Data for: ${l[0].name} encountered an error while trying to get it`,
data: [dbe],
data: dbe as any,
notify: true,
});
}
@@ -90,26 +97,17 @@ const reprint = async (data: any, emails: string) => {
});
if (!sentEmail?.success) {
// sendEmail({
// email: "Blake.matths@alpla.com",
// subject: `${os.hostname()} failed to run ${data[0]?.name}.`,
// template: "serverCrash",
// context: {
// error: sentEmail?.data,
// plant: `${os.hostname()}`,
// },
// });
return returnFunc({
success: false,
level: "error",
module: "notification",
subModule: "email",
message: `${l[0].name} failed to send the email`,
data: [sentEmail?.data],
data: sentEmail?.data as any,
notify: true,
});
}
}
};
export default reprint;
export default func;

View File

@@ -1,5 +1,6 @@
import type { Express } from "express";
import { requireAuth } from "../middleware/auth.middleware.js";
import manual from "./notification.manualTrigger.js";
import getNotifications from "./notification.route.js";
import updateNote from "./notification.update.route.js";
import deleteSub from "./notificationSub.delete.route.js";
@@ -11,6 +12,7 @@ export const setupNotificationRoutes = (baseUrl: string, app: Express) => {
//stats will be like this as we dont need to change this
app.use(`${baseUrl}/api/notification`, requireAuth, getNotifications);
app.use(`${baseUrl}/api/notification`, requireAuth, updateNote);
app.use(`${baseUrl}/api/notification/manual`, requireAuth, manual);
app.use(`${baseUrl}/api/notification/sub`, requireAuth, subs);
app.use(`${baseUrl}/api/notification/sub`, requireAuth, newSub);
app.use(`${baseUrl}/api/notification/sub`, requireAuth, updateSub);

View File

@@ -22,7 +22,7 @@ const note: NewNotification[] = [
"Checks for new blocking orders that have been entered, recommend to get the most recent order in here before activating.",
active: false,
interval: "10",
options: [{ sentBlockingOrders: [{ timeStamp: "0", blockingOrder: 1 }] }],
options: [{ lastBlockingOrderIdSent: 1 }],
},
{
name: "alplaPurchaseHistory",

View File

@@ -14,20 +14,82 @@
*/
import { Router } from "express";
import multer from "multer";
import { db } from "../db/db.controller.js";
import { printerLog } from "../db/schema/printerLogs.schema.js";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
type PrinterEvent = {
name: string;
condition: string;
message: string;
};
const r = Router();
const upload = multer();
r.post("/printer/listener/:printer", async (req, res) => {
const parseZebraAlert = (body: any): PrinterEvent => {
const name = body.uniqueId || "unknown";
const decoded = decodeURIComponent(body.alertMsg || "");
const [conditionRaw, ...rest] = decoded.split(":");
const condition = conditionRaw?.toLowerCase()?.trim() || "unknown";
const message = rest.join(":").trim();
return {
name,
condition,
message,
};
};
r.post("/printer/listener/:printer", upload.any(), async (req, res) => {
const { printer: printerName } = req.params;
console.log(req.body);
const event: PrinterEvent = parseZebraAlert(req.body);
const rawIp =
req.headers["x-forwarded-for"]?.toString().split(",")[0]?.trim() ||
req.socket.remoteAddress ||
req.ip;
const ip = rawIp?.replace("::ffff:", "");
// post the new message
const { data, error } = await tryCatch(
db
.insert(printerLog)
.values({
ip: ip?.replace("::ffff:", ""),
name: printerName,
printerSN: event.name,
condition: event.condition,
message: event.message,
})
.returning(),
);
if (error) {
return apiReturn(res, {
success: false,
level: "info",
module: "ocp",
subModule: "printing",
message: `${printerName} encountered an error posting the log`,
data: error as any,
status: 400,
});
}
if (data) {
// TODO: send message over to the controller to decide what to do next with it
}
return apiReturn(res, {
success: true,
level: "info",
module: "ocp",
subModule: "printing",
message: `${printerName} just passed over a message`,
message: `${printerName} just sent a message`,
data: req.body ?? [],
status: 200,
});

View File

@@ -10,10 +10,323 @@
* printer status will live here this will be how we manage all the levels of status like 3 paused, 1 printing, 8 error, 10 power up, etc...
*/
import { eq } from "drizzle-orm";
import net from "net";
import { db } from "../db/db.controller.js";
import { printerData } from "../db/schema/printers.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import { delay } from "../utils/delay.utils.js";
import { runProdApi } from "../utils/prodEndpoint.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
type Printer = {
name: string;
humanReadableId: string;
type: number;
ipAddress: string;
port: number;
default: boolean;
labelInstanceIpAddress: string;
labelInstancePort: number;
active: boolean;
remark: string;
processes: number[];
};
const log = createLogger({ module: "ocp", subModule: "printers" });
export const printerManager = async () => {};
export const printerHeartBeat = async () => {
// heat heats will be defaulted to 60 seconds no reason to allow anything else
// heat heats will be defaulted to 60 seconds no reason to allow anything else, and heart beats will only go to assigned printers no need to be monitoring non labeling printers
};
//export const printerStatus = async (statusNr: number, printerId: number) => {};
export const printerSync = async () => {
// pull the printers from alpla prod and update them in lst
const printers = await runProdApi({
method: "get",
endpoint: "/public/v1.0/Administration/Printers",
});
if (!printers?.success) {
return returnFunc({
success: false,
level: "error",
module: "ocp",
subModule: "printer",
message: printers?.message ?? "",
data: printers?.data ?? [],
notify: false,
});
}
if (printers?.success) {
const ignorePrinters = ["pdf24", "standard"];
const validPrinters =
printers.data.filter(
(n: any) =>
!ignorePrinters.includes(n.name.toLowerCase()) && n.ipAddress,
) ?? [];
if (validPrinters.length) {
for (const printer of validPrinters as Printer[]) {
// run an update for each printer, do on conflicts based on the printer id
log.debug({}, `Add/Updating ${printer.name}`);
if (printer.active) {
await db
.insert(printerData)
.values({
name: printer.name,
humanReadableId: printer.humanReadableId,
ipAddress: printer.ipAddress,
port: printer.port,
remark: printer.remark,
processes: printer.processes,
})
.onConflictDoUpdate({
target: printerData.humanReadableId,
set: {
name: printer.name,
humanReadableId: printer.humanReadableId,
ipAddress: printer.ipAddress,
port: printer.port,
remark: printer.remark,
processes: printer.processes,
},
})
.returning();
await tcpPrinter(printer);
}
if (!printer.active) {
log.warn({}, `${printer.name} is not active so removing from lst.`);
await db
.delete(printerData)
.where(eq(printerData.humanReadableId, printer.humanReadableId));
}
}
return returnFunc({
success: true,
level: "info",
module: "ocp",
subModule: "printer",
message: `${printers.data.length} printers were just synced, this includes new and old printers`,
data: [],
notify: false,
});
}
}
return returnFunc({
success: true,
level: "info",
module: "ocp",
subModule: "printer",
message: `No printers to update`,
data: [],
notify: false,
});
};
const tcpPrinter = (printer: Printer) => {
return new Promise<void>((resolve) => {
const socket = new net.Socket();
const timeoutMs = 15 * 1000;
const commands = [
{
key: "clearAlerts",
command: '! U1 setvar "alerts.configured" ""\r\n',
},
{
key: "addAlert",
command: `! U1 setvar "alerts.add" "ALL MESSAGES,HTTP-POST,Y,Y,http://${process.env.SERVER_IP}:${process.env.PORT}/lst/api/ocp/printer/listener/${printer.name},0,N,printer"\r\n`,
},
{
key: "setFriendlyName",
command: `! U1 setvar "device.friendly_name" "${printer.name}"\r\n`,
},
{
key: "getUniqueId",
command: '! U1 getvar "device.unique_id"\r\n',
},
] as const;
let currentCommandIndex = 0;
let awaitingSerial = false;
let settled = false;
const cleanup = () => {
socket.removeAllListeners();
socket.destroy();
};
const finish = (err?: unknown) => {
if (settled) return;
settled = true;
clearTimeout(timeout);
cleanup();
if (err) {
log.error(
{ err, printer: printer.name },
`Printer update failed for ${printer.name}: doing the name and alert add directly on the printer.`,
);
}
resolve();
};
const timeout = setTimeout(() => {
finish(`${printer.name} timed out while updating printer config`);
}, timeoutMs);
const sendNext = async () => {
if (currentCommandIndex >= commands.length) {
socket.end();
return;
}
const current = commands[currentCommandIndex];
if (!current) {
socket.end();
return;
}
awaitingSerial = current.key === "getUniqueId";
log.info(
{ printer: printer.name, command: current.key },
`Sending command to ${printer.name}`,
);
socket.write(current.command);
currentCommandIndex++;
// Small pause between commands so the printer has breathing room
if (currentCommandIndex < commands.length) {
await delay(1500);
await sendNext();
} else {
// last command was sent, now wait for final data/close
await delay(1500);
socket.end();
}
};
socket.connect(printer.port, printer.ipAddress, async () => {
log.info({}, `Connected to ${printer.name}`);
try {
await sendNext();
} catch (error) {
finish(
error instanceof Error
? error
: new Error(
`Unknown error while sending commands to ${printer.name}`,
),
);
}
});
socket.on("data", async (data) => {
const response = data.toString().trim().replaceAll('"', "");
log.info(
{ printer: printer.name, response },
`Received printer response from ${printer.name}`,
);
if (!awaitingSerial) return;
awaitingSerial = false;
try {
await db
.update(printerData)
.set({ printerSN: response })
.where(eq(printerData.humanReadableId, printer.humanReadableId));
} catch (error) {
finish(
error instanceof Error
? error
: new Error(`Failed to update printer SN for ${printer.name}`),
);
}
});
socket.on("close", () => {
log.info({}, `Closed connection to ${printer.name}`);
finish();
});
socket.on("error", (err) => {
finish(err);
});
});
};
// const tcpPrinter = async (printer: Printer) => {
// const p = new net.Socket();
// const commands = [
// '! U1 setvar "alerts.configured" ""\r\n', // clean install just remove all alerts
// `! U1 setvar "alerts.add" "ALL MESSAGES,HTTP-POST,Y,Y,http://${process.env.SERVER_IP}:${process.env.PORT}/lst/api/ocp/printer/listener/${printer.name},0,N,printer"\r\n`, // add in the all alert
// `! U1 setvar "device.friendly_name" "${printer.name}"\r\n`, // change the name to match the alplaprod name
// `! U1 getvar "device.unique_id"\r\n`, // this will get mapped into the printer as this is the one we will link to in the db.
// //'! U1 getvar "alerts.configured" ""\r\n',
// ];
// let index = 0;
// const sendNext = async () => {
// if (index >= commands.length) {
// p.end();
// return;
// }
// const cmd = commands[index] as string;
// p.write(cmd);
// return;
// };
// p.connect(printer.port, printer.ipAddress, async () => {
// log.info({}, `Connected to ${printer.name}`);
// while (index < commands.length) {
// await sendNext();
// await delay(2000);
// index++;
// }
// });
// p.on("data", async (data) => {
// // this is just the sn that comes over so we will update this printer.
// await db
// .update(printerData)
// .set({ printerSN: data.toString().trim().replaceAll('"', "") })
// .where(eq(printerData.humanReadableId, printer.humanReadableId));
// // get the name
// // p.write('! U1 getvar "device.friendly_name"\r\n');
// // p.write('! U1 getvar "device.unique_id"\r\n');
// // p.write('! U1 getvar "alerts.configured"\r\n');
// });
// p.on("close", () => {
// log.info({}, `Closed connection to ${printer.name}`);
// p.destroy();
// return;
// });
// p.on("error", (err) => {
// log.info(
// { stack: err },
// `${printer.name} encountered an error while trying to update`,
// );
// return;
// });
// };

View File

@@ -0,0 +1,38 @@
/**
* the route that listens for the printers post.
*
* and http-post alert should be setup on each printer pointing to at min you will want to make the alert for
* pause printer, you can have all on here as it will also monitor and do things on all messages
*
* http://{serverIP}:2222/lst/api/ocp/printer/listener/{printerName}
*
* the messages will be sent over to the db for logging as well as specific ones will do something
*
* pause will validate if can print
* close head will repause the printer so it wont print a label
* power up will just repause the printer so it wont print a label
*/
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
//import { tryCatch } from "../utils/trycatch.utils.js";
import { printerSync } from "./ocp.printer.manage.js";
const r = Router();
r.post("/printer/update", async (_, res) => {
printerSync();
return apiReturn(res, {
success: true,
level: "info",
module: "ocp",
subModule: "printing",
message:
"Printer update has been triggered to monitor progress please head to the logs.",
data: [],
status: 200,
});
});
export default r;

View File

@@ -2,6 +2,7 @@ import { type Express, Router } from "express";
import { requireAuth } from "../middleware/auth.middleware.js";
import { featureCheck } from "../middleware/featureActive.middleware.js";
import listener from "./ocp.printer.listener.js";
import update from "./ocp.printer.update.js";
export const setupOCPRoutes = (baseUrl: string, app: Express) => {
//setup all the routes
@@ -16,6 +17,7 @@ export const setupOCPRoutes = (baseUrl: string, app: Express) => {
// auth routes below here
router.use(requireAuth);
router.use(update);
//router.use("");
app.use(`${baseUrl}/api/ocp`, router);

View File

@@ -7,12 +7,17 @@ import { returnFunc } from "../utils/returnHelper.utils.js";
export let pool: sql.ConnectionPool;
export let connected: boolean = false;
export let reconnecting = false;
// start the delay out as 2 seconds
let delayStart = 2000;
let attempt = 0;
const maxAttempts = 10;
export const connectProdSql = async () => {
const serverUp = await checkHostnamePort(`${process.env.PROD_SERVER}:1433`);
if (!serverUp) {
// we will try to reconnect
connected = false;
reconnectToSql();
return returnFunc({
success: false,
level: "error",
@@ -48,6 +53,7 @@ export const connectProdSql = async () => {
notify: false,
});
} catch (error) {
reconnectToSql();
return returnFunc({
success: false,
level: "error",
@@ -104,11 +110,6 @@ export const reconnectToSql = async () => {
//set reconnecting to true while we try to reconnect
reconnecting = true;
// start the delay out as 2 seconds
let delayStart = 2000;
let attempt = 0;
const maxAttempts = 10;
while (!connected && attempt < maxAttempts) {
attempt++;
log.info(
@@ -121,7 +122,7 @@ export const reconnectToSql = async () => {
if (!serverUp) {
delayStart = Math.min(delayStart * 2, 30000); // exponential backoff until up to 30000
return;
continue;
}
try {
@@ -133,19 +134,12 @@ export const reconnectToSql = async () => {
);
} catch (error) {
delayStart = Math.min(delayStart * 2, 30000);
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "db",
message: "Failed to reconnect to the prod sql server.",
data: [error],
notify: false,
});
delayStart = Math.min(delayStart * 2, 30000);
log.error({ error }, "Failed to reconnect to the prod sql server.");
}
}
if (!connected) {
if (!connected && attempt >= maxAttempts) {
log.error(
{ notify: true },
"Max reconnect attempts reached on the prodSql server. Stopping retries.",

View File

@@ -1,10 +1,5 @@
import { returnFunc } from "../utils/returnHelper.utils.js";
import {
connected,
pool,
reconnecting,
reconnectToSql,
} from "./prodSqlConnection.controller.js";
import { connected, pool } from "./prodSqlConnection.controller.js";
interface SqlError extends Error {
code?: string;
@@ -22,29 +17,15 @@ interface SqlError extends Error {
*/
export const prodQuery = async (queryToRun: string, name: string) => {
if (!connected) {
reconnectToSql();
if (reconnecting) {
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "prodSql",
message: `The sql ${process.env.PROD_PLANT_TOKEN} is trying to reconnect already`,
data: [],
notify: false,
});
} else {
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "prodSql",
message: `${process.env.PROD_PLANT_TOKEN} is not connected, and failed to connect.`,
data: [],
notify: true,
});
}
return returnFunc({
success: false,
level: "error",
module: "system",
subModule: "prodSql",
message: `${process.env.PROD_PLANT_TOKEN} is offline or attempting to reconnect`,
data: [],
notify: false,
});
}
//change to the correct server

View File

@@ -1,6 +1,6 @@
use AlplaPROD_test1
SELECT V_Artikel.IdArtikelvarianten,
SELECT V_Artikel.IdArtikelvarianten as article,
V_Artikel.Bezeichnung,
V_Artikel.ArtikelvariantenTypBez,
V_Artikel.PreisEinheitBez,

View File

@@ -0,0 +1,43 @@
/**
This will be replacing activeArticles once all data is remapped into this query.
make a note in the docs this activeArticles will go stale sooner or later.
**/
use [test1_AlplaPROD2.0_Read]
select a.Id,
a.HumanReadableId as av,
a.Alias as alias,
p.LoadingUnitsPerTruck as loadingUnitsPerTruck,
p.LoadingUnitsPerTruck * p.LoadingUnitPieces as qtyPerTruck,
p.LoadingUnitPieces,
case when i.MinQuantity IS NOT NULL then round(cast(i.MinQuantity as float), 2) else 0 end as min,
case when i.MaxQuantity IS NOT NULL then round(cast(i.MaxQuantity as float),2) else 0 end as max
from masterData.Article (nolock) as a
/* sales price */
left join
(select *
from (select
id,
PackagingId,
ArticleId,
DefaultCustomer,
ROW_NUMBER() OVER (PARTITION BY ArticleId ORDER BY ValidAfter DESC) AS RowNum
from masterData.SalesPrice (nolock)
where DefaultCustomer = 1) as x
where RowNum = 1
) as s
on a.id = s.ArticleId
/* pkg instructions */
left join
masterData.PackagingInstruction (nolock) as p
on s.PackagingId = p.id
/* stock limits */
left join
masterData.StockLimit (nolock) as i
on a.id = i.ArticleId
where a.active = 1
and a.HumanReadableId in ([articles])

View File

@@ -0,0 +1,45 @@
select x.idartikelVarianten as av
,ArtikelVariantenAlias as Alias
--x.Lfdnr as RunningNumber,
--,round(sum(EinlagerungsMengeVPKSum),0) as Total_Pallets
--,sum(EinlagerungsMengeSum) as Total_PalletQTY
,round(sum(VerfuegbareMengeVPKSum),0) as Avalible_Pallets
,sum(VerfuegbareMengeSum) as Avaliable_PalletQTY
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as COA_Pallets
,sum(case when c.Description LIKE '%COA%' then GesperrteMengeSum else 0 end) as COA_QTY
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeVPKSum else 0 end) as Held_Pallets
--,sum(case when c.Description NOT LIKE '%COA%' then GesperrteMengeSum else 0 end) as Held_QTY
,IdProdPlanung as Lot
--,IdAdressen
--,x.AdressBez
--,*
from [AlplaPROD_test1].dbo.[V_LagerPositionenBarcodes] (nolock) x
left join
[AlplaPROD_test1].dbo.T_EtikettenGedruckt (nolock) on
x.Lfdnr = T_EtikettenGedruckt.Lfdnr AND T_EtikettenGedruckt.Lfdnr > 1
left join
(SELECT *
FROM [AlplaPROD_test1].[dbo].[T_BlockingDefects] (nolock) where Active = 1) as c
on x.IdMainDefect = c.IdBlockingDefect
/*
The data below will be controlled by the user in excell by default everything will be passed over
IdAdressen = 3
*/
where
--IdArtikelTyp = 1
x.IdWarenlager not in (6, 1)
--and IdAdressen
--and x.IdWarenlager in (0)
group by x.IdArtikelVarianten
,ArtikelVariantenAlias
,IdProdPlanung
--,c.Description
,IdAdressen
,x.AdressBez
--, x.Lfdnr
order by x.IdArtikelVarianten

View File

@@ -0,0 +1,29 @@
use [test1_AlplaPROD2.0_Read]
select
customerartno as CustomerArticleNumber
,h.CustomerOrderNumber as CustomerOrderNumber
,l.CustomerLineItemNumber as CustomerLineNumber
,r.CustomerReleaseNumber as CustomerRealeaseNumber
,r.Quantity
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as DeliveryDate
,h.CustomerHumanReadableId as CustomerID
,r.Remark
--,*
from [order].[Release] as r (nolock)
left join
[order].LineItem as l (nolock) on
l.id = r.LineItemId
left join
[order].Header as h (nolock) on
h.id = l.HeaderId
WHERE releaseState not in (1, 2, 3, 4)
AND h.CreatedByEdi = 1
AND r.deliveryDate < getdate() + 1
--AND h.CustomerHumanReadableId in (0)
order by r.deliveryDate

View File

@@ -0,0 +1,8 @@
SELECT format(RequirementDate, 'yyyy-MM-dd') as requirementDate
,ArticleHumanReadableId
,CustomerArticleNumber
,ArticleDescription
,Quantity
FROM [test1_AlplaPROD2.0_Read].[forecast].[Forecast]
where DeliveryAddressHumanReadableId in ([customers])
order by RequirementDate

View File

@@ -0,0 +1,64 @@
use [test1_AlplaPROD2.0_Read]
select
ArticleHumanReadableId as article
,ArticleAlias as alias
,round(sum(QuantityLoadingUnits),2) total_pallets
,round(sum(Quantity),2) as total_palletQTY
,round(sum(case when State = 0 then QuantityLoadingUnits else 0 end),2) available_Pallets
,round(sum(case when State = 0 then Quantity else 0 end),2) available_QTY
,round(sum(case when b.HumanReadableId = 864 then QuantityLoadingUnits else 0 end),2) as coa_Pallets
,round(sum(case when b.HumanReadableId = 864 then Quantity else 0 end),2) as coa_QTY
,round(sum(case when b.HumanReadableId <> 864 then QuantityLoadingUnits else 0 end),2) as held_Pallets
,round(sum(case when b.HumanReadableId <> 864 then Quantity else 0 end),2) as held_QTY
,round(sum(case when w.type = 7 then QuantityLoadingUnits else 0 end),2) as consignment_Pallets
,round(sum(case when w.type = 7 then Quantity else 0 end),2) as consignment_qty
--,l.RunningNumber
/** datamart include lot number **/
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber as lot
/** data mart include location data **/
--,l.WarehouseDescription,l.LaneDescription
/** historical section **/
--,l.ProductionLotRunningNumber as lot,l.warehousehumanreadableid as warehouseId,l.WarehouseDescription as warehouseDescription,l.lanehumanreadableid as locationId,l.lanedescription as laneDescription
,articleTypeName
FROM [warehousing].[WarehouseUnit] as l (nolock)
left join
(
SELECT [Id]
,[HumanReadableId]
,d.[Description]
,[DefectGroupId]
,[IsActive]
FROM [blocking].[BlockingDefect] as g (nolock)
left join
[AlplaPROD_test1].dbo.[T_BlockingDefects] as d (nolock) on
d.IdGlobalBlockingDefect = g.HumanReadableId
) as b on
b.id = l.MainDefectId
left join
[warehousing].[warehouse] as w (nolock) on
w.id = l.warehouseid
where LaneHumanReadableId not in (20000,21000)
group by ArticleHumanReadableId,
ArticleAlias,
ArticleTypeName
--,l.RunningNumber
/** datamart include lot number **/
--,l.MachineLocation,l.MachineName,l.ProductionLotRunningNumber
/** data mart include location data **/
--,l.WarehouseDescription,l.LaneDescription
/** historical section **/
--,l.ProductionLotRunningNumber,l.warehousehumanreadableid,l.WarehouseDescription,l.lanehumanreadableid,l.lanedescription
order by ArticleHumanReadableId

View File

@@ -0,0 +1,33 @@
use [test1_AlplaPROD2.0_Read]
select
customerartno
,r.ArticleHumanReadableId as article
,r.ArticleAlias as articleAlias
,ReleaseNumber
,h.CustomerOrderNumber as header
,l.CustomerLineItemNumber as lineItem
,r.CustomerReleaseNumber as releaseNumber
,r.LoadingUnits
,r.Quantity
,r.TradeUnits
,h.CustomerHumanReadableId
,r.DeliveryAddressDescription
,format(r.LoadingDate, 'MM/dd/yyyy HH:mm') as loadingDate
,format(r.DeliveryDate, 'MM/dd/yyyy HH:mm') as deliveryDate
,r.Remark
--,*
from [order].[Release] as r (nolock)
left join
[order].LineItem as l (nolock) on
l.id = r.LineItemId
left join
[order].Header as h (nolock) on
h.id = l.HeaderId
WHERE releasestate not in (1, 2, 4)
AND r.deliverydate between getDate() + -[startDay] and getdate() + [endDay]
order by r.deliverydate

View File

@@ -0,0 +1,19 @@
use [test1_AlplaPROD2.0_Reporting]
declare @startDate nvarchar(30) = '[startDate]' --'2024-12-30'
declare @endDate nvarchar(30) = '[endDate]' --'2025-08-09'
select MachineLocation,
ArticleHumanReadableId as article,
sum(Quantity) as Produced,
count(Quantity) as palletsProdued,
FORMAT(convert(date, ProductionDay), 'M/d/yyyy') as ProductionDay,
ProductionLotHumanReadableId as productionLot
from [reporting_productionControlling].[ScannedUnit] (nolock)
where convert(date, ProductionDay) between @startDate and @endDate
and ArticleHumanReadableId in ([articles])
and BookedOut is null
group by MachineLocation, ArticleHumanReadableId,ProductionDay, ProductionLotHumanReadableId

View File

@@ -0,0 +1,23 @@
use AlplaPROD_test1
/**
move this over to the delivery date range query once we have the shift data mapped over correctly.
update the psi stuff on this as well.
**/
declare @start_date nvarchar(30) = '[startDate]' --'2025-01-01'
declare @end_date nvarchar(30) = '[endDate]' --'2025-08-09'
select IdArtikelVarianten,
ArtikelVariantenBez,
sum(Menge) totalDelivered,
case when convert(time, upd_date) between '00:00' and '07:00' then convert(date, upd_date - 1) else convert(date, upd_date) end as ShippingDate
from dbo.V_LadePlanungenLadeAuftragAbruf (nolock)
where upd_date between CONVERT(datetime, @start_date + ' 7:00') and CONVERT(datetime, @end_date + ' 7:00')
and IdArtikelVarianten in ([articles])
group by IdArtikelVarianten, upd_date,
ArtikelVariantenBez

View File

@@ -0,0 +1,32 @@
use AlplaPROD_test1
declare @start_date nvarchar(30) = '[startDate]' --'2025-01-01'
declare @end_date nvarchar(30) = '[endDate]' --'2025-08-09'
/*
articles will need to be passed over as well as the date structure we want to see
*/
select x.IdArtikelvarianten As Article,
ProduktionAlias as Description,
standort as MachineId,
MaschinenBezeichnung as MachineName,
--MaschZyklus as PlanningCycleTime,
x.IdProdPlanung as LotNumber,
FORMAT(ProdTag, 'MM/dd/yyyy') as ProductionDay,
x.planMenge as TotalPlanned,
ProduktionMenge as QTYPerDay,
round(ProduktionMengeVPK, 2) PalDay,
Status as finished
--MaschStdAuslastung as nee
from dbo.V_ProdLosProduktionJeProdTag_PLANNING (nolock) as x
left join
dbo.V_ProdPlanung (nolock) as p on
x.IdProdPlanung = p.IdProdPlanung
where ProdTag between @start_date and @end_date
and p.IdArtikelvarianten in ([articles])
--and V_ProdLosProduktionJeProdTag_PLANNING.IdKunde = 10
--and IdProdPlanung = 18442
order by ProdTag desc

View File

@@ -0,0 +1,44 @@
use [test1_AlplaPROD2.0_Read]
SELECT
'Alert! new blocking order: #' + cast(bo.HumanReadableId as varchar) + ' - ' + bo.ArticleVariantDescription as subject
,cast(bo.[HumanReadableId] as varchar) as blockingNumber
,bo.[ArticleVariantDescription] as article
,cast(bo.[CustomerHumanReadableId] as varchar) + ' - ' + bo.[CustomerDescription] as customer
,convert(varchar(10), bo.[BlockingDate], 101) + ' ' + convert(varchar(5), bo.[BlockingDate], 108) as blockingDate
,cast(ArticleVariantHumanReadableId as varchar) + ' - ' + ArticleVariantDescription as av
,case when bo.Remark = '' or bo.Remark is NULL then 'Please reach out to quality for the reason this was placed on hold as a remark was not entered during the blocking processs' else bo.Remark end as remark
,cast(FORMAT(TotalAmountOfPieces, '###,###') as varchar) + ' / ' + cast(LoadingUnit as varchar) as peicesAndLoadingUnits
,bo.ProductionLotHumanReadableId as lotNumber
,cast(osd.IdBlockingDefectsGroup as varchar) + ' - ' + osd.Description as mainDefectGroup
,cast(df.HumanReadableId as varchar) + ' - ' + os.Description as mainDefect
,lot.MachineLocation as line
--,*
FROM [blocking].[BlockingOrder] (nolock) as bo
/*** get the defect details ***/
join
[blocking].[BlockingDefect] (nolock) AS df
on df.id = bo.MainDefectId
/*** pull description from 1.0 ***/
left join
[AlplaPROD_test1].[dbo].[T_BlockingDefects] (nolock) as os
on os.IdGlobalBlockingDefect = df.HumanReadableId
/*** join in 1.0 defect group ***/
left join
[AlplaPROD_test1].[dbo].[T_BlockingDefectsGroups] (nolock) as osd
on osd.IdBlockingDefectsGroup = os.IdBlockingDefectsGroup
left join
[productionControlling].[ProducedLot] (nolock) as lot
on lot.id = bo.ProductionLotId
where
bo.[BlockingDate] between getdate() - 2 and getdate() + 3 and
bo.BlockingTrigger = 1 -- so we only get the ir blocking and not coa
--and HumanReadableId NOT IN ([sentBlockingOrders])
and bo.HumanReadableId > [lastBlocking]

View File

@@ -0,0 +1,4 @@
select top(1) convert(varchar(8) ,
convert(time,startdate), 108) as shiftChange
from [test1_AlplaPROD2.0_Read].[masterData].[ShiftDefinition]
where teamNumber = 1

View File

@@ -20,8 +20,8 @@ export const gpReqCheck = async (data: GpStatus[]) => {
module: "purchase",
subModule: "query",
message: `Error getting alpla purchase info`,
data: [gpReqCheck.message],
notify: false,
data: gpReqCheck.message as any,
notify: true,
});
}
@@ -30,7 +30,7 @@ export const gpReqCheck = async (data: GpStatus[]) => {
const result = await gpQuery(
gpReqCheck.query.replace(
"[reqsToCheck]",
data.map((r) => `'${r.req}'`).join(", ") ?? "",
data.map((r) => `'${r.req}'`).join(", ") ?? "xo",
),
"Get req info",
);
@@ -55,7 +55,7 @@ export const gpReqCheck = async (data: GpStatus[]) => {
[Requisition Number] as req
,case when [Workflow Status] = 'recall' then 'returned' else [Workflow Status] end as approvedStatus
--,*
from [dbo].[PurchaseRequisitions] where [Requisition Number] in (${missing1Reqs.map((r) => `'${r}'`).join(", ")})`,
from [dbo].[PurchaseRequisitions] where [Requisition Number] in (${missing1Reqs.map((r) => `'${r}'`).join(", ") ?? "xo"})`,
"validate req is not in recall",
);
@@ -76,7 +76,7 @@ export const gpReqCheck = async (data: GpStatus[]) => {
,PONUMBER
,reqStatus='converted'
,*
from alpla.dbo.sop60100 (nolock) where sopnumbe in (${missing2Reqs.map((r) => `'${r}'`).join(", ")})`,
from alpla.dbo.sop60100 (nolock) where sopnumbe in (${missing2Reqs.map((r) => `'${r}'`).join(", ") ?? "xo"})`,
"Get release info",
);
@@ -111,7 +111,15 @@ export const gpReqCheck = async (data: GpStatus[]) => {
}));
return updateData;
} catch (error) {
log.error({ stack: error });
} catch (error: any) {
return returnFunc({
success: false,
level: "error",
module: "purchase",
subModule: "gpChecks",
message: error.message,
data: error.stack as any,
notify: true,
});
}
};

View File

@@ -39,8 +39,8 @@ export const monitorAlplaPurchase = async () => {
module: "purchase",
subModule: "query",
message: `Error getting alpla purchase info`,
data: [sqlQuery.message],
notify: false,
data: sqlQuery.message as any,
notify: true,
});
}
@@ -78,7 +78,7 @@ export const monitorAlplaPurchase = async () => {
if (error) {
log.error(
{ error },
{ error, notify: true },
"There was an error adding alpla purchase history",
);
}
@@ -86,8 +86,10 @@ export const monitorAlplaPurchase = async () => {
await delay(500);
}
} catch (e) {
log.error({ error: e }, "Error occurred while running the monitor job");
log.error({ error: e }, "Error occurred while running the monitor job");
log.error(
{ error: e, notify: true },
"Error occurred while running the monitor job",
);
return;
}
@@ -104,7 +106,7 @@ export const monitorAlplaPurchase = async () => {
// if theres no reqs just end meow
if (errorReq) {
log.error(
{ stack: errorReq },
{ stack: errorReq, notify: true },
"There was an error getting history data",
);
return;

View File

@@ -10,6 +10,7 @@ import { setupOCPRoutes } from "./ocp/ocp.routes.js";
import { setupOpendockRoutes } from "./opendock/opendock.routes.js";
import { setupProdSqlRoutes } from "./prodSql/prodSql.routes.js";
import { setupSystemRoutes } from "./system/system.routes.js";
import { setupTCPRoutes } from "./tcpServer/tcp.routes.js";
import { setupUtilsRoutes } from "./utils/utils.routes.js";
export const setupRoutes = (baseUrl: string, app: Express) => {
@@ -24,4 +25,5 @@ export const setupRoutes = (baseUrl: string, app: Express) => {
setupOpendockRoutes(baseUrl, app);
setupNotificationRoutes(baseUrl, app);
setupOCPRoutes(baseUrl, app);
setupTCPRoutes(baseUrl, app);
};

View File

@@ -6,15 +6,19 @@ import { dbCleanup } from "./db/dbCleanup.controller.js";
import { type Setting, settings } from "./db/schema/settings.schema.js";
import { connectGPSql } from "./gpSql/gpSqlConnection.controller.js";
import { createLogger } from "./logger/logger.controller.js";
import { historicalSchedule } from "./logistics/logistics.historicalInv.js";
import { startNotifications } from "./notification/notification.controller.js";
import { createNotifications } from "./notification/notifications.master.js";
import { printerSync } from "./ocp/ocp.printer.manage.js";
import { monitorReleaseChanges } from "./opendock/openDockRreleaseMonitor.utils.js";
import { opendockSocketMonitor } from "./opendock/opendockSocketMonitor.utils.js";
import { connectProdSql } from "./prodSql/prodSqlConnection.controller.js";
import { monitorAlplaPurchase } from "./purchase/purchase.controller.js";
import { setupSocketIORoutes } from "./socket.io/serverSetup.js";
import { baseSettingValidationCheck } from "./system/settingsBase.controller.js";
import { startTCPServer } from "./tcpServer/tcp.server.js";
import { createCronJob } from "./utils/croner.utils.js";
import { sendEmail } from "./utils/sendEmail.utils.js";
const port = Number(process.env.PORT) || 3000;
export let systemSettings: Setting[] = [];
@@ -28,6 +32,7 @@ const start = async () => {
const log = createLogger({ module: "system", subModule: "main start" });
// triggering long lived processes
startTCPServer();
connectProdSql();
connectGPSql();
@@ -51,17 +56,39 @@ const start = async () => {
monitorAlplaPurchase();
}
if (systemSettings.filter((n) => n.name === "ocp")[0]?.active) {
printerSync();
}
// these jobs below are system jobs and should run no matter what.
createCronJob("JobAuditLogCleanUp", "0 0 5 * * *", () =>
dbCleanup("jobs", 30),
);
createCronJob("logsCleanup", "0 15 5 * * *", () => dbCleanup("logs", 120));
historicalSchedule();
// one shots only needed to run on server startups
createNotifications();
startNotifications();
}, 5 * 1000);
process.on("uncaughtException", async (err) => {
console.error("Uncaught Exception:", err);
//await closePool();
const emailData = {
email: "blake.matthes@alpla.com", // should be moved to the db so it can be reused.
subject: `${os.hostname()} has just encountered a crash.`,
template: "serverCrash",
context: {
error: err,
plant: `${os.hostname()}`,
},
};
await sendEmail(emailData);
//process.exit(1);
});
server.listen(port, async () => {
log.info(
`Listening on http://${os.hostname()}:${port}${baseUrl}, logging in ${process.env.LOG_LEVEL}, current ENV ${process.env.NODE_ENV ? process.env.NODE_ENV : "development"}`,

View File

@@ -1,9 +1,12 @@
import { Router } from "express";
import { connected as gpSql } from "../gpSql/gpSqlConnection.controller.js";
import { connected as prodSql } from "../prodSql/prodSqlConnection.controller.js";
import { prodQuery } from "../prodSql/prodSqlQuery.controller.js";
import {
type SqlQuery,
sqlQuerySelector,
} from "../prodSql/prodSqlQuerySelector.utils.js";
import { isServerRunning } from "../tcpServer/tcp.server.js";
const router = Router();
@@ -25,6 +28,9 @@ router.get("/", async (_, res) => {
: [],
eomFGPkgSheetVersion: 1, // this is the excel file version when we have a change to the macro we want to grab this
masterMacroFile: 1,
tcpServerOnline: isServerRunning,
sqlServerConnected: prodSql,
gpServerConnected: gpSql,
});
});

View File

@@ -0,0 +1,51 @@
import { db } from "../db/db.controller.js";
import { printerLog } from "../db/schema/printerLogs.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
export type PrinterData = {
ip: string;
name: string;
condition: string;
message: string;
date?: string;
printerSN: string;
};
const log = createLogger({ module: "tcp", submodule: "create_server" });
export const printerListen = async (tcpData: PrinterData) => {
const ip = tcpData.ip?.replace("::ffff:", "");
// post the new message
const { data, error } = await tryCatch(
db
.insert(printerLog)
.values({
ip,
name: tcpData.name,
condition: tcpData.condition,
message: tcpData.message,
printerSN: tcpData.printerSN,
})
.returning(),
);
if (error) {
return returnFunc({
success: false,
level: "error",
module: "tcp",
subModule: "post",
message: "Failed to post tcp printer data.",
data: [],
notify: false,
});
}
if (data) {
log.info({}, `${tcpData.name} sent a message over`);
// TODO: send message over to the controller to decide what to do next with it
}
};

View File

@@ -0,0 +1,14 @@
import type { Express } from "express";
import { requireAuth } from "../middleware/auth.middleware.js";
import restart from "./tcpRestart.route.js";
import start from "./tcpStart.route.js";
import stop from "./tcpStop.route.js";
export const setupTCPRoutes = (baseUrl: string, app: Express) => {
//stats will be like this as we dont need to change this
app.use(`${baseUrl}/api/tcp/start`, requireAuth, start);
app.use(`${baseUrl}/api/tcp/stop`, requireAuth, stop);
app.use(`${baseUrl}/api/tcp/restart`, requireAuth, restart);
// all other system should be under /api/system/*
};

View File

@@ -0,0 +1,180 @@
import net from "node:net";
import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { printerData } from "../db/schema/printers.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import { delay } from "../utils/delay.utils.js";
import { returnFunc } from "../utils/returnHelper.utils.js";
import { tryCatch } from "../utils/trycatch.utils.js";
import { type PrinterData, printerListen } from "./tcp.printerListener.js";
let tcpServer: net.Server;
const tcpSockets: Set<net.Socket> = new Set();
export let isServerRunning = false;
const port = parseInt(process.env.TCP_PORT ?? "2222", 10);
const parseTcpAlert = (input: string) => {
// guard
const colonIndex = input.indexOf(":");
if (colonIndex === -1) return null;
const condition = input.slice(0, colonIndex).trim();
const rest = input.slice(colonIndex + 1).trim();
// extract all [ ... ] blocks from rest
const matches = [...rest.matchAll(/\[(.*?)\]/g)];
const date = matches[0]?.[1] ?? "";
const name = matches[1]?.[1] ?? "";
// message = everything before first "["
const bracketIndex = rest.indexOf("[");
const message =
bracketIndex !== -1 ? rest.slice(0, bracketIndex).trim() : rest;
return {
condition,
message,
date,
name,
};
};
const log = createLogger({ module: "tcp", submodule: "create_server" });
export const startTCPServer = async () => {
tcpServer = net.createServer(async (socket) => {
tcpSockets.add(socket);
socket.on("data", async (data: Buffer) => {
const parseData = data.toString("utf-8").trimEnd();
// check where the data came from then we do something.
const ip = socket.remoteAddress ?? "127.0.0.1";
const { data: printer, error: pError } = await tryCatch(
db
.select()
.from(printerData)
.where(eq(printerData.ipAddress, ip.replace("::ffff:", ""))),
);
if (pError) {
log.error(
{ stack: pError },
"There was an error getting printer data for tcp check",
);
return;
}
if (printer?.length) {
const printerData = {
...parseTcpAlert(parseData),
ip,
printerSN: printer[0]?.printerSN,
name: printer[0]?.name,
};
printerListen(printerData as PrinterData);
}
});
socket.on("end", () => {
log.debug({}, "Client disconnected");
// just in case we dont fully disconnect
setTimeout(() => {
if (!socket.destroyed) {
socket.destroy();
}
}, 1000);
tcpSockets.delete(socket);
});
socket.on("error", (err: Error) => {
log.error({ stack: err }, `Socket error:", ${err}`);
// just in case we dont fully disconnect
setTimeout(() => {
if (!socket.destroyed) {
socket.destroy();
}
}, 1000);
tcpSockets.delete(socket);
});
});
tcpServer.listen(port, () => {
log.info({}, `TCP Server listening on port ${port}`);
});
isServerRunning = true;
return returnFunc({
success: true,
level: "info",
module: "tcp",
subModule: "create_server",
message: "TCP server started.",
data: [],
notify: false,
room: "",
});
};
export const stopTCPServer = async () => {
if (!isServerRunning)
return { success: false, message: "Server is not running" };
for (const socket of tcpSockets) {
socket.destroy();
}
tcpSockets.clear();
tcpServer.close(() => {
log.info({}, "TCP Server stopped");
});
isServerRunning = false;
return returnFunc({
success: true,
level: "info",
module: "tcp",
subModule: "create_server",
message: "TCP server stopped.",
data: [],
notify: false,
room: "",
});
};
export const restartTCPServer = async () => {
if (!isServerRunning) {
startTCPServer();
return returnFunc({
success: false,
level: "warn",
module: "tcp",
subModule: "create_server",
message: "Server is not running will try to start it",
data: [],
notify: false,
room: "",
});
} else {
for (const socket of tcpSockets) {
socket.destroy();
}
tcpSockets.clear();
tcpServer.close(() => {
log.info({}, "TCP Server stopped");
});
isServerRunning = false;
await delay(1500);
startTCPServer();
}
return returnFunc({
success: true,
level: "info",
module: "tcp",
subModule: "create_server",
message: "TCP server has been restarted.",
data: [],
notify: false,
room: "",
});
};

View File

@@ -0,0 +1,19 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { restartTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/restart", async (_, res) => {
const connect = await restartTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "tcp",
subModule: "post",
message: "TCP Server has been restarted",
data: connect.data,
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -0,0 +1,20 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { startTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/start", async (_, res) => {
const connect = await startTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "routes",
subModule: "prodSql",
message: connect.message,
data: connect.data,
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -0,0 +1,20 @@
import { Router } from "express";
import { apiReturn } from "../utils/returnHelper.utils.js";
import { stopTCPServer } from "./tcp.server.js";
const r = Router();
r.post("/stop", async (_, res) => {
const connect = await stopTCPServer();
apiReturn(res, {
success: connect.success,
level: connect.success ? "info" : "error",
module: "routes",
subModule: "prodSql",
message: connect.message,
data: [],
status: connect.success ? 200 : 400,
});
});
export default r;

View File

@@ -3,6 +3,7 @@ import { eq } from "drizzle-orm";
import { db } from "../db/db.controller.js";
import { jobAuditLog } from "../db/schema/auditLog.schema.js";
import { createLogger } from "../logger/logger.controller.js";
import type { ReturnHelper } from "./returnHelper.utils.js";
// example createJob
// createCronJob("test Cron", "*/5 * * * * *", async () => {
@@ -45,7 +46,7 @@ const cronStats: Record<string, { created: number; replaced: number }> = {};
export const createCronJob = async (
name: string,
schedule: string, // cron string with 8 8 IE: */5 * * * * * every 5th second
task: () => Promise<void>, // what function are we passing over
task: () => Promise<void | ReturnHelper>, // what function are we passing over
source = "unknown",
) => {
// get the timezone based on the os timezone set

View File

@@ -0,0 +1,73 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
{{!-- <link rel="stylesheet" href="styles/styles.css" /> --}}
<style>
.email-wrapper {
max-width: 80%; /* Limit width to 80% of the window */
margin: 0 auto; /* Center the content horizontally */
}
.email-table {
width: 100%;
border-collapse: collapse;
}
.email-table td {
vertical-align: top;
padding: 10px;
border: 1px solid #000;
border-radius: 25px; /* Rounded corners */
background-color: #f0f0f0; /* Optional: Add a background color */
}
.email-table h2 {
margin: 0;
}
.remarks {
border: 1px solid black;
padding: 10px;
background-color: #f0f0f0;
border-radius: 25px;
}
</style>
</head>
<body>
<div class="email-wrapper">
<p>All,</p>
<p>Please see the new blocking order that was created.</p>
<div>
<div class="email-table">
<table>
<tr>
<td>
<p><strong>Blocking number: </strong>{{items.blockingNumber}}</p>
<p><strong>Blocking Date: </strong>{{items.blockingDate}}</p>
<p><strong>Article: </strong>{{items.av}}</p>
<p><strong>Production Lot: </strong>{{items.lotNumber}}</p>
<p><strong>Line: </strong>{{items.line}}</p>
</td>
<td>
<p><strong>Customer: </strong>{{items.customer}}</p>
<p><strong>Blocked pieces /LUs: </strong>{{items.peicesAndLoadingUnits}}</p>
<p><strong>Main defect group: </strong>{{items.mainDefectGroup}}</p>
<p><strong>Main defect: </strong>{{items.mainDefect}}</p>
</td>
</tr>
</table>
</div>
</div>
<div class="remarks">
<h4>Remarks:</h4>
<p>{{items.remark}}</p>
</div>
</div>
<br>
<p>For further questions please reach out to quality.</p>
<p>Thank you,</p>
<p>Quality Department</p>
</p>
</div>
</body>
</html>

View File

@@ -0,0 +1,35 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
{{!--<title>Order Summary</title> --}}
{{> styles}}
<style>
pre {
background-color: #f8f9fa;
color: #d63384;
padding: 10px;
border-radius: 5px;
white-space: pre-wrap;
font-family: monospace;
}
</style>
{{!-- <link rel="stylesheet" href="styles/styles.css" /> --}}
</head>
<body>
<h3>{{plant}},<br/> Has encountered an unexpected error.</h1>
<p>
Please see below the stack error from the crash.
</p>
<hr/>
<div>
<h3>Error Message: </h3>
<p>{{error.message}}</p>
</div>
<hr/>
<div>
<h3>Stack trace</h3>
<pre>{{{error.stack}}}</pre>
</div>
</body>
</html>

View File

@@ -0,0 +1,36 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
{{!--<title>Order Summary</title> --}}
{{> styles}}
<style>
pre {
background-color: #f8f9fa;
color: #d63384;
padding: 10px;
border-radius: 5px;
white-space: pre-wrap;
font-family: monospace;
}
</style>
{{!-- <link rel="stylesheet" href="styles/styles.css" /> --}}
</head>
<body>
<h3>{{plant}},<br/> Has encountered an error.</h1>
<p>
The below error came from Module: {{module}}, Submodule: {{submodule}}.
</p>
<p>The error below is considered to be critical and should be addressed</p>
<hr/>
<div>
<h3>Error Message: </h3>
<p>{{message}}</p>
</div>
<hr/>
<div>
<h3>Stack trace</h3>
<pre>{{{error}}}</pre>
</div>
</body>
</html>

View File

@@ -0,0 +1,41 @@
import pkg from "pg";
const { Pool } = pkg;
const baseConfig = {
host: process.env.DATABASE_HOST ?? "localhost",
port: parseInt(process.env.DATABASE_PORT ?? "5433", 10),
user: process.env.DATABASE_USER,
password: process.env.DATABASE_PASSWORD,
};
// Pools (one per DB)
const v1Pool = new Pool({
...baseConfig,
database: "lst",
});
const v2Pool = new Pool({
...baseConfig,
database: "lst_db",
});
// Query helpers
export const v1QueryRun = async (query: string, params?: any[]) => {
try {
const res = await v1Pool.query(query, params);
return res;
} catch (err) {
console.error("V1 query error:", err);
throw err;
}
};
export const v2QueryRun = async (query: string, params?: any[]) => {
try {
const res = await v2Pool.query(query, params);
return res;
} catch (err) {
console.error("V2 query error:", err);
throw err;
}
};

View File

@@ -0,0 +1,124 @@
import https from "node:https";
import axios from "axios";
import { returnFunc } from "./returnHelper.utils.js";
import { tryCatch } from "./trycatch.utils.js";
type bodyData = any;
type Data = {
endpoint: string;
data?: bodyData[];
method: "post" | "get" | "delete" | "patch";
};
// type ApiResponse<T = unknown> = {
// status: number;
// statusText: string;
// data: T;
// };
// create the test server stuff
const testServers = [
{ token: "test1", port: 8940 },
{ token: "test2", port: 8941 },
{ token: "test3", port: 8942 },
];
const agent = new https.Agent({
rejectUnauthorized: false,
});
export const prodEndpointCreation = async (endpoint: string) => {
let url = "";
//get the plant token
const plantToken = process.env.PROD_PLANT_TOKEN ?? "test1";
// check if we are a test server
const testServer = testServers.some((server) => server.token === plantToken);
// await db
// .select()
// .from(settings)
// .where(eq(settings.name, "dbServer"));
if (testServer) {
//filter out what testserver we are
const test = testServers.filter((t) => t.token === plantToken);
// "https://usmcd1vms036.alpla.net:8942/application/public/v1.0/DemandManagement/ORDERS"
url = `https://${process.env.PROD_SERVER}.alpla.net:${test[0]?.port}/application${endpoint}`;
return url;
} else {
url = `https://${plantToken}prod.alpla.net/application${endpoint}`;
return url;
}
};
/**
*
* @param data
* @param timeoutDelay
* @returns
*/
export const runProdApi = async (data: Data) => {
const url = await prodEndpointCreation(data.endpoint);
const { data: d, error } = await tryCatch(
axios({
method: data.method as string,
url,
data: data.data ? data.data[0] : undefined,
headers: {
"X-API-Key": process.env.TEC_API_KEY || "",
"Content-Type": "application/json",
},
validateStatus: () => true,
httpsAgent: agent,
}),
);
switch (d?.status) {
case 200:
return returnFunc({
success: true,
level: "info",
module: "utils",
subModule: "prodEndpoint",
message: "Data from prod endpoint",
data: d.data,
notify: false,
});
case 401:
return returnFunc({
success: false,
level: "error",
module: "utils",
subModule: "prodEndpoint",
message: "Data from prod endpoint",
data: d.data,
notify: false,
});
case 400:
return returnFunc({
success: false,
level: "error",
module: "utils",
subModule: "prodEndpoint",
message: "Data from prod endpoint",
data: d.data,
notify: false,
});
}
if (error) {
return returnFunc({
success: true,
level: "error",
module: "utils",
subModule: "prodEndpoint",
message: "Failed to get data from the prod endpoint",
data: error as any,
notify: true,
});
}
};

View File

@@ -1,7 +1,7 @@
import type { Response } from "express";
import { createLogger } from "../logger/logger.controller.js";
interface Data<T = unknown[]> {
export interface ReturnHelper<T = unknown[]> {
success: boolean;
module:
| "system"
@@ -12,29 +12,12 @@ interface Data<T = unknown[]> {
| "opendock"
| "notification"
| "email"
| "purchase";
subModule:
| "db"
| "labeling"
| "printer"
| "prodSql"
| "query"
| "sendmail"
| "auth"
| "datamart"
| "jobs"
| "apt"
| "settings"
| "get"
| "update"
| "delete"
| "post"
| "notification"
| "delete"
| "printing"
| "gpSql"
| "email";
level: "info" | "error" | "debug" | "fatal";
| "purchase"
| "tcp"
| "logistics";
subModule: string;
level: "info" | "error" | "debug" | "fatal" | "warn";
message: string;
room?: string;
data?: T;
@@ -55,7 +38,7 @@ interface Data<T = unknown[]> {
* data: [] the data that will be passed back
* notify: false by default this is to send a notification to a users email to alert them of an issue.
*/
export const returnFunc = (data: Data) => {
export const returnFunc = (data: ReturnHelper) => {
const notify = data.notify ? data.notify : false;
const room = data.room ?? data.room;
const log = createLogger({ module: data.module, subModule: data.subModule });
@@ -88,7 +71,7 @@ export const returnFunc = (data: Data) => {
export function apiReturn(
res: Response,
opts: Data & { status?: number },
opts: ReturnHelper & { status?: number },
optional?: unknown, // leave this as unknown so we can pass an object or an array over.
): Response {
const result = returnFunc(opts);

View File

@@ -5,13 +5,17 @@ meta {
}
get {
url: {{url}}/api/datamart/:name
url: {{url}}/api/datamart/:name?historical=x
body: none
auth: inherit
}
params:query {
historical: x
}
params:path {
name: activeArticles
name: inventory
}
settings {

View File

@@ -1,5 +1,5 @@
vars {
url: http://localhost:3600/lst
url: http://localhost:3000/lst
readerIp: 10.44.14.215
}
vars:secret [

View File

@@ -14,7 +14,7 @@ body:json {
{
"userId":"m6AbQXFwOXoX3YKLfwWgq2LIdDqS5jqv",
"notificationId": "0399eb2a-39df-48b7-9f1c-d233cec94d2e",
"emails": ["blake.mattes@alpla.com","cowchmonkey@gmail.com"]
"emails": ["blake.matthes@alpla.com","blake.matthes@alpla.com"]
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -26,6 +26,8 @@
"radix-ui": "^1.4.3",
"react": "^19.1.1",
"react-dom": "^19.1.1",
"react-markdown": "^10.1.0",
"remark-gfm": "^4.0.1",
"shadcn": "^4.0.8",
"socket.io-client": "^4.8.3",
"sonner": "^2.0.7",
@@ -36,6 +38,7 @@
},
"devDependencies": {
"@eslint/js": "^9.36.0",
"@tailwindcss/typography": "^0.5.19",
"@tanstack/router-plugin": "^1.166.7",
"@types/react": "^19.1.13",
"@types/react-dom": "^19.1.9",

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

View File

@@ -0,0 +1,105 @@
import { Link, useRouterState } from "@tanstack/react-router";
import { ChevronRight } from "lucide-react";
import {
Collapsible,
CollapsibleContent,
CollapsibleTrigger,
} from "../ui/collapsible";
import {
SidebarGroup,
SidebarGroupContent,
SidebarGroupLabel,
SidebarMenu,
SidebarMenuButton,
SidebarMenuItem,
SidebarMenuSub,
SidebarMenuSubButton,
SidebarMenuSubItem,
useSidebar,
} from "../ui/sidebar";
const docs = [
{
title: "Notifications",
url: "/intro",
//icon,
isActive: window.location.pathname.includes("notifications") ?? false,
items: [
{
title: "Reprints",
url: "/reprints",
},
{
title: "New Blocking order",
url: "/qualityBlocking",
},
],
},
];
export default function DocBar() {
const { setOpen } = useSidebar();
const pathname = useRouterState({
select: (s) => s.location.pathname,
});
const isNotifications = pathname.includes("notifications");
return (
<SidebarGroup>
<SidebarGroupLabel>Docs</SidebarGroupLabel>
<SidebarGroupContent>
<SidebarMenu>
<SidebarMenuItem key={"docs"}>
<SidebarMenuButton asChild>
<Link to={"/docs"} onClick={() => setOpen(false)}>
{/* <item.icon /> */}
<span>{"Intro"}</span>
</Link>
</SidebarMenuButton>
</SidebarMenuItem>
</SidebarMenu>
<SidebarMenu>
{docs.map((item) => (
<Collapsible
key={item.title}
asChild
defaultOpen={isNotifications}
className="group/collapsible"
>
<SidebarMenuItem>
<CollapsibleTrigger asChild>
<SidebarMenuButton tooltip={item.title}>
<Link
to={"/docs/$"}
params={{ _splat: `notifications${item.url}` }}
>
{item.title}
</Link>
<ChevronRight className="ml-auto transition-transform duration-200 group-data-[state=open]/collapsible:rotate-90" />
</SidebarMenuButton>
</CollapsibleTrigger>
<CollapsibleContent>
<SidebarMenuSub>
{item.items?.map((subItem) => (
<SidebarMenuSubItem key={subItem.title}>
<SidebarMenuSubButton asChild>
<Link
to={"/docs/$"}
params={{ _splat: `notifications${subItem.url}` }}
>
{subItem.title}
</Link>
</SidebarMenuSubButton>
</SidebarMenuSubItem>
))}
</SidebarMenuSub>
</CollapsibleContent>
</SidebarMenuItem>
</Collapsible>
))}
</SidebarMenu>
</SidebarGroupContent>
</SidebarGroup>
);
}

View File

@@ -7,6 +7,7 @@ import {
} from "@/components/ui/sidebar";
import { useSession } from "@/lib/auth-client";
import AdminSidebar from "./AdminBar";
import DocBar from "./DocBar";
export function AppSidebar() {
const { data: session } = useSession();
@@ -21,6 +22,7 @@ export function AppSidebar() {
<SidebarMenu>
<SidebarMenuItem>
<SidebarContent>
<DocBar/>
{session &&
(session.user.role === "admin" ||
session.user.role === "systemAdmin") && (

View File

@@ -0,0 +1,76 @@
import * as React from "react"
import { cva, type VariantProps } from "class-variance-authority"
import { cn } from "@/lib/utils"
const alertVariants = cva(
"group/alert relative grid w-full gap-0.5 rounded-lg border px-2.5 py-2 text-left text-sm has-data-[slot=alert-action]:relative has-data-[slot=alert-action]:pr-18 has-[>svg]:grid-cols-[auto_1fr] has-[>svg]:gap-x-2 *:[svg]:row-span-2 *:[svg]:translate-y-0.5 *:[svg]:text-current *:[svg:not([class*='size-'])]:size-4",
{
variants: {
variant: {
default: "bg-card text-card-foreground",
destructive:
"bg-card text-destructive *:data-[slot=alert-description]:text-destructive/90 *:[svg]:text-current",
},
},
defaultVariants: {
variant: "default",
},
}
)
function Alert({
className,
variant,
...props
}: React.ComponentProps<"div"> & VariantProps<typeof alertVariants>) {
return (
<div
data-slot="alert"
role="alert"
className={cn(alertVariants({ variant }), className)}
{...props}
/>
)
}
function AlertTitle({ className, ...props }: React.ComponentProps<"div">) {
return (
<div
data-slot="alert-title"
className={cn(
"font-medium group-has-[>svg]/alert:col-start-2 [&_a]:underline [&_a]:underline-offset-3 [&_a]:hover:text-foreground",
className
)}
{...props}
/>
)
}
function AlertDescription({
className,
...props
}: React.ComponentProps<"div">) {
return (
<div
data-slot="alert-description"
className={cn(
"text-sm text-balance text-muted-foreground md:text-pretty [&_a]:underline [&_a]:underline-offset-3 [&_a]:hover:text-foreground [&_p:not(:last-child)]:mb-4",
className
)}
{...props}
/>
)
}
function AlertAction({ className, ...props }: React.ComponentProps<"div">) {
return (
<div
data-slot="alert-action"
className={cn("absolute top-2 right-2", className)}
{...props}
/>
)
}
export { Alert, AlertTitle, AlertDescription, AlertAction }

View File

@@ -0,0 +1,31 @@
import { Collapsible as CollapsiblePrimitive } from "radix-ui"
function Collapsible({
...props
}: React.ComponentProps<typeof CollapsiblePrimitive.Root>) {
return <CollapsiblePrimitive.Root data-slot="collapsible" {...props} />
}
function CollapsibleTrigger({
...props
}: React.ComponentProps<typeof CollapsiblePrimitive.CollapsibleTrigger>) {
return (
<CollapsiblePrimitive.CollapsibleTrigger
data-slot="collapsible-trigger"
{...props}
/>
)
}
function CollapsibleContent({
...props
}: React.ComponentProps<typeof CollapsiblePrimitive.CollapsibleContent>) {
return (
<CollapsiblePrimitive.CollapsibleContent
data-slot="collapsible-content"
{...props}
/>
)
}
export { Collapsible, CollapsibleTrigger, CollapsibleContent }

View File

@@ -0,0 +1,62 @@
export default function into() {
return (
<div className="mx-auto w-full max-w-4xl px-6 py-8">
<h1 className="text-3xl underline p-2">Notifications</h1>
<p className="p-2">
All notifications are a subscription based, please open the menu and
select the notification you would like to know more info about
</p>
<hr />
<p>To subscribe to a notification</p>
<ol className="list-decimal list-inside">
<li>Click on your profile</li>
<img
src="/lst/app/imgs/docs/notifications/lt_profile.png"
alt="Reprint notification example"
className="m-2 rounded-lg border-2"
/>
<li>Click account</li>
<li>Select the notification you would like to subscribe to.</li>
<img
src="/lst/app/imgs/docs/notifications/lt_notification_select.png"
alt="Reprint notification example"
className="m-2 rounded-lg border-2"
/>
<li>
If you want to have more people on the notification you can add more
emails by clicking the add email button.{" "}
<p className="text-sm underline">
Please note that each user can subscribe on there own so you do not
need to add others unless you want to add them.
</p>
</li>
<li>When you are ready click subscribe</li>
</ol>
<br />
<p className="">
NOTE: you can select the same notification and add more people or just
your self only, when you do this it will override you current
subscription and add / remove the emails
</p>
<hr className="m-2" />
<div>
<p>
The table at the bottom of your profile is where all of your current
subscriptions will be at.
</p>
<p>
Clicking the trash can will remove the notifications from sending you
emails
</p>
<img
src="/lst/app/imgs/docs/notifications/lt_notification_table.png"
alt="Reprint notification example"
className="m-2 rounded-lg border-2"
/>
</div>
</div>
);
}

View File

@@ -0,0 +1,19 @@
export default function reprints() {
return (
<div className="mx-auto w-full max-w-4xl px-6 py-8">
<h1 className="text-3xl underline p-2">Quality Blocking</h1>
<p className="p-2">
When a new blocking order is created a new alert will be sent out to all
users subscribed. if there are multiple blocking orders created between
checks you can expect to get multiple emails. below you will see an
example of a blocking email that is sent out
</p>
<img
src="/lst/app/imgs/docs/notifications/lt_qualityBlocking.png"
alt="Reprint notification example"
className="m-2 rounded-lg border-2"
/>
</div>
);
}

View File

@@ -0,0 +1,18 @@
export default function reprints() {
return (
<div className="mx-auto w-full max-w-4xl px-6 py-8">
<h1 className="text-3xl underline p-2">Reprints</h1>
<p className="p-2">
The reprint alert will monitor for labels that have been printed within
a defined time. when a label is printed in the defined time an email
will sent out that looks similar to the below
</p>
<img
src="/lst/app/imgs/docs/notifications/lt_reprints.png"
alt="Reprint notification example"
className="m-2 rounded-lg border-2"
/>
</div>
);
}

26
frontend/src/lib/docs.ts Normal file
View File

@@ -0,0 +1,26 @@
import type { ComponentType } from "react";
const modules = import.meta.glob("../docs/**/*.tsx", {
eager: true,
});
type DocModule = {
default: ComponentType;
};
const docsMap: Record<string, ComponentType> = {};
for (const path in modules) {
const mod = modules[path] as DocModule;
const slug = path
.replace("../docs/", "")
.replace(".tsx", "");
// "notifications/intro"
docsMap[slug] = mod.default;
}
export function getDoc(slug: string) {
return docsMap[slug];
}

View File

@@ -11,6 +11,8 @@
import { Route as rootRouteImport } from './routes/__root'
import { Route as AboutRouteImport } from './routes/about'
import { Route as IndexRouteImport } from './routes/index'
import { Route as DocsIndexRouteImport } from './routes/docs/index'
import { Route as DocsSplatRouteImport } from './routes/docs/$'
import { Route as AdminSettingsRouteImport } from './routes/admin/settings'
import { Route as AdminNotificationsRouteImport } from './routes/admin/notifications'
import { Route as AdminLogsRouteImport } from './routes/admin/logs'
@@ -29,6 +31,16 @@ const IndexRoute = IndexRouteImport.update({
path: '/',
getParentRoute: () => rootRouteImport,
} as any)
const DocsIndexRoute = DocsIndexRouteImport.update({
id: '/docs/',
path: '/docs/',
getParentRoute: () => rootRouteImport,
} as any)
const DocsSplatRoute = DocsSplatRouteImport.update({
id: '/docs/$',
path: '/docs/$',
getParentRoute: () => rootRouteImport,
} as any)
const AdminSettingsRoute = AdminSettingsRouteImport.update({
id: '/admin/settings',
path: '/admin/settings',
@@ -72,6 +84,8 @@ export interface FileRoutesByFullPath {
'/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute
'/admin/settings': typeof AdminSettingsRoute
'/docs/$': typeof DocsSplatRoute
'/docs/': typeof DocsIndexRoute
'/user/profile': typeof authUserProfileRoute
'/user/resetpassword': typeof authUserResetpasswordRoute
'/user/signup': typeof authUserSignupRoute
@@ -83,6 +97,8 @@ export interface FileRoutesByTo {
'/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute
'/admin/settings': typeof AdminSettingsRoute
'/docs/$': typeof DocsSplatRoute
'/docs': typeof DocsIndexRoute
'/user/profile': typeof authUserProfileRoute
'/user/resetpassword': typeof authUserResetpasswordRoute
'/user/signup': typeof authUserSignupRoute
@@ -95,6 +111,8 @@ export interface FileRoutesById {
'/admin/logs': typeof AdminLogsRoute
'/admin/notifications': typeof AdminNotificationsRoute
'/admin/settings': typeof AdminSettingsRoute
'/docs/$': typeof DocsSplatRoute
'/docs/': typeof DocsIndexRoute
'/(auth)/user/profile': typeof authUserProfileRoute
'/(auth)/user/resetpassword': typeof authUserResetpasswordRoute
'/(auth)/user/signup': typeof authUserSignupRoute
@@ -108,6 +126,8 @@ export interface FileRouteTypes {
| '/admin/logs'
| '/admin/notifications'
| '/admin/settings'
| '/docs/$'
| '/docs/'
| '/user/profile'
| '/user/resetpassword'
| '/user/signup'
@@ -119,6 +139,8 @@ export interface FileRouteTypes {
| '/admin/logs'
| '/admin/notifications'
| '/admin/settings'
| '/docs/$'
| '/docs'
| '/user/profile'
| '/user/resetpassword'
| '/user/signup'
@@ -130,6 +152,8 @@ export interface FileRouteTypes {
| '/admin/logs'
| '/admin/notifications'
| '/admin/settings'
| '/docs/$'
| '/docs/'
| '/(auth)/user/profile'
| '/(auth)/user/resetpassword'
| '/(auth)/user/signup'
@@ -142,6 +166,8 @@ export interface RootRouteChildren {
AdminLogsRoute: typeof AdminLogsRoute
AdminNotificationsRoute: typeof AdminNotificationsRoute
AdminSettingsRoute: typeof AdminSettingsRoute
DocsSplatRoute: typeof DocsSplatRoute
DocsIndexRoute: typeof DocsIndexRoute
authUserProfileRoute: typeof authUserProfileRoute
authUserResetpasswordRoute: typeof authUserResetpasswordRoute
authUserSignupRoute: typeof authUserSignupRoute
@@ -163,6 +189,20 @@ declare module '@tanstack/react-router' {
preLoaderRoute: typeof IndexRouteImport
parentRoute: typeof rootRouteImport
}
'/docs/': {
id: '/docs/'
path: '/docs'
fullPath: '/docs/'
preLoaderRoute: typeof DocsIndexRouteImport
parentRoute: typeof rootRouteImport
}
'/docs/$': {
id: '/docs/$'
path: '/docs/$'
fullPath: '/docs/$'
preLoaderRoute: typeof DocsSplatRouteImport
parentRoute: typeof rootRouteImport
}
'/admin/settings': {
id: '/admin/settings'
path: '/admin/settings'
@@ -222,6 +262,8 @@ const rootRouteChildren: RootRouteChildren = {
AdminLogsRoute: AdminLogsRoute,
AdminNotificationsRoute: AdminNotificationsRoute,
AdminSettingsRoute: AdminSettingsRoute,
DocsSplatRoute: DocsSplatRoute,
DocsIndexRoute: DocsIndexRoute,
authUserProfileRoute: authUserProfileRoute,
authUserResetpasswordRoute: authUserResetpasswordRoute,
authUserSignupRoute: authUserSignupRoute,

View File

@@ -0,0 +1,31 @@
import { createFileRoute, Link } from "@tanstack/react-router";
import { getDoc } from "../../lib/docs";
export const Route = createFileRoute("/docs/$")({
component: RouteComponent,
});
function RouteComponent() {
const { _splat } = Route.useParams();
const slug = _splat || "";
const Doc = getDoc(slug);
if (!Doc) {
return (
<div>
<p>
You Have reached a doc page that dose not seem to exist please
validate and come back
</p>
<Link to="/docs">Docs Home</Link>
</div>
);
}
return (
<div className="mx-auto w-full max-w-4xl px-6 py-8">
<Doc />
</div>
);
}

View File

@@ -0,0 +1,100 @@
import { createFileRoute, Link } from "@tanstack/react-router";
export const Route = createFileRoute("/docs/")({
component: RouteComponent,
});
function RouteComponent() {
return (
<div className="mx-auto w-full max-w-4xl px-6 py-8">
<h1 className="text-3xl underline p-2">Logistics Support Tool Intro</h1>
<h2 className="text-2xl shadow-2xl p-2">What is lst</h2>
<p className="p-2">
Lst is a logistics support tool, and aid to ALPLAprod All data in here
is just to be treated as an aid and can still be completed manually in
alplaprod. These docs are here to help show what LST has to offer as
well as the manual process via alpla prod.
</p>
<hr />
<h2 className="text-2xl shadow-2xl p-2">What dose LST offer</h2>
<ul className="list-disc list-inside">
<li>One click print</li>
<ul className="list-disc list-inside indent-8">
<li>Controls printing of labels</li>
<li>devices that can be used</li>
<ul className="list-disc list-inside indent-16">
<li>Printer control</li>
<li>plc control</li>
<li>ame palletizer control</li>
</ul>
<li>considers more business logic than alplaprod</li>
<ul className="list-disc list-inside indent-16">
<li>
enough material is needed in the system to create the next pallet
</li>
<li>this will be the same for packaging as well.</li>
</ul>
<li>special processes</li>
<ul className="list-disc list-inside indent-16">
<li>in-house delivery triggered once booked in</li>
<li>stop gap on printing labels at specific times</li>
<li>per line delay in printing</li>
</ul>
</ul>
<li>Silos Management</li>
<ul className="list-disc list-inside indent-8">
<li>Silo adjustments per location</li>
<ul className="list-disc list-inside indent-16">
<li>Charts for the last 10 adjustments</li>
<li>Historical data</li>
<li>Comments on per adjustment</li>
<li>Automatic email for more than 5% deviation</li>
</ul>
<li>Attach silo</li>
<ul className="list-disc list-inside indent-16">
<li>Only shows machines not attached to this silo</li>
</ul>
<li>Detach silo</li>
<ul className="list-disc list-inside indent-16">
Only shows machines that are attached to the silo.
</ul>
</ul>
<li>TMS integration</li>
<ul className="list-disc list-inside indent-8">
<li>integration with TI to auto add in orders</li>
<ul className="list-disc list-inside indent-16">
<li>orders are based on a time defined per plant.</li>
<li>carriers can be auto set.</li>
</ul>
</ul>
<li>
<Link
to={"/docs/$"}
params={{ _splat: "notifications/intro" }}
className="underline"
>
Notifications
</Link>
</li>
<ul className="list-disc list-inside indent-8">
<li>Automated alerts</li>
<li>Subscription based</li>
<li>Processes notifications</li>
</ul>
<li>Datamart</li>
<ul className="list-disc list-inside indent-8">
<li>queries that can be pulled via excel</li>
<li>queries are created to allow better views for the plants</li>
<li>Faster customer reports</li>
</ul>
<li>Fake EDI (Demand Management)</li>
<ul className="list-disc list-inside indent-8">
<li>Orders in (standard template)</li>
<li>Customer specific orders templates per plant</li>
<li>Forecast (standard Template)</li>
<li>Customer specific forecast per plant</li>
</ul>
</ul>
</div>
);
}

View File

@@ -4,6 +4,7 @@ import { tanstackRouter } from "@tanstack/router-plugin/vite";
import react from "@vitejs/plugin-react-swc";
import { defineConfig } from "vite";
// https://vite.dev/config/
export default defineConfig({
plugins: [

20
lst_docs/.gitignore vendored
View File

@@ -1,20 +0,0 @@
# Dependencies
/node_modules
# Production
/build
# Generated files
.docusaurus
.cache-loader
# Misc
.DS_Store
.env.local
.env.development.local
.env.test.local
.env.production.local
npm-debug.log*
yarn-debug.log*
yarn-error.log*

View File

@@ -1,41 +0,0 @@
# Website
This website is built using [Docusaurus](https://docusaurus.io/), a modern static website generator.
## Installation
```bash
yarn
```
## Local Development
```bash
yarn start
```
This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.
## Build
```bash
yarn build
```
This command generates static content into the `build` directory and can be served using any static contents hosting service.
## Deployment
Using SSH:
```bash
USE_SSH=true yarn deploy
```
Not using SSH:
```bash
GIT_USER=<Your GitHub username> yarn deploy
```
If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the `gh-pages` branch.

View File

@@ -1,12 +0,0 @@
---
slug: first-blog-post
title: First Blog Post
authors: [slorber, yangshun]
tags: [hola, docusaurus]
---
Lorem ipsum dolor sit amet...
<!-- truncate -->
...consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet

View File

@@ -1,44 +0,0 @@
---
slug: long-blog-post
title: Long Blog Post
authors: yangshun
tags: [hello, docusaurus]
---
This is the summary of a very long blog post,
Use a `<!--` `truncate` `-->` comment to limit blog post size in the list view.
<!-- truncate -->
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Pellentesque elementum dignissim ultricies. Fusce rhoncus ipsum tempor eros aliquam consequat. Lorem ipsum dolor sit amet

View File

@@ -1,24 +0,0 @@
---
slug: mdx-blog-post
title: MDX Blog Post
authors: [slorber]
tags: [docusaurus]
---
Blog posts support [Docusaurus Markdown features](https://docusaurus.io/docs/markdown-features), such as [MDX](https://mdxjs.com/).
:::tip
Use the power of React to create interactive blog posts.
:::
{/* truncate */}
For example, use JSX to create an interactive button:
```js
<button onClick={() => alert('button clicked!')}>Click me!</button>
```
<button onClick={() => alert('button clicked!')}>Click me!</button>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 94 KiB

View File

@@ -1,29 +0,0 @@
---
slug: welcome
title: Welcome
authors: [slorber, yangshun]
tags: [facebook, hello, docusaurus]
---
[Docusaurus blogging features](https://docusaurus.io/docs/blog) are powered by the [blog plugin](https://docusaurus.io/docs/api/plugins/@docusaurus/plugin-content-blog).
Here are a few tips you might find useful.
<!-- truncate -->
Simply add Markdown files (or folders) to the `blog` directory.
Regular blog authors can be added to `authors.yml`.
The blog post date can be extracted from filenames, such as:
- `2019-05-30-welcome.md`
- `2019-05-30-welcome/index.md`
A blog post folder can be convenient to co-locate blog post images:
![Docusaurus Plushie](./docusaurus-plushie-banner.jpeg)
The blog supports tags as well!
**And if you don't want a blog**: just delete this directory, and use `blog: false` in your Docusaurus config.

View File

@@ -1,25 +0,0 @@
yangshun:
name: Yangshun Tay
title: Ex-Meta Staff Engineer, Co-founder GreatFrontEnd
url: https://linkedin.com/in/yangshun
image_url: https://github.com/yangshun.png
page: true
socials:
x: yangshunz
linkedin: yangshun
github: yangshun
newsletter: https://www.greatfrontend.com
slorber:
name: Sébastien Lorber
title: Docusaurus maintainer
url: https://sebastienlorber.com
image_url: https://github.com/slorber.png
page:
# customize the url of the author page at /blog/authors/<permalink>
permalink: '/all-sebastien-lorber-articles'
socials:
x: sebastienlorber
linkedin: sebastienlorber
github: slorber
newsletter: https://thisweekinreact.com

View File

@@ -1,19 +0,0 @@
facebook:
label: Facebook
permalink: /facebook
description: Facebook tag description
hello:
label: Hello
permalink: /hello
description: Hello tag description
docusaurus:
label: Docusaurus
permalink: /docusaurus
description: Docusaurus tag description
hola:
label: Hola
permalink: /hola
description: Hola tag description

View File

@@ -1,47 +0,0 @@
---
sidebar_position: 1
---
# Tutorial Intro
Let's discover **Docusaurus in less than 5 minutes**.
## Getting Started
Get started by **creating a new site**.
Or **try Docusaurus immediately** with **[docusaurus.new](https://docusaurus.new)**.
### What you'll need
- [Node.js](https://nodejs.org/en/download/) version 20.0 or above:
- When installing Node.js, you are recommended to check all checkboxes related to dependencies.
## Generate a new site
Generate a new Docusaurus site using the **classic template**.
The classic template will automatically be added to your project after you run the command:
```bash
npm init docusaurus@latest my-website classic
```
You can type this command into Command Prompt, Powershell, Terminal, or any other integrated terminal of your code editor.
The command also installs all necessary dependencies you need to run Docusaurus.
## Start your site
Run the development server:
```bash
cd my-website
npm run start
```
The `cd` command changes the directory you're working with. In order to work with your newly created Docusaurus site, you'll need to navigate the terminal there.
The `npm run start` command builds your website locally and serves it through a development server, ready for you to view at http://localhost:3000/.
Open `docs/intro.md` (this page) and edit some lines: the site **reloads automatically** and displays your changes.

View File

@@ -1,8 +0,0 @@
{
"label": "Tutorial - Basics",
"position": 2,
"link": {
"type": "generated-index",
"description": "5 minutes to learn the most important Docusaurus concepts."
}
}

View File

@@ -1,23 +0,0 @@
---
sidebar_position: 6
---
# Congratulations!
You have just learned the **basics of Docusaurus** and made some changes to the **initial template**.
Docusaurus has **much more to offer**!
Have **5 more minutes**? Take a look at **[versioning](../tutorial-extras/manage-docs-versions.md)** and **[i18n](../tutorial-extras/translate-your-site.md)**.
Anything **unclear** or **buggy** in this tutorial? [Please report it!](https://github.com/facebook/docusaurus/discussions/4610)
## What's next?
- Read the [official documentation](https://docusaurus.io/)
- Modify your site configuration with [`docusaurus.config.js`](https://docusaurus.io/docs/api/docusaurus-config)
- Add navbar and footer items with [`themeConfig`](https://docusaurus.io/docs/api/themes/configuration)
- Add a custom [Design and Layout](https://docusaurus.io/docs/styling-layout)
- Add a [search bar](https://docusaurus.io/docs/search)
- Find inspirations in the [Docusaurus showcase](https://docusaurus.io/showcase)
- Get involved in the [Docusaurus Community](https://docusaurus.io/community/support)

View File

@@ -1,34 +0,0 @@
---
sidebar_position: 3
---
# Create a Blog Post
Docusaurus creates a **page for each blog post**, but also a **blog index page**, a **tag system**, an **RSS** feed...
## Create your first Post
Create a file at `blog/2021-02-28-greetings.md`:
```md title="blog/2021-02-28-greetings.md"
---
slug: greetings
title: Greetings!
authors:
- name: Joel Marcey
title: Co-creator of Docusaurus 1
url: https://github.com/JoelMarcey
image_url: https://github.com/JoelMarcey.png
- name: Sébastien Lorber
title: Docusaurus maintainer
url: https://sebastienlorber.com
image_url: https://github.com/slorber.png
tags: [greetings]
---
Congratulations, you have made your first post!
Feel free to play around and edit this post as much as you like.
```
A new blog post is now available at [http://localhost:3000/blog/greetings](http://localhost:3000/blog/greetings).

View File

@@ -1,57 +0,0 @@
---
sidebar_position: 2
---
# Create a Document
Documents are **groups of pages** connected through:
- a **sidebar**
- **previous/next navigation**
- **versioning**
## Create your first Doc
Create a Markdown file at `docs/hello.md`:
```md title="docs/hello.md"
# Hello
This is my **first Docusaurus document**!
```
A new document is now available at [http://localhost:3000/docs/hello](http://localhost:3000/docs/hello).
## Configure the Sidebar
Docusaurus automatically **creates a sidebar** from the `docs` folder.
Add metadata to customize the sidebar label and position:
```md title="docs/hello.md" {1-4}
---
sidebar_label: 'Hi!'
sidebar_position: 3
---
# Hello
This is my **first Docusaurus document**!
```
It is also possible to create your sidebar explicitly in `sidebars.js`:
```js title="sidebars.js"
export default {
tutorialSidebar: [
'intro',
// highlight-next-line
'hello',
{
type: 'category',
label: 'Tutorial',
items: ['tutorial-basics/create-a-document'],
},
],
};
```

View File

@@ -1,43 +0,0 @@
---
sidebar_position: 1
---
# Create a Page
Add **Markdown or React** files to `src/pages` to create a **standalone page**:
- `src/pages/index.js``localhost:3000/`
- `src/pages/foo.md``localhost:3000/foo`
- `src/pages/foo/bar.js``localhost:3000/foo/bar`
## Create your first React Page
Create a file at `src/pages/my-react-page.js`:
```jsx title="src/pages/my-react-page.js"
import React from 'react';
import Layout from '@theme/Layout';
export default function MyReactPage() {
return (
<Layout>
<h1>My React page</h1>
<p>This is a React page</p>
</Layout>
);
}
```
A new page is now available at [http://localhost:3000/my-react-page](http://localhost:3000/my-react-page).
## Create your first Markdown Page
Create a file at `src/pages/my-markdown-page.md`:
```mdx title="src/pages/my-markdown-page.md"
# My Markdown page
This is a Markdown page
```
A new page is now available at [http://localhost:3000/my-markdown-page](http://localhost:3000/my-markdown-page).

View File

@@ -1,31 +0,0 @@
---
sidebar_position: 5
---
# Deploy your site
Docusaurus is a **static-site-generator** (also called **[Jamstack](https://jamstack.org/)**).
It builds your site as simple **static HTML, JavaScript and CSS files**.
## Build your site
Build your site **for production**:
```bash
npm run build
```
The static files are generated in the `build` folder.
## Deploy your site
Test your production build locally:
```bash
npm run serve
```
The `build` folder is now served at [http://localhost:3000/](http://localhost:3000/).
You can now deploy the `build` folder **almost anywhere** easily, **for free** or very small cost (read the **[Deployment Guide](https://docusaurus.io/docs/deployment)**).

View File

@@ -1,152 +0,0 @@
---
sidebar_position: 4
---
# Markdown Features
Docusaurus supports **[Markdown](https://daringfireball.net/projects/markdown/syntax)** and a few **additional features**.
## Front Matter
Markdown documents have metadata at the top called [Front Matter](https://jekyllrb.com/docs/front-matter/):
```text title="my-doc.md"
// highlight-start
---
id: my-doc-id
title: My document title
description: My document description
slug: /my-custom-url
---
// highlight-end
## Markdown heading
Markdown text with [links](./hello.md)
```
## Links
Regular Markdown links are supported, using url paths or relative file paths.
```md
Let's see how to [Create a page](/create-a-page).
```
```md
Let's see how to [Create a page](./create-a-page.md).
```
**Result:** Let's see how to [Create a page](./create-a-page.md).
## Images
Regular Markdown images are supported.
You can use absolute paths to reference images in the static directory (`static/img/docusaurus.png`):
```md
![Docusaurus logo](/img/docusaurus.png)
```
![Docusaurus logo](/img/docusaurus.png)
You can reference images relative to the current file as well. This is particularly useful to colocate images close to the Markdown files using them:
```md
![Docusaurus logo](./img/docusaurus.png)
```
## Code Blocks
Markdown code blocks are supported with Syntax highlighting.
````md
```jsx title="src/components/HelloDocusaurus.js"
function HelloDocusaurus() {
return <h1>Hello, Docusaurus!</h1>;
}
```
````
```jsx title="src/components/HelloDocusaurus.js"
function HelloDocusaurus() {
return <h1>Hello, Docusaurus!</h1>;
}
```
## Admonitions
Docusaurus has a special syntax to create admonitions and callouts:
```md
:::tip My tip
Use this awesome feature option
:::
:::danger Take care
This action is dangerous
:::
```
:::tip My tip
Use this awesome feature option
:::
:::danger Take care
This action is dangerous
:::
## MDX and React Components
[MDX](https://mdxjs.com/) can make your documentation more **interactive** and allows using any **React components inside Markdown**:
```jsx
export const Highlight = ({children, color}) => (
<span
style={{
backgroundColor: color,
borderRadius: '20px',
color: '#fff',
padding: '10px',
cursor: 'pointer',
}}
onClick={() => {
alert(`You clicked the color ${color} with label ${children}`)
}}>
{children}
</span>
);
This is <Highlight color="#25c2a0">Docusaurus green</Highlight> !
This is <Highlight color="#1877F2">Facebook blue</Highlight> !
```
export const Highlight = ({children, color}) => (
<span
style={{
backgroundColor: color,
borderRadius: '20px',
color: '#fff',
padding: '10px',
cursor: 'pointer',
}}
onClick={() => {
alert(`You clicked the color ${color} with label ${children}`);
}}>
{children}
</span>
);
This is <Highlight color="#25c2a0">Docusaurus green</Highlight> !
This is <Highlight color="#1877F2">Facebook blue</Highlight> !

View File

@@ -1,7 +0,0 @@
{
"label": "Tutorial - Extras",
"position": 3,
"link": {
"type": "generated-index"
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 27 KiB

View File

@@ -1,55 +0,0 @@
---
sidebar_position: 1
---
# Manage Docs Versions
Docusaurus can manage multiple versions of your docs.
## Create a docs version
Release a version 1.0 of your project:
```bash
npm run docusaurus docs:version 1.0
```
The `docs` folder is copied into `versioned_docs/version-1.0` and `versions.json` is created.
Your docs now have 2 versions:
- `1.0` at `http://localhost:3000/docs/` for the version 1.0 docs
- `current` at `http://localhost:3000/docs/next/` for the **upcoming, unreleased docs**
## Add a Version Dropdown
To navigate seamlessly across versions, add a version dropdown.
Modify the `docusaurus.config.js` file:
```js title="docusaurus.config.js"
export default {
themeConfig: {
navbar: {
items: [
// highlight-start
{
type: 'docsVersionDropdown',
},
// highlight-end
],
},
},
};
```
The docs version dropdown appears in your navbar:
![Docs Version Dropdown](./img/docsVersionDropdown.png)
## Update an existing version
It is possible to edit versioned docs in their respective folder:
- `versioned_docs/version-1.0/hello.md` updates `http://localhost:3000/docs/hello`
- `docs/hello.md` updates `http://localhost:3000/docs/next/hello`

Some files were not shown because too many files have changed in this diff Show More