Compare commits

..

16 Commits

Author SHA1 Message Date
26ceb95dae refactor(finance query): changes to include article type 2025-06-20 11:19:15 -05:00
bd5bad3cba refactor(server data): changes to bet and stp1 2025-06-20 11:18:55 -05:00
c5bd5a7c0a feat(notify): shortage bookings based on time and article type 2025-06-20 11:18:37 -05:00
095d724e65 feat(notifcations): pallets booked as waste brought back in via cycle count 2025-06-20 11:18:13 -05:00
7df512acaa feat(logistics): get sscc based on runnning number
returns the sscc based on the running number more for support and testing the endpoints
2025-06-20 11:17:39 -05:00
3073df342f refactor(datamart): changes to customer inventory to include specific whse 2025-06-20 11:16:19 -05:00
b9ff0a4138 feat(default accounts): added in a default account run this will create a bunch of system admins 2025-06-20 11:15:39 -05:00
c555172d68 ci(release): bump build number to 426 2025-06-20 08:21:14 -05:00
e432f0b3ae ci(release): bump build number to 425 2025-06-19 16:36:04 -05:00
bc25e835b4 ci(release): bump build number to 424 2025-06-19 07:17:13 -05:00
a71167e598 ci(release): bump build number to 423 2025-06-18 12:16:05 -05:00
53945402ce ci(release): bump build number to 422 2025-06-17 09:01:52 -05:00
aa42819cc1 ci(release): bump build number to 421 2025-06-17 08:57:57 -05:00
c36e4e66b3 ci(release): bump build number to 420 2025-06-17 07:24:11 -05:00
11ae3cc0bc ci(release): bump build number to 419 2025-06-16 19:06:11 -05:00
f80e742e27 chore(release): 2.23.0 2025-06-16 19:04:57 -05:00
20 changed files with 612 additions and 13 deletions

View File

@@ -1,5 +1,49 @@
# All CHanges to LST can be found below. # All CHanges to LST can be found below.
## [2.23.0](https://git.tuffraid.net/cowch/lstV2/compare/v2.22.0...v2.23.0) (2025-06-17)
### 📝 Testing Code
* **added in commands:** relocate and updated to remove ([e865c1d](https://git.tuffraid.net/cowch/lstV2/commits/e865c1dcaf8e9ee3710f8d9f65e95118bb3fa3e7))
* **return preforms:** start to preform return ([3283972](https://git.tuffraid.net/cowch/lstV2/commits/328397280954fbf6668e2014b0ce28c885ee417d))
### 🐛 Bug fixes
* **datamart:** fix for delivery date by range error for hardcoded date ([7b60ddc](https://git.tuffraid.net/cowch/lstV2/commits/7b60ddcadf0b6b6e9c9d4e7efd928cf48bcfee0f))
* **forecast:** changes to png to allow different address id for the portal ([e1332e7](https://git.tuffraid.net/cowch/lstV2/commits/e1332e754a163bfc882ffb54e6e2bd741579fad1)), closes [#21](https://git.tuffraid.net/cowch/lstV2/issues/21)
* **lots:** corrections to role issue where user logged in but no role set yet ([68d7527](https://git.tuffraid.net/cowch/lstV2/commits/68d75277c2490cd25c6f52dd18663fba0c5cd932))
* **notification template:** forgot to remove a end time field data ([12d0c69](https://git.tuffraid.net/cowch/lstV2/commits/12d0c6923d1b177d60bf34b5c70168d0e7eb982d))
* **produser:** changes to get correct response from errors ([3c45010](https://git.tuffraid.net/cowch/lstV2/commits/3c45010b268ba9aaa0b0b165a8ac06d7d58de9ad))
* **removed roles:** changes to remove the roles and use userRoles instead ([826c44c](https://git.tuffraid.net/cowch/lstV2/commits/826c44c9967130f26ddbcfc2e8f4d43673819a71))
### 🌟 Enhancements
* **command log:** added new log for all commands used ([aadf255](https://git.tuffraid.net/cowch/lstV2/commits/aadf255e343bb10bed399f6db20671b15e88ce8a))
* **common commands:** added in a common commands just the barecode ([6156a1a](https://git.tuffraid.net/cowch/lstV2/commits/6156a1a5bb6ce2dfd160a7877c4cb1d8c91fac3f))
* **datamart:** finance inventory audit added ([90be86d](https://git.tuffraid.net/cowch/lstV2/commits/90be86d972d4660d87e0b8f41e7f2b1a22838e9c))
* **dotnet:** added in wrapper so we could run in iis for ssl :D ([84aacd5](https://git.tuffraid.net/cowch/lstV2/commits/84aacd5b71b0efab36eb91c0ba7f2d362dd6b736))
* **helpercommands:** added in helpercommnands link ([078c2ec](https://git.tuffraid.net/cowch/lstV2/commits/078c2ec12f3a08094d661d1839373b9adf00f07d))
* **new command:** helper command to remove as non reusable pallets ([353960b](https://git.tuffraid.net/cowch/lstV2/commits/353960bd267e55b24c59a4a654cc37411c9986c7))
* **nofitication:** bow2 henkel orders ([ab23dcd](https://git.tuffraid.net/cowch/lstV2/commits/ab23dcdfb81fddf6881ddaa17111c8115c7002b0))
* **prod roles:** added in quality tech and plant manager ([292eb32](https://git.tuffraid.net/cowch/lstV2/commits/292eb324c549c6febc58af8b0a587d241d458c9e))
* **produser:** added in prodSupervisor to the list ([136bf98](https://git.tuffraid.net/cowch/lstV2/commits/136bf9820d6df837c1db37b4e38bcdd9375730d3))
* **produser:** added in the function to create a standard user based on there username ([99ad79c](https://git.tuffraid.net/cowch/lstV2/commits/99ad79c662cfeddd78bc86792852e8e97b14e287))
### 🛠️ Code Refactor
* **bookin card:** changes to move the button to the right side ([ed77743](https://git.tuffraid.net/cowch/lstV2/commits/ed777437eb6b5a9d8b179c8182883dac3c731683))
* **command log:** added in the command log tracking into the 3 we currently have ([0caf809](https://git.tuffraid.net/cowch/lstV2/commits/0caf8094de588e64b3dcda99d023d39e62bbbf5a))
* **forcast:** changes to consider a plants different addresses ([92c8fc2](https://git.tuffraid.net/cowch/lstV2/commits/92c8fc25544f4b56fde1348d7040498bd650dcb1))
* **ocme:** changes in the pickup to no longer handle all but specific area ([d32289c](https://git.tuffraid.net/cowch/lstV2/commits/d32289c8337eed71d5abb29d0dd8c8741af52ac9))
* **ocme:** picked up per location now vs picked up all ([37f2518](https://git.tuffraid.net/cowch/lstV2/commits/37f2518589935788c142965dff1ff5b9fb6ab902))
* **picked up pallets:** changes to start picking up only area from spot ([f79ff26](https://git.tuffraid.net/cowch/lstV2/commits/f79ff26958c7f3e4560da73d2a5fb0fd9a6443e6))
* **printers:** added in processes for the upcoming reprint function ([1cb285b](https://git.tuffraid.net/cowch/lstV2/commits/1cb285bea86828d0f3fa7cc42a25495681b0e336))
* **serverdata:** changes to limas tms data and activation ([51e6864](https://git.tuffraid.net/cowch/lstV2/commits/51e68648681981aee393ee9c1a729534b9a3983a))
## [2.22.0](https://git.tuffraid.net/cowch/lstV2/compare/v2.21.0...v2.22.0) (2025-06-10) ## [2.22.0](https://git.tuffraid.net/cowch/lstV2/compare/v2.21.0...v2.22.0) (2025-06-10)

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{ {
"name": "lstv2", "name": "lstv2",
"version": "2.22.0", "version": "2.23.0",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "lstv2", "name": "lstv2",
"version": "2.22.0", "version": "2.23.0",
"dependencies": { "dependencies": {
"@dotenvx/dotenvx": "^1.39.0", "@dotenvx/dotenvx": "^1.39.0",
"@hono/node-server": "^1.14.0", "@hono/node-server": "^1.14.0",

View File

@@ -1,6 +1,6 @@
{ {
"name": "lstv2", "name": "lstv2",
"version": "2.22.0", "version": "2.23.0",
"type": "module", "type": "module",
"scripts": { "scripts": {
"dev": "concurrently -n \"server,frontend\" -c \"#007755,#2f6da3\" \"npm run dev:server\" \"cd frontend && npm run dev\"", "dev": "concurrently -n \"server,frontend\" -c \"#007755,#2f6da3\" \"npm run dev:server\" \"cd frontend && npm run dev\"",
@@ -36,7 +36,7 @@
} }
}, },
"admConfig": { "admConfig": {
"build": 418, "build": 426,
"oldBuild": "backend-0.1.3.zip" "oldBuild": "backend-0.1.3.zip"
}, },
"devDependencies": { "devDependencies": {

View File

@@ -10,6 +10,7 @@ import createUser from "./routes/userAdmin/createUser.js";
import allUsers from "./routes/userAdmin/getUsers.js"; import allUsers from "./routes/userAdmin/getUsers.js";
import updateUser from "./routes/userAdmin/updateUser.js"; import updateUser from "./routes/userAdmin/updateUser.js";
import allUserRoles from "./routes/userAdmin/getAllUserRoles.js"; import allUserRoles from "./routes/userAdmin/getAllUserRoles.js";
import { massAccountCreation } from "./utils/DefaultAccountCreation.js";
const app = new OpenAPIHono(); const app = new OpenAPIHono();
@@ -36,4 +37,8 @@ const appRoutes = routes.forEach((route) => {
app.route("/auth", route); app.route("/auth", route);
}); });
// setTimeout(() => {
// massAccountCreation();
// }, 1000 * 60);
export default app; export default app;

View File

@@ -0,0 +1,56 @@
import { db } from "../../../../database/dbclient.js";
import { users } from "../../../../database/schema/users.js";
import { tryCatch } from "../../../globalUtils/tryCatch.js";
import { createLog } from "../../logger/logger.js";
import { setSysAdmin } from "../controllers/userRoles/setSysAdmin.js";
import { createPassword } from "./createPassword.js";
export const massAccountCreation = async () => {
/**
* This will create a new account for all users before if they are already in there it will update just there password.
*
*/
const user: any = [
// {
// username: "landa002",
// email: "Oscar.Landa@alpla.com",
// password: "Frostlike-Petri5-Ungreased!",
// },
];
for (let i = 0; i < user.length; i++) {
const updatedUser = {
username: user[i].username,
email: user[i].email,
password: await createPassword(user[i].password),
};
const { data, error } = await tryCatch(
db
.insert(users)
.values(updatedUser)
.onConflictDoUpdate({
target: users.username,
set: {
password: updatedUser.password,
email: updatedUser.email,
},
})
.returning({
user_id: users.user_id,
username: users.username,
})
);
await setSysAdmin(data, "systemAdmin");
if (error) {
createLog(
"error",
"lst",
"auth",
`There was an error creating ${user[i].username}`
);
}
}
};

View File

@@ -1,14 +1,23 @@
import { query } from "../../sqlServer/prodSqlServer.js"; import { query } from "../../sqlServer/prodSqlServer.js";
import { customerInvNoHold } from "../../sqlServer/querys/dataMart/customerInventoryQuerys.js"; import { customerInvNoHold } from "../../sqlServer/querys/dataMart/customerInventoryQuerys.js";
export const getCurrentCustomerInv = async (customer: any | null) => { export const getCurrentCustomerInv = async (data: any | null) => {
//console.log(data.customer[0]);
let updatedQuery = customerInvNoHold; let updatedQuery = customerInvNoHold;
if (customer) { if (data.customer) {
//console.log(data.customer); //console.log(data.customer);
updatedQuery = customerInvNoHold.replaceAll( updatedQuery = customerInvNoHold.replaceAll(
"--and IdAdressen", "--and IdAdressen",
`and IdAdressen = ${customer}` `and IdAdressen = ${data.customer[0]}`
);
}
if (data.whseToInclude) {
updatedQuery = updatedQuery.replaceAll(
"--and x.IdWarenlager in (14,15)",
`and x.IdWarenlager in (${data.whseToInclude[0]})`
); );
} }
try { try {

View File

@@ -28,8 +28,8 @@ const current: any = [
name: "getCustomerInventory", name: "getCustomerInventory",
endpoint: "/api/datamart/getcustomerinventory", endpoint: "/api/datamart/getcustomerinventory",
description: description:
"Returns specific customer inventory based on there address ID.", "Returns specific customer inventory based on there address ID, with optional to include warehouses, IE 36,41,5. leaving warehouse blank will just pull everything",
criteria: "customer", criteria: "customer,whseToInclude",
}, },
// { // {
// name: "getPalletLabels", // name: "getPalletLabels",

View File

@@ -25,12 +25,12 @@ app.openapi(
responses: responses(), responses: responses(),
}), }),
async (c) => { async (c) => {
const customer: string = c.req.query("customer") ?? ""; const customerData: any = c.req.queries();
// make sure we have a vaid user being accessed thats really logged in // make sure we have a vaid user being accessed thats really logged in
apiHit(c, { endpoint: "/getcustomerinventory" }); apiHit(c, { endpoint: "/getcustomerinventory" });
const { data, error } = await tryCatch( const { data, error } = await tryCatch(
getCurrentCustomerInv(customer ? customer : null) getCurrentCustomerInv(customerData ? customerData : null)
); );
if (error) { if (error) {

View File

@@ -20,6 +20,7 @@ import { runHistoricalData } from "./controller/eom/historicalInv.js";
import intervalChecks from "./route/getActiveLogistics.js"; import intervalChecks from "./route/getActiveLogistics.js";
import getActiveLanes from "./route/getActiveLanes.js"; import getActiveLanes from "./route/getActiveLanes.js";
import removeAsNonReable from "./route/removeAsNonReusable.js"; import removeAsNonReable from "./route/removeAsNonReusable.js";
import getSSCC from "./route/getSSCCNumber.js";
const app = new OpenAPIHono(); const app = new OpenAPIHono();
@@ -49,6 +50,7 @@ const routes = [
// logisitcs // logisitcs
removeAsNonReable, removeAsNonReable,
getSSCC,
] as const; ] as const;
// app.route("/server", modules); // app.route("/server", modules);

View File

@@ -0,0 +1,76 @@
import { createRoute, OpenAPIHono, z } from "@hono/zod-openapi";
import { responses } from "../../../globalUtils/routeDefs/responses.js";
import { tryCatch } from "../../../globalUtils/tryCatch.js";
import { getCycleCountCheck } from "../controller/warehouse/cycleCountChecks/getCycleCountCheck.js";
import { getPPOO } from "../controller/warehouse/ppoo/getPPOO.js";
import { apiHit } from "../../../globalUtils/apiHits.js";
import { createSSCC } from "../../../globalUtils/createSSCC.js";
const app = new OpenAPIHono();
// const Body = z
// .object({
// age: z.number().optional().openapi({ example: 90 }),
// //email: z.string().optional().openapi({example: "s.smith@example.com"}),
// type: z.string().optional().openapi({ example: "fg" }),
// })
// .openapi("User");
app.openapi(
createRoute({
tags: ["logistics"],
summary: "Returns returns sscc based on running number",
method: "post",
path: "/getsscc",
// request: {
// body: {
// content: {
// "application/json": { schema: Body },
// },
// },
// },
// description:
// "Provided a running number and lot number you can consume material.",
responses: responses(),
}),
async (c: any) => {
apiHit(c, { endpoint: "/getsscc" });
const { data, error: bodyError } = (await tryCatch(
c.req.json()
)) as any;
if (bodyError) {
return c.json({
success: false,
message: "Missing critical data.",
data: bodyError,
});
}
if (!data.runningNr) {
return c.json({
success: false,
message: "Missing critical data.",
data: [],
});
}
const { data: sscc, error } = await tryCatch(
createSSCC(data.runningNr)
);
if (error) {
return c.json({
success: false,
message: "Error creating sscc.",
data: error,
});
}
return c.json({
success: true,
message: "SSCC",
data: sscc,
});
}
);
export default app;

View File

@@ -0,0 +1,107 @@
import { eq, sql } from "drizzle-orm";
import { db } from "../../../../../database/dbclient.js";
import { notifications } from "../../../../../database/schema/notifications.js";
import { tryCatch } from "../../../../globalUtils/tryCatch.js";
import { createLog } from "../../../logger/logger.js";
import { query } from "../../../sqlServer/prodSqlServer.js";
import { sendEmail } from "../sendMail.js";
import { palletsRemovedAswaste } from "../../../sqlServer/querys/notifications/palletsRemovedAsWaste.js";
import { format } from "date-fns-tz";
export interface Labels {
IdEtikettenHistorie?: number;
}
const notification = async (notifyData: any) => {
/**
* Pass the entire notification over
*/
createLog(
"info",
"wastebooking",
"notify",
`monitoring ${notifyData.name}`
);
// validate if there are any emails.
if (notifyData.emails === "") {
createLog(
"error",
"reprinting",
"notify",
`There are no emails set for ${notifyData.name}`
);
return;
}
const { data: l, error: palletError } = await tryCatch(
query(palletsRemovedAswaste, "Removed as waste check")
);
const pallets: any = l?.data as any;
if (palletError) {
createLog(
"error",
"reprinting",
"notify",
`Failed to get the labels: ${palletError}`
);
return;
}
console.log(pallets);
if (pallets.length > 0) {
//send the email :D
const emailSetup = {
email: notifyData.emails,
subject: `Alert! ${
pallets.length > 1 ? "Some pallets were" : "A pallet was "
} brought back in`,
template: "palletBookedAsWaste",
context: {
items: pallets.map((i: any) => {
return {
...i,
lastMovingDate: format(i.lastMovingDate, "M/d/yyyy"),
};
}),
},
};
const sentEmail = await sendEmail(emailSetup);
if (!sentEmail.success) {
createLog(
"error",
"reprinting",
"notify",
"Failed to send email, will try again on next interval"
);
return;
}
// // update the last time we ran and the prod id
// const notifUpdate = {
// prodID: labels[0].IdEtikettenHistorie,
// lastRan: nowDate(),
// };
// update the last time ran
const { data, error } = await tryCatch(
db
.update(notifications)
.set({
lastRan: sql`NOW()`,
notifiySettings: {
...notifyData.notifiySettings,
prodID: pallets[0].runningnumber,
},
})
.where(eq(notifications.name, notifyData.name))
);
} else {
return;
}
};
export default notification;

View File

@@ -0,0 +1,109 @@
import { eq, sql } from "drizzle-orm";
import { db } from "../../../../../database/dbclient.js";
import { notifications } from "../../../../../database/schema/notifications.js";
import { tryCatch } from "../../../../globalUtils/tryCatch.js";
import { createLog } from "../../../logger/logger.js";
import { query } from "../../../sqlServer/prodSqlServer.js";
import { sendEmail } from "../sendMail.js";
import { format } from "date-fns-tz";
import { shortageBookings } from "../../../sqlServer/querys/notifications/shortageBookings.js";
export interface Labels {
IdEtikettenHistorie?: number;
}
const notification = async (notifyData: any) => {
/**
* Pass the entire notification over
*/
createLog(
"info",
"wastebooking",
"notify",
`monitoring ${notifyData.name}`
);
// validate if there are any emails.
if (notifyData.emails === "") {
createLog(
"error",
"reprinting",
"notify",
`There are no emails set for ${notifyData.name}`
);
return;
}
//console.log(notifyData);
// update the settings so we have everything we need
let updatedQuery = shortageBookings
.replace("[time]", notifyData?.notifiySettings.time)
.replace("[type]", notifyData?.notifiySettings.type)
.replace("[avType]", notifyData?.notifiySettings.avType);
const { data: l, error: shortageError } = await tryCatch(
query(updatedQuery, "Removed as waste check")
);
const pallets: any = l?.data as any;
//console.log(updatedQuery);
//console.log(pallets);
if (shortageError) {
createLog(
"error",
"reprinting",
"notify",
`Failed to get the labels: ${shortageError}`
);
return;
}
if (pallets.length > 0) {
//send the email :D
const emailSetup = {
email: notifyData.emails,
subject: `Alert! New shortage booking as been completed in the last ${notifyData?.notifiySettings.time} min`,
template: "shortageBookings",
context: {
items: pallets.map((i: any) => {
return {
...i,
bookingDate: format(i.bookingDate, "M/d/yyyy"),
};
}),
},
};
const sentEmail = await sendEmail(emailSetup);
if (!sentEmail.success) {
createLog(
"error",
"reprinting",
"notify",
"Failed to send email, will try again on next interval"
);
return;
}
// // update the last time we ran and the prod id
// const notifUpdate = {
// prodID: labels[0].IdEtikettenHistorie,
// lastRan: nowDate(),
// };
// update the last time ran
const { data, error } = await tryCatch(
db
.update(notifications)
.set({
lastRan: sql`NOW()`,
})
.where(eq(notifications.name, notifyData.name))
);
} else {
return;
}
};
export default notification;

View File

@@ -117,6 +117,30 @@ export const note: any = [
active: false, active: false,
notifiySettings: { processTime: 15 }, notifiySettings: { processTime: 15 },
}, },
{
name: "palletsRemovedAsWaste",
description:
"Validates stock to make sure, there are no pallets released that have been removed as waste already ",
checkInterval: 15,
timeType: "min",
emails: "blake.matthes@alpla.com",
active: false,
notifiySettings: { prodID: 1 },
},
{
name: "shortageBookings",
description:
"Checks for material shortage bookings by single av type or all types ",
checkInterval: 15,
timeType: "min",
emails: "blake.matthes@alpla.com",
active: false,
notifiySettings: {
time: 15,
type: "all", // change this to something else or leave blank to use the av type
avType: 1,
},
},
]; ];
export const notificationCreate = async () => { export const notificationCreate = async () => {

View File

@@ -0,0 +1,44 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
{{!-- <link rel="stylesheet" href="styles/styles.css" /> --}}
{{> styles}}
</head>
<body>
<p>All,</p>
<p>The below labels have been brought back into the system either by relocate or by inventory taking order, please validate these labels and reblock them or reremove them.</p>
<table >
<thead>
<tr>
<th>AV</th>
<th>Desciption</th>
<th>Label Number</th>
<th>Last Moving Date</th>
</tr>
</thead>
<tbody>
{{#each items}}
<tr>
<td>{{av}}</td>
<td>{{alias}}</td>
<td>{{runningnumber}}</td>
<td>{{lastMovingDate}}</td>
</tr>
{{/each}}
</tbody>
</table>
<div>
<p>For a removal process logistcs will need to do this in lst so a reason for the removal can be added.</p>
</div>
<div>
<p>Thank you,</p>
<p>LST Team</p>
</div>
</body>
</html>

View File

@@ -0,0 +1,60 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
{{!-- <link rel="stylesheet" href="styles/styles.css" /> --}}
{{> styles}}
</head>
<body>
<p>All,</p>
<p>$shortage bookings were just done on the below pallet(s). </p>
<table >
<thead>
<tr>
<th>Material AV</th>
<th>Material Alias</th>
<th>Production Lot</th>
<th>Production Pallet Running number</th>
<th>Machine</th>
<th>Machine Name</th>
<th>Quantity Shorted</th>
<th>Shortage Date</th>
</tr>
</thead>
<tbody>
{{#each items}}
<tr>
<td>{{materialAV}}</td>
<td>{{materialAlias}}</td>
<td>{{productionlot}}</td>
<td>{{palletWithShortBookings}}</td>
<td>{{machineNumber}}</td>
<td>{{machineAlias}}</td>
<td>{{qtyShortpcs}}</td>
<td>{{bookingDate}}</td>
</tr>
{{/each}}
</tbody>
</table>
<div>
<p>This can be corrected by following the below simple instructions.</p>
<ol type="1">
<li>Bring the pallet back to PPOO</li>
<li>Book out the pallet</li>
<li>Make the corrections to stock for the above materials/packaging missing</li>
<li>Book the pallet back in.</li>
</ol>
<br/>
<p>For further instructions please reach out to regional support via helpdesk ticket</p>
</div>
<div>
<p>Thank you,</p>
<p>LST Team</p>
</div>
</body>
</html>

View File

@@ -38,7 +38,11 @@
"oldVersion": "E:\\LST\\lst_backend", "oldVersion": "E:\\LST\\lst_backend",
"shippingHours": "[{\"early\": \"06:30\", \"late\": \"23:00\"}]", "shippingHours": "[{\"early\": \"06:30\", \"late\": \"23:00\"}]",
"tiPostTime": "[{\"from\": \"24\", \"to\": \"24\"}]", "tiPostTime": "[{\"from\": \"24\", \"to\": \"24\"}]",
"otherSettings": [{ "specialInstructions": "" }] "otherSettings": [
{
"specialInstructions": "PLEASE CONTACT ShippingReceivingBethlehem@groups.alpla.com WITH ANY QUESTIONS"
}
]
}, },
{ {
"sName": "Huston", "sName": "Huston",
@@ -345,7 +349,7 @@
"active": true, "active": true,
"serverLoc": "E:\\LST\\lstv2", "serverLoc": "E:\\LST\\lstv2",
"oldVersion": "E:\\LST\\lst_backend", "oldVersion": "E:\\LST\\lst_backend",
"shippingHours": "[{\"early\": \"06:30\", \"late\": \"23:00\"}]", "shippingHours": "[{\"early\": \"00:01\", \"late\": \"23:59\"}]",
"tiPostTime": "[{\"from\": \"24\", \"to\": \"24\"}]", "tiPostTime": "[{\"from\": \"24\", \"to\": \"24\"}]",
"otherSettings": [ "otherSettings": [
{ "specialInstructions": "Loadbars/Straps required." } { "specialInstructions": "Loadbars/Straps required." }

View File

@@ -32,6 +32,7 @@ The data below will be controlled by the user in excell by default everything wi
where IdArtikelTyp = 1 where IdArtikelTyp = 1
and x.IdWarenlager not in (6, 1) and x.IdWarenlager not in (6, 1)
--and IdAdressen --and IdAdressen
--and x.IdWarenlager in (14,15)
group by x.IdArtikelVarianten group by x.IdArtikelVarianten

View File

@@ -6,6 +6,7 @@ select
b.IdArtikelVarianten b.IdArtikelVarianten
,ArtikelVariantenAlias ,ArtikelVariantenAlias
,ArtikelVariantenBez ,ArtikelVariantenBez
,a.Bezeichnung as articleType
,sum(EinlagerungsMengeVPKSum) totalPal ,sum(EinlagerungsMengeVPKSum) totalPal
,sum(EinlagerungsMengeSum) totalPieces ,sum(EinlagerungsMengeSum) totalPieces
--,ProduktionsDatumMin --,ProduktionsDatumMin
@@ -41,6 +42,13 @@ from T_HistoryEK (nolock) )x
where rn = 1) sp on where rn = 1) sp on
sp.IdArtikelvarianten = b.IdArtikelVarianten sp.IdArtikelvarianten = b.IdArtikelVarianten
/* article type */
left join
T_Artikeltyp (nolock) a
on a.IdArtikelTyp = b.IdArtikelTyp
where IdWarenlager not in (1,5,6) where IdWarenlager not in (1,5,6)
and ProduktionsDatumMin < '[date]' -- '2025-05-31' and ProduktionsDatumMin < '[date]' -- '2025-05-31'
@@ -50,6 +58,8 @@ group by b.IdArtikelVarianten
,convert(date, ProduktionsDatumMin, 111) ,convert(date, ProduktionsDatumMin, 111)
,pp.VKPreis ,pp.VKPreis
,sp.EKPreis ,sp.EKPreis
,a.Bezeichnung
order by IdArtikelVarianten order by IdArtikelVarianten
`; `;

View File

@@ -0,0 +1,17 @@
export const palletsRemovedAswaste = `
select * from (select IdArtikelVarianten as av
,ArtikelVariantenAlias as alias
,Lfdnr as runningnumber
,case when GesperrtAktivSum = 1 then 'Blocked' else 'Released' end as palletStatus
,BewegungsDatumMax as lastMovingDate
--,*
from AlplaPROD_test1.dbo.V_LagerPositionenBarcodes (nolock) )x
where runningnumber in (
SELECT
[HumanReadableId]
FROM [test1_AlplaPROD2.0_Reporting].[reporting_blocking].[BlockedItem] (nolock)
where state = 4
) and palletStatus = 'Released'
`;

View File

@@ -0,0 +1,31 @@
export const shortageBookings = `
use AlplaPROD_test1
Declare @range int = [time] -- change this to be range in minutues you want to monitor, this shouldnt be more than the interval check so we do not see duplicates
declare @avType nvarchar(3) = '[type]' --change to blank or single to have specific ones if all the type is ignored
declare @avTypeID NVARCHAR(MAX) = '[avType]' -- this can only be 1 article now.
select
IdArtikelVarianten as materialAV
,IdArtikelTyp
,ArtikelTypBez
,ArtikelVariantenAlias as materialAlias
,CAST(Menge as varchar) as qtyShortpcs
,ProduktionsLos as productionlot
,LEFT(PARSE(Right(barcode, 39) as int), LEN(PARSE(Right(barcode, 39)as int)) - 1) as palletWithShortBookings
,m.Standort as machineNumber
,m.Bezeichnung ,m.Bezeichnung as machineAlias
,Buchungsdatum as bookingDate
from [dbo].[V_LagerBuchungen] (nolock) s
left join
dbo.T_Maschine (nolock) as m
on m.IdMaschine = s.IdMaschine
where beleg like '%$Sho%' and s.Add_Date > DATEADD(MINUTE, -@range, getdate())
and (@avType = 'all' or IdArtikelTyp in (@avTypeID))
order by ProduktionsLos
`;