How to perform database queries for automation purposes

In order to increase the automation, it is possible to carry out queries about order distribution, capacity load, state of processing, quantity of pieces and setup state in the PPS database with the corresponding criteria.
For this, you can address Select queries to the PPS database.
In the following examples, the value assigned to d.uid is the UID of the machine.

To determine the UID out of the machine network name, you can use conversion functions or the skrpps_02.skr_get_mc_list function.
See: Different help functions for further performance optimized options.

Example:

SELECT "BO_GUID" from skrpps_02.skr_get_mc_list('skr') where "MC_NAME"='McNetworkName';

The query provides you the UID of the machine, if you enter the network name of your machine for McNetworkName.

From the PPS database you can determine:

Query via a batch file, when a machine completes all the tickets

In the following example is queried the time, when the machine with a specific UID is done with all the tickets.

Copy the code into a batch file, set the Machine_UID variable to a value of one of your machines and execute the code with administrator rights. In the DONE_TIME variable, you receive the result of the query.

@echo off
setlocal enabledelayedexpansion

set "PgLatestRootPath=C:\Program Files\Stoll\skr3\PostgreSql\9.5.5-1\bin"
set "PgRootPath=!PgLatestRootPath!"
if not exist "!PgRootPath!" Set "PgRootPath=C:\Program Files\Stoll\skr3\PostgreSql\9.5.3-1\bin"

set "Machine_UID=56665454"
call :GetDoneTime !Machine_UID! DONE_TIME
echo DONE_TIME=!DONE_TIME!, ErrorLevel=!ErrorLevel!
goto :EOF

:GetDoneTime
    call :UndefErrorLevel
    set "MC_UID=%~1"
    set "ReturnDoneTimeValueRef=%~2"
    set "DataVal=NULL"
    set "PGPASSWORD=ppsfrontend"
    set "PGCLIENTENCODING=utf-8"

    set pd_SelCmd=SELECT max(t.endDate) FROM Ticket t JOIN Device d on t.device_id = d.id WHERE t.state in ('APPROVED','IN_PROGRESS', 'IN_ACTIVITY') AND d.uid='!MC_UID!';
    set pg_SelCall="!PgRootPath!\psql" -U ppsfrontend -h localhost -d pps
    set pg_cmd="echo !pd_SelCmd! | !pg_SelCall! || echo/ & echo/ & call echo NULL NULL %%^^^errorlevel%%"
    set "pg_cmd=!pg_cmd:)=^)!"
   
    REM Execute PG command and split up table at | and space. Resulting DataValue obtained from second column.
    REM Check for errors of call
    for /f "skip=2 tokens=1,2,3" %%i in ('!pg_cmd!') do (      
        REM Get value in first and second parameter from split - which is from third row
        set "DataVal=%%i %%j"
        REM If error happend, report it. Error code is obtained in 3rd parameter.
        if "!DataVal!"=="NULL NULL" (
            echo ## Postgres DB operation failed with ERROR: %%~k
            set "DataVal=NULL"
        ) else (
            REM Check if not valid
            if "!DataVal:~0,1!"=="(" set "DataVal=NULL"
        )
        goto GotDoneTime
    )
:GotDoneTime
    if not '!ReturnDoneTimeValueRef!'=='' set "!ReturnDoneTimeValueRef!=!DataVal!"
    if "!DataVal!"=="NULL" set "ERRORLEVEL=1" & exit /b 1
exit /b 0

:UndefErrorLevel
REM Undefine ERRORLEVEL, otherwise !ERRORLEVEL! keeps last set value. The ERRORLEVEL system value is not the same as !ERRORLEVEL!
set "ERRORLEVEL="
exit /b 0

Output example:

D:\PPS\PpsJBossServer\PpsServerInstallScripts>GetDoneTimeByUID.bat
DONE_TIME=2017-07-18 13:57:33.818, ErrorLevel=0

Query specific standstill times:

  1. Standstill times with the following data:

    - Machines in case of a failure

    - Other standstills

    - Total running time

    - Maschinen mit SKR-ID1530603103 und 1530627696

    Time period 2018-08-26 12:00 to 2018-09-01 12:00

    This information can be determined via individualized stop statistics.
    Generating a Secondary DB and triggering an individualized report.
    For testing purposes, you can do it with the pgAdmin Tool. In this example, static tables are used which will not be automatically deleted. The mydb table is used, which should be deleted in the report_01 schema after finishing the test.
    Advantage of this method: You can enter the 2 selects into different SQL windows and view the generated tables in the report_01 schema.
    1259
    Create Secondary DB tables.

    SELECT * FROM skrpps_02.skr_create_secundary_db_tables_ex('report_01'::text ,'mydb'::text,'eUnlogged'::skrpps_02.e_sec_tabletype ,'skr'::text,ARRAY[1457359086, 1457359088],'2018-08-26 12:00'::timestamp,'2018-09-01 12:00'::timestamp,'2000','2:00:00'::interval);
    1260
    Generate individual stop statistics.

    - Add or subtract columns with arithmetic operations to obtain the desired result.

    - Additionally the result columns receive their own names (ERROR_STOP_DURATION and OTHER_STOP_DURATION).

    - The table is sorted ascending via the first column (name of the machine).

    SELECT skrpps_02.skr_mcid2mcname( "AGG_MC_ID",'skr') AS "AGG_MC_NAME"
    , "AGG_MC_ID",
    ("STOP_DURATION" - "ENGAGING_DURATION" - "PIECE_COUNTER_ZERO_DURATION" - "PROGRAMMING_DURATION" - "OFF_DURATION" - "BOOT_DURATION" - "NO_DATA_DURATION" - "MCSTATE_UNKNOWN_DURATION"
    ) AS "ERROR_STOP_DURATION",
    ("ENGAGING_DURATION" + "PIECE_COUNTER_ZERO_DURATION" + "PROGRAMMING_DURATION" + "OFF_DURATION" + "BOOT_DURATION" + "NO_DATA_DURATION" + "MCSTATE_UNKNOWN_DURATION") AS "OTHER_STOP_DURATION",
    "RUN_DURATION"
    FROM  skrpps_02.eval_stop_statistic
    (
    'report_01' -- __sSchema text
    ,'mydb' -- _sTable_Prefix text
    ,Array['AGG_MC_ID'] -- _aSortorder text[]
     
    ,true -- _bGroupByMachine boolean
    ,false -- _bGroupByShift boolean
    ,false -- _bGroupByUserName boolean
    ,false -- _bGroupByPattern boolean
    ,false -- _bGroupBySeq boolean
    ,false -- _bGroupByTicketUid boolean
    ,false -- _bGroupByTicketProductionId boolean
    ,false -- _bGroupByTicketProductionSub1Id boolean
    ,false -- _bGroupByTicketProductionSub2Id boolean
    ,false -- _bGroupByTicketCustomerId boolean
    ,false -- _bGroupByTicketArticleId boolean
    ,NULL -- _aFilterShift integer[]
    ,NULL -- _aFilterUsername text[]
    ,NULL -- _aFilterPattern text[]
    ,NULL -- _aFilterSeq text[]
    ,NULL -- _aFilterTicketUid bigint[]
    ,NULL -- _aFilterTicketProductionId text[]
    ,NULL -- _aFilterTicketProductionsub1Id text[]
    ,NULL -- _aFilterTicketProductionsub2Id text[]
    ,NULL -- _aFilterTicketCustomerId text[]
    ,NULL -- _aFilterTicketArticleid text[]
    ,NULL -- _aFilterUserState Smallint[]
    ,NULL -- _aFilterSintralState Smallint[]
    ) where "OVERALL_DURATION" <> "NO_DATA_DURATION" order by 1 asc

    : With the delivered Excel program (StollReportFromSkrToExcel.xlsb) you can easily configure the basic filter settings and copy the select query.
    How to Access the Database via ODBC
  1. Example result of the query

How to add an order directly to the database without generating a XML file

Instead of writing the order data into a XML file and saving it in the D:\ERP\Input folder, you can insert the order directly in the ticket_queue using a database connection to the pps database.

The following insert adds an order with minimal data record for the auto production to the ticket_queue. It is intended for reloading a Jacquard in case of a individual production with lot size 1. Other order variants are possible as well.

Using the example Excel file D:\PPS\ODBC_Examples\Excel\StollPpsTicketExportViaODBC.xlsm, you can develop further applications.

For this is necessary a ODBC connection to the pps and the skr database.
The knitting time (estimated_order_duration) is transferred in milliseconds.
Therefore, 20 min are to be entered as 20*60*1000=1200000.

INSERT INTO "ticket_queue" (
    ticket_number,
    ticket_type,
    pattern_subdirectory,
    pattern_file_name,
    total_num_of_executions,
    estimated_order_duration,
    custom_xml, machine_list_xml,
    order_description,
    pattern_preload_condition_xml
)
Values (
    '145609_bandage_2154879',
    'AUTO_PRODUCTION',
    'production\Spring2019',
    'CMSADF-3 KW.bandage_2154879.zip',
    1,
    1200000,
    '<STOLL:CUSTOM xmlns="http://schemas.custom.com/ticketing/ticket/v0.3" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://schemas.custom.com/ticketing/ticket/v0.3 CustomTicket.xsd"/>',
    '<STOLL:MACHINE_LIST>
        <STOLL:MACHINE Name="312_ADF-3KW" BoIgnore="1" Pieces="1"/>
    </STOLL:MACHINE_LIST>',
    'Ticket created by Sapas',
    '<STOLL:PATTERN_PRELOAD_CONDITION>
        <STOLL:ERASE_ALL State="false"/>
        <STOLL:ERASE_ALL_YARNCARRIERPOS State="false"/>
        <STOLL:ERASE_ALL_SEQYARNCARRIERPOS State="false"/>
        <STOLL:LOAD_PAT_CONTAINER_COMP SIN="false" JAC="true" SET="false"/>
    </STOLL:PATTERN_PRELOAD_CONDITION>'
)

Determine the pieces produced with the still running ticket based production

You want to determine, how many pieces of the running production were already produced on the machines?

By querying the PPS database as follows, you receive information about the production:

  • how many pieces
  • from how many machines
  • on which machines
  • which ticket
  • which order
  • how many unfinished tickets

SELECT d.name AS device_name, executionsdone, numberofexecutions, t.name AS ticket_name, ordernumber AS order_name, patternfilename FROM Ticket t JOIN Device d on device_id=d.id JOIN Orders o ON o.id = t.order_id WHERE o.type in ('GENERAL','AUTO_PRODUCTION', 'GUIDED_PRODUCTION') and t.state ='IN_PROGRESS' order by order_name

Determine produced sequences per day and machine:

  1. For example, if a sweater is knitted using a sequence and you want to determine how many sweaters were produced per day per machine over a certain period of time, you can do this using the EventList report:

    - Machines with SKR-ID 1706536365, 1710405527, 1710409205, 1711546385, 1706535846, 1711443250, 1607342198, 1704803012, 1718697144, 1718692632

    - Period from 2024-04-09 00:00 to 2024-08-01 00:00

    After generating a Secondary DB, you can use the EventList Report. With it, the event filter for Sequence number increased (1,4) is used.
    For testing purposes, you can do it with the pgAdmin Tool. In order for the 2 selects to run in the same session, the queries must be entered commonly in a SQL window.
    1
    Create Secondary DB tables

    SELECT * FROM skrpps_02.skr_create_secundary_db_tables_ex(''::text ,'mydb'::text,'eUnloggedTemp'::skrpps_02.e_sec_tabletype ,'skr'::text, ARRAY[1706536365, 1710405527, 1710409205, 1711546385, 1706535846, 1711443250, 1607342198, 1704803012, 1718697144, 1718692632],'2024-04-09 00:00'::timestamp,'2024-08-01 00:00'::timestamp,'2000','2:00:00'::interval);
    2
    Generate an individual EventList Report.

    - Machines and sequences are grouped on a daily basis and the quantity of sequences is counted.

    - The event table displays machine name, sequence name (may also contain pattern name), day and number of sequences.

    : The number of sequences will only be correct if production has run smoothly.
    This query cannot take into account skipped or aborted sequence elements.

    Select skrpps_02.skr_mcid2mcname( "MC_ID",'skr'::cstring) AS "MC_NAME",
    "SEQ_PATTERN_NAME",
    DATE_TRUNC('day', "TIME_STAMP") AS "DAY",
    COUNT(*) "NumbOfSequences"
    from 
    skrpps_02.eval_event_list(
    '' -- __sSchema text
    ,'mydb' -- _sTable_Prefix text
    ,'EN' -- _sLang text
    ,1 -- _iStartId bigint
    ,10000 -- _iLimit int
    ,ARRAY[[1,4]]::smallint[][2] -- _aFilterEvents smallint[][2]
    ,NULL -- _aFilterShift integer[]
    ,NULL -- _aFilterUsername text[]
    ,NULL -- _aFilterPattern text[]
    ,NULL -- _aFilterSeq text[]
    ,NULL -- _aFilterTicketUid bigint[]
    ,NULL -- _aFilterTicketProductionId text[]
    ,NULL -- _aFilterTicketProductionsub1Id text[]
    ,NULL -- _aFilterTicketProductionsub2Id text[]
    ,NULL -- _aFilterTicketCustomerId text[]
    ,NULL -- _aFilterTicketArticleid text[]
    ,NULL -- _aFilterUserState Smallint[]
    ,NULL -- _aFilterSintralState Smallint[]
    )
    GROUP BY skrpps_02.skr_mcid2mcname("MC_ID", 'skr'::cstring), 
    "SEQ_PATTERN_NAME", 
    DATE_TRUNC('day', "TIME_STAMP") 
    ORDER BY "DAY";


    : With the delivered Excel program (StollReportFromSkrToExcel.xlsb) you can easily configure the basic filter settings and copy the select query.
    How to Access the Database via ODBC
  1. Example result of the query by the pgAdmin Tool

You Want to Generate the Intermediate Tables and Evaluation with One Call

In this example, the auxiliary table and the event list are generated at once.
Please note that in the event of an error, analysis will be more difficult. The first skr_create_secundary_db_tables_ex column should contain the value 0 throughout to ensure that the auxiliary table could be generated for all machines.
In the example, after the auxiliary tables have been generated, the Event List Report is executed.

You can replace this part of the SELECT with another report based on the auxiliary tables.

SELECT skrpps_02.skr_create_secundary_db_tables_ex (''::text ,'mydb'::text,'eUnloggedTemp'::skrpps_02.e_sec_tabletype ,'skr'::text, ARRAY[1706536365, 1710405527, 1710409205, 1711546385, 1706535846, 1711443250, 1607342198, 1704803012, 1718697144, 1718692632],'2024-04-09 00:00'::timestamp,'2028-08-01 00:00'::timestamp,'2000','2:00:00'::interval)

, skrpps_02.skr_mcid2mcname("MC_ID",'skr'::cstring) AS "MC_NAME"
,* from
skrpps_02.eval_event_list (

'' -- __sSchema text

,'mydb' -- _sTable_Prefix text

,'EN' -- _sLang text

,1 -- _iStartId bigint

,1000 -- _iLimit int

,NULL::smallint[][2] -- _aFilterEvents smallint[][2]

,NULL -- _aFilterShift integer[]

,NULL -- _aFilterUsername text[]

,NULL -- _aFilterPattern text[]

,NULL -- _aFilterSeq text[]

,NULL -- _aFilterTicketUid bigint[]

,NULL -- _aFilterTicketProductionId text[]

,NULL -- _aFilterTicketProductionsub1Id text[]

,NULL -- _aFilterTicketProductionsub2Id text[]

,NULL -- _aFilterTicketCustomerId text[]

,NULL -- _aFilterTicketArticleid text[]

,NULL -- _aFilterUserState Smallint[]

,NULL -- _aFilterSintralState Smallint[]
);

Export the report to a CSV file:

Suppose you want to determine the run time of a machine on a given day.

The stop statistics provide this information in the RUN_DURATION column.

You can use the Copy Postgres command to export the results table to a CSV file.

    1
    Create the Secundary DB for the desired day and machine.
    2
    Create the desired report.
    3
    Proceed in the same way for further reports, provided that the Secundary database is required.
    Otherwise, you can call up the desired table directly.

You can test the call in the pgAdmin4 program:
SELECT * FROM skrpps_02.skr_create_secundary_db_tables_ex(
''::text ,'mydb'::text,

'eUnloggedTemp'::skrpps_02.e_sec_tabletype ,'skr'::text, ARRAY[1718692632],

'2024-06-25 00:00'::timestamp,
'2024-06-26 00:00'::timestamp,
'2000','2:00:00'::interval);

copy (

SELECT skrpps_02.skr_mcid2mcname("AGG_MC_ID",'skr') AS "AGG_MC_NAME"

, * FROM skrpps_02.eval_stop_statistic_ex

('' -- __sSchema text

,'mydb' -- _sTable_Prefix text

,NULL -- _aSortorder text[]

,false -- _bGroupByMachine boolean

,false -- _bGroupByShift boolean

,false -- _bGroupByUserName boolean

,false -- _bGroupByPattern boolean

,false -- _bGroupBySeq boolean

,false -- _bGroupByTicketUid boolean

,false -- _bGroupByTicketProductionId boolean

,false -- _bGroupByTicketProductionSub1Id boolean

,false -- _bGroupByTicketProductionSub2Id boolean

,false -- _bGroupByTicketCustomerId boolean

,false -- _bGroupByTicketArticleId boolean

,false -- _bGroupByDay boolean

,false -- _bGroupByWeek boolean

,false -- _bGroupByMonth boolean

,NULL -- _aFilterShift integer[]

,NULL -- _aFilterUsername text[]

,NULL -- _aFilterPattern text[]

,NULL -- _aFilterSeq text[]

,NULL -- _aFilterTicketUid bigint[]

,NULL -- _aFilterTicketProductionId text[]

,NULL -- _aFilterTicketProductionsub1Id text[]

,NULL -- _aFilterTicketProductionsub2Id text[]

,NULL -- _aFilterTicketCustomerId text[]

,NULL -- _aFilterTicketArticleid text[]

,NULL -- _aFilterUserState Smallint[]

,NULL -- _aFilterSintralState Smallint[]

) where "OVERALL_DURATION" <> "NO_DATA_DURATION"

) TO 'D:\PPS\Temp\StopStatistic.csv' WITH (FORMAT CSV, HEADER TRUE, DELIMITER E'\t') ;


The directory of the CSV file must have write permission for the _Skr2DbUser.

You can split the call into a batch and a *.sql file.

    1
    Copy the first text into a StopStatistic.bat batch file.
    2
    Copy the second text into the export_report.sql file.
    3
    Adjust the time period, machine IDs, and the target directory.
  1. The columns are separated by tabs and the result is written to the CSV-file 'D:\PPS\Temp\StopStatistic.csv' .
  2. You can open the file in Excel via data import:

@echo off

REM File StopStatistic.bat

SET "PGPATH=C:\Program Files\Stoll\skr3_15\PostgreSql\15.12-1\bin"

SET PGPASSWORD=skrfrontend

SET SQL_FILE="D:\PPS\Temp\export_report.sql"

SET ERROR_LOG="D:\PPS\Temp\psql_error.log"

del %ERROR_LOG% 2>NUL

"%PGPATH%\psql" -U skrfrontend -d skr -h localhost -f %SQL_FILE% > %ERROR_LOG% 2>&1

if %ERRORLEVEL% NEQ 0 (

echo Error executing PostgreSQL commands!

echo Check %ERROR_LOG% for details.

type %ERROR_LOG%

pause

) else (

echo CSV export completed successfully.

echo Check %ERROR_LOG% for any warnings or notices.

type %ERROR_LOG%

)


-- File export_report.sql

\set ON_ERROR_STOP on

SELECT skrpps_02.skr_create_secundary_db_tables_ex (

''::text,

'mydb'::text,

'eUnloggedTemp'::skrpps_02.e_sec_tabletype,

'skr'::text,

ARRAY[1718692632],

'2024-06-25 00:00'::timestamp,

'2024-06-26 00:00'::timestamp,

'2000',

'2:00:00'::interval

);

\copy (Select skrpps_02.skr_mcid2mcname ( "AGG_MC_ID",'skr') AS "AGG_MC_NAME", * from skrpps_02.eval_stop_statistic_ex ('' , 'mydb', NULL, true, false, false, false, false, false, false, false, false, false, false, false, false, false, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL) WHERE "OVERALL_DURATION" <> "NO_DATA_DURATION") TO 'D:\PPS\Temp\StopStatistic.csv' WITH (FORMAT CSV, HEADER TRUE, DELIMITER E'\t') ;