NXLog Legacy Documentation

Schneider Electric EcoStruxure Process Expert

Schneider Electric’s EcoStruxure Process Expert is a modern, distributed control system (DCS) designed to reduce resources spent on engineering and commissioning. It also provides cost-effective solutions for mitigating engineering and integration risks. EcoStruxure Process Expert finds deployment in a variety of industries, the most common being oil and gas, chemicals, utilities, cement, and mining.

NXLog can collect and process EcoStruxure Process Expert logs from the following sources:

Logs from Windows Event Log

There are two ways to collect logs from Windows Event Log: by Event ID or by log source.

NXLog can be configured to process EcoStruxure Process Expert logs that Windows Event Log collects as shown in the examples below.

This table lists the event generated by EcoStruxure Process Expert with their corresponding Event IDs.

Table 1. EcoStruxure Process Expert Event IDs
Event ID Description

0

Service started successfully.

260

Startup Requested Cache configuration STRUXUREWAREPEv461.

The following table lists each EcoStruxure Process Expert service that generates events in Windows Event Log, along with its source name and the path to its executable.

Table 2. EcoStruxure Process Expert log sources
Service Name/ Display Name Source Path to Executable Description

EPEAuditTrail

EPEAuditTrail

C:\Program Files\Schneider Electric\EcoStruxure\Process Expert\AuditTrail\AuditTrailService.exe

Cache_c-_program files_intersystems_struxurewarepev46 / Cache Controller for STRUXUREWAREPEv461

Cache Config STRUXUREWAREPEv461

C:\Program Files\InterSystems\STRUXUREWAREPEv461\bin\cservice.exe

Schneider Electric SUT Service

Schneider Electric SUT Service

C:\Program Files (x86)\Schneider Electric\Software Update\SutService.exe

SUT Service for Schneider Electric Software Update

Example 1. Processing EcoStruxure Process Expert logs based on Event ID

This example contains an NXLog configuration for reading and processing EcoStruxure Process Expert logs from Windows Event Log. Filtering is based on the value of the Event ID field.

Windows Event Log collected the following sample from the Cache Config STRUXUREWAREPEv461 log source, which has Event ID 260.

Event sample
Log Name:      Application
Source:        Cache Config STRUXUREWAREPEv461
Date:          9/9/2021 1:13:16 PM
Event ID:      260
Task Category: None
Level:         Information
Keywords:      Classic
User:          SYSTEM
Computer:      WIN-0NDMK54PLPR
Description:
Startup Requested Cache configuration STRUXUREWAREPEv461.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
  <System>
    <Provider Name="Cache Config STRUXUREWAREPEv461" />
    <EventID Qualifiers="16641">260</EventID>
    <Level>4</Level>
    <Task>0</Task>
    <Keywords>0x80000000000000</Keywords>
    <TimeCreated SystemTime="2021-09-09T20:13:16.376032300Z" />
    <EventRecordID>1330</EventRecordID>
    <Channel>Application</Channel>
    <Computer>WIN-0NDMK54PLPR</Computer>
    <Security UserID="S-1-5-18" />
  </System>
  <EventData>
    <Data>Startup Requested</Data>
    <Data>Cache configuration STRUXUREWAREPEv461. </Data>
  </EventData>
</Event>

In this configuration, the im_msvistalog module is used to collect Windows Event Log data. The QueryXML block contains a query that selects only those event records having an Event ID of 260 or 0. Data is then converted to JSON using the to_json() procedure of the xm_json module.

nxlog.conf
<Extension json>
    Module    xm_json
</Extension>

<Input eventlog>
    Module    im_msvistalog
    #XML query to read Windows Event Logs based on EventID
    <QueryXML>
        <QueryList>
            <Query Id="0" Path="Application">
                <Select Path="Application">
                    *[System[(EventID=260 or EventID=0)]]
                </Select>
            </Query>
        </QueryList>
    </QueryXML>
    # Conversion to JSON
    Exec      to_json();
</Input>
Output sample in JSON format
{
  "EventTime": "2021-09-09T13:13:16.376032-07:00",
  "Hostname": "WIN-0NDMK54PLPR",
  "Keywords": "36028797018963968",
  "EventType": "INFO",
  "SeverityValue": 2,
  "Severity": "INFO",
  "EventID": 260,
  "SourceName": "Cache Config STRUXUREWAREPEv461",
  "TaskValue": 0,
  "RecordNumber": 1330,
  "ExecutionProcessID": 0,
  "ExecutionThreadID": 0,
  "Channel": "Application",
  "Domain": "NT AUTHORITY",
  "AccountName": "SYSTEM",
  "UserID": "S-1-5-18",
  "AccountType": "Well Known Group",
  "Message": "Startup Requested Cache configuration STRUXUREWAREPEv461. ",
  "Data": "Startup Requested",
  "Data_1": "Cache configuration STRUXUREWAREPEv461. ",
  "EventReceivedTime": "2021-09-09T13:13:16.982131-07:00",
  "SourceModuleName": "eventlog",
  "SourceModuleType": "im_msvistalog"
}
Example 2. Processing EcoStruxure Process Expert logs based on event source

This example demonstrates how to configure NXLog to parse and process EcoStruxure Process Expert log sources.

This event sample was taken from Windows Event Viewer.

Schneider Electric SUT Service event sample
Log Name:      Application
Source:        Schneider Electric SUT Service
Date:          9/9/2021 1:26:11 PM
Event ID:      0
Task Category: None
Level:         Information
Keywords:      Classic
User:          N/A
Computer:      WIN-0NDMK54PLPR
Description:
Service started successfully.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
  <System>
    <Provider Name="Schneider Electric SUT Service" />
    <EventID Qualifiers="0">0</EventID>
    <Level>4</Level>
    <Task>0</Task>
    <Keywords>0x80000000000000</Keywords>
    <TimeCreated SystemTime="2021-09-09T20:26:11.596342200Z" />
    <EventRecordID>1336</EventRecordID>
    <Channel>Application</Channel>
    <Computer>WIN-0NDMK54PLPR</Computer>
    <Security />
  </System>
  <EventData>
    <Data>Service started successfully.</Data>
  </EventData>
</Event>

Unlike the im_msvistalog instance from the previous example that uses a QueryXML block to filter by Event ID, this query only accepts event records from the following three Providers:

  • Cache Config STRUXUREWAREPEv461

  • EPEAuditTrail

  • Schneider Electric SUT Service

To provide more flexibility in choosing a post-processing solution, the xm_json module’s to_json() procedure is invoked, which not only converts the newly parsed fields to JSON, but also enriches each event record with the core fields.

nxlog.conf
<Extension json>
    Module    xm_json
</Extension>

<Input eventlog>
    Module    im_msvistalog
    # XML query to read Windows Event Log data
    <QueryXML>
        <QueryList>
            <Query Id="0" Path="Application">
                <Select Path="Application">*
                    [System[Provider[@Name='Cache Config STRUXUREWAREPEv461' or
                        @Name='EPEAuditTrail' or
                        @Name='Schneider Electric SUT Service']]]
                </Select>
            </Query>
        </QueryList>
    </QueryXML>
    # Conversion to JSON
    Exec    to_json();
</Input>

The following JSON-formatted event record represents the original Schneider Electric SUT Service event sample after NXLog parsing and processing.

Output sample in JSON format
{
  "EventTime": "2021-09-09T13:26:11.596342-07:00",
  "Hostname": "WIN-0NDMK54PLPR",
  "Keywords": "36028797018963968",
  "EventType": "INFO",
  "SeverityValue": 2,
  "Severity": "INFO",
  "EventID": 0,
  "SourceName": "Schneider Electric SUT Service",
  "TaskValue": 0,
  "RecordNumber": 1336,
  "ExecutionProcessID": 0,
  "ExecutionThreadID": 0,
  "Channel": "Application",
  "Message": "Service started successfully.",
  "Opcode": "Info",
  "Data": "Service started successfully.",
  "EventReceivedTime": "2021-09-09T13:26:11.652134-07:00",
  "SourceModuleName": "eventlog",
  "SourceModuleType": "im_msvistalog"
}

Collecting diagnostic data from log files

EcoStruxure Process Expert produces several types of file-based logs that NXLog can parse and process. This table lists the file-based logs that Schneider Electric generates and their locations.

Table 3. File-based EcoStruxure Process Expert logs
Log type File ext. Location

Server log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\Server.log

Server trace log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\ServerTrace.log

System info log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\SysInfo.log

Engineering client log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\EngClient.log

Engineering client communication log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\EngClientCommunication.log

Engineering client notification log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\EngClientNotification.log

Engineering client object store log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\EngClientObjectStore.log

Operation client log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\OpClient.log

Operation client communication log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\OpClientCommunication.log

HybridDCS virtual machine log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\HybridDCS.Vm.2020R2#0.log

Communication log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\Communication.log

Data access log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\DataAccess.log

Object store log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\ObjectStore.log

Performance log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\Performance.log

Audit trail log

.log

C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs\AuditTrail.log

Migration log

.txt

C:\ProgramData\Schneider Electric\Process Expert 2020 R2\Db\MigrationLogFile.txt

Activity log files

Schneider Electric EcoStruxure Process Expert records the activity of each system server, operation client, and engineering client. It then stores these activity logs in the C:\Users\<Username>\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs directory.

Server log and server trace log

Server.log and ServerTrace.log contain the information related to the EcoStruxure Process Expert system server component.

Both logs exhibit similar data structures and can be processed using similar configurations.

Example 3. Processing server logs

The following event sample is taken from Server.log.

Event sample
2021-09-14 06:21:55.2041 INFO  [T 29] Session [Id:21e80e02-694f-431c-8ac1-73c19f270a01 System:System_1 User:Administrator (Administrator)] Command (5f6fd43f-453f-40be-8251-3250f8e964cb) BuildAll (SchneiderElectric.ProcessExpert.Unity.Executable.UnityExecutable) completed in 30196.7928 ms SchneiderElectric.ProcessExpert.FoundationServices.CommandShell.ExecuteCommand

System server logs are comprised of records that contain predictable data patterns, from which a set of fields can be parsed that are common to all logs. Compare the values in the event sample above with these fields that can be parsed from it:

Table 4. Fields common to all server logs
Field Data sample

Timestamp

2021-09-14 06:21:55.2041

Severity

INFO

Event ID

[T 29]

Message

Session [Id:21e80e02-694f-431c-8ac1-73c19f270a01 System:System_1 User:Administrator (Administrator)] Command (5f6fd43f-453f-40be-8251-3250f8e964cb) BuildAll (SchneiderElectric.ProcessExpert.Unity.Executable.UnityExecutable) completed in 30196.7928 ms

Service

SchneiderElectric.ProcessExpert.FoundationServices.CommandShell.ExecuteCommand

Like many other types of log sources with a Message field — for example, syslog records — the Message field often contains multiple data values that can be parsed as additional fields, which are common among some records but missing from others.

Because these logs exhibit consistent data patterns, 14 regular expressions are defined and assigned to constants, all ending in _REGEX, for parsing all of the possible fields this log source might contain.

Some events can span multiple lines. This configuration uses the im_file module to collect file-based logs and the multiline module to read multiline log records. To correctly process multiline event logs, a pattern needs to be defined as a regular expression that describes the header line of an event. In the following xm_multiline extension instance, the HeaderLine directive specifies the regular expression to be used for finding the header line of each event.

Enabling multiline event processing
<Extension multiline_srv>
    Module        xm_multiline
    # Parsing the header line
    HeaderLine    /\d+.\d+.\d+\s*\d+.\d+.\d+.\d+\s*[A-Z]+\s*\[.*?\]/
</Extension>

In the from_file input instance, the InputType directive is used for referencing the xm_multiline extension instance by name, multiline_srv, which will enable the instance to establish the beginning and end of each event. Therefore, we use the xm_multiline module to process them.

The File directive used in the from_file instance of the im_file module defines the path and filename(s) of the log file(s) to be parsed. Since NXLog supports the use of wildcards with this directive, the wildcard character (*) used in Server*.log means that both Server.log and ServerTrace.log will be parsed.

To aid in parsing, the regular expression substitution operator is invoked s/\\r\\n/ /g to replace all occurrences of the \r\n pair (CR/LF, i.e., hex 0D 0A) to a space (hex 20), thus converting multiline event records to single-line records.

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined and stored as the 14 *_REGEX constants. Within these regular expressions, field names are defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. By invoking these regular expressions as part of the complex conditional statement that begins with if $raw_event =~, the parsing takes place automatically.

The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that do not match any of the regular expressions used in the conditional statement.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# A regular expression defined as a constant to read the content of the logs
define HEADER_REGEX         (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s*
define FOOTER_REGEX         (?<Message>.*?)\s*(?<Service>SchneiderElectric.*?)\s+

define SESSION_REGEX        (?:Session\s*\[Id\:(?<SessionId>.*?)\s*System\:(?<System>.*?)\s*User\:(?<User>.*?)\])\s*
define COMMAND_REGEX        Command\s*\((?<CommandId>.*?)\)\s*(?<Command>[\w\s\(]*SchneiderElectric[\w\.\)]+)\s*
define QUERY_REGEX          Query\s*\((?<QueryId>.*?)\)\s*(?<Query>SchneiderElectric[\w\.]+)\s*
define COMPILE_REGEX        Compiled\:\s*(?<Compiled>[\$\w]+\s*[\w\.]+)\s*
define EXCEPTN_REGEX        (?<Exception>[\w\.]+)\:\s+(?<Exception_Comment>.*?)\s+(?<AddInfo>at\s+.*)
define SUBTASK_REGEX        SubTask\:\s*(?<SubTask>.*?)\((?<SubTaskId>.*?)\)\s*
define COMSRVTRANS_REGEX    (?<Action>CommitServiceTransaction)\s*\((?<CommitServiceTransactionId>.*?)\)\s*
define EXEC_REGEX           (?<Action>Execution)\s+of\s+(?<Command>SchneiderElectric.*?)\s+on\s+\
                            (?<ExecutionTool>.*?)\s+on\s+(?<VM>.*?)\s+on\s+(?<Host>.*?)\s+
define CALLBACK_REGEX       From\s*\[(?<CallbackFromIP>\d+.\d+.\d+.\d+)\]\s*(?<CallbackTimestamp>[\d\-]+\s*[\d\:\.]+)\s*\
                            (?<CallbackSeverity>[A-Z]+)\s*Vm\s*\[(?<CallbackFromVM>.*?)\]\s*
define DURTRANSCALL_REGEX   Duration\:(?<Duration>\d+\.\d+\w+)\,\s*Transaction\:(?<Transaction>.*?)\,\s*Call\:(?<Call>.*?)\s+\
                            (?<Method>.*?)\s+
define FILETRANS_REGEX      FileTransfer\s*(?<FileTransfer>\w+)\s*(?<File>.*?)\s*\<(?<FileSize>[\d\.\s]+.B)\>\s*
define SERVICE_REGEX        (?<Action>[\w\s]+\:\s*\w+)\s*(?:Id\:(?<Id>.*?))\s*(?:ClientId\:(?<ClientId>.*?))\s*\
                            (?:ClientHost\:(?<ClientHost>.*?))\s*(?:TotalSessions\:(?<TotalSessions>\d+))\s*

# Part of the log path defined as a constant
define PE_LOG_PATH          C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension json>
    Module        xm_json
</Extension>

<Extension multiline_srv>
    Module        xm_multiline
    # Parsing the header line
    HeaderLine    /\d+.\d+.\d+\s*\d+.\d+.\d+.\d+\s*[A-Z]+\s*\[.*?\]/
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\Server*.log'
    InputType   multiline_srv

    <Exec>
        # Replaces unwanted characters
        $raw_event =~ s/\\r\\n/ /g;
        $raw_event =~ s/\s{2,}/ /g;
    </Exec>

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %COMMAND_REGEX% %FOOTER_REGEX% %EXCEPTN_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %COMMAND_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %QUERY_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %COMPILE_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %SUBTASK_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %COMSRVTRANS_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %EXEC_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %FOOTER_REGEX% %EXCEPTN_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %FILETRANS_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SERVICE_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %CALLBACK_REGEX% %DURTRANSCALL_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %CALLBACK_REGEX% %FOOTER_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %FOOTER_REGEX%/
        {
            # Creates the timestamp
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

After NXLog has read, parsed, and processed the original Event sample, the following event is output. See Server log and server trace log.

Output sample in JSON format
{
  "EventReceivedTime": "2021-09-14T06:21:55.924549-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Command": "BuildAll (SchneiderElectric.ProcessExpert.Unity.Executable.UnityExecutable)",
  "CommandId": "5f6fd43f-453f-40be-8251-3250f8e964cb",
  "EventId": "T 29",
  "Message": "completed in 30196.7928 ms",
  "Service": "SchneiderElectric.ProcessExpert.FoundationServices.CommandShell.ExecuteCommand",
  "SessionId": "21e80e02-694f-431c-8ac1-73c19f270a01",
  "Severity": "INFO",
  "System": "System_1",
  "User": "Administrator (Administrator)",
  "EventTime": "2021-09-14T06:21:55.204100-07:00"
}

Engineering client logs and Operation client logs

Log files generated by engineering and operation clients store information about previous client activity. All of these log files exhibit similar data patterns and can be processed using a single NXLog configuration.

Example 4. Processing engineering client and operation client logs

The following sample is taken from the EngClient.log file. Its event record structure is identical to that of OpClient.log.

EngClient.log event sample
2021-09-16 01:53:02.6169 TRACE Query Invocation SchneiderElectric.ProcessExpert.Core.Queries.GetLicenseCategoryDetails completed in 39 ms SchneiderElectric.ProcessExpert.ClientEssentials.UIFramework.NLogLogger.Trace

The following sample exhibits the event record structure also shared by OpClientCommunication.log.

EngClientCommunication.log event sample
2021-09-16 01:52:40.2137 TRACE Host: localhost, Port: 9950 SchneiderElectric.ProcessExpert.ServiceCommunication.CommunicationServiceObject.IsServerReachable

The following sample is taken from EngClientNotification.log.

EngClientNotification.log event sample
2021-09-16 00:15:15.3425 INFO  Waiting for notification SEQ 12 SchneiderElectric.ProcessExpert.ServiceCommunication.NotificationReceiver.EnsureMessageOrder

The sample below is taken from EngClientObjectStore.log.

EngClientObjectStore.log event sample
2021-09-16 01:53:05.1818 TRACE ObjectStore.GetObjects(F.Type:Node Count:1 Fetch:1 Time:63ms) SchneiderElectric.ProcessExpert.ServiceCommunication.Caching.ObjectStore.GetObjects

Like many other types of log sources with a Message field — for example, syslog records — the Message field often contains multiple data values that can be parsed as additional fields, which are common among some records but missing from others.

Despite this wide variety of fields, the following fields are common to all client logs. Compare the values in the previous event sample above with these fields that can be parsed from it:

Table 5. Fields common to engineering client and operation client logs
Field Data sample

Timestamp

2021-09-16 01:53:05.1818

Severity

TRACE

Message

ObjectStore.GetObjects(F.Type:Node Count:1 Fetch:1 Time:63ms)

Service

SchneiderElectric.ProcessExpert.ServiceCommunication.Caching.ObjectStore.GetObjects

These fields can be parsed using the following regular expressions:

Defining regular expressions for common fields
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s+\d+.\d+.\d+.\d+)\s+(?<Severity>[A-Z]+)\s+
define MESSAGE_REGEX    (?<Message>.*?)\s*
define SERVICE_REGEX    (?<Service>SchneiderElectric.*?)\s+

Because these logs exhibit consistent data patterns, 16 regular expressions are defined and assigned to constants, all ending in _REGEX, for parsing all of the possible fields this log source might contain.

Defining regular expressions for fields associated with specific types of events
define COMMAND_REGEX    (?<Action>Command.*?)\s+(?<Command>SchneiderElectric.*?)\s+
define QUERY_REGEX      (?<Action>Query.*?)\s+(?<Query>SchneiderElectric.*?)\s+
define COMPONENT_REGEX  (?<Action>COMMIT|ONCLOSE)?\s*(?<Component>\w+)\#(?<ComponentNum>\d+)\s+
define ID_REGEX         Id\:\s*(?<Id>.*?)\s+
define EXCEPTN_REGEX    (?<Exception>[\w\.]+)\:\s+(?<Exception_Comment>.*?)[\s\-\>]*\
                        (?:(?<ExceptionAdd>[\w\.]+)\:\s+(?<ExceptionAdd_Comment>.*))?$

define HOSTPORT_REGEX   Host\:\s+(?<Host>.*?)\,\s+Port\:\s+(?<Port>\d+)\s+
define SUCCONN_REGEX    Success\:\s+(?<Success>\w+)\,\s+Connected\:\s+(?<Connected>\w+)\s+
define SERVURI_REGEX    ServiceURI\:\s+(?<ServiceURI>.*?)\s+
define HOSTNAME_REGEX   HostName\:\s+(?<HostName>.*?)\s+

define PROP_REGEX       (?<Properties>[A-Z]+\s*\d+)\s+

define GETOBJ_REGEX     \(ID\:(?<Id>\d+)\s+Time\:(?<Time>\d+\w+)\)\s+
define GETOBJX_REGEX    \(F\.Type\:(?<F_Type>\w+)\s+Count\:(?<Count>\d+)\s+\
                        Fetch\:(?<Fetch>\d+)\s+Time\:(?<Time>\d+\w+)\)\s+
define NOTIFCN_REGEX    NOTIFICATION\s+\[ID\:(?<NotificationId>[\w\-]+)\s+DESC\:\
                        (?<NotificationDesc>.*?)\]\s+

The File directive used in the from_file instance of the im_file module defines the path and filename(s) of the log file(s) to be parsed. Since NXLog supports the use of wildcards with this directive, the wildcard character (*) used in *Client*.log means that any file with a .log file extension in the specified directory having Client as part of its filename will be read and parsed.

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined and stored as the 16 *_REGEX constants. Within these regular expressions, field names are defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. By invoking these regular expressions as part of the complex conditional statement that begins with if $raw_event =~, the parsing takes place automatically.

The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that do not match any of the regular expressions used in the conditional statement.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s+\d+.\d+.\d+.\d+)\s+(?<Severity>[A-Z]+)\s+
define MESSAGE_REGEX    (?<Message>.*?)\s*
define SERVICE_REGEX    (?<Service>SchneiderElectric.*?)\s+

define COMMAND_REGEX    (?<Action>Command.*?)\s+(?<Command>SchneiderElectric.*?)\s+
define QUERY_REGEX      (?<Action>Query.*?)\s+(?<Query>SchneiderElectric.*?)\s+
define COMPONENT_REGEX  (?<Action>COMMIT|ONCLOSE)?\s*(?<Component>\w+)\#(?<ComponentNum>\d+)\s+
define ID_REGEX         Id\:\s*(?<Id>.*?)\s+
define EXCEPTN_REGEX    (?<Exception>[\w\.]+)\:\s+(?<Exception_Comment>.*?)[\s\-\>]*\
                        (?:(?<ExceptionAdd>[\w\.]+)\:\s+(?<ExceptionAdd_Comment>.*))?$

define HOSTPORT_REGEX   Host\:\s+(?<Host>.*?)\,\s+Port\:\s+(?<Port>\d+)\s+
define SUCCONN_REGEX    Success\:\s+(?<Success>\w+)\,\s+Connected\:\s+(?<Connected>\w+)\s+
define SERVURI_REGEX    ServiceURI\:\s+(?<ServiceURI>.*?)\s+
define HOSTNAME_REGEX   HostName\:\s+(?<HostName>.*?)\s+

define PROP_REGEX       (?<Properties>[A-Z]+\s*\d+)\s+

define GETOBJ_REGEX     \(ID\:(?<Id>\d+)\s+Time\:(?<Time>\d+\w+)\)\s+
define GETOBJX_REGEX    \(F\.Type\:(?<F_Type>\w+)\s+Count\:(?<Count>\d+)\s+\
                        Fetch\:(?<Fetch>\d+)\s+Time\:(?<Time>\d+\w+)\)\s+
define NOTIFCN_REGEX    NOTIFICATION\s+\[ID\:(?<NotificationId>[\w\-]+)\s+DESC\:\
                        (?<NotificationDesc>.*?)\]\s+

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension json>
    Module        xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\*Client*.log'

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %COMMAND_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %QUERY_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %COMPONENT_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %ID_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX% %EXCEPTN_REGEX%/ or

        $raw_event =~ /%HEADER_REGEX% %HOSTPORT_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SUCCONN_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SERVURI_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %HOSTNAME_REGEX% %SERVICE_REGEX%/ or

        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %PROP_REGEX% %SERVICE_REGEX%/ or

        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %GETOBJ_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %GETOBJX_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %NOTIFCN_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/ or
        
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/
        {
            # Creates the timestamp
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

The following output samples show the processed messages in JSON format.

This event was parsed from the EngClient.log event sample.

Output sample of the EngClient.log in JSON format
{
  "EventReceivedTime": "2021-09-16T01:53:03.318546-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Action": "Query Invocation",
  "Message": "completed in 39 ms",
  "Query": "SchneiderElectric.ProcessExpert.Core.Queries.GetLicenseCategoryDetails",
  "Service": "SchneiderElectric.ProcessExpert.ClientEssentials.UIFramework.NLogLogger.Trace",
  "Severity": "TRACE",
  "EventTime": "2021-09-16T01:53:02.616900-07:00"
}

This event was parsed from the EngClientCommunication.log event sample.

Output sample of the EngClientCommunication.log in JSON
{
  "EventReceivedTime": "2021-09-16T01:52:40.306515-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Host": "localhost",
  "Port": "9950",
  "Service": "SchneiderElectric.ProcessExpert.ServiceCommunication.CommunicationServiceObject.IsServerReachable",
  "Severity": "TRACE",
  "EventTime": "2021-09-16T01:52:40.213700-07:00"
}

This event was parsed from the EngClientNotification.log event sample.

Output sample of the EngClientNotification.log in JSON
{
  "EventReceivedTime": "2021-09-16T00:15:16.342500-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Message": "Waiting for notification",
  "Properties": "SEQ 12",
  "Service": "SchneiderElectric.ProcessExpert.ServiceCommunication.NotificationReceiver.EnsureMessageOrder",
  "Severity": "INFO",
  "EventTime": "2021-09-16T00:15:15.342500-07:00"
}

This event was parsed from the EngClientObjectStore.log event sample.

Output sample of the EngClientObjectStore.log in JSON
{
  "EventReceivedTime": "2021-09-16T01:53:05.869078-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Count": "1",
  "F_Type": "Node",
  "Fetch": "1",
  "Message": "ObjectStore.GetObjects",
  "Service": "SchneiderElectric.ProcessExpert.ServiceCommunication.Caching.ObjectStore.GetObjects",
  "Severity": "TRACE",
  "Time": "63ms",
  "EventTime": "2021-09-16T01:53:05.181800-07:00"
}

Object store log

Example 5. Processing object store log

The following event sample is taken from ObjectStore.log. ALthough its event record structure is similar to those from previous examples, it lacks a Message field and exhibits no variation in fields.

Object store log event sample
2021-09-17 11:40:17.8680 TRACE [T 38] Session [Id:e18a1ce1-cd3b-484b-a963-e00ca5a61035 System:Global User:Administrator (Administrator)] IObjectStoreService.GetObjects(F.Type:SchneiderElectric.ProcessExpert.Systems.Node, Count:1, Time:0ms) SchneiderElectric.ProcessExpert.FoundationServices.ObjectStoreService.GetObjects

Because these logs exhibit consistent data patterns, six regular expressions are defined and assigned to constants, all ending in _REGEX, for parsing all of the possible fields this log source might contain.

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined and stored as the six *_REGEX constants. Within these regular expressions, field names are defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. By invoking these regular expressions as part of the complex conditional statement that begins with if $raw_event =~, the parsing takes place automatically.

The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that do not match any of the regular expressions used in the conditional statement.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
# Regular expressions for the main part
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s+
define MESSAGE_REGEX    (?<Message>.*?)\s*
define SERVICE_REGEX    (?<Service>SchneiderElectric.*?)\s+

define SESSION_REGEX    (?:Session\s*\[Id\:(?<SessionId>.*?)\s*System\:(?<System>.*?)\s*User\:(?<User>.*?)\])\s+
define GETOBJ_REGEX     (?<Method>\w+\.\w+)\((?:(?<Property_1>.*?)\,\s*(?<Property_2>[\w\.]+\:\d+))?\)\s+
define GETOBJX_REGEX    (?<Method>\w+\.\w+)\(F\.Type\:(?<F_Type>.*?)\,\s+Count\:(?<Count>\d+)\,\s+Time\:(?<Time>\d+\w+)\)\s+

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension json>
    Module        xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\ObjectStore.log'

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %GETOBJX_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %GETOBJ_REGEX% %SERVICE_REGEX%/ or        
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/
        {
            # Creates the timestamp
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

This event was parsed from the Object store log event sample.

Output sample in JSON format
{
  "EventReceivedTime": "2021-09-17T11:40:17.927585-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Count": "1",
  "EventId": "T 38",
  "F_Type": "SchneiderElectric.ProcessExpert.Systems.Node",
  "Method": "IObjectStoreService.GetObjects",
  "Service": "SchneiderElectric.ProcessExpert.FoundationServices.ObjectStoreService.GetObjects",
  "SessionId": "e18a1ce1-cd3b-484b-a963-e00ca5a61035",
  "Severity": "TRACE",
  "System": "Global",
  "Time": "0ms",
  "User": "Administrator (Administrator)",
  "EventTime": "2021-09-17T11:40:17.868000-07:00"
}

Audit trail log

Audit trail functionality allows the system server to connect to the syslog server and send audit trail messages.

Messages contain information that correspond to user actions performed by using engineering clients or the system server.

Typically, audit trail messages refer to a user action. These entries can be found in the notification panel of the engineering client.

The AuditTrail.log file stores status events from the audit trail service. These include starting/stopping the audit trail service, and connecting to the syslog server, among others.

Example 6. Processing audit trail log

The sample below represents the event structure of the audit trail log message, which is quite similar to the previously considered message types.

Audit trail log event sample
2021-09-19 11:38:00.2321 DEBUG [T 43] Session [Id:bf761b9b-d47e-42f5-b374-4284b1f0bd53 System:Global User:Administrator (Administrator)] Starting Audit Trail Windows Service... SchneiderElectric.ProcessExpert.FoundationServices.AuditTrailConfigurationService.ServerStatusChanged
Table 6. Fields common to the audit trail log
Field Data sample

Timestamp

2021-09-19 11:38:00.2321

Severity

DEBUG

EventId

T 43

Message

Session [Id:bf761b9b-d47e-42f5-b374-4284b1f0bd53 System:Global User:Administrator (Administrator)] Starting Audit Trail Windows Service…​

Service

SchneiderElectric.ProcessExpert.FoundationServices.AuditTrailConfigurationService.ServerStatusChangedGetObjects

At the same time, messages may contain the following additional fields:

Table 7. Other audit trail log fields that may be present
Field Data sample

SessionId

bf761b9b-d47e-42f5-b374-4284b1f0bd53

System

Global

User

Administrator (Administrator)

Because these logs exhibit consistent data patterns, four regular expressions are defined and assigned to constants, all ending in _REGEX, for parsing all of the possible fields this log source might contain.

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined and stored as the four *_REGEX constants. Within these regular expressions, field names are defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. By invoking these regular expressions as part of the complex conditional statement that begins with if $raw_event =~, the parsing takes place automatically.

The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that do not match any of the regular expressions used in the conditional statement.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s+
define MESSAGE_REGEX    (?<Message>.*?)\s*
define SERVICE_REGEX    (?<Service>SchneiderElectric.*?)\s+

define SESSION_REGEX    (?:Session\s*\[Id\:(?<SessionId>.*?)\s*System\:(?<System>.*?)\s*User\:(?<User>.*?)\])\s*

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension json>
    Module        xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\AuditTrail.log'

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/
        {
            # Creates the timestamp
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

This event was parsed from the Object store log event sample.

Output sample in JSON format
{
  "EventReceivedTime": "2021-09-19T11:38:00.333899-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "EventId": "T 43",
  "Message": "Starting Audit Trail Windows Service...",
  "Service": "SchneiderElectric.ProcessExpert.FoundationServices.AuditTrailConfigurationService.ServerStatusChanged",
  "SessionId": "bf761b9b-d47e-42f5-b374-4284b1f0bd53",
  "Severity": "DEBUG",
  "System": "Global",
  "User": "Administrator (Administrator)",
  "EventTime": "2021-09-19T11:38:00.232100-07:00"
}

Communication log

Example 7. Processing audit trail log

The following event is taken from EcoStruxure Process Expert Communication.log.

Communication log event sample
2021-09-19 12:12:59.5253 DEBUG [T 16] ServiceURI: net.tcp://127.0.0.1:9950/StruxureWarePE/HostingService SchneiderElectric.ProcessExpert.ServiceCommunication.HostNameResolver.ReplaceHostNameWithIpAddress

Because these logs exhibit consistent data patterns, six regular expressions are defined and assigned to constants, all ending in _REGEX, for parsing all of the possible fields this log source might contain.

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined and stored as the six *_REGEX constants. Within these regular expressions, field names are defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. By invoking these regular expressions as part of the complex conditional statement that begins with if $raw_event =~, the parsing takes place automatically.

The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that do not match any of the regular expressions used in the conditional statement.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s+
define MESSAGE_REGEX    (?<Message>.*?)\s*
define SERVICE_REGEX    (?<Service>SchneiderElectric.*?)\s+

define ACTION_REGEX     (?<Action>\w+\.\w+)\s+
define SERVURI_REGEX    ServiceURI\:\s+(?<ServiceURI>.*?)\s+
define HOSTNAME_REGEX   HostName\:\s+(?<HostName>.*?)\s+

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension json>
    Module        xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\Communication.log'

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %ACTION_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SERVURI_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %HOSTNAME_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/
        {
            # Creates the timestamp
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

The output sample below shows the processed result in JSON format.

This event was parsed from the Communication log event sample.

Output sample in JSON format
{
  "EventReceivedTime": "2021-09-19T12:13:00.095747-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "EventId": "T 16",
  "Service": "SchneiderElectric.ProcessExpert.ServiceCommunication.HostNameResolver.ReplaceHostNameWithIpAddress",
  "ServiceURI": "net.tcp://127.0.0.1:9950/StruxureWarePE/HostingService",
  "Severity": "DEBUG",
  "EventTime": "2021-09-19T12:12:59.525300-07:00"
}

Data access log

Example 8. Processing data access log

The following event sample is taken from DataAccess.log.

Data access log event sample
2021-09-20 04:51:24.1155 TRACE [T 36] Session [Id:ab0d3e11-7a7d-4532-8d89-b5ec9ed00e44 System:Global User:Administrator (Administrator)] Transaction SubTask: <InitializeAfterStartup>b__3(3e09ce6d-4aac-4703-b4d9-26cdd4158066) took 0.079ms for saving SchneiderElectric.ProcessExpert.Data.Context.Commit

Because these logs exhibit consistent data patterns, 10 regular expressions are defined and assigned to constants, all ending in _REGEX, for parsing all of the possible fields this log source might contain.

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined and stored as the 10 *_REGEX constants. Within these regular expressions, field names are defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. By invoking these regular expressions as part of the complex conditional statement that begins with if $raw_event =~, the parsing takes place automatically.

The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that do not match any of the regular expressions used in the conditional statement.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s+
define MESSAGE_REGEX    (?<Message>.*?)\s*
define SERVICE_REGEX    (?<Service>SchneiderElectric.*?)\s+

define SESSION_REGEX    (?:Session\s*\[Id\:(?<SessionId>.*?)\s*System\:(?<System>.*?)\s*User\:(?<User>.*?)\])\s+
define TRANSID_REGEX    Transaction\:(?<TransactionId>.*?)\,\s+
define LOADED_REGEX     Loaded\:(?<Loaded>\d+)\[Time\:(?<Loaded_Time>[\d\.]+\w+)\,\s+Data\:(?<Loaded_Data>.*?)\]\,?\s+
define SAVED_REGEX      Saved\:(?<Saved>\d+)\[Time\:(?<Saved_Time>[\d\.]+\w+)\,\s+Data\:(?<Saved_Data>.*?)\,\s+New\:(?<Saved_New>\d+)\]\s+
define SUBTASK_REGEX    Transaction\s+(?<Transaction>.*?)\:\s+(?<SubTask>(?<=SubTask\:\s).*?)\((?<Id>.*?)\)\s+
define COMMAND_REGEX    Transaction\s+(?<Transaction>\w+\.\w+.*?)\((?<Id>.*?)\)\s+
define PROP_REGEX       Transaction\s+(?<Transaction>.*?)\s+\((?<Property>.*?)\)\((?<Id>.*?)\)\s+

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension json>
    Module        xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\DataAccess.log'

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %TRANSID_REGEX% %LOADED_REGEX% %SAVED_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %TRANSID_REGEX% %LOADED_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %SUBTASK_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %COMMAND_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %PROP_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %TRANSID_REGEX% %LOADED_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/
        {
            # Creates the timestamps
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

The following output sample shows the processed event in JSON format.

This event was parsed from the Data access log event sample.

Output sample in JSON format
{
  "EventReceivedTime": "2021-09-20T04:51:24.750590-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "EventId": "T 36",
  "Id": "3e09ce6d-4aac-4703-b4d9-26cdd4158066",
  "Message": "took 0.079ms for saving",
  "Service": "SchneiderElectric.ProcessExpert.Data.Context.Commit",
  "SessionId": "ab0d3e11-7a7d-4532-8d89-b5ec9ed00e44",
  "Severity": "TRACE",
  "SubTask": "<InitializeAfterStartup>b__3",
  "System": "Global",
  "Transaction": "SubTask",
  "User": "Administrator (Administrator)",
  "EventTime": "2021-09-20T04:51:24.115500-07:00"
}

Virtual machine log

Example 9. Processing virtual machine log

The following event sample is taken from HybridDCS.Vm.2020R2#0.log.

Virtual machine log event sample
2021-09-20 11:36:04.9881 : Launch 5000 "c:\Pes\Vm\VBoxWrap.Guest.exe" 192.168.101.2 39181 Ok

Because these logs exhibit consistent data patterns, eight regular expressions are defined and assigned to constants, all ending in _REGEX, for parsing all of the possible fields this log source might contain.

This configuration uses the im_file module to collect file-based logs and the multiline module to read multiline log records. To correctly process multiline event logs, a pattern needs to be defined as a regular expression that describes the header line of an event. In the following xm_multiline extension instance, the HeaderLine directive specifies the regular expression to be used for finding the header line of each event.

Enabling multiline event processing
<Extension multiline_srv>
    Module        xm_multiline
    # Parsing the header line
    HeaderLine    /\d+.\d+.\d+\s*\d+.\d+.\d+.\d+/
</Extension>

In the from_file input instance, the InputType directive is used for referencing the xm_multiline extension instance by name, multiline_srv, which will enable the instance to establish the beginning and end of each event.

To aid in parsing, the regular expression substitution operator is invoked s/\\r\\n/ /g to replace all occurrences of the \r\n pair (CR/LF, i.e., hex 0D 0A) to a space (hex 20), thus converting multiline event records to single-line records.

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined and stored as the eight *_REGEX constants. Within these regular expressions, field names are defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. By invoking these regular expressions as part of the complex conditional statement that begins with if $raw_event =~, the parsing takes place automatically.

The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that do not match any of the regular expressions used in the conditional statement.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)[\s\:]*
define MESSAGE_REGEX    (?<Message>.*)

define COPY_REGEX       (?<Action>.*?)\s+\"(?<File>.*?)\"\s+\"(?<Destination>.*?)\"\s+(?<Status>.*)
define LAUNCH_REGEX     (?<Action>.*?)\s+\"(?<Exec>.*?)\"\s+(?<IP>\d+\.\d+\.\d+\.\d+)\s+(?<Port>\d+)\s+(?<Status>.*)
define FILEEXITS_REGEX  (?<Action>.*?)\s+\"(?<FilePath>.*?)\"\s+(?<Status>.*)
define EXCEPTION_REGEX  (?<Exception>.*Exception.*?)\:\s+(?<ExceptionComment>.*?)\s+
define OBJNAME_REGEX    Object\s+name[\:\s\']*(?<ObjectName>.*?)[\'\.]*\s+
define ADDINFO_REGEX    (?<AddInfo>at\s+.*)

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension multiline_vm>
    Module        xm_multiline
    # Regular expression to look for the header of the message
    HeaderLine    /\d+.\d+.\d+\s*\d+.\d+.\d+.\d+/
</Extension>

<Extension json>
    Module        xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\HybridDCS.Vm.2020R2#0.log'
    InputType   multiline_vm

    <Exec>
        # Replaces unwanted characters
        $raw_event =~ s/\\r\\n/ /g;
        $raw_event =~ s/\s{2,}/ /g;
    </Exec>

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %EXCEPTION_REGEX% %OBJNAME_REGEX% %ADDINFO_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %COPY_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %LAUNCH_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %FILEEXITS_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %MESSAGE_REGEX%/
        {
            # Creates the timestamps
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

The following output sample shows the processed event in JSON format.

This event was parsed from the Virtual machine log event sample.

Output sample in JSON format
{
  "EventReceivedTime": "2021-09-20T11:45:15.464379-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Action": "Launch 5000",
  "Exec": "c:\\Pes\\Vm\\VBoxWrap.Guest.exe",
  "IP": "192.168.101.2",
  "Port": "39181",
  "Status": "Ok",
  "EventTime": "2021-09-20T11:36:04.988100-07:00"
}

Performance log

Example 10. Processing performance log

The event sample below is taken from Performance.log and exhibits the log file message structure, which is similar to the other EcoStruxure Process Expert log files.

Performance log event sample
2021-09-20 13:32:44.1836 TRACE [T 16] Session [Id:25d6030c-f7aa-419d-89af-181bfa1e4dd1 System:System_1 User:Administrator (Administrator)] BUILD ALL> Total 60.719s - Built 'Executable_1' SchneiderElectric.ProcessExpert.FoundationUtils.StopwatchLogger.Total

The event contains fields that are common for almost all EcoStruxure Process Expert log files:

Table 8. Fields common to the performance log
Field Data sample

Timestamp

2021-09-20 13:32:44.1836

Severity

TRACE

EventId

T 16

Message

Session [Id:25d6030c-f7aa-419d-89af-181bfa1e4dd1 System:System_1 User:Administrator (Administrator)] BUILD ALL> Total 60.719s - Built 'Executable_1' SchneiderElectric.ProcessExpert.FoundationUtils.StopwatchLogger.Total

The fields listed above can be parsed and processed by NXLog using the following regular expressions

Defining regular expressions for common fields
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s+
define MESSAGE_REGEX    (?<Message>.*?)\s*
define SERVICE_REGEX    (?<Service>SchneiderElectric.*?)\s+

At the same time, messages may contain the following additional fields:

Table 9. Other performance log fields that may be present
Field Data sample

SessionId

25d6030c-f7aa-419d-89af-181bfa1e4dd1

System

System_1

User

Administrator (Administrator)

Action

BUILD ALL

TotalTime

60.719s

Defining regular expressions for additional fields
define SESSION_REGEX    (?:Session\s*\[Id\:(?<SessionId>.*?)\s*System\:(?<System>.*?)\s*User\:(?<User>.*?)\])\s+
define ACTION_REGEX     (?<Action>.*?)\>\s+
define ELAPSED_REGEX    (?:Elapsed\s+(?<ElapsedTime>[\w\.]+))[\s\-]+
define TOTAL_REGEX      (?:Total\s+(?<TotalTime>[\w\.]+))[\s\-]+

Because these logs exhibit consistent data patterns, seven regular expressions are defined and assigned to constants, all ending in _REGEX, for parsing all of the possible fields this log source might contain.

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined and stored as the seven *_REGEX constants. Within these regular expressions, field names are defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. By invoking these regular expressions as part of the complex conditional statement that begins with if $raw_event =~, the parsing takes place automatically.

The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that do not match any of the regular expressions used in the conditional statement.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s+
define MESSAGE_REGEX    (?<Message>.*?)\s*
define SERVICE_REGEX    (?<Service>SchneiderElectric.*?)\s+

define SESSION_REGEX    (?:Session\s*\[Id\:(?<SessionId>.*?)\s*System\:(?<System>.*?)\s*User\:(?<User>.*?)\])\s+
define ACTION_REGEX     (?<Action>.*?)\>\s+
define ELAPSED_REGEX    (?:Elapsed\s+(?<ElapsedTime>[\w\.]+))[\s\-]+
define TOTAL_REGEX      (?:Total\s+(?<TotalTime>[\w\.]+))[\s\-]+

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension json>
    Module        xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\Performance.log'

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %ACTION_REGEX% %ELAPSED_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX% %SESSION_REGEX% %ACTION_REGEX% %TOTAL_REGEX% %MESSAGE_REGEX% %SERVICE_REGEX%/
        {
            # Creates the timestamps
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

The following output sample shows the processed event in JSON format.

This event was parsed from the Performance log event sample.

Output sample in JSON format
{
  "EventReceivedTime": "2021-09-20T13:32:44.916056-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Action": "BUILD ALL",
  "EventId": "T 16",
  "Message": "Built 'Executable_1'",
  "Service": "SchneiderElectric.ProcessExpert.FoundationUtils.StopwatchLogger.Total",
  "SessionId": "25d6030c-f7aa-419d-89af-181bfa1e4dd1",
  "Severity": "TRACE",
  "System": "System_1",
  "TotalTime": "60.719s",
  "User": "Administrator (Administrator)",
  "EventTime": "2021-09-20T13:32:44.183600-07:00"
}

System information log

The EcoStruxure Process Expert SysInfo.log file provides a detailed overview of the main system components such as operating system, CPUs, physical memory, disk controller, video controller, etc. The log file is updated once the system server is started.

Example 11. Processing the system information log

The following event samples are taken from SysInfo.log and represent different entries, single-line and multiline.

Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expressions defined by HEADER_REGEX, PROPVAL_REGEX, and DRIVE_REGEX:

System information log input sample, single-line event
2021-09-25 22:07:47.0444 TRACE [T 10] ENVIRONMENT PARAM	SystemDirectory = C:\Windows\system32
Defining the HEADER_REGEX regular expression for common fields
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s+(?<EventType>.*?)\t+\
                        (?<Property>.*?)\s+\=\s+(?<Value>.*)

Using HEADER_REGEX, the following fields can be parsed.

Table 10. Fields common to the system information log
Field Data sample

Timestamp

2021-09-25 22:07:47.0444

Severity

TRACE

EventId

T 10

EventType

ENVIRONMENT PARAM

Property

SystemDirectory

Value

C:\Windows\system32

System information log input sample with drive information, single-line event
2021-09-25 22:21:50.1003 TRACE [T 40] ENVIRONMENT PARAM	Drive Z:\ Properties = Volume Label : Shared Folders, Drive Type : Network, Drive Format : HGFS, Total Size : 208784060416, Available Free Space : 108368785408
Defining the DRIVE_REGEX regular expression for fields containing drive information
define DRIVE_REGEX      Volume\s+Label[\s\:]+(?<VolumeLabel>.*?)\,\s+Drive\s+Type[\s\:]+(?<DriveType>.*?)\,\s+\
                        Drive\s+Format[\s\:]+(?<DriveFormat>.*?)\,\s+Total\s+Size[\s\:]+(?<TotalSize>.*?)\,\s+\
                        Available\s+Free\s+Space[\s\:]+(?<AvailableFreeSpace>.*)

Using DRIVE_REGEX, the following fields can be parsed.

Table 11. Fields specific to events containing drive information
Field Data sample

VolumeLabel

Shared Folders

DriveType

Network

DriveFormat

HGFS

TotalSize

208784060416

AvailableFreeSpace

108368785408

SysInfo.log can also contain multiline events like the following one.

System information log input sample, event
2021-09-25 22:21:50.1927 TRACE [T 40] ENVIRONMENT PARAM	Operating System =
			BootDevice : \Device\HarddiskVolume2
			BuildNumber : 14393
			BuildType : Multiprocessor Free
			Caption : Microsoft Windows Server 2016 Standard
			CodeSet : 1252
			CountryCode : 1
			CreationClassName : Win32_OperatingSystem
			CSCreationClassName : Win32_ComputerSystem
			CSDVersion :
			CSName : WIN-0NDMK54PLPR
			CurrentTimeZone : -420
			DataExecutionPrevention_32BitApplications : True
			DataExecutionPrevention_Available : True
			DataExecutionPrevention_Drivers : True
			DataExecutionPrevention_SupportPolicy : 3
			Debug : False
			Description :
			Distributed : False
			EncryptionLevel : 256
			ForegroundApplicationBoost : 2
			FreePhysicalMemory : 5790392
			FreeSpaceInPagingFiles : 1228800
			FreeVirtualMemory : 7034056
			InstallDate : 20210909051158.000000-420
			LargeSystemCache :
			LastBootUpTime : 20210925205239.486214-420
			LocalDateTime : 20210925222150.137000-420
			Locale : 0409
			Manufacturer : Microsoft Corporation
			MaxNumberOfProcesses : 4294967295
			MaxProcessMemorySize : 137438953344
			MUILanguages : System.String[]
			Name : Microsoft Windows Server 2016 Standard|C:\Windows|\Device\Harddisk0\Partition4
			NumberOfLicensedUsers : 0
			NumberOfProcesses : 92
			NumberOfUsers : 1
			OperatingSystemSKU : 7
			Organization :
			OSArchitecture : 64-bit
			OSLanguage : 1033
			OSProductSuite : 272
			OSType : 18
			OtherTypeDescription :
			PAEEnabled :
			PlusProductID :
			PlusVersionNumber :
			PortableOperatingSystem : False
			Primary : True
			ProductType : 3
			RegisteredUser : Windows User
			SerialNumber : 00377-60000-00000-AA027
			ServicePackMajorVersion : 0
			ServicePackMinorVersion : 0
			SizeStoredInPagingFiles : 1310720
			Status : OK
			SuiteMask : 272
			SystemDevice : \Device\HarddiskVolume4
			SystemDirectory : C:\Windows\system32
			SystemDrive : C:
			TotalSwapSpaceSize :
			TotalVirtualMemorySize : 9698268
			TotalVisibleMemorySize : 8387548
			Version : 10.0.14393
			WindowsDirectory : C:\Windows

Normally, a long and complex regular expression would be required to parse such a large number of fields. However, a simple but effective approach is to use PROPVAL_REGEX and parse each line of this multiline event as individual events having only two fields, $Property and $Value.

Defining the PROPVAL_REGEX regular expression for all other fields
define PROPVAL_REGEX    (?x)^\t+(?<Property>.*?)\s+\:\s+(?<Value>.*)

Using PROPVAL_REGEX, the following fields can be parsed from multiline event records without any dependence on HEADER_REGEX or DRIVE_REGEX.

Table 12. Multiline event records contain primarily two fields that can be parsed
Field Data sample

Property

BootDevice

Value

\Device\HarddiskVolume2

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups contained defined by HEADER_REGEX, PROPVAL_REGEX or DRIVE_REGEX. Each field name is defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. In this example, depending on which of the three regular expressions an event matches, the event record will be enriched with a subset of the following fields: $EventId, $EventType, $Severity, $Property, and $Value. If the event contains drive information, the following fields will also be added: $VolumeLabel, $DriveType, $DriveFormat, $TotalSize, and $AvailableFreeSpace.

The strptime() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that match neither the HEADER_REGEX nor the PROPVAL_REGEX nor the DRIVE_REGEX regular expressions.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
define HEADER_REGEX     (?x)^(\d+.\d+.\d+\s*\d+.\d+.\d+.\d+)\s*(?<Severity>[A-Z]+)\s*\[(?<EventId>.*?)\]\s+(?<EventType>.*?)\t+\
                        (?<Property>.*?)\s+\=\s+(?<Value>.*)
define PROPVAL_REGEX    (?x)^\t+(?<Property>.*?)\s+\:\s+(?<Value>.*)

define DRIVE_REGEX      Volume\s+Label[\s\:]+(?<VolumeLabel>.*?)\,\s+Drive\s+Type[\s\:]+(?<DriveType>.*?)\,\s+\
                        Drive\s+Format[\s\:]+(?<DriveFormat>.*?)\,\s+Total\s+Size[\s\:]+(?<TotalSize>.*?)\,\s+\
                        Available\s+Free\s+Space[\s\:]+(?<AvailableFreeSpace>.*)

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\Users\Administrator\AppData\Roaming\Schneider Electric\Process Expert 2020 R2\Logs

<Extension json>
    Module      xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\SysInfo.log'

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ /%HEADER_REGEX% %DRIVE_REGEX%/ or
        $raw_event =~ /%HEADER_REGEX%/        
        {
            # Creates the timestamps
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        else if $raw_event =~ /%PROPVAL_REGEX%/
        {
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

The following output samples depict the processed events in JSON format.

Output sample of the single line event in JSON format
{
  "EventReceivedTime": "2021-09-25T22:07:47.758428-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "EventId": "T 10",
  "EventType": "ENVIRONMENT PARAM",
  "Property": "SystemDirectory",
  "Severity": "TRACE",
  "Value": "C:\\Windows\\system32",
  "EventTime": "2021-09-25T22:07:47.044400-07:00"
}
Output sample of the drive info event in JSON format
{
  "EventReceivedTime": "2021-09-25T22:21:50.733777-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "AvailableFreeSpace": "108368785408",
  "DriveFormat": "HGFS",
  "DriveType": "Network",
  "EventId": "T 40",
  "EventType": "ENVIRONMENT PARAM",
  "Property": "Drive Z:\\ Properties",
  "Severity": "TRACE",
  "TotalSize": "208784060416",
  "Value": "",
  "VolumeLabel": "Shared Folders",
  "EventTime": "2021-09-25T22:21:50.100300-07:00"
}

This event was parsed from the System information log input sample, event.

Output sample of the multiline event header in JSON format
{
  "EventReceivedTime": "2021-09-25T22:21:50.734408-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "EventId": "T 40",
  "EventType": "ENVIRONMENT PARAM",
  "Property": "Operating System",
  "Severity": "TRACE",
  "Value": "",
  "EventTime": "2021-09-25T22:21:50.192700-07:00"
}

This event was parsed from the System information log input sample, event.

Output sample of the multiline event property in JSON format
{
  "EventReceivedTime": "2021-09-25T22:21:50.734408-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Property": "BootDevice",
  "Value": "\\Device\\HarddiskVolume2"
}

Migration log

The MigrationLogFile.txt contains the information related to the database migration process. The log file is located in C:\ProgramData\Schneider Electric\Process Expert 2020 R2\Db folder.

Example 12. Processing migration log

The following event samples are taken from MigrationLogFile.txt and represent two types of input entries.

The first event type contains the following list of fields:

Table 13. Migration log event type 1 fields that can be parsed
Field Data sample

Severity

[Info]

Timestamp

(09/27/2021 04:51:53)

Message

Database Global : Message 0 of 1 Global Templates converted successfully : Domain : ContentRepository

Input sample of event type 1
[Info] (09/27/2021 04:51:53) Database Global : Message 0 of 1 Global Templates converted successfully : Domain : ContentRepository

The following fields can be parsed from the second event type:

Table 14. Migration log event type 2 fields that can be parsed
Field Data sample

Timestamp

9/27/2021 4:51:53 AM

Message

Database migration for database System_1 sucessful : 0 Error(s) , 0 Warning(s) and 3 Information

Input sample of event type 2
9/27/2021 4:51:53 AM Database migration for database System_1 sucessful : 0 Error(s) , 0 Warning(s) and 3 Information

Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expressions defined by MIGR_TYPE1_REGEX and MIGR_TYPE2_REGEX:

Defining the MIGR_TYPE1_REGEX and MIGR_TYPE2_REGEX regular expressions
define MIGR_TYPE1_REGEX /(?x)^\[(?<Severity>\w+)?\]\s+\((\d+.\d+.\d+\s+\d+.\d+.\d+)\)\s*(?<Message>.*)/
define MIGR_TYPE2_REGEX /(?x)^(\d+\/\d+\/\d+\s+\d+\:\d+\:\d+\s+\w+)\s+(?<Message>.*)/

The logic for parsing and filtering is defined within the Exec block of the from_file instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in MIGR_TYPE1_REGEX or MIGR_TYPE2_REGEX. Each field name is defined within angle brackets (< >) that will determine which event field the captured value will be assigned to. In this example, depending on which of the two regular expressions an event matches, the event record will be enriched with a subset of the following fields: $Severity and $Message.

The strptime() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime field.

The drop procedure discards records that match neither the MIGR_TYPE1_REGEX nor the MIGR_TYPE2_REGEX regular expressions.

Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.

The following NXLog configuration combines all the steps described above.

nxlog.conf
# Regular expressions defined as a constants to read the content of the logs
define MIGR_TYPE1_REGEX /(?x)^\[(?<Severity>\w+)?\]\s+\((\d+.\d+.\d+\s+\d+.\d+.\d+)\)\s*(?<Message>.*)/
define MIGR_TYPE2_REGEX /(?x)^(\d+\/\d+\/\d+\s+\d+\:\d+\:\d+\s+\w+)\s+(?<Message>.*)/

# Part of the log path defined as a constant
define PE_LOG_PATH      C:\ProgramData\Schneider Electric\Process Expert 2020 R2\Db

<Extension json>
    Module      xm_json
</Extension>

<Input from_file>
    Module      im_file
    File        '%PE_LOG_PATH%\MigrationLogFile.txt'

    <Exec>
        # Matches the events with a regular expression
        if $raw_event =~ %MIGR_TYPE1_REGEX%
        {
            # Creates the timestamps
            $EventTime = strptime($2,'%m/%d/%Y %T');
            # Formats the result as JSON
            to_json();
        }
        else if $raw_event =~ %MIGR_TYPE2_REGEX%
        {
            # Creates the timestamps
            $EventTime = parsedate($1);
            # Formats the result as JSON
            to_json();
        }
        # Discard event if it doesn't match a/the regular expression
        else drop();
    </Exec>
</Input>

The following output sample represents a processed event in JSON format.

This event was parsed from the Input sample of event type 1.

Output sample of event type 1 in JSON format
{
  "EventReceivedTime": "2021-09-27T04:51:54.372805-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Message": "Database Global : Message 0 of 1 Global Templates converted successfully : Domain : ContentRepository",
  "Severity": "Info",
  "EventTime": "2021-09-27T04:51:53.000000-07:00"
}

This event was parsed from the Input sample of event type 2.

Output sample of event type 2 in JSON format
{
  "EventReceivedTime": "2021-09-27T04:51:54.372805-07:00",
  "SourceModuleName": "from_file",
  "SourceModuleType": "im_file",
  "Message": "Database migration for database System_1 sucessful : 0 Error(s) , 0 Warning(s) and 3 Information",
  "EventTime": "2021-09-27T04:51:53.000000-07:00"
}

Passive Network monitoring

EcoStruxure Process Expert DCS allows building the control system using the predefined objects integrated into tested, validated, and documented libraries.

The communication functionality provided by the integrated libraries allows the integration of Schneider Electric and third-party devices into the EcoStruxure Process Expert DCS using a variety of open and propriety protocols. It makes the data of connected devices available to other control systems.

Modbus TCP

Modbus is an open, simple, and robust industrial communication standard. It has been combined with Ethernet to create the Modbus TCP communication protocol.

Modbus TCP is an open Ethernet protocol that does not require any license to develop a connection and can be used with any product supporting TCP/IP. Vendor support for Modbus TCP is well-established and there are numerous products currently using this protocol.

Since the Modbus TCP application layer is identical to the Modbus serial link, there is no need to convert protocols for routing between networks. Like the TCP/IP protocol it was built upon, Modbus TCP also supports IP routing.

Example 13. Capturing Process Expert Modbus TCP packets

The Dev directive in the im_pcap module specifies the network interface. The Protocol group directive defines the modbus protocol for monitoring. The Exec block of im_pcap calls the to_json() procedure of the xm_json module to convert messages to JSON.

nxlog.conf
<Extension _json>
    Module        xm_json
</Extension>

<Input pcap_modbus>
    Module        im_pcap
    # Name of a network device/interface
    Dev           \Device\NPF_{B89D13DD-37BB-4799-BFCD-6C9D326C481A}
    <Protocol>
        # Protocol type
        Type      modbus
    </Protocol>
    # Conversion to JSON
    Exec          to_json();
</Input>
Modbus TCP query sample
{
  "modbus.function_code": "Read Holding Registers (03)",
  "modbus.length": "6",
  "modbus.prot_id": "0",
  "modbus.query.read_holding_regs.qty_of_regs": "5",
  "modbus.query.read_holding_regs.starting_address": "0",
  "modbus.trans_id": "80",
  "modbus.unit_id": "1",
  "EventTime": "2021-10-07T05:16:15.251715-07:00",
  "EventReceivedTime": "2021-10-07T05:16:15.293361-07:00",
  "SourceModuleName": "pcap_modbus",
  "SourceModuleType": "im_pcap"
}
Modbus TCP response sample
{
  "modbus.function_code": "Read Holding Registers (03)",
  "modbus.length": "13",
  "modbus.prot_id": "0",
  "modbus.response.read_holding_regs.byte_count": "10",
  "modbus.response.read_holding_regs.registers": "216, 199, 193, 179, 162",
  "modbus.trans_id": "80",
  "modbus.unit_id": "1",
  "EventTime": "2021-10-07T05:16:15.262300-07:00",
  "EventReceivedTime": "2021-10-07T05:16:16.280934-07:00",
  "SourceModuleName": "pcap_modbus",
  "SourceModuleType": "im_pcap"
}

DNP3

The distributed network protocol (DNP3) is an open standard for interoperability between master stations, substation devices, Remote Terminal Units (RTUs), and Intelligent Electronic Devices (IEDs). DNP3 is widely used in North America by the oil and gas, utilities, and transportation industries.

The protocol was originally developed for serial communications, but the current DNP3 IP version supports TCP/IP-based networking.

Example 14. Capturing Process Expert DNP3 packets

The current NXLog configuration uses the im_pcap module for passive network monitoring. The Dev directive of this module specifies the network interface. The Protocol group directive defines the protocol for capturing. The Exec block of im_pcap calls the to_json() procedure of the xm_json module to convert messages to JSON.

nxlog.conf
<Extension _json>
    Module        xm_json
</Extension>

<Input pcap_dnp3>
    Module        im_pcap
    # Name of a network device/interface
    Dev           \Device\NPF_{B89D13DD-37BB-4799-BFCD-6C9D326C481A}
    <Protocol>
        # Protocol type
        Type      dnp3
    </Protocol>
    # Conversion to JSON
    Exec          to_json();
</Input>
DNP3 unsolicited response
{
  "dnp3.application_layer.control.con": "1",
  "dnp3.application_layer.control.fin": "1",
  "dnp3.application_layer.control.fir": "1",
  "dnp3.application_layer.control.sequence": "7",
  "dnp3.application_layer.control.uns": "1",
  "dnp3.application_layer.function_code": "Unsolicited Response",
  "dnp3.application_layer.internal_indications.already_executing": "0",
  "dnp3.application_layer.internal_indications.broadcast": "0",
  "dnp3.application_layer.internal_indications.class1_events": "0",
  "dnp3.application_layer.internal_indications.class2_events": "0",
  "dnp3.application_layer.internal_indications.class3_events": "0",
  "dnp3.application_layer.internal_indications.config_corrupt": "0",
  "dnp3.application_layer.internal_indications.device_restart": "0",
  "dnp3.application_layer.internal_indications.device_trouble": "0",
  "dnp3.application_layer.internal_indications.events_buffer_overflow": "0",
  "dnp3.application_layer.internal_indications.local_control": "0",
  "dnp3.application_layer.internal_indications.need_time": "0",
  "dnp3.application_layer.internal_indications.no_func_code_support": "0",
  "dnp3.application_layer.internal_indications.object_unknown": "0",
  "dnp3.application_layer.internal_indications.parameter_error": "0",
  "dnp3.application_layer.internal_indications.reserved": "0 (expected 0)",
  "dnp3.application_layer.object0.count": "1",
  "dnp3.application_layer.object0.group": "32",
  "dnp3.application_layer.object0.name": "Analog input event - single-precision, floating-point with time",
  "dnp3.application_layer.object0.point0.flags": "[ONLINE]",
  "dnp3.application_layer.object0.point0.index": "1",
  "dnp3.application_layer.object0.point0.time_of_occurance": "1633867737071",
  "dnp3.application_layer.object0.point0.value": "11876.329102",
  "dnp3.application_layer.object0.range": "2-octet count of objects",
  "dnp3.application_layer.object0.variation": "7",
  "dnp3.data_layer.control": "0x44",
  "dnp3.data_layer.control.dir": "0",
  "dnp3.data_layer.control.fcb": "0",
  "dnp3.data_layer.control.fcv": "0",
  "dnp3.data_layer.control.function_code": "Unconfirmed User Data",
  "dnp3.data_layer.control.prm": "1",
  "dnp3.data_layer.destination": "3",
  "dnp3.data_layer.length": "28",
  "dnp3.data_layer.source": "4",
  "dnp3.data_layer.start_bytes": "0x0564",
  "dnp3.transport.fin": "1",
  "dnp3.transport.fir": "1",
  "dnp3.transport.sequence": "12",
  "EventTime": "2021-10-10T02:08:57.076909-07:00",
  "EventReceivedTime": "2021-10-10T02:08:57.210239-07:00",
  "SourceModuleName": "pcap_dnp3",
  "SourceModuleType": "im_pcap"
}
DNP3 confirm
{
  "dnp3.application_layer.control.con": "0",
  "dnp3.application_layer.control.fin": "1",
  "dnp3.application_layer.control.fir": "1",
  "dnp3.application_layer.control.sequence": "7",
  "dnp3.application_layer.control.uns": "1",
  "dnp3.application_layer.function_code": "Confirm",
  "dnp3.data_layer.control": "0xC4",
  "dnp3.data_layer.control.dir": "1",
  "dnp3.data_layer.control.fcb": "0",
  "dnp3.data_layer.control.fcv": "0",
  "dnp3.data_layer.control.function_code": "Unconfirmed User Data",
  "dnp3.data_layer.control.prm": "1",
  "dnp3.data_layer.destination": "4",
  "dnp3.data_layer.length": "8",
  "dnp3.data_layer.source": "3",
  "dnp3.data_layer.start_bytes": "0x0564",
  "dnp3.transport.fin": "1",
  "dnp3.transport.fir": "1",
  "dnp3.transport.sequence": "7",
  "EventTime": "2021-10-10T02:08:57.186236-07:00",
  "EventReceivedTime": "2021-10-10T02:08:57.211453-07:00",
  "SourceModuleName": "pcap_dnp3",
  "SourceModuleType": "im_pcap"
}
DNP3 read data objects request
{
  "dnp3.application_layer.control.con": "0",
  "dnp3.application_layer.control.fin": "1",
  "dnp3.application_layer.control.fir": "1",
  "dnp3.application_layer.control.sequence": "14",
  "dnp3.application_layer.control.uns": "0",
  "dnp3.application_layer.function_code": "Read",
  "dnp3.application_layer.object0.count": "0",
  "dnp3.application_layer.object0.group": "60",
  "dnp3.application_layer.object0.name": "Class objects - Class 1 data",
  "dnp3.application_layer.object0.variation": "2",
  "dnp3.application_layer.object1.count": "0",
  "dnp3.application_layer.object1.group": "60",
  "dnp3.application_layer.object1.name": "Class objects - Class 2 data",
  "dnp3.application_layer.object1.variation": "3",
  "dnp3.application_layer.object2.count": "0",
  "dnp3.application_layer.object2.group": "60",
  "dnp3.application_layer.object2.name": "Class objects - Class 3 data",
  "dnp3.application_layer.object2.variation": "4",
  "dnp3.data_layer.control": "0xC4",
  "dnp3.data_layer.control.dir": "1",
  "dnp3.data_layer.control.fcb": "0",
  "dnp3.data_layer.control.fcv": "0",
  "dnp3.data_layer.control.function_code": "Unconfirmed User Data",
  "dnp3.data_layer.control.prm": "1",
  "dnp3.data_layer.destination": "4",
  "dnp3.data_layer.length": "17",
  "dnp3.data_layer.source": "3",
  "dnp3.data_layer.start_bytes": "0x0564",
  "dnp3.transport.fin": "1",
  "dnp3.transport.fir": "1",
  "dnp3.transport.sequence": "6",
  "EventTime": "2021-10-10T02:08:56.986131-07:00",
  "EventReceivedTime": "2021-10-10T02:08:57.210239-07:00",
  "SourceModuleName": "pcap_dnp3",
  "SourceModuleType": "im_pcap"
}
DNP3 data objects response
{
  "dnp3.application_layer.control.con": "0",
  "dnp3.application_layer.control.fin": "1",
  "dnp3.application_layer.control.fir": "1",
  "dnp3.application_layer.control.sequence": "14",
  "dnp3.application_layer.control.uns": "0",
  "dnp3.application_layer.function_code": "Response",
  "dnp3.application_layer.internal_indications.already_executing": "0",
  "dnp3.application_layer.internal_indications.broadcast": "0",
  "dnp3.application_layer.internal_indications.class1_events": "0",
  "dnp3.application_layer.internal_indications.class2_events": "0",
  "dnp3.application_layer.internal_indications.class3_events": "0",
  "dnp3.application_layer.internal_indications.config_corrupt": "0",
  "dnp3.application_layer.internal_indications.device_restart": "0",
  "dnp3.application_layer.internal_indications.device_trouble": "0",
  "dnp3.application_layer.internal_indications.events_buffer_overflow": "0",
  "dnp3.application_layer.internal_indications.local_control": "0",
  "dnp3.application_layer.internal_indications.need_time": "0",
  "dnp3.application_layer.internal_indications.no_func_code_support": "0",
  "dnp3.application_layer.internal_indications.object_unknown": "0",
  "dnp3.application_layer.internal_indications.parameter_error": "0",
  "dnp3.application_layer.internal_indications.reserved": "0 (expected 0)",
  "dnp3.data_layer.control": "0x44",
  "dnp3.data_layer.control.dir": "0",
  "dnp3.data_layer.control.fcb": "0",
  "dnp3.data_layer.control.fcv": "0",
  "dnp3.data_layer.control.function_code": "Unconfirmed User Data",
  "dnp3.data_layer.control.prm": "1",
  "dnp3.data_layer.destination": "3",
  "dnp3.data_layer.length": "10",
  "dnp3.data_layer.source": "4",
  "dnp3.data_layer.start_bytes": "0x0564",
  "dnp3.transport.fin": "1",
  "dnp3.transport.fir": "1",
  "dnp3.transport.sequence": "11",
  "EventTime": "2021-10-10T02:08:56.986861-07:00",
  "EventReceivedTime": "2021-10-10T02:08:57.210239-07:00",
  "SourceModuleName": "pcap_dnp3",
  "SourceModuleType": "im_pcap"
}

IEC 60870-5-104

The IEC 60870-5-104 protocol extends the IEC 60870-5-101 protocol and combines transport, network, link, and physical layers to enable communication between control stations and substations using TCP/IP. NXLog can be configured to monitor network traffic that uses this protocol.

Example 15. Capturing IEC 60870-5-104 packets

To passively monitor network traffic, NXLog uses the im_pcap module. The Dev directive of this module specifies the network interface. The Protocol group directive is then defined twice to simultaneously monitor the iec104apci and iec104asdu protocols. The Exec block of this module calls the to_json() procedure of the xm_json module to convert messages to JSON.

nxlog.conf
<Extension _json>
    Module        xm_json
</Extension>

<Input pcap_iec>
    Module        im_pcap
    # Name of a network device/interface
    Dev           \Device\NPF_{B89D13DD-37BB-4799-BFCD-6C9D326C481A}
    <Protocol>
        # Protocol types
        Type      iec104apci
    </Protocol>
    <Protocol>
        Type      iec104asdu
    </Protocol>
    # Conversion to JSON
    Exec          to_json();
</Input>
IEC 60870-5-104 response sample
{
  "iec104.apci.receive_sequence_number": "2",
  "iec104.apci.send_sequence_number": "39",
  "iec104.apci.type": "Information (I)",
  "iec104.asdu.data": {
      "io": [
          {
              "ioa": 1020,
              "ie": [
                  {
                      "type": "R32",
                      "value": 29899.640625
                  },
                  {
                      "type": "QDS",
                      "invalid": false,
                      "not-topical": false,
                      "substituted": false,
                      "blocked": false,
                      "overflow": false
                  },
                  {
                      "type": "CP56Time2A",
                      "milliseconds": 35304,
                      "minutes": 59,
                      "hours": 13,
                      "day-of-week": 0,
                      "day-of-month": 10,
                      "month": 10,
                      "year": 21
                  }
              ],
              "ies": 3
          }
      ],
      "ios": 1
  },
  "iec104.asdu.dui.cause_of_transmission": "Spontaneous (3)",
  "iec104.asdu.dui.coa": "1",
  "iec104.asdu.dui.num_records": "1",
  "iec104.asdu.dui.org": "0",
  "iec104.asdu.dui.pn": "0",
  "iec104.asdu.dui.sq": "FALSE",
  "iec104.asdu.dui.test_bit": "0",
  "iec104.asdu.dui.type": "M_ME_TF_1",
  "EventTime": "2021-10-10T03:59:35.681739-07:00",
  "EventReceivedTime": "2021-10-10T03:59:35.871039-07:00",
  "SourceModuleName": "pcap_iec",
  "SourceModuleType": "im_pcap"
}
Disclaimer

While we endeavor to keep the information in this topic up to date and correct, NXLog makes no representations or warranties of any kind, express or implied about the completeness, accuracy, reliability, suitability, or availability of the content represented here. We update our screenshots and instructions on a best-effort basis.

The accurateness of the content was tested and proved to be working in our lab environment at the time of the last revision with the following software versions:

Schneider Electric EcoStruxure Process Expert 2020 R2
NXLog version 5.3.7166

Last revision: