Schneider Electric Citect SCADA
Citect SCADA is a Supervisory Control and Data Acquisition (SCADA) solution from Schneider Electric, which is used to manage and monitor processes in manufacturing, primary production, utilities delivery, and facilities management.
This document was written in accordance with Citect SCADA 2018 Version 8.10[0]. All examples listed in this document were tested using this version.
Windows Event Log collection
NXLog can be configured to collect Windows Event Log entries generated by Citect SCADA.
Citect SCADA produces two types of Windows Event Log in the Application event channel with Event ID 0:
-
Schneider Electric SUT Service for Schneider Electric software updates
-
Citect Runtime Manager log
NXLog can use the im_msvistalog module to process Event Log based on the Event ID and source values.
The NXLog configuration below specifies the im_msvistalog module to read data with Event ID 0 from the Application channel of Windows Event Log using Xpath filtering. The to_json() procedure of the xm_json module converts event entries to JSON prior to routing them to any output.
<Extension json>
Module xm_json
</Extension>
<Input from_eventlog>
Module im_msvistalog
# XML query for reading Windows Event Log based on the Event ID value
<QueryXML>
<QueryList>
<Query Id="0" Path="Application">
<Select Path="Application">*[System[(EventID=0)]]</Select>
</Query>
</QueryList>
</QueryXML>
# Converting log data to JSON
Exec to_json();
</Input>
In the configuration sample below, NXLog uses the im_msvistalog module to read log entries from the Application channel of Windows Event Log based on the Schneider Electric SUT Service
source.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record.
The output stream is formatted JSON, which can be saved to file, forwarded over the network, or published to a Kafka topic.
<Extension json>
Module xm_json
</Extension>
<Input from_eventlog>
Module im_msvistalog
# XML query for reading Windows Event Log based on the source name
<QueryXML>
<QueryList>
<Query Id="0" Path="Application">
<Select Path="Application">
*[System[Provider[@Name='Schneider Electric SUT Service']]]
</Select>
</Query>
</QueryList>
</QueryXML>
# Converting to JSON
Exec to_json();
</Input>
The output sample below demonstrates the message with Event ID 0 after processing with NXLog.
{
"EventTime": "2020-05-12T08:12:57.737403+02:00",
"Hostname": "NXLog-Computer",
"Keywords": "36028797018963968",
"EventType": "INFO",
"SeverityValue": 2,
"Severity": "INFO",
"EventID": 0,
"SourceName": "Schneider Electric SUT Service",
"TaskValue": 0,
"RecordNumber": 912,
"ExecutionProcessID": 0,
"ExecutionThreadID": 0,
"Channel": "Application",
"Message": "Service started successfully.",
"Opcode": "Info",
"Data": "Service started successfully.",
"EventReceivedTime": "2020-09-14T13:14:26.675310+02:00",
"SourceModuleName": "from_eventlog",
"SourceModuleType": "im_msvistalog"
}
File-based logs
The majority of Citect SCADA log entries are stored in files. NXLog can be configured to collect and process file-based logs.
Citect SCADA log files contain time-stamped system data and are stored in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
directory.
The Drivers\
subdirectory beneath this directory contains driver log files which are named according to their respective drivers.
Logs related to licensing are stored in the C:\Users\<WindowsUser>\AppData\Local\Temp
directory.
The Citect SCADA 2018 Log Files Table lists the exact location of all log files.
If the system uses separate processes, a log file is appended with the name of the component.
For example, the filename syslog.IOServer.Cluster1.dat
illustrates how the server name, IOServer
and the cluster name, Cluster1
, are appended to the syslog
log type.
All data from the C:\ProgramData\Schneider Electric\ directory is duplicated in the C:\Users\All Users\Schneider Electric\ directory.
|
Configuring Citect SCADA logging
Citect SCADA logging can be configured using the Computer Setup Editor which is part of the Citect SCADA suite.
This application modifies the citect.ini
file located in the C:\ProgramData\Schneider Electric\Citect SCADA2018\Config\
directory.
This file can be used to adjust logging and filtering data by priority, category, or severity.
The citect.ini file also stores values for a comprehensive set of operating parameters that are used to configure the operational settings for each computer in the Citect SCADA system.
These values are read by Citect SCADA on startup and determine how the application should operate.
|
For more detailed information on how to configure logging, consult the Schneider Citect SCADA documentation in the Citect Studio application.
Citect SCADA 2018 log files table
The following table provides details about the various types of file-based logs generated by the Schneider Citect SCADA system.
Log Type | File Ext. |
Location | Description |
---|---|---|---|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\ChangeLogs\* |
Log file containing timestamps with last unsaved changes and created after saving the project |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\* |
Low-level driver traffic, kernel, and user-defined messages |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\* |
Managed-code events related to data subscription and updates |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\* |
Information about crashes and serious internal issues |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\ |
CTAPI communication traffic log |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\Drivers\* - File mask is *PROTOCOL.[ClusterName].[IOServerName].dat |
Driver operation log |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\ |
Copy of the kernel screens |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\* |
Records of non-default SCADA parameters |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\* |
Server reload log |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\* |
Runtime Manager operation log |
|
|
C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\DBLog.*\* - File mask is C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\DBLog.Alarm.[ClusterName].[AlarmServerName]\DB*.log* |
- |
|
|
C:\Users\<WindowsUser>\AppData\Local\Temp\ |
- |
|
|
C:\ProgramData\Schneider Electric\Software Update\SutService\Logs\* |
Notification about updates, download and installation capabilities, and implementation of the Schneider Electric software improvement program |
Processing Citect SCADA log files
This section discusses each Citect SCADA log type individually with information about its files, as well as how to collect and process them with NXLog. An input sample is provided for each log type along with an example configuration and an output sample in JSON format. Displaying the processed logs as JSON objects not only makes the data more readable for human consumption, it is also a format readily ingested by many third-party systems used for data analysis.
Because these logs do not contain structured data, regular expressions are used in all of the configurations to accommodate the parsing needed to create the individual schema that best suits each log type. This ensures that all important data is collected and structured for any future data analysis. The flexibility of NXLog allows the use of named capturing when using regular expressions, so the resulting fields are automatically added to the output.
Change log
Citect SCADA uses the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\ChangeLogs\
folder to record changes in project configuration.
All files are named according to the dbf-yyyy-mm-dd-hh-mm-ss-fff.log
pattern.
A new log file is created after saving the project and contains the timestamp of the last unsaved changes.
In this case, change log messages can be spread for a few files.
As each log file only contains the changes since the last saved changes, it might be that you want to process more than one of these files at the same time.
In such case, you can use a wildcard character (* ) in the file name.
For more details, see the File directive of the im_file module.
|
The example below demonstrates how NXLog can process change log messages.
Below is a change log sample.
Update + NoError Dummy Network Addresses mbserver Address: 127.0.0.1 -> 127.0.0.10
The sample contains the following fields:
-
Action (
Update
in the sample above) -
Severity (
NoError
) -
Project Name (
Dummy
) -
Submenu (
Network Addresses
) -
Component (
mbserver
) -
Property is expressed in a Old Value
→
New Value format to represent the changes (Address: 127.0.0.1 → 127.0.0.10
)
To read the change log, NXLog can use the im_file module.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as CHANGELOG_REGEX
:
define CHANGELOG_REGEX /(?x)^(?<Action>\w+)\t+\+\t+(?<Severity>\w+)\
\s+(?<Project_Name>\w+)\t+(?<Submenu>\w+\s?\w+)\
\t(?<Component>\w+)\t(?<Property>\w+)\:\
\s(?<Value_Change>.*)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in CHANGELOG_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The value of the $EventReceivedTime
field can be assigned to the $EventTime
field.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record. The output stream is formatted as JSON, which can be saved to file, forwarded over the network, or published to a Kafka topic.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define CHANGELOG_REGEX /(?x)^(?<Action>\w+)\t+\+\t+(?<Severity>\w+)\
\s+(?<Project_Name>\w+)\t+(?<Submenu>\w+\s?\w+)\
\t(?<Component>\w+)\t(?<Property>\w+)\:\
\s(?<Value_Change>.*)/
# Part of the log path defined as a constant
define CHANGELOG_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%LOG_PATH%\ChangeLogs\test.log'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %CHANGELOG_REGEX%
{
# Creates the timestamp
$EventTime = $EventReceivedTime;
# Formats the result as JSON
to_json();
}
else drop();
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2021-05-19T10:27:48.483702+03:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Action": "Update",
"Component": "mbserver",
"Project_Name": "Dummy",
"Property": "Address",
"Severity": "NoError",
"Submenu": "Network Addresses",
"Value_Change": "127.0.0.1 -> 127.0.0.10",
"EventTime": "2021-05-19T10:27:48.483702+03:00"
}
Syslog
The term syslog used here is not the same as the well known syslog protocol. The name Syslog is only used in this case as Schneider Electric decided to name this particular file syslog. |
The syslog.dat
file is the primary log file for Citect SCADA.
It contains system information about low-level driver traffic, Kernel messages, user-defined messages, and trace options (except some CTAPI traces).
This log file contains the following fields:
-
Level
-
Category
-
Thread Id
-
Driver Name
-
Unit
-
Function
-
File
-
Line
-
Message
There is a separate SysLog file per IOServer process for both the Drivers and the IOServer (in earlier versions of Citect SCADA there was one file for the IOServer and one file for the Drivers).
As indicated by the file-based logs table,
Syslog logs are located in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
directory.
The following NXLog example applies to all logs having names that begin with syslog.
.
In this example, only two fields will be parsed from the syslog.dat
log file: the date and time the event was logged and a message.
However, messages that start with four or more asterisks (*
) will be dropped, since they only contain asterisks.
Two different regular expressions are used to achieve this.
2020-05-12 12:59:38.103 +03:00 *************************************
2020-05-12 12:59:38.112 +03:00 *** Citect process is starting up ***
2020-05-12 12:59:38.135 +03:00 Running in 32 bit process architecture
2020-05-12 12:59:38.136 +03:00 Running in Console Mode
2020-05-12 12:59:38.164 +03:00 Data Execution Prevention (DEP), has been enabled successfully
In order to make the Exec directive block of the from_file
input instance more readable, the first regular expression is defined as the constant SYSLOG_REGEX
.
It parses the datetime and message values.
The named capture feature, in this case (?<Message>.*)
, will cause a new $Message
field containing the value that has been captured to be created.
The other, shorter regular expression is used with the drop() procedure in this input instance to filter out the unwanted messages flagged with asterisks.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define SYSLOG_REGEX /(?x)^(\d+.\d+.\d+.)(\d+.\d+.\d+.\d+)\s([+]\d+.\d*)\
\s(?<Message>.*)/
# Part of the log path defined as a constant
define SYSLOG_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%SYSLOG_PATH%\syslog.dat'
<Exec>
# A regular expression to drop event messages
# that starts with more than four asterisks
if $raw_event =~ /.*[*]{4,}.*/ drop();
# Matches the events with the regular expression
if $raw_event =~ %SYSLOG_REGEX%
{
# Creates the timestamp
$EventTime = parsedate($1 + $2 + $3);
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-10-06T09:35:17.597817+02:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Message": "*** Citect process is starting up ***",
"EventTime": "2020-05-12T11:59:38.112000+02:00"
}
Tracelog
The tracelog.dat
file contains managed code logging, mainly in relation to
data subscriptions and updates.
In the PSIClient and CSAToPSI categories, Sent and Received notations (indicated with ⇐ and ⇒) will be added to indicate tag data flow on the client side.
As indicated by the file-based logs table, tracelog
logs are located in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
directory.
The following NXLog example applies to all logs having names that begin with tracelog.
, except the tracelog.RuntimeManager logs which are discussed separately.
These log files contain the following fields:
-
Process ID (PID)
-
Severity
-
Process Name
-
Message
In this example, the IDEtracelog.dat
log file provides four fields of interest in addition to each event’s timestamp: PID, process name, severity, and a message.
Since no filters are required for filtering unwanted events, only one regular expression is needed to parse these fields.
2020-05-15 08:12:53.388 +03:00 1744 0 Error Deployment at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
2020-05-15 08:12:53.388 +03:00 1744 0 Error Deployment at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
2020-05-15 08:12:53.388 +03:00 1744 0 Error Deployment at SchneiderElectric.CitectIDE.Controls.HttpRequestHelper.<TryHttpRequestAsync>d__a`1.MoveNext()
2020-05-15 08:12:53.389 +03:00 1744 0 Error Deployment HttpResponseException Response: Forbidden (403)
By defining the regular expression as the constant TRACELOG_REGEX
, as well as the absolute path to the log file as the constant TRACELOG_REGEX
, the from_file
input instance will be more readable without such long strings.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as TRACELOG_REGEX
:
define TRACELOG_REGEX /(?x)^(\d+.\d+.\d+.)(\d+.\d+.\d+.\d+.)\s([+]\d+.\d*)\
\s(?<PID>\d+)\s{3}\d\s(?<Severity>\w+)\s*(?<ProcessName>\
\w+)\s*(?<Message>.*)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in TRACELOG_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
To populate the $EventTime
field, parsedate() returns a datetime value when provided the three concatenated string values captured in $1
,$2
, and $3
by the TRACELOG_REGEX
regular expression.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define TRACELOG_REGEX /(?x)^(\d+.\d+.\d+.)(\d+.\d+.\d+.\d+.)\s([+]\d+.\d*)\
\s(?<PID>\d+)\s{3}\d\s(?<Severity>\w+)\s*(?<ProcessName>\
\w+)\s*(?<Message>.*)/
# Part of the log path defined as a constant
define TRACELOG_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%TRACELOG_PATH%\IDEtracelog.dat'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %TRACELOG_REGEX%
{
# Creates the timestamp
$EventTime = parsedate($1 + $2 + $3);
# Saves the PID field as an integer
$PID = integer ($PID);
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-10-05T18:43:13.609841+02:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Message": "HttpResponseException Response: Forbidden (403)",
"PID": 1744,
"ProcessName": "Deployment",
"Severity": "Error",
"EventTime": "2020-05-15T07:11:58.720000+02:00"
}
Debug log
This debug.log
file contains information about crashes and other serious internal issues.
If a crash occurs, it identifies the version and path of each DLL being used at the time of the crash. It can be used to confirm the validity of file versions.
As indicated by the file-based logs table, Debug
logs are located in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
directory.
The following NXLog example applies to all logs having names that begin with debug.
.
Aside from the timestamp, the only field of interest in this log file is the message field.
In this example the configuration only extracts a single field of interest from the debug.log
file in addition to the default fields added by NXLog.
In this case, all events lines are of interest, so all lines are read and processed by NXLog, nothing is dropped.
2020-05-12 13:02:40.254 +03:00 BufPoolClose: non fatal, freed 12 inuse buffers in Code.String
2020-05-13 11:07:12.734 +03:00 BufPoolClose: non fatal, freed 12 inuse buffers in Code.String
2020-05-15 08:35:55.828 +03:00 BufPoolClose: non fatal, freed 12 inuse buffers in Code.String
2020-05-15 22:00:39.969 +03:00 BufPoolClose: non fatal, freed 12 inuse buffers in Code.String
2020-05-15 22:13:06.356 +03:00 BufPoolClose: non fatal, freed 119 inuse buffers in Code.String
There are two regular expression defined as a constant, DEBUGLOG_REGEX
the regular expression to process the events and DEBUGLOG_PATH
as the absolute path to the log file.
This helps with the readability of the file path in the from_file
input instance.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as DEBUGLOG_REGEX
:
define DEBUGLOG_REGEX /(?x)^(\d+.\d+.\d+.)(\d+.\d+.\d+.\d+.\d+.)([+]\
\d+.\d+.)\s(?<Message>.*)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in DEBUGLOG_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
To populate the $EventTime
field, parsedate() returns a datetime value when provided the three concatenated string values captured in $1
,$2
, and $3
by the DEBUGLOG_REGEX
regular expression.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define DEBUGLOG_REGEX /(?x)^(\d+.\d+.\d+.)(\d+.\d+.\d+.\d+.\d+.)([+]\
\d+.\d+.)\s(?<Message>.*)/
# Part of the log path defined as a constant
define DEBUGLOG_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%DEBUGLOG_PATH%\debug.log'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %DEBUGLOG_REGEX%
{
# Creates the timestamp
$EventTime = parsedate($1 + $2 + $3);
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-10-06T09:44:18.656718+02:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Message": "BufPoolClose: non fatal, freed 12 inuse buffers in Code.String",
"EventTime": "2020-05-12T13:02:40.254000+02:00"
}
Ipc log
This log stores information about CTAPI communication traffic.
As indicated by the file-based logs table, ipc
logs are located in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
directory.
Some of the IPC logs are multiline logs, which means a single event spans multiple lines, while other IPC logs are comprised of single-line events.
These log files contain the following fields:
-
Process
-
Message
In this configuration two constants are defined.
The first is IPCLOG_REGEX
which is the regular expression to parse the fields in the events and IPCLOG_PATH
which defines the absolute path to the log file.
2020-05-20 08:22:35.478 +03:00 CtApi: _ctConnect: connection is established, sending login information.
2020-05-20 08:22:41.631 +03:00 CtApi Server: CAPIDoCmd: 0x10b start to process message.
"CLUSTER","",.................
2020-05-20 08:22:41.631 +03:00 CtApi: ctThread: IPCRead 0x10c successful (actual read 111).
CMB.o..........."2","4"
"NAME","DBTYPE_STR"
"COMMENT","DBTYPE_STR"
"ACTIVE","DBTYPE_STR"
"ERROR","DBTYPE_STR"
.
2020-05-20 08:22:41.631 +03:00 CtApi: ctThread: check piggy back message (actual read 111, request data 111, pippyback 0).
Each log message has a header (TIMESTAMP) which is found by the regular expression and used as the message boundary, so each log message is appended with additional lines until the next header line is detected.
To correctly process multiline event logs, a pattern needs to be defined as a regular expression that describes the header line of an event.
In the following xm_multiline extension instance, the HeaderLine
directive specifies the regular expression to be used for finding the header line of each event.
<Extension multiline>
Module xm_multiline
# Parsing the header line
HeaderLine /(\d+.\d+.\d+)\s(\d+.\d+.\d+.\d+.\d+)\s([+]\d+.\d+)\s(.*)/
</Extension>
In the from_file
input instance, the InputType
directive is used for referencing the xm_multiline extension instance by name, multiline
, which will enable the instance to establish the beginning and end of each event.
Once the messages are on a single line, the first Exec directive block in the from_file
input instance replaces the \r
and \n
characters.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as IPCLOG_REGEX
:
define IPCLOG_REGEX /(?x)^(\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+)\s+([+]\d+.\d+)\
\s+(?<Process>\S+)\s+(?<Message>.*)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in IPCLOG_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
To populate the $EventTime
field, parsedate() returns a datetime value when provided the two concatenated string values captured in $1
and $2
by the IPCLOG_REGEX
regular expression.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define IPCLOG_REGEX /(?x)^(\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+)\s+([+]\d+.\d+)\
\s+(?<Process>\S+)\s+(?<Message>.*)/
# Part of the log path defined as a constant
define IPCLOG_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Extension multiline>
Module xm_multiline
# Regular expression to look for the header of the message
HeaderLine /(\d+.\d+.\d+)\s(\d+.\d+.\d+.\d+.\d+)\s([+]\d+.\d+)\s(.*)/
</Extension>
<Input from_file>
Module im_file
File '%IPCLOG_PATH%\ipc.log'
# Defines that the input has to be first paresd with the regular
# expression in the multiline extension module
InputType multiline
<Exec>
# Replaces unwanted characters
$raw_event = replace($raw_event, "\r", "");
$raw_event = replace($raw_event, "\n", " ");
</Exec>
<Exec>
# Matches the events with a regular expression
if $raw_event =~ %IPCLOG_REGEX%
{
# Creates the timestamp
$EventTime = parsedate($1 + $2);
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-10-09T10:44:38.390139+02:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Message": "_ctConnect: connection is established, sending login information.",
"Process": "CtApi:",
"EventTime": "2020-05-20T21:12:13.047000+02:00"
}
[driver] log
[driver]
logs contain information about the operation of a particular driver and are named accordingly.
For example, the TCP/IP driver is logged in TCPIP.*.DAT
.
As indicated by the file-based logs table, [driver]
logs are located in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\Drivers\
directory.
Some of the [driver]
logs are single-line logs, but some have events that span multiple lines.
The following NXLog example applies to all logs having names that begin with the name of a driver.
These log files contain the following fields:
-
SourceType
-
Message
This example processes the TCPIP.*.DAT
log file.
For easier readability a regular expression is defined as TCPIPDRIVER_REGEX
. Using this same method, the absolute path to the log file is defined as TCPIPDRIVER_PATH
.
2020/05/20 22:59:57.757 Port Port1 connection OK
2020/05/20 22:59:57.762 Port Port1 Event - FD_WRITE
2020/05/20 22:59:57.780 Port Port1 Sent 12 bytes of data.
0000: 00 01 00 00 00 06 00 03 00 00 00 01
2020/05/20 22:59:57.799 Port Port1 Event - FD_READ
To correctly process multiline event logs, a pattern needs to be defined as a regular expression that describes the header line of an event.
In the following xm_multiline extension instance, the HeaderLine
directive specifies the regular expression to be used for finding the header line of each event.
<Extension multiline>
Module xm_multiline
# Parsing the header line
HeaderLine /(\d+.[\/]\d+.[\/]\d+)\s(\d+.\d+.\d+.\d+.\d*)/
</Extension>
In the from_file
input instance, the InputType
directive is used for referencing the xm_multiline extension instance by name, multiline
, which will enable the instance to establish the beginning and end of each event.
define TCPIPDRIVER_REGEX /(?x)^(\d+.[\/]\d+.[\/]\d+\d+.\d+.\d+.\d+.\d*)\
\s+(?<SourceType>\S+)\s+(?<Message>.*)/
Once the messages are on a single line, the first Exec directive block of the from_file input instance replaces the \r
, \t
and \n
characters.
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in TCPIPDRIVER_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The strptime() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime
field.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define TCPIPDRIVER_REGEX /(?x)^(\d+.[\/]\d+.[\/]\d+\d+.\d+.\d+.\d+.\d*)\
\s+(?<SourceType>\S+)\s+(?<Message>.*)/
# Part of the log path defined as a constant
define TCPIPDRIVER_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Extension multiline>
Module xm_multiline
# Regular expression to look for the header of the message
HeaderLine /(\d+.[\/]\d+.[\/]\d+)\s(\d+.\d+.\d+.\d+.\d*)/
</Extension>
<Input from_file>
Module im_file
File '%TCPIPDRIVER_PATH%\Drivers\TCPIP.Cluster1.IOServer1.DAT'
# Defines that the input has to be first paresd with the regular
# expression in the multiline extension module
InputType multiline
<Exec>
# Replaces unwanted characters
$raw_event = replace($raw_event, "\r", "");
$raw_event = replace($raw_event, "\n", " ");
$raw_event = replace($raw_event, "\t", " ");
</Exec>
<Exec>
# Matches the events with a regular expression
if $raw_event =~ %TCPIPDRIVER_REGEX%
{
# Creates the timestamp
$EventTime = strptime($1, "%Y/%m/%d %H:%M:%S");
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
<Output to_file>
Module om_file
File "C:\logs\output.txt"
</Output>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-10-09T12:37:51.259840+02:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Message": "Port1 connection OK",
"SourceType": "Port",
"EventTime": "2020-05-20T22:59:57.000000+02:00"
}
Kernel log
The kernel.dat
file contains a copy of the kernel screens in a human-readable form.
The example below demonstrates how to parse, process, and forward kernel log data over the network.
This sample contains the following fields:
-
Mode
-
Name
-
Hnd
-
State
-
Prty
-
Cpu
-
Min
-
Max
-
Avg
-
Count
Below is an input sample of a kernel log entry.
Sys Ker.Stat 0 sleep low 0.0 0.000 0.001 0.000 544
By defining the regular expression as the constant KERNEL_REGEX
, as well as the absolute path to the log file as the constant LOG_PATH
, the from_file
input instance will be more readable without such long strings.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as KERNEL_REGEX
:
define KERNEL_REGEX /(?x)^(?<Mode>\w+)\s+(?<Name>[\w+\.]+)\s+(?<Hnd>\d+)\
\s+(?<State>\w+)\s+(?<Prty>\w+)\s+(?<Cpu>[\d\.]+)\s+\
(?<Min>[\d\.]+)\s+(?<Max>[\d\.]+)\s+(?<Avg>[\d\.]+)\s+\
(?<Count>\d+)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in KERNEL_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The value of the $EventReceivedTime
field can be assigned to the $EventTime
field.
The drop procedure discards records that do not match the KERNEL_REGEX
regular expression.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record. The output stream is formatted as JSON, which can be saved to file or forwarded over the network to a remote host using om_tcp as shown in this example.
The following NXLog configuration combines all the steps described above.
define KERNEL_REGEX /(?x)^(?<Mode>\w+)\s+(?<Name>[\w+\.]+)\s+(?<Hnd>\d+)\
\s+(?<State>\w+)\s+(?<Prty>\w+)\s+(?<Cpu>[\d\.]+)\s+\
(?<Min>[\d\.]+)\s+(?<Max>[\d\.]+)\s+(?<Avg>[\d\.]+)\s+\
(?<Count>\d+)/
# Part of the log path defined as a constant
define LOG_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Input in>
Module im_file
File '%LOG_PATH%\kernel.dat'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %KERNEL_REGEX%
{
# Creates the timestamp
$EventTime = $EventReceivedTime;
# Formats the result as JSON
to_json();
}
else drop();
</Exec>
</Input>
<Output out>
Module om_tcp
Host 192.168.0.127
Port 10500
</Output>
Below is the output sample after the input message has been processed by NXLog.
{
"EventReceivedTime": "2021-05-19T14:02:16.139998+03:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Avg": "0.000",
"Count": "544",
"Cpu": "0.0",
"Hnd": "0",
"Max": "0.001",
"Min": "0.000",
"Mode": "Sys",
"Name": "Ker.Stat",
"Prty": "low",
"State": "sleep",
"EventTime": "2021-05-19T14:02:16.139998+03:00"
}
Params log
The Params.dat
log file contains historical record of non-default SCADA parameters.
As indicated by the file-based logs table, the Params
logs are located in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\
directory.
The following NXLog example applies to all logs having names that begin with Params.
.
These log files contain the following fields:
-
Category
-
Message
In this example, besides the core fields added by NXLog, two fields are parsed from the Params.dat
log file: $Category
and the $Message
field.
2020-05-12 13:00:08.008 +03:00 [SA_Library.Meter] UseDefaultPLCLimits= TRUE Default= FALSE
2020-05-12 13:00:09.168 +03:00 [MultiMonitors] Context1= MyPlant:Screen1 Default=
2020-05-12 13:00:10.245 +03:00 [Format] InfoAlarm_HD1080= {PriorityandState,24}{OnTime,90}{Item,100}{Name,240} Default= {PriorityAndState,25}{OnDate,80}{OnTime,90}{Name,250}{State,40}{Cluster,70}{Equipment,220}{Item,160}{UserName,100}{Comment,250}{Category,60}
2020-05-12 13:00:10.247 +03:00 [AlarmHeading] OnTime= On Time Default= OnTime
By defining the regular expression as the constant PARAMS_REGEX
, as well as the absolute path to the log file as the constant PARAMS_PATH
, the from_file
input instance will be more readable without such long strings.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as PARAMS_REGEX
:
define PARAMS_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+).\
\s(?<Category>\[\w+\]|\[\w+.\w+.\w+\])\s+(?<Message>.*)/
The first Exec directive of the from_file
input instance replaces the \t
and " " (space) characters.
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in PARAMS_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime
field.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define PARAMS_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+).\
\s(?<Category>\[\w+\]|\[\w+.\w+.\w+\])\s+(?<Message>.*)/
# Part of the log path defined as a constant
define PARAMS_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%PARAMS_PATH%\Params.dat'
# Replaces unwanted characters
Exec $raw_event = replace($raw_event, "\t", " ");
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %PARAMS_REGEX%
{
# Creates the timestamp
$EventTime = parsedate($1);
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-10-06T13:11:37.529135+02:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Category": "[SA_Library.Meter]",
"Message": "UseDefaultPLCLimits= TRUE Default= FALSE",
"EventTime": "2020-09-18T13:18:19.890000+02:00"
}
Reloadlog
The reloadlog
files contain server reload entries.
As indicated by the file-based logs table, the reloadlog
logs are located in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\
directory.
The following NXLog example applies to all logs having names that begin with reloadlog.
.
These log files contain the following fields:
-
ServerName
-
Message
This example parses two fields of interest, the $ServerName
and $Message
fields in addition to the fields created automatically by NXLog.
2020-05-12 12:59:48.375 +03:00 Alarm Server: Alarm added: PMP02_Fault (0).
2020-05-12 12:59:48.375 +03:00 Alarm Server: Alarm added: CompanyUnit1_VLV11_Open (0).
2020-05-12 12:59:48.375 +03:00 Alarm Server: Alarm added: PMP04_Fault (0).
2020-05-12 12:59:48.376 +03:00 Alarm Server: Alarm added: CompanyUnit1_DRV04_Running (0).
In the first part of the configuration, two constants are defined: RELOADLOG_REGEX
, the regular expression for parsing fields and their values, as well as RELOADLOG_PATH
, which stores the absolute path to the log file.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as RELOADLOG_REGEX
:
define RELOADLOG_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+)\
\s(?<ServerName>\w+\s\w+:)\s(?<Message>.*)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in RELOADLOG_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime
field.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define RELOADLOG_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+)\
\s(?<ServerName>\w+\s\w+:)\s(?<Message>.*)/
# Part of the log path defined as a constant
define RELOADLOG_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%RELOADLOG_PATH%\reloadlog.dat'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %RELOADLOG_REGEX%
{
# Creates the timestamp
$EventTime = parsedate($1);
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-09-29T15:23:26.050139+02:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Message": "Server: Alarm added: PMP02_Fault (0).",
"ServerName": "Alarm",
"EventTime": "2020-05-13T09:04:40.306000+02:00"
}
Tracelog.RuntimeManager
These logs contain information related to the operation of the Runtime Manager.
As indicated by the file-based logs table, the tracelog.RuntimeManager
logs are located in the C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\
directory.
The following NXLog example applies to all logs having names that begin with tracelog.RuntimeManager
.
These log files contain the following fields:
-
EventID
-
Severity
-
Type
-
Message
This example processes the tracelog.RuntimeManager.UI.dat
log file.
In this case, the configuration parses four fields in addition to the core fields added by NXLog.
2020-05-20 22:43:13.949 +03:00 412 0 Warning RuntimeManager The Citect process (Client) is not responding.
2020-05-20 23:00:29.691 +03:00 5544 0 Error RuntimeManager The Citect process (Client) is stopped.
2020-05-20 23:05:05.797 +03:00 5544 0 Error RuntimeManager The Citect process (Client) is stopped.
2020-09-18 13:38:13.264 +02:00 4572 0 Error RuntimeManager The Citect process (Cluster1.TrendServer1) is stopped.
2020-09-18 13:38:13.438 +02:00 4572 0 Error RuntimeManager The Citect process (Cluster1.ReportServer1) is stopped.
At the beginning of the configuration two constants are defined, TRACELOGRM_REGEX
and TRACELOGRM_PATH
.
The former is the regular expression used for parsing the messages and the latter is the absolute path to the log file.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as TRACELOGRM_REGEX
:
define TRACELOGRM_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+)\
\s(?<EventID>\w+)\s+\d\s(?<Severity>\w+)\s+(?<Type>\w+)\
\s+(?<Message>.*)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in TRACELOGRM_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The parsedate() function is called to convert the captured timestamp to a datetime value that it assigns to the $EventTime
field.
In the next step, the string value captured for the EventID
field is converted to an integer. This can be helpful if further analysis is required, for instance, for querying a range of EventID
s.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define TRACELOGRM_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+)\
\s(?<EventID>\w+)\s+\d\s(?<Severity>\w+)\s+(?<Type>\w+)\
\s+(?<Message>.*)/
# Part of the log path defined as a constant
define TRACELOGRM_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%TRACELOGRM_PATH%\tracelog.RuntimeManager.UI.dat'
<Exec>
# Matches the events with a regular expression
if $raw_event =~ %TRACELOGRM_REGEX%
{
# Creates the timestamp
$EventTime = parsedate($1);
# Specifies the EventID filed to be saved as an integer
$EventID = integer ($EventID);
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-10-09T13:55:39.129810+02:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"EventID": 412,
"Message": "The Citect process (Client) is not responding.",
"Severity": "Warning",
"Type": "RuntimeManager",
"EventTime": "2020-05-20T22:43:13.949000+02:00"
}
Alarm server database log
Alarm Server processes are responsible for determining alarm conditions and generating alarms.
The example below reads the alarm server database log from the DBLog.Alarm.Cluster.Server\DB_001.log
file located in the Schneider Electric log folder.
this lof file contains the following data:
-
Timestamp (
15-MAY-2020 18:02:18.593
in the sample above) -
TID or thread ID (
174C
) -
Thread name (
AlarmCookieMap
) -
Message (
MinCookieId 1, MaxCookieId 250000
)
Below is the sample alarm server database message containing the following data:
15-MAY-2020 18:02:18.593 174C [AlarmCookieMap] MinCookieId 1, MaxCookieId 250000
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as ALARM_REGEX
:
define ALARM_REGEX /(?x)^(\d+\-\w+\-\d+)\s+\
(\d+\:\d+\:\d+\.\d+)\s+(?<Three>\d+\w+)\
\s+\[?(?<Four>[\w\d\s]+)\]?[\:\s]+(?<Message>.*)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in ALARM_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The timestamp string is captured by the regular expression as $1
and $2
.
However, parsedate() cannot readily parse them due to the non-standard string format.
The strptime() function allows a custom format to be defined as the second argument, which it then uses for parsing and converting the string from the first argument to the datetime data type.
This value is then assigned to the explicitly defined $EventTime
field.
The drop procedure discards records that do not match the ALARM_REGEX
regular expression.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record. The output stream is formatted as JSON, which can be saved to file, forwarded over the network, published to a Kafka topic, or sent to an Apache NiFi processor.
The following NXLog configuration combines all the steps described above.
define ALARM_REGEX /(?x)^(\d+\-\w+\-\d+)\s+\
(\d+\:\d+\:\d+\.\d+)\s+(?<Three>\d+\w+)\
\s+\[?(?<Four>[\w\d\s]+)\]?[\:\s]+(?<Message>.*)/
define ALARM_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs\
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%ALARM_PATH%\DBLog.Alarm.Cluster.Server\DB_001.log'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %ALARM_REGEX%
{
# Creates the timestamp
$EventTime = strptime($1 + $2, "%d-%b-%Y %T");
# Formats the result as JSON
to_json();
}
else drop();
</Exec>
</Input>
Below is the output JSON-formatted message after processing with NXLog.
{
"EventReceivedTime": "2021-05-19T17:46:04.732418+03:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Four": "AlarmCookieMap",
"Message": "MinCookieId 1, MaxCookieId 250000",
"Three": "174C",
"EventTime": "2020-05-15T18:02:18.000000+03:00"
}
Floating License Manager reg log
The Floating License Manager_reg.log
log contains information related to the operation of the Floating License Manager.
As indicated by the file-based logs table, the Floating License Manager reg log is located in the C:\Users\%USERNAME%\AppData\Local\Temp
directory.
These logs are multiline logs containing a set of system information that spans multiple lines, which is only valuable when correctly grouped together as an interdependent set of attributes.
In this example, the fields are parsed, extracted, and structured as a single JSON object.
These log files contain the following fields:
-
EventTime
-
ServerName
-
Severity
-
EvetGroupID
-
InfoType
-
ComputerName
-
UserName
-
OSVersion
-
Hypervisor
-
Product
-
LogLevel
-
FNPAPIVersion
-
LicensingServiceVersion
-
SrvActBrickdllVersion
-
SrvActBrickXdllVersion
-
FnpCommsSoapdllVersion
-
Message1
-
Message2
In addition, most of the above parameters contain an additional Line ID.
They are named with the word LineID
appended to the name of the entry.
For example the corresponding Line ID for Product
is ProductLineID
.
These IDs are collected as well since they can be useful for event data analysis later on, when searching for patterns.
This example processes the Floating_License_Manager_reg.log
log file.
In this example all together, there are 33 fields that are of interest and parsed.
05-15-2020,08:34:07:987,SrvActBr,INF, 150, 8944,---------------------------------------------------------------------------
05-15-2020,08:34:07:987,SrvActBr,INF, 151, 8944,GENERAL INFORMATION:
05-15-2020,08:34:07:987,SrvActBr,INF, 153, 8944,Computer Name: DESKTOP-VCF8F0G
05-15-2020,08:34:07:987,SrvActBr,INF, 155, 8944,User Name: Engineer
05-15-2020,08:34:07:987,SrvActBr,INF, 157, 8944,Windows: Microsoft (build 14393), 64-bit
05-15-2020,08:34:07:988,SrvActBr,INF, 164, 8944,Hypervisor: Oracle VirtualBox
05-15-2020,08:34:07:988,SrvActBr,INF, 169, 8944,Product: Floating License Manager 02.04
05-15-2020,08:34:07:988,SrvActBr,INF, 172, 8944,Log Level: 1
05-15-2020,08:34:07:988,SrvActBr,INF, 176, 8944,FNP API Version: v11.16.4.0 build 252457 i86_n3
05-15-2020,08:34:07:991,SrvActBr,INF, 187, 8944,Licensing Service Version: v11.16.4.0 build 252457 2019/07/09
05-15-2020,08:34:07:991,SrvActBr,INF, 213, 8944,SrvActBrick.dll Version: 2.4.0.0
05-15-2020,08:34:07:991,SrvActBr,INF, 232, 8944,SrvActBrickX.dll Version: 11.16.4.0 build 252457
05-15-2020,08:34:08:186,SrvActBr,INF, 236, 8944,FnpCommsSoap.dll Version: 11.16.4.0 build 252457
05-15-2020,08:34:08:186,SrvActBr,INF, 286, 8944,---------------------------------------------------------------------------
05-15-2020,08:34:08:264,SrvActBr,INF,2695, 8944,No stored composite transaction requests found
05-15-2020,08:34:08:270,SrvActBr,INF, 628, 8944,Found 0 fulfillments in Trusted Storage
To achieve the parsing of the above mentioned fields, a regular expression is used, which is defined as a constant with the name FLMRG_REGEX
in the first line of the configuration.
The absolute path to the directory of the log file is also defined in the same way as FLMRG_PATH
.
To correctly process multiline event logs, a pattern needs to be defined as a regular expression that describes the header line of an event.
In the following xm_multiline extension instance, the HeaderLine
directive specifies the regular expression to be used for finding the header line of each event.
<Extension multiline>
Module xm_multiline
# Parsing the header line
HeaderLine /(\d+-\d+-\d+),(\d+:\d+:\d+:\d+),(\w+),INF,\s+150(.*)/
</Extension>
In the from_file
input instance, the InputType
directive is used for referencing the xm_multiline extension instance by name, multiline
, which will enable the instance to establish the beginning and end of each event.
In this case a LineID (150) which is used as the message boundary.
Each subsequent log message is appended to the current event record until the next header line is detected.
Once the messages are on a single line, the first Exec directive block in the from_file
input instance replaces the \r
, \t
, and \n
characters.
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as FLMRG_REGEX
:
define FLMRG_REGEX /(?x)^(?<EventTime>\d+-\d+-\d+,\d+:\d+:\d+:\d+),\
(?<ServerName>\w+),(?<Severity>\w+),\
\s(?<EventGroupLineID>150),\s(?<EvetGroupID>\d+),\-+\s\d+.\
\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\s+(?<InfoTypeLineID>\d+)\
,\s+\d+,(?<InfoType>\w+\s+\w+):\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\
\d*,\w+,\w+,\s+(?<ComputernameLineID>\d+),\s+\d+,\w+\s+\w+:\
\s+(?<ComputerName>\w+.+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\
\w+,\w+,\s+(?<UserNameLineID>\d+),\s+\d+,\w+\s+\w+:\
\s+(?<UserName>\w+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<OSVersionLineID>\d+),\s\d+,\w+:\s+(?<OSVersion>\w+\s+\
\(\w+\s\d+\),\s\d+-\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\
\w+,\s+(?<HypervisorLineID>\d+),\s+\d+,\w+:\s+(?<Hypervisor>\
\w+.\w+|\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<ProductLineID>\d+),\s+\d+,\w+:\s+(?<Product>\w+\s\w+\s\
\w+\s\d+.\d+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s(?<LogLevelLineID>\d+),\s\d+,\w+\s\w+:\s+(?<LogLevel>\d+)\s+\
\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\
\s+(?<FNPAPIVersionLineID>\d+),\s+\d+,\w+.\w+.\w+:\
\s+(?<FNPAPIVersion>v\d+.\d+.\d+.\d+\s\w+\s\d+\s\w+)\s\d+.\d+.\
\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<LicensingServiceVersionLineID>\d+),\s+\d+,\w+.\w+.\w+:\
\s+(?<LicensingServiceVersion>v\d+.\d+.\d+.\d+\s\w+\s\d+\s\d+\
\/\d+\/\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s(?<SrvActBrickdllVersionLineID>\d+),\s\d+,\w+.\w+\s+\w+:\
\s+(?<SrvActBrickdllVersion>\d+.\d+.\d+.\d+)\s+\d+.\d+.\d+\d+.\
\d+.\d+.\d+.\d*,\w+,\w+,\s(?<SrvActBrickXdllVersionLineID>\
\d+),\s\d+,\w+.\w+\s+\w+:\s+(?<SrvActBrickXdllVersion>\d+.\
\d+.\d+.\d+\s+\w+\s+\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\
\w+,\w+,\s+(?<FnpCommsSoapdllVersionLineID>\d+),\s+\d+,\w+.\
\w+\s+\w+:\s+(?<FnpCommsSoapdllVersion>\d+.\d+.\d+.\d+\s+\w+\
\s+\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\s+\d+,\s+\
\d+,\-+\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\
\w+,(?<Message1LineID>\d+),\s\d+,(?<Message1>\w+\s\w+\s\w+\s\
\w+\s\w+\s\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\
\s+(?<Message2LineID>\d+),\s+\d+,(?<Message2>.*)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in FLMRG_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The original event time is processed with the strptime() function from the $EventTime
capture group which converts the string to a datetime value.
Likewise the fields having names ending in LineID
, for instance, ComputernameLineID
, EventGroupLineID
, etc., should contain integer values.
The integer() function is used for converting their string values to integers.
This will facilitate any further processing or queries that might need to employ conditional statements requiring numeric comparison operators.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record and formatted as JSON prior to being routed to any output instances.
The following NXLog configuration combines all the steps described above.
# A regular expression defined as a constant to read the content of the logs
define FLMRG_REGEX /(?x)^(?<EventTime>\d+-\d+-\d+,\d+:\d+:\d+:\d+),\
(?<ServerName>\w+),(?<Severity>\w+),\
\s(?<EventGroupLineID>150),\s(?<EvetGroupID>\d+),\-+\s\d+.\
\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\s+(?<InfoTypeLineID>\d+)\
,\s+\d+,(?<InfoType>\w+\s+\w+):\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\
\d*,\w+,\w+,\s+(?<ComputernameLineID>\d+),\s+\d+,\w+\s+\w+:\
\s+(?<ComputerName>\w+.+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\
\w+,\w+,\s+(?<UserNameLineID>\d+),\s+\d+,\w+\s+\w+:\
\s+(?<UserName>\w+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<OSVersionLineID>\d+),\s\d+,\w+:\s+(?<OSVersion>\w+\s+\
\(\w+\s\d+\),\s\d+-\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\
\w+,\s+(?<HypervisorLineID>\d+),\s+\d+,\w+:\s+(?<Hypervisor>\
\w+.\w+|\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<ProductLineID>\d+),\s+\d+,\w+:\s+(?<Product>\w+\s\w+\s\
\w+\s\d+.\d+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s(?<LogLevelLineID>\d+),\s\d+,\w+\s\w+:\s+(?<LogLevel>\d+)\s+\
\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\
\s+(?<FNPAPIVersionLineID>\d+),\s+\d+,\w+.\w+.\w+:\
\s+(?<FNPAPIVersion>v\d+.\d+.\d+.\d+\s\w+\s\d+\s\w+)\s\d+.\d+.\
\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<LicensingServiceVersionLineID>\d+),\s+\d+,\w+.\w+.\w+:\
\s+(?<LicensingServiceVersion>v\d+.\d+.\d+.\d+\s\w+\s\d+\s\d+\
\/\d+\/\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s(?<SrvActBrickdllVersionLineID>\d+),\s\d+,\w+.\w+\s+\w+:\
\s+(?<SrvActBrickdllVersion>\d+.\d+.\d+.\d+)\s+\d+.\d+.\d+\d+.\
\d+.\d+.\d+.\d*,\w+,\w+,\s(?<SrvActBrickXdllVersionLineID>\
\d+),\s\d+,\w+.\w+\s+\w+:\s+(?<SrvActBrickXdllVersion>\d+.\
\d+.\d+.\d+\s+\w+\s+\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\
\w+,\w+,\s+(?<FnpCommsSoapdllVersionLineID>\d+),\s+\d+,\w+.\
\w+\s+\w+:\s+(?<FnpCommsSoapdllVersion>\d+.\d+.\d+.\d+\s+\w+\
\s+\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\s+\d+,\s+\
\d+,\-+\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\
\w+,(?<Message1LineID>\d+),\s\d+,(?<Message1>\w+\s\w+\s\w+\s\
\w+\s\w+\s\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\
\s+(?<Message2LineID>\d+),\s+\d+,(?<Message2>.*)/
# Part of the log path defined as a constant
define FLMRG_PATH C:\Users\<USERNAME>\AppData\Local\Temp
<Extension json>
Module xm_json
</Extension>
<Extension multiline>
Module xm_multiline
# Regular expression to look for the header of the message
HeaderLine /(\d+-\d+-\d+),(\d+:\d+:\d+:\d+),(\w+),INF,\s+150(.*)/
</Extension>
<Input from_file>
Module im_file
File '%FLMRG_PATH%\Floating License Manager_reg.log'
# Defines that the input has to be first paresd with the regular
# expression in the multiline extension module
InputType multiline
<Exec>
# Replaces unwanted characters
$raw_event = replace($raw_event, "\r", "");
$raw_event = replace($raw_event, "\n", " ");
$raw_event = replace($raw_event, "\t", " ");
</Exec>
<Exec>
# Matches the events with a regular expression
if $raw_event =~ %FLMRG_REGEX%
{
# Creates the timestamp
$EventTime = strptime($EventTime, "%m-%d-%Y,%H:%M:%S");
# Saves the LineID fields as integers (default is string)
$EventGroupLineID = integer($EventGroupLineID);
$EvetGroupID = integer($EvetGroupID);
$InfoTypeLineID = integer($InfoTypeLineID);
$ComputernameLineID = integer($ComputernameLineID);
$UserNameLineID = integer($UserNameLineID);
$OSVersionLineID = integer($OSVersionLineID);
$HypervisorLineID = integer($HypervisorLineID);
$ProductLineID = integer($ProductLineID);
$LogLevelLineID = integer($LogLevelLineID);
$FNPAPIVersionLineID = integer($FNPAPIVersionLineID);
$LicensingServiceVersionLineID = integer($LicensingServiceVersionLineID);
$SrvActBrickdllVersionLineID = integer($SrvActBrickdllVersionLineID);
$SrvActBrickXdllVersionLineID = integer($SrvActBrickXdllVersionLineID);
$FnpCommsSoapdllVersionLineID = integer($FnpCommsSoapdllVersionLineID);
$Message1LineID = integer($Message1LineID);
$Message2LineID = integer($Message2LineID);
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
The sample below depicts a processed event in JSON format.
{
"EventReceivedTime": "2020-10-05T14: 06: 54.146916+02: 00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"ComputerName": "DESKTOP-VCF8F0G",
"ComputernameLineID": 153,
"EventGroupLineID": 150,
"EventTime": "2020-05-15T08: 34: 07.000000+02: 00",
"EvetGroupID": 8944,
"FNPAPIVersion": "v11.16.4.0 build 252457 i86_n3",
"FNPAPIVersionLineID": 176,
"FnpCommsSoapdllVersion": "11.16.4.0 build 252457",
"FnpCommsSoapdllVersionLineID": 236,
"Hypervisor": "Oracle VirtualBox",
"HypervisorLineID": 164,
"InfoType": "GENERAL INFORMATION",
"InfoTypeLineID": 151,
"LicensingServiceVersion": "v11.16.4.0 build 252457 2019/07/09",
"LicensingServiceVersionLineID": 187,
"LogLevel": "1",
"LogLevelLineID": 172,
"Message1": "No stored composite transaction requests found",
"Message1LineID": 2695,
"Message2": "Found 0 fulfillments in Trusted Storage",
"Message2LineID": 628,
"OSVersion": "Microsoft (build 14393), 64-bit",
"OSVersionLineID": 157,
"Product": "Floating License Manager 02.04",
"ProductLineID": 169,
"ServerName": "SrvActBr",
"Severity": "INF",
"SrvActBrickXdllVersion": "11.16.4.0 build 252457",
"SrvActBrickXdllVersionLineID": 232,
"SrvActBrickdllVersion": "2.4.0.0",
"SrvActBrickdllVersionLineID": 213,
"UserName": "Engineer",
"UserNameLineID": 155
}
Software update log
The Schneider Electric Software Update (SESU) system automatically notifies customers about updates, provides download and installation capabilities, and implements the Schneider Electric software improvement program.
The example below demonstrates how to collect and process software update log
entries of the Software Update\Logs\SESU.log
file contained in the Schneider
Electric log folder.
This log file contains the following pieces of data:
-
Date (
2020-05-12
in the sample above) -
Time (
09:13:17,597
) -
PID or process ID (
5976
) -
Thread (
1
) -
Severity (
INFO
) -
Message
Below is a sample message which is generated by the software update system.
2020-05-12 09:13:17,597 [5976/ 1] INFO Load in UpdateManagerPersistent: decompressed file not found, path=C:\Users\Engineer\AppData\Local\Schneider Electric\Software Update\UpdateManagerData.xml (Load, line 0)
Because these logs exhibit consistent data patterns, specific fields can be parsed using the following regular expression defined as UPDATE_REGEX
:
define UPDATE_REGEX /(?x)^(\d+\-\d+\-\d+)\s+(\d+\:\d+\:\d+),\
\d+\s+\[(?<Three>.*)\]\s+(?<Severity>\w+)\
\s+(?<Message>.*)\((?<Four>.*)\)/
The logic for parsing and filtering is defined within the Exec block of the from_file
instance of the im_file input module. All parsed fields that do not require additional processing are captured using the named capturing groups defined in UPDATE_REGEX
.
Each field name is defined within angle brackets (< >
) that will determine which event field the captured value will be assigned to.
The timestamp string is captured by the regular expression as $1
and $2
.
However, parsedate() cannot readily parse them due to the non-standard string format.
The strptime() function allows a custom format to be defined as the second argument, which it then uses for parsing and converting the string from the first argument to the datetime data type.
This value is then assigned to the explicitly defined $EventTime
field.
The drop procedure discards records that do not match the UPDATE_REGEX
regular expression.
Then, by calling the to_json() procedure of the JSON (xm_json) extension module, the newly parsed fields along with the core fields are added to the event record. The output stream is formatted as JSON, which can be saved to file, forwarded over the network, or published to a Kafka topic.
The following NXLog configuration combines all the steps described above.
define UPDATE_REGEX /(?x)^(\d+\-\d+\-\d+)\s+(\d+\:\d+\:\d+),\
\d+\s+\[(?<Three>.*)\]\s+(?<Severity>\w+)\
\s+(?<Message>.*)\((?<Four>.*)\)/
define UPDATE_PATH C:\Users\User\AppData\Local\Schneider Electric
<Extension json>
Module xm_json
</Extension>
<Input from_file>
Module im_file
File '%UPDATE_PATH%\Software Update\Logs\SESU.log'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %UPDATE_REGEX%
{
# Creates the timestamp
$EventTime = strptime($1 + $2, "%Y-%m-%d %T");
# Formats the result as JSON
to_json();
}
else drop();
</Exec>
</Input>
Below is the JSON-formatted message sample after it has been processed by NXLog.
{
"EventReceivedTime": "2021-05-19T19:45:59.822635+03:00",
"SourceModuleName": "from_file",
"SourceModuleType": "im_file",
"Four": "Load, line 0",
"Message": "Load in UpdateManagerPersistent: decompressed file not found, path=C:\\Users\\Engineer\\AppData\\Local\\Schneider Electric\\Software Update\\UpdateManagerData.xml ",
"Severity": "INFO",
"Three": "5976/ 1",
"EventTime": "2020-05-12T09:13:17.000000+03:00"
}
Processing all Citect SCADA logs with a single configuration
This configuration combines the configuration files in the File-based logs section. It provides the same functionality as the individual configurations, but performed automatically, on demand, using a single configuration. Using this technique, any other source or output can be combined together.
In this case, an output module is also displayed sending the collected logs to Microsoft Azure Event Hubs using the om_kafka module. For more information on configuring NXLog to send data to Microsoft Azure Event Hubs see the Microsoft Azure Event Hubs integration guide in the NXLog documentation.
Although this configuration contains three routes, all of which define the same output destination, it was written this way to enhance the readability of the configuration file. Combining all nine input instances as a list of inputs in a single route would achieve the same functionality.
The example below combines the processing of all the files listed in this topic.
This example is structured with extra headers for easier readability. Each main section is separated with comments.
# ----------- REGULAR EXPRESSIONS FOR PARSING DATA -------------------
define SYSLOG_REGEX /(?x)^(\d+.\d+.\d+.)(\d+.\d+.\d+.\d+.)\s([+]\d+.\
\d*)\s(?<Message>.*)/
define TRACELOG_REGEX /(?x)^(\d+.\d+.\d+.)(\d+.\d+.\d+.\d+.)\s([+]\d+.\d*)\
\s(?<PID>\d+)\s{3}\d\s(?<Severity>\w+)\s*\
(?<ProcessName>\w+)\s*(?<Message>.*)/
define DEBUGLOG_REGEX /(?x)^(\d+.\d+.\d+.)(\d+.\d+.\d+.\d+.\d+.)([+]\
\d+.\d+.)\s(?<Message>.*)/
define IPCLOG_REGEX /(?x)^(\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+)\s+([+]\d+.\d+)\
\s+(?<Process>\S+)\s+(?<Message>.*)/
define TCPDRIVER_REGEX /(?x)^(\d+.[\/]\d+.[\/]\d+\d+.\d+.\d+.\d+.\d*)\
\s+(?<SourceType>\S+)\s+(?<Message>.*)/
define PARAMS_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+).\
\s(?<Category>\[\w+\]|\[\w+.\w+.\w+\])\s+(?<Message>.*)/
define RELOADLOG_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+)\
\s(?<ServerName>\w+\s\w+:)\s(?<Message>.*)/
define TRACELOGRM_REGEX /(?x)^(\d+.\d+.\d+\s\d+.\d+.\d+.\d+.\d+\s[+]\d+.\d+)\
\s(?<EventID>\w+)\s+\d\s(?<Severity>\w+)\s+(?<Type>\w+)\
\s+(?<Message>.*)/
define CHANGELOG_REGEX /(?x)^(?<Action>\w+)\t+\+\t+(?<Severity>\w+)\
\s+(?<Project_Name>\w+)\t+(?<Submenu>\w+\s?\w+)\
\t(?<Component>\w+)\t(?<Property>\w+)\:\
\s(?<Value_Change>.*)/
define KERNEL_REGEX /(?x)^(?<Mode>\w+)\s+(?<Name>[\w+\.]+)\s+(?<Hnd>\d+)\
\s+(?<State>\w+)\s+(?<Prty>\w+)\s+(?<Cpu>[\d\.]+)\s+\
(?<Min>[\d\.]+)\s+(?<Max>[\d\.]+)\s+(?<Avg>[\d\.]+)\s+\
(?<Count>\d+)/
define ALARM_REGEX /(?x)^(\d+\-\w+\-\d+)\s+\
(\d+\:\d+\:\d+\.\d+)\s+(?<Three>\d+\w+)\
\s+\[?(?<Four>[\w\d\s]+)\]?[\:\s]+(?<Message>.*)/
define UPDATE_REGEX /(?x)^(\d+\-\d+\-\d+)\s+(\d+\:\d+\:\d+),\
\d+\s+\[(?<Three>.*)\]\s+(?<Severity>\w+)\
\s+(?<Message>.*)\((?<Four>.*)\)/
define FLMRG_REGEX /(?x)^(?<EventTime>\d+-\d+-\d+,\d+:\d+:\d+:\d+),\
(?<ServerName>\w+),(?<Severity>\w+),\
\s(?<EventGroupLineID>150),\s(?<EvetGroupID>\d+),\-+\s\d+.\
\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\s+(?<InfoTypeLineID>\d+)\
,\s+\d+,(?<InfoType>\w+\s+\w+):\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\
\d*,\w+,\w+,\s+(?<ComputernameLineID>\d+),\s+\d+,\w+\s+\w+:\
\s+(?<ComputerName>\w+.+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\
\w+,\w+,\s+(?<UserNameLineID>\d+),\s+\d+,\w+\s+\w+:\
\s+(?<UserName>\w+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<OSVersionLineID>\d+),\s\d+,\w+:\s+(?<OSVersion>\w+\s+\
\(\w+\s\d+\),\s\d+-\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\
\w+,\s+(?<HypervisorLineID>\d+),\s+\d+,\w+:\s+(?<Hypervisor>\
\w+.\w+|\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<ProductLineID>\d+),\s+\d+,\w+:\s+(?<Product>\w+\s\w+\s\
\w+\s\d+.\d+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s(?<LogLevelLineID>\d+),\s\d+,\w+\s\w+:\s+(?<LogLevel>\d+)\s+\
\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\
\s+(?<FNPAPIVersionLineID>\d+),\s+\d+,\w+.\w+.\w+:\
\s+(?<FNPAPIVersion>v\d+.\d+.\d+.\d+\s\w+\s\d+\s\w+)\s\d+.\d+.\
\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s+(?<LicensingServiceVersionLineID>\d+),\s+\d+,\w+.\w+.\w+:\
\s+(?<LicensingServiceVersion>v\d+.\d+.\d+.\d+\s\w+\s\d+\s\d+\
\/\d+\/\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\w+,\
\s(?<SrvActBrickdllVersionLineID>\d+),\s\d+,\w+.\w+\s+\w+:\
\s+(?<SrvActBrickdllVersion>\d+.\d+.\d+.\d+)\s+\d+.\d+.\d+\d+.\
\d+.\d+.\d+.\d*,\w+,\w+,\s(?<SrvActBrickXdllVersionLineID>\
\d+),\s\d+,\w+.\w+\s+\w+:\s+(?<SrvActBrickXdllVersion>\d+.\
\d+.\d+.\d+\s+\w+\s+\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\
\w+,\w+,\s+(?<FnpCommsSoapdllVersionLineID>\d+),\s+\d+,\w+.\
\w+\s+\w+:\s+(?<FnpCommsSoapdllVersion>\d+.\d+.\d+.\d+\s+\w+\
\s+\d+)\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\s+\d+,\s+\
\d+,\-+\s+\d+.\d+.\d+\d+.\d+.\d+.\d+.\d*,\w+,\
\w+,(?<Message1LineID>\d+),\s\d+,(?<Message1>\w+\s\w+\s\w+\s\
\w+\s\w+\s\w+)\s\d+.\d+.\d+\d+.\d+.\d+.\d+.\d+,\w+,\w+,\
\s+(?<Message2LineID>\d+),\s+\d+,(?<Message2>.*)/
# ------------------- PATHS TO LOG FILES --------------------------
define LOG_PATH C:\ProgramData\Schneider Electric\Citect SCADA 2018\Logs
define FLMRG_PATH C:\Users\Engineer\AppData\Local\Temp
define UPDATE_PATH C:\Users\User\AppData\Local\Schneider Electric
# -------------------- EXTENSION MODULES --------------------------
<Extension json>
Module xm_json
</Extension>
<Extension _IPCLOG_multiline>
Module xm_multiline
HeaderLine /(\d+.\d+.\d+)\s(\d+.\d+.\d+.\d+.\d+)\s([+]\d+.\d+)\s(.*)/
</Extension>
<Extension _TCPIPDRIVER_multiline>
Module xm_multiline
HeaderLine /(\d+.[\/]\d+.[\/]\d+)\s(\d+.\d+.\d+.\d+.\d*)/
</Extension>
<Extension _FLMRG_multiline>
Module xm_multiline
HeaderLine /(\d+-\d+-\d+),(\d+:\d+:\d+:\d+),(\w+),INF,\s+150(.*)/
</Extension>
# ---------------------- INPUT MODULES -----------------------------
<Input SYSLOG_from_file>
Module im_file
File '%LOG_PATH%\syslog.dat'
<Exec>
if $raw_event =~ /.*[*]{4,}.*/ drop();
if $raw_event =~ %SYSLOG_REGEX%
{
$EventTime = parsedate($1 + $2 + $3);
to_json();
}
</Exec>
</Input>
<Input TRACELOG_from_file>
Module im_file
File '%LOG_PATH%\IDEtracelog.dat'
<Exec>
if $raw_event =~ %TRACELOG_REGEX%
{
$EventTime = parsedate($1 + $2 + $3);
$PID = integer ($PID);
to_json();
}
</Exec>
</Input>
<Input DEBUGLOG_from_file>
Module im_file
File '%LOG_PATH%\debug.log'
<Exec>
if $raw_event =~ %DEBUGLOG_REGEX%
{
$EventTime = parsedate($1 + $2 + $3);
to_json();
}
</Exec>
</Input>
<Input IPCLOG_from_file>
Module im_file
File '%LOG_PATH%\ipc.log'
InputType _IPCLOG_multiline
<Exec>
$raw_event = replace($raw_event, "\r", "");
$raw_event = replace($raw_event, "\n", " ");
</Exec>
<Exec>
if $raw_event =~ %IPCLOG_REGEX%
{
$EventTime = parsedate($1 + $2);
to_json();
}
</Exec>
</Input>
<Input TCPIPDRIVER_from_file>
Module im_file
File '%LOG_PATH%\Drivers\TCPIP.Cluster1.IOServer1.DAT'
InputType _TCPIPDRIVER_multiline
<Exec>
$raw_event = replace($raw_event, "\r", "");
$raw_event = replace($raw_event, "\n", " ");
$raw_event = replace($raw_event, "\t", " ");
</Exec>
<Exec>
if $raw_event =~ %TCPDRIVER_REGEX%
{
$EventTime = strptime($1, "%Y/%m/%d %H:%M:%S");
to_json();
}
</Exec>
</Input>
<Input PARAMS_from_file>
Module im_file
File '%LOG_PATH%\Params.dat'
Exec $raw_event = replace($raw_event, "\t", " ");
<Exec>
if $raw_event =~ %PARAMS_REGEX%
{
$EventTime = parsedate($1);
to_json();
}
</Exec>
</Input>
<Input RELOADLOG_from_file>
Module im_file
File '%LOG_PATH%\reloadlog.dat'
<Exec>
if $raw_event =~ %RELOADLOG_REGEX%
{
$EventTime = parsedate($1);
to_json();
}
</Exec>
</Input>
<Input TRACELOGRM_from_file>
Module im_file
File '%LOG_PATH%\tracelog.RuntimeManager.UI.dat'
<Exec>
if $raw_event =~ %TRACELOGRM_REGEX%
{
$EventTime = parsedate($1);
$EventID = integer ($EventID);
to_json();
}
</Exec>
</Input>
<Input FLMRG_from_file>
Module im_file
File '%FLMRG_PATH%\Floating License Manager_reg.log'
InputType _FLMRG_multiline
<Exec>
$raw_event = replace($raw_event, "\r", "");
$raw_event = replace($raw_event, "\n", " ");
$raw_event = replace($raw_event, "\t", " ");
</Exec>
<Exec>
if $raw_event =~ %FLMRG_REGEX%
{
$EventTime = strptime($EventTime, "%m-%d-%Y,%H:%M:%S");
$EventGroupLineID = integer($EventGroupLineID);
$EvetGroupID = integer($EvetGroupID);
$InfoTypeLineID = integer($InfoTypeLineID);
$ComputernameLineID = integer($ComputernameLineID);
$UserNameLineID = integer($UserNameLineID);
$OSVersionLineID = integer($OSVersionLineID);
$HypervisorLineID = integer($HypervisorLineID);
$ProductLineID = integer($ProductLineID);
$LogLevelLineID = integer($LogLevelLineID);
$FNPAPIVersionLineID = integer($FNPAPIVersionLineID);
$LicensingServiceVersionLineID = integer($LicensingServiceVersionLineID);
$SrvActBrickdllVersionLineID = integer($SrvActBrickdllVersionLineID);
$SrvActBrickXdllVersionLineID = integer($SrvActBrickXdllVersionLineID);
$FnpCommsSoapdllVersionLineID = integer($FnpCommsSoapdllVersionLineID);
$Message1LineID = integer($Message1LineID);
$Message2LineID = integer($Message2LineID);
to_json();
}
</Exec>
</Input>
<Input CHANGELOG_from_file>
Module im_file
File '%LOG_PATH%\ChangeLogs\test.log'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %CHANGELOG_REGEX%
{
# Creates the timestamp
$EventTime = $EventReceivedTime;
# Formats the result as JSON
to_json();
}
</Exec>
</Input>
<Input KERNEL_from_file>
Module im_file
File '%LOG_PATH%\kernel.dat'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %KERNEL_REGEX%
{
# Creates the timestamp
$EventTime = $EventReceivedTime;
# Formats the result as JSON
to_json();
}
else drop();
</Exec>
</Input>
<Input ALARM_from_file>
Module im_file
File '%LOG_PATH%\DBLog.Alarm.Cluster.Server\DB_001.log'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %ALARM_REGEX%
{
# Creates the timestamp
$EventTime = strptime($1 + $2, "%d-%b-%Y %T");
# Formats the result as JSON
to_json();
}
else drop();
</Exec>
</Input>
<Input UPDATE_from_file>
Module im_file
File '%UPDATE_PATH%\Software Update\Logs\SESU.log'
<Exec>
# Matches the events with the regular expression
if $raw_event =~ %UPDATE_REGEX%
{
# Creates the timestamp
$EventTime = strptime($1 + $2, "%Y-%m-%d %T");
# Formats the result as JSON
to_json();
}
else drop();
</Exec>
</Input>
# ---------------------- OUTPUT MODULE -----------------------------
<Output to_eventhubs>
Module om_kafka
BrokerList <YOURNAMESPACE>.servicebus.windows.net:9093
Topic <YOUREVENTHUB>
Option security.protocol SASL_SSL
Option group.id <YOURCONSUMERGROUP>
Option sasl.mechanisms PLAIN
Option sasl.username $ConnectionString
Option sasl.password <YOUR Connection string–primary key>
CAFile C:\Program Files\nxlog\cert\<ca.pem>
</Output>
# ---------------------- ROUTES -----------------------------
<Route r1>
Path SYSLOG_from_file, TRACELOG_from_file, DEBUGLOG_from_file => to_eventhubs
</Route>
<Route r2>
Path IPCLOG_from_file, TCPIPDRIVER_from_file, PARAMS_from_file => to_eventhubs
</Route>
<Route r3>
Path RELOADLOG_from_file, TRACELOGRM_from_file, FLMRG_from_file => to_eventhubs
</Route>
<Route r4>
Path CHANGELOG_from_file, KERNEL_from_file, ALARM_from_file => to_eventhubs
</Route>
<Route r5>
Path UPDATE_from_file => to_eventhubs
</Route>
Network monitoring
NXLog’s im_pcap module provides support to passively monitor network traffic by generating logs for various protocols. This capability of NXLog is another source for collecting events based on the network communication to and from Citect SCADA devices and controller computers.
Citect SCADA can communicate with a variety of I/O devices, including PLCs (programmable logic controllers), loop controllers, bar code readers, scientific analyzers, remote terminal units (RTUs), and distributed control systems (DCS).
The I/O device communication in a Citect SCADA ecosystem typically consists of:
-
Citect SCADA I/O Server: a computer that directly connects to an I/O device.
-
I/O Device: I/O devices that directly monitor and control technological processes.
-
Transport medium: the physical communication medium and the low-level logic required to control it.
-
Protocol: the type of messages exchanged between the I/O server and I/O device.
The following protocols are supported by NXLog and are considered in the examples below:
Modbus TCP
The Modbus protocol is a communication protocol with bus topology based on a master/slave architecture. In Modbus TCP, the data is transmitted via the Ethernet interface as TCP/IP packets.
The Citect SCADA MODNET driver supports TCP/IP communication with a number of Modbus TCP compatible devices.
In this example, the im_pcap input module is configured to listen on the
\Device\NPF_{475C04FC-859D-42F5-8BF1-765D0D6C6518}
network interface for
network traffic that uses the Modbus protocol. The result is formatted as JSON,
then saved to a file.
<Extension _json>
Module xm_json
</Extension>
<Input pcap>
Module im_pcap
# Specifies the name of a network device/interface
Dev \Device\NPF_{475C04FC-859D-42F5-8BF1-765D0D6C6518}
# Specifies the protocol type
<Protocol>
Type modbus
</Protocol>
</Input>
<Output file>
Module om_file
File "C:\output.txt"
# Formats the result as JSON
Exec to_json();
</Output>
{
"modbus.function_code": "Read Holding Registers (03)",
"modbus.length": "6",
"modbus.prot_id": "0",
"modbus.query.read_holding_regs.qty_of_regs": "3",
"modbus.query.read_holding_regs.starting_address": "0",
"modbus.trans_id": "204",
"modbus.unit_id": "1",
"EventTime": "2021-07-06T11:47:18.251209+03:00",
"EventReceivedTime": "2021-07-06T11:47:18.367490+03:00",
"SourceModuleName": "pcap",
"SourceModuleType": "im_pcap"
}
{
"modbus.function_code": "Read Holding Registers (03)",
"modbus.length": "9",
"modbus.prot_id": "0",
"modbus.response.read_holding_regs.byte_count": "6",
"modbus.response.read_holding_regs.registers": "2278, 2276, 2273",
"modbus.trans_id": "204",
"modbus.unit_id": "1",
"EventTime": "2021-07-06T11:47:18.263063+03:00",
"EventReceivedTime": "2021-07-06T11:47:18.367490+03:00",
"SourceModuleName": "pcap",
"SourceModuleType": "im_pcap"
}
BACnet
The BACnet protocol is accepted as an ISO standard for Building Automation and Control System (BACS). It allows integration with building management systems for heating and air conditioning, ventilation, lighting, security, and fire detection.
The Citect SCADA BACNET driver provides communication with BACnet networks and devices using the BACnet/IP option.
In this example, the im_pcap input module is configured to listen on the
\Device\NPF_{475C04FC-859D-42F5-8BF1-765D0D6C6518}
network interface for
network traffic that uses the BACnet protocol. The result is formatted as JSON,
then saved to a file.
<Extension _json>
Module xm_json
</Extension>
<Input pcap>
Module im_pcap
# Specifies the name of a network device/interface
Dev \Device\NPF_{475C04FC-859D-42F5-8BF1-765D0D6C6518}
# Specifies the protocol type
<Protocol>
Type bacnet
</Protocol>
</Input>
<Output file>
Module om_file
File "C:\output.txt"
# Formats the result as JSON
Exec to_json();
</Output>
{
"bacnet.apdu.bacnet_confirmed_request.invoke_id": "132",
"bacnet.apdu.bacnet_confirmed_request.max_resp": "1476",
"bacnet.apdu.bacnet_confirmed_request.max_segs": "64",
"bacnet.apdu.bacnet_confirmed_request.more_segments_follow": "false",
"bacnet.apdu.bacnet_confirmed_request.segmented": "false",
"bacnet.apdu.bacnet_confirmed_request.segmented_accepted": "true",
"bacnet.apdu.bacnet_confirmed_request.service_choice": "Confirmed COV Notification (1)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.0.subscriber_process_id": "3",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.1.initiating_device_identifier.instance_number": "1",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.1.initiating_device_identifier.object_id": "device (8)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.2.monitored_device_identifier.instance_number": "0",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.2.monitored_device_identifier.object_id": "analog-value (2)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.3.time_remaining": "3561",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.4": "Opening Tag (4)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.5.0.property_identifier": "present-value (85)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.5.0.property_value.records.0": "Opening Tag (2)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.5.0.property_value.records.1": "-55000.000000",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.5.0.property_value.records.2": "Closing Tag (2)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.5.1.property_identifier": "status-flags (111)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.5.1.property_value.records.0": "Opening Tag (2)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.5.1.property_value.records.1": "in-alarm (0): false, fault (1): false, overriden (2): false, out-of-service (3): false",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.5.1.property_value.records.2": "Closing Tag (2)",
"bacnet.apdu.bacnet_confirmed_request.service_request.records.6": "Closing Tag (4)",
"bacnet.apdu.pdu_type": "BACnet-Confirmed-Request-PDU (0x00)",
"bacnet.bvlc.function": "Original-Unicast-NPDU (0x0A)",
"bacnet.bvlc.length": "52",
"bacnet.bvlc.type": "BACnet/IP (Annex J) (0x81)",
"bacnet.npdu.control": "0x000C",
"bacnet.npdu.control.contains": "BACnet APDU message (0)",
"bacnet.npdu.control.dst_spec": "DNET, DLEN, DADR, Hop Count absent (0)",
"bacnet.npdu.control.prio": "Normal message",
"bacnet.npdu.control.reply_expected": "Yes (1)",
"bacnet.npdu.control.src_spec": "SNET, SLEN, SADR present (1)",
"bacnet.npdu.version": "0x0001",
"EventTime": "2021-07-07T11:10:29.740191+03:00",
"EventReceivedTime": "2021-07-07T11:10:30.585834+03:00",
"SourceModuleName": "pcap",
"SourceModuleType": "im_pcap"
}
{
"bacnet.apdu.bacnet_simpleack.original_invoke_id": "132",
"bacnet.apdu.bacnet_simpleack.service_ack_choice": "Confirmed COV Notification (1)",
"bacnet.apdu.pdu_type": "BACnet-Simple-ACK-PDU (0x02)",
"bacnet.bvlc.function": "Original-Unicast-NPDU (0x0A)",
"bacnet.bvlc.length": "19",
"bacnet.bvlc.type": "BACnet/IP (Annex J) (0x81)",
"bacnet.npdu.control": "0x0020",
"bacnet.npdu.control.contains": "BACnet APDU message (0)",
"bacnet.npdu.control.dst_spec": "DNET, DLEN, Hop Count present (1)",
"bacnet.npdu.control.prio": "Normal message",
"bacnet.npdu.control.reply_expected": "No (0)",
"bacnet.npdu.control.src_spec": "SNET, SLEN, SADR absent (0)",
"bacnet.npdu.version": "0x0001",
"EventTime": "2021-07-07T11:10:29.740619+03:00",
"EventReceivedTime": "2021-07-07T11:10:30.585834+03:00",
"SourceModuleName": "pcap",
"SourceModuleType": "im_pcap"
}
DNP3
DNP3, or Distributed Network Protocol, is a standards-based, flexible, efficient, non-proprietary, layered communication protocol that offers higher data-transfer integrity than most of the other popular communication protocols.
It has been developed to enhance systems interoperability in electric utility, oil & gas, water/wastewater, and security industries.
The Citect SCADA DNPR driver communicates with devices that support DNP3 protocol subsets Level 1 and Level 2.
In this example, the im_pcap input module is configured to listen on the
\Device\NPF_{475C04FC-859D-42F5-8BF1-765D0D6C6518}
network interface for
network traffic that uses the DNP3 protocol. The result is formatted as JSON,
then saved to a file.
<Extension _json>
Module xm_json
</Extension>
<Input pcap>
Module im_pcap
# Specifies the name of a network device/interface
Dev \Device\NPF_{475C04FC-859D-42F5-8BF1-765D0D6C6518}
# Specifies the protocol type
<Protocol>
Type dnp3
</Protocol>
</Input>
<Output file>
Module om_file
File "C:\output.txt"
# Formats the result as JSON
Exec to_json();
</Output>
{
"dnp3.application_layer.control.con": "0",
"dnp3.application_layer.control.fin": "1",
"dnp3.application_layer.control.fir": "1",
"dnp3.application_layer.control.sequence": "1",
"dnp3.application_layer.control.uns": "0",
"dnp3.application_layer.function_code": "Read",
"dnp3.application_layer.object0.count": "0",
"dnp3.application_layer.object0.group": "60",
"dnp3.application_layer.object0.name": "Class objects - Class 3 data",
"dnp3.application_layer.object0.variation": "4",
"dnp3.application_layer.object1.count": "0",
"dnp3.application_layer.object1.group": "60",
"dnp3.application_layer.object1.name": "Class objects - Class 2 data",
"dnp3.application_layer.object1.variation": "3",
"dnp3.application_layer.object2.count": "0",
"dnp3.application_layer.object2.group": "60",
"dnp3.application_layer.object2.name": "Class objects - Class 1 data",
"dnp3.application_layer.object2.variation": "2",
"dnp3.application_layer.object3.count": "0",
"dnp3.application_layer.object3.group": "60",
"dnp3.application_layer.object3.name": "Class objects - Class 0 data",
"dnp3.application_layer.object3.variation": "1",
"dnp3.data_layer.control": "0xC4",
"dnp3.data_layer.control.dir": "1",
"dnp3.data_layer.control.fcb": "0",
"dnp3.data_layer.control.fcv": "0",
"dnp3.data_layer.control.function_code": "Unconfirmed User Data",
"dnp3.data_layer.control.prm": "1",
"dnp3.data_layer.destination": "1",
"dnp3.data_layer.length": "20",
"dnp3.data_layer.source": "2",
"dnp3.data_layer.start_bytes": "0x6405",
"dnp3.transport.fin": "1",
"dnp3.transport.fir": "1",
"dnp3.transport.sequence": "60",
"EventTime": "2021-07-09T17:58:43.600716+03:00",
"EventReceivedTime": "2021-07-09T17:58:44.213449+03:00",
"SourceModuleName": "pcap",
"SourceModuleType": "im_pcap"
}
{
"dnp3.application_layer.control.con": "0",
"dnp3.application_layer.control.fin": "1",
"dnp3.application_layer.control.fir": "1",
"dnp3.application_layer.control.sequence": "1",
"dnp3.application_layer.control.uns": "0",
"dnp3.application_layer.function_code": "Response",
"dnp3.application_layer.internal_indications.already_executing": "0",
"dnp3.application_layer.internal_indications.broadcast": "0",
"dnp3.application_layer.internal_indications.class1_events": "0",
"dnp3.application_layer.internal_indications.class2_events": "0",
"dnp3.application_layer.internal_indications.class3_events": "0",
"dnp3.application_layer.internal_indications.config_corrupt": "0",
"dnp3.application_layer.internal_indications.device_restart": "0",
"dnp3.application_layer.internal_indications.device_trouble": "0",
"dnp3.application_layer.internal_indications.events_buffer_overflow": "0",
"dnp3.application_layer.internal_indications.local_control": "0",
"dnp3.application_layer.internal_indications.need_time": "0",
"dnp3.application_layer.internal_indications.no_func_code_support": "0",
"dnp3.application_layer.internal_indications.object_unknown": "0",
"dnp3.application_layer.internal_indications.parameter_error": "0",
"dnp3.application_layer.internal_indications.reserved": "0 (expected 0)",
"dnp3.application_layer.object0.count": "3",
"dnp3.application_layer.object0.group": "30",
"dnp3.application_layer.object0.name": "Analog input - single-precision, floating-point with flag",
"dnp3.application_layer.object0.point0.flags": "[ONLINE]",
"dnp3.application_layer.object0.point0.value": "456.600006",
"dnp3.application_layer.object0.point1.flags": "[ONLINE]",
"dnp3.application_layer.object0.point1.value": "123.300003",
"dnp3.application_layer.object0.point2.flags": "[ONLINE]",
"dnp3.application_layer.object0.point2.value": "789.900024",
"dnp3.application_layer.object0.variation": "5",
"dnp3.data_layer.control": "0x44",
"dnp3.data_layer.control.dir": "0",
"dnp3.data_layer.control.fcb": "0",
"dnp3.data_layer.control.fcv": "0",
"dnp3.data_layer.control.function_code": "Unconfirmed User Data",
"dnp3.data_layer.control.prm": "1",
"dnp3.data_layer.destination": "2",
"dnp3.data_layer.length": "30",
"dnp3.data_layer.source": "1",
"dnp3.data_layer.start_bytes": "0x6405",
"dnp3.transport.fin": "1",
"dnp3.transport.fir": "1",
"dnp3.transport.sequence": "58",
"EventTime": "2021-07-09T17:58:43.668446+03:00",
"EventReceivedTime": "2021-07-09T17:58:44.213449+03:00",
"SourceModuleName": "pcap",
"SourceModuleType": "im_pcap"
}
IEC 60870-5-104
The IEC 60870-5-104 protocol standard is officially named “Network access for IEC 60870-5-101 using standard transport profiles”. The protocol is based on the existing IEC 61870-5-101 application and transport layer profiles while also including a network link layer specification for ETHERNET/TCP communication, allowing simultaneous data transmission between several devices and services.
The IEC870IP driver enables Citect SCADA to communicate via TCP/IP with devices that use the IEC 60870-5-104 communication protocol.
In this example, the im_pcap input module is configured to listen on the
\Device\NPF_{475C04FC-859D-42F5-8BF1-765D0D6C6518}
network interface for
network traffic that uses the IEC 60870-5-104 protocol. The result is formatted
as JSON, then saved to a file.
<Extension _json>
Module xm_json
</Extension>
<Input pcap>
Module im_pcap
# Specifies the name of a network device/interface
Dev \Device\NPF_{475C04FC-859D-42F5-8BF1-765D0D6C6518}
# Specifies the protocol type
<Protocol>
Type iec104asdu
</Protocol>
<Protocol>
Type iec104apci
</Protocol>
</Input>
<Output file>
Module om_file
File "C:\output.txt"
# Formats the result as JSON
Exec to_json();
</Output>
{
"iec104.apci.receive_sequence_number": "2",
"iec104.apci.send_sequence_number": "365",
"iec104.apci.type": "Information (I)",
"iec104.asdu.data": {
"io": [
{
"ioa": 1020,
"ie": [
{
"type": "R32",
"value": 28079.349609375
},
{
"type": "QDS",
"invalid": false,
"not-topical": false,
"substituted": false,
"blocked": false,
"overflow": false
}
],
"ies": 2
}
],
"ios": 1
},
"iec104.asdu.dui.cause_of_transmission": "Spontaneous (3)",
"iec104.asdu.dui.coa": "1",
"iec104.asdu.dui.num_records": "1",
"iec104.asdu.dui.org": "0",
"iec104.asdu.dui.pn": "0",
"iec104.asdu.dui.sq": "FALSE",
"iec104.asdu.dui.test_bit": "0",
"iec104.asdu.dui.type": "M_ME_NC_1",
"EventTime": "2021-07-14T10:53:35.377132+03:00",
"EventReceivedTime": "2021-07-14T10:53:35.419664+03:00",
"SourceModuleName": "pcap",
"SourceModuleType": "im_pcap"
}