Skip to content

How to send logs to an external SIEM platform

This page describes procedures to send logs to an external Security Information and Event Management (SIEM) platform. This procedure applies to on-prem deployments of the solution. For SaaS deployments, contact OKIOK if you need access to the audit logs.

S-Filer Portal uses Log4J as its logging framework, which provides flexibility to send logs to SIEM platforms through different methods.

Warning

Changing the Log4J configuration is supported by OKIOK, but is not part of the standard installation procedure. It may be necessary to reapply these changes after an upgrade.

Overview

There are two main approaches to send logs to an external SIEM platform:

  1. Direct syslog integration: Use a syslog appender to send logs directly to a SIEM that acts as a syslog (or syslog-ng) server.
  2. File-based integration: Configure Log4J to write logs in JSONL (JSON Lines) format, then use the SIEM's native agent (Logstash, Vector, etc.) to read the log files and send them to the SIEM.

Method 1: Direct syslog integration

This method sends logs directly from S-Filer Portal to your SIEM platform using the syslog protocol. This is the most straightforward approach when your SIEM supports syslog reception.

For detailed instructions on configuring the syslog appender, see the How to send audits to syslog guide.

Advantages of direct syslog integration

  • Real-time log delivery
  • No intermediate file storage required
  • Lower latency
  • Simpler architecture

Considerations

  • Requires network connectivity between S-Filer Portal and the SIEM
  • May require firewall rules configuration
  • SIEM must support syslog reception (UDP, TCP, or TLS over TCP)

Method 2: File-based integration using SIEM agents

This method involves two steps:

  1. Configure Log4J to write logs in JSONL format to files
  2. Deploy a SIEM agent (Logstash, Vector, Filebeat, etc.) to read these files and forward them to your SIEM platform

Advantages of file-based integration

  • Works with any SIEM that supports file-based log collection
  • Can buffer logs if the SIEM is temporarily unavailable
  • Allows for log transformation and enrichment before sending
  • More flexible for complex SIEM architectures

Step 1: Configure JSONL logging

Configure Log4J to write audit logs in JSONL format. JSONL format stores one JSON object per line, making it easy for log collection agents to parse.

Here's an example configuration that writes audit logs to a JSONL file:

xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration monitorinterval="30" status="info" strict="true">
    <Appenders>
        <RollingFile name="audit-jsonl" fileName="logs/audit.jsonl" filePattern="logs/audit-%d{yyyy-MM-dd}.jsonl">
            <JsonTemplateLayout eventTemplateUri="classpath:sfiler-ecs-layout.json"/>
            <Policies>
                <SizeBasedTriggeringPolicy size="256MB"/>
            </Policies>
            <DefaultRolloverStrategy max="10"/>
        </RollingFile>
    </Appenders>
    <Loggers>
        <Logger name="AUDIT" level="INFO">
            <appender-ref ref="audit-jsonl"/>
        </Logger>
    </Loggers>
</Configuration>

This configuration:

  • Writes audit logs to logs/audit.jsonl
  • Rotates files daily or when they reach 256MB
  • Keeps up to 10 archived files
  • Uses ECS (Elastic Common Schema) format for structured logging

For more information on using the ECS logging format, see the ECS Logging guide.

Step 2: Configure SIEM agent

After configuring JSONL logging, deploy and configure a SIEM agent to read the log files and forward them to your SIEM platform.

Example: Logstash configuration

If you're using Logstash as your SIEM agent, here's an example configuration:

ruby
input {
  file {
    path => "/path/to/sfiler/logs/audit.jsonl"
    start_position => "beginning"
    codec => "json"
    sincedb_path => "/var/lib/logstash/sincedb/audit"
  }
}

filter {
  # Add any transformations or enrichments here
  # For example, parsing timestamps, adding fields, etc.
  
  date {
    match => [ "timestamp", "ISO8601" ]
  }
}

output {
  # Example: Send to Elasticsearch
  elasticsearch {
    hosts => ["https://your-elasticsearch-host:9200"]
    index => "sfiler-audit-%{+YYYY.MM.dd}"
    user => "elastic"
    password => "your-password"
  }
  
  # Or send to other SIEM platforms
  # Example: Splunk
  # http {
  #   url => "https://your-splunk-host:8088/services/collector"
  #   http_method => "post"
  #   headers => {
  #     "Authorization" => "Splunk your-token"
  #   }
  # }
}

Example: Vector configuration

If you're using Vector as your SIEM agent, here's an example configuration:

toml
[sources.sfiler_audit]
type = "file"
include = ["/path/to/sfiler/logs/audit*.jsonl"]
read_from = "beginning"

[transforms.parse_json]
type = "remap"
inputs = ["sfiler_audit"]
source = '''
. = parse_json!(.message)
'''

[sinks.siem_output]
type = "elasticsearch"
inputs = ["parse_json"]
endpoint = "https://your-elasticsearch-host:9200"
index = "sfiler-audit-%Y.%m.%d"
auth.strategy = "basic"
auth.user = "elastic"
auth.password = "your-password"

Example: Filebeat configuration

If you're using Filebeat, here's an example configuration:

yaml
filebeat.inputs:
  - type: log
    enabled: true
    paths:
      - /path/to/sfiler/logs/audit*.jsonl
    json.keys_under_root: true
    json.add_error_key: true

output.elasticsearch:
  hosts: ["https://your-elasticsearch-host:9200"]
  index: "sfiler-audit-%{+yyyy.MM.dd}"
  username: "elastic"
  password: "your-password"

Log file location

By default, log files are written to the logs/ directory relative to the S-Filer Portal installation. The exact path depends on your deployment:

  • Standalone installation: Typically $SFILER_HOME/logs/
  • Docker installation: Check the volume mount configuration
  • Service installation: Check the working directory of the service

Ensure that the SIEM agent has read access to the log files directory.

Log rotation handling

S-Filer Portal rotates log files based on size and date. SIEM agents should be configured to:

  • Monitor the log file pattern (e.g., audit*.jsonl) to pick up rotated files
  • Handle file rotation gracefully without losing log entries
  • Track reading position to avoid re-reading old logs (using sincedb or similar mechanisms)

Choosing the right method

Choose the method that best fits your infrastructure:

  • Use Method 1 (Direct syslog) if:

    • Your SIEM supports syslog reception
    • You want real-time log delivery
    • You prefer a simpler architecture without intermediate agents
  • Use Method 2 (File-based with agents) if:

    • Your SIEM requires specific log formats or protocols
    • You need to transform or enrich logs before sending
    • You want to buffer logs for reliability
    • You're using a SIEM that doesn't support direct syslog

Contextual information

The solution provides contextual information for audits. These can be used in log formatting:

NameLayout parameterExample
Audit message%mMFA validation failed for 'jsmith'.
Thread identifier%tpool-1-thread-3
Date and time%dSee Log4J doc for formats
Other standard Log4J%p, %c, etc.See Log4J for details.
Account name%X{MDC_USER}jsmith
IP of the user%X{MDC_REMOTE_IP}172.134.23.46
S-Filer component%X{MDC_COMPONENT}sfiler-gw-1

Audits to log

See the Audit Reference page for the list of all supported audits in S-Filer Portal.

References