All Apps and Add-ons

Setup Secure (Encrypted) Syslog

Kieffer87
Communicator

Has anyone had luck setting up secure (encrypted) syslog with this Addon? It only mentions creating a TCP input which would not be encrypted. Our Proofpoint is hosted at their cloud, so encryption between their cloud and our Heavy Forwarder onsite is imperative.

0 Karma
1 Solution

Kieffer87
Communicator

Ended up creating certificates and using the following configuration settings in inputs.conf. The key to making this work is the cipherSuite which is not a default cipher.

[tcp-ssl://1518]
sourcetype = pps_log
index = proofpoint
disabled = false
acceptFrom = *comma seperated list of your cluster server IPs*

[SSL]
requireClientCert = false
serverCert = /opt/splunk/etc/apps/TA_pps/local/certs/combined.cer
sslVersions = tls1.2
cipherSuite = AES256-SHA

The ServerCert should be combined and in the following order:

-----BEGIN CERTIFICATE----- 
(Your server certificate) 
-----END CERTIFICATE----- 
-----BEGIN CERTIFICATE----- 
(Your Intermediate certificate (if you have one)) 
-----END CERTIFICATE----- 
-----BEGIN RSA PRIVATE KEY----- 
(Your Private Key) 
-----END RSA PRIVATE KEY----- 

Proofpoint will need to load this certificate chain as well.

View solution in original post

mannyk1splunk
Loves-to-Learn

Hi @Kieffer87 , I am trying to setup similar ssl on the Splunk Heavy Forward for one of the Vmware application syslog. I have few queries on the above solution you have mentioned.

1. Do we need to have a .cer file or .pem would do?

2.  In the .cer/.pem file do we need to include the private key details?

3. Regarding the cipherSuite, do we need to get this from the source application that encrypts the data?

4. We have other default [SSL] config on the same Splunk server so in that case assuming we should use the specific SSL attributes in the [tcp-ssl://<port>] stanza?

0 Karma

Kieffer87
Communicator

Ended up creating certificates and using the following configuration settings in inputs.conf. The key to making this work is the cipherSuite which is not a default cipher.

[tcp-ssl://1518]
sourcetype = pps_log
index = proofpoint
disabled = false
acceptFrom = *comma seperated list of your cluster server IPs*

[SSL]
requireClientCert = false
serverCert = /opt/splunk/etc/apps/TA_pps/local/certs/combined.cer
sslVersions = tls1.2
cipherSuite = AES256-SHA

The ServerCert should be combined and in the following order:

-----BEGIN CERTIFICATE----- 
(Your server certificate) 
-----END CERTIFICATE----- 
-----BEGIN CERTIFICATE----- 
(Your Intermediate certificate (if you have one)) 
-----END CERTIFICATE----- 
-----BEGIN RSA PRIVATE KEY----- 
(Your Private Key) 
-----END RSA PRIVATE KEY----- 

Proofpoint will need to load this certificate chain as well.

xpac
SplunkTrust
SplunkTrust

Hey,
you could use [tcp-ssl://1234] in inputs.conf - it offers encrypted receiving of data.
However, best practice is to run a dedicated syslog server, which receives the data and writes it to disk, and have Splunk monitor those files. This helps with reliability, as a syslog server restart might take less than one second, but restarting Splunk might take up to several minutes. You might loose data that would come in during such an restart - which also happens more often with Splunk instances than with syslog servers.
I'd therefore recommend to setup syslog-ng, with encryption enabled, and send your data there.

Hope that helps - if it does I'd be happy if you would upvote/accept this answer, so others could profit from it. 🙂

Kieffer87
Communicator

The proofpoint cloud cluster caches some amount of logs, so a Splunk restart shouldn't result in a loss of logs.

0 Karma
Get Updates on the Splunk Community!

Splunk App for Anomaly Detection End of Life Announcment

Q: What is happening to the Splunk App for Anomaly Detection?A: Splunk is officially announcing the ...

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...
OSZAR »