Operations (formerly Stackdriver) Integration with Splunk

Chaitanya Malpe
7 min readMar 9, 2021

Overview

This document enables you to export selected logs from Stackdriver Logging to Pub/Sub for ingestion into Splunk. Splunk is a security information and event management (SIEM) solution that provides the Splunk Add-on for Google Cloud which includes the ability to ingest logs, events, and billing information from Google Cloud. With the help of this add-on, you can export the logs from Google Cloud to a Splunk installation.

If you are using already deployed Splunk solutions, Stackdriver Logging lets you export logs from Google Cloud into your Splunk solution. This ability helps you take advantage of the native logging, monitoring, and diagnostics capabilities while still enabling you to include these logs in your existing systems.

Set up the logging export

The following diagram shows the steps for enabling logging export to Splunk through Pub/Sub.

Set up a Pub/Sub topic

Follow the instructions to set up a Pub/Sub topic that will receive your exported logs.

Turn on audit logging for all services

Data access audit logs — except for BigQuery — are disabled by default. In order to enable all audit logs, follow the instructions to update the Cloud IAM policy with the configuration listed in the audit policy documentation. The steps include the following:

  • Downloading the current IAM policy as a file.
  • Adding the audit log policy JSON or YAML object to the current policy file.
  • Updating the Google Cloud project with the changed policy file.

The following is an example JSON object that enables all audit logs for all services.

“auditConfigs”: [
{
“service”: “allServices”,
“auditLogConfigs”: [
{ “logType”: “ADMIN_READ” },
{ “logType”: “DATA_READ” },
{ “logType”: “DATA_WRITE” },
]
},
]

Configure the logging export

After you set up aggregated exports or logs export, you need to refine the logging filters to export audit logs, virtual machine–related logs, storage logs, and database logs. The following logging filter includes the Admin Activity and Data Access audit logs and the logs for specific resource types.

logName:”/logs/cloudaudit.googleapis.com” OR
resource.type:gce OR
resource.type=gcs_bucket OR
resource.type=bigquery_resource

From the gcloud command-line tool, use the gcloud logging sinks create command or the organizations.sinks.create API call to create a sink with the appropriate filters. The following example gcloud command creates a sink called gcp_logging_sink_pubsub for the organization. The sink includes all children projects and specifies filtering to select specific audit logs.

gcloud logging sinks create [SINK_NAME] \
pubsub.googleapis.com/projects/[PROJECT_ID]/topics/[TOPIC_ID] \
— log-filter=’logName:”/logs/cloudaudit.googleapis.com” OR \
resource.type:”gce” OR \
resource.type=”gcs_bucket” OR \
resource.type=”bigquery_resource”’

The command output is similar to the following:

Created [https://logging.googleapis.com/v2/organizations/your-organization/sinks/gcp_logging_export_pubsub_sink].
Please remember to grant `serviceAccount:gcp-logging-export-pubsub-si@logging-oyour-organization.iam.gserviceaccount.com` Pub/Sub Publisher role to the topic.
More information about sinks can be found at /logging/docs/export/configure_export

In the serviceAccount entry returned from the API call, the identity gcp-logging-export-pubsub-si@logging-oyour-organization.iam.gserviceaccount.com is included in the response. This identity represents a Google Cloud service account that has been created for the export. Until you grant this identity publish access to the destination topic, log entry exports from this sink will fail. For more information, see the next section or the documentation for Granting access for a resource.

Set IAM policy permissions for the Pub/Sub topic

By adding the service account gcp-logging-export-pubsub-si@logging-oyour-organization.iam.gserviceaccount.com to the pubsub.googleapis.com/projects/compliance-logging-export/topics/logs-export-topic topic with the Pub/Sub Publisher permissions, you grant the service account permission to publish to the topic. Until you add these permissions, the sink export will fail.

To add the permissions to the service account, follow these steps:

  1. In the Cloud Console, open the Cloud Pub/Sub Topics page:
    GO TO THE TOPICS PAGE
  2. Select the topic name.
  3. Click Show info panel, and then select the Pub/Sub Publisher permissions.

After you create the logging export by using this filter, log files begin to populate in the Pub/Sub topic in the configured project. You can confirm that the topic is receiving messages by using the Metrics Explorer in Stackdriver Monitoring. Using the following resource type and metric, observe the number of message-send operations over a brief period. If you have configured the export properly, you will see activity above 0 on the graph, as in this screenshot.

  • Resource type: pubsub_topic
  • Metric: pubsub/topic/send_message_operation_count

Configure the Splunk Add-on for Google Cloud

The Splunk Add-on for Google Cloud uses the Pub/Sub topic and a service account in Google Cloud. The service account is used to generate a private key that the add-on uses to establish a Pub/Sub subscription and ingest messages from the logging export topic. The appropriate IAM permissions are required to allow the service account to create the subscription and list the components in the Pub/Sub project that contains the subscription.

Follow the instructions to set up Splunk Add-on. Simpler version of this process is as follows:

  1. If you already have a Splunk client setup please skip this step. First, we need to download the Splunk Enterprise client. Make sure you select the correct installation package as per your operating system and download the same.

2. Follow the instructions properly and get the client installed. The instructions involve :

a. selecting the correct Python version,

b. choosing the required username and password, etc.

3. Once the installation process is complete it will open up Splunk Enterprise in a new window in your browser as shown. Enter the username and password set while installing. This will open up the Splunk enterprise home dashboard.

4. In order to install the Splunk add-on for Google Cloud Platform, click on the find more apps option on the left.

5. Now search Splunk add on for Google Cloud Platform in the search bar provided. Then click on install.

6. Now enter the Splunk website username and password and NOT the username and password used for the Splunk enterprise client. Click on login and install to install the addon. This will take a few minutes.

7. Once the add-on is installed it will ask for a restart. Click on restart now. Wait for a few minutes and then login again into your Splunk enterprise client.

8. Navigate to the home dashboard and you will be able to see the Splunk add-on for Google Cloud Platform in your apps section.

9. Now we need to set up inputs and configuration for our Splunk addon. First, navigate to the configuration section by clicking on the configuration option above.

10. Now click on add credential. Enter the name of the credential and the JSON content of the private key of the service account created earlier. Now click on add. This will validate the service account key and add the given credential successfully.

11. Once this is done we need to configure inputs for Google Cloud Platform. To do this click on the inputs option beside the configuration option at the top.

12. Now click on create new input and select cloud PubSub. Enter the desired name and select the credentials from the dropdown. Now select the GCP project from the project dropdown. Then select the correct PubSub subscription from the dropdown menu. Let the index by default and click on add.

13. Now to make sure Splunk is receiving messages from the PubSub topic, click on the search option beside the configuration option. Select the desired filter and click on the search icon. As we can see we are able to see all the audit logs from the PubSub topic here.

14. This confirms that the Splunk add-on is successfully pulling messages from our Pub/Sub Topic.

By using Metrics Explorer in Stackdriver Monitoring, you can again confirm that the subscription that the Splunk add-on is using is pulling messages. Using the following resource type and metric, observe the number of message-pull operations over a brief period.

  • Resource type: pubsub_subscription
  • Metric: pubsub/subscription/pull_message_operation_count

If you have configured the export properly, you see activity above 0 on the graph, as in this screenshot.

Using the exported logs

After the exported logs have been ingested into Splunk, you can use Splunk as you would with any other data source to do the following tasks:

  • Search the logs.
  • Correlate complex events.
  • Visualize results by using dashboards.

--

--