Share that search! Building a content pack for Splunk Enterprise Security 4.0+

Splunk has initial support for export of “content” which can be dashboards and correlation searches created by the user to share with another team. What if you need to be a little more complex for example including a lookup generating search? This will get a little more complicated but very doable by the average admin. Our mission here is to implement UC0029. What is UC0029 glad you ask Each new malware signature detected should be reviewed by a security analyst to determine if proactive steps can be taken to prevent infection. We will create this as a notable event so that we can provide evidence to audit that the process exists and was followed.

Source code will be provided so I will not detail step by step how objects will be created and defined for this post

UC0029 Endpoint new malware detected by signature


My “brand” is SecKit so you will see this identifier in content I have created alone or with my team here at Splunk. As per our best practice adopt your own brands and use appropriately for your content. There is no technical reason to replace the “brand” on third party content you elect to utilize.

Note ensure all knowledge objects are exported as all app’s owned by admin as you go

      • Create a DA-ESS-SecKit-EndpointProtection
        • This will contain ES specific content such as menus dashboards, and correlation searches
      • Create the working app SA-SecKit-EndpointProtection
        • This will contain props transforms lookups and scheduled searches created outside of ES
      • Create the lookup seckit_endpoint_malware_tracker this lookup will contain each signature as it is detected in the environment and some handy information such as the endpoint first detected, user involved and the most recent detection.
      • Create empty lookup CSV files
        • seckit_endpoint_malware_tracker.csv (note you will not ship this file in your content pack)
        • seckit_endpoint_malware_tracker.csv.default

Build and test the saved search SecKit Malware Tracker – Lookup Gen. This search will use tstats to find the first and last instance of all signatures in a time window and update the lookup if an earlier or later instance is found


      Build and test the correlation search UC0029-S01-V001 New malware signature detected. This search will find “new” signatures from the lookup we have created and create a notable event”Make it default” In both apps move content from local/ to default/ this will allow your users to customize the content without replacing the existing searches”Turn if off by default” It is best practice to ensure any load generating searches are disabled by default

        add disabled=1 to each savedsearches.conf stanza that does not end in”- Rule”add disabled=1 to each correleationsearches.conf

Create a spl (tar.gz) containing both apps createdWrite a blog post explaining what you did, how the searches work and share the code!Gain fame and respect maybe a fez or a cape

The source code

Bonus: Delegate administration of content app

  1. Using your favorite editor edit app/metadata/local.meta
  2. Update the following permisions adding “ess_admin” role

## access = read : [ * ], write : [ admin,role2,role3 ]
access = read : [ * ], write : [ admin,ess_admin ]

access = read : [ * ], write : [ admin,ess_admin ]

Get started with Splunk App Stream 6.4 for DNS Analysis

Passive DNS analysis is all the rage right now, the detection opportunities presented have been well discussed for some time. If your organization is like most now is the time you are being asked how you can implement these detection strategies. Leveraging your existing Splunk investment you can get started very quickly with less change to your organization than one might think. Here is what we will use older versions will work fine however the screen shots will be a bit off:

  •  Splunk Enterprise 6.3.1
  • Splunk App for Stream 6.4

We will assume Splunk Enterprise 6.3.1has already been installed.

Decide where to install your Stream App. Typically this will be the Enterprise Security search head. However if your ES search head is also a search head cluster you will need to use an AD-HOC search head,  dedicated search head or a deployment server. Current versions of Stream fully support installation on a Search Head Cluster.

Note: If using the deployment server (DS) you must configure the server to search the indexer or index cluster containing your stream data.

  1. Install Splunk App for Stream using the standard procedures located here.
  2. Copy the deployment TA to your deployment server if you installed on a search head. /opt/splunk/etc/deployment-apps/Splunk_TA_stream
  3. On your deployment server create a new folder to contain configuration for your stream dns server group.
    • mkdir -p Splunk_TA_stream_infra_dns/local
  4. Copy the inputs.conf from the default TA to the new TA for group management
    • cp Splunk_TA_stream/local/inputs.conf Splunk_TA_stream_infra_dns/local/
  5. Update the inputs.conf to include your forwarder group id
    • vi Splunk_TA_stream_infra_dns/local/inputs.conf
    • Alter “stream_forwarder_id =” to “stream_forwarder_id =infra_dns”
  6. Create a new server class “infra_stream_dns” include both the following apps and deploy to all DNS servers (Windows DNS or BIND)
    • Splunk_TA_stream
    • Splunk_TA_stream_infra_dns
  7. Reload your deployment server

Excellent at this point the Splunk Stream app will be deployed to all of your DNS servers and sit idle. The next few steps will prepare the environment to start collections

  • Create a new index I typically will create stream_dns and setup retention for 30 days.

Configure your deployment group

  1. Login to the search head with the Splunk App for Stream
  2. Navigate to Splunk App for Stream
  3. If this is your first time you may find you need to complete the welcome wizard .
  4. Click on Configure the “Distributed Forwarder Management”
    • stream_configure_dfm
  5. Click Create New Group as follows then click Next
    1. Name Infra_DNS
    2. Description Applied to All DNS servers
    3. Include Ephemeral Streams? No
  6. Enter “infra_dns” as this will ensure all clients deployed above will pickup this configuration from the Stream App
  7. Search for “Splunk_DNS” and select each match then Click Finish
    • stream_dns_aggs
  8. Click on Configuration then “Configure Streams”
    • stream_configure
  1. Click on New Stream
  2. Setup basic info as follows then click Next
    1. Protocol DNS
    2. Name “Infra_DNS”
    3. Description “Capture DNS on internal DNS servers”
    4. stream_configure_dns
  3. We will no use Aggregation so leave this as “No” and click Next
  4. The default fields will meet our needs so go ahead and click Next
  5. Optional Step: Create filters in most cases requests from the DNS server to the outside are not interesting as they are generated based on client requests that cannot be answer from the cache. Creating filters will reduce the total volume of data by approximately 50%
    1. Click create filter
    2. Select src_ip as the field
    3. Select “Not Regular Expression” as the type
    4. Provide a regex capture that will match  all DNS server IPs example “(172\.16\.0\.(19|20|21))” will match in my lab network.
      • stream_filter
    5. Click next
    6. Select only the Infra_DNS group and click Create Stream

At this point stream will deploy and begin collection however index selection is not permitted in this workflow so we need to go back and set it up now.

  1. Find Infra_DNS and click edit
  2. Select the index appropriate for your environment
  3. Click save

Ready to check your work? Run this search replace index=* with your index

index=* sourcetype=stream:dns | stats count by query | sort – count


Getting all the logs, avoid the syslog

Big data, open world a utopia we may one day have. Today I want my logs all of my logs, and then I want more. I often want to collect additional data such as:

  • Performance counters on Windows operating systems
  • Appended files on all platforms
  • Script and executable output to translate the odd and the weird stuff developers create

All to often there is resistance to this lofty goal of security information awareness. Why might you ask. To be honest often security people have a certain reputation. I’m not talking about the funk or the mothers basement kind of reputation. There is a reputation for breaking the environment and stopping the business. IT ops is in agent overload, license compliance, monitoring, data loss prevention, av, endpoint security all want their agents. Log management often is late to the party and is viewed as a bridge to far. In some cases an ineffective solution was in place and there is resistance to replacing a legacy collection tool. Yes indeed the reason people don’t want to install a proper collection tool is the broken solution being replaced worked just fine. Really actually had this conversation.

I’m a Splunk user and customer turned consultant. I bleed green but this isn’t about Splunk it does support the idea that using the Splunk tool set including the Universal Forwarder is the best choice. But if your log collection tool is another enterprise ready product this applies to you as well.

Issue number 1: Supportability each agent will parse or fail parse and provide log data in a unique format. Each security solution vendor will be able to best test with their native language (format) if supportable and tested is a goal. You want to use the best tool.

Issue number 2: Reliable delivery each agent from a commercial vendor using a native protocol will support acknowledgment and store and forward. Any vendor neutral agent using the syslog feature will not support this feature meaning you can not assure any auditor with any level of google foo your log solution has integrity and is complete.

Issue number 3: Reliable resumption each commercial agent includes support for high water tracking with windows events, and tail tracking for files. Snare (unreliable) Lasso (unreliable) Logstash, Gray log Fluentd do not support this feature. Without this feature any time the agent stops, abends, or the system reboots data is lost. So this is not acceptable for regulated environment. Including small matters like PCI, SOX, HIPPA, GLBA to name an American focused few.

Issue number 4: The position that using a freeware or vendor neutral collection tool is reliable places you alone outside of industry support. Splunk, HP logger, Mcafee Nitro, Q1 Radar all provide reliable collection agents. Where support for syslog based solutions exists it is limited and second class at best.

Issue number 5: Cost its not free, every issue encountered will cost human labor time, opportunity (delays) and potentially leave your company open to audit finding for non compliance.

Issue number 6: False belief that performance will be impacted by these vendor agents. While for some specific vendor agents and use cases this may be true. It is no more likely (or less likley) to be than using a unsupported log collection tool.

Splunk Universal Forwarder Version 6.2.3+ Ubuntu 15.04

Author: Ryan Faircloth

Summary: Using repositories for version managment of the Splunk Universal Forwarder assists in ensuring managed Ubuntu systems are using the approved version of the software at all times.

Setup the repository server

  1. Install reprepro and nginx

    sudo apt-get install reprepro nginx packaging-dev -y

  2. Create a user to work with the repository

    adduser --disabled-password --disabled-login --home /srv/reprepro --group reprepro

  3. Change user to our reprepro user all commands for the repository should be executed using this ID

    sudo su - reprepro

Generate GPG Keys

  1. Change user to our reprepro user all commands for the repository should be executed using this ID

    sudo su - reprepro 
  2. Create the default configuration for gpg by running the command

    gpg --list-keys

  3. Edit ~/.gnupg/gpg.conf
    • uncomment the line no-greeting
    • add the following content to the end of the file
    # Prioritize stronger algorithms for new keys.
    default-preference-list SHA512 SHA384 SHA256 SHA224 AES256 AES192 AES CAST5 BZIP2 ZLIB ZIP UNCOMPRESSED
    # Use a stronger digest than the default SHA1 for certifications.
    cert-digest-algo SHA512
  4. Generate a new key with the command gpg --gen-key

  5. Select the folowing options
    1. Type of key “(1) RSA and RSA (default)”
    2. Key size “4096”
    3. Expires “10y”
    4. Confirm “Y”
    5. Real Name “Splunk local repository”
    6. Email address on repository contact this generally should be an alias or distribution list
    7. Leave the comment blank
    8. Confirm and “O” to Okay
    9. Leave passphrase blank and confirm, a key will be generated not the sub KEY ID in the following example * E507D48E *

    gpg: checking the trustdb
    gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
    gpg: depth: 0 valid: 1 signed: 0 trust: 0-, 0q, 0n, 0m, 0f, 1u
    gpg: next trustdb check due at 2025-05-24
    pub 4096R/410E1699 2015-05-27 [expires: 2025-05-24]
    Key fingerprint = 7CB8 81A9 E07F DA7B 83FF 2E1B 8B31 DA83 410E 1699
    uid Splunk local repository <>
    sub 4096R/E507D48E 2015-05-27 [expires: 2025-05-24]

  6. Export the signing keys public component save this content for use later

    gpg --export --armor KEY_ID >~/

Configure Prerepro

  1. Change user to our reprepro user all commands for the repository should be executed using this ID sudo su - reprepro

  2. Create the directory structure sudo mkdir -p /srv/reprepro/ubuntu/{conf,dists,incoming,indices,logs,pool,project,tmp}

  3. Change directories to the new repository cd /srv/reprepro/ubuntu/

  4. Edit the file /srv/reprepro/ubuntu/conf/distributions

  5. Update the file contents

    Origin: SplunkEnterprise
    Label: SplunkEnterprise
    Codename: ponies
    Architectures: i386 amd64 source    
    Components: main
    Description: Splunk Enterprise and Universal Forwarders for Debian based systems
    SignWith: YOUR-KEY-ID 
  6. Edit the file /srv/reprepro/ubuntu/conf/options

  7. Update the file contents

    basedir .

Load the packages

Load the packages using the following commands syntax replace package.deb with the correct path to the splunkforwarder deb file

reprepro -S utils -P standard includedeb ponies package.deb

Setup the web server

  1. Create the file /etc/nginx/sites-available/vhost-packages.conf

  2. Use the following content replacing package.local with the fqdn of the repository host
    server {
      listen 80;
      server_name packages.internal;
      access_log /var/log/nginx/packages-access.log;
      error_log /var/log/nginx/packages-error.log;
      location / {
        root /srv/reprepro;
        index index.html;
      location ~ /(.*)/conf {
        deny all;
      location ~ /(.*)/db {
        deny all;
  3. Increase the server name hash bucket by creating the following file /etc/nginx/conf.d/server_names_hash_bucket_size.conf

  4. Use the following content server_names_hash_bucket_size 64;

  5. Enable the new configuration

    sudo ln -s /etc/nginx/sites-available/vhost-packages.conf /etc/nginx/sites-enabled/vhost-packages.conf
    sudo service nginx reload

Configure the repository

  1. Edit the file
  2. Use the following content
    deb http://packages.internal/ubuntu/ ponies main
  3. Import the public key
    sudo apt-key add /tmp/
  4. Update the repository cache
    sudo apt-get update 

Install the Splunk Universal Forwarder

Run the following command

sudo apt-get install splunkforwarder

Configure the universal forwarder

  • Using best practices to manually create the org_deploymentclient configuration app
  • Using RPM based configuration package
  • Using Configuration Managment system such as Puppet or Chef

Create and install a configuration package for the Universal Forwarder

In the following procedure “org” should be replace with the abbreviate of the organization using the configuration.

  1. Create the paths /srv/reprepro/org_debs/

  2. Create the path for the first version of the package ie mkdir org-splunk-ufconfig-1

  3. Change to the new directory

  4. Create the following structure

    ├── DEBIAN
    │   ├── control (file)
    │   ├── postinst (file)
    │   ├── preinst (file)
    │   └── prerm (file)
    └── opt
        └── splunkforwarder
            └── etc
                └── apps
                    └── org_all_deploymentclient
                        └── default
                            ├── deploymentclient.conf (file)
  5. Edit the DEBIAN/control file as follows

    Package: org-splunk-ufconfig
    Section: base
    Priority: standard
    Version: 1
    Architecture: all
    Maintainer: Your Name <>
    Depends: splunkforwarder (>=6.0.0)
    Description: <insert up to 60 chars description>
    <insert long description, indented with spaces>

  6. Edit the DEBIAN/postinst

    /opt/splunkforwarder/bin/splunk enable boot-start -user splunk --accept-license --answer-yes
    service splunk start    
  7. Edit the DEBIAN/preinst
    if [ -f "$file" ]
        echo "$file found."
        service splunk stop
        echo "$file not found."
  8. Edit the DEBIAN/prerm
    if [ -f "$file" ]
        echo "$file found."
        service splunk stop
        /opt/splunkforwarder/bin/splunk disable boot-start
        echo "$file not found."
  9. Update the contents of deploymentclient.conf with the appropriate information for you installation

  10. Add additional content as required for your deployment

  11. Change directories up to the parent of org-splunk-ufconfig–1

  12. Create the debian package with the command dpkg-deb --build org-splunk-ufconfig-1/

  13. Change to the repository directory /srv/reprepro/ubuntu

  14. Store the new package in the repository

    reprepro -S utils -P standard includedeb ponies /srv/reprepro/org_debs/org-splunk-ufconfig-1.deb

  15. Install the new package on the client using the command sudo apt-get install org-splunk-ufconfig this will install the splunk forwarder package if has not yet been installed.

Splunk Universal Forwarder Version 6.2.3+ Microsoft System Center 2012 R2

Author: Ryan Faircloth

Summary: Rapid deployment of the universal forwarder in a production environment is possible with a minimal amount of risk for the customer. The installation of a universal forwarder can be performed at any time without impact to the production system and without reboot. A small caution is required in that if an existing MSI installation has created on reboot actions the installation of the Splunk universal forwarder or any other MSI may trigger a reboot by the SCCM client.



This guide will deploy the universal forwarder to all servers with a supported version of the Microsoft Windows Server operating system.

  • Create a new folder to contain Splunk related collections
  • Create one or more collection containing all systems which should receive the universal forwarder.
  • Create a collection containing all systems where any version of the universal forwarder -has been deployed
  • Create an application definition to deploy the universal forwarder without configuration
  • Create an application definition to deploy an upgrade to the universal forwarder without configuration
  • Create a package containing a powershell script to configure the universal forwarder
  • Deploy the configuration script using a task sequence

Prerequisite Steps

Task Responsible
Create CNAME for Deployment Server DNS Admin
Install Splunk Enterprise on Server Splunk Admin
Configure Splunk Instance as Deployment Server Splunk Admin

Step by Step

Create the deployment collection folder

  1. Navigate to Device Collections

  2. Right click

  3. Create new folder

  4. Name the new folder “Splunk Universal Forwarders”

  5. Navigate to the new folder

Create a collection for deployment

  1. Right click and choose "“Create New Device Collection”
  2. Name the collection “Splunk Deployment Collection for Servers”
  3. Select “All Desktop and Server Clients” as the limiting collection
    Create Device Collection
  4. Click Next
  5. Click Add to define the criteria used to determine which devices will receive the Universal Forwarder
  6. Click Query
  7. Name the Query “Server OS”
  8. Click Edit
  9. Click Show query language
  10. Enter the following query:
    select SMS_R_SYSTEM.ResourceID,
    from SMS_R_System
    inner join
    SMS_G_System_OPERATING_SYSTEM on SMS_G_System_OPERATING_SYSTEM.ResourceId = SMS_R_System.ResourceId
    SMS_G_System_OPERATING_SYSTEM.ProductType = 2
    or SMS_G_System_OPERATING_SYSTEM.ProductType = 3
  11. Click OK
  12. Click OK again
  13. Enable Incremental Update by checking the box
  14. Click Next
  15. Click Next
  16. Click Close

> Note: the collection will contain zero members until the update collection background task completes

Create a collection of all successfully deployed universal forwarders

  1. Right click and choose “Create New Device Collection”
  2. Name the collection “Splunk Deployment Collection for Deployed Forwarders”

  3. Select “All Desktop and Server Clients” as the limiting collection

  4. Click Next

  5. Click Add to define the criteria used to determine which devices will receive the Universal Forwarder

  6. Click Query

  7. Name the Query “Server OS”

  8. Click Edit

  9. Click Show query language

  10. Enter the following query:

    from SMS_R_System
    inner join
    SMS_G_System_ADD_REMOVE_PROGRAMS on SMS_G_System_ADD_REMOVE_PROGRAMS.ResourceID = SMS_R_System.ResourceId
    inner join
    inner join
    SMS_G_System_ADD_REMOVE_PROGRAMS_64.ResourceId = SMS_R_System.ResourceId
    SMS_G_System_ADD_REMOVE_PROGRAMS.DisplayName = "UniversalForwarder"
    and SMS_G_System_ADD_REMOVE_PROGRAMS_64.DisplayName = "UniversalForwarder"
    or SMS_G_System_INSTALLED_SOFTWARE.ProductName = "UniversalForwarder"
    order by SMS_R_System.Name

  11. Click OK

  12. Click OK again

  13. Enable Incremental Update by checking the box

  14. Click Next

  15. Click Next

  16. Click Close

Note: the collection will contain zero members until the update collection background task completes

Create Application Definitions

Download both the 32bit and 64bit versions of the Splunk Universal Forwarder into the source folder structure used for SCCM deployment applications. Do this for all versions currently deployed as well as the new version to be deployed.

In general the locations are similar to the path:

Create the application definition for the oldest deployed version of the Univeral Forwarder first.

  1. Navigate to Applications in the Software Library screen
  2. Right click and create a new folder for Splunk definitions
  3. Right click on the new folder and choose Create New Application
  4. Locate the 64 bit MSI for this product version
  5. Click Next
  6. Click Next again
  7. Update the definition with the following information
    • Name (Include version Number and bitness Version number i.e. Universal Forwarder 6.2.3 (x64)
    • Publisher
    • Version
    • Update the command line by removing “/q” and appending “/quiet AGREETOLICENSE=Yes”

      Note it is very important that /q is replaced by /quiet

  8. Click Next
  9. Click Next
  10. Click Close
  11. Right click on the new application definition and click properties
  12. Select the deployment type tab
  13. Select the first deployment and click edit
  14. Select the program tab
  15. update the uninstall command replacing /q with /quiet
  16. select the third browse next to product code and select the MSI
  17. Click requirements
  18. Click add
  19. Select category = device condition = operating system and provide the supported 64bit operating systems
  20. Create and additional requirements appropriate for your environment such as memory and disk space free
  21. Click OK
  22. Click OK again
  23. Add a new deployment type define the 32 bit MSI type using the information above
  24. Edit the new type using the information above to set the product MSI and verify requirements
  25. Select the supersedence tab
  26. click add
  27. Click Browse and select the oldest prior version of the application deployed to replace
  28. Map old deployment type to new ensuring the types match
  29. Click OK
  30. Add any other replacements required
  31. Verify your work and click OK

Repeat the application creation process for all versions of the UF in production If you are upgrading monitor your deployment progress You may continue with this procedure while the Universal Forwarder application is deployed.

Create a Configuration Script

  1. Create a source folder to contain the configuration script for example \\servername\source\splunk\scripts\UF_Config_V1
  2. The following script can be used as a template for the appropriate configuration for your site. At minimum the deployment server FQDN must be customized. Name the script configure.ps1
#Splunk Configuration Script for SCCM Task Sequence
#Locate Splunk based on the MSI registration

function Get-IniContent ($filePath)
$ini = @{}
switch -regex -file $FilePath
 "^\[(.+)\]" # Section
$section = $matches[1]
$ini[$section] = @{}
$CommentCount = 0
"^(\#.*)$" # Comment
$value = $matches[1]
$CommentCount = $CommentCount + 1
$name = "Comment" + $CommentCount
#$ini[$section][$name] = $value
"(.+?)\s*=(.*)" # Key
$name,$value = $matches[1..2]
$ini[$section][$name] = $value
return $ini

$location ="C:\Program Files\SplunkUniversalForwarder\"

#note if splunk may not be installed at the default location uncomment the following lines
#$list = Get-WmiOBject -Class Win32_Product | Where-Object {
# $_.Name -eq 'UniversalForwarder' -or $_.Name -eq 'Splunk' }

#$splunkprod = $list | where-Object { $_.InstallLocation }

#$location = $splunkprod.InstallLocation

$scriptappver = 2

$splunkcmd = $location + "bin\splunk.exe"
$staticapp = $location + "etc\apps\_static_all_universalforwarder\"
$staticdefault = $staticapp + "default\"
$staticlocal = $staticapp + "local\"

$staticdefault_dc = $staticdefault + "deploymentclient.conf"
$staticlocal_dc = $staticlocal + "deploymentclient.conf"
$staticdefault_app = $staticdefault + "app.conf"

if (!(Test-Path -Path $staticapp)) {new-item -ItemType Directory -Path $staticapp}

if (!(Test-Path -Path $staticdefault)) {new-item -ItemType Directory -Path $staticdefault}

if (!(Test-Path -Path $staticlocal)) {new-item -ItemType Directory -Path $staticlocal}

if (!(Test-Path -Path $staticdefault_app))
 new-item -path $staticdefault_app -ItemType File
 Add-Content -Path $staticdefault_app -Value "#Generated by scripting"
 #Add-Content -Path $staticdefault_app -Value "`r`n"
 Add-Content -Path $staticdefault_app -Value "[_static_all_universalforwarder]"
 Add-Content -Path $staticdefault_app -Value "author=Ryan Faircloth"
 Add-Content -Path $staticdefault_app -Value "description=Script Generated UF default configuration applied by SCCM"
 Add-Content -Path $staticdefault_app -Value "version=1"
 Add-Content -Path $staticdefault_app -Value "[ui]"
 Add-Content -Path $staticdefault_app -Value "is_visible = false"

$appconf = Get-IniContent $staticdefault_app
$appver = $appconf[“_static_all_universalforwarder”][“version”]

if ($appver -ne $scriptappver)
if (!(Test-Path -Path $staticdefault_dc))
 new-item -path $staticdefault_dc -ItemType File
 Add-Content -Path $staticdefault_dc -Value "#Generated by scripting"
 Add-Content -Path $staticdefault_dc -Value "[deployment-client]"
 Add-Content -Path $staticdefault_dc -Value "clientName=ScriptDeployed|"
 Add-Content -Path $staticdefault_dc -Value "[target-broker:deploymentServer]"
 Add-Content -Path $staticdefault_dc -Value ""
 Add-Content -Path $staticdefault_dc -Value ""


& $splunkcmd "restart"

Create a Package to contain the configuration script

  1. Create a new package folder Splunk
  2. Create a new folder on a network share Splunk_config_vx where X is the version of the script and include a customized version of the config script provided
  3. Right click on the package folder create package
  4. Name the package Splunk Configuration Script v1
  5. Select the source folder
  6. Click Next
  7. Click do not create a program
  8. Click next
  9. Click next
  10. Click Close
  11. Right click on the package and click “Distribute Content” using appropriate options for the environment. Do not click deploy
  12. Create the Task Sequence
  13. Crea a new Task Sequence Folder “Splunk”
  14. Right click the Task Sequence Folder Create Task Sequence
  15. Name the task Splunk Config Vx
  16. Click Next
  17. Click Next
  18. Click Close
  19. Right click on the task sequence
  20. Click properties
  21. Click the advance tab
  22. Select suppress task sequence notifications and disable this task sequence on computers where it is deployed
  23. Right click on the task sequence and choose edit
  24. Click Add General —> powershell script
  25. Set the script name i.e. configure.ps1 and execution policy=bypass
  26. Click OK
  27. Right click on the task and deploy to the deployed collection created second above

Create the configuration task sequence

  1. Navigate to Software Library
  2. Navigate to Operating System Deployment
  3. Navigate to Task Sequence
  4. Optional Create a new folder called Splunk
  5. Right click and Create a new task sequence
  6. Select Custom Sequence
  7. Click Next
  8. Name the sequence i.e. Splunk Configuration Script Vx
  9. Click Next
  10. Click Next
  11. Click Close
  12. Right click on the task sequence
  13. Click properties
  14. Click the advanced tab
    • Select suppress task sequence notifications
    • disable this task sequence on computers where it is deployed
  15. Click Ok
  16. Right click on the task sequence and choose edit
    • Click Add General —> powershell script
    • Set the script name and execution policy=bypass
  17. Click OK
  18. Right click on the task and deploy to the deployed collection created second above