Get started with Splunk App Stream 6.4 for DNS Analysis

Passive DNS analysis is all the rage right now, the detection opportunities presented have been well discussed for some time. If your organization is like most now is the time you are being asked how you can implement these detection strategies. Leveraging your existing Splunk investment you can get started very quickly with less change to your organization than one might think. Here is what we will use older versions will work fine however the screen shots will be a bit off:

  •  Splunk Enterprise 6.3.1
  • Splunk App for Stream 6.4

We will assume Splunk Enterprise 6.3.1has already been installed.

Decide where to install your Stream App. Typically this will be the Enterprise Security search head. However if your ES search head is also a search head cluster you will need to use an AD-HOC search head,  dedicated search head or a deployment server. Current versions of Stream fully support installation on a Search Head Cluster.

Note: If using the deployment server (DS) you must configure the server to search the indexer or index cluster containing your stream data.

  1. Install Splunk App for Stream using the standard procedures located here.
  2. Copy the deployment TA to your deployment server if you installed on a search head. /opt/splunk/etc/deployment-apps/Splunk_TA_stream
  3. On your deployment server create a new folder to contain configuration for your stream dns server group.
    • mkdir -p Splunk_TA_stream_infra_dns/local
  4. Copy the inputs.conf from the default TA to the new TA for group management
    • cp Splunk_TA_stream/local/inputs.conf Splunk_TA_stream_infra_dns/local/
  5. Update the inputs.conf to include your forwarder group id
    • vi Splunk_TA_stream_infra_dns/local/inputs.conf
    • Alter “stream_forwarder_id =” to “stream_forwarder_id =infra_dns”
  6. Create a new server class “infra_stream_dns” include both the following apps and deploy to all DNS servers (Windows DNS or BIND)
    • Splunk_TA_stream
    • Splunk_TA_stream_infra_dns
  7. Reload your deployment server

Excellent at this point the Splunk Stream app will be deployed to all of your DNS servers and sit idle. The next few steps will prepare the environment to start collections

  • Create a new index I typically will create stream_dns and setup retention for 30 days.

Configure your deployment group

  1. Login to the search head with the Splunk App for Stream
  2. Navigate to Splunk App for Stream
  3. If this is your first time you may find you need to complete the welcome wizard .
  4. Click on Configure the “Distributed Forwarder Management”
    • stream_configure_dfm
  5. Click Create New Group as follows then click Next
    1. Name Infra_DNS
    2. Description Applied to All DNS servers
    3. Include Ephemeral Streams? No
  6. Enter “infra_dns” as this will ensure all clients deployed above will pickup this configuration from the Stream App
  7. Search for “Splunk_DNS” and select each match then Click Finish
    • stream_dns_aggs
  8. Click on Configuration then “Configure Streams”
    • stream_configure
  1. Click on New Stream
  2. Setup basic info as follows then click Next
    1. Protocol DNS
    2. Name “Infra_DNS”
    3. Description “Capture DNS on internal DNS servers”
    4. stream_configure_dns
  3. We will no use Aggregation so leave this as “No” and click Next
  4. The default fields will meet our needs so go ahead and click Next
  5. Optional Step: Create filters in most cases requests from the DNS server to the outside are not interesting as they are generated based on client requests that cannot be answer from the cache. Creating filters will reduce the total volume of data by approximately 50%
    1. Click create filter
    2. Select src_ip as the field
    3. Select “Not Regular Expression” as the type
    4. Provide a regex capture that will match  all DNS server IPs example “(172\.16\.0\.(19|20|21))” will match in my lab network.
      • stream_filter
    5. Click next
    6. Select only the Infra_DNS group and click Create Stream

At this point stream will deploy and begin collection however index selection is not permitted in this workflow so we need to go back and set it up now.

  1. Find Infra_DNS and click edit
  2. Select the index appropriate for your environment
  3. Click save

Ready to check your work? Run this search replace index=* with your index

index=* sourcetype=stream:dns | stats count by query | sort – count


Getting all the logs – Avoiding the WEC

I get asked about this one often, I happen to have a bit of experience with this which is very rare. There is scant documentation on the technology from Microsoft or anyone else. I do know of some success being had with very specific low volume use cases but that’s not what I do. I’m a specialist of sorts I walk of a Delta plane, drop my bag at a Marriott then walk into change someones world with data. Actual facts about their environment from their environment and I need and use data my customers don’t know they had. Which brings me to Windows Event Collection (WEC).

Customer ask me about it its seems so easy lets talk about the parts

  • Group policy use to make changes to all systems in an environment.
  • Remote Power Shell
  • COM/DCOM/Com+ and all of the RPC that goes with it
  • Kerberos authentication

How does it work?

  1. Group policy instructs the computer to connect to a collector and gather a policy
  2. Policy read causes a Com+ server to read the event log (yes this is code you have not been running it can and will impact your endpoints)
  3. Local filter determines what do do with this event (xml parsing with XPATH and XSLT)
  4. RPC call using computer account to Collector
  5. Denial (Auth required)
  6. Authentication (event log write on DC and on Collector)
  7. Serial write with sync and block to round robin data base on the server. So if 300 events come in these have to get in queue to go to disk.
  8. Close connection
  9. Poll period go back to 3

Lots of steps? Lets ask about failure modes

  • What happens if my collector is down
    • Answer client goes to sleep and retries hope your logs don’t wrap
  • What happens if my collector won’t get back up
    • Answer build a new one, open a change record, wait for approval, explain to audit why you don’t have logs
  • What happens to the format of the logs?
    • Answer Good question I can’t explain what MS is doing to these logs if you know please share
  • What about log rotation and archival
    • Answer not possible you need another tool to read back and store them some place (splunk)
  • My collector isn’t keeping up what do I do now?
    • Answer Well hopefully the org structure of your Domain will support creating an assignment policy at the OU level, you might be able to use the same policy/collector pair at multiple OU points but you might also need to break up the OUs to manage the policy.
  • Cross domain?
    • Answer 1 or more collectors per domain.
  • Wait I only want events XX and ZZYY from certain servers for compliance.
    • Answer you get another collection policy
  • I can’t make this work on server2134
    • Answer call Support at MS, explain what event collection is, hopefully convince that person it is supported
  • My sensitive “application/service log” doesn’t use the event log
    • Answer logfile this is windows who would do that?

Lets compare to universal forwarders with Splunk

  • What happens if my “indexer” is down
    • Answer Client connect to another indexer, in a production system the indexer itself is replicated and you retain access to all data.
  • What happens if my collector won’t get back up
    • Answer. Data is replicated still available
  • What happens to the format of the logs?
    • Answer We capture the original text of all logs
  • What about log rotation and archival
    • Answer Built in
  • My collector isn’t keeping up what do I do now?
    • Answer Horizontal scaling Splunk will help you plan for this with experience and performance data from real world implementations
  • Cross domain?
    • Certainly, WAN no issue, Cloud not a problem. VPN sure why not
  • Wait I only want events XX and ZZYY from certain servers for compliance.
    • Deployment server will push a configuration based on the server names you select
  • I can’t make this work on server2134
    • Answer call Support (paid) at Splunk, we have real people with real knowledge  and a great community who has probably solved that problem before.
  • My sensitive system doesn’t use the event log file it
    • Answer probably not a problem, files, database, network capture can be a data source we do this all the time.

Getting all the logs, avoid the syslog

Big data, open world a utopia we may one day have. Today I want my logs all of my logs, and then I want more. I often want to collect additional data such as:

  • Performance counters on Windows operating systems
  • Appended files on all platforms
  • Script and executable output to translate the odd and the weird stuff developers create

All to often there is resistance to this lofty goal of security information awareness. Why might you ask. To be honest often security people have a certain reputation. I’m not talking about the funk or the mothers basement kind of reputation. There is a reputation for breaking the environment and stopping the business. IT ops is in agent overload, license compliance, monitoring, data loss prevention, av, endpoint security all want their agents. Log management often is late to the party and is viewed as a bridge to far. In some cases an ineffective solution was in place and there is resistance to replacing a legacy collection tool. Yes indeed the reason people don’t want to install a proper collection tool is the broken solution being replaced worked just fine. Really actually had this conversation.

I’m a Splunk user and customer turned consultant. I bleed green but this isn’t about Splunk it does support the idea that using the Splunk tool set including the Universal Forwarder is the best choice. But if your log collection tool is another enterprise ready product this applies to you as well.

Issue number 1: Supportability each agent will parse or fail parse and provide log data in a unique format. Each security solution vendor will be able to best test with their native language (format) if supportable and tested is a goal. You want to use the best tool.

Issue number 2: Reliable delivery each agent from a commercial vendor using a native protocol will support acknowledgment and store and forward. Any vendor neutral agent using the syslog feature will not support this feature meaning you can not assure any auditor with any level of google foo your log solution has integrity and is complete.

Issue number 3: Reliable resumption each commercial agent includes support for high water tracking with windows events, and tail tracking for files. Snare (unreliable) Lasso (unreliable) Logstash, Gray log Fluentd do not support this feature. Without this feature any time the agent stops, abends, or the system reboots data is lost. So this is not acceptable for regulated environment. Including small matters like PCI, SOX, HIPPA, GLBA to name an American focused few.

Issue number 4: The position that using a freeware or vendor neutral collection tool is reliable places you alone outside of industry support. Splunk, HP logger, Mcafee Nitro, Q1 Radar all provide reliable collection agents. Where support for syslog based solutions exists it is limited and second class at best.

Issue number 5: Cost its not free, every issue encountered will cost human labor time, opportunity (delays) and potentially leave your company open to audit finding for non compliance.

Issue number 6: False belief that performance will be impacted by these vendor agents. While for some specific vendor agents and use cases this may be true. It is no more likely (or less likley) to be than using a unsupported log collection tool.

Splunk Universal Forwarder Version 6.2.3+ Ubuntu 15.04

Author: Ryan Faircloth

Summary: Using repositories for version managment of the Splunk Universal Forwarder assists in ensuring managed Ubuntu systems are using the approved version of the software at all times.

Setup the repository server

  1. Install reprepro and nginx

    sudo apt-get install reprepro nginx packaging-dev -y

  2. Create a user to work with the repository

    adduser --disabled-password --disabled-login --home /srv/reprepro --group reprepro

  3. Change user to our reprepro user all commands for the repository should be executed using this ID

    sudo su - reprepro

Generate GPG Keys

  1. Change user to our reprepro user all commands for the repository should be executed using this ID

    sudo su - reprepro 
  2. Create the default configuration for gpg by running the command

    gpg --list-keys

  3. Edit ~/.gnupg/gpg.conf
    • uncomment the line no-greeting
    • add the following content to the end of the file
    # Prioritize stronger algorithms for new keys.
    default-preference-list SHA512 SHA384 SHA256 SHA224 AES256 AES192 AES CAST5 BZIP2 ZLIB ZIP UNCOMPRESSED
    # Use a stronger digest than the default SHA1 for certifications.
    cert-digest-algo SHA512
  4. Generate a new key with the command gpg --gen-key

  5. Select the folowing options
    1. Type of key “(1) RSA and RSA (default)”
    2. Key size “4096”
    3. Expires “10y”
    4. Confirm “Y”
    5. Real Name “Splunk local repository”
    6. Email address on repository contact this generally should be an alias or distribution list
    7. Leave the comment blank
    8. Confirm and “O” to Okay
    9. Leave passphrase blank and confirm, a key will be generated not the sub KEY ID in the following example * E507D48E *

    gpg: checking the trustdb
    gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
    gpg: depth: 0 valid: 1 signed: 0 trust: 0-, 0q, 0n, 0m, 0f, 1u
    gpg: next trustdb check due at 2025-05-24
    pub 4096R/410E1699 2015-05-27 [expires: 2025-05-24]
    Key fingerprint = 7CB8 81A9 E07F DA7B 83FF 2E1B 8B31 DA83 410E 1699
    uid Splunk local repository <>
    sub 4096R/E507D48E 2015-05-27 [expires: 2025-05-24]

  6. Export the signing keys public component save this content for use later

    gpg --export --armor KEY_ID >~/

Configure Prerepro

  1. Change user to our reprepro user all commands for the repository should be executed using this ID sudo su - reprepro

  2. Create the directory structure sudo mkdir -p /srv/reprepro/ubuntu/{conf,dists,incoming,indices,logs,pool,project,tmp}

  3. Change directories to the new repository cd /srv/reprepro/ubuntu/

  4. Edit the file /srv/reprepro/ubuntu/conf/distributions

  5. Update the file contents

    Origin: SplunkEnterprise
    Label: SplunkEnterprise
    Codename: ponies
    Architectures: i386 amd64 source    
    Components: main
    Description: Splunk Enterprise and Universal Forwarders for Debian based systems
    SignWith: YOUR-KEY-ID 
  6. Edit the file /srv/reprepro/ubuntu/conf/options

  7. Update the file contents

    basedir .

Load the packages

Load the packages using the following commands syntax replace package.deb with the correct path to the splunkforwarder deb file

reprepro -S utils -P standard includedeb ponies package.deb

Setup the web server

  1. Create the file /etc/nginx/sites-available/vhost-packages.conf

  2. Use the following content replacing package.local with the fqdn of the repository host
    server {
      listen 80;
      server_name packages.internal;
      access_log /var/log/nginx/packages-access.log;
      error_log /var/log/nginx/packages-error.log;
      location / {
        root /srv/reprepro;
        index index.html;
      location ~ /(.*)/conf {
        deny all;
      location ~ /(.*)/db {
        deny all;
  3. Increase the server name hash bucket by creating the following file /etc/nginx/conf.d/server_names_hash_bucket_size.conf

  4. Use the following content server_names_hash_bucket_size 64;

  5. Enable the new configuration

    sudo ln -s /etc/nginx/sites-available/vhost-packages.conf /etc/nginx/sites-enabled/vhost-packages.conf
    sudo service nginx reload

Configure the repository

  1. Edit the file
  2. Use the following content
    deb http://packages.internal/ubuntu/ ponies main
  3. Import the public key
    sudo apt-key add /tmp/
  4. Update the repository cache
    sudo apt-get update 

Install the Splunk Universal Forwarder

Run the following command

sudo apt-get install splunkforwarder

Configure the universal forwarder

  • Using best practices to manually create the org_deploymentclient configuration app
  • Using RPM based configuration package
  • Using Configuration Managment system such as Puppet or Chef

Create and install a configuration package for the Universal Forwarder

In the following procedure “org” should be replace with the abbreviate of the organization using the configuration.

  1. Create the paths /srv/reprepro/org_debs/

  2. Create the path for the first version of the package ie mkdir org-splunk-ufconfig-1

  3. Change to the new directory

  4. Create the following structure

    ├── DEBIAN
    │   ├── control (file)
    │   ├── postinst (file)
    │   ├── preinst (file)
    │   └── prerm (file)
    └── opt
        └── splunkforwarder
            └── etc
                └── apps
                    └── org_all_deploymentclient
                        └── default
                            ├── deploymentclient.conf (file)
  5. Edit the DEBIAN/control file as follows

    Package: org-splunk-ufconfig
    Section: base
    Priority: standard
    Version: 1
    Architecture: all
    Maintainer: Your Name <>
    Depends: splunkforwarder (>=6.0.0)
    Description: <insert up to 60 chars description>
    <insert long description, indented with spaces>

  6. Edit the DEBIAN/postinst

    /opt/splunkforwarder/bin/splunk enable boot-start -user splunk --accept-license --answer-yes
    service splunk start    
  7. Edit the DEBIAN/preinst
    if [ -f "$file" ]
        echo "$file found."
        service splunk stop
        echo "$file not found."
  8. Edit the DEBIAN/prerm
    if [ -f "$file" ]
        echo "$file found."
        service splunk stop
        /opt/splunkforwarder/bin/splunk disable boot-start
        echo "$file not found."
  9. Update the contents of deploymentclient.conf with the appropriate information for you installation

  10. Add additional content as required for your deployment

  11. Change directories up to the parent of org-splunk-ufconfig–1

  12. Create the debian package with the command dpkg-deb --build org-splunk-ufconfig-1/

  13. Change to the repository directory /srv/reprepro/ubuntu

  14. Store the new package in the repository

    reprepro -S utils -P standard includedeb ponies /srv/reprepro/org_debs/org-splunk-ufconfig-1.deb

  15. Install the new package on the client using the command sudo apt-get install org-splunk-ufconfig this will install the splunk forwarder package if has not yet been installed.

Splunk Universal Forwarder Version 6.2.3+ Microsoft System Center 2012 R2

Author: Ryan Faircloth

Summary: Rapid deployment of the universal forwarder in a production environment is possible with a minimal amount of risk for the customer. The installation of a universal forwarder can be performed at any time without impact to the production system and without reboot. A small caution is required in that if an existing MSI installation has created on reboot actions the installation of the Splunk universal forwarder or any other MSI may trigger a reboot by the SCCM client.



This guide will deploy the universal forwarder to all servers with a supported version of the Microsoft Windows Server operating system.

  • Create a new folder to contain Splunk related collections
  • Create one or more collection containing all systems which should receive the universal forwarder.
  • Create a collection containing all systems where any version of the universal forwarder -has been deployed
  • Create an application definition to deploy the universal forwarder without configuration
  • Create an application definition to deploy an upgrade to the universal forwarder without configuration
  • Create a package containing a powershell script to configure the universal forwarder
  • Deploy the configuration script using a task sequence

Prerequisite Steps

Task Responsible
Create CNAME for Deployment Server DNS Admin
Install Splunk Enterprise on Server Splunk Admin
Configure Splunk Instance as Deployment Server Splunk Admin

Step by Step

Create the deployment collection folder

  1. Navigate to Device Collections

  2. Right click

  3. Create new folder

  4. Name the new folder “Splunk Universal Forwarders”

  5. Navigate to the new folder

Create a collection for deployment

  1. Right click and choose "“Create New Device Collection”
  2. Name the collection “Splunk Deployment Collection for Servers”
  3. Select “All Desktop and Server Clients” as the limiting collection
    Create Device Collection
  4. Click Next
  5. Click Add to define the criteria used to determine which devices will receive the Universal Forwarder
  6. Click Query
  7. Name the Query “Server OS”
  8. Click Edit
  9. Click Show query language
  10. Enter the following query:
    select SMS_R_SYSTEM.ResourceID,
    from SMS_R_System
    inner join
    SMS_G_System_OPERATING_SYSTEM on SMS_G_System_OPERATING_SYSTEM.ResourceId = SMS_R_System.ResourceId
    SMS_G_System_OPERATING_SYSTEM.ProductType = 2
    or SMS_G_System_OPERATING_SYSTEM.ProductType = 3
  11. Click OK
  12. Click OK again
  13. Enable Incremental Update by checking the box
  14. Click Next
  15. Click Next
  16. Click Close

> Note: the collection will contain zero members until the update collection background task completes

Create a collection of all successfully deployed universal forwarders

  1. Right click and choose “Create New Device Collection”
  2. Name the collection “Splunk Deployment Collection for Deployed Forwarders”

  3. Select “All Desktop and Server Clients” as the limiting collection

  4. Click Next

  5. Click Add to define the criteria used to determine which devices will receive the Universal Forwarder

  6. Click Query

  7. Name the Query “Server OS”

  8. Click Edit

  9. Click Show query language

  10. Enter the following query:

    from SMS_R_System
    inner join
    SMS_G_System_ADD_REMOVE_PROGRAMS on SMS_G_System_ADD_REMOVE_PROGRAMS.ResourceID = SMS_R_System.ResourceId
    inner join
    inner join
    SMS_G_System_ADD_REMOVE_PROGRAMS_64.ResourceId = SMS_R_System.ResourceId
    SMS_G_System_ADD_REMOVE_PROGRAMS.DisplayName = "UniversalForwarder"
    and SMS_G_System_ADD_REMOVE_PROGRAMS_64.DisplayName = "UniversalForwarder"
    or SMS_G_System_INSTALLED_SOFTWARE.ProductName = "UniversalForwarder"
    order by SMS_R_System.Name

  11. Click OK

  12. Click OK again

  13. Enable Incremental Update by checking the box

  14. Click Next

  15. Click Next

  16. Click Close

Note: the collection will contain zero members until the update collection background task completes

Create Application Definitions

Download both the 32bit and 64bit versions of the Splunk Universal Forwarder into the source folder structure used for SCCM deployment applications. Do this for all versions currently deployed as well as the new version to be deployed.

In general the locations are similar to the path:

Create the application definition for the oldest deployed version of the Univeral Forwarder first.

  1. Navigate to Applications in the Software Library screen
  2. Right click and create a new folder for Splunk definitions
  3. Right click on the new folder and choose Create New Application
  4. Locate the 64 bit MSI for this product version
  5. Click Next
  6. Click Next again
  7. Update the definition with the following information
    • Name (Include version Number and bitness Version number i.e. Universal Forwarder 6.2.3 (x64)
    • Publisher
    • Version
    • Update the command line by removing “/q” and appending “/quiet AGREETOLICENSE=Yes”

      Note it is very important that /q is replaced by /quiet

  8. Click Next
  9. Click Next
  10. Click Close
  11. Right click on the new application definition and click properties
  12. Select the deployment type tab
  13. Select the first deployment and click edit
  14. Select the program tab
  15. update the uninstall command replacing /q with /quiet
  16. select the third browse next to product code and select the MSI
  17. Click requirements
  18. Click add
  19. Select category = device condition = operating system and provide the supported 64bit operating systems
  20. Create and additional requirements appropriate for your environment such as memory and disk space free
  21. Click OK
  22. Click OK again
  23. Add a new deployment type define the 32 bit MSI type using the information above
  24. Edit the new type using the information above to set the product MSI and verify requirements
  25. Select the supersedence tab
  26. click add
  27. Click Browse and select the oldest prior version of the application deployed to replace
  28. Map old deployment type to new ensuring the types match
  29. Click OK
  30. Add any other replacements required
  31. Verify your work and click OK

Repeat the application creation process for all versions of the UF in production If you are upgrading monitor your deployment progress You may continue with this procedure while the Universal Forwarder application is deployed.

Create a Configuration Script

  1. Create a source folder to contain the configuration script for example \\servername\source\splunk\scripts\UF_Config_V1
  2. The following script can be used as a template for the appropriate configuration for your site. At minimum the deployment server FQDN must be customized. Name the script configure.ps1
#Splunk Configuration Script for SCCM Task Sequence
#Locate Splunk based on the MSI registration

function Get-IniContent ($filePath)
$ini = @{}
switch -regex -file $FilePath
 "^\[(.+)\]" # Section
$section = $matches[1]
$ini[$section] = @{}
$CommentCount = 0
"^(\#.*)$" # Comment
$value = $matches[1]
$CommentCount = $CommentCount + 1
$name = "Comment" + $CommentCount
#$ini[$section][$name] = $value
"(.+?)\s*=(.*)" # Key
$name,$value = $matches[1..2]
$ini[$section][$name] = $value
return $ini

$location ="C:\Program Files\SplunkUniversalForwarder\"

#note if splunk may not be installed at the default location uncomment the following lines
#$list = Get-WmiOBject -Class Win32_Product | Where-Object {
# $_.Name -eq 'UniversalForwarder' -or $_.Name -eq 'Splunk' }

#$splunkprod = $list | where-Object { $_.InstallLocation }

#$location = $splunkprod.InstallLocation

$scriptappver = 2

$splunkcmd = $location + "bin\splunk.exe"
$staticapp = $location + "etc\apps\_static_all_universalforwarder\"
$staticdefault = $staticapp + "default\"
$staticlocal = $staticapp + "local\"

$staticdefault_dc = $staticdefault + "deploymentclient.conf"
$staticlocal_dc = $staticlocal + "deploymentclient.conf"
$staticdefault_app = $staticdefault + "app.conf"

if (!(Test-Path -Path $staticapp)) {new-item -ItemType Directory -Path $staticapp}

if (!(Test-Path -Path $staticdefault)) {new-item -ItemType Directory -Path $staticdefault}

if (!(Test-Path -Path $staticlocal)) {new-item -ItemType Directory -Path $staticlocal}

if (!(Test-Path -Path $staticdefault_app))
 new-item -path $staticdefault_app -ItemType File
 Add-Content -Path $staticdefault_app -Value "#Generated by scripting"
 #Add-Content -Path $staticdefault_app -Value "`r`n"
 Add-Content -Path $staticdefault_app -Value "[_static_all_universalforwarder]"
 Add-Content -Path $staticdefault_app -Value "author=Ryan Faircloth"
 Add-Content -Path $staticdefault_app -Value "description=Script Generated UF default configuration applied by SCCM"
 Add-Content -Path $staticdefault_app -Value "version=1"
 Add-Content -Path $staticdefault_app -Value "[ui]"
 Add-Content -Path $staticdefault_app -Value "is_visible = false"

$appconf = Get-IniContent $staticdefault_app
$appver = $appconf[“_static_all_universalforwarder”][“version”]

if ($appver -ne $scriptappver)
if (!(Test-Path -Path $staticdefault_dc))
 new-item -path $staticdefault_dc -ItemType File
 Add-Content -Path $staticdefault_dc -Value "#Generated by scripting"
 Add-Content -Path $staticdefault_dc -Value "[deployment-client]"
 Add-Content -Path $staticdefault_dc -Value "clientName=ScriptDeployed|"
 Add-Content -Path $staticdefault_dc -Value "[target-broker:deploymentServer]"
 Add-Content -Path $staticdefault_dc -Value ""
 Add-Content -Path $staticdefault_dc -Value ""


& $splunkcmd "restart"

Create a Package to contain the configuration script

  1. Create a new package folder Splunk
  2. Create a new folder on a network share Splunk_config_vx where X is the version of the script and include a customized version of the config script provided
  3. Right click on the package folder create package
  4. Name the package Splunk Configuration Script v1
  5. Select the source folder
  6. Click Next
  7. Click do not create a program
  8. Click next
  9. Click next
  10. Click Close
  11. Right click on the package and click “Distribute Content” using appropriate options for the environment. Do not click deploy
  12. Create the Task Sequence
  13. Crea a new Task Sequence Folder “Splunk”
  14. Right click the Task Sequence Folder Create Task Sequence
  15. Name the task Splunk Config Vx
  16. Click Next
  17. Click Next
  18. Click Close
  19. Right click on the task sequence
  20. Click properties
  21. Click the advance tab
  22. Select suppress task sequence notifications and disable this task sequence on computers where it is deployed
  23. Right click on the task sequence and choose edit
  24. Click Add General —> powershell script
  25. Set the script name i.e. configure.ps1 and execution policy=bypass
  26. Click OK
  27. Right click on the task and deploy to the deployed collection created second above

Create the configuration task sequence

  1. Navigate to Software Library
  2. Navigate to Operating System Deployment
  3. Navigate to Task Sequence
  4. Optional Create a new folder called Splunk
  5. Right click and Create a new task sequence
  6. Select Custom Sequence
  7. Click Next
  8. Name the sequence i.e. Splunk Configuration Script Vx
  9. Click Next
  10. Click Next
  11. Click Close
  12. Right click on the task sequence
  13. Click properties
  14. Click the advanced tab
    • Select suppress task sequence notifications
    • disable this task sequence on computers where it is deployed
  15. Click Ok
  16. Right click on the task sequence and choose edit
    • Click Add General —> powershell script
    • Set the script name and execution policy=bypass
  17. Click OK
  18. Right click on the task and deploy to the deployed collection created second above

Staying up to date with the updates, update smart with Microsoft and Secunia

Its been a busy year already Oracle’s Java, Adobe’s Flash, and so many Updates to Windows. Most users by how have heard they should keep their Windows PCs up to date to avoid infection. Unfortunately, our adversaries have heard the same speech and are trying to deceive through fake updates for your computer. First reliable companies will not notify you by email, instant message, or advertisement that your computer is out of date and needs an update. You may see email or advertisements for new versions or upgrades, and subscription renewals. Some leading software companies are helping us stay secure through automatic or seamless updates such as Google’s Chrome browser, the FireFox Browser, and Adobe’s Flash. Security updates for these productions will simply install in the backgroud without needing your help. You can keep yourself safer by taking a few steps to secure your computer.

Lets take care of our operating system first.

  • Open “Computer” on Windows 7 or “This PC” on Windows 8.1
  • Click on Control Panel in the menu bar.
  • Search for “Windows Update” (1) in control panel then select “Turn automatic updating on or off” (2)


  • Setup the options (1) (2) (3) as shown below then click ok (4)

Windows will now check all Microsoft Products daily for updates and install them as needed. You will be asked to reboot your computer to finish applying updates this is very important don’t put it off. Now what about non Microsoft programs? Secunia provides a product called PSI to help us with this task.

Update the rest of our software with Secuina

First download and install Secuina PSI it is very important for you to download from this link. There are a number of sites offering versions modified to include malware.

  • you will need to provide your name and email address.
  • Then look for the big “Download Now” button. “Try Now” is for a separate business grade product.


  • Run and install PSISetup.Exe, this is a simple next, next, finish, default choices will be best.
  • After you click finish the software will start to update your computer. I installed an old version of Java to demonstrate the process below:
  • After the updates complete you will see an updated list of software and your are done.

PSI will not upgrade software however, for example Adobe Acrobat XII (Future software) or the new Java JRE 1.8 will require you to visit the software vendor to download or purchase an upgrade at some point in the future.

Getting Started with KeePass Part 1

KeePass is a Open Source Information manager. KeePass is simple to install and has a wide variety of options of personal security however it does not directly integrate with any web browser. The significant plus with this solution is the cost. Free

Get started by downloading and installing the software from this site. 

  • Open KeyPass by clicking on your start menu then all programs then “Key Pass 2”
  • The first time you run the program you will be asked if  can automatically check for updates. Enable this option KeePass
  • KeePass will open up and look like this to start


  • We would like KeePass to start with windows so from the Tools Menu click Options
  • Click the integration tab and check “Run KeyPass at Windows Startup”
  • Click OkKeepass2

Now we are ready to create our first password database. For most users one database will be enough however it may make sense to create separate databases for information associated with a specific organization with a separate database for personal information.

  • From the file menu click “New”
  • Create a folder in documents “KeePass”
  • Name the database with a meaningful name such as “PersonalAccounts”Keepass3
  • Create a master pass phrase with at least 12 total characters, using two words 1 or more upper case letters 1 or more symbols and 1 or more numbers.Keepass4
  • 1-Enter a descriptive name for this database
  • 2-Enter a default username that is either a username or email address that you will typically use for your accounts
  • 3-Optional Pick a colorKeepass5
  • Click the Security Tab
  • Change the iterations value to 15000Keepass6
  • Click OK
  • First lets Add a Group under our Internet identities for social media right click on “Internet” click social group then “Add Group”
  • Keepass7
  • Name the group “Social Media” and Click OKKeepass8
  • Click Add Entry
  • keepass9
  • Fill out the entry with all of the information you have
    • 1- Title of the entry
    • 2- Your Username on this site
    • 3- Your password on this site (x2) if the password is less than 50 bit a strong password is advisable
    • 4- The URL to this site i.e.
    • 5- Click OK
  • The entry will now be listed under the social media group
  • keepass10
  • The entry will not be listed under the social media groupkeepass11
  • Congratulations on creating your first entry! Open a webrowser for the site you just created
  • Return to KeePass and select your entry then choose copy username (green arrow or Control +B)
  • Go to your web browser and past in the username field
  • Return to KeePass and select your entry then choose copy password (red arrow or control +C)
  • Note you have 15 seconds to past the value in to the  correct location last pass will clear the clip board to protect your information
  • keepass12

Repeat the steps above for each web site or system you will use. When you are done with a work session choose “Lock Workspace” from the file menu to protect your information. Also don’t forget to save your database from the file menu after important changes.



Getting Started with Last Pass (Premium)

Last Pass Premium has been my personal choice in password managers for over two years now.  The premium license take care of a few requirements that are above and beyond what most users required yet remains user friendly enough for most users.

  • Plug-ins for all major browsers
  • Support for Ubuntu Linux and Mac OS X
  • Support for iPhone, iPad, and Android devies
  • No limit on the number of in use devices.

Watch this video to get started with last pass.

You can elect to use my referral link for a free month service

The three videos in this play list will give you a quick introduction into  making use of Last Pass day to day.


Many of you will ask how secure is Last Pass. I’m glad you ask! Thinking about how secure something is means you are taking your personal security seriously. A better question to ask though is what are my risks with using this software.  The risk is if someone can obtain our master password then all of your accounts would be compromised. That can happen if someone is able to observe or guess your master password.

  • Reduce this risk by using a strong  pass phrase to secure your account.
  • Only access your last pass account from devices you can trust. A device you can trust is a device you own and control, with no other users..
  • If you must access your last pass account from shared devices Do not save your master password on devices shared others, including friends family or co-works.

We can make a few small changes to make our information more secure. On your desktop double click the Last Pass icon and log then click on settings.

  • First restrict login to only those countries in which you may travel frequently. This list can be changed at any time be sure visit this setting before traveling.




  • The second thing we will change is requiring the entry of our master password before a password can be “shared”. This is a feature that is security sensitive. It can be a great feature for families allowing you to securely share passwords for financial sites with family members. It should not be used for enterprise credentials.
  • We also want to enter a “security” email address which can be used to notify you of concerns with your account. A family member or work email is frequently used here.



  • The last thing we should do is restrict login from mobile devices. After you have installed Last Pass on all of your devices, come to this screen to “restrict” and then enable each of your devices.



Lass Pass supports the use of two factor authentication selecting and enabling is beyond the scope of this article.


Getting Started with DashLane 3 Premium

Dashlane 3 Premium is an alternative to LastPass, Dashlane is generally a less technical program and does not support the Linux operating system.  The biggest pro for Dashlane over others is its simplicity there is no configuration required to use this software securely. The con for dashlane is that lack of advanced features

  • No ability to restrict usage by contry
  • No ability to restrict login to certain devices
  • No ability to required two factor authentication such as smart token or sms message.
  • Higher cost $30 per year compared to $12 for lastpass

To get started last pass for six months free use my referral link below

Watch this getting started video.

Joy to share

Ok many of you may know me for those that don’t. I will share that I am not the kind of person you would count as full of joy. I’ve been told if I am standing in a room my default mode is simply scary. Not that I am an unhappy person by any means if you are willing to risk approaching me you will find my appearance is misleading. My devotion today has me asking to I have joy to share.

Then Nehemiah the governor, Ezra the priest and teacher of the Law, and the Levites who were instructing the people said to them all, “This day is holy to the Lord your God. Do not mourn or weep.” For all the people had been weeping as they listened to the words of the Law. Nehemiah said, “Go and enjoy choice food and sweet drinks, and send some to those who have nothing prepared. This day is holy to our Lord. Do not grieve, for the joy of the Lord is your strength.” (Nehemiah 8:9, 10 NIV)

Nehemiah said something profound on hearing the law. He told the people that the very law that was condemning the nation was cause for joy. He realized that the condemnation was leading to salvation. He wanted everyone to share the joy. He wanted them to be joyous so much so his instructions were to share with those that we’re not prepared. Are there people in my life that are not prepared to enjoy our salvation? Have I prepared enough joy to provide them with plenty and not be lacking myself. I think this wise king was on to something this might just be the very heart of the Gospel.