I'm always excited to take on new projects and collaborate with innovative minds.

Mail

say@niteshsynergy.com

Website

https://www.niteshsynergy.com/

Splunk

Splunk and New Relic are both popular observability / monitoring tools, but they were designed with different core strengths. The easiest way to understand them is:

Below is a clear comparison.


Splunk vs New Relic

FeatureSplunkNew Relic
Core focusLog management, machine-data analytics, SIEMApplication Performance Monitoring (APM), full-stack observability
Primary usersSecurity teams, DevOps, SREDevelopers, DevOps, SRE
Data typesLogs, metrics, traces, events (strong in logs)Metrics, traces, logs (strong in APM)
DeploymentCloud + on-prem optionsMainly cloud SaaS
Query languageSPL (Splunk Processing Language)NRQL
VisualizationDashboards and log analyticsStrong UI for application monitoring
PricingOften expensive (data ingestion based)Generally cheaper and simpler pricing
Security use casesSIEM, threat detectionNot mainly used for security
Ease of usePowerful but steeper learning curveEasier to start and use

Sources note that Splunk excels at log analytics and security, while New Relic focuses more on monitoring application performance and user experience.


Architecture focus

Splunk

It can ingest and analyze almost any log or event data source for troubleshooting and analytics.


New Relic

It provides full-stack observability across apps, infrastructure, and networks with dashboards and alerts.


Typical use cases

Choose Splunk if you need

Example teams:


Choose New Relic if you need

Example teams:


Quick example

Imagine a slow web application.

New Relic

Splunk

 

Simple summary

If your focus is…Choose
Application performanceNew Relic
Log analytics & securitySplunk
Developer observabilityNew Relic
Enterprise monitoring + SIEMSplunk

 

 

Splunk – Simple Overview

Splunk is a software platform used to collect, analyze, and visualize machine data and big data.

Machine data is generated by systems such as:

This data usually does not have direct business meaning, but it is very useful for:

Splunk can process unstructured, semi-structured, and structured data, then allow users to search, analyze, and visualize insights using reports and dashboards.

Over time, Splunk has evolved from a simple log analysis tool into a powerful big data analytics platform.


Splunk Product Categories

  1. Splunk Enterprise
    Used by organizations with large IT infrastructures to collect and analyze data from applications, websites, devices, and sensors.
  2. Splunk Cloud
    Cloud-hosted version of Splunk Enterprise, available directly from Splunk or through AWS.
  3. Splunk Light
    A simplified version that provides basic search, reporting, and alerting for log data.

Key Features of Splunk

 

Splunk is a data analytics platform that collects, searches, analyzes, and visualizes machine data to monitor systems and gain insights.

 

Step 1 – Download the .deb Package

  1. Open the Splunk download page:
    https://www.splunk.com/en_us/download/splunk-enterprise.html
  2. Choose Linux as the platform.
  3. Select the .deb package (used for Ubuntu/Debian systems).
  4. Click Download Now.
  5. Log in or create a Splunk account if prompted.
  6. The Splunk Enterprise .deb installer will start downloading.

The .deb file is the installation package used by Ubuntu, similar to .exe files in Windows.

 

Splunk – Web Interface Overview

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/276acb6d-e7e7-4d37-91bc-53c769702463?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiIyNzZhY2I2ZC1lN2U3LTRkMzctOTFiYy01M2M3Njk3MDI0NjMiLCJleHAiOjE3NzQyMjk4NDAsImp0aSI6ImFkYTk2Zjk3NjUwOTRkNGNhYjAxZjEwMGM2OGNjYTAxIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiNjl5c0ZYU0I5NjNRdDF5QVN5dlUifQ.pcWPa1tSSAxr5si8MkJOXgOItYUKKDp2HUMvoOyMlcw
https://dev.splunk.com/images/enterprise/landing/app-overview-splunkweb.png

 

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/7dfe9db6-332d-4af1-9b64-9788d0810731?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI3ZGZlOWRiNi0zMzJkLTRhZjEtOWI2NC05Nzg4ZDA4MTA3MzEiLCJleHAiOjE3NzUxMjUzMTgsImp0aSI6IjMyZjA3MGM1NDE0MjQ2OTY4NDc3MTI0MjY0NjJlNjMxIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiMGdwaUZxelB4Z0F1cUp5cFNIcm0ifQ.B4VDYphXc3UbY1rdv195O0nPIshO51rKMoiRCXzOy38

 

The below picture shows the initial screen after your login to Splunk with the admin credentials.

Interface1

 

Administrator Link

The Administrator dropdown allows management of the admin account.

Using this option you can:

  • Change admin email ID
  • Reset or update admin password
  • Manage account settings

It also provides access to Preferences, where you can:

  • Set the time zone
  • Select the default home application
  • Customize the landing page after login


Interface2

Interface3

Settings Link

The Settings menu contains most of the core configuration features of Splunk.

From here you can:

  • Manage data inputs
  • Create lookup files and lookup definitions
  • Configure users, roles, and authentication
  • Manage indexes and apps
  • Configure alerts and data models

This section is mainly used for administration and system configuration.
 

Interface4

Search and Reporting Link

The Search & Reporting app is the most commonly used area in Splunk.

It allows users to:

  • Search ingested data
  • Create reports and alerts
  • Build visualizations and dashboards
  • Analyze logs and machine data

Users write Splunk Search Processing Language (SPL) queries here to analyze data.
 

Interface5

 

 

The Splunk Web Interface is a browser-based platform where users can search data, create reports, configure alerts, and manage system settings and users.

It includes key sections such as Administrator, Settings, and Search & Reporting.

 

 

Splunk – Data Ingestion (Simple Explanation)

 

Data Ingestion in Splunk means importing or loading data into Splunk so that it can be searched, analyzed, and visualized.

This process is done using the Add Data option available in the Search & Reporting App.

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/c8b4bc46-6e8f-41b7-afd7-d69ffe368453?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiJjOGI0YmM0Ni02ZThmLTQxYjctYWZkNy1kNjlmZmUzNjg0NTMiLCJleHAiOjE3NzUxMjUwMTUsImp0aSI6ImU0OWFmNmE1MGJlYzQwZGViMDNhNjk3OGQ3NzhmYzFmIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiMGdwaUZxelB4Z0F1cUp5cFNIcm0ifQ.e21eTZI5ohkQqRxzJ4Ffv6ofIve2Kphin1hCmxLssVM
https://www.splunk.com/content/dam/splunk-blogs/images/en_uk/2024/12/ingest-processor.jpg
Ingesting and Visualizing SSH Authentication Logs with Splunk | by JahdiSec  | Medium

 

Using ingest actions with source types that are renamed with props and  transforms - Splunk Lantern

 

Steps for Data Ingestion in Splunk

Add Data

After logging in to Splunk:

  • Go to the Home screen
  • Click the Add Data icon

This opens the data upload wizard where you select the data source.

Gather the Data

You need a dataset to upload into Splunk.

Example:

  • Download sample datasets from the Splunk official website
  • Extract the downloaded file
  • The folder may contain files like:
    • secure.log
    • other log files generated by web applications

These log files simulate machine data for analysis.

 

Upload Data

  • Select the file (example: secure.log) from your system
  • Click the Next button to continue.

This uploads the file into the ingestion pipeline.

 

Select Source Type

Splunk automatically detects the data format.

Common source types include:

  • System logs
  • Web logs
  • JSON
  • XML

You can:

  • Use the default source type, or
  • Manually select a different type from the dropdown list.

 

nput Settings

Here you configure host information for the data.

Options include:

Constant Value

  • Specify the full host name of the machine generating the data.

Regex on Path

  • Use a regular expression to extract the host name.

Segment in Path

  • Extract host name from a specific path segment.
    Example: /var/log/server1/logfile

You also choose an Index Type:

  • Default Index – stores raw data for searching.
  • Summary Index – stores aggregated summary data.
  • History Index – stores search history.

 

Review Settings

Splunk shows a summary of all selected settings.

You should:

  • Review the configuration
  • Click Next to confirm.

 

 Data Ingestion Completed

After finishing:

  • Splunk confirms successful data ingestion
  • You can now:
    • Search the data
    • Create reports
    • Build dashboards
    • Set alerts

 

Splunk – Source Types (Simple Explanation)

https://www.tutorialspoint.com/splunk/images/source_type_1.jpg
https://www.splunk.com/content/dam/splunk2/en_us/images/screenshots/software/splunk-enterprise-security-dashboard-hero.png
CSV File Upload via GUI Interface - Splunk Community

 

Configuring the Splunk System

What is a Source Type in Splunk?

A Source Type in Splunk defines the format and structure of incoming data.

When data is ingested into Splunk, the data processing engine automatically analyzes the data and assigns it a source type. This process is called Source Type Detection.

This helps Splunk:

  • Understand the structure of the data
  • Extract relevant fields automatically
  • Make the data easier to search and analyze

Example:
If Splunk receives a log file from an Apache web server, it automatically identifies it and applies the Apache log source type.


Supported Source Types

When uploading data using the Add Data feature:

  1. Upload a file.
  2. Open the Source Type dropdown.
  3. You will see many supported source types.

These include formats such as:

  • Logs
  • CSV files
  • JSON
  • XML
  • Database logs
  • Application logs

Source Type Sub-Categories

Each source type category can have multiple sub-categories.

Example:

  • Database category
    • MySQL logs
    • Oracle logs
    • PostgreSQL logs

Splunk can recognize these formats and automatically extract useful fields.


Important Pre-Trained Source Types

Splunk includes many pre-trained source types, meaning it already knows how to interpret certain log formats.

Source TypeDescription
access_combinedHTTP web server logs in NCSA combined format
access_combined_wcookieSame as combined logs but with cookie information
apache_errorApache web server error logs
linux_messages_syslogLinux system log messages
log4jLogs generated by applications using Log4j
mysqld_errorMySQL database error logs

 

A Source Type in Splunk identifies the format of incoming data, allowing Splunk to automatically classify logs and extract useful fields, making analysis faster and easier.

 

Splunk – Basic Search

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/3282b7d3-4f2d-4e1c-83c5-1ddaa2872432?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiIzMjgyYjdkMy00ZjJkLTRlMWMtODNjNS0xZGRhYTI4NzI0MzIiLCJleHAiOjE3NzQ2Mzc2OTEsImp0aSI6ImFjMzBmNTg2MTA5OTQyZmY5MGIzODBiOTllYjk0MTY3IiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiVm1jTHZPeXQzTTRIeUtpcGZrbjkifQ.ezhviOA0LyoPRNYfPGBXfGrM0ogUk2MHrhDOrhmco3Y
Getting Started With Splunk: Basic Searching & Data Viz — Stratosphere  Laboratory
Use the timeline to investigate events | Splunk Enterprise (last updated  2025-07-04T01:41:23.803Z)

 

Splunk Search Interface. Splunk has its own Query language… | by Anvesh |  SilentTech | Medium

 

What is Basic Search in Splunk?

Basic Search in Splunk allows users to find specific information from the ingested data (such as log files, system events, or machine data).

Splunk uses a Search Processing Language (SPL) to perform searches. With SPL, you can filter, combine, and analyze log data quickly.

The search feature is available in the Search & Reporting App in the Splunk web interface.

 

How to Perform a Basic Search

Open Search & Reporting

After logging in to Splunk:

  1. Go to the left sidebar.
  2. Click Search & Reporting.
  3. A search bar will appear where you can type search queries.

This is the main place where data analysis begins in Splunk.

 

Searching for a Single Term

You can search for a specific term present in the log data.

Example:

 
host=server1
 

This query will show all log events generated from the host named "server1".

Results appear in:

  • Events list (individual log entries)
  • Timeline chart showing event distribution over time.

 

Combining Search Terms

You can combine multiple terms to make the search more specific.

Example:

 
"login failed"
 

Using double quotes (" ") searches for the exact phrase.

Example result:

  • Only log entries that contain login failed together will appear.

 

Using Wildcards

Wildcards help search for multiple variations of a word.

Example:

 
fail* AND password
 

Here:

  • fail* matches words like:
    • fail
    • failed
    • failure
  • AND ensures the event must also contain password.

Other operators include:

  • AND – both terms must exist
  • OR – either term can exist
  • NOT – excludes a term

Example:

 
error OR failure
 

 

Refining Search Results

Splunk allows you to narrow down results directly from the event list.

Steps:

  1. Click a value in the search results (for example 3351).
  2. Select Add to Search.
  3. Splunk automatically adds it to the query.

Example refined query:

 
host=server1 3351
 

Now only events containing 3351 will be displayed.

Also:

  • The timeline graph updates automatically to reflect filtered results.

 

Key Components in Search Results

Timeline

  • Shows the number of events over time.

Events List

  • Displays individual log entries.

Fields Panel

  • Shows extracted fields from the data.

 

Basic Search in Splunk allows users to query ingested data using keywords or SPL queries. Users can combine terms, use wildcards, apply operators, and refine results to analyze log data efficiently.

 

Splunk – Field Searching

https://community.splunk.com/t5/image/serverpage/image-id/26665i02AC5B8574BF6738?v=v2

 

Basic searches and search results | Splunk Cloud Platform (last updated  2025-07-18T13:58:27.569Z)

 

stats | Splunk Enterprise (last updated 2025-07-03T23:06:24.881Z)
About the Search app - Splunk Documentation

 

What is Field Searching in Splunk?

When Splunk ingests machine data (like log files), it automatically analyzes the data and breaks it into fields.
A field represents a single piece of information from an event.

Example fields from a log record:

  • host – server name
  • timestamp – time when event occurred
  • event type – login attempt, HTTP request, error, etc.
  • user – username involved
  • status – success or failure

Even if the data is unstructured, Splunk tries to extract key-value pairs and separate them based on:

  • Strings
  • Numbers
  • Dates

This automatic process is called Field Extraction.


Viewing Fields

After running a search in Splunk:

  1. Click Show Fields in the search results page.
  2. Splunk opens the Fields sidebar.
  3. It displays all the fields extracted from the uploaded data (for example from secure.log).

Each field represents a column of information within the events.


Choosing Fields to Display

You can control which fields appear in the search results.

Steps:

  1. Click All Fields.
  2. A list of available fields appears.
  3. Select or deselect fields using checkboxes.

For every field Splunk also shows:

  • Number of distinct values
  • Data type (string, number, etc.)
  • Event coverage (%) – how many events contain that field.

This helps users understand how important or common a field is in the dataset.

 

Field Details

If you click on a field name, Splunk shows a Field Summary.

The summary includes:

  • All distinct values of that field
  • Count of each value
  • Percentage distribution

Example:
Field: status

ValueCountPercentage
success12060%
failure8040%

This helps identify patterns and trends quickly.


Using Fields in Search Queries

Fields can also be used directly in the search query to filter results.

Example:

 
host=mailsecure_log date="15 Oct"
 

This query returns all events generated on 15th October from the host mailsecure_log.

Another example:

 
status=failed user=root
 

This search shows failed login attempts for the root user.

 

Field Searching in Splunk allows users to analyze specific parts of log data by using automatically extracted fields such as host, timestamp, user, and event type, making searches more precise and efficient.

 

Splunk – Time Range Search

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/0939cb70-4413-4147-8671-265c592d1c8f?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiIwOTM5Y2I3MC00NDEzLTQxNDctODY3MS0yNjVjNTkyZDFjOGYiLCJleHAiOjE3NzUwMTg3MzUsImp0aSI6IjA0ZWNmZjBiODJlZjQ3NDA4YTM4M2Y1NjA1NmUzMDliIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiZWtzTk5Dd2ZxaXo2eEk2NDhNUzUifQ.vrdMHFrnZbKIdDvUNY9DhEPBu8hWDrXC-69qU45xO0g
https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/54197b40-882e-401d-bfd9-79fef4780687?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI1NDE5N2I0MC04ODJlLTQwMWQtYmZkOS03OWZlZjQ3ODA2ODciLCJleHAiOjE3NzUwMjk2NjUsImp0aSI6ImYyMTRjMmY3Y2MyYzRiMzliNmQwZTNkZGIzZWQ0NjMxIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiWkxOTGpEYmhFeldTeHVqbmN0ajcifQ.fqrMXpLWKd1Xr2kVIFR8JM6WJc-g0TAzY3kE_1BrhT0
Splunk - Time Range Search
Time functions | Splunk Enterprise (last updated 2025-07-04T01:49:03.593Z)

What is Time Range Search?

In Splunk, Time Range Search allows users to filter and analyze events based on a specific time period.

Every search result page in Splunk shows a timeline graph at the top.
This graph displays how events are distributed across time.

Using this timeline, users can limit their search results to a specific time range, making analysis faster and more accurate.


Preset Time Range Options

Splunk provides predefined time ranges that can be selected easily.

Common preset options include:

  • Last 15 minutes
  • Last 60 minutes
  • Last 24 hours
  • Last 7 days
  • Previous week
  • Previous month

Example:
If you select Previous Month, Splunk will display only the events that occurred during the previous month.

The timeline graph updates automatically to show events for that selected period.


Selecting a Time Subset

You can also manually select a portion of time from the timeline graph.

Steps:

  1. Click on the timeline graph.
  2. Drag across the bars representing event counts.
  3. Splunk filters results for that selected time interval.

Important:

  • This does not re-run the query.
  • It only filters the already returned results.

This method is useful for quick analysis of specific time spikes.


Using Earliest and Latest Commands

Splunk also allows time filtering directly through search commands.

Two important commands are:

  • earliest
  • latest

Example:

 
earliest=-15d latest=-7d
 

Meaning:

  • Show events from 15 days ago to 7 days ago.

This method provides more precise control compared to manual timeline selection.


Nearby Events

Splunk can also display events occurring near a specific time.

Users can specify the time interval scale, such as:

  • Seconds
  • Minutes
  • Hours
  • Days
  • Weeks

Example use case:

  • Investigating events occurring just before or after a system failure.

This helps identify related events around the same time period.

 

Time Range Search in Splunk allows users to analyze log events within a specific time period using preset time ranges, timeline selection, or commands like earliest and latest.

 

https://www.tutorialspoint.com/splunk/images/share_export_1.jpg
https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/294cfbd2-ac62-45d7-93d7-f105cf823205?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiIyOTRjZmJkMi1hYzYyLTQ1ZDctOTNkNy1mMTA1Y2Y4MjMyMDUiLCJleHAiOjE3NzQ2OTU1MDAsImp0aSI6IjVlN2JhMGEzMWYwNDRlZGJhOTY1NmQ0YTI5NDRmMDVhIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiQ2V0VVd5WjNaRTZRNUNROHBQNzEifQ.Z1NwGbpaxOxJczM5C6YH0dMRCrqDfBmeXOeBuww21EU
https://www.tutorialspoint.com/splunk/images/share_export_3.jpg
View search job properties | Splunk Cloud Platform (last updated  2025-12-02T01:36:34.820Z)

Overview

In Splunk, when you run a search query, the result is saved as a search job on the Splunk server.
These jobs can be shared with other users or exported as files so that the results can be used outside Splunk.

This feature helps teams collaborate and reuse search results without running the same query again.

 

Sharing the Search Result

After a search query finishes:

  1. Look at the top-right area of the search page.
  2. Click the Share icon (upward arrow).
  3. Splunk generates a URL link for that search job.

Using this link:

  • Other users can open the same search result.
  • They do not need to write or run the query again.

Important:

  • Users must have permission to access the shared result.
  • Permissions are managed through the Splunk Administration settings.

 

Finding Saved Search Jobs

All executed searches are stored as jobs in Splunk.

To view them:

  1. Go to the Activity Menu (top-right of the interface).
  2. Click Jobs.
  3. Splunk shows a list of saved search jobs.

Each job includes:

  • Search query
  • Owner (user who ran it)
  • Execution time
  • Expiration date

 

Managing Job Expiration

Search jobs expire automatically after a certain time to save server resources.

If you want to keep the results longer:

  1. Select the job from the jobs list.
  2. Click Edit Selected.
  3. Choose Extend Expiration.

This keeps the job available for future use.

 

Exporting Search Results

Splunk also allows exporting search results into files.

Steps:

  1. Run a search query.
  2. Click the Export button.
  3. Choose a file format.

Available formats:

  • CSV – used for spreadsheets like Excel
  • XML – structured data format
  • JSON – commonly used for APIs and applications

After selecting the format, the file is downloaded to your local system.

This allows results to be shared with people who do not use Splunk.

→ Splunk allows users to share search results through links or saved jobs, and also export results as CSV, XML, or JSON files for external use and collaboration.

 

Splunk – Search Processing Language (SPL)

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/41825a0a-0b99-4f18-a805-efa0aa4ccac1?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI0MTgyNWEwYS0wYjk5LTRmMTgtYTgwNS1lZmEwYWE0Y2NhYzEiLCJleHAiOjE3NzQ4NDA2ODEsImp0aSI6IjVhYTIwYjc2NmU2ZDQxN2ZiMWQ4ZjE3ZjI3YzkzMTM3IiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoibERVQVlQcml2RWtzUGlISG5PSWkifQ.TxVysO_J4ULRJGUP6IuOOE9uj5RFbxIe6jeIyHqFK2s
https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/88d5adaf-a3d7-4839-acc8-dc6e66e8980a?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI4OGQ1YWRhZi1hM2Q3LTQ4MzktYWNjOC1kYzZlNjZlODk4MGEiLCJleHAiOjE3NzUzOTcxNDcsImp0aSI6IjFkZTU4MDI2NTEwMzRlYjU5Y2ZkZmNmNWU4ZDE4MDUxIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiYU5NMEFoWVZyRTY5aldkNjVZZ1UifQ.5EsU_5rNdrYf51bjOvGLo4N5bD_9oIKSXRBJJkWZQt4
https://miro.medium.com/v2/resize%3Afit%3A582/0%2AW5qEBlEY7EMTr4wc.jpg
stats | Splunk Enterprise, Splunk Cloud Platform (last updated  2026-03-06T22:18:04.040Z)

 

What is SPL?

SPL (Search Processing Language) is the language used in Splunk to search, filter, and analyze data.

It allows users to:

  • Retrieve data from logs
  • Filter results
  • Perform calculations
  • Group and transform data

SPL queries are written in the search bar of the Search & Reporting app.

A typical SPL query uses a pipeline structure (|), where each command processes the result from the previous command.

Example:

 
error | head 3
 

This search finds events containing error and then shows only the first 3 results.


Components of SPL

SPL consists of four main components:

  1. Search Terms
  2. Commands
  3. Functions
  4. Clauses

 

Search Terms

Search terms are the keywords or phrases used to retrieve data from Splunk.

Example:

 
login failed
 

This query returns all events containing the words login and failed.

You can also search specific fields.

Example:

 
host=server1
 

This shows events generated from server1.

 

Commands

Commands tell Splunk what action to perform on the search results.

Commands are separated by the pipe symbol |.

Example:

 
error | head 3
 

Explanation:

  • error → search term
  • | head 3 → command showing only the first 3 results

Common commands include:

  • head – shows first few results
  • table – displays selected fields
  • stats – performs statistical calculations
  • sort – sorts the results
  • top – shows most frequent values

 

Functions

Functions perform calculations on fields in the dataset.

These are commonly used with commands like stats.

Example:

 
| stats avg(bytes)
 

Explanation:

  • avg() calculates the average value of the field bytes.

Other common functions:

  • sum() – total value
  • count() – number of events
  • max() – maximum value
  • min() – minimum value

Example:

 
| stats count
 

This counts the total number of events.

 

Clauses

Clauses help organize and rename results.

Common clauses include:

  • by – groups results
  • as – renames fields

Example:

 
| stats avg(bytes) by file
 

Meaning:

  • Calculate average bytes
  • Group results by file name

Example with rename:

 
| stats avg(bytes) as Avg_Size by file
 

Result:

  • Each file
  • Its average size

SPL (Search Processing Language) is the query language used in Splunk to search and analyze machine data.
It includes search terms, commands, functions, and clauses to filter, calculate, and organize data results.

 

Splunk – Search Optimization

https://docs.imply.io/assets/images/search-job-inspector-919b6b243c8c68c535b8f1cd8534652b.png
View search job properties | Splunk Enterprise (last updated  2025-12-02T01:38:33.528Z)

 

The basics of indexer cluster architecture | Splunk Enterprise (last  updated 2025-07-04T13:03:44.413Z)
Use the timeline to investigate events | Splunk Enterprise (last updated  2025-07-04T01:41:23.803Z)

 

What is Search Optimization?

Search Optimization in Splunk improves the speed and efficiency of search queries automatically.

Splunk has built-in optimization mechanisms that analyze the query and adjust the search process so that results are returned faster and with less resource usage.

The two main optimization goals are:

  • Early Filtering
  • Parallel Processing

 

Early Filtering

Early filtering means removing unnecessary data as early as possible during the search process.

Instead of processing all events, Splunk first filters out irrelevant events.

Benefits:

  • Reduces amount of data processed
  • Improves search performance
  • Avoids unnecessary calculations like lookups and evaluations

Example search:

 
fail OR failed OR password
 

Splunk first filters events containing these keywords before applying further processing.

 

Parallel Processing

Splunk uses multiple indexers to process searches simultaneously.

Steps:

  1. Search query is sent to indexers.
  2. Indexers process parts of the data in parallel.
  3. Processed results are sent to the Search Head.
  4. The Search Head combines and displays the final results.

Benefits:

  • Faster search execution
  • Better performance for large datasets

 

Analyzing Search Optimization

Splunk provides a tool called Job Inspector to analyze how a search was optimized.

Steps to access it:

  1. Run a search query.
  2. Click Search.
  3. Go to Job.
  4. Select Inspect Job.

The Job Inspector shows:

  • Execution time
  • Number of events processed
  • Optimization steps
  • Cost of each processing step

This helps users understand how Splunk executed the search.

 

Eg-

Search query:

 
fail OR failed OR password
 

Using Job Inspector, you can see:

  • Total events scanned
  • Time taken to return results
  • How Splunk optimized the query

 

Turning Off Optimization

Splunk also allows disabling built-in optimization for testing purposes.

This can be done using the noop command.

Example:

 
fail OR failed OR password | noop
 

Purpose:

  • Compare performance with and without optimization
  • Identify whether optimization improves or slows down a specific query

Sometimes disabling optimization may give faster results, depending on the query.

 

Splunk – Transforming Commands

https://www.tutorialspoint.com/splunk/images/stats_1.jpg
https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/722c04a9-dd7d-41fe-9dbf-0c604e06b3b1?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI3MjJjMDRhOS1kZDdkLTQxZmUtOWRiZi0wYzYwNGUwNmIzYjEiLCJleHAiOjE3NzQxNTQ0MDAsImp0aSI6ImM1OTQ3ZDUyNTVkNjQ5MzFhNjJlZjVkNzRmYzk5MjhjIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiNjl5c0ZYU0I5NjNRdDF5QVN5dlUifQ.dNm58ROu6Y-M6C_JosacAjeWwP4V34RVD73Tr641cbw
https://i.sstatic.net/IF8Wa.jpg
What is Splunk Dashboard Studio? - Splunk Documentation

What are Transforming Commands?

Transforming commands in Splunk are used to convert search results into structured tables or statistical summaries.

These commands take the raw event data returned from a search and transform it into formats suitable for reports, statistics, and visualizations such as charts and dashboards.

Instead of showing individual log events, transforming commands produce aggregated results.

 

Eg.. of Transforming Commands

Some commonly used transforming commands include:

  • highlight
  • chart
  • stats

 

Highlight Command

The highlight command is used to highlight specific keywords in search results.

It helps users quickly identify important terms in large datasets.

Example:

 
error | highlight safari,butter
 

Explanation:

  • Splunk searches for error events
  • The words safari and butter are highlighted in the results.

This improves readability and analysis of log events.

 

Chart Command

The chart command transforms search results into a table format that can be visualized as charts.

Supported visualizations include:

  • Bar charts
  • Line charts
  • Column charts
  • Area charts

Example:

 
| chart avg(bytes) by filetype
 

Explanation:

  • Calculates average bytes
  • Groups the results by file type
  • Results can be displayed as a bar or column chart

This command is commonly used for data visualization in dashboards.

 

Stats Command

The stats command is one of the most powerful transforming commands in Splunk.

It performs statistical calculations on fields.

Common functions used with stats:

  • count() – number of events
  • sum() – total value
  • avg() – average value
  • max() – maximum value
  • min() – minimum value

Example:

 
| stats count by weekday
 

Explanation:

  • Counts events
  • Groups results by weekday

Output example:

WeekdayCount
Monday20
Tuesday15
Wednesday18

This produces summary statistics instead of raw events.

Transforming commands in Splunk convert raw search results into structured statistical data that can be used for reports, charts, and dashboards.
Common examples include highlight, chart, and stats.

 

Splunk – Reports

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/eda9d800-9fb3-4975-af03-1d1d6a0ef79a?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiJlZGE5ZDgwMC05ZmIzLTQ5NzUtYWYwMy0xZDFkNmEwZWY3OWEiLCJleHAiOjE3NzU0MjYxNjUsImp0aSI6ImM0MWNlZDAyZGMzZjQ3YjFhMWU2MjIwYmU4MDAyYWNjIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiYU5NMEFoWVZyRTY5aldkNjVZZ1UifQ.00_2rR-XoYik9Y0KGYCHMu2ZO9Z5HyPxRNHDDGSo_Dg
https://www.splunk.com/content/dam/splunk-blogs/images/2016/09/splunk-add-time-picker-start-button.jpg
https://www.tutorialspoint.com/splunk/images/schedule_alert_1.jpg
Dashboards GA: Introducing Splunk Dashboard Studio | Splunk

What is a Splunk Report?

A Splunk Report is a saved result of a search query that displays statistics, tables, or visualizations based on event data.

Key points about reports:

  • Reports are created from search queries.
  • They run again each time they are opened, so results are always updated with new data.
  • Reports can be shared with other users.
  • They can be added to dashboards.
  • Some reports support drill-down, allowing users to click on a result and view the underlying events.

Creating a Report

Creating a report in Splunk is simple.

Steps:

  1. Run a search query in the Search & Reporting app.
  2. Click Save As.
  3. Select Report from the dropdown menu.

This opens the Create Report window.

You must provide:

  • Report Name
  • Description
  • Time Picker option

Time Picker:
If enabled, users can change the time range when running the report.

Finally, click Save to create the report.


Report Configuration

After saving the report, Splunk provides options to configure it.

Important configuration options include:

Permissions

Define who can view or edit the report.

Scheduling

You can schedule reports to run automatically at specific intervals.

Example:

  • Every hour
  • Every day
  • Every week

Scheduled reports are often used for monitoring and alerts.

Add to Dashboard

Reports can be added directly to dashboards for visualization.


Viewing a Report

After creating the report, click View to open it.

The report page shows:

  • Search query results
  • Charts or tables
  • Time range selection

Users can run the report again anytime to get updated results.

 

Modifying the Report Search

Sometimes the original search query needs to be updated.

Steps to edit the search:

  1. Open the report.
  2. Click Open in Search.
  3. The original SPL query appears in the search bar.
  4. Modify the query as needed.
  5. Save the updated report.

 

A Splunk Report is a saved search result that shows statistics and visualizations. Reports can be scheduled, shared with users, added to dashboards, and updated by modifying the original search query.


Phase 2 : 8AM To 12PM

Splunk - Dashboards


https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/82076efa-39db-4b51-8f29-c7ea4017beb1?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI4MjA3NmVmYS0zOWRiLTRiNTEtOGYyOS1jN2VhNDAxN2JlYjEiLCJleHAiOjE3NzM4OTg0ODcsImp0aSI6IjZmYzExNzdlYzkzZjRmYzY5ZTYzZDA2YjFjNmJmZjJiIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiNFZITjRXV1V0V1FEczNNNHllM2IifQ.-EO6JDpP0WdpMm1e0TJIk6Ertu43cIwIAVNC-Wmucs8

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/79a5738b-b271-46f1-b7c0-88f0e7d5690d?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI3OWE1NzM4Yi1iMjcxLTQ2ZjEtYjdjMC04OGYwZTdkNTY5MGQiLCJleHAiOjE3NzM4OTg0ODcsImp0aSI6ImNkY2RmZjgyYTAzMTQyOTc5MjAxOTg1NmIyMDNlMzg2IiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiNFZITjRXV1V0V1FEczNNNHllM2IifQ.TqwXXC64zD3GHUBud29W5_JwYjLFwe8d8dhj4zJiqq8
https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/7763458d-6a1b-44e0-a85c-13c622e16369?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI3NzYzNDU4ZC02YTFiLTQ0ZTAtYTg1Yy0xM2M2MjJlMTYzNjkiLCJleHAiOjE3NzUyMzg2NjYsImp0aSI6IjUyMzE0MmQ3YjQwNDQ2YzA4OWVjYTU3ZjYyZDhlNDIwIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiaExIZ29pOHdGQ3hkaGhEZldjbTMifQ.6hi_BHnGiPKSpCIr0ADojdVQiIf73oERBp7wgUBhjrM
How to Create a Dashboard in Splunk Enterprise | by Paritosh | Medium

 

What is a Dashboard in Splunk?

A Dashboard in Splunk is a visual interface that displays data using charts, tables, and reports.

Dashboards are used to monitor systems and analyze data quickly.

Key idea:

  • A dashboard contains multiple panels.
  • Each panel displays a report, chart, or table.

This helps present important information in a visually organized way.

Example panels may show:

  • Login failures
  • Server activity
  • Website traffic
  • System errors

Creating a Dashboard

Dashboards are usually created from search results and visualizations.

Steps to Create a Dashboard

  1. Run a search query in the Search & Reporting app.
  2. Click the Visualization tab.
  3. Choose a chart type (for example Pie Chart).
  4. Click Save As → Dashboard Panel.

This will open the Create Dashboard Panel window.


Filling Dashboard Details

In the next screen you must enter:

  • Dashboard Title
  • Panel Title
  • Dashboard Description

You can either:

  • Add the panel to an existing dashboard, or
  • Create a new dashboard

Then click Save.


Viewing the Dashboard

After saving:

  1. Click View Dashboard.
  2. The dashboard will appear with the panel containing your chart.

Dashboard options include:

  • Edit
  • Export
  • Delete

Adding Panels to a Dashboard

Dashboards can contain multiple panels.

To add another panel:

  1. Create another search query and visualization.
  2. Click Save As → Dashboard Panel.
  3. Select the same dashboard.
  4. Add the panel and save.

Now the dashboard will display multiple charts in different panels.

Example:

  • Panel 1 → Pie chart (file counts by weekday)
  • Panel 2 → Bar chart (file sizes)

Advanced Dashboard Features

Splunk dashboards also support interactive inputs, such as:

  • Text boxes
  • Radio buttons
  • Dropdown menus

These allow users to filter dashboard data dynamically.

 

A Splunk Dashboard is a visual display of reports and charts organized into panels.
Dashboards help users monitor systems, analyze trends, and present data visually.

 

Splunk – Pivot and Datasets

https://www.tutorialspoint.com/splunk/images/datasets_pivot_3.jpg
https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/f0dc57e6-8498-468b-9fe8-759c0d2656c4?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiJmMGRjNTdlNi04NDk4LTQ2OGItOWZlOC03NTljMGQyNjU2YzQiLCJleHAiOjE3NzUyOTk5NzMsImp0aSI6IjY2NzU1MmNhYTk4NDRkYmE5ZmQyODVhNTM1Y2I0YzM1IiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiOEJSU2hKcXJBQVdGc0xlV0t5QnYifQ.uAc-xK32uI9RUBdcuvNbKBFcGnAEB18CUDlq-yGZF6s
https://www.tutorialspoint.com/splunk/images/datasets_pivot_6.jpg
Using Datasets in Splunk | Function1

 

What are Datasets in Splunk?

A Dataset in Splunk is a structured collection of data fields, similar to a table in a relational database.

Datasets make it easier to:

  • Analyze data
  • Filter records
  • Create pivot reports
  • Perform lookups
  • Build visualizations

Instead of working directly with raw logs, users can work with organized datasets containing selected fields.

 

Creating a Dataset

Datasets are created using the Splunk Datasets Add-on.

Steps:

  1. Install the Splunk Datasets Add-on from Splunkbase.
  2. Open the add-on in Splunk.
  3. Click Create New Table Dataset.

 

Selecting a Dataset Source

When creating a dataset, Splunk provides three options:

Indexes and Source Types

Use data that already exists in Splunk.

Example:

  • Logs from servers
  • Application logs
  • Network logs

Existing Datasets

Create a new dataset from an existing dataset.

Search

Use the results of a search query as the dataset.

Example:

 
index=web_logs
 

In many cases, users select an existing index as the data source.


Choosing Dataset Fields

After selecting the source, Splunk asks which fields should be included in the dataset.

Example fields:

  • _time (default field – cannot be removed)
  • bytes
  • categoryID
  • clientIP
  • file

These fields become columns in the dataset table.

After selecting the fields, click Done.

The dataset now looks like a structured table.

Finally, click Save As to store the dataset.

 

Creating a Pivot

A Pivot is a tool used to summarize and analyze dataset information.

Pivot reports perform aggregation, similar to pivot tables in Excel.

Example:

  • Count events
  • Calculate averages
  • Compare categories

 

Steps to Create a Pivot

Select Dataset

Go to the Datasets tab and select your dataset.

Choose Pivot Option

Click Actions → Visualize with Pivot.

Select Pivot Fields

In the pivot editor you define:

Split Columns

  • Field that will appear as columns.

Example:

 
categoryID
 

Split Rows

  • Field that will appear as rows.

Example:

 
file
 

Result

The pivot table will show counts of each categoryID for each file.

Example:

FileCategory 1Category 2Category 3
file1523
file2416

This allows users to quickly analyze patterns in the data.

 

  • Datasets in Splunk are structured tables created from log data.
  • Pivot is used to analyze datasets by summarizing data across rows and columns.
  • It helps create reports and visualizations without writing SPL queries.

 

 

Splunk – Lookups

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/ac80abba-005e-446a-b31b-77a966c00588?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiJhYzgwYWJiYS0wMDVlLTQ0NmEtYjMxYi03N2E5NjZjMDA1ODgiLCJleHAiOjE3NzUxNjAzMTMsImp0aSI6ImU5MDU4ZWJlMmM3NzQwMWJhOTY1Yjc2OTI2NmU0YjZkIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiN1dnblRrcnlCRHVFMmtjckpkUEsifQ.2NbMp-eXYWPACNmIFaS3VFdkjZFesjYfuXuDJ3j6k6k
https://cdn.splunkbase.splunk.com/media/public/screenshots/d157027a-015f-11ee-8244-86096248aa77.png

 

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/fdd57052-d123-494d-914c-42ca08598a56?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiJmZGQ1NzA1Mi1kMTIzLTQ5NGQtOTE0Yy00MmNhMDg1OThhNTYiLCJleHAiOjE3NzUxNjAzMTMsImp0aSI6IjhhZDYwMGVjN2ViNjRiNDdiZTdiNzhiNmNlZDhiNmY4IiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiN1dnblRrcnlCRHVFMmtjckpkUEsifQ.ZVSw_S2tj8sdSjH6eC6EpPLlTM40za_F7IrtcXre680

 

Search with field lookups | Splunk Enterprise (last updated  2025-07-04T01:23:57.701Z)

What is a Lookup in Splunk?

A Lookup in Splunk is used to enrich search results by adding additional information from another dataset.

Sometimes log data contains codes or IDs that are not easy to understand.
For example:

productid
WC-SH-G04
DB-SG-G01

These values do not clearly explain the product.
Using a lookup table, we can map these IDs to meaningful descriptions.

Example lookup result:

productidproductdescription
WC-SH-G04Tablets
DB-SG-G01PCs

This process of matching fields from two datasets is called a Lookup.

 

Steps to Create and Use a Lookup

Create a Lookup File

First, create a CSV file containing the mapping values.

Example: productidvals.csv

 
productid,productdescription  
WC-SH-G04,Tablets  
DB-SG-G01,PCs  
DC-SG-G02,MobilePhones  
SC-MG-G10,Wearables  
WSC-MG-G10,USB Light  
GT-SC-G01,Battery  
SF-BVS-G01,Hard Drive
 

Important rule:
The field name must match the field in the dataset (productid).

 

Upload the Lookup File

Steps:

  1. Go to Settings.
  2. Click Lookups.
  3. Select Lookup Table Files.
  4. Click Add New.
  5. Upload the file (productidvals.csv).
  6. Select the destination app (usually Search & Reporting).
  7. Click Save.

Now the CSV file becomes a lookup table in Splunk.

 

Create a Lookup Definition

A Lookup Definition tells Splunk how to use the lookup table.

Steps:

  1. Go to Settings → Lookups.
  2. Click Lookup Definitions.
  3. Click Add New.
  4. Choose the uploaded lookup table.
  5. Save the definition.

Now Splunk knows which file to use for the lookup process.

 

Select the Lookup Field

Next, enable the field for searching:

  1. Go to Search → All Fields.
  2. Select the field productid.
  3. Splunk automatically adds productdescription from the lookup table.

 

Use Lookup in Search Query

Now you can use the lookup field in your search.

Example:

 
index=web_application | lookup productidvals.csv productid OUTPUT productdescription
 

Result:

  • Splunk matches productid values from the dataset
  • It retrieves the productdescription from the lookup table

The search result will now display product names instead of just IDs.

 

A Lookup in Splunk is used to add meaningful information to search results by matching fields with values from another dataset (usually a CSV file).

Example:

  • Product ID → Product Name
  • IP Address → Location
  • User ID → User Name

This improves data readability and analysis.

 

Splunk – Schedules and Alerts

https://www.tutorialspoint.com/splunk/images/schedule_alert_1.jpg
https://infohub.delltechnologies.com/static/media/9198938f-8c47-5a0e-82d9-6db6a62cd3f7/DAM-90cc460c-56c4-40c4-ba7a-61d1194bd8fd/out/5114.071.png
Select time ranges to apply to your search | Splunk Enterprise (last  updated 2025-07-04T01:41:20.188Z)
Custom alert actions | Documentation | Enterprise | Splunk Developer Program

 

 

Overview

In Splunk, Scheduling and Alerts help automate monitoring tasks.

  • Scheduling → Runs reports automatically at specific times.
  • Alerts → Trigger actions when certain conditions are met in the data.

These features are widely used in system monitoring, security operations, and performance tracking.

 

Scheduling in Splunk

Scheduling means running a report or search automatically at predefined intervals.

Uses of Scheduling

  • Run reports daily, weekly, or monthly
  • Improve dashboard performance by generating results in advance
  • Automatically send reports via email
  • Automate system monitoring

 

Creating a Schedule

Steps:

  1. Open an existing report.
  2. Click Edit.
  3. Select Edit Schedule.

You will see scheduling options.

Example configuration:

  • Report runs every Monday at 6 AM.

 

Important Scheduling Features

Time Range

Defines the data time period used in the report.

Examples:

  • Last 15 minutes
  • Last 4 hours
  • Last week

 

Schedule Priority

Determines which report runs first when multiple reports are scheduled at the same time.

Higher priority reports run earlier.

 

Schedule Window

Allows a report to run within a flexible time window.

Example:

  • If window = 5 minutes
  • Report may run within 5 minutes of scheduled time

This helps balance system load.

 

Schedule Actions

After a scheduled report runs, Splunk can perform actions.

Examples:

  • Send email notification
  • Run a script
  • Save results to a file

This is configured using the Add Actions option.

 

Alerts in Splunk

Alerts are automatic actions triggered when specific conditions occur in search results.

Alerts help detect:

  • System errors
  • Security threats
  • Performance issues

Example:

  • Alert when login failures exceed 10 attempts
  • Alert when CPU usage exceeds 90%

 

Creating an Alert

Steps:

  1. Run a search query.
  2. Click Save As.
  3. Choose Alert.

This opens the Alert Configuration screen.

 

Alert Configuration Options

Title

Name of the alert.

Description

Explanation of what the alert monitors.

Permissions

Defines who can:

  • View the alert
  • Run the alert
  • Edit the alert

Options:

  • Private
  • Shared in App

 

Alert Type

Two types of alerts:

Scheduled Alert

  • Runs at specific intervals.

Real-Time Alert

  • Runs continuously in the background.

 

Trigger Condition

Defines when the alert should activate.

Examples:

  • Number of results
  • Number of hosts
  • Number of sources

Options include:

  • Once → trigger once when condition is met
  • For each result → trigger for every matching event

 

Trigger Actions

When the alert condition is satisfied, Splunk can perform actions such as:

  • Send email notification
  • Log the event
  • Run a script
  • Add results to a lookup file
  • Send webhook notifications

 

Simple Summary

  • Scheduling automatically runs reports at defined times.
  • Alerts trigger actions when specific conditions occur in data.
  • These features help automate monitoring, notifications, and reporting in Splunk.

 

Splunk – Knowledge Management

What is Knowledge Management in Splunk?

Knowledge Management (KM) in Splunk is the process of organizing, managing, and sharing knowledge objects so users can better understand and analyze machine data.

It helps transform raw machine data → meaningful information by adding structure, fields, tags, lookups, and data models.

 

Main Goals of Knowledge Management

  1. 1️⃣ Share knowledge objects with the correct users or teams.
  2. Normalize event data by maintaining consistent naming conventions.
  3. Remove duplicate or unused objects.
  4. Improve search and reporting performance.
  5. Create data models for Pivot users.

What is a Knowledge Object?

A Knowledge Object is a Splunk configuration or object that adds meaning to raw data.

When creating a knowledge object, it can be:

  • Private → only visible to the creator
  • Shared → accessible by other users or applications

Examples of Knowledge Objects

  • Saved searches
  • Field extractions
  • Lookups
  • Tags
  • Event types
  • Data models
  • Workflow actions

Uses of Knowledge Objects

When using Splunk regularly, many knowledge objects are created.
Without management, this can cause:

  • Duplicate objects
  • Confusing naming
  • Poor data organization

Knowledge Management helps:

  • Organize objects
  • Apply permissions
  • Improve search efficiency
  • Ensure proper usage across teams

 

Types of Knowledge Objects

Fields and Field Extractions

Fields are individual pieces of information extracted from raw data.

Example log:

 
IP=192.168.1.1 user=admin status=failed
 

Extracted fields:

FieldValue
IP192.168.1.1
useradmin
statusfailed

Two types:

Automatic extraction

  • Splunk extracts fields automatically.

Manual extraction

  • User defines custom fields.

These fields help structure raw logs into searchable data.

 

Event Types and Transactions

Event Types

Used to group similar events together.

Example:

 
eventtype=login_failure
 

This may include all logs containing failed login attempts.

 

Transactions

A transaction groups related events across time.

Example:

  • User login
  • File access
  • Logout

All actions belong to one user session.

 

Lookups and Workflow Actions

Lookups

Lookups add extra information from external datasets.

Example:

productidproductname
P01Laptop

Lookup adds productname to search results.

Sources:

  • CSV files
  • External scripts
  • Databases

 

Workflow Actions

Workflow actions allow interaction with external tools or resources.

Example:

  • Click an IP address → open WHOIS lookup
  • Send data to another system

This helps integrate Splunk with other applications.

 

Tags and Aliases

Tags

Tags are used to group related events or fields.

Example:

Hosts in New York office may all receive tag:

 
tag=NY_OFFICE
 

This allows easy filtering.

 

Aliases

Aliases are used when different field names represent the same data.

Example:

Original FieldAlias
clientipipaddress

Now both fields refer to the same information.

This helps normalize data across different sources.

 

Data Models

A Data Model is a structured representation of datasets.

It helps users analyze data without writing SPL queries.

Data models are mainly used by:

  • Pivot tool
  • Dashboards
  • Reports

Example data model structure:

 
Web Data  
   ├── User Activity  
   ├── File Access  
   └── Error Logs
 

Users can easily generate:

  • Tables
  • Charts
  • Reports

 

Knowledge Management in Splunk organizes and manages objects that give meaning to machine data.

Main knowledge objects include:

Knowledge ObjectPurpose
FieldsExtract information from logs
Event TypesGroup similar events
TransactionsLink related events
LookupsAdd external data
TagsGroup related fields/events
AliasesNormalize field names
Data ModelsStructured datasets for analysis

 

Knowledge Management helps convert raw machine data into structured, meaningful, and searchable information in Splunk.

 

Splunk – Subsearching

https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/42aebb22-5154-4ca9-9959-89fc55cf095a?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI0MmFlYmIyMi01MTU0LTRjYTktOTk1OS04OWZjNTVjZjA5NWEiLCJleHAiOjE3NzQ5NDEzMDQsImp0aSI6IjcyZDc1MTNiYWU2NTRlOTdhZDRjOTAxODYzOTVjMjUxIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoibERVQVlQcml2RWtzUGlISG5PSWkifQ.WQvmlTeouXwkgD7iZZO7G_pXJAfkadmZnve0aKfFZDo
https://splunk.deploy.heretto.com/v4/deployments/lbx3FHoDR4kUISPo5g64/object/88d5adaf-a3d7-4839-acc8-dc6e66e8980a?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJodHRwczovL2pvcnNlay5jb20vZXpkX29yZ2FuaXphdGlvbiI6InNwbHVuayIsImh0dHBzOi8vam9yc2VrLmNvbS9lemQvb2JqZWN0X3V1aWQiOiI4OGQ1YWRhZi1hM2Q3LTQ4MzktYWNjOC1kYzZlNjZlODk4MGEiLCJleHAiOjE3NzUzOTcxNDcsImp0aSI6IjFkZTU4MDI2NTEwMzRlYjU5Y2ZkZmNmNWU4ZDE4MDUxIiwiaHR0cHM6Ly9qb3JzZWsuY29tL2V6ZF9maWxlc2V0IjoiYU5NMEFoWVZyRTY5aldkNjVZZ1UifQ.5EsU_5rNdrYf51bjOvGLo4N5bD_9oIKSXRBJJkWZQt4
https://www.tutorialspoint.com/splunk/images/subsearch_1.jpg
About subsearches | Splunk Enterprise (last updated  2025-07-04T01:18:38.961Z)

What is Subsearch in Splunk?

A Subsearch is a search inside another search where the result of the inner search becomes the input for the outer search.

It is similar to a Subquery in SQL.

Key idea:

 
Outer Search [ Subsearch ]
 

Important rule:

  • The subsearch runs first
  • Its result is passed to the main (outer) search

Basic Syntax of Subsearch

In Splunk, subsearches are written inside square brackets.

Example structure:

 
main_search [ subsearch ]
 

Flow of execution:

  1. Subsearch runs first
  2. It returns results
  3. Those results are used in the main search

Example Scenario

Goal:

Find events where the file size is equal to the maximum file size and occurred on Sunday.

Steps involved:

  1. Find the maximum file size from logs.
  2. Use that value in the main search.

Step 1: Create the Subsearch

We first calculate the maximum file size.

Example:

 
index=web_application | stats max(bytes)
 

Explanation:

  • stats → statistical command
  • max(bytes) → finds the maximum value of the bytes field

Result example:

max(bytes)
10500

This value will be used by the main query.


Step 2: Add the Subsearch to the Main Search

Now we insert the subsearch into the main query.

Example:

 
index=web_application [ search index=web_application | stats max(bytes) ]
 

But usually we combine it with filters like day of week.

Example query:

 
index=web_application day="Sunday"   
[ search index=web_application | stats max(bytes) ]
 

How the Query Works

Execution steps:

1. Splunk runs the subsearch:

 
search index=web_application | stats max(bytes)
 

Result:

 
max(bytes)=10500
 

2. Splunk inserts that result into the main search.

Final search becomes something like:

 
index=web_application bytes=10500 day="Sunday"
 

3. Splunk returns only events where:

  • bytes = maximum value
  • day = Sunday

 

Key Characteristics of Subsearch

FeatureDescription
Execution OrderSubsearch runs first
SyntaxWritten inside [ ]
Result UseUsed as input for main search
Similar ConceptSQL Subqueries

 

Simple Eg:-

Example:

Find users who logged in successfully and are listed in another dataset.

 
index=login_logs user [ search index=user_list | fields user ]
 

Here:

  • Subsearch gets list of users
  • Main search finds login events for those users

 

Advantages of Subsearch

✔ Allows dynamic filtering
✔ Supports complex queries
✔ Useful for data correlation
✔ Reduces manual query updates

 

Subsearch in Splunk is a search nested inside another search where:

  • The inner search runs first
  • Its results are passed to the outer search
  • Written inside square brackets [ ]

Example:

 
index=web_application [ search index=web_application | stats max(bytes) ]
 

This technique helps create dynamic and advanced searches.

 

Splunk – Search Macros

Search Macro1
Search Macro2
Search Macro3
Search Macro4

 

 

What are Search Macros?

Search Macros in Splunk are reusable pieces of SPL (Search Processing Language) that can be inserted into multiple searches.

Instead of writing the same long SPL query again and again, you can create a macro and reuse it.

Think of a macro like a function in programming.

Example idea:

 
macro_name(argument)
 

Splunk will replace the macro with the defined SPL query when the search runs.

 

Why Search Macros Are Used

Search macros help to:

  • Reuse common search logic
  • Simplify complex SPL queries
  • Make searches dynamic using arguments
  • Improve search readability and maintenance

Example use cases:

  • Reusable statistics queries
  • Common filters
  • Shared security detection logic

Creating a Search Macro

Steps to create a macro:

  1. Go to Settings
  2. Select Advanced Search
  3. Click Search Macros
  4. Click Add New

This opens the macro creation configuration page.


Example Scenario

Goal:

We want to calculate statistics of file sizes from logs.

Using the field:

 
bytes
 

We want to dynamically calculate:

  • Average file size
  • Maximum file size
  • Minimum file size

Instead of writing separate SPL queries, we create one macro.


Defining the Macro

Macro name example:

 
filestats(1)
 

Explanation:

  • (1) means one argument is required

Argument name:

 
fun
 

Macro definition SPL:

 
stats $fun$(bytes) by file
 

Here:

  • $fun$ is replaced by avg, max, or min

Using the Macro in a Search

Macros are used inside backticks ( ).

Example 1 – Average File Size

Search:

 
index=web_application | `filestats(avg)`
 

Output:

fileavg(bytes)
file1.html520
file2.jpg1020

Example 2 – Maximum File Size

Search:

 
index=web_application | `filestats(max)`
 

Output:

filemax(bytes)
file1.html980
file2.jpg2100

Example 3 – Minimum File Size

Search:

 
index=web_application | `filestats(min)`
 

Output:

filemin(bytes)
file1.html200
file2.jpg500

Key Syntax Rules

ElementDescription
(1)Number of macro arguments
$argument$Used inside macro definition
`macro_name()`Used to call macro

Example:

 
`filestats(avg)`
 

Advantages of Search Macros

✔ Reusable SPL code
✔ Shorter search queries
✔ Easier maintenance
✔ Dynamic arguments support
✔ Useful for large Splunk environments


Simple Summary

A Search Macro is a reusable SPL block that acts like a function.

Key points:

  • Defined in Settings → Advanced Search → Search Macros
  • Supports dynamic arguments
  • Used inside searches with backticks

Example:

 
`filestats(avg)`
 

This helps reuse complex search logic easily across multiple queries.

 

Splunk – Event Types

 
Event Type1
Event Type2
Event Type3

 

Event Type4

 

Event Type5
Event Type6

What is an Event Type?

An Event Type in Splunk is a saved search that categorizes a specific group of events based on defined criteria.

Instead of repeatedly writing the same search query, you can save the search as an event type and reuse it later.

In simple words:

Event Type = Named search that identifies a specific group of events.


Example Scenario

Suppose your logs contain HTTP status codes.

You want to identify all events where:

 
status = 200
 

This means the HTTP request was successful.

Instead of searching every time like this:

 
status=200
 

You can create an event type called:

 
status200
 

Now you can simply search:

 
eventtype=status200
 

Ways to Create Event Types

There are two main methods.


1️⃣ Creating Event Type Using Search

Step-by-step process:

  1. Run a search query.

Example:

 
status=200 day=Wednesday
 
  1. Click Save As
  2. Select Event Type

Then configure the event type.


Event Type Configuration Options

Name

Example:

 
status200
 

Tags (Optional)
Tags help categorize event types.

Example:

 
web_success
 

Color

Used to highlight matching events in search results.

Example:

  • Green → successful requests
  • Red → failed requests

Priority

If multiple event types match the same event, priority decides which event type appears first.


2️⃣ Creating Event Type from Settings

Another method:

Steps:

  1. Go to Settings
  2. Select Event Types
  3. Click New Event Type

Then enter the same search query manually.

Example:

 
status=200 day=Wednesday
 

After saving, the event type becomes available for future searches.


Viewing Event Types

To view existing event types:

  1. Go to Settings
  2. Select Event Types

This page shows:

  • Event type name
  • Search definition
  • Permissions
  • Priority

Using Event Types in Searches

After creating an event type, you can use it in SPL queries.

Example:

 
eventtype=status200
 

Splunk automatically expands it to:

 
status=200 day=Wednesday
 

Mixing Event Types with Other Searches

You can combine event types with other filters.

Example:

 
eventtype=status200 host=web_server
 

Result:

  • Shows successful HTTP events
  • Only from specific hosts

Matching events will appear highlighted with the chosen color.


Advantages of Event Types

✔ Simplifies complex searches
✔ Reusable search logic
✔ Improves search readability
✔ Helps categorize events
✔ Supports tagging and highlighting


Event Type vs Saved Search

FeatureEvent TypeSaved Search
PurposeCategorize eventsSave full search
UsageUsed inside searchesRun directly
HighlightingYesNo

Simple Summary

An Event Type is a saved search that groups events based on defined criteria.

Example:

Search condition:

 
status=200
 

Event Type created:

 
status200
 

Usage:

 
eventtype=status200
 

This helps quickly identify and reuse specific types of events in Splunk searches.

 

Splunk – Basic Chart

Basic Chart1

 

Basic Chart2

 

Basic Chart3
Basic Chart4

 

What is a Basic Chart in Splunk?

Splunk provides powerful visualization features that convert search results into graphical charts. These charts help users analyze trends, patterns, and statistics visually.

Charts are created from search queries that produce numerical or statistical results.

Example data source:

 
index=web_application
 

Example goal:

Find average file size (bytes) and display it as a chart.


Step 1: Generate Statistical Data

Before creating a chart, the search must produce statistical output.

Example SPL query:

 
index=web_application | stats avg(bytes) by file
 

Result appears in the Statistics tab.

Example output:

fileavg(bytes)
file1.html520
file2.jpg1200
file3.png980

This statistical data becomes the input for chart visualization.


Step 2: Create the Chart

Steps:

  1. Run the search query.
  2. Open the Statistics tab to confirm the data.
  3. Click the Visualization tab.

Splunk automatically creates a default chart, usually a Pie Chart.

Example visualization:

  • Each file represented as a slice
  • Slice size based on average bytes

Step 3: Change Chart Type

Splunk allows multiple chart types.

Common chart options include:

Chart TypePurpose
Pie ChartShows proportions
Bar ChartCompares values
Column ChartDisplays grouped comparisons
Line ChartShows trends over time
Area ChartDisplays cumulative trends

Example:

Switching from Pie Chart → Bar Chart makes file sizes easier to compare.


Step 4: Formatting the Chart

Splunk also allows customizing the chart appearance.

Click Format to modify chart settings.

Formatting options include:

Axes Settings

Control labels and scale of:

  • X-axis
  • Y-axis

Legends

Legends describe what each color or value represents.

Example:

 
Blue → file1.html  
Green → file2.jpg
 

Data Labels

Display actual numerical values on the chart.

Example:

 
file1.html → 520 bytes  
file2.jpg → 1200 bytes
 

Chart Orientation

Example:

  • Vertical bar chart
  • Horizontal bar chart

Horizontal charts are often easier to read when many values exist.


Example Full SPL Query

Example search generating chart data:

 
index=web_application | stats avg(bytes) by file
 

Visualization output:

  • Bar chart showing average file size per file

Advantages of Charts in Splunk

✔ Easy data visualization
✔ Identify trends quickly
✔ Improve dashboards
✔ Better data analysis
✔ Useful for reports and presentations


Simple Summary

A Basic Chart in Splunk converts statistical search results into visual graphs.

Steps:

  1. Run search with statistical function
  2. View results in Statistics tab
  3. Open Visualization tab
  4. Choose chart type
  5. Format chart if needed

Example SPL:

 
index=web_application | stats avg(bytes) by file
 

This can be visualized as:

  • Pie chart
  • Bar chart
  • Line chart

 

Splunk – Overlay Chart

Chart Overlay1_1
Chart Overlay1
Chart Overlay2
Chart Overlay3

What is an Overlay Chart?

An Overlay Chart in Splunk is used to display one chart on top of another so that multiple metrics can be compared in the same visualization.

Typically:

  • One metric is shown as bars or columns
  • Another metric is shown as a line overlay

This helps identify patterns, correlations, and trends between datasets.

Example comparison:

  • File size vs Average file size
  • Sales vs Average sales
  • Network traffic vs Standard deviation

Example Scenario

Suppose we want to analyze file sizes from web application logs across different days of the week.

We calculate:

1️⃣ Total bytes
2️⃣ Average bytes
3️⃣ Standard deviation of bytes

These metrics help understand file size distribution.


Step 1: Create the Base Chart

First create a chart with two metrics.

Example SPL query:

 
index=web_application  
| stats sum(bytes) as total_bytes avg(bytes) as avg_bytes by weekday
 

Example statistical output:

weekdaytotal_bytesavg_bytes
Monday12000800
Tuesday15000900
Wednesday17000950

This data can be visualized as a bar chart.


Step 2: Add a Third Variable

To create an overlay chart, add another statistical measure such as standard deviation.

Example updated query:

 
index=web_application  
| stats sum(bytes) avg(bytes) stdev(bytes) by weekday
 

Now the statistics tab contains:

weekdaysum(bytes)avg(bytes)stdev(bytes)
Monday12000800150
Tuesday15000900200
Wednesday17000950220

This extra field enables the overlay visualization.


Step 3: Create the Overlay Chart

Steps:

  1. Go to Visualization Tab
  2. Click Format
  3. Select Chart Overlay

A configuration window appears.


Step 4: Configure Overlay Settings

Choose the field to overlay.

Example:

 
stdev(bytes)
 

Other optional settings include:

OptionPurpose
TitleChart title
ScaleSecondary axis scale
Min/MaxAxis value limits
IntervalAxis spacing

Usually default settings work fine.


Step 5: Final Visualization

The final overlay chart typically shows:

  • Bars → total or average file size
  • Line overlay → standard deviation trend

This allows users to see:

  • Which days have higher variation
  • When file sizes deviate from the average

Example Practical Use Cases

Overlay charts are commonly used for:

ScenarioExample
Performance MonitoringCPU usage vs average CPU
Network MonitoringTraffic vs packet loss
Security MonitoringLogin attempts vs anomaly score
Business AnalyticsSales vs average sales

Advantages of Overlay Charts

✔ Compare multiple metrics easily
✔ Identify trends quickly
✔ Detect anomalies
✔ Improve dashboard analytics
✔ Provide deeper data insights


Simple Summary

An Overlay Chart in Splunk displays two charts together in one visualization.

Steps:

1️⃣ Create a chart with statistical values
2️⃣ Add a third metric
3️⃣ Use Visualization → Format → Chart Overlay
4️⃣ Select the overlay field

Example SPL query:

 
index=web_application  
| stats sum(bytes) avg(bytes) stdev(bytes) by weekday
 

This helps compare main metrics with trend indicators in the same chart.

 

 

Splunk - Sparklines

Sparkine1

Sparkine2

Sparkine3

What are Sparklines?

A Sparkline is a small, compact chart that shows trends over time inside a table cell.
Unlike normal charts, sparklines do not display axes or labels. They appear as tiny line graphs that show how a value changes over time.

They are useful for quickly understanding trends or fluctuations in data.

Example idea:

FileAvg BytesTrend
file1.html800▁▃▅▇
file2.jpg1200▂▆▃▇

The small graph indicates how the value changed over time.


Why Sparklines Are Used

Sparklines help:

  • Show data trends inside tables
  • Save dashboard space
  • Quickly identify patterns
  • Compare multiple trends simultaneously

They are commonly used in monitoring dashboards and reports.


Step 1: Select the Fields

First, run a search that produces statistical values.

Example SPL query:

 
index=web_application  
| stats avg(bytes) by file
 

Result example:

fileavg(bytes)
file1.html820
file2.jpg1200
file3.png950

This statistical data will be used to create the sparkline.


Step 2: Create the Sparkline

To generate sparklines, use the sparkline() function with the stats command.

Example query:

 
index=web_application  
| stats sparkline(avg(bytes)) as trend avg(bytes) by file
 

Result:

fileavg(bytes)trend
file1.html820tiny graph
file2.jpg1200tiny graph
file3.png950tiny graph

Each row now shows a mini trend graph representing changes in average bytes.


Step 3: Time Range Effect

The sparkline graph depends on the selected time range.

Example:

Time Range: All Time

The sparkline displays the trend for the entire dataset.

Time Range: Last 30 Days

Only data from the last 30 days is used.

Effects:

  • Some files may disappear if they were not present in that period.
  • The sparkline shape changes based on recent trends.

Example Full Query

Example SPL query with sparkline:

 
index=web_application  
| stats sparkline(avg(bytes)) as trend avg(bytes) by file
 

Explanation:

CommandPurpose
statsGenerates statistical results
avg(bytes)Calculates average file size
sparkline()Generates mini trend chart

Advantages of Sparklines

✔ Shows trends in a compact format
✔ Works inside tables and dashboards
✔ No extra chart space required
✔ Useful for monitoring and comparisons


Real-World Use Cases

Sparklines are commonly used for:

Use CaseExample
Server MonitoringCPU usage trend
Network MonitoringTraffic fluctuation
Security MonitoringLogin attempts over time
Business AnalyticsSales trend per product

Simple Summary

A Sparkline in Splunk is a small trend chart displayed inside a table cell.

Key points:

  • Shows data trends over time
  • Does not include axes or labels
  • Created using sparkline() function

Example SPL:

 
index=web_application  
| stats sparkline(avg(bytes)) as trend avg(bytes) by file
 

This displays mini trend charts for each file.

 

Splunk – Managing Indexes

Indexes1
Indexes2
 
Indexes3
Indexes4
 

What is an Index in Splunk?

An Index in Splunk is a storage location where processed machine data is stored and organized for fast searching.

Indexing works similar to database indexing, where data is given structured references so searches can be executed quickly.

When data enters Splunk:

  1. The Indexer processes the data
  2. Data is stored in an index
  3. Searches retrieve data from those indexes

Default Indexes in Splunk

When Splunk is installed, it automatically creates three default indexes.

Index NamePurpose
mainDefault index where most ingested data is stored
internalStores Splunk system logs and performance metrics
auditStores user activity and audit logs

Example Search Using an Index

Example SPL:

 
index=main
 

This searches events stored in the main index.

Example:

 
index=_internal
 

This searches Splunk system logs.


Role of the Indexer

The Indexer component in Splunk is responsible for:

  • Processing incoming data
  • Creating indexes
  • Storing events in indexed format
  • Enabling fast search operations

So the flow is:

 
Data → Indexer → Index → Search
 

Checking Existing Indexes

You can view available indexes in Splunk.

Steps:

  1. Login to Splunk Web Interface
  2. Go to Settings
  3. Click Indexes

This displays a list of:

  • Default indexes
  • Custom indexes
  • Storage usage
  • Data size

Creating a New Index

Sometimes you may want to separate different types of data into different indexes.

Example reasons:

  • Security logs
  • Web application logs
  • Network logs

Steps to create an index:

  1. Go to Settings
  2. Click Indexes
  3. Click New Index

You will see a configuration screen.


Required Information

FieldDescription
Index NameName of the new index
Storage PathLocation where data is stored
Max SizeMaximum storage allocation
Data RetentionTime period for storing data

Example:

 
index_web_app
 

Assigning Data to an Index

After creating an index, new data must be configured to use that index.

Steps:

  1. Go to Settings
  2. Select Data Inputs
  3. Choose Files & Directories
  4. Select the data source
  5. Assign the new index

Example:

 
index = index_web_app
 

Now all events from that data source will be stored in the new index.


Example Search with Custom Index

Example SPL query:

 
index=index_web_app
 

This searches only the data stored in the custom index.


Advantages of Using Multiple Indexes

✔ Better data organization
✔ Faster searches
✔ Easier data management
✔ Separate storage for different log types
✔ Improved security control

Example:

IndexData Type
web_logsWebsite logs
security_logsSecurity events
system_logsServer logs

Simple Summary

An Index in Splunk is a structured storage location for processed machine data.

Key points:

  • Indexing improves search performance
  • Default indexes include main, internal, and audit
  • New indexes can be created for better data management
  • Data sources can be assigned to specific indexes

Example SPL:

 
index=main
 

This searches events stored in the main index.

 

Splunk – Calculated Fields

Calculated Fields1
Calculated Fields2
Calculated Fields3

 

What are Calculated Fields?

Calculated Fields in Splunk are new fields created by applying calculations or transformations to existing fields in events.

They are useful when you want to:

  • Modify existing data
  • Convert values to another unit
  • Extract part of a field
  • Create new derived information

These calculations are usually done using the eval command in SPL.


Why Calculated Fields Are Used

Calculated fields help:

  • Perform mathematical calculations
  • Transform data formats
  • Extract specific parts of a field
  • Create new analytical fields

Example uses:

Original FieldCalculated Field
bytesbytes_in_GB
date_wdayshort_day

Example Scenario

Suppose the web_application log contains the following fields:

bytesdate_wday
2048Wednesday
4096Thursday
1024Monday

Goals:

1️⃣ Convert bytes → GB
2️⃣ Show only the first three characters of the weekday


Using the eval Function

Splunk uses the eval command to create calculated fields.

Syntax

 
eval new_field = calculation
 

Example 1 – Convert Bytes to GB

To convert bytes to GB:

 
eval byte_in_GB = bytes / 1024
 

Full search example:

 
index=web_application  
| eval byte_in_GB = bytes/1024
 

Result:

bytesbyte_in_GB
20482
40964
10241

Example 2 – Extract First 3 Letters of Weekday

To extract the first three characters from the date_wday field, use the substr() function.

 
eval short_day = substr(date_wday,1,3)
 

Example search:

 
index=web_application  
| eval short_day = substr(date_wday,1,3)
 

Result:

date_wdayshort_day
WednesdayWed
ThursdayThu
MondayMon

Combining Both Calculations

You can apply multiple calculated fields in the same search.

Example SPL:

 
index=web_application  
| eval byte_in_GB = bytes/1024  
| eval short_day = substr(date_wday,1,3)
 

Result:

bytesbyte_in_GBdate_wdayshort_day
20482WednesdayWed
40964ThursdayThu

Displaying Calculated Fields

After creating calculated fields:

  1. Click All Fields
  2. Select the new fields (e.g., byte_in_GB, short_day)
  3. They will appear in the search results table

Common Functions Used in Calculated Fields

FunctionPurpose
evalCreate calculated fields
substr()Extract part of a string
round()Round numeric values
len()Get string length
tonumber()Convert string to number

Example:

 
eval rounded_value = round(bytes/1024,2)
 

Advantages of Calculated Fields

✔ Transform raw data into meaningful values
✔ Perform calculations during search
✔ Create reusable analytical fields
✔ Improve data interpretation


Simple Summary

A Calculated Field is a new field created from existing fields using calculations.

It is usually created using the eval command.

Example:

 
index=web_application  
| eval byte_in_GB = bytes/1024  
| eval short_day = substr(date_wday,1,3)
 

This produces new fields:

  • byte_in_GB → converted file size
  • short_day → shortened weekday name

Splunk – Tags

 
Tags1
Tags2
Tags3
Tags

 

What are Tags in Splunk?

Tags are labels used to group specific field–value combinations in Splunk events.

They allow you to categorize events so that they can be searched easily with a single keyword instead of writing complex queries.

Tags are part of Splunk Knowledge Objects.


Why Tags Are Used

Tags help to:

  • Organize events
  • Simplify searches
  • Group similar field values
  • Improve data classification

Example:

Instead of searching multiple status codes:

 
status=503 OR status=505
 

You can create a tag called:

 
server_error
 

Then simply search:

 
tag=server_error
 

Fields That Can Be Tagged

Tags can be assigned to different Splunk fields such as:

Field TypeExample
hostserver01
sourceweb_log
sourcetypeapache_access
event typestatus events
field-value pairsstatus=503

Example Scenario

Suppose we want to group server error status codes.

Error status codes:

 
503  
505
 

We assign them the tag:

 
server_error
 

So:

FieldValueTag
status503server_error
status505server_error

Steps to Create Tags

Step 1 – Expand Event

  1. Run a search
  2. Expand an event
  3. View the field list

Example field:

 
status=503
 

Step 2 – Edit Tag

  1. Locate the field
  2. Click Actions
  3. Select Edit Tags

Step 3 – Add Tag Name

Enter the tag name.

Example:

 
server_error
 

Apply this tag to:

 
status=503  
status=505
 

You must repeat the step for each value.


Searching Using Tags

Once tags are created, searching becomes easier.

Example Search

 
tag=server_error
 

This will return events containing:

 
status=503  
status=505
 

Even though the search does not explicitly mention them.


Example Comparison

Without Tag

 
status=503 OR status=505
 

With Tag

 
tag=server_error
 

Tags make the query simpler and reusable.


Tags vs Fields

FeatureTagsFields
PurposeGroup field valuesStore event data
UsageSimplify searchesExtract event information
Exampleserver_errorstatus=503

Advantages of Tags

✔ Simplifies complex searches
✔ Groups multiple field values
✔ Improves data categorization
✔ Helps in knowledge management
✔ Useful for dashboards and reports


Simple Summary

A Tag in Splunk is a label assigned to field-value combinations to group similar events.

Example:

 
status=503  
status=505
 

Tagged as:

 
server_error
 

Search query:

 
tag=server_error
 

This returns all events that match those tagged values.

 

What are Apps in Splunk?

Splunk Apps are packages that extend the functionality of Splunk.
They contain configurations, dashboards, reports, searches, field extractions, and visualizations designed for specific use cases.

Apps help users analyze specific types of data quickly without building everything from scratch.


Components of a Splunk App

A Splunk App may contain several components:

ComponentDescription
DashboardsVisual displays of data
ReportsSaved searches with results
AlertsNotifications triggered by conditions
Field ExtractionsExtract fields from raw data
Data ModelsStructured data representation
LookupsExternal reference data

These components work together to provide a complete monitoring or analysis solution.


Examples of Popular Splunk Apps

Some commonly used apps include:

App NamePurpose
Splunk App for AWSMonitor Amazon Web Services
Splunk App for Windows InfrastructureMonitor Windows systems
Splunk App for Unix and LinuxMonitor Linux servers
Splunk Security EssentialsSecurity analytics
Splunk IT Service Intelligence (ITSI)IT service monitoring

These apps are usually downloaded from Splunkbase, the official marketplace.


Installing Splunk Apps

There are two common ways to install apps.

Method 1 – From Splunkbase

Steps:

  1. Go to Apps → Find More Apps
  2. Search for the desired app
  3. Click Install
  4. Provide Splunkbase credentials

Method 2 – Manual Installation

Steps:

  1. Download the app from Splunkbase
  2. Go to Apps → Manage Apps
  3. Click Install App from File
  4. Upload the .spl or .tgz package

Managing Installed Apps

You can manage apps using:

 
Settings → Manage Apps
 

From here you can:

  • Enable or disable apps
  • Upgrade apps
  • Configure permissions
  • Remove apps

Example: Using an App

Suppose you install Splunk App for AWS.

The app automatically provides:

  • AWS dashboards
  • Cloud monitoring reports
  • Prebuilt searches
  • Alerts for cloud resources

This saves time compared to building everything manually.


Advantages of Splunk Apps

✔ Ready-to-use dashboards and reports
✔ Faster deployment for monitoring systems
✔ Easy integration with external platforms
✔ Customizable for specific business needs
✔ Extend Splunk capabilities


Types of Splunk Apps

TypeDescription
Technology Add-ons (TA)Data collection and field extraction
Visualization AppsCustom dashboards
Security AppsSecurity monitoring
Infrastructure AppsServer and cloud monitoring

Simple Summary

A Splunk App is a packaged extension that adds dashboards, reports, alerts, and configurations to Splunk.

Example:

 
Apps → Install → Use dashboards & reports
 

They help users analyze specific data sources quickly and efficiently.

 

Splunk – Removing Data

What Does Removing Data Mean in Splunk?

In Splunk, removing data means deleting indexed events from the system so they are no longer searchable.

However, Splunk does not normally delete individual events easily because data is stored in indexed buckets for fast searching. Instead, data removal usually happens by:

  • Deleting specific events using commands
  • Deleting entire indexes
  • Using data retention policies

Methods to Remove Data in Splunk

There are mainly three ways to remove data.

MethodDescription
Using the delete commandRemoves specific search results
Deleting an indexRemoves all events stored in that index
Data retention policyAutomatically removes old data

1. Removing Events Using the delete Command

Splunk provides a delete command that marks events as deleted.

Example Search

 
index=main host=server1 | delete
 

This command:

  • Finds events from host=server1
  • Marks them as deleted

Important points:

  • Events are not physically removed immediately
  • They are hidden from search results

To use the delete command, the role must have can_delete permission.


2. Removing Data by Deleting an Index

If you want to remove all data, you can delete the entire index.

Steps:

  1. Go to Settings
  2. Click Indexes
  3. Select the index
  4. Delete or remove it

Example index:

 
index_web_logs
 

Deleting the index removes all events stored in that index.


3. Removing Data Using Data Retention Policies

This is the most common method in production environments.

Splunk automatically deletes old data based on:

  • Maximum storage size
  • Data retention period

Example configuration:

SettingExample
Maximum index size500 GB
Data retention30 days

When the limit is reached:

  • Splunk automatically removes older data buckets

Example Scenario

Suppose an index contains web logs.

Search:

 
index=web_logs status=404
 

If you want to remove these events:

 
index=web_logs status=404 | delete
 

These events will no longer appear in searches.


Important Notes

⚠ Splunk discourages frequent manual deletion because:

  • It can affect index integrity
  • It may cause performance issues

Instead, organizations usually rely on:

  • Retention policies
  • Index rotation

Advantages of Controlled Data Removal

✔ Saves storage space
✔ Maintains system performance
✔ Ensures compliance with data policies
✔ Prevents unnecessary data accumulation


Simple Summary

Removing data in Splunk means deleting events from indexes.

Main methods:

1️⃣ Using delete command

 
index=main host=server1 | delete
 

2️⃣ Deleting an index

3️⃣ Using automatic data retention policies

 

 

Splunk – Custom Chart

 

What is a Custom Chart in Splunk?

A Custom Chart in Splunk is a user-defined visualization created to display data in a specific way that is not available in the default chart options.

Splunk provides basic charts such as:

  • Pie chart
  • Bar chart
  • Line chart
  • Area chart
  • Column chart

But sometimes organizations need specialized visualizations, which can be created using custom chart modules or visualization apps.


Why Custom Charts Are Used

Custom charts are used when:

  • Default Splunk charts are not sufficient
  • Special visualization is required
  • Advanced dashboards are needed
  • Interactive analytics is required

Examples include:

  • Heat maps
  • Radar charts
  • Sankey diagrams
  • Advanced timelines

Creating a Custom Chart

To create a custom chart in Splunk:

Step 1 – Run a Search Query

First run a search that produces statistical data.

Example:

 
index=web_application  
| stats avg(bytes) by file
 

This produces a table like:

fileavg(bytes)
file12048
file24096

Step 2 – Open Visualization Tab

After the results appear:

  1. Click Visualization
  2. Choose Chart Type

Splunk will display a default chart such as pie chart or bar chart.


Step 3 – Customize the Chart

Use the Format option to modify:

SettingPurpose
Axis labelsLabel X and Y axes
LegendDisplay chart legends
Data labelsShow values on chart
ColorsCustomize appearance
Chart titleAdd meaningful title

Using Custom Visualization Apps

Splunk also supports custom visualization apps from Splunkbase.

Examples:

VisualizationUse Case
Sankey DiagramData flow analysis
Heat MapDensity analysis
Bubble ChartMultivariable comparison
Gauge ChartPerformance monitoring

These visualizations can be installed via:

 
Apps → Find More Apps → Install Visualization
 

Example Custom Chart Query

Example SPL:

 
index=web_application  
| stats avg(bytes) by date_wday
 

Visualization result:

DayAvg Bytes
Monday3000
Tuesday2500
Wednesday3500

This can be displayed as:

  • Bar chart
  • Line chart
  • Custom visualization

Advantages of Custom Charts

✔ Better data visualization
✔ More meaningful dashboards
✔ Supports advanced analytics
✔ Improves decision making
✔ Highly customizable


Simple Summary

A Custom Chart in Splunk is a specialized visualization created using search results and advanced chart configuration.

Steps:

1️⃣ Run search query
2️⃣ Open Visualization tab
3️⃣ Choose chart type
4️⃣ Customize using Format options

Example SPL:

 
index=web_application  
| stats avg(bytes) by file
 
 

Splunk - Monitor Files


 

Monitor Files1
Monitor Files2
Monitor Files3

 

What is File Monitoring in Splunk?

File Monitoring in Splunk means tracking files or directories for new data and automatically indexing that data.

When new data is written to a monitored file (such as log files), Splunk reads the new entries and adds them to its index so they can be searched and analyzed.

This is commonly used for application logs, server logs, and system logs.


How Splunk Monitors Files

Splunk uses a monitor input to watch files or directories.

Process:

  1. User specifies file or directory path
  2. Splunk reads the file
  3. When new data appears, Splunk indexes it automatically

Example:

 
/var/log/apache/access.log
 

Whenever new logs appear in this file, Splunk captures and indexes them.


Monitoring Directories

Instead of monitoring a single file, you can monitor a complete directory.

Example:

 
/var/log/apache/
 

If the directory contains subdirectories, Splunk can also monitor them recursively.

Example structure:

 
/var/log/apache/  
      access.log  
      error.log  
      archive/  
           old_logs.log
 

Splunk can monitor all files inside this directory tree.


Monitoring Network or Shared Directories

Splunk can also monitor:

  • Mounted directories
  • Shared folders
  • Network file systems

Condition:

✔ Splunk must have read permission for the directory.


Using Whitelist and Blacklist

Sometimes you only want to monitor specific files.

Splunk allows filtering using:

OptionPurpose
WhitelistInclude only specific files
BlacklistExclude specific files

Example:

Whitelist:

 
*.log
 

Blacklist:

 
debug.log
 

This means Splunk monitors all .log files except debug.log.


Important Behavior

If you disable or delete a monitor input:

  • Splunk stops checking the file for new data
  • But already indexed data remains searchable

So Splunk does not delete indexed data automatically.


Adding Files to Monitor (Using Splunk Web)

Steps:

Step 1 – Open Add Data

Go to:

 
Splunk Home → Add Data
 

Step 2 – Choose Monitor

Select:

 
Add Data → Monitor
 

This allows monitoring:

  • Files
  • Directories
  • Network paths

Step 3 – Select File or Directory

Example:

 
/var/log/web_application.log
 

Splunk automatically detects:

  • Source type
  • Timestamp
  • File structure

Step 4 – Confirm Settings

Splunk automatically configures:

  • Source type
  • Host
  • Index

Then click Submit.


Viewing Monitored Data

After configuration, Splunk begins indexing events.

Example search:

 
index=main
 

Result:

TimestampEvent
10:01GET /index.html
10:02POST /login

If new log entries appear, Splunk updates the results automatically.


Common Logs Monitored by Splunk

Splunk often monitors logs such as:

Log TypeExample
Web server logsApache, Nginx
Application logsJava, .NET
System logsLinux syslog
Security logsFirewall logs

Advantages of File Monitoring

✔ Real-time log monitoring
✔ Automatic data indexing
✔ Easy integration with applications
✔ Supports large-scale log analysis


Simple Summary

File Monitoring in Splunk allows the system to watch files or directories and index new data automatically.

Steps:

1️⃣ Add data
2️⃣ Choose Monitor
3️⃣ Select file or directory
4️⃣ Splunk starts indexing new data

Example monitored file:

 
/var/log/web_access.log
 

Whenever new logs appear, Splunk captures and indexes them for analysis.

 

1. Splunk – Sort Command

What it Does

The sort command arranges search results in ascending or descending order based on one or more fields.

It works like ORDER BY in SQL.

Syntax

 
... | sort <field>
 

Examples

Sort results in ascending order

 
index=web_application | sort bytes
 

This sorts events by bytes from smallest to largest.

Sort results in descending order

 
index=web_application | sort - bytes
 

- means descending order.

Example Result

filebytes
file1200
file2400
file3800

2. Splunk – Top Command

What it Does

The top command finds the most frequent values in a field.

It shows:

  • Top values
  • Count
  • Percentage

Syntax

 
... | top <field>
 

Example

 
index=web_application | top file
 

This shows which files appear most frequently in the logs.

Example Result

filecountpercent
index.html12040%
login.html8027%
about.html5017%

You can also limit the results.

 
index=web_application | top limit=5 file
 

This shows top 5 most frequent values.


3. Splunk – Stats Command

What it Does

The stats command performs statistical calculations on fields.

It is one of the most powerful commands in Splunk.

Syntax

 
... | stats <function>(field)
 

Common Statistical Functions

FunctionPurpose
countCount events
sumAdd values
avgAverage
maxMaximum value
minMinimum value

Examples

Count events

 
index=web_application | stats count
 

Average file size

 
index=web_application | stats avg(bytes)
 

Average bytes by file

 
index=web_application | stats avg(bytes) by file
 

Example Result

fileavg(bytes)
file1400
file2650
file3900

Key Differences

CommandPurposeExample Use
sortOrder search resultsSort logs by size
topFind most frequent valuesTop visited pages
statsPerform calculationsAverage response time

Example Using All Three Together

 
index=web_application  
| stats avg(bytes) by file  
| sort - avg(bytes)
 

What happens here:

  1. stats → calculates average bytes per file
  2. sort → sorts results in descending order

Simple Summary

CommandWhat it Does
sortArranges results in ascending/descending order
topFinds most frequent values
statsPerforms statistical calculations

 

 

4
 

 

4