geostats - Splunk Documentation (2025)

Description

Use the geostats command to generate statistics to display geographic data and summarize the data on maps.

The command generates statistics which are clustered into geographical bins to be rendered on a world map.The events are clustered based on latitude and longitude fields in the events. Statistics are then evaluated on the generated clusters. The statistics can be grouped or split by fields using a BY clause.

For map rendering and zooming efficiency, the geostats command generates clustered statistics at a variety of zoom levels in one search, the visualization selecting among them. The quantity of zoom levels is controlled by the binspanlat, binspanlong, and maxzoomlevel options. The initial granularity is selected by the binspanlat and the binspanlong. At each level of zoom, the number of bins is doubled in both dimensions for a total of 4 times as many bins for each zoom in.

Syntax

The required syntax is in bold.

geostats
[ translatetoxy=<bool> ]
[ latfield=<string> ]
[ longfield=<string> ]
[ globallimit=<int> ]
[ locallimit=<int> ]
[ outputlatfield=<string> ]
[ outputlongfield=<string> ]
[ binspanlat=<float> binspanlong=<float> ]
[ maxzoomlevel=<int> ]
<stats-agg-term>...
[ <by-clause> ]

Required arguments

stats-agg-term
Syntax: <stats-func> ( <evaled-field> | <wc-field> ) [AS <wc-field>]
Description: A statistical aggregation function. See Stats function options. The function can be applied to an eval expression, or to a field or set of fields. Use the AS clause to place the result into a new field with a name that you specify. You can use wild card characters in field names. For more information on eval expressions, see Types of eval expressions in the Search Manual.

Optional arguments

binspanlat
Syntax: binspanlat=<float>
Description: The size of the bins in latitude degrees at the lowest zoom level. If you set binspanlat lower than the default value, the visualizations on the map might not render.
Default: 22.5. If the default values for binspanlat and binspanlong are used, a grid size of 8x8 is generated.
binspanlong
Syntax: binspanlong=<float>
Description: The size of the bins in longitude degrees at the lowest zoom level. If you set binspanlong lower than 33, the visualizations on the map might not render.
Default: 45.0. If the default values for binspanlat and binspanlong are used, a grid size of 8x8 is generated.
by-clause
Syntax: BY <field>
Description: The name of the field to group by.
globallimit
Syntax: globallimit=<int>
Description: Controls the number of named categories to add to each pie chart. There is one additional category called "OTHER" under which all other split-by values are grouped. Setting globallimit=0 removes all limits and all categories are rendered. Currently the grouping into "OTHER" only works intuitively for count and additive statistics.
Default: 10
locallimit
Syntax: locallimit=<int>
Description: Specifies the limit for series filtering. When you set locallimit=N, the top N values are filtered based on the sum of each series. If locallimit=0, no filtering occurs.
Default: 10
latfield
Syntax: latfield=<field>
Description: Specify a field from the pre-search that represents the latitude coordinates to use in your analysis.
Defaults: lat
longfield
Syntax: longfield=<field>
Description: Specify a field from the pre-search that represents the longitude coordinates to use in your analysis.
Default: lon
maxzoomlevel
Syntax: maxzoomlevel=<int>
Description: The maximum number of levels to create in the quadtree.
Default: 9. Specifies that 10 zoom levels are created, 0-9.
outputlatfield
Syntax: outputlatfield=<string>
Description: Specify a name for the latitude field in your geostats output data.
Default: latitude
outputlongfield
Syntax: outputlongfield=<string>
Description: Specify a name for the longitude field in your geostats output data.
Default: longitude
translatetoxy
Syntax: translatetoxy=<bool>
Description: If true, geostats produces one result per each locationally binned location. This mode is appropriate for rendering on a map. If false, geostats produces one result per category (or tuple of a multiply split dataset) per locationally binned location. Essentially this causes the data to be broken down by category. This mode cannot be rendered on a map.
Default: true

Stats function options

stats-func
Syntax: The syntax depends on the function that you use. See Usage.
Description: Statistical and charting functions that you can use with the geostats command. Each time you invoke the geostats command, you can use one or more functions.

Usage

To display the information on a map, you must run a reporting search with the geostats command.

If you are using a lookup command before the geostats command, see Optimizing your lookup search.

Supported functions

You can use a wide range of functions with the geostats command. For general information about using functions, see Statistical and charting functions.

  • For a list of statistical functions by category, see Function list by category
  • For an alphabetical list of statistical functions, see Alphabetical list of functions

Memory and geostats search performance

A pair of limits.conf settings strike a balance between the performance of geostats searches and the amount of memory they use during the search process, in RAM and on disk. If your geostats searches are consistently slow to complete you can adjust these settings to improve their performance, but at the cost of increased search-time memory usage, which can lead to search failures.

For more information, see Memory and stats search performance in the Search Manual.

Basic examples

1. Use the default settings and calculate the count

Cluster events by default latitude and longitude fields "lat" and "lon" respectively. Calculate the count of the events.

... | geostats count

2. Specify the latfield and longfield and calculate the average of a field

Compute the average rating for each gender after clustering/grouping the events by "eventlat" and "eventlong" values.

... | geostats latfield=eventlat longfield=eventlong avg(rating) by gender

Extended examples

3. Count each product sold by a vendor and display the information on a map

This example uses the sample data from the Search Tutorial. To try this example on your own Splunk instance, you must download the sample data and follow the instructions to get the tutorial data into Splunk. Use the time range All time when you run the search.


In addition, this example uses several lookup files that you must download (prices.csv.zipand vendors.csv.zip) and unzip the files. You must complete the steps in the Enabling field lookups section of the tutorial for both the prices.csv and the vendors.csv files. The steps in the tutorial are specific to the prices.csv file. For the vendors.csv file, use the name vendors_lookup for the lookup definition. Skip the step in the tutorial that makes the lookups automatic.

This search uses the stats command to narrow down the number of events that the lookup and geostats commands need to process.

Use the following search to count each product sold by a vendor and display the information on a map.

sourcetype=vendor_sales | stats count by Code VendorID | lookup prices_lookup Code OUTPUTNEW product_name | table product_name VendorID | lookup vendors_lookup VendorID | geostats latfield=VendorLatitude longfield=VendorLongitude count by product_name

  • In this example, sourcetype=vendor_sales is associated with a log file that is included in the Search Tutorial sample data. This log file contains vendor information that looks like this:
[10/Apr/2018:18:24:02] VendorID=5036 Code=B AcctID=6024298300471575
  • The vendors_lookup is used to output all the fields in vendors.csv file that match to the VentorID in the vendor_sales.log file. The fields in the vendors.csv file are: Vendor, VendorCity, VendorID, VendorLatitude, VendorLongitude, VendorStateProvince, and VendorCountry.
  • The prices_lookup is used to match the Code field in each event to a product_name in the table.

In this search, the CSV files are uploaded and the lookups are defined but are not automatic.

This search produces a table displayed on the Statistics tab:


Click the Visualization tab. The results are plotted on a world map. There is a pie chart for each vendor in the results. The larger the pie chart, the larger the count value.

In this screen shot, the mouse pointer is over the pie chart for a region in the northeastern part of the United States. An popup information box displays the latitude and longitude for the vendor, as well as a count of each product that the vendor sold.

You can zoom in to see more details on the map.

See also

Commands
iplocation
stats
xyseries
Reference information
Mapping data in Dashboards and Visualizations

Last modified on 15 April, 2021

geomfilterhead

This documentation applies to the following versions of Splunk® Enterprise: 7.1.0, 7.1.1, 7.1.2, 7.1.3, 7.1.4, 7.1.5, 7.1.6, 7.1.7, 7.1.8, 7.1.9, 7.1.10, 7.2.0, 7.2.1, 7.2.2, 7.2.3, 7.2.4, 7.2.5, 7.2.6, 7.2.7, 7.2.8, 7.2.9, 7.2.10, 7.3.0, 7.3.1, 7.3.2, 7.3.3, 7.3.4, 7.3.5, 7.3.6, 7.3.7, 7.3.8, 7.3.9, 8.0.0, 8.0.1, 8.0.2, 8.0.3, 8.0.4, 8.0.5, 8.0.6, 8.0.7, 8.0.8, 8.0.9, 8.0.10, 8.1.0, 8.1.1, 8.1.2, 8.1.3, 8.1.4, 8.1.5, 8.1.6, 8.1.7, 8.1.8, 8.1.9, 8.1.10, 8.1.11, 8.1.12, 8.1.13, 8.1.14, 8.2.0, 8.2.1, 8.2.2, 8.2.3, 8.2.4, 8.2.5, 8.2.6, 8.2.7, 8.2.8, 8.2.9, 8.2.10, 8.2.11, 8.2.12, 9.0.0, 9.0.1, 9.0.2, 9.0.3, 9.0.4, 9.0.5, 9.0.6, 9.0.7, 9.0.8, 9.0.9, 9.0.10, 9.1.0, 9.1.1, 9.1.2, 9.1.3, 9.1.4, 9.1.5, 9.2.0, 9.2.1, 9.2.2

geostats - Splunk Documentation (2025)

FAQs

How do I monitor Splunk logs? ›

You can do this by going to the Splunk web interface and entering a search string. This will bring up a list of all the events that match your search. You can then use the Splunk filters to further refine your results and get the specific data that you require.

How to write subsearch in Splunk? ›

A subsearch is enclosed in square brackets [ ] and processed first when the search criteria are parsed. Copy and paste the following search into the Search bar and run the search. Because the top command returns the count and percent fields, the table command is used to keep only the clientip value.

What is the function of stats command in Splunk? ›

The SPL2 stats command calculates aggregate statistics, such as average, count, and sum, over the incoming search results set. This is similar to SQL aggregation. If the stats command is used without a BY clause, only one row is returned, which is the aggregation over the entire incoming result set.

What data format does Splunk use? ›

Splunk can ingest data from a wide variety of sources, including files, directories, network events, and APIs. It supports common data formats such as CSV, JSON, and XML, as well as custom formats.

Is Splunk a good monitoring tool? ›

Besides that, Splunk generally has impressive log management capabilities. It can integrate ingested data from the different Splunk platforms and data sources into the Splunk Observability Cloud for centralized log management. This is done by the Log Observer Connect feature.

What are the different types of Splunk logs? ›

Common types of log data include application logs, system logs, network logs and security logs.

How to query logs in Splunk? ›

To search your logs, follow these steps: Navigate to Log Observer. In the content control bar, enter a time range in the time picker if you know it. Select Index next to Saved Queries, then select the indexes you want to query.

What is the limit of Splunk Subsearch 10000? ›

By default, subsearches return a maximum of 10,000 results. You will see variations in the actual number of output results because every command can change what the default maxout is when the command invokes a subsearch.

What are streamstats in Splunk? ›

1. Introduction to the Streamstats Command. The 'streamstats' command is another statistical command in Splunk that is used to perform real-time statistical analysis on event streams.

What is the difference between stats and eventstats in Splunk? ›

Eventstats calculates a statistical result same as stats command only difference is it does not create statistical results, it aggregates them to the original raw data. Streamstats command uses events before the current event to compute the aggregate statistics that are applied to each event.

What does coalesce do in Splunk? ›

The Splunk Search Processing Language (SPL) coalesce function takes one or more values and returns the first value that is not null.

What is rex in Splunk? ›

The rex command matches the value of the specified field against the unanchored regular expression and extracts the named groups into fields of the corresponding names.

What language does Splunk use? ›

Core Splunk is written in C/C++, afaik.

What databases does Splunk use? ›

Splunk doesn't use a DB: Splunk is a search engine that stores row data and indexes all of them making them searchable, for more infos see at https://docs.splunk.com/Documentation/Splunk/9.0.1/Deploy/Datapipeline .

Does Splunk encrypt data at rest? ›

Encryption at rest is available as a premium service enhancement that customers can purchase.

How do I view Splunk search logs? ›

To search your logs, follow these steps: Navigate to Log Observer. In the content control bar, enter a time range in the time picker if you know it. Select Index next to Saved Queries, then select the indexes you want to query.

How do I access Splunk monitoring console? ›

Locate the Cloud Monitoring Console
  1. From anywhere in Splunk Web, select Apps.
  2. Select Cloud Monitoring Console.
May 28, 2024

How do I check Splunk history? ›

This feature can be used to get the complete list of search queries executed on Splunk over time. The search history feature can be accessed via the Splunk Web console by clicking on "Search & Reporting" App | Search. It takes the user to the search summary dashboard with the option to run search queries.

How do I check Splunk usage? ›

The first three dashboards accessed from the Cloud Monitoring Console > License Usage tab enable Splunk Cloud Platform administrators to monitor their Splunk Cloud Platform subscription entitlement and ensure they don't exceed their license limits.

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Jerrold Considine

Last Updated:

Views: 6501

Rating: 4.8 / 5 (78 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: Jerrold Considine

Birthday: 1993-11-03

Address: Suite 447 3463 Marybelle Circles, New Marlin, AL 20765

Phone: +5816749283868

Job: Sales Executive

Hobby: Air sports, Sand art, Electronics, LARPing, Baseball, Book restoration, Puzzles

Introduction: My name is Jerrold Considine, I am a combative, cheerful, encouraging, happy, enthusiastic, funny, kind person who loves writing and wants to share my knowledge and understanding with you.