This command generates an event based an on input ip. It uses the Spur Context API so you must have an active Spur subscription. The command takes 1 argument 'ip' which is the ip that will be passed to the context api.
Single IP:
| spurcontextapigen ip="1.1.1.1"
Multiple IPs:
| spurcontextapigen ip="1.1.1.1,8.8.8.8"
This command enriches existing events with data from the Spur Context API. It uses the Spur Context API so you must have an active Spur subscription. The command takes 1 argument 'ip_field' which is the field that contains the ip that will be passed to the context api.
NOTE: This assumes you have uploaded the splunk tutorial data: https://docs.splunk.com/Documentation/Splunk/9.1.1/SearchTutorial/GetthetutorialdataintoSplunk
Simple example:
| makeresults
| eval ip = "1.1.1.1"
| spurcontextapi ip_field="ip"
Basic IP Query:
clientip="223.205.219.67" | spurcontextapi ip_field="clientip"
Enrich a list of distinct IPs:
clientip=* | head 1000 | stats values(clientip) as "ip" | mvexpand ip | spurcontextapi ip_field="ip"
The modular input allows you to insert feed data into a splunk index. It uses the Spur Feed API so you must have an active Spur subscription. The modular input takes 2 arguments: 'Feed Type', 'Enable Checkpoint Files'. The feed type is the type of feed you want to pull from the Spur API and depends on your subscription level (anonymous, anonymou-residential, realtime). The enable checkpoint files option will ensure that the same feed file will not be processed multiple times. During setup you can override the splunk defaults to insert into a different index. You can also utilize the interval setting to ensure the feed is ingested at your desired interval.
NOTE: You can monitor the progress of the feed by looking at the logs. The logs are logged locally to /opt/splunk/var/log/splunk/spur.log. This can be viewed directly or added to splunk as a data input.
index="spur" earliest_time=@d | head 1000
You can enhance Splunk's built-in iplocation
command by replacing the default IP geolocation database with Spur's more accurate and comprehensive IP geolocation data. This allows you to leverage Spur's superior IP intelligence while using Splunk's native iplocation
command syntax.
Download the Spur IP Geo database: Download the latest version of the Spur IP geolocation database from:
https://feeds.spur.us/v2/ipgeo/latest.mmdb
Replace the default database using Splunk Web Interface (Recommended):
.mmdb
fileAlternative - Manual file replacement:
- Copy the downloaded .mmdb
file to your Splunk installation directory:
- Default location: $SPLUNK_HOME/share/GeoLite2-City.mmdb
- Or configure a custom path using the db_path
setting in limits.conf
- Restart your Splunk instance to load the new database file
To use a custom file path or name, add the following to your limits.conf
file:
[iplocation]
db_path = /path/to/your/spur-ipgeo.mmdb
For distributed deployments, ensure the .mmdb
file is deployed to all indexers as it's not automatically included in the knowledge bundle.
Test the enhanced IP geolocation with a simple example:
| makeresults
| eval ip="8.8.8.8"
| iplocation ip
This will return enhanced location data powered by Spur's IP intelligence, including more accurate city, country, region, latitude, and longitude information.
The Spur IP Geo modular input allows you to automatically ingest IP geolocation data into a locally stored mmdb. You must have an active Spur subscription with access to the IP Geo feed.
ipgeo
as your feed typeThe app includes a spuriplocation
command that enriches events with comprehensive IP geolocation data from the Spur IP Geo MMDB. This command can be used as an enhanced replacement for Splunk's built-in iplocation
command, providing more detailed geographic and network information.
Prerequisites: This command depends on the Spur IP Geo modular input. Please configure the IP Geo feed input first before using this command.
| makeresults
| eval ip="1.1.1.1"
| spuriplocation ip_field=ip
ip_field
(required): The field containing the IP address to look upfields
(optional): Comma-separated list of fields to include in the output. If not specified, all fields are included.The spuriplocation
command supports the following fields. You can use either the short field names or full field names when specifying the fields
option:
Short Name | Full Field Name | Description |
---|---|---|
country |
spur_location_country |
Country name (English) |
country_iso |
spur_location_country_iso |
ISO country code (e.g., "US") |
country_geoname_id |
spur_location_country_geoname_id |
GeoNames database ID for country |
subdivision |
spur_location_subdivision |
State/province name (English) |
subdivision_geoname_id |
spur_location_subdivision_geoname_id |
GeoNames database ID for subdivision |
city |
spur_location_city |
City name (English) |
city_geoname_id |
spur_location_city_geoname_id |
GeoNames database ID for city |
continent |
spur_location_continent |
Continent name (English) |
continent_code |
spur_location_continent_code |
Continent code (e.g., "NA") |
continent_geoname_id |
spur_location_continent_geoname_id |
GeoNames database ID for continent |
registered_country |
spur_location_registered_country |
Registered country name (English) |
registered_country_iso |
spur_location_registered_country_iso |
Registered country ISO code |
registered_country_geoname_id |
spur_location_registered_country_geoname_id |
GeoNames ID for registered country |
latitude |
spur_location_latitude |
Latitude coordinate |
longitude |
spur_location_longitude |
Longitude coordinate |
accuracy_radius |
spur_location_accuracy_radius |
Accuracy radius in kilometers |
timezone |
spur_location_timezone |
Timezone (e.g., "America/Chicago") |
as_number |
spur_as_number |
Autonomous System number |
as_organization |
spur_as_organization |
Autonomous System organization name |
error |
spur_error |
Error message (if any) |
Basic IP lookup with all fields:
| makeresults
| eval ip="8.8.8.8"
| spuriplocation ip_field=ip
Get only basic location information:
| makeresults
| eval ip="8.8.8.8"
| spuriplocation ip_field=ip fields="country,subdivision,city"
Get coordinates only:
| makeresults
| eval ip="8.8.8.8"
| spuriplocation ip_field=ip fields="latitude,longitude"
Get network information:
| makeresults
| eval ip="8.8.8.8"
| spuriplocation ip_field=ip fields="as_number,as_organization"
Enrich existing log data:
index=web_logs
| head 1000
| spuriplocation ip_field=client_ip fields="country,city,latitude,longitude"
Get detailed country information with IDs:
| makeresults
| eval ip="8.8.8.8"
| spuriplocation ip_field=ip fields="country,country_iso,country_geoname_id"
Mixed field specification (short and full names):
| makeresults
| eval ip="8.8.8.8"
| spuriplocation ip_field=ip fields="country,spur_location_latitude,as_number"
The following fields are returned from the context api and added to the steamed records:
"spur_as_number"
"spur_as_organization"
"spur_organization"
"spur_infrastructure"
"spur_client_behaviors"
"spur_client_concentration_country"
"spur_client_concentration_city"
"spur_client_concentration_geohash"
"spur_client_concentration_density"
"spur_client_concentration_skew"
"spur_client_countries"
"spur_client_spread"
"spur_client_proxies"
"spur_client_count"
"spur_client_types"
"spur_location_country"
"spur_location_state"
"spur_location_city"
"spur_services"
"spur_tunnels_type"
"spur_tunnels_anonymous"
"spur_tunnels_operator"
"spur_risks"
The records from the feed are inserted with no modifications. The adhere to the following JSON schema:
{
"type": "object",
"description": "IP Context Object",
"additionalProperties": false,
"properties": {
"ip": {
"type": "string"
},
"as": {
"type": "object",
"properties": {
"number": {
"type": "integer"
},
"organization": {
"type": "string"
}
}
},
"organization": {
"type": "string"
},
"infrastructure": {
"type": "string"
},
"client": {
"type": "object",
"properties": {
"behaviors": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
},
"concentration": {
"type": "object",
"properties": {
"country": {
"type": "string"
},
"state": {
"type": "string"
},
"city": {
"type": "string"
},
"geohash": {
"type": "string"
},
"density": {
"type": "number",
"minimum": 0,
"maximum": 1
},
"skew": {
"type": "integer"
}
}
},
"countries": {
"type": "integer"
},
"spread": {
"type": "integer"
},
"proxies": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
},
"count": {
"type": "integer"
},
"types": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
}
}
},
"location": {
"type": "object",
"properties": {
"country": {
"type": "string"
},
"state": {
"type": "string"
},
"city": {
"type": "string"
}
}
},
"services": {
"type": "array",
"items": {
"type": "string"
}
},
"tunnels": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "object",
"properties": {
"anonymous": {
"type": "boolean"
},
"entries": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
},
"operator": {
"type": "string"
},
"type": {
"type": "string"
},
"exits": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
}
},
"required": ["type"]
}
},
"risks": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
}
},
"required": ["ip"]
}
Added IP Geo feed support
Added support for anonymous-ipv6 and anonymous-residential-ipv6 feeds in Splunk Data Inputs
Updated installation instructions to include list_settings as a required capability for users.
Fix bug in spurcontextapi command where ip data was not being populated
Fixed bug where ipv6 addresses would not be enriched, fix bug where when the first input resulted in an error subsequent results would not show all available enrichment fields
Set python version to be python 3 in commands.conf
Adjusted schema in output of spurcontextapi to use arrays instead of joined strings.
Fix issue where app doesn't work in a distributed environment.
As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps from Splunk, our partners and our community. Find an app for most any data source and user need, or simply create your own with help from our developer portal.