This command generates an event based an on input ip. It uses the Spur Context API so you must have an active Spur subscription. The command takes 1 argument 'ip' which is the ip that will be passed to the context api.
| spurcontextapigen ip="1.1.1.1"
This command enriches existing events with data from the Spur Context API. It uses the Spur Context API so you must have an active Spur subscription. The command takes 1 argument 'ip_field' which is the field that contains the ip that will be passed to the context api.
NOTE: This assumes you have uploaded the splunk tutorial data: https://docs.splunk.com/Documentation/Splunk/9.1.1/SearchTutorial/GetthetutorialdataintoSplunk
Simple example:
| makeresults
| eval ip = "1.1.1.1"
| spurcontextapi ip_field="ip"
Basic IP Query:
clientip="223.205.219.67" | spurcontextapi ip_field="clientip"
Enrich a list of distinct IPs:
clientip=* | head 1000 | stats values(clientip) as "ip" | mvexpand ip | spurcontextapi ip_field="ip"
The modular input allows you to insert feed data into a splunk index. It uses the Spur Feed API so you must have an active Spur subscription. The modular input takes 1 argument 'feed_type'. The feed type is the type of feed you want to pull from the Spur API and depends on your subscription level (anonymous, anonymou-residential, realtime). During setup you can override the splunk defaults to insert into a different index. You can also utilize the interval setting to ensure the feed is ingested at your desired interval.
NOTE: You can monitor the progress of the feed by looking at the logs. The logs are logged locally to /opt/splunk/var/log/splunk/spurcontextapi.log. This can be viewed directly or added to splunk as a data input.
index="spur" earliest_time=@d | head 1000
The following fields are returned from the context api and added to the steamed records:
"spur_as_number"
"spur_as_organization"
"spur_organization"
"spur_infrastructure"
"spur_client_behaviors"
"spur_client_concentration_country"
"spur_client_concentration_city"
"spur_client_concentration_geohash"
"spur_client_concentration_density"
"spur_client_concentration_skew"
"spur_client_countries"
"spur_client_spread"
"spur_client_proxies"
"spur_client_count"
"spur_client_types"
"spur_location_country"
"spur_location_state"
"spur_location_city"
"spur_services"
"spur_tunnels_type"
"spur_tunnels_anonymous"
"spur_tunnels_operator"
"spur_risks"
The records from the feed are inserted with no modifications. The adhere to the following JSON schema:
{
"type": "object",
"description": "IP Context Object",
"additionalProperties": false,
"properties": {
"ip": {
"type": "string"
},
"as": {
"type": "object",
"properties": {
"number": {
"type": "integer"
},
"organization": {
"type": "string"
}
}
},
"organization": {
"type": "string"
},
"infrastructure": {
"type": "string"
},
"client": {
"type": "object",
"properties": {
"behaviors": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
},
"concentration": {
"type": "object",
"properties": {
"country": {
"type": "string"
},
"state": {
"type": "string"
},
"city": {
"type": "string"
},
"geohash": {
"type": "string"
},
"density": {
"type": "number",
"minimum": 0,
"maximum": 1
},
"skew": {
"type": "integer"
}
}
},
"countries": {
"type": "integer"
},
"spread": {
"type": "integer"
},
"proxies": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
},
"count": {
"type": "integer"
},
"types": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
}
}
},
"location": {
"type": "object",
"properties": {
"country": {
"type": "string"
},
"state": {
"type": "string"
},
"city": {
"type": "string"
}
}
},
"services": {
"type": "array",
"items": {
"type": "string"
}
},
"tunnels": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "object",
"properties": {
"anonymous": {
"type": "boolean"
},
"entries": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
},
"operator": {
"type": "string"
},
"type": {
"type": "string"
},
"exits": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
}
},
"required": ["type"]
}
},
"risks": {
"type": "array",
"uniqueItems": true,
"items": {
"type": "string"
}
}
},
"required": ["ip"]
}
As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps from Splunk, our partners and our community. Find an app for most any data source and user need, or simply create your own with help from our developer portal.