Hi All,
I have created one query and it is working fine in search. I am sharing part of code from dashboard. In first part of call if you see I have hardcoded by earliest and latest time . But i want to pass those as input values by selecting input time provided on dashboard and then remaining part of query I want to run for whole day or lets say another time range . becuse it is possible that request i have received during mentioned time might get process later at dayy.How can I achieve this . Also I want to hide few columns at end like message guid , request time and output time .
<panel>
<table>
<title>Contact -Timings</title>
<search>
<query>```query for apigateway call```
index=aws* earliest="03/28/2025:13:30:00" latest="03/28/2025:14:35:00"
Method response body after transformations: sourcetype="aws:apigateway"
| rex field=_raw "Method response body after transformations: (?<json>[^$]+)"
| spath input=json path="header.messageGUID" output=messageGUID
| spath input=json path="payload.statusType.code" output=status
| spath input=json path="payload.statusType.text" output=text
| spath input=json path="header.action" output=action
| where status=200 and action="Create"
| rename _time as request_time
```dedupe is added to remove duplicates ```
| dedup messageGUID
| append
```query for event brigdel```
[ search index="aws_np"
| rex field=_raw "messageGUID\": String\(\"(?<messageGUID>[^\"]+)"
| rex field=_raw "source\": String\(\"(?<source>[^\"]+)"
| rex field=_raw "type\": String\(\"(?<type>[^\"]+)"
| where source="MDM" and type="Contact" ```and messageGUID="0461870f-ee8a-96cd-3db6-1ca1f6dbeb30"```
| rename _time as output_time | dedup messageGUID
]
| stats values(request_time) as request_time values(output_time) as output_time by messageGUID
| where isnotnull(output_time) and isnotnull(request_time)
| eval timeTaken=(output_time-request_time)/60| convert ctime(output_time)| convert ctime(request_time)
| eventstats avg(timeTaken) min(timeTaken) max(timeTaken) count(messageGUID)
| head 1</query>
<earliest>$field1.earliest$</earliest>
<latest>$field1.latest$</latest>
</search>
<option name="drilldown">none</option>
</table>
</panel>
I got solution of this by following what is mentioned in https://community.splunk.com/t5/Splunk-Search/Query-running-time/m-p/367124#M108287
I got solution of this by following what is mentioned in https://community.splunk.com/t5/Splunk-Search/Query-running-time/m-p/367124#M108287
Ok. Let me offer you some additional pointers here.
1. Whenever I see a dedup command I raise my eyebrows questioningly - are you sure you know how dedup works and is it really what you want?
2. Your subsearch is highly suboptimal considering you're just looking for a single - relatively unique value of the guid. As it is now, you're plowing through all data for given time range, extracting some fields (which you will not use later) with regex and finally only catching a small subset of those initial events.
An example from my home lab environment. If I search
index=mail
| rex "R=(?<r>\S+)"
| where r="1u0tIb-000000005e9-07kx"
over all-time Splunk has to throw the regex at almost 11 millions of events and it takes 197 seconds.
If I narrow the search at the very beginning and do
index=mail 1u0tIb-000000005e9-07kx
| rex "R=(?<r>\S+)"
| where r="1u0tIb-000000005e9-07kx"
The search takes just half a second and scans only 8 events.
Actually, if you had your extractions configured for your events properly, you could just do the search like
index="aws_np" aws_source="MDM" type="Contact"
and it would work. You apparently don't have your data onboarded properly so you have to do it like in your search but this is ineffective.
The same applies to the initial search where you do a lot of heavy lifting before hitting the where command. By moving the raw "200" and "Create" strings to the initial search you may save yourself a lot of time.
3. To add insult to injury - your appended search is prone to subsearch limits so it might get silently finalized and you will get wrong/incomplete results without even knowing it.
4. You are doing several separate runs of the spath command which is relatively heavy. I'm not sure here but I'd hazard a guess that one "big" spath and filtering fields immediately afterwards in order to not drag them along and limit memory usage might be better performancewise.
5. You're statsing only three fields - request_time, output_time and messageGUID. Why extract the text field?
Hello @PickleRick , Thanks for pointing out issues , I will check my query and see how can I optimize it . Logging is not perfect that why I have to take this route . I will check and see how can I make it better . Also I am looking for one guid . That part is commented. I am looking for whole set
Hello @ITWhisperer how can override time of appended search
What do you want to override it with?
hi @ITWhisperer , based on date provided in main search lets say 12/mar/2025 1pm - 12/mar/2025 1:30 PM , I wan to use 12/mar/2025 only date in second one
| append
```query for event brigdel```
[ search index="aws_np" [| makeresults
| eval earliest=strptime("12/03/2025","%d/%m/%Y")
| eval latest=relative_time(earliest,"+1d")
| table earliest latest]
Also I tried it like
[ search index="aws_np" [| makeresults
| eval earliest=strptime("12/03/2025","%d/%m/%Y")
| eval latest=relative_time(earliest,"+1d")
| table earliest latest] host="test" app_environment=qa
| rex field=_raw "messageGUID\": String\(\"(?<messageGUID>[^\"]+)"
but getting below error :
Error in 'search' command: Unable to parse the search: 'AND' operator is missing a clause on the left hand side.
Subsearches are executed before the main search so when the appended search is executed the field is not available. Without the full search, I cannot determine where the error might be coming from. The basic concept of using make results to provide new values for earliest and latest can be demonstrated to work with the following complete search
| makeresults
| eval line="First"
| append
[search index=_internal
[| makeresults
| eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M")
| eval latest=relative_time(earliest,"+1d")
| table earliest latest] sourcetype=splunkd
| head 1
| eval line="second"]
Hello @ITWhisperer
| eval latest_time=strptime("03/12/2025:12:30:00", "%m/%d/%Y:%H:%M:%S")
| eval new_latest_time=(latest_time + 18000) ``` 18000 seconds = 5 hrs ```
| eval new_latest_time_str=strftime(new_latest_time, "%m/%d/%Y:%H:%M:%S")
| append
```query for event brigdel```
[ search index="aws_np" earliest="03/12/2025:12:30:00" latest=$new_latest_time_str$ host="EventConsumer-mdm
I tried writing it as above ( query is not complete ). Just wanted to share I tried evaluating values and then trying using within sub search but it is giving following error :
Invalid value "$new_latest_time_str$" for time term 'latest'
Don't override the earliest and latest in the first part of the search (then it will take the times from the input field. You can then override the earliest and latest in the appended search to be a different time frame.
Hello @ITWhisperer , First of all thanks for spending time on it .I am trying running simple query ( not subsearch ) as follows still it is not running . Basically I am trying to understand how do we calculate any parameter value while query runs .
index="aws_np" earliest="12/03/2025:13:00" latest=[| makeresults
| eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M")
| eval latest=relative_time(earliest,"+1d")
| table latest]
| rex field=_raw "messageGUID\": String\(\"(?<messageGUID>[^\"]+)"
| rex field=_raw "source\": String\(\"(?<source>[^\"]+)"
| where type="Contact"
| stats count by source
index="aws_np" [| makeresults
| eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M")
| eval latest=relative_time(earliest,"+1d")
| table earliest latest]
| rex field=_raw "messageGUID\": String\(\"(?<messageGUID>[^\"]+)"
| rex field=_raw "source\": String\(\"(?<source>[^\"]+)"
| where type="Contact"
| stats count by source
But when I am running this I am getting
Error in 'search' command: Unable to parse the search: 'AND' operator is missing a clause on the left hand side.
That's because you are not doing it exactly as I suggested - you don't need latest=[ - try it the way I suggested!
index="aws_np" [| makeresults
| eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M")
| eval latest=relative_time(earliest,"+1d")
| table earliest latest]
| rex field=_raw "messageGUID\": String\(\"(?<messageGUID>[^\"]+)"
| rex field=_raw "source\": String\(\"(?<source>[^\"]+)"
| rex field=_raw "type\": String\(\"(?<type>[^\"]+)"
| rex field=_raw "addBy\": String\(\"(?<addBy>[^\"]+)"
| where type="Contact"
| stats count by source
I tried exactly same way , Error in 'search' command: Unable to parse the search: 'AND' operator is missing a clause on the left hand side.
This does not produce a parsing error for me
Which version of Splunk are you using?
I don't think Splunk parsing has changed that much since 9.1 (I'm using 9.4). Please share your full search (obfuscated as little as possible) so we can figure out where that error might be coming from.
index="aws_np" [| makeresults
| eval earliest=strptime("12/03/2025 13:00","%d/%m/%Y %H:%M")
| eval latest=relative_time(earliest,"+1d")
| table earliest latest]
Even bare minimum when I am running I am getting this issue . Am I really making some trivial mistake .
Could there a possibility that last 24 hours which got selected playing any role for error