Description
What did you do?
I've been investigating weird notification issue we have been seeing in our alert routing setup, which is a custom alert router that receives messages from alertmanager (and other sources). Some of the symptoms I've seen are:
- Duplicate notifications
- Rapid close/reopen of issues triggered by notifications
- Lots of updates to the same issue
I also looked at #1685, #1550, #1546, #1005 for inspiration.
After lots of digging, it turns out that our alertmanager cluster is sending messages that are not supposed to be sent. Our alert router was hiding that somewhat, as it has it own bugs 😄 So I took it out of the loop and made alertmanager send its alerts to a simple go program that just logged them and returned an http 200 OK.
func handler(w http.ResponseWriter, r *http.Request) {
log.WithFields(log.Fields{"url": r.URL, "method": r.Method, "client": r.RemoteAddr}).Info("HTTP request received")
if r.Body != nil {
b, _ := ioutil.ReadAll(r.Body)
log.WithFields(log.Fields{"client": r.RemoteAddr, "body": mangle(b)}).Info("Body")
}
w.Write([]byte("OK"))
}
func mangle(b []byte) string {
return strings.Replace(strings.TrimSpace(string(b)),"\n","\\n", -1)
}
Here are 6 simultaneous notifications, we expect all 6 as every 15 minutes a canary alert will fire (and will cease firing at the next quarter) on each of 3 separate prometheus boxes
[note: all logs have been minimally redacted. Last octets of ip addresses and identifying parts of domainnames have been scrubbed. Remaining parts of ip addresses and fqdn's are still unique]
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] HTTP request received client=172.16.X.X:41828 method=POST url=/
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"81900"},"annotations":{"description":"Canary alert for site3/production at 81900","message_firing":"Prometheus canary alert for site3/production at 81900 retrieved 81909 prometheus 1567032324","summary":"Canary alert for site3/production at 81900"},"startsAt":"2019-08-28T15:45:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-c6efba1.site3.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site3","ts_rounded":"81900"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"81900"},"commonAnnotations":{"description":"Canary alert for site3/production at 81900","message_firing":"Prometheus canary alert for site3/production at 81900 retrieved 81909 prometheus 1567032324","summary":"Canary alert for site3/production at 81900"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"81900\"}"} client=172.16.X.X:41828
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] HTTP request received client=172.16.X.X:41828 method=POST url=/
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"81900"},"annotations":{"description":"Canary alert for site1/production at 81900","message_firing":"Prometheus canary alert for site1/production at 81900 retrieved 81912 prometheus 1567032324","summary":"Canary alert for site1/production at 81900"},"startsAt":"2019-08-28T15:45:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"81900"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"81900"},"commonAnnotations":{"description":"Canary alert for site1/production at 81900","message_firing":"Prometheus canary alert for site1/production at 81900 retrieved 81912 prometheus 1567032324","summary":"Canary alert for site1/production at 81900"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"81900\"}"} client=172.16.X.X:41828
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] HTTP request received client=172.16.X.X:41826 method=POST url=/
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"81000"},"annotations":{"description":"Canary alert for site1/production at 81000","message_firing":"Prometheus canary alert for site1/production at 81000 retrieved 81852 prometheus 1567032264","summary":"Canary alert for site1/production at 81000"},"startsAt":"2019-08-28T15:30:24.116517014-07:00","endsAt":"2019-08-28T15:45:24.116517014-07:00","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"81000"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"81000"},"commonAnnotations":{"description":"Canary alert for site1/production at 81000","message_firing":"Prometheus canary alert for site1/production at 81000 retrieved 81852 prometheus 1567032264","summary":"Canary alert for site1/production at 81000"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"81000\"}"} client=172.16.X.X:41826
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] HTTP request received client=172.16.X.X:41826 method=POST url=/
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"81900"},"annotations":{"description":"Canary alert for site4/production at 81900","message_firing":"Prometheus canary alert for site4/production at 81900 retrieved 81918 prometheus 1567032324","summary":"Canary alert for site4/production at 81900"},"startsAt":"2019-08-28T15:45:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-b7a157f.site4.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site4","ts_rounded":"81900"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"81900"},"commonAnnotations":{"description":"Canary alert for site4/production at 81900","message_firing":"Prometheus canary alert for site4/production at 81900 retrieved 81918 prometheus 1567032324","summary":"Canary alert for site4/production at 81900"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"81900\"}"} client=172.16.X.X:41826
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] HTTP request received client=172.16.X.X:41826 method=POST url=/
Aug 28 15:45:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:45:24-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"81000"},"annotations":{"description":"Canary alert for site4/production at 81000","message_firing":"Prometheus canary alert for site4/production at 81000 retrieved 81858 prometheus 1567032264","summary":"Canary alert for site4/production at 81000"},"startsAt":"2019-08-28T15:30:24.116517014-07:00","endsAt":"2019-08-28T15:45:24.116517014-07:00","generatorURL":"http://observability-prom-b7a157f.site4.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site4","ts_rounded":"81000"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"81000"},"commonAnnotations":{"description":"Canary alert for site4/production at 81000","message_firing":"Prometheus canary alert for site4/production at 81000 retrieved 81858 prometheus 1567032264","summary":"Canary alert for site4/production at 81000"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"81000\"}"} client=172.16.X.X:41826
Aug 28 15:46:54 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:46:54-07:00] HTTP request received client=10.48.X.X:37754 method=POST url=/
Aug 28 15:46:54 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-28T15:46:54-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"81000"},"annotations":{"description":"Canary alert for site3/production at 81000","message_firing":"Prometheus canary alert for site3/production at 81000 retrieved 81849 prometheus 1567032264","summary":"Canary alert for site3/production at 81000"},"startsAt":"2019-08-28T15:30:24.116517014-07:00","endsAt":"2019-08-28T15:45:24.116517014-07:00","generatorURL":"http://observability-prom-c6efba1.site3.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site3","ts_rounded":"81000"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"81000"},"commonAnnotations":{"description":"Canary alert for site3/production at 81000","message_firing":"Prometheus canary alert for site3/production at 81000 retrieved 81849 prometheus 1567032264","summary":"Canary alert for site3/production at 81000"},"externalURL":"http://observability-alert-871f98c.site3.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"81000\"}"} client=10.48.X.X:37754
Note that one notification is coming from a different alertmanager. This is not supposed to be possible, as the node at the top of the list should be the only one sending the alerts (the cluster has 3 nodes, there were no failovers or restarts during this). It clearly hasn't, and the other node sends it after correctly waiting for others to time out.
Here is another, more devious instance.
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] HTTP request received client=172.16.X.X:35044 method=POST url=/
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"45000"},"annotations":{"description":"Canary alert for site1/production at 45000","message_firing":"Prometheus canary alert for site1/production at 45000 retrieved 45012 prometheus 1566477024","summary":"Canary alert for site1/production at 45000"},"startsAt":"2019-08-22T05:30:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"45000"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"45000"},"commonAnnotations":{"description":"Canary alert for site1/production at 45000","message_firing":"Prometheus canary alert for site1/production at 45000 retrieved 45012 prometheus 1566477024","summary":"Canary alert for site1/production at 45000"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"45000\"}"} client=172.16.X.X:35044
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] HTTP request received client=172.16.X.X:35044 method=POST url=/
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"45000"},"annotations":{"description":"Canary alert for site3/production at 45000","message_firing":"Prometheus canary alert for site3/production at 45000 retrieved 45009 prometheus 1566477024","summary":"Canary alert for site3/production at 45000"},"startsAt":"2019-08-22T05:30:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-c6efba1.site3.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site3","ts_rounded":"45000"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"45000"},"commonAnnotations":{"description":"Canary alert for site3/production at 45000","message_firing":"Prometheus canary alert for site3/production at 45000 retrieved 45009 prometheus 1566477024","summary":"Canary alert for site3/production at 45000"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"45000\"}"} client=172.16.X.X:35044
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] HTTP request received client=172.16.X.X:35044 method=POST url=/
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site3/production at 44100","message_firing":"Prometheus canary alert for site3/production at 44100 retrieved 44949 prometheus 1566476964","summary":"Canary alert for site3/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"2019-08-22T05:30:24.116517014-07:00","generatorURL":"http://observability-prom-c6efba1.site3.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site3","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site3/production at 44100","message_firing":"Prometheus canary alert for site3/production at 44100 retrieved 44949 prometheus 1566476964","summary":"Canary alert for site3/production at 44100"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"44100\"}"} client=172.16.X.X:35044
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] HTTP request received client=172.16.X.X:35046 method=POST url=/
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site1/production at 44100","message_firing":"Prometheus canary alert for site1/production at 44100 retrieved 44952 prometheus 1566476964","summary":"Canary alert for site1/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"2019-08-22T05:30:24.116517014-07:00","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site1/production at 44100","message_firing":"Prometheus canary alert for site1/production at 44100 retrieved 44952 prometheus 1566476964","summary":"Canary alert for site1/production at 44100"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"44100\"}"} client=172.16.X.X:35046
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] HTTP request received client=172.16.X.X:35046 method=POST url=/
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"45000"},"annotations":{"description":"Canary alert for site4/production at 45000","message_firing":"Prometheus canary alert for site4/production at 45000 retrieved 45018 prometheus 1566477024","summary":"Canary alert for site4/production at 45000"},"startsAt":"2019-08-22T05:30:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-b7a157f.site4.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site4","ts_rounded":"45000"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"45000"},"commonAnnotations":{"description":"Canary alert for site4/production at 45000","message_firing":"Prometheus canary alert for site4/production at 45000 retrieved 45018 prometheus 1566477024","summary":"Canary alert for site4/production at 45000"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"45000\"}"} client=172.16.X.X:35046
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] HTTP request received client=172.16.X.X:35046 method=POST url=/
Aug 22 05:30:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:24-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site4/production at 44100","message_firing":"Prometheus canary alert for site4/production at 44100 retrieved 44958 prometheus 1566476964","summary":"Canary alert for site4/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"2019-08-22T05:30:24.116517014-07:00","generatorURL":"http://observability-prom-b7a157f.site4.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site4","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site4/production at 44100","message_firing":"Prometheus canary alert for site4/production at 44100 retrieved 44958 prometheus 1566476964","summary":"Canary alert for site4/production at 44100"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"44100\"}"} client=172.16.X.X:35046
Aug 22 05:30:28 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:28-07:00] HTTP request received client=10.44.X.X:35162 method=POST url=/
Aug 22 05:30:28 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:28-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site3/production at 44100","message_firing":"Prometheus canary alert for site3/production at 44100 retrieved 44949 prometheus 1566476964","summary":"Canary alert for site3/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-c6efba1.site3.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site3","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site3/production at 44100","message_firing":"Prometheus canary alert for site3/production at 44100 retrieved 44949 prometheus 1566476964","summary":"Canary alert for site3/production at 44100"},"externalURL":"http://observability-alert-0b8eb8b.site2.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"44100\"}"} client=10.44.X.X:35162
Aug 22 05:30:28 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:28-07:00] HTTP request received client=10.44.X.X:35160 method=POST url=/
Aug 22 05:30:28 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:28-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site1/production at 44100","message_firing":"Prometheus canary alert for site1/production at 44100 retrieved 44952 prometheus 1566476964","summary":"Canary alert for site1/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site1/production at 44100","message_firing":"Prometheus canary alert for site1/production at 44100 retrieved 44952 prometheus 1566476964","summary":"Canary alert for site1/production at 44100"},"externalURL":"http://observability-alert-0b8eb8b.site2.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"44100\"}"} client=10.44.X.X:35160
Aug 22 05:30:28 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:28-07:00] HTTP request received client=10.44.X.X:35176 method=POST url=/
Aug 22 05:30:28 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:28-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site4/production at 44100","message_firing":"Prometheus canary alert for site4/production at 44100 retrieved 44958 prometheus 1566476964","summary":"Canary alert for site4/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-b7a157f.site4.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site4","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site4/production at 44100","message_firing":"Prometheus canary alert for site4/production at 44100 retrieved 44958 prometheus 1566476964","summary":"Canary alert for site4/production at 44100"},"externalURL":"http://observability-alert-0b8eb8b.site2.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"44100\"}"} client=10.44.X.X:35176
Aug 22 05:30:43 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:43-07:00] HTTP request received client=10.44.X.X:35176 method=POST url=/
Aug 22 05:30:43 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:43-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site3/production at 44100","message_firing":"Prometheus canary alert for site3/production at 44100 retrieved 44949 prometheus 1566476964","summary":"Canary alert for site3/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"2019-08-22T05:30:24.116517014-07:00","generatorURL":"http://observability-prom-c6efba1.site3.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site3","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site3/production at 44100","message_firing":"Prometheus canary alert for site3/production at 44100 retrieved 44949 prometheus 1566476964","summary":"Canary alert for site3/production at 44100"},"externalURL":"http://observability-alert-0b8eb8b.site2.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"44100\"}"} client=10.44.X.X:35176
Aug 22 05:30:43 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:43-07:00] HTTP request received client=10.44.X.X:35160 method=POST url=/
Aug 22 05:30:43 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:43-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site1/production at 44100","message_firing":"Prometheus canary alert for site1/production at 44100 retrieved 44952 prometheus 1566476964","summary":"Canary alert for site1/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"2019-08-22T05:30:24.116517014-07:00","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site1/production at 44100","message_firing":"Prometheus canary alert for site1/production at 44100 retrieved 44952 prometheus 1566476964","summary":"Canary alert for site1/production at 44100"},"externalURL":"http://observability-alert-0b8eb8b.site2.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"44100\"}"} client=10.44.X.X:35160
Aug 22 05:30:43 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:43-07:00] HTTP request received client=10.44.X.X:35162 method=POST url=/
Aug 22 05:30:43 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-22T05:30:43-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"44100"},"annotations":{"description":"Canary alert for site4/production at 44100","message_firing":"Prometheus canary alert for site4/production at 44100 retrieved 44958 prometheus 1566476964","summary":"Canary alert for site4/production at 44100"},"startsAt":"2019-08-22T05:15:24.116517014-07:00","endsAt":"2019-08-22T05:30:24.116517014-07:00","generatorURL":"http://observability-prom-b7a157f.site4.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site4","ts_rounded":"44100"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"44100"},"commonAnnotations":{"description":"Canary alert for site4/production at 44100","message_firing":"Prometheus canary alert for site4/production at 44100 retrieved 44958 prometheus 1566476964","summary":"Canary alert for site4/production at 44100"},"externalURL":"http://observability-alert-0b8eb8b.site2.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"44100\"}"} client=10.44.X.X:35162
As you see, one alertmanager is sending all the correct messages, then 4 seconds later another one sends firing messages for the old canary and resolved messages for those another 15 seconds later. Not only has it not waited for the configured interval, it's also sending firing notifications for things long-resolved.
We've also seen instances of all 3 of the members of our cluster being involved, or only one or two of the canaries re-firing. This happens not every time, but at least once every few hours.
I am convinced that alertmanager's clustering is at fault here, and the symptoms look like either a race condition between the cluster coordination and the alertgroup goroutines or a bug in processing gossip messages. The reason for being convinced of this is threefold:
- With only one alertmanager this does not happen
- If we mute all alertmanagers except the "leader" using https://github.com/seveas/alertmanager/commit/ebc4ec49ae5475cb1d8c917f124a59616038a53a, this also does not happen
- When adding lots of debug logging (see below), gossip message handling is implicated, though I will not claim that I understand the mechanics there
What did you expect to see?
Only the alertmanager at the top of the memberlist should be sending notifications, as all alertmanagers were up during these incidents.
What did you see instead? Under which circumstances?
See above.
Environment
-
System information:
Linux 4.19.0-0.bpo.5-amd64 x86_64 - debian jessie
-
Alertmanager version:
0.18, 0.19-rc0
- Alertmanager configuration file:
---
global:
slack_api_url: https://REDACTED
route:
# We effectively disable grouping for now by setting the grouping interval to
# 5 seconds. This makes sense for now because ninesapp is explicitly
# ungrouping the alerts. In the future we should determine if we have good
# use cases for grouping, and split the routing here based on label. Adding
# support to to ninesapp (or its successor) to group alerts with a specific
# label or annotation is fairly easy.
#
# Without this change to group_interval, alerts firing and
# being resolves is delayed by 5 minutes.
group_interval: 5s
# We don't have an inhibition rules configured, so no need to wait for
# them.
group_wait: 0s
# We don't want to spam nines very often, but on the other hand, periodic
# reminders are good, so only repeat notifications for alerts once a week
repeat_interval: 1w
# Group by instance to reduce the number of alerts in a single webhook
# payload. Without this change nines can take a long time process large
# number of alerts and fail the request.
group_by:
- instance
receiver: 'default-without-grouping'
routes:
# The canary alerts should not be grouped at all, so this gets a separate route
- receiver: 'canary'
match_re:
ts_rounded: '.+'
# This special value completely disables grouping, but doesn't work in alertmanager 0.15
#
# group_by: ['...']
group_by: ['instance', 'ts_rounded', 'ts_fetched', 'site', 'env']
receivers:
- name: 'default-without-grouping'
slack_configs:
- send_resolved: true
channel: '#observability-alerts'
webhook_configs:
- url: https://REDACTED
http_config:
basic_auth:
username: REDACTED
password: REDACTED
# Testing receiver that logs all request bodies, for debugging purposes
- url: https://REDACTED/
- name: 'canary'
slack_configs:
- channel: "#observability-canary"
title: "Canary alert"
text: "{{ .CommonAnnotations.message_firing }}"
webhook_configs:
- url: https://REDACTEDr
http_config:
basic_auth:
username: REDACTED
password: REDACTED
# Testing receiver that logs all request bodies, for debugging purposes
- url: https://REDACTED/
- Logs:
I found the debugging logs a bit lacking in detail, so I added more debugging in https://github.com/seveas/alertmanager/commit/289fd1f9231ab66d65b321092011d4a503b4e88c and https://github.com/seveas/alertmanager/commit/534f48d7ad3db2b72a42916ba34f2c2dd8ca68b2
So these logs will look somewhat unfamiliar to you, though maybe parts of those patches could be useful as some of the debugging info did help.
receiver log:
Host on pos 2(!) sending something it should not send old 39600 new 40500
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] HTTP request received client=172.16.X.X:48074 method=POST url=/
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"40500"},"annotations":{"description":"Canary alert for site3/production at 40500","message_firing":"Prometheus canary alert for site3/production at 40500 retrieved 40509 prometheus 1567077324","summary":"Canary alert for site3/production at 40500"},"startsAt":"2019-08-29T04:15:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-c6efba1.site3.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site3","ts_rounded":"40500"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"40500"},"commonAnnotations":{"description":"Canary alert for site3/production at 40500","message_firing":"Prometheus canary alert for site3/production at 40500 retrieved 40509 prometheus 1567077324","summary":"Canary alert for site3/production at 40500"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"40500\"}"} client=172.16.X.X:48074
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] HTTP request received client=172.16.X.X:48074 method=POST url=/
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"40500"},"annotations":{"description":"Canary alert for site1/production at 40500","message_firing":"Prometheus canary alert for site1/production at 40500 retrieved 40512 prometheus 1567077324","summary":"Canary alert for site1/production at 40500"},"startsAt":"2019-08-29T04:15:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"40500"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"40500"},"commonAnnotations":{"description":"Canary alert for site1/production at 40500","message_firing":"Prometheus canary alert for site1/production at 40500 retrieved 40512 prometheus 1567077324","summary":"Canary alert for site1/production at 40500"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"40500\"}"} client=172.16.X.X:48074
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] HTTP request received client=172.16.X.X:48074 method=POST url=/
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"39600"},"annotations":{"description":"Canary alert for site1/production at 39600","message_firing":"Prometheus canary alert for site1/production at 39600 retrieved 40452 prometheus 1567077264","summary":"Canary alert for site1/production at 39600"},"startsAt":"2019-08-29T04:00:24.116517014-07:00","endsAt":"2019-08-29T04:15:24.116517014-07:00","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"39600"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"39600"},"commonAnnotations":{"description":"Canary alert for site1/production at 39600","message_firing":"Prometheus canary alert for site1/production at 39600 retrieved 40452 prometheus 1567077264","summary":"Canary alert for site1/production at 39600"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"39600\"}"} client=172.16.X.X:48074
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] HTTP request received client=172.16.X.X:48074 method=POST url=/
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"40500"},"annotations":{"description":"Canary alert for site4/production at 40500","message_firing":"Prometheus canary alert for site4/production at 40500 retrieved 40518 prometheus 1567077324","summary":"Canary alert for site4/production at 40500"},"startsAt":"2019-08-29T04:15:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-b7a157f.site4.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site4","ts_rounded":"40500"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"40500"},"commonAnnotations":{"description":"Canary alert for site4/production at 40500","message_firing":"Prometheus canary alert for site4/production at 40500 retrieved 40518 prometheus 1567077324","summary":"Canary alert for site4/production at 40500"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"40500\"}"} client=172.16.X.X:48074
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] HTTP request received client=172.16.X.X:48074 method=POST url=/
Aug 29 04:15:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:15:24-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"39600"},"annotations":{"description":"Canary alert for site4/production at 39600","message_firing":"Prometheus canary alert for site4/production at 39600 retrieved 40458 prometheus 1567077264","summary":"Canary alert for site4/production at 39600"},"startsAt":"2019-08-29T04:00:24.116517014-07:00","endsAt":"2019-08-29T04:15:24.116517014-07:00","generatorURL":"http://observability-prom-b7a157f.site4.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site4","ts_rounded":"39600"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-b7a157f.site4.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site4","ts_rounded":"39600"},"commonAnnotations":{"description":"Canary alert for site4/production at 39600","message_firing":"Prometheus canary alert for site4/production at 39600 retrieved 40458 prometheus 1567077264","summary":"Canary alert for site4/production at 39600"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"39600\"}"} client=172.16.X.X:48074
Aug 29 04:16:54 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:16:54-07:00] HTTP request received client=10.48.X.X:35546 method=POST url=/
Aug 29 04:16:54 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:16:54-07:00] Body body={"receiver":"canary","status":"firing","alerts":[{"status":"firing","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"39600"},"annotations":{"description":"Canary alert for site1/production at 39600","message_firing":"Prometheus canary alert for site1/production at 39600 retrieved 40452 prometheus 1567077264","summary":"Canary alert for site1/production at 39600"},"startsAt":"2019-08-29T04:00:24.116517014-07:00","endsAt":"0001-01-01T00:00:00Z","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"39600"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"39600"},"commonAnnotations":{"description":"Canary alert for site1/production at 39600","message_firing":"Prometheus canary alert for site1/production at 39600 retrieved 40452 prometheus 1567077264","summary":"Canary alert for site1/production at 39600"},"externalURL":"http://observability-alert-871f98c.site3.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"39600\"}"} client=10.48.X.X:35546
Aug 29 04:17:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:17:24-07:00] HTTP request received client=172.16.X.X:48074 method=POST url=/
Aug 29 04:17:24 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:17:24-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"39600"},"annotations":{"description":"Canary alert for site3/production at 39600","message_firing":"Prometheus canary alert for site3/production at 39600 retrieved 40449 prometheus 1567077264","summary":"Canary alert for site3/production at 39600"},"startsAt":"2019-08-29T04:00:24.116517014-07:00","endsAt":"2019-08-29T04:15:24.116517014-07:00","generatorURL":"http://observability-prom-c6efba1.site3.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site3","ts_rounded":"39600"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-c6efba1.site3.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site3","ts_rounded":"39600"},"commonAnnotations":{"description":"Canary alert for site3/production at 39600","message_firing":"Prometheus canary alert for site3/production at 39600 retrieved 40449 prometheus 1567077264","summary":"Canary alert for site3/production at 39600"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"39600\"}"} client=172.16.X.X:48074
Aug 29 04:17:26 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:17:26-07:00] HTTP request received client=172.16.X.X:48074 method=POST url=/
Aug 29 04:17:26 ops-default-46e3d7b.site5.github.net ./scratch[9825]: INFO[2019-08-29T04:17:26-07:00] Body body={"receiver":"canary","status":"resolved","alerts":[{"status":"resolved","labels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"39600"},"annotations":{"description":"Canary alert for site1/production at 39600","message_firing":"Prometheus canary alert for site1/production at 39600 retrieved 40452 prometheus 1567077264","summary":"Canary alert for site1/production at 39600"},"startsAt":"2019-08-29T04:00:24.116517014-07:00","endsAt":"2019-08-29T04:15:24.116517014-07:00","generatorURL":"http://observability-prom-a5a5790.site1.github.net:9090/graph?g0.expr=observer_current_timestamp+%3E%3D+0\u0026g0.tab=1"}],"groupLabels":{"env":"production","instance":"observability-observer-production.service.site1.github.net:443","site":"site1","ts_rounded":"39600"},"commonLabels":{"alertname":"ObservabilityCanary","env":"production","exported_env":"prod","exported_site":"cp1","host":"observability-prom-a5a5790.site1.github.net","instance":"observability-observer-production.service.site1.github.net:443","job":"canary","service":"prometheus","severity":"sev1","site":"site1","ts_rounded":"39600"},"commonAnnotations":{"description":"Canary alert for site1/production at 39600","message_firing":"Prometheus canary alert for site1/production at 39600 retrieved 40452 prometheus 1567077264","summary":"Canary alert for site1/production at 39600"},"externalURL":"http://observability-alert-152ce57.site1.github.net:9093","version":"4","groupKey":"{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"39600\"}"} client=172.16.X.X:48074
alertmanager debug log of the 10.48.x.x node:
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.117Z caller=api.go:209 component=api component=http msg="http request received" method=POST url=/api/v1/alerts remote=10.48.25.49:44682
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.118Z caller=api.go:456 component=api version=v1 msg="v1 alerts received" alerts="([]*types.Alert) (len=2 cap=4) {\n (*types.Alert)(0xc00030e620)(ObservabilityCanary[6e0f958][resolved]),\n (*types.Alert)(0xc00030e690)(ObservabilityCanary[1477c88][active])\n}\n" valid_alerts="([]*types.Alert) (len=2 cap=2) {\n (*types.Alert)(0xc00030e620)(ObservabilityCanary[6e0f958][resolved]),\n (*types.Alert)(0xc00030e690)(ObservabilityCana
ry[1477c88][active])\n}\n"
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.118Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=ObservabilityCanary[6e0f958][resolved]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.118Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=ObservabilityCanary[1477c88][active]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.118Z caller=dispatch.go:432 component=dispatcher aggrGroup="{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"40500\"}" msg=flushing alerts=[ObservabilityCanary[1477c88][active]]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.123Z caller=dispatch.go:432 component=dispatcher aggrGroup="{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"39600\"}" msg=flushing alerts=[ObservabilityCanary[7076047][active]]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=error ts=2019-08-29T11:15:24.123Z caller=dispatch.go:443 component=dispatcher aggrGroup="{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site3\", ts_rounded=\"39600\"}" msg="failed to get alert" err="alert not found"
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.131Z caller=api.go:209 component=api component=http msg="http request received" method=POST url=/api/v1/alerts remote=172.16.41.15:34068
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.131Z caller=api.go:456 component=api version=v1 msg="v1 alerts received" alerts="([]*types.Alert) (len=2 cap=4) {\n (*types.Alert)(0xc0004e20e0)(ObservabilityCanary[7076047][resolved]),\n (*types.Alert)(0xc0004e23f0)(ObservabilityCanary[10e126f][active])\n}\n" valid_alerts="([]*types.Alert) (len=2 cap=2) {\n (*types.Alert)(0xc0004e20e0)(ObservabilityCanary[7076047][resolved]),\n (*types.Alert)(0xc0004e23f0)(ObservabilityCana
ry[10e126f][active])\n}\n"
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.131Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=ObservabilityCanary[7076047][resolved]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.131Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=ObservabilityCanary[10e126f][active]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.132Z caller=dispatch.go:432 component=dispatcher aggrGroup="{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site1\", ts_rounded=\"40500\"}" msg=flushing alerts=[ObservabilityCanary[10e126f][active]]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.168Z caller=api.go:209 component=api component=http msg="http request received" method=POST url=/api/v1/alerts remote=10.42.128.84:51772
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.169Z caller=api.go:456 component=api version=v1 msg="v1 alerts received" alerts="([]*types.Alert) (len=2 cap=4) {\n (*types.Alert)(0xc00016a9a0)(ObservabilityCanary[cdc6ba6][resolved]),\n (*types.Alert)(0xc00016aa80)(ObservabilityCanary[4b6f8f8][active])\n}\n" valid_alerts="([]*types.Alert) (len=2 cap=2) {\n (*types.Alert)(0xc00016a9a0)(ObservabilityCanary[cdc6ba6][resolved]),\n (*types.Alert)(0xc00016aa80)(ObservabilityCana
ry[4b6f8f8][active])\n}\n"
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.169Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=ObservabilityCanary[cdc6ba6][resolved]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.169Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=ObservabilityCanary[4b6f8f8][active]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.169Z caller=dispatch.go:432 component=dispatcher aggrGroup="{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"40500\"}" msg=flushing alerts=[ObservabilityCanary[4b6f8f8][active]]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.173Z caller=dispatch.go:432 component=dispatcher aggrGroup="{}/{ts_rounded=~\"^(?:.+)$\"}:{env=\"production\", instance=\"observability-observer-production.service.site1.github.net:443\", site=\"site4\", ts_rounded=\"39600\"}" msg=flushing alerts=[ObservabilityCanary[cdc6ba6][resolved]]
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.347Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.347Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc001557e40)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:244036388 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:244036388 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.347Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:244036388 > firing_alerts:88340381542000293 > exp
ires_at:<seconds:1567509324 nanos:244036388 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.347Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.347Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc001557fc0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:260555231 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:260555231 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.347Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:260555231 > resolved_alerts:5964970159396630185 >
expires_at:<seconds:1567509324 nanos:260555231 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.347Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.348Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc0005c6180)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:148461074 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:148461074 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.348Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:148461074 > firing_alerts:88340381542000293
> expires_at:<seconds:1567509324 nanos:148461074 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.348Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.348Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc0005c62c0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:150717058 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:150717058 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.348Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:150717058 > firing_alerts:73972179893101381
63 > expires_at:<seconds:1567509324 nanos:150717058 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.348Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.348Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc0005c6500)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:331828434 > resolved_alerts:8472568677447878230 > expires_at:<seconds:1567509324 nanos:331828434 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.348Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:331828434 > resolved_alerts:8472568677447878230 >
expires_at:<seconds:1567509324 nanos:331828434 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.358Z caller=dispatch.go:432 component=dispatcher aggrGroup="{}:{instance=\"observability-prom-a5a5790.site1.github.net\"}" msg=flushing alerts="[PrometheusQueryDurationSLOViolation[f56f211][active] PrometheusQueryDurationSLOViolation[baa42b5][active]]"
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.457Z caller=api.go:209 component=api component=http msg="http request received" method=GET url=/-/ready remote=127.0.0.1:58136
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.527Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.527Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc00055ca00)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:171483707 > firing_alerts:5230837512232136682 > expires_at:<seconds:1567509324 nanos:171483707 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.527Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:171483707 > firing_alerts:52308375122321366
82 > expires_at:<seconds:1567509324 nanos:171483707 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.527Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc00055cc00)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:208278699 > resolved_alerts:8472568677447878230 > expires_at:<seconds:1567509324 nanos:208278699 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:208278699 > resolved_alerts:847256867744787
8230 > expires_at:<seconds:1567509324 nanos:208278699 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc00055ce40)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:285000296 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:285000296 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:285000296 > firing_alerts:7397217989310138163 > e
xpires_at:<seconds:1567509324 nanos:285000296 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc00055d000)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:303646581 > firing_alerts:5230837512232136682 > expires_at:<seconds:1567509324 nanos:303646581 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:303646581 > firing_alerts:5230837512232136682 > e
xpires_at:<seconds:1567509324 nanos:303646581 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.528Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc00055d200)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:331828434 > resolved_alerts:8472568677447878230 > expires_at:<seconds:1567509324 nanos:331828434 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=174) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}:canary/slack/0\": (*nflogpb.MeshEntry)(0xc00055d3c0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"product
ion\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:227434412 > firing_alerts:5230837512232136682 > expires_at:<seconds:1567509324 nanos:227434412 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:227434412 > firing_alerts:5230837512232136682 > exp
ires_at:<seconds:1567509324 nanos:227434412 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc00055d540)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"produ
ction\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:244036388 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:244036388 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc00055d640)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:260555231 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:260555231 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc00055d780)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:148461074 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:148461074 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.547Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.548Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc00055d880)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:150717058 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:150717058 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.610Z caller=api.go:209 component=api component=http msg="http request received" method=POST url=/api/v1/alerts remote=172.16.26.15:25582
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.610Z caller=api.go:456 component=api version=v1 msg="v1 alerts received" alerts="([]*types.Alert) {\n}\n" valid_alerts="([]*types.Alert) {\n}\n"
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.727Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc000860c40)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:331828434 > resolved_alerts:8472568677447878230 > expires_at:<seconds:1567509324 nanos:331828434 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc000860d80)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:244036388 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:244036388 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc000861280)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:260555231 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:260555231 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc000861340)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:148461074 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:148461074 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.728Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc0008614c0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:150717058 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:150717058 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.747Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.747Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc000861800)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:160387553 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:160387553 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.747Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:160387553 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:160387553 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.747Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.747Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc000861a00)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:171483707 > firing_alerts:5230837512232136682 > expires_at:<seconds:1567509324 nanos:171483707 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.748Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.748Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc000861b80)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:208278699 > resolved_alerts:8472568677447878230 > expires_at:<seconds:1567509324 nanos:208278699 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.748Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.748Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc000861d00)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:285000296 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:285000296 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.748Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.748Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc0004b6080)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:303646581 > firing_alerts:5230837512232136682 > expires_at:<seconds:1567509324 nanos:303646581 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.927Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=174) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}:canary/slack/0\": (*nflogpb.MeshEntry)(0xc001412c80)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:227434412 > firing_alerts:5230837512232136682 > expires_at:<seconds:1567509324 nanos:227434412 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc001412dc0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:244036388 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:244036388 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc001412e80)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:260555231 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:260555231 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc001412f80)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:160387553 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:160387553 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.928Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc001413040)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:171483707 > firing_alerts:5230837512232136682 > expires_at:<seconds:1567509324 nanos:171483707 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=174) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}:canary/slack/0\": (*nflogpb.MeshEntry)(0xc001413400)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:158708728 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:158708728 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:158708728 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:158708728 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=174) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}:canary/slack/0\": (*nflogpb.MeshEntry)(0xc001413580)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:200539637 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:200539637 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:200539637 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:200539637 > "
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=174) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/slack/0\": (*nflogpb.MeshEntry)(0xc001413700)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:206447300 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:206447300 > )\n}\n" err=null
Aug 29 04:15:24 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:24.949Z caller=nflog.go:542 component=nflog msg="gossiping new entry" entry="entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:206447300 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:206447300 > "
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.127Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc001413f00)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:208278699 > resolved_alerts:8472568677447878230 > expires_at:<seconds:1567509324 nanos:208278699 > )\n}\n" err=null
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc001413fc0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:285000296 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:285000296 > )\n}\n" err=null
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc0015740c0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:303646581 > firing_alerts:5230837512232136682 > expires_at:<seconds:1567509324 nanos:303646581 > )\n}\n" err=null
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}:canary/webhook/0\": (*nflogpb.MeshEntry)(0xc0015742c0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site4\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" > timestamp:<seconds:1567077324 nanos:331828434 > resolved_alerts:8472568677447878230 > expires_at:<seconds:1567509324 nanos:331828434 > )\n}\n" err=null
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.128Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=176) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/webhook/1\": (*nflogpb.MeshEntry)(0xc001574380)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"webhook\" idx:1 > timestamp:<seconds:1567077324 nanos:148461074 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:148461074 > )\n}\n" err=null
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.185Z caller=api.go:209 component=api component=http msg="http request received" method=POST url=/api/v1/alerts remote=172.19.14.33:37060
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.185Z caller=api.go:456 component=api version=v1 msg="v1 alerts received" alerts="([]*types.Alert) (len=1 cap=4) {\n (*types.Alert)(0xc0004e2a80)(ThanosBucketCompaction[f82c46b][active])\n}\n" valid_alerts="([]*types.Alert) (len=1 cap=1) {\n (*types.Alert)(0xc0004e2a80)(ThanosBucketCompaction[f82c46b][active])\n}\n"
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.186Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=ThanosBucketCompaction[f82c46b][active]
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.186Z caller=api.go:209 component=api component=http msg="http request received" method=POST url=/api/v1/alerts remote=172.19.39.179:36108
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.186Z caller=api.go:456 component=api version=v1 msg="v1 alerts received" alerts="([]*types.Alert) (len=1 cap=4) {\n (*types.Alert)(0xc00030ec40)(ThanosBucketCompaction[e471eba][active])\n}\n" valid_alerts="([]*types.Alert) (len=1 cap=1) {\n (*types.Alert)(0xc00030ec40)(ThanosBucketCompaction[e471eba][active])\n}\n"
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.186Z caller=dispatch.go:104 component=dispatcher msg="Received alert" alert=ThanosBucketCompaction[e471eba][active]
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.327Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.328Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=174) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}:canary/slack/0\": (*nflogpb.MeshEntry)(0xc000e71940)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"39600\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:158708728 > resolved_alerts:5964970159396630185 > expires_at:<seconds:1567509324 nanos:158708728 > )\n}\n" err=null
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.328Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.328Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=174) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}:canary/slack/0\": (*nflogpb.MeshEntry)(0xc000e71ac0)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site1\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:200539637 > firing_alerts:7397217989310138163 > expires_at:<seconds:1567509324 nanos:200539637 > )\n}\n" err=null
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.328Z caller=delegate.go:160 component=cluster msg="update received" key=nfl
Aug 29 04:15:25 observability-alert-871f98c.site3.github.net alertmanager[17495]: level=debug ts=2019-08-29T11:15:25.328Z caller=nflog.go:526 component=nflog msg="state received" state="(nflog.state) (len=1) {\n (string) (len=174) \"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}:canary/slack/0\": (*nflogpb.MeshEntry)(0xc000e71b80)(entry:<group_key:\"{}/{ts_rounded=~\\\"^(?:.+)$\\\"}:{env=\\\"production\\\", instance=\\\"observability-observer-production.service.site1.github.net:443\\\", site=\\\"site3\\\", ts_rounded=\\\"40500\\\"}\" receiver:<group_name:\"canary\" integration:\"slack\" > timestamp:<seconds:1567077324 nanos:206447300 > firing_alerts:88340381542000293 > expires_at:<seconds:1567509324 nanos:206447300 > )\n}\n" err=null