As per the docs, rate function calculates per-second average rate of increase. Our scrape interval is 15s. In the rate graph, we an occasional datapoints at 1.4. How can they be 1.4/s when the latency hasn't increased anywhere near 20 (1.4 * 15)? And why is the rate close to 0 for the most part even when the latency rate is considerably higher than 0? If I increase the rate interval to 1h, I get a more accurate value but it misses the occasional spikes.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…