I get a stream of measurements and organize them in numpy arrays. Even the processing of the data can be vectorized, so it feels like a good match.
I wonder how to elegantly and quickly stitch together the incoming chunk arrays into bigger ones. Is it efficient to drop data from my working set numpy array (where i do the calculations) once I processed it or should I start over with a fresh working set array every once in a while?
The chunks come in via mqtt. Do I need to number them and be ready to re-order the chunks or can I blindly concaternate the incoming data?
question from:
https://stackoverflow.com/questions/65920013/how-to-operate-efficiently-on-stream-of-numpy-arrays 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…