The first argument to Value
is typecode_or_type. That is defined as:
typecode_or_type determines the type of the returned object: it is
either a ctypes type or a one character typecode of the kind used by
the array module. *args is passed on to the constructor for the type.
Emphasis mine. So, you simply cannot put a pandas dataframe in a Value
, it has to be a ctypes type.
You could instead use a multiprocessing.Manager
to serve your singleton dataframe instance to all of your processes. There's a few different ways to end up in the same place - probably the easiest is to just plop your dataframe into the manager's Namespace
.
from multiprocessing import Manager
mgr = Manager()
ns = mgr.Namespace()
ns.df = my_dataframe
# now just give your processes access to ns, i.e. most simply
# p = Process(target=worker, args=(ns, work_unit))
Now your dataframe instance is accessible to any process that gets passed a reference to the Manager. Or just pass a reference to the Namespace
, it's cleaner.
One thing I didn't/won't cover is events and signaling - if your processes need to wait for others to finish executing, you'll need to add that in. Here is a page with some Event
examples which also cover with a bit more detail how to use the manager's Namespace
.
(note that none of this addresses whether multiprocessing
is going to result in tangible performance benefits, this is just giving you the tools to explore that question)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…