I am trying to store a pickle object in a Google Cloud Storage bucket. This is a part of a machine learning pipeline [tutorial][1] provided by Google that I am following. I broke down the code to a minimal example that still throws the same error. In my actual code, the object is a class instance.
import tensorflow as tf
import dill as pickle
obj = {'foo': 'bar'}
with tf.io.gfile.GFile(filename, 'wb') as f:
pickle.dump(obj, f, protocol=pickle.HIGHEST_PROTOCOL)
When I run this, I get
TypeError: Expected binary or unicode string, got <memory at 0x7fdc66d7d7c0>
On the tutorial this step worked fine as it used Tensorflow 1 and tf.io.gfile.Open(), which was removed in Tensorflow 2 and replaced by the command above. Simply using open() also works, but of course that doesn't help me writing to a bucket. I also tried
with tf.io.gfile.GFile(filename, 'wb') as f:
f.write(obj)
but it returns the same error. Please let me know know what I am doing wrong or if there is an alternative approach to store a pickled object directly to a bucket? Many thanks for your help!
[1]: https://cloud.google.com/dataflow/docs/samples/molecules-walkthrough#overview
question from:
https://stackoverflow.com/questions/66052849/storing-pickle-object-in-google-cloud-storage-using-tensorflow-io-gfile 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…