multiprocessing - How to write pipe object to a file in python? -


I have a requirement where I have to make a pipe for use in a process, say P1, I stored P1 Pipe information somewhere (maybe a file ??) so that another process says that P2 could read the information from this place. I thought about using the snacking concept, but it does not seem that it will work. I am facing the following issue.

  Python 2.7.6 (Default, 22 March 2014, 22:56:56) [GCC 4.8.2] Type "help", "copyright" on linux2 For more information, Credit "or" License "& Gt; & Gt; & Gt; Pipe from multi processing import & gt; & Gt; & Gt; P1 = pipe (false)> gt; & Gt; & Gt; Fp = open ("sample.pkl", 'w') & gt; & Gt; & Gt; Fp: ... with pickle.dump (p1, fp) ... traceback (the most recent call final): the file "lt; stdin & gt;", line 2, & lt; Module & gt; File "/usr/lib/python2.7/pickle.py", line 1370, dump picker (file, protocol) .dump (obj) file "/usr/lib/python2.7/pickle.py", line 224, Save the Dump SafeSave (OBJ) file "/usr/lib/python2.7/pickle.py", line 286, in F (Self, OBJ) #Automatic file "/ usr / lib / Python2.7 / pickle.py ", Line 562, save_tuple" Element "file in" /usr/lib/python2.7/pickle.py ", line 306, in R.V. = Reduce (self.proto) file "/ Usr / lib / python2.7 / copy_reg.py", in line 70, _reduce_ex raise typeError, "can not take% objects"% base .__ name__ TypeError: not connection object Can take & gt; & Gt; & Gt;  

I can not send this information to the process P2 because it is completely asynchronous with P1 and the ordering should not be ensured between two (by design). It is suggested how can I overcome this snack issue or what else can I do to do this!

Comments

Popular posts from this blog

Member with no value in F# -

java - Joda Time Interval Not returning what I expect -

c# - Showing a SelectedItem's Property -