asynchronous - Python asyncio task got bad yield -
i confused how play around asyncio
module in python 3.4. have searching
api search engine, , want each search request run either parallel, or asynchronously, don't have wait 1 search finish start another.
here high-level searching api build objects raw search results. search engine using kind of asyncio mechanism, won't bother that.
# no asyncio module used here class search(object): ... self.s = some_search_engine() ... def searching(self, *args, **kwargs): ret = {} # raw searching according args , kwargs , build wrapped results ... return ret
to try async requests, wrote following test case test how can interact stuff asyncio
module.
# here testing script @asyncio.coroutine def handle(f, *args, **kwargs): r = yield f(*args, **kwargs) return r s = search() loop = asyncio.get_event_loop() loop.run_until_complete(handle(s.searching, arg1, arg2, ...)) loop.close()
by running pytest, return runtimeerror: task got bad yield : {results searching...}
, when hits line r = yield ...
.
i tried way.
# same handle above def handle(..): .... s = search() loop = asyncio.get_event_loop() tasks = [ asyncio.async(handle(s.searching, arg11, arg12, ...)), asyncio.async(handle(s.searching, arg21, arg22, ...)), ... ] loop.run_until_complete(asyncio.wait(tasks)) loop.close()
by running test case pytest, passes weird exception search engine raise. , says future/task exception never retrieved
.
things wish ask:
- for 1st try, right way use
yield from
, returning actual result function call? - i think need add sleep 2nd test case wait task finish, how should that? , how can function calls return in 2nd test case?
- is way implement asyncio existing module, creating async handler handle requests?
- if answer question 2 no, every client calls class
search
needs includeloop = get_event_loop()
kind of stuffs async requests?
the problem can't call existing synchronous code if asyncio.coroutine
, asynchronous behavior. when call yield searching(...)
, you're going asynchronous behavior if searching
asyncio.coroutine
, or @ least returns asyncio.future
. right now, searching
regular synchronous function, calling yield searching(...)
going throw error, because doesn't return future
or coroutine.
to behavior want, you'll need have asynchronous version of searching
in addition synchronous
version (or drop synchronous version altogether if don't need it). have few options support both:
- rewrite
searching
asyncio.coroutine
usesasyncio
-compatible calls i/o, rather blocking i/o. make work inasyncio
context, means won't able call directly in synchronous context anymore. instead, you'd need provide alternative synchronoussearching
method startsasyncio
event loop , callsreturn loop.run_until_complete(self.searching(...))
. see this question more details on that. keep synchronous implementation of
searching
, , provide alternative asynchronous api usesbaseeventloop.run_in_executor
runsearching
method in background thread:class search(object): ... self.s = some_search_engine() ... def searching(self, *args, **kwargs): ret = {} ... return ret @asyncio.coroutine def searching_async(self, *args, **kwargs): loop = kwargs.get('loop', asyncio.get_event_loop()) try: del kwargs['loop'] # assuming searching doesn't take loop arg except keyerror: pass r = yield loop.run_in_executor(none, self.searching, *args) # passing none tells asyncio use default threadpoolexecutor return r
testing script:
s = search() loop = asyncio.get_event_loop() loop.run_until_complete(s.searching_async(arg1, arg2, ...)) loop.close()
this way, can keep synchronous code is, , @ least provide methods can used in
asyncio
code without blocking event loop. it's not clean solution if used asynchronous i/o in code, better nothing.- provide 2 separate versions of
searching
, 1 uses blocking i/o, , 1 that'sasyncio
-compatible. gives ideal implementations both contexts, requires twice work.
Comments
Post a Comment