asynchronous - Python asyncio task got bad yield -


i confused how play around asyncio module in python 3.4. have searching api search engine, , want each search request run either parallel, or asynchronously, don't have wait 1 search finish start another.

here high-level searching api build objects raw search results. search engine using kind of asyncio mechanism, won't bother that.

# no asyncio module used here class search(object):   ...   self.s = some_search_engine()   ...   def searching(self, *args, **kwargs):     ret = {}     # raw searching according args , kwargs , build wrapped results     ...     return ret 

to try async requests, wrote following test case test how can interact stuff asyncio module.

# here testing script @asyncio.coroutine def handle(f, *args, **kwargs):   r = yield f(*args, **kwargs)   return r  s = search() loop = asyncio.get_event_loop() loop.run_until_complete(handle(s.searching, arg1, arg2, ...)) loop.close() 

by running pytest, return runtimeerror: task got bad yield : {results searching...}, when hits line r = yield ....

i tried way.

# same handle above def handle(..):   .... s = search() loop = asyncio.get_event_loop() tasks = [         asyncio.async(handle(s.searching, arg11, arg12, ...)),         asyncio.async(handle(s.searching, arg21, arg22, ...)),         ...         ] loop.run_until_complete(asyncio.wait(tasks)) loop.close() 

by running test case pytest, passes weird exception search engine raise. , says future/task exception never retrieved.

things wish ask:

  1. for 1st try, right way use yield from, returning actual result function call?
  2. i think need add sleep 2nd test case wait task finish, how should that? , how can function calls return in 2nd test case?
  3. is way implement asyncio existing module, creating async handler handle requests?
  4. if answer question 2 no, every client calls class search needs include loop = get_event_loop() kind of stuffs async requests?

the problem can't call existing synchronous code if asyncio.coroutine , asynchronous behavior. when call yield searching(...), you're going asynchronous behavior if searching asyncio.coroutine, or @ least returns asyncio.future. right now, searching regular synchronous function, calling yield searching(...) going throw error, because doesn't return future or coroutine.

to behavior want, you'll need have asynchronous version of searching in addition synchronous version (or drop synchronous version altogether if don't need it). have few options support both:

  1. rewrite searching asyncio.coroutine uses asyncio-compatible calls i/o, rather blocking i/o. make work in asyncio context, means won't able call directly in synchronous context anymore. instead, you'd need provide alternative synchronous searching method starts asyncio event loop , calls return loop.run_until_complete(self.searching(...)). see this question more details on that.
  2. keep synchronous implementation of searching, , provide alternative asynchronous api uses baseeventloop.run_in_executor run searching method in background thread:

    class search(object):   ...   self.s = some_search_engine()   ...   def searching(self, *args, **kwargs):     ret = {}     ...     return ret     @asyncio.coroutine    def searching_async(self, *args, **kwargs):       loop = kwargs.get('loop', asyncio.get_event_loop())       try:           del kwargs['loop']  # assuming searching doesn't take loop arg       except keyerror:           pass       r = yield loop.run_in_executor(none, self.searching, *args)  # passing none tells asyncio use default threadpoolexecutor       return r 

    testing script:

    s = search() loop = asyncio.get_event_loop() loop.run_until_complete(s.searching_async(arg1, arg2, ...)) loop.close() 

    this way, can keep synchronous code is, , @ least provide methods can used in asyncio code without blocking event loop. it's not clean solution if used asynchronous i/o in code, better nothing.

  3. provide 2 separate versions of searching, 1 uses blocking i/o, , 1 that's asyncio-compatible. gives ideal implementations both contexts, requires twice work.

Comments

Popular posts from this blog

apache - PHP Soap issue while content length is larger -

javascript - Complete OpenIDConnect auth when requesting via Ajax -