python - How to reuse / clone an sqlalchemy query -


it seems me going through whole process of creating expression tree , creating query again wasted time when using sqlalchemy. apart occasional dynamic query, same during whole life of application (apart parameters of course).

is there way save query once it's created , reuse later on different parameters? or maybe there's internal mechanism similar?

it seems me going through whole process of creating expression tree , creating query again wasted time when using sqlalchemy.

do have estimates on how time wasted, compared rest of application? profiling here extremely important before making program more complex. note, reddit serves on 1 billion page views day, use sqlalchemy core query database, , last time looked @ code make no attempt optimize process - build expression trees on fly , compile each time. have had users have determined specific system benefits optimiztions in these areas, however.

i've written background on profiling here: how can profile sqlalchemy powered application?

is there way save query once it's created , reuse later on different parameters? or maybe there's internal mechanism similar?

there several methods, depending on apis you're using , areas you'd optimize.

there's 2 main portions rendering sql - there's construction of expression tree, speak, , compilation of string expression tree.

the tree itself, can either select() construct if using core or query() if using orm, can reused. select() has nothing associated prevents being reused (same insert(), delete(), update(), etc.).

in orm, query can used different sessions using with_session() method. win here not much, query() still produces entire select() each time invoked. we'll see below there recipe can allow cached.

the next level of optimization involves caching of actual sql text rendered. area little more care needed, sql generate specific target backend; there edge cases various parameterizations change sql (such using "top n rows" sql server, can't use bound parameter there). caching of sql strings provided using execution_options() method of connection, available in few other places, setting compiled_cache feature sending dictionary or other dict-like object cache string format of statements, keyed dialect, identity of construct, , parameters sent. feature used orm insert/update/delete statements.

there's recipe i've posted integrates compiled_cache feature query, @ bakedquery. taking query , saying query.bake(), can run query session , long don't call more chained methods on it, should use cached form of sql string each time:

q = s.query(foo).filter(foo.data==bindparam('foo')).bake()  in range(10):     result = q.from_session(s).params(foo='data 12').all() 

it's experimental, , not used often, it's you're asking here. i'd suggest tailor needs, keep eye on memory usage use , make sure follow how works.


Comments

Popular posts from this blog

jquery - How can I dynamically add a browser tab? -

node.js - Getting the socket id,user id pair of a logged in user(s) -

keyboard - C++ GetAsyncKeyState alternative -