Question

I was looking at the question and decided to try using the bind variables. I use

sql = 'insert into abc2 (interfield,textfield) values (%s,%s)'
a = time.time()
for i in range(10000):
    #just a wrapper around cursor.execute
    db.executeUpdateCommand(sql,(i,'test'))

db.commit()

and

sql = 'insert into abc2 (intfield,textfield) values (%(x)s,%(y)s)'
for i in range(10000):
    db.executeUpdateCommand(sql,{'x':i,'y':'test'})

db.commit()

Looking at the time taken for the two sets, above it seems like there isn't much time difference. In fact, the second one takes longer. Can someone correct me if I've made a mistake somewhere? using psycopg2 here.

Was it helpful?

Solution

The queries are equivalent in Postgresql.

Bind is oracle lingo. When you use it will save the query plan so the next execution will be a little faster. prepare does the same thing in Postgres.

http://www.postgresql.org/docs/current/static/sql-prepare.html

psycopg2 supports an internal 'bind', not prepare with cursor.executemany() and cursor.execute()

(But don't call it bind to pg people. Call it prepare or they may not know what you mean:)

OTHER TIPS


IMPORTANT UPDATE : I've seen into source of all python libraries to connect to PostgreSQL in FreeBSD ports and can say, that only py-postgresql does real prepared statements! But it is Python 3+ only.

also py-pg_queue is funny lib implementing official DB protocol (python 2.4+)


You've missed answer for that question about prepared statements to use as many as possible. "Binded variables" are better form of this, let's see:

sql_q = 'insert into abc (intfield, textfield) values (?, ?)'  # common form 
sql_b = 'insert into abc2 (intfield, textfield) values (:x , :y)' # should have driver and db support

so your test should be this:

sql = 'insert into abc2 (intfield, textfield) values (:x , :y)'
for i in range (10000):
    cur.execute(sql, x=i, y='test')

or this:

def _data(n):
    for i in range (n):
         yield (i, 'test')
sql = 'insert into abc2 (intfield, textfield) values (? , ?)'    
cur.executemany(sql, _data(10000))

and so on.

UPDATE: I've just found interest reciple how to transparently replace SQL queries with prepared and with usage of %(name)s

As far as I know, psycopg2 has never supported server-side parameter binding ("bind variables" in Oracle parlance). Current versions of PostgreSQL do support it at the protocol level using prepared statements, but only a few connector libraries make use of it. The Postgres wiki notes this here. Here are some connectors that you might want to try: (I haven't used these myself.)

As long as you're using DB-API calls, you probably ought to consider cursor.executemany() instead of repeatedly calling cursor.execute().

Also, binding parameters to their query in the server (instead of in the connector) is not always going to be faster in PostgreSQL. Note this FAQ entry.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top