Question

Let's say I have a model like this

class Foo(db.Model):
    id = db.StringProperty()
    bar = db.StringProperty()
    baz = db.StringProperty()

And I'm going a GqlQuery like this

foos = db.GqlQuery("SELECT * FROM Foo")

I want to take the results of the GqlQuery and turn into some sort of JSON string that I can manipulate from different languages.


Here's how I'm doing it now

  1. Add a method to the Foo class that converts it into a dictionary

    def toDict(self):
        return {
             'id': self.id,
             'bar': self.bar,
             'baz': self'baz
           }
    
  2. Loop through the GqlQuery results and manually add each Foo instance to a dictionary

    fooDict = {}
    
    for foo in foos:
        fooDict[foo.id] = foo.toDict()
    
    return simplejson.dumps(fooDict)
    

My approach above works but it feels kind of gross.

Is there a cleaner, more "Pythonic" way to handle this?

The end format doesn't have to be exactly what I've done above. It just has to be something that converts nicely to JSON so I can deal with it from Javascript/PHP/whatever.

Was it helpful?

Solution

Take a look at google.appengine.api.datastore. It's the lower level datastore API that google.appengine.ext.db builds on, and it returns Entity objects, which subclass dict. You can query it using GQL with google.appengine.ext.gql, or (my personal preference) use the Query class, which avoids the need for you to construct text strings for the GQL parser to parse. The Query class in api.datastore behaves exactly like the one documented here (but returns the lower level Entity objects instead of Model instances).

As an example, your query above can be reformulated as "datastore.Query("Foo").all()".

OTHER TIPS

I can't do too much better than that, but here are a couple of ideas:

class Foo:
    id = db.StringProperty() # etc.
    json_attrs = 'id bar baz'.split()

# Depending on how easy it is to identify string properties, there
# might also be a way to assign json_attrs programmatically after the
# definition of Foo, like this
Foo.json_attrs = [attr for attr in dir(Foo)
                  if isStringProperty(getattr(Foo, attr))]

fooDict=dict((foo.id,dict(getattr(foo, attr)
                          for attr in Foo.json_attrs))
              for foo in foos)

http://code.google.com/p/google-app-engine-samples/source/browse/trunk/geochat/json.py?r=55

The encoder method will solve your GQL-to-JSON needs nicely. I'd recommend getting rid of some of the excessive datetime options....out time as an epoch? really?

You can use web2py on GAE and do:

db.define_table('foo',SQLField('bar'),SQLField('baz'))
rows=db(db.foo.id>0).select()
### rows is a list, rows.response is a list of tuples
for row in rows: print  dict(row)

Runs on Oracle, Postgresql, Mssql, mysql, etc... too.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top