Since no one has any thoughts on this question, I will document this workaround I came up with.
I was able to get the write ops down to 4 by doing this to the bulkload.yaml file
...
transformers:
- kind: Word
connector: csv
connector_options:
encoding: utf-8
columns: from_header
property_map:
- property: __key__
external_name: word
export_transform: transform.key_id_or_name_as_string
- property: otherlangs
external_name: otherlangs
changing my class to look like this:
class Word(db.Model):
word = db.StringProperty(multiline=False)
otherlangs = db.StringProperty(multiline=True)
def __str__(self): #encode('utf8')
return "word: " + str(self.key().name().encode('utf8')) + ", otherlangs: " + self.otherlangs.encode('utf8')
And now the write ops are 4, which is nice:
Querying in the interactive console got trickier, it took me a while to figure that out.
from google.appengine.api import users
from google.appengine.ext.db.metadata import Namespace
import words
foo = words.Word.get_by_key_name('abalone')
print foo
which produces:
word: abalone, otherlangs: fr|ormeaux|it|orecchie di mare|
What I don't understand is... how can I dynamically add new words in such a way that I can get the word to be the key, like the bulk uploader is doing for me. But I'm not going to worry about that too much, as long as the bulk loader can do it, that's good enough for the moment I guess.