문제

I have written a program which inserts in bulk to Elasticsearch in batch of around 3000. The problem is that I need to convert these object to json before executing the bulk insert request. But there is major downside with json convertion and it is becoming a bottle neck of my whole computation.

Can any one suggest a super fast way to convert object to json in java. My code looks like this:

  private String getESValueAsString(ElasticSearchValue elasticSearchValue) throws JsonProcessingException {
    ElasticSearchValue prevValue = null;
    if (stateType == StateType.OPAQUE) {
      prevValue = (ElasticSearchValue) elasticSearchValue.getPrevious();
    }

    elasticSearchValue.setPrevious(null);

    ObjectMapper om = new ObjectMapper();
    Map<String, Object> props = om.convertValue(elasticSearchValue, Map.class);

    if (stateType == stateType.OPAQUE) {
      props.put("previous", prevValue);
    }

    return om.writeValueAsString(props);
  }
도움이 되었습니까?

해결책

Just found the issue, I am creating too many ObjectMapper for each serialization and that is making my whole processing slow. This is a very good guide and it improved my performance 100x

http://wiki.fasterxml.com/JacksonBestPracticesPerformance

다른 팁

why not just insert into BulkRequestBuilder json records in the first place, something like this

Client client = new TransportClient().addTransportAddress(new InetSocketTransportAddress("localhost", 9300));
BulkRequestBuilder bulk = client.prepareBulk();
.....
bulk.add(client.prepareIndex(<your index>, <your type>)
    .setSource(<your object>.toJson());
....

and in <your object> class

create Gson like this:

Gson gson = new GsonBuilder().excludeFieldsWithoutExposeAnnotation().create();

and method:

public String toJson(){
    return gson.toJson(this, <you class>.class);
}
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top