Question

I'm having some trouble mapping a byte array to a MySQL database in Hibernate and was wondering if I'm missing something obvious. My class looks roughly like this:

public class Foo {
    private byte[] bar;

    // Getter and setter for 'bar'
}

The table is defined like this in MySQL 5.5:

CREATE TABLE foo (
bar BINARY(64) NOT NULL)

And the Hibernate 3.6.2 mapping looks similar to this:

<hibernate-mapping>
    <class name="example.Foo" table="foo">
        <property name="bar" column="bar" type="binary" />
    </class>
</hibernate-mapping>

I am using hbm2ddl for validation only and it gives me this error when I deploy the application:

Wrong column type in foo for column bar. Found: binary, expected: tinyblob

If using type="binary" in the mapping wouldn't cause Hibernate to expect the column's type to be binary (instead of tinyblob,) I don't know what would. I spent some time Googling this but couldn't find the exact error. The solutions for similar errors were to...

  1. Specify "length" on the <property>. That changes what type Hibernate expects but it's always some variety of blob instead of the "binary" type it's finding.
  2. Instead of declaring a "type" on the property element, nest a column element and give it a sql-type attribute. That work but that would also make the binding specific to MySQL so I would like to avoid it if possible.

Does anything stand out about this setup that would cause this mismatch? If I specify type="binary" instead of "blob", why is Hibernate expecting a blob instead of a binary?

Was it helpful?

Solution

I believe the problem is type="binary".

That type is a hibernate, generic type. It does not directly map to DB-engine specific types. They are translated to different SQL types based on driver you are using. Apparently the MySQL driver maps the hibernate type "binary" to a tinyblob.

The full list of hibernate types is available here.

You have 2 options. You can change your CREATE TABLE script to store that column with a tinyblob data type. Then your hibernate validation would not fail and your application would work. This would be the suggested solution.

The second option should be used only if you HAVE to use BINARY data type in the DB. What you can do is specify a sql-type in the hibernate mapping so that you enforce hibernate to use the type you want. The mapping would look like this:

<property name="bar">
  <column name="bar" sql-type="binary" />
</property>

The main down side to this is you lose DB -engine independence which is why most people use hibernate in the first place. This code will only work on DB engines which have the BINARY data type.

OTHER TIPS

What we ended up doing to solve a problem similar to this is write our own custom UserType.

UserTypes are relatively easy to implement. Just create a class that implements org.hibernate.usertype.UserType and implement the @override methods.

in your hibernate definitions, using a user type is pretty easy:

<property name="data" type="com.yourpackage.hibernate.CustomBinaryStreamUserType" column="binary_data" />

Simply put, What this will do is execute this class for reading and writing the data from the database. Specifically the methods nullSafeGet and nullSafeSet are used.

In our case, we used this to gzip compress binary data before writing it to the database, and uncompress it as its read out. This hides the fact that the data is compressed from the application using this data.

I think there is an easy solution for mapping Binary columns in hibernate.

"BINARY" columns can be easily mapped to "java.util.UUID" in hibernate entity classes.

For e.g. Column definition will look like

`tokenValue` BINARY(16) NOT NULL

Hibernate Entitiy will have below code to support BINARY column

private UUID tokenValue;

@Column(columnDefinition = "BINARY(16)", length = 16)
public UUID getTokenValue() {
    return this.tokenValue;
}

public void setTokenValue(UUID sessionTokenValue) {
    this.tokenValue = tokenValue;
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top