Question

I have dropwizard-application (0.7.0) for which I want to run integration tests.

I've set up an integration test using DropwizardAppRule, like this:

@ClassRule
public static final DropwizardAppRule<MyAppConfiguration> RULE =
        new DropwizardAppRule<MyAppConfiguration>(
                MyApplication.class, Resources.getResource("testconfiguration.yml").getPath());

When I try to run the below tests using it, it doesn't work because I haven't run my migrations. What is the best way to run the migrations?

Test:

@Test
public void fooTest() {
    Client client = new Client();
    String root = String.format("http://localhost:%d/", RULE.getLocalPort());
    URI uri = UriBuilder.fromUri(root).path("/users").build();
    client.resource(uri).accept(MediaType.APPLICATION_JSON).type(MediaType.APPLICATION_JSON).post(User.class, new LoginUserDTO("email@email.com", "password"));
}

Configuration:

 public class MyAppConfiguration extends Configuration {
@Valid
@NotNull
private DataSourceFactory database = new DataSourceFactory();

@JsonProperty("database")
public DataSourceFactory getDataSourceFactory() {
    return database;
}

@JsonProperty("database")
public void setDataSourceFactory(DataSourceFactory dataSourceFactory) {
    this.database = dataSourceFactory;
}

}

Was it helpful?

Solution

Thanks to Kimble and andersem for putting me on the right track. Here's what I came up with in my @BeforeClass method:

// Create the test database with the LiquiBase migrations.
@BeforeClass
public static void up() throws Exception
{
    ManagedDataSource ds = RULE.getConfiguration().getMainDataSource().build(
        RULE.getEnvironment().metrics(), "migrations");
    try (Connection connection = ds.getConnection())
    {
        Liquibase migrator = new Liquibase("migrations.xml", new ClassLoaderResourceAccessor(), new JdbcConnection(connection));
        migrator.update("");
    }
}

OTHER TIPS

I ran into some concurrency issues when trying to do the database migration as part of the test case and ended up baking it into the application itself (protected by a configuration option).

private void migrate(MyAppConfiguration configuration, Environment environment) {
    if (configuration.isMigrateSchemaOnStartup()) {
        log.info("Running schema migration");
        ManagedDataSource dataSource = createMigrationDataSource(configuration, environment);

        try (Connection connection = dataSource.getConnection()) {
            JdbcConnection conn = new JdbcConnection(connection);

            Database database = DatabaseFactory.getInstance().findCorrectDatabaseImplementation(conn);
            Liquibase liquibase = new Liquibase("migrations.xml", new ClassLoaderResourceAccessor(), database);
            liquibase.update("");

            log.info("Migration completed!");
        }
        catch (Exception ex) {
            throw new IllegalStateException("Unable to migrate database", ex);
        }
        finally {
            try {
                dataSource.stop();
            }
            catch (Exception ex) {
                log.error("Unable to stop data source used to execute schema migration", ex);
            }
        }
    }
    else {
        log.info("Skipping schema migration");
    }
}

private ManagedDataSource createMigrationDataSource(MyAppConfiguration configuration, Environment environment) {
    DataSourceFactory dataSourceFactory = configuration.getDataSourceFactory();

    try {
        return dataSourceFactory.build(environment.metrics(), "migration-ds");
    }
    catch (ClassNotFoundException ex) {
        throw new IllegalStateException("Unable to initialize data source for schema migration", ex);
    }
}

Another approach that doesn't rely on importing Liquibase's classes directly is to run the db migrate command in the same way that you might from the command line, using the RULE:

@Before
public void migrateDatabase() throws Exception {
    RULE.getApplication().run("db", "migrate", ResourceHelpers.resourceFilePath("testconfiguration.yml"));
}

This approach also works for any other commands from any other bundles that you might want to run before starting the tests.

A small winkle: Doing this with any commands that extend Dropwizards ConfiguredCommand (which all of the dropwizard-migrations do) will unnecessarily disable logback when the command finishes. To restore it, you can call:

        RULE.getConfiguration().getLoggingFactory().configure(RULE.getEnvironment().metrics(),
            RULE.getApplication().getName());

I did it this way using Liquibase's API:

private void migrate(){
    DataSourceFactory dataSourceFactory = RULE.getConfiguration().dataSourceFactory;
    Properties info = new Properties();
    info.setProperty("user", dataSourceFactory.getUser());
    info.setProperty("password", dataSourceFactory.getPassword());
    org.h2.jdbc.JdbcConnection h2Conn = new org.h2.jdbc.JdbcConnection(dataSourceFactory.getUrl(), info);
    JdbcConnection conn = new JdbcConnection(h2Conn);
    Database database = DatabaseFactory.getInstance().findCorrectDatabaseImplementation(conn);
    Liquibase liquibase = new Liquibase("migrations.xml", new ClassLoaderResourceAccessor(), database);
    String ctx = null;
    liquibase.update(ctx);
}

And then I put this in a beforeclass:

@BeforeClass
public void setup(){
    migrate();
}

It's probably not the ultimate solution, and it depends a lot on the database you're using, but it works.

What I do to achieve the same goal is to run the migration from within maven.

Add this to the section in the sction of your pom.xml:

<plugin>
  <groupId>org.liquibase</groupId>
  <artifactId>liquibase-maven-plugin</artifactId>
  <version>3.0.5</version>
  <executions>
      <execution>
          <phase>process-test-resources</phase>  
          <configuration>
            <changeLogFile>PATH TO YOUR MIGRATIONS FILE</changeLogFile>
            <driver>org.h2.Driver</driver>
            <url>JDBC URL LIKE IN YOUR APP.YML</url>
            <username>USERNAME</username>
            <password>PASSWORD</password>
            <dropFirst>false</dropFirst>
            <promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
            <logging>info</logging>
          </configuration>
          <goals>
            <goal>dropAll</goal>
            <goal>update</goal>
          </goals>
      </execution>
  </executions>               
</plugin>

This will work with maven from command line. With this setting, maven will use liquibase dropAll to drop all database objects, and then run a migration, so with every test you have a clean new database.

When using that, I ran intoissues with eclipse, it complained about the lifecycle mapping not working upon the execution tag of the plugin. In this case, you need to add the following to the build section as well, so eclipse can properly map the life cycles:

<pluginManagement>
  <plugins>
    <plugin>
      <groupId>org.eclipse.m2e</groupId>
      <artifactId>lifecycle-mapping</artifactId>
      <version>1.0.0</version>
      <configuration>
        <lifecycleMappingMetadata>
        <pluginExecutions>
          <pluginExecution>
            <pluginExecutionFilter>
              <groupId>org.liquibase</groupId>
              <artifactId>liquibase-maven-plugin</artifactId>
              <versionRange>[1.0,)</versionRange>
              <goals>
                <goal>dropAll</goal>
                <goal>update</goal>
              </goals>
            </pluginExecutionFilter>
            <action>
              <execute />
            </action>
          </pluginExecution>
        </pluginExecutions>
        </lifecycleMappingMetadata>
      </configuration>
    </plugin>
  </plugins>
  </pluginManagement>  
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top