Pergunta

I happily created my first NN and performed hyperparameter optimization through GridSearchCV. I just don't know what to do next. Do I have to fit it again with the best parameters GridSearchCV() revealed? is there an elegant way to do so? Otherwise, how to proceed?

def create_model(...

    model.compile(loss='mean_squared_error', optimizer=optimizer, metrics=['accuracy'])            
    return model

model = KerasRegressor(build_fn=create_model, verbose=0)

> hypparas
{'batch_size': [2, 6], 'optimizer': ['Adam', 'sgd'], 'opt_par': [0.5, 0.8]}

GridSearchCV(estimator=model 
                    , param_distributions=hypparas 
                    , n_jobs=1   
                    , n_iter=20
                    , cv=3 
                    )

grid_result = grid_obj.fit(X_train1, y_train1, callbacks = [time_callback])


print("Best: %f using %s" %  (grid_result.best_score_, grid_result.best_params_), "\n")

means = grid_result.cv_results_['mean_test_score']
stds = grid_result.cv_results_['std_test_score']
params = grid_result.cv_results_['params']

for mean, stdev, param in zip(means, stds, params):
    print("%f (%f) with: %r" % (mean, stdev, param))

    Best: -0.941568 using {'optimizer': 'Adam', 'opt_par': 0.8, 'batch_size': 2} 

    -1.725617 (0.620383) with: {'optimizer': 'Adam', 'opt_par': 0.5, 'batch_size': 2}
    -1.595137 (0.224487) with: {'optimizer': 'sgd', 'opt_par': 0.5, 'batch_size': 2}
    -0.941568 (0.149151) with: {'optimizer': 'Adam', 'opt_par': 0.8, 'batch_size': 2}
    -1.338372 (0.523434) with: {'optimizer': 'sgd', 'opt_par': 0.8, 'batch_size': 2}
    -1.094907 (0.121018) with: {'optimizer': 'Adam', 'opt_par': 0.5, 'batch_size': 6}
    -1.588476 (0.569475) with: {'optimizer': 'sgd', 'opt_par': 0.5, 'batch_size': 6}
    -1.443133 (0.342028) with: {'optimizer': 'Adam', 'opt_par': 0.8, 'batch_size': 6}
    -1.275414 (0.331939) with: {'optimizer': 'sgd', 'opt_par': 0.8, 'batch_size': 6}

Nenhuma solução correta

Licenciado em: CC-BY-SA com atribuição
scroll top