Pergunta

I'm working on a deep neural model for text classification using Keras. To fine tune some hyperparameters i'm using Keras Wrappers for the Scikit-Learn API. So I builded a Sklearn Pipeline for that:

def create_model(optimizer="adam", nbr_features=100):
    model = Sequential()
    model.add(Dense(512, activation='relu', input_shape=(nbr_features,)))
    ...
    model.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=["accuracy"])
    return model

estimator = Pipeline([("tfidf", TfidfVectorizer()),
                      ('norm', StandardScaler(with_mean=False)),
                      ("km", KerasClassifier(build_fn=create_model, verbose=1))])
grid_params = {
     'tfidf__max_df': (0.1, 0.25, 0.5, 0.75, 1.0),
     'tfidf__max_features': (100, 500, 1000, 5000,),
      ... }

gs = GridSearchCV(estimator,
                   param_grid,
                   ...)

I want to pass max_features parameters from tfidf stage to km stage as nbr_features. Any Hack/Workaround to do that ?

Nenhuma solução correta

Licenciado em: CC-BY-SA com atribuição
scroll top