Pergunta

I'm trying to generate something like that:
Which is a random sample from my real data function (that i'm trying to mimic).

numbers = [ 54.,  87., 103., 209., 356., 371., 383., 448., 452.,  38.,  37.,
          30.,  22.,  24.,  48.,  52.,  61.,  66.,  87., 150., 163., 406.,
         545., 557., 566., 508., 413.,  26.,  22.,  22.,  12.,  19.,  22.,
          26.,  28.,  90.,  93., 126., 155., 362., 476., 470., 455., 410.,
         345., 252., 233.,  62.,  45.,  42.,  40.,  38.,  37.,  26.,  27.,
          16.,  18.,  21.,  21.,  22.,  63.,  67.,  96., 177., 228., 331.,
         382., 183.,  38.,  31.,  14.,  13.,  13.,  18.,  19.,  21.,  23.,
          61.,  68.,  93., 104., 179., 273., 428., 446., 388.,  96.,  77.,
          19.,  18.,  13.,   4.,   0.,   0.,   0.,   0.,   0.,   0.,   0.,
           0.]

enter image description here

How would your Generator & Discriminator would look like???

Here's my attempt to create G & D that will be able to do so:

def create_G(num=100):
  G_in = Input(shape=num)
  x = Dense(num/2,activation=LeakyReLU())(G_in)
  x = Dropout(0.5)(x)
  x = Dense(num/4,activation=LeakyReLU())(x)
  x = Dropout(0.5)(x)
  x = Dense(num)(x)
  G = Model(G_in,x)
  G.compile(loss='binary_crossentropy',optimizer=Adam(learning_rate=0.001))
  return G

def create_D(num=100):
  D_in = Input(shape=num)
  x = Reshape((-1,1))(D_in)
  x = Conv1D(num,3,activation='relu')(x)
  x = Dropout(0.5)(x)
  x = Flatten()(x)
  x = Dense(2,activation='sigmoid')(x)
  D = Model(D_in,x)
  D.compile(loss='binary_crossentropy',optimizer=Adam(lr=0.003))
  return D

I'm training the right way, with fake and real examples. But my G is not able to fool the D.
Here is an example of the values of G.predict(noise) in blue, compare to the real values in orange .

enter image description here

What am I missing?

Would appreciate any suggestions for different architecture.

Foi útil?

Solução

In case it will help someone in the future. This is what solved it...

def create_G(num=100):
    G_in = Input(shape=noise_size)
    x = Reshape((-1,1))(G_in)
    x = Conv1D(2,3,activation=LeakyReLU())(x)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    x = Conv1D(4,3,activation=LeakyReLU())(x)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    x = Conv1D(8,3,activation=LeakyReLU())(x)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    x = Conv1D(16,3,activation=LeakyReLU())(x)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    x = Conv1D(32,3,activation=LeakyReLU())(x)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    x = Conv1D(64,3,activation=LeakyReLU())(x)
    x = BatchNormalization()(x)
    x = Dropout(0.5)(x)
    x = Conv1D(100,3,activation='tanh')(x)
    x = Flatten()(x)
    G = Model(G_in,x,name='Generator')
    G.compile(loss=wasserstein_loss,optimizer=G_rmsprop_optimizer)
    return G



def create_D(num=100):
    D_in = Input(shape=num)
    x = Reshape((-1,1))(D_in)
    x = Conv1D(num/2,3,activation=LeakyReLU())(x)
    x = Conv1D(num/10,3,activation=LeakyReLU())(x)
    x = Conv1D(num/50,3,activation=LeakyReLU())(x)
    x = Flatten()(x)
    x = Dense(128,activation=LeakyReLU())(x)
    x = Dropout(0.5)(x)
    x = Dense(64,activation=LeakyReLU())(x)
    x = Dropout(0.5)(x)
    x = Dense(32,activation=LeakyReLU())(x)
    x = Dropout(0.5)(x)
    x = Dense(8,activation=LeakyReLU())(x)
    x = Dense(2,activation='sigmoid')(x)
    D = Model(D_in,x,name='Discriminator')
    D.compile(loss=wasserstein_loss,optimizer=D_rmsprop_optimizer)
    return D

Cheers :)

Licenciado em: CC-BY-SA com atribuição
scroll top