Recurrent Neural Network

RNN

Qu’est-ce que c’est

Types de RNN

Forward Propagation

Back Propagation

Problèmes des RNN


Long Short-Term Memory (LSTM)

Comment ça marche

python
from keras.layers import LSTM, Activation, Dense, Dropout, Input, Embedding
from keras.callbacks import EarlyStopping

def RNN():
  i = Input(name='inputs',shape=[max_len])
  x = Embedding(max_words,50,input_length=max_len)(i)

  x = LSTM(64)(x)
  x = Dense(256,name='FC1')(x)
  x = Activation('relu')(x)
  x = Dropout(0.5)(x)
  x = Dense(1,name='out_layer')(x)
  x = Activation('sigmoid')(x)
  model = Model(inputs=i,outputs=x)
  return model

model = RNN()
model.summary()
model.compile(loss='binary_crossentropy',optimizer=RMSprop(),metrics=['accuracy'])

r = model.fit(
  sequences_matrix, Y_train,
  batch_size=128,
  epochs=10,
  validation_split=0.2,
  callbacks=[EarlyStopping(monitor='val_loss',min_delta=0.0001)])

plt.plot(r.history['loss'], label='loss')
plt.plot(r.history['val_loss'], label='val_loss')
plt.legend()

Gated Recurrent Unit (GRU)