Skip to content
Toggle navigation
P
Projects
G
Groups
S
Snippets
Help
Flor Miriam Plaza del Arco
/
WASSA 2018
This project
Loading...
Sign in
Toggle navigation
Go to a project
Project
Repository
Merge Requests
0
Pipelines
Wiki
Settings
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Commit
887c1e63
authored
Jun 29, 2018
by
Flor Miriam Plaza del Arco
Browse files
Options
_('Browse Files')
Download
Email Patches
Plain Diff
Resultados evaluación
parent
2a2ba792
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
26 additions
and
0 deletions
results_evaluation
results_evaluation
0 → 100644
View file @
887c1e63
Params:
vocab_embeddings = 100000
max_lenght_tweet = 40
Layers:
e = Embedding(feature_size, EMBEDDING_DIM, input_length=max_len_input, weights=[embedding_matrix], trainable=False)
model.add(e)
#number of features:_32 each vector of 200 dim is converted to a vector of 32 dim
model.add(LSTM(128, dropout=0.2, recurrent_dropout=0.2))
#model.add(Bidirectional(LSTM(2,dropout=0.2,recurrent_dropout=0.2,return_sequences=True)))
model.add(Dense(32, activation='tanh'))
model.add(Dropout(0.5))
model.add(Dense(len(CLASSES), activation='softmax'))
Results evaluation:
Accuracy trainning: 53.418567
*** Results RNN_LSTM ***
Macro-Precision: 0.5226264521865173
Macro-Recall: 0.5156404166549265
Macro-F1: 0.5100454275009482
Accuracy: 0.5155875299760192
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment