首页 文章

Keras神经网络中的精度@n和Recall @ n

提问于
浏览
-4

基本上,我想计算精度的最高k @ n并回忆keras神经网络中每个时期的@n . 经过长时间的搜索,我找到了自定义功能......

def precision(y_true,y_pred):

true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
precision = true_positives / (predicted_positives + K.epsilon())
return precision

def召回(y_true,y_pred):

true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
recall = true_positives / (possible_positives + K.epsilon())
return recall

//这是用于模型编译

model.compile(optimizer = RMSprop(lr = learning_rate),loss ='binary_crossentropy',metrics = [precision,recall])

对于xrange(epochs)中的纪元:t1 = time()user_input,item_input,labels = get_train_instances(train,num_negatives)

用于培训

hist = model.fit([np.array(user_input), np.array(item_input)], np.array(labels), batch_size=batch_size, nb_epoch=1, verbose=0, shuffle=True) loss,precision,recall = hist.history['loss'][0],hist.history["precision"][0] , hist.history["recall"][0] print('Iteration %d [%.1f s]: loss = %.4f , precision= %.4f , recall=%.4f [%.1f s]' % (epoch, t2-t1, loss,precision,recall, time()-t2))

//这是我的输出,看起来不对... enter image description here

我希望你现在明白了......

1 回答

  • 1

    你最好的选择是使用classification report capability in SciKitLearn . 这是一般的分类评估功能,但我经常与Keras一起使用 .

    from sklearn.preprocessing import LabelBinarizer
    from sklearn.metrics import classification_report
    
    from keras.models import Sequential
    from keras.layers.core import Dense, Dropout, Activation, Flatten
    from keras.layers.convolutional import Conv2D, MaxPooling2D
    from keras.layers.normalization import BatchNormalization
    
    from keras.datasets import cifar10
    
    # Load CIFAR-10 dataset
    (trainX, trainY), (testX, testY) = cifar10.load_data()
    trainX = trainX / 255.0
    testX = testX / 255.0
    
    # Transform labels from int to one-hot vectors
    lb = LabelBinarizer()
    trainY = lb.fit_transform(trainY)
    testY = lb.fit_transform(testY)
    
    # CNN architecture with Keras
    model = Sequential()
    model.add(Conv2D(input_shape=trainX[0,:,:,:].shape, filters=32, 
                     use_bias=True, kernel_size=(3,3)))
    model.add(Activation('relu'))
    model.add(Dropout(0.2))
    model.add(Conv2D(filters=96, use_bias=False, kernel_size=(5,5), strides=2))
    model.add(Activation('relu'))
    model.add(Dropout(0.2))
    model.add(Conv2D(filters=96, use_bias=False, kernel_size=(5,5), strides=2))
    model.add(Activation('relu'))
    model.add(Dropout(0.2))
    model.add(Flatten())
    model.add(BatchNormalization())
    model.add(Dense(256))
    model.add(Activation('relu'))
    model.add(Dropout(0.4))
    model.add(Dense(n_classes, activation="softmax"))
    model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
    
    # Train the model
    n_epochs = 25
    batch_size = 256
    callbacks_list = None
    H = model.fit(trainX, trainY, validation_data=(testX, testY), 
                  epochs=n_epochs, batch_size=batch_size, callbacks=callbacks_list)
    print('Done!!!')
    
    # Evaluate TEST model class prediction accuracy
    print("[INFO] Evaluating network...")
    predictions = model.predict(testX, batch_size=batch_size)
    target_names = [str(x) for x in lb.classes_]
    print(classification_report(testY.argmax(axis=1),
                                predictions.argmax(axis=1),
                                target_names=target_names))
    
    # Evaluate TRAIN model class prediction accuracy
    print("[INFO] Evaluating network...")
    trainPreds = model.predict(trainX, batch_size=batch_size)
    target_names = [str(x) for x in lb.classes_]
    print(classification_report(trainY.argmax(axis=1),
                                trainPreds.argmax(axis=1),
                                target_names=target_names))
    

    对分类报告的调用将为您提供每个类的精确度和召回统计信息,如下所示:

    [INFO]评估网络...精确回忆f1-得分支持0 0.78 0.83 0.80 1000
    1 0.85 0.92 0.88 1000
    2 0.77 0.58 0.66 1000
    3 0.66 0.54 0.59 1000
    4 0.72 0.72 0.72 1000
    5 0.72 0.63 0.67 1000
    6 0.69 0.91 0.78 1000
    7 0.81 0.83 0.82 1000
    8 0.85 0.87 0.86 1000
    9 0.82 0.86 0.84 1000
    平均0.77 0.77 0.76 10000

相关问题