Tested model: ============= ------- notif_encoder: NotifEncoder( (model): Sequential( (0): Linear(in_features=29, out_features=64, bias=True) (1): ReLU() (2): Linear(in_features=64, out_features=32, bias=True) (3): Dropout(p=0.2, inplace=False) ) ) ------- evnt_encoder: EvntEncoder( (model): Sequential( (0): Linear(in_features=11, out_features=64, bias=True) (1): ReLU() (2): Linear(in_features=64, out_features=32, bias=True) (3): Dropout(p=0.2, inplace=False) ) ) ------- info_encoder: InfoEncoder( (model): Sequential( (0): Linear(in_features=28, out_features=64, bias=True) (1): PReLU(num_parameters=1) (2): Linear(in_features=64, out_features=32, bias=True) (3): Dropout(p=0.2, inplace=False) ) ) ------- embedder: ConvEmbedderSimple( (adapter): Linear(in_features=32, out_features=32, bias=True) (representer): ConvRepresenter( (convs): ParameterList( (0): Object of type: ConvRepresenterBlock (1): Object of type: ConvRepresenterBlock (0): ConvRepresenterBlock( (model): Sequential( (0): Conv1d(32, 64, kernel_size=(3,), stride=(1,), padding=(1,)) (1): PReLU(num_parameters=1) (2): Conv1d(64, 32, kernel_size=(3,), stride=(1,), padding=(1,)) (3): PReLU(num_parameters=1) ) ) (1): ConvRepresenterBlock( (model): Sequential( (0): Conv1d(32, 64, kernel_size=(3,), stride=(1,), padding=(1,)) (1): PReLU(num_parameters=1) (2): Conv1d(64, 32, kernel_size=(3,), stride=(1,), padding=(1,)) (3): PReLU(num_parameters=1) ) ) ) ) (final): Conv1d(32, 32, kernel_size=(22,), stride=(1,)) ) (Training dataset class occurences: {4: 45, 3: 176, 1: 46, 2: 1}.) svm finished: True; SV cnt 109; Trained SVM with LINEAR kernel with loss 920000.0000000066 and 109 outliers. svm finished: True; SV cnt 97; Trained SVM with POLYNOMIAL (deg 2) kernel with loss 920000.0000057232 and 97 outliers. svm finished: True; SV cnt 99; Trained SVM with GAUSSIAN (sigma 1) kernel with loss 920000.0000057232 and 97 outliers. Results of classification of specially annotated notifications: IDs of clssified notificaions: [11168292, 11169523, 11173814, 9961382, 9962202, 9971171, 10168557, 10213020, 10275769] classes of classified notifications: [3, 3, 1, 3, 3, 1, 3, 2, 3] SVM with LINEAR kernel: tensor([1., 1., 1., 1., 1., 1., 1., 1., 1.], dtype=torch.float64) SVM with POLYNOMIAL kernel: tensor([1., 1., 1., 1., 1., 1., 1., 1., 1.], dtype=torch.float64) SVM with GAUSSIAN kernel: tensor([1., 1., 1., 1., 1., 1., 1., 1., 1.], dtype=torch.float64) logistic regression: tensor([[9.9913e-01, 8.7022e-04], [9.9989e-01, 1.1063e-04], [5.8318e-01, 4.1682e-01], [9.8751e-01, 1.2493e-02], [9.8005e-01, 1.9947e-02], [7.5498e-01, 2.4502e-01], [1.0000e+00, 4.7845e-07], [9.9959e-01, 4.1077e-04], [9.7019e-01, 2.9814e-02]], grad_fn=) MLP classifier: tensor([[1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.], [1., 0.]], grad_fn=) Results on test set: SVM with LINEAR kernel: 1.0 (13/13) SVM with POLYNOMIAL kernel: 1.0 (13/13) SVM with GAUSSIAN kernel: 1.0 (13/13) logistic regression: 0.8461538553237915 (11/13) MLP classifier: 1.0 (13/13) (class occurence counts: {3: 11, 4: 2})