- Time: 13:30 -
- Attendees: Bob Zhang (Supervisor), Huang Yanzhen, Mai Jiajun
- We applied per-channel normalization for data samples.
- We added a batch-normalization layer after all the convolution layers (before activation).
- We replaced the original MaxPool3D layer with our custom ResPool layer, which added the batch-normed data back to the original input (so we named it the Residual Pooling layer). Unlike max-polling in 3D that loses 87.5% of the original information, ResPool only emphasized the local max from the original, boosting the model's learning performance.
- We removed the 4th convolution layer (C=32 -> C=32) and added a fully-connected layer. This prevents over-fitting.
- Training: 200 epoch - Min validation loss = 19.0367 => 25 epoch - Min validation loss = 11.0318
- Application: No more after-detection bias needed.
