Releases: joaopauloschuler/neural-api
New examples, hard swish activation function, web server and more source code comments
New examples for beginners:
- One-neuron example that learns 2x - 3y + 4.
- One-neuron example that OR logic operation.
- Example for XOR logic function.
- Example for non linear function.
- Easy Delphi example for a quick start.
For advanced users:
- New hard swish activation function.
- New web server example.
This release also includes some bug fixes, better support for grouped convolutions, better documentation and source code comments.
Faster backpropagation!
Updates in this new release v1.0.7 are:
- CIFAR-10 Resized program has been added to the main readme file. It’s a program that resizes CIFAR-10 and CIFAR-100 images to 64x64 and 128x128 pixels.
- #77: Added results for the ResNet example.
- #78: Added TNNetLayer.ForcePositiveWeights.
- #79: Updated hypotenuse example.
- #81: Added FlipY with MultipleSamplesAtValidation.
- #84: Faster backpropagation by not backpropagating small values.
ResNet, More Training, Less Validation and Loads Best Epoch
Variable Sample Sizes on CIFAR10 + Swish6 Activation Function
This release has:
- You can now define the validation sample size when loading CIFAR-10 and CIFAR-100: #73 . By tweaking this number, you may find bigger test accuracies when training your models.
- There is a new activation function TNNetSwish6. This activation function is a fantastic replacement for ReLU6 as it gives higher accuracy while maintaining numerical stability.
ChannelShiftRate Data Augmentation Property
This release implements new data augmentation fitting properties: TNeuralImageFit.ChannelShiftRate and TNeuralDataLoadingFit.ChannelShiftRate .
This release also fixes #70 - Using HasImgCrop breaks execution.
More Layer Types! Better Sampling!
New Layers:
- TNNetChannelMulByLayer.
- TNNetCellMulByLayer.
- TNNetInterleaveChannels.
New Activation Functions:
API Additions:
- TNNetVolumeList.FillTag.
- TFileNameList.
- TNeuralImageLoadingFit #49.
- TNeuralThreadList.CalculateWorkingRange.
- TNeuralThreadList.GetRandomNumberOnWorkingRange.
- TVolume.GetValueCount.
- TVolume.GetSmallestIdxInRange.
- TNNetVolumeList.AddValue.
- TNNetVolumeList.Divi.
- TNNetNeuronList.GetMaxAbsWeight.
Other Improvements:
Code optimizations for non X86 processors
Fixes #39 and #41
Version 1.0!
After a long long time coding and testing, it's time to call it version 1.0! YAY!
Really comprehensive testing has been made. The last batch of testing came from a Generative Adversarial Network Testing that helped to debug a lot.
These are some of the images produces with this testing:
These are the main changes since the last release:
- We have a new convolutional layer type: TNNetConvolutionSharedWeights. Instead of having its own neurons and weights, this convolutional layer uses the same neurons and weights from another existing layer. So, if you need 2 layers with the same neurons, you can add TNNetConvolutionSharedWeights to your network. Why would you need something like this? Maybe, your NN needs to learn the same patterns in different scales (such as big and small dogs are all dogs). You can follow this example.
- Adding FitLoading Example.
- Documentation has been updated: #30 and #33.
- Fixes bugs #35 and #36.
- Added method CountNeurons.
- TNNetDeMaxPool optimization.
There are still open bugs. But this is how life is.
I wish everyone happy pascal coding and to live long and prosper!
New Examples, Bug Fixes, and Layers
New examples:
- Learns Hypotenuse
- Simple Image Classifier with Parallel Convolutions
- Simple Plant Leaf Disease Classifier
Adds new layer types:
- TNNetReLUSqrt: does a ReLU followed by square root.
- TNNetPower: calculates the power function.
Documentation updates.