Skip to content

Releases: joaopauloschuler/neural-api

New examples, hard swish activation function, web server and more source code comments

01 Jul 00:18
Compare
Choose a tag to compare

New examples for beginners:

For advanced users:

This release also includes some bug fixes, better support for grouped convolutions, better documentation and source code comments.

Faster backpropagation!

04 Jun 07:04
624c566
Compare
Choose a tag to compare

Updates in this new release v1.0.7 are:

  • CIFAR-10 Resized program has been added to the main readme file. It’s a program that resizes CIFAR-10 and CIFAR-100 images to 64x64 and 128x128 pixels.

  • #77: Added results for the ResNet example.
  • #78: Added TNNetLayer.ForcePositiveWeights.
  • #79: Updated hypotenuse example.
  • #81: Added FlipY with MultipleSamplesAtValidation.
  • #84: Faster backpropagation by not backpropagating small values.

ResNet, More Training, Less Validation and Loads Best Epoch

30 Dec 08:34
75871e8
Compare
Choose a tag to compare

This is what is new in v1.0.6:

  • #73 : CIFAR-10 - Only 2000 samples for validation as default quantity.
  • #74 : Fitting methods should load weights from best performing epoch.
  • #76 : Fixed bug at TNNetSum that prevents ResNet architectures style to learn.
  • #77 : Added a ResNet example.

Variable Sample Sizes on CIFAR10 + Swish6 Activation Function

11 Nov 19:03
Compare
Choose a tag to compare

This release has:

  • You can now define the validation sample size when loading CIFAR-10 and CIFAR-100: #73 . By tweaking this number, you may find bigger test accuracies when training your models.
  • There is a new activation function TNNetSwish6. This activation function is a fantastic replacement for ReLU6 as it gives higher accuracy while maintaining numerical stability.

ChannelShiftRate Data Augmentation Property

18 Oct 23:30
Compare
Choose a tag to compare

This release implements new data augmentation fitting properties: TNeuralImageFit.ChannelShiftRate and TNeuralDataLoadingFit.ChannelShiftRate .

This release also fixes #70 - Using HasImgCrop breaks execution.

More Layer Types! Better Sampling!

14 Oct 01:39
Compare
Choose a tag to compare

New Layers:

  • TNNetChannelMulByLayer.
  • TNNetCellMulByLayer.
  • TNNetInterleaveChannels.

New Activation Functions:

  • TNNetSwish #65.
  • TNNetReLU6 #69.

API Additions:

  • TNNetVolumeList.FillTag.
  • TFileNameList.
  • TNeuralImageLoadingFit #49.
  • TNeuralThreadList.CalculateWorkingRange.
  • TNeuralThreadList.GetRandomNumberOnWorkingRange.
  • TVolume.GetValueCount.
  • TVolume.GetSmallestIdxInRange.
  • TNNetVolumeList.AddValue.
  • TNNetVolumeList.Divi.
  • TNNetNeuronList.GetMaxAbsWeight.

Other Improvements:

  • Improvements on #48.
  • Fixes #21, #55, #58, #61, #66 and #68.
  • Updated documentation.
  • New plant leaf disease classification example #49.
  • Better input sampling at TNeuralImageLoadingFit and TNeuralImageFit #51.

Code optimizations for non X86 processors

02 Apr 16:16
Compare
Choose a tag to compare

This version fixes #42, #43, #44 and #45.

There are also code optimizations for processors other than X86 based.

Fixes #39 and #41

07 Feb 18:06
Compare
Choose a tag to compare

This release fixes bugs #39 and #41 .

Version 1.0!

17 Jan 04:49
8d0494d
Compare
Choose a tag to compare

After a long long time coding and testing, it's time to call it version 1.0! YAY!

Really comprehensive testing has been made. The last batch of testing came from a Generative Adversarial Network Testing that helped to debug a lot.

These are some of the images produces with this testing:

These are the main changes since the last release:

  • We have a new convolutional layer type: TNNetConvolutionSharedWeights. Instead of having its own neurons and weights, this convolutional layer uses the same neurons and weights from another existing layer. So, if you need 2 layers with the same neurons, you can add TNNetConvolutionSharedWeights to your network. Why would you need something like this? Maybe, your NN needs to learn the same patterns in different scales (such as big and small dogs are all dogs). You can follow this example.
  • Adding FitLoading Example.
  • Documentation has been updated: #30 and #33.
  • Fixes bugs #35 and #36.
  • Added method CountNeurons.
  • TNNetDeMaxPool optimization.

There are still open bugs. But this is how life is.

I wish everyone happy pascal coding and to live long and prosper!

New Examples, Bug Fixes, and Layers

09 Oct 15:36
882bd9b
Compare
Choose a tag to compare

New examples:

Fixes bugs #31 and #32.

Adds new layer types:

  • TNNetReLUSqrt: does a ReLU followed by square root.
  • TNNetPower: calculates the power function.

Documentation updates.