brainstorm how to get objects into brain.js recurrent neural network
See original GitHub issueThe standard network uses:
[
{input: { r: 0.03, g: 0.7, b: 0.5 }, output: { black: 1 }},
{input: { r: 0.16, g: 0.09, b: 0.2 }, output: { white: 1 }},
{input: { r: 0.5, g: 0.5, b: 1.0 }, output: { white: 1 }}
]
for training. Would it be possible to get something like that into the recurrent neural network? If so, how?
Issue Analytics
- State:
- Created 7 years ago
- Comments:9 (5 by maintainers)
Top Results From Across the Web
Neural Networks with JavaScript - Full Course using Brain.js
This course gives you a practical introduction to building neural networks in the browser and in Node.js using the Brain. js JavaScript ...
Read more >An introduction to deep learning with Brain.js - LogRocket Blog
Using Brain.js is a fantastic way to build a neural network. It learns the patterns and relationship between the inputs and output in...
Read more >Neural networks in JavaScript with Brain.js | by Sai Sandhya
Neural network algorithms use complex mathematical equations and functions to predict a model. To make our process simpler, the Brain.js library ...
Read more >Chapter 1. Deep learning and JavaScript - liveBook · Manning
Since then, deep neural networks have been applied to an increasingly wide range of ... Where we use TensorFlow.js to perform a similar...
Read more >Beyond core object recognition: Recurrent processes ... - PLOS
Deep neural networks are shown to achieve human-level performance in these tasks, and explain the primate brain representation.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The idea of brain.js is simplicity first. By not abstracting the base neural nets, we can optimize towards speed, and keep the learning curve low for understanding what is actually happening in the neural net, keeping things composable and understandable. However, I’d be open to abstracting if it meant meeting that goal. Also, the LSTM and GRU are abstractions. In the end, I think we wanted to just get it working 😋. Like Addy Osmani said: “First do it. Then do it right. Then do it better.” Also, I’m unaware of the “DenseNet” (a Densely Connected Convolutional Networks?) in our codebase that you speak of.
I applaud your hard work towards less is more, and the synaptic project is fantastic. The code we started with was https://github.com/karpathy/recurrentjs so it is just a reflection of that code aimed toward the brain.js api. We also have es6 src, which is compiled into es5 and a special browser (and browser min) file. This is all done automatically, but does add to the code base. I’m also interested where the figure “7 times smaller” comes from, would be helpful for comparing other projects in the future. We are also in the middle of removing things like tests that bring in dependencies. Does synpatic2 use matrix operations?
The actual matrix library was brought in from recurrentjs, just simplified it and cut down on matrix instantiation, and split it apart so that when we do
.toFunction()
we could get the inner value of the objects and reuse as a non-function operation, an example that I sent to my mother which outputs “hi mom!”. Also, the actual matrix code fits in about 8 lines:The rest of it are some left over (not used, and thank you for bringing that to my attention) methods from recurrentjs, and utility functions for going to and from json, and math. The math was again brought in by recurrentjs and is there doing very simple operations that will eventually be made to run on the gpu where available. If there is a library that would give us what we have (reused matrices, relu, tanh, rowPluck, sampleI, maxI), I’m all for using it.
Thank you for your time in looking at our codebase!
This is now completed with https://gist.github.com/robertleeplummerjr/713a47d5fd63e8e189f8cf5cbc0649cd in brain.js 1.6.0+ for recurrent time step neural networks.