Neural Network

Aug 3, 2010 at 9:57am

Neural Network

Hello maxers,

here a beta version of an external about “do it yourself” Neural Network ; that’s still work in progress, but as i don’t know what can i do with NN : if you have suggestions ;-)

PS : currently mac only but sources with (don’t forget to add commonsyms.c to the project before build [papa] external).

Attachments:
  1. DIY_NN_beta.zip
#51612
Aug 3, 2010 at 1:46pm

Hello maxers,

oops sorry ; NN is arbitrary limited to 128 nodes per layer ; previous [papa] crash with more ; not the case with this one !!!

Attachments:
  1. Oscar_Papa.zip
#185159
Aug 6, 2010 at 6:26pm

I have a real hard time understanding these, what’s target and learn?

#185160
Aug 6, 2010 at 6:31pm

By the way, here’s a windows compile…

Attachments:
  1. papaoscarwin.zip
#185161
Aug 7, 2010 at 1:43am

okay, let’s see if I understand this correctly, if you give a network say:
target 1 2 3 4 and learn 4 3 2 1
as well as target 4 3 2 1 learn 1 2 3 4
then if you send it 1 2 3 4 you should get 4 3 2 1 on the output?
I’ve noticed I have to rebuild the network after I have sent it data, is this normal?

I’m also getting values way out of range of the values I sent the NN, maybe it’s a bug in my compile?

#185162
Aug 7, 2010 at 5:51am

Hello Veqtor,

yep, more or less : NN are used to discriminate classes in data provided ;

I’ve noticed I have to rebuild the network after I have sent it data, is this normal ?

No, normally weights computing after a learning process, should not change ;
specially if you select “Save Data With Patcher”. After building the NN, you can delete all objects between [oscar] and [papa] ; they are just used as UI to design home-made (recurrent) networks and for memory : but all the NN process is in [papa] ; [oscar] is just a remote ; (i don’t have my computer here, but i think you can delete [oscar] too) ; once building the NN you should never have to do it again.

I’m also getting values way out of range of the values I sent the NN, maybe it’s a bug in my compile ?

If you are using “float”, it’s common to use values between [-1., 1.] ; if you are using “int” my external just scale [0, 127] to [-1., 1.] ; i did it just to test with MIDI notes ; if you send “float” you will have “float” out [-1., 1.] ; if you send “int” you will have “int” out [0, 127] …

Most of time it’s better to use extremity (boolean style) float values in learning process [-1., 1.] to obtain a good discrimination : “target -1″ “learn -1. 0. 1. -1″ ; use MIDI values like me only if you want to test “music generative stuff” but in this case i have no idea how ; and if you find interesting approach tell me ;-)

To sum up : what is NOT normal : “int” values > 127 or < 0 !!!

Post a patch for better comments.

PS : Concerning Help files and other Explanations (i’m refering to the post you send about Zoulou too) ; according to the lack of feedback i have about what i post on this forum … not your case -thanks for windows compile-, and not complaining as i am like everybody : interested in my work and not that of the others ;-) … i don’t think very useful to spend more hours (than i already do) in PDF (avec mon anglais catastrophique) and so, while it’s so easy to talk here.

#185163
Aug 7, 2010 at 7:52am

Hello Veqtor,

I’ve noticed I have to rebuild the network after I have sent it data, is this normal ?

Concerning various “Colors”, “Function Transfert Oscar” “Function Transfert Papa” : you have to rebuild the NN after changing those attributes, not for others.

#185164
Aug 9, 2010 at 2:09am

Well an interesting approach would be to use floats as fuzzy logic boolean states, I can think of ways of using it to generate drum sequences.

#185165
Aug 13, 2010 at 4:04pm

So, question: What objects can be placed inbetween oscar and papa and what purpose does the different objects serve?

#185166
Aug 13, 2010 at 5:27pm

Hello Veqtor,

[pack] -> node with “sigmoide/logistic” function ;
[-] -> node with “linear” function ;
[+] -> node with “linear” function too but WITHOUT delaying feedback synapse ; just usefull to loop a node with himeself ; can not be connect backward ;

all objects minus those are nodes with “tanh” function.

HTH.

#185167
Aug 14, 2010 at 5:44am

Hello Veqtor,

About feedback synapse (red one) : the weight of this synapse is 1. and can not be changed by learning process ; nodes have two entries, one normal (A) and one for time delay storage (B) ; blue synapse send the signal from a node to increment A of another one ; red synapse send signal from a node to increment B (except special [+] node which is A) ; propagation is done forward, layer per layer ; signal from a node is : f_transfer (A + B) ; at the end of the process B -> A and 0 -> B ;

all that stuff may create a “t – 1″ delay ;-)

#185168
Aug 14, 2010 at 6:21am

can i ask a random question?
i notice that “isidore cholesterol” has a similar icon to “vanille bechamel” and also that “pizza olives” is often posting with a certain sense of familiarity to posts which are also frequented by “vanille bechamel”. so my question is: are you three the same person? or do you all know each other?

(i’m just curious… i’m guessing all 3 of you know each other? like are from the same town? montpellier, france? but maybe you’re all the same person? i’m only wondering because i’m trying to see if i can spot similar personality in posts…. i don’t mean it in any bad/accusatory way… i learn so much from your posts…)

oh, and great Neural Network stuff by the way, thanks!

#185169
Aug 14, 2010 at 9:40am

Hello Noob4Life,

you are right of course ; just ME ; at the beginning i created accounts to post after the “big forum change” few monthes ago while my password didn’t work ; then to distinguish between computers i use to connect (macOS, Windows, Linux), and finally without any reason, just for fun ; it is true that it can generate some confusion ;

i know that there is several maxers in Montpellier, but I know personally none of them ; and that’s good ;-)

… i don’t like “serious discussions” ; long time ago i became a cook to escape to all that stuff, and sure profile reproduction is same process ;

anyway : now, to be compatible with netiquette i am going to kill the chicks …

salutations,
pizza olives.

#185170
Aug 14, 2010 at 1:05pm

ah, nice.

i have one more random question, if you don’t mind… how did you come up with the name “isidore cholesterol”? is that “isidore” from Le Comte de Lautreamont(Isidore Ducasse)?
(Le Chants de Maldoror is one of my favorite books)
just wondering…

in any case, all three of you are very helpful! thanks again.

#185171
Aug 14, 2010 at 2:05pm

Hello Noob4Life,

could be as Alain Ducasse is a famous cook ;-) … but no, and no idea from which area of my brain it comes ; i have never read “Les Chants de Maldoror” ; maybe i will (it looks nice on wikipedia’s page) ; when i was student, my favorite trio was Witold Gombrowicz, Arno Schmidt, and Boris Vian ; now i don’t read so much novels …

#185172
May 18, 2013 at 7:30am

Sorry for the resurrection of this thread but just to get links up-to-date:

https://github.com/nicolasdanet/Max/tree/master/Oscar

#249762

You must be logged in to reply to this topic.