Busides GANs any major Machine Learning breakthrough?
#1


I, being poor, have only my dreams; I have spread my dreams under your feet; Tread softly because you tread on my dreams.
Reply
#2

I already used GAN to generate all.sorts of pictures....

Any other advances in AI interesting to learn?

I, being poor, have only my dreams; I have spread my dreams under your feet; Tread softly because you tread on my dreams.
Reply
#3

how to automate and optimise the nn structures for a particular problem? and let them adapt when new data comes in? instead of guessing and trying the number of layers, number of neurons, their interconnections, learning rate, momentum rate, activation functions and etc.

in other words, how to let the nn learn to learn, optimise and adapt, all by themselves, no trial and error
Reply
#4

(13-08-2021, 04:19 PM)sgbuffett Wrote:  I already used GAN to generate all.sorts of pictures....

Any other advances in AI interesting to learn?


[Image: safe-image.gif]
Reply
#5

(13-08-2021, 04:45 PM)WhatDoYouThink? Wrote:  how to automate and optimise the nn structures for a particular problem? and let them adapt when new data comes in? instead of guessing and trying the number of layers, number of neurons, their interconnections, learning rate, momentum rate, activation functions and etc.

in other words, how to let the nn learn to learn, optimise and adapt, all by themselves, no trial and error

I have already thought through this problem.

Any problem space the NN is to discover  generalisations e.g. what are the charactiestsixs makes a dog a dog.

The difficult of this is unknown until you run it through an NN ..if N layers is not enough add another layer until you get accuracy you want.

So one needs to figure out the "ease of generalisation" to determine the structure of the network needed. 

I can do this by measuring the error vector of after training an X(say 3) layer NN.
and using the error distribution I trained another network to predict the #of layers needed to achieve a desired error.

I, being poor, have only my dreams; I have spread my dreams under your feet; Tread softly because you tread on my dreams.
Reply
#6

(13-08-2021, 05:37 PM)sgbuffett Wrote:  I have already thought through this problem.

Any problem space the NN is to discover  generalisations e.g. what are the charactiestsixs makes a dog a dog.

The difficult of this is unknown until you run it through an NN ..if N layers is not enough add another layer until you get accuracy you want.

So one needs to figure out the "ease of generalisation" to determine the structure of the network needed. 

I can do this by measuring the error vector of after training an X(say 3) layer NN.
and using the error distribution I trained another network to predict the #of layers needed to achieve a desired error.

that's only the numbers of layers. still got numbers of neurons, their interconnections (mostly used all-to-all), ...... a lot of guess work and not optimised
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)