Page 1 of 1
Genetic neural network plays FlappyBird
Posted: Tue Oct 31, 2017 4:18 am
by MateConBigote
I started learning about neural networks, so I decided to create a neural network in Love2d capable of learning to play Flappy Bird.
After programming a lot this is what I got, I hope you like it.
Git hub link
Re: Genetic neural network plays FlappyBird
Posted: Tue Oct 31, 2017 6:15 am
by zorg
Seems like an interesting project, though the overall file structure is weird, mainly because you're not including everything in the .love as you normally should.
If it was dofile that failed you there (and it seems likely), then you should use require (or love.filesystem.load, though that doesn't execute the chunk it loads), since the require paths of löve do search inside the .love archive, whereas standard lua io functions don't.
Re: Genetic neural network plays FlappyBird
Posted: Tue Oct 31, 2017 2:15 pm
by MateConBigote
Actually the structure of the code was conceived in this way, but now that I think about it, it was not a very good idea since it leaves the program organized in a very confusing and difficult way to explain.
Now I'm working on changing the structure in a more standard way.
Re: Genetic neural network plays FlappyBird
Posted: Tue Oct 31, 2017 5:50 pm
by vrld
Fun project that leaves room for so much experimentation. Here are a few suggestions:
For the neural net:
- Use different activation function, for example ReLU, leaky ReLU, sinc, etc. (here is a list). How do the different activation functions affect how fast the computer learns to play the game?
- Use different activation functions for each layer. Or even for each individual neuron.
- Use a different number of hidden neurons. How many neurons do you need at minimum?
- Use a different number of hidden layers. It can be shown that you need only one hidden layer, but do more layers help or hurt training?
For the genetic algorithm:
- Use tournament selection instead of random selection of parents for recombination. How does that affect the learning speed?
- Use a different recombination strategy, for example, choose individual weights (not whole layers) randomly from either parent.
- Select more weights from the fitter parent.
- Use a different mutation strategy: instead of randomly replacing a whole layer, randomly replace individual weights.
- … or randomly change the activation function.
- Add a small random offset to a random weight, instead of replacing the weight entirely.
- Use elitism instead of randomly killing half of the birds: always keep the best of all birds.
And lastly, it might be fun to look inside the head of a bird. Here is a function that draws the output of a net for a range of possible inputs (aka the
decision surface of the net):
Code: Select all
function paint(bird)
local img = love.image.newImageData(300,300)
img:mapPixel(function(x,y)
-- input x is always positive, input y may also be negative
local input = mx{{x, y-150}}
local result = mx.replace(mx.mul(input, bird.hiden), sigmoid)
result = mx.replace(mx.mul(result, bird.output), sigmoid)
result = mx.getelement(result, 1, 1) * 255
return result,result,result,255 -- maybe add a color palette here?
end)
return img
end
You can use this function to plot the brain of the best and worst birds in your population. Comparing the images, does that tell you anything about good solutions?