Page 1 of 1

Is this an infant Neural Network, or just random code?

Posted: Sun Sep 28, 2014 3:06 am
by Andynonomous
Hi there,

I'm just wondering if anybody out there has ever tried simulating anything with neural networks? I'd like to eventually code an evolution simulation based on NN's but I don't know if my code is viable or if I'm totally on the wrong track. If anyone has any experience I'd love if you could critique my (very basic - no back propagation yet) Neuron code. Thanks in advance!

Code: Select all

Neuron = {}

function tablelength(T)
  local count = 0
  for _ in pairs(T) do count = count + 1 end
  return count
end


function Neuron:new(o)
	o = o or {}
	setmetatable(o, self)
	self.__index = self
	return o
end

function Neuron:init(inputs) 
	self.inputs = inputs	
	self.weights = {}
	
	for i = 1, tablelength(self.inputs), 1 do
		self.weights[i] = love.math.random() * 2.0 - 1.0		
	end	
end

function Neuron:fire()	
	local weightedInput = 0
	for i = 1, tablelength(self.inputs), 1 do
		weightedInput = weightedInput + self.inputs[i] * self.weights[i]		
	end
	return math.tanh(weightedInput)
end

Re: Is this an infant Neural Network, or just random code?

Posted: Sun Sep 28, 2014 9:22 am
by Robin
One thing: instead of that function tablelength, use the operator #

Code: Select all

   for i = 1, #self.inputs do
      weightedInput = weightedInput + self.inputs[i] * self.weights[i]
   end
More generally, you just took your first step. It's a bit early to ask if you're on the right track. Just go for it. :)

Re: Is this an infant Neural Network, or just random code?

Posted: Mon Sep 29, 2014 6:57 pm
by undef
Depending on how many neurons you will have you might want to optimize this part:

Code: Select all

function Neuron:init(inputs)
   self.inputs = inputs 
   local r, weights = love.math.random, {}
   for i = 1, #table, 1 do
      weights[i] = r() * 2.0 - 1.0      
   end
   self.weights = weights   
end
This is faster if it's a big for loop, because Lua searches for variables from smallest scope to largest.
By making the random function local inside the block above, there will be less lookups by the function call in the for loop.
I also made a local table "weights", to avoid double indexing in the for loop.

Other than that, like Robin already said, get rid of the tablelength function.

Re: Is this an infant Neural Network, or just random code?

Posted: Tue Sep 30, 2014 2:45 am
by Andynonomous
Thanks!