I am making a neural network in Clojure that can take an array of integers,and return a data structure representing the layers of a neural network: so (make-layers [1 4 5]) would evaluate to:

[[0]          <-- input
 [0 0 0 0]    <-- hidden
 [0 0 0 0 0]] <-- output 

When I run my activation function on the weights of the network, however, I get a core.matrix error saying that it can't do the matrix multiplication on an input of a one-dimensional vector with the transpose of the weights:

(core.matrix.operators/* inputs (transpose weights))
 user=> Incompatible shapes, cannot broadcast [1] to [4 2]

I understand why this is not working from the perspective of matrix multiplication, but I am not sure how to rewrite the function to deal with layers of arbitrary length.

Here is a gist that shows what I am working on: https://gist.github.com/gamma235/b8db845a512c60d123af

有帮助吗?

解决方案

It sounds like you are using * (elementwise multiplication) when you really want to be using mmul (matrix multiplication).

If you are computing a length 4 hidden layer from a length 1 input vector, then your weight matrix should be a 4x1 matrix, e.g.:

(def weights [[1.0] [2.0] [3.0] [4.0]]) ;; a 4x1 weight matrix
(def input [1.5])
(mmul weights input)
=> [1.5 3.0 4.5 6.0]

Hope that helps!

许可以下: CC-BY-SA归因
scroll top