The effort seems to have been worth it though, we get a pretty good solution from our neural network. Step_size=0.001, num_iters=N, callback=callback) Here I show how to do that, and make sure that the derivatives we pull out are comparable to the derivatives defined by the ODEs above. I only figured this out by direct comparison of the data from a numerical solution and the output of the jacobian. I use some fancy indexing on the array to get the desired arrays of the derivatives. The second dimension is the species, and the last dimension is nothing here, but is there because the input is in a column. The first and third dimensions are related to the time steps. The jacobian will output an array of shape (17, 3, 17, 1), and we have to extract the pieces we want from that. the concentrations of each species at each time. That leads to an output from the NN with a shape of (17, 3), i.e. Why does it have this shape? Our time input vector we used has 17 time values, in a column vector. So, there is no issue there, a neural network can represent the solution. I ran this block manually a bunch of times, but eventually you see that we can train a one layer network with 8 nodes to output all three concentrations pretty accurately. Now, we train our network to reproduce the solution. # initial guess for the weights and biases params = init_random_params(0.1, layer_sizes=) "Neural network functions" for W, b in params: """Build a list of (weights, biases) tuples, one for each layer.""" return, layer_sizes)] From autograd import grad, elementwise_grad, jacobianįrom import adamÄef init_random_params(scale, layer_sizes, rs=npr.RandomState(0)):
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |