In this lecture, we will briefly review neural networks and take our first look at computation graphs. We will discuss what computation graphs are and their semantics as a general language for describing functions. Then we will see the two kinds of computations they allow, namely the forward pass and the backward pass.
Lectures
 Neural networks refresher
 Computation graphs
 Forward pass with computation graphs
 Backward pass with computation graphs
 Constructing comptuation graphs
Readings
Matrices and Tensors
 The PyTorch tutorial on tensors
 The Matrix Cookbook is an invaluable resource for quickly looking up identities, operations, approximations, etc involving matrices and vectors.
Computation graphs

An introduction to computation graphs, by Christopher Olah.

Chapters 4 and 5 in Goldberg, Yoav. “Neural network methods for natural language processing.” Synthesis Lectures on Human Language Technologies 10, no. 1 (2017): 1309.

Goldberg, Yoav. “A primer on neural network models for natural language processing.” Journal of Artificial Intelligence Research 57 (2016): 345420. See section 5.
Automatic differentiation

The Wikipedia article on automatic differentiation is quite thorough.

Griewank, Andreas. “Who invented the reverse mode of differentiation?” Documenta Mathematica, Extra Volume ISMP 389400 (2012).