DFYS AutoDiff Documentation

Background

Introduction

Automatic differentiation (AD) is a family of techniques for efficiently and accurately evaluating derivatives of numeric functions expressed as computer programs. Application of AD includes Newton’s method for solving nonlinear equations, real-parameter optimization, probabilistic inference, and backpropagation in neural networks. AD has been extremely popular because of the booming development in machine learning and deep learning techniques. Our AD sofeware package enable user to calculate derivatives using the forward and reverse mode.

Our package has feature including support for second order derivatives (including Hssian matrix), rooting finding, optimization(Newton, Gradient Descent, BFGS), and backpropagation.

Mathematical Background

Automatic Differentiation decomposes a complex function into a sequence of operations on elementary functions, evaluates the derivatives at each intermediate stage, repeatedly applies the chain rule to obtain the derivative of the outermost function. We provides explanations for related math concepts below.

Elimentary functions

The class of functions consisting of the polynomials, the exponential functions, the logarithmic functions, the trigonometric functions, the inverse trigonometric functions,and the functions obtained from those listed by the four arithmetic operations and by superposition(i.e. composition),applied by finitely many times.

Chain Rule - Used to compute the derivative of a composite function - Core of automatic differentiation

For the first derivative:

\[\dfrac{dy}{dx} = \dfrac{dy}{du}\cdot\dfrac{du}{dx}\]

For the second derivative:

\[\dfrac{\partial^2 t}{\partial x_i \partial x_j} = \sum_k(\dfrac{\partial y}{\partial u_k}\dfrac{u_k^2}{\partial x_i \partial x_j}) + \sum_{k,l}(\dfrac{\partial^2 y}{\partial u_k \partial u_l}\dfrac{\partial u_k}{\partial x_i}\dfrac{\partial u_l}{\partial x_j})\]

Topological Graph - Each node represent a variable - Arrows indicate topological orders(order of operations) and operations themselves.

Forward Mode Autodifferentiation

Follow the topological order and store the values of each variable in the nodes. visit each node in topological order. Let x denote our innermost function. For variable \(u_i=g_i(v)\) we already know \(\dfrac{dv}{dx}\), calculate \(\dfrac{du_i}{dx}= \dfrac{du_i}{dv}\dfrac{dv}{dx}\)

Reverse Mode Autodifferentiation

Has forward computation and backward computation

Step 1: Forward Computation

Follow the topological order and store the values of each variable in each nodes.

Step 2: Backward Computation

let y denote our final output variable and \(u_j\), \(v_j\) denote the intermediate variables

  1. Initialize all partial derivative \(\dfrac{dy}{du_j}\) to 0 and dy/dy = 1
  2. visit each node in reverse topological order. For variable \(u_i=g_i(v_1,...,v_n)\) we already know \(\dfrac{dy} {du_i}\), increment \(\dfrac{dy}{dv_j}\) by \(\dfrac{dy}{du_i}\dfrac{du_i}{dv_j}\)

Installation

Install Through PyPI

The easiest way to install autodiff is by pip. Just type in pip install DYFS-autodiff in the command line.

pip install DFYS-autodiff

Install Manually

The user can choose to install autodiff directly from the source in this repository. We suppose that the user has already installed pip and virtualenv:

  1. clone the project repo by git clone git@github.com:D-F-Y-S/cs207-FinalProject.git
  2. cd into the local repo and create a virtual environment by virtualenv env
  3. activate the virtual environment by source env/bin/activate (use deactivate to deactivate the virtual environment later.)
  4. install the dependencies by pip install -r requirements.txt
  5. install autodiff by pip install -e .

Getting Started

Univariate Functions

The standard workflow for autodiff is to first initiate a Variable, or several Variables. We then use these Variable to construct Expressions, which can then be queried for values and derivatives.

In [24]:
import numpy              as np
import matplotlib.pyplot  as plt
from mpl_toolkits.mplot3d import Axes3D

from autodiff.forward     import *

Suppose we want to calculate the derivatives of \(f(x) = \cos(\pi x)\exp(-x^2)\). We can start with creating a Variable called x.

In [3]:
x = Variable()

We then create the Expression for \(f(x)\). Note that here cos and exp are library functions from autrodiff.

In [4]:
f = cos(np.pi*x)*exp(-x**2)

We can then evaluate \(f(x)\)‘s value and derivative by calling the evaluation_at method and the derivative_at method. For derivative_at method, the first argument specifies which variable to take derivative with respect to, the second argument specifies which point in the domain are the derivative to be calculated.

In [5]:
f.evaluation_at({x: 1})
Out[5]:
-0.36787944117144233
In [6]:
f.derivative_at(x, {x: 1})
Out[6]:
0.7357588823428846

The derivative_at method supports second order derivative. If we want to calculate \(\dfrac{d^2 f}{d x^2}\), we can add another argument order=2.

In [7]:
f.derivative_at(x, {x: 1}, order=2)
Out[7]:
2.895065669313077

Both the methods evaluation_at and derivative_at are vectorized, and instead of pass in a scalar value, we can pass in a numpy.array, and the output will be f’s value / derivative at all entried of the input. For example, we can calculate the value, first order derivative and second order derivative of \(f(x)\) on the interval \([-2, 2]\) simply by

In [8]:
interval = np.linspace(-2, 2, 200)
values = f.evaluation_at(   {x: interval})
der1st = f.derivative_at(x, {x: interval})
der2nd = f.derivative_at(x, {x: interval}, order=2)

Let’s see what they look like.

In [9]:
fig  = plt.figure(figsize=(16, 8))
plt.plot(interval, values, c='magenta',     label='$f(x)$')
plt.plot(interval, der1st, c='deepskyblue', label='$\dfrac{df(x)}{dx}$')
plt.plot(interval, der2nd, c='purple',      label='$\dfrac{d^2f(x)}{dx^2}$')
plt.xlabel('x')
plt.legend()
plt.show()
_images/Getting_Started_16_0.png

Multivariate Functions

The workflow with multivariate functions are essentially the same.

Suppose we want to calculate the derivatives of \(g(x, y) = \cos(\pi x)\cos(\pi y)\exp(-x^2-y^2)\). We can start with adding another Variable called y.

In [10]:
y = Variable()

We then create the Expression for \(g(x, y)\).

In [11]:
g = cos(np.pi*x) * cos(np.pi*y) * exp(-x**2-y**2)

We can then evaluate \(f(x)\)‘s value and derivative by calling the evaluation_at method and the derivative_at method, as usual.

In [12]:
g.evaluation_at({x: 1.0, y: 1.0})
Out[12]:
0.1353352832366127
In [13]:
g.derivative_at(x, {x: 1.0, y: 1.0})
Out[13]:
-0.27067056647322535
In [14]:
g.derivative_at(x, {x: 1.0, y: 1.0})
Out[14]:
-1.0650351405815222

Now we have two variables, we may want to calculate \(\dfrac{\partial^2 g}{\partial x \partial y}\). We can just replace the first argument of derivative_at to a tuple (x, y). In this case the third argument order=2 can be omitted, because the Expression can infer from the first argument that we are looking for a second order derivative.

In [15]:
g.derivative_at((x, y), {x: 1.0, y: 1.0})
Out[15]:
0.5413411329464506

We can also ask g for its Hessian matrix. A numpy.array will be returned.

In [29]:
g.hessian_at({x: 1.0, y:1.0})
Out[29]:
array([[-1.06503514,  0.54134113],
       [ 0.54134113, -1.06503514]])

Since the evaluation_at method and derivarive_at method are vectorized, we can as well pass in a mesh grid, and the output will be a grid of the same shape. For example, we can calculate the value, first order derivative and second order derivative of f(x)f(x) on the interval \(x\in[−2,2], y\in[-2,2]\) simply by

In [20]:
us, vs = np.linspace(-2, 2, 200), np.linspace(-2, 2, 200)
uu, vv = np.meshgrid(us, vs)
In [21]:
values = g.evaluation_at(        {x: uu, y:vv})
der1st = g.derivative_at(x,      {x: uu, y:vv})
der2nd = g.derivative_at((x, y), {x: uu, y:vv})

Let’s see what they look like.

In [22]:
def plt_surf(uu, vv, zz):
    fig  = plt.figure(figsize=(16, 8))
    ax   = Axes3D(fig)
    surf = ax.plot_surface(uu, vv, zz, rstride=2, cstride=2, alpha=0.8, cmap='cool')
    ax.set_xlabel('x')
    ax.set_ylabel('y')
    ax.set_zlabel('z')
    ax.set_proj_type('ortho')
    plt.show()
In [25]:
plt_surf(uu, vv, values)
_images/Getting_Started_35_0.png
In [26]:
plt_surf(uu, vv, der1st)
_images/Getting_Started_36_0.png
In [27]:
plt_surf(uu, vv, der2nd)
_images/Getting_Started_37_0.png

Vector Functions

Functions defined on \(\mathbb{R}^n \mapsto \mathbb{R}^m\) are also supported. Here we create an VectorFunction that represents \(h(\begin{bmatrix}x\\y\end{bmatrix}) = \begin{bmatrix}f(x)\\g(x, y)\end{bmatrix}\).

In [30]:
h = VectorFunction(exprlist=[f, g])

We can then evaluates \(h(\begin{bmatrix}x\\y\end{bmatrix})\)‘s value and gradient (\(\begin{bmatrix}\dfrac{\partial f}{\partial x}\\\dfrac{\partial g}{\partial x}\end{bmatrix}\) and \(\begin{bmatrix}\dfrac{\partial f}{\partial y}\\\dfrac{\partial g}{\partial y}\end{bmatrix}\)) by calling its evaluation_at method and gradient_at method. The jacobian_at function returns the Jacobian matrix (\(\begin{bmatrix}\dfrac{\partial f}{\partial x} & \dfrac{\partial f}{\partial y} \\ \dfrac{\partial g}{\partial x} & \dfrac{\partial g}{\partial y} \end{bmatrix}\)).

In [31]:
h.evaluation_at({x: 1.0, y: -1.0})
Out[31]:
array([-0.36787944,  0.13533528])
In [35]:
h.gradient_at(0, {x: 1.0, y: -1.0})
Out[35]:
array([0., 0.])
In [33]:
h.jacobian_at({x: 1.0, y: -1.0})
Out[33]:
array([[ 0.73575888,  0.        ],
       [-0.27067057,  0.27067057]])

Libraries Demo

autodiff.forward

Univariate Functions

The standard workflow for autodiff is to first initiate a Variable, or several Variables. We then use these Variable to construct Expressions, which can then be queried for values and derivatives.

In [65]:
import numpy              as np
import matplotlib.pyplot  as plt
from mpl_toolkits.mplot3d import Axes3D
from autodiff.forward     import *

Suppose we want to calculate the derivatives of \(f(x) = \cos(\pi x)\exp(-x^2)\). We can start with creating a Variable called x.

In [66]:
x = Variable()

We then create the Expression for \(f(x)\). Note that here cos and exp are library functions from autrodiff.

In [67]:
f = cos(np.pi*x)*exp(-x**2)

We can then evaluate \(f(x)\)‘s value and derivative by calling the evaluation_at method and the derivative_at method. For derivative_at method, the first argument specifies which variable to take derivative with respect to, the second argument specifies which point in the domain are the derivative to be calculated.

In [68]:
f.evaluation_at({x: 1})
Out[68]:
-0.36787944117144233
In [69]:
f.derivative_at(x, {x: 1})
Out[69]:
0.73575888234288456

The derivative_at method supports second order derivative. If we want to calculate \(\dfrac{d^2 f}{d x^2}\), we can add another argument order=2.

In [70]:
f.derivative_at(x, {x: 1}, order=2)
Out[70]:
2.8950656693130772

Both the methods evaluation_at and derivative_at are vectorized, and instead of pass in a scalar value, we can pass in a numpy.array, and the output will be f’s value / derivative at all entried of the input. For example, we can calculate the value, first order derivative and second order derivative of \(f(x)\) on the interval \([-2, 2]\) simply by

In [71]:
interval = np.linspace(-2, 2, 200)
values = f.evaluation_at(   {x: interval})
der1st = f.derivative_at(x, {x: interval})
der2nd = f.derivative_at(x, {x: interval}, order=2)
In [72]:
fig  = plt.figure(figsize=(16, 8))
plt.plot(interval, values, c='magenta',     label='$f(x)$')
plt.plot(interval, der1st, c='deepskyblue', label='$\dfrac{df(x)}{dx}$')
plt.plot(interval, der2nd, c='purple',      label='$\dfrac{d^2f(x)}{dx^2}$')
plt.xlabel('x')
plt.legend()
plt.show()
_images/Libraries_Demo_16_0.png

Multivariate Functions

The workflow with multivariate functions are essentially the same.

Suppose we want to calculate the derivatives of \(g(x, y) = \cos(\pi x)\cos(\pi y)\exp(-x^2-y^2)\). We can start with adding another Variable called y.

In [73]:
y = Variable()

We then create the Expression for \(g(x, y)\).

In [74]:
g = cos(np.pi*x) * cos(np.pi*y) * exp(-x**2-y**2)

We can then evaluate \(f(x)\)‘s value and derivative by calling the evaluation_at method and the derivative_at method, as usual.

In [75]:
g.evaluation_at({x: 1.0, y: 1.0})
Out[75]:
0.1353352832366127
In [76]:
g.derivative_at(x, {x: 1.0, y: 1.0})
Out[76]:
-0.27067056647322535
In [77]:
g.derivative_at(x, {x: 1.0, y: 1.0})
Out[77]:
-0.27067056647322535

Now we have two variables, we may want to calculate \(\dfrac{\partial^2 g}{\partial x \partial y}\). We can just replace the first argument of derivative_at to a tuple (x, y). In this case the third argument order=2 can be omitted, because the Expression can infer from the first argument that we are looking for a second order derivative.

In [78]:
g.derivative_at((x, y), {x: 1.0, y: 1.0})
Out[78]:
0.54134113294645059

We can also ask g for its Hessian matrix. A numpy.array will be returned.

In [79]:
g.hessian_at({x: 1.0, y:1.0})
Out[79]:
array([[-1.06503514,  0.54134113],
       [ 0.54134113, -1.06503514]])

Since the evaluation_at method and derivarive_at method are vectorized, we can as well pass in a mesh grid, and the output will be a grid of the same shape. For example, we can calculate the value, first order derivative and second order derivative of f(x)f(x) on the interval \(x\in[−2,2], y\in[-2,2]\) simply by

In [80]:
us, vs = np.linspace(-2, 2, 200), np.linspace(-2, 2, 200)
uu, vv = np.meshgrid(us, vs)
In [81]:
values = g.evaluation_at(        {x: uu, y:vv})
der1st = g.derivative_at(x,      {x: uu, y:vv})
der2nd = g.derivative_at((x, y), {x: uu, y:vv})

Let’s see what they look like.

In [82]:
def plt_surf(uu, vv, zz):
    fig  = plt.figure(figsize=(16, 8))
    ax   = Axes3D(fig)
    surf = ax.plot_surface(uu, vv, zz, rstride=2, cstride=2, alpha=0.8, cmap='cool')
    ax.set_xlabel('x')
    ax.set_ylabel('y')
    ax.set_zlabel('z')
    ax.set_proj_type('ortho')
    plt.show()
In [83]:
plt_surf(uu, vv, values)
_images/Libraries_Demo_35_0.png
In [84]:
plt_surf(uu, vv, der1st)
_images/Libraries_Demo_36_0.png
In [85]:
plt_surf(uu, vv, der2nd)
_images/Libraries_Demo_37_0.png

Vector Functions

Functions defined on \(\mathbb{R}^n \mapsto \mathbb{R}^m\) are also supported. Here we create an VectorFunction that represents \(h(\begin{bmatrix}x\\y\end{bmatrix}) = \begin{bmatrix}f(x)\\g(x, y)\end{bmatrix}\).

In [86]:
h = VectorFunction(exprlist=[f, g])

We can then evaluates \(h(\begin{bmatrix}x\\y\end{bmatrix})\)‘s value and gradient (\(\begin{bmatrix}\dfrac{\partial f}{\partial x}\\\dfrac{\partial g}{\partial x}\end{bmatrix}\) and \(\begin{bmatrix}\dfrac{\partial f}{\partial y}\\\dfrac{\partial g}{\partial y}\end{bmatrix}\)) by calling its evaluation_at method and gradient_at method. The jacobian_at function returns the Jacobian matrix (\(\begin{bmatrix}\dfrac{\partial f}{\partial x} & \dfrac{\partial f}{\partial y} \\ \dfrac{\partial g}{\partial x} & \dfrac{\partial g}{\partial y} \end{bmatrix}\)).

In [87]:
h.evaluation_at({x: 1.0, y: -1.0})
Out[87]:
array([-0.36787944,  0.13533528])
In [88]:
h.gradient_at(0, {x: 1.0, y: -1.0})
Out[88]:
array([ 0.,  0.])
In [89]:
h.jacobian_at({x: 1.0, y: -1.0})
Out[89]:
array([[ 0.73575888,  0.        ],
       [-0.27067057,  0.27067057]])

autodiff.rootfinding

Rootfinding module provides function newton_scalar to find the root of a given function with arbitrarily many variables. It also works with back propagation mode. Here for visualization purpose we only show up to 2 variables.

Example1: try to approximate: \(f=sin(x)-0.4x = 0\) from \(x = -2.5,y = -1.5\)

In [90]:
import matplotlib.pyplot as plt
import numpy as np
%matplotlib inline
from autodiff.forward import *
from autodiff.rootfinding import *
from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot  as plt
In [91]:
x = Variable()
f = x**2-4*x
result_d = newton_scalar(f,{x:1},max_itr=100)
In [92]:
xx= np.linspace(-np.pi,np.pi,100)
plt.plot(xx,xx**2,color = 'black')
plt.plot(xx,4*xx,color = 'blue')
plt.scatter([result_d[x]],[f.evaluation_at({x:result_d[x]})],color = 'red')
Out[92]:
<matplotlib.collections.PathCollection at 0x107e954e0>
_images/Libraries_Demo_50_1.png

Example 2: \(f(x,y) = x^2-xy = 0\) from \(x=1\), and \(y=10\)

In [93]:
x, y = Variable(), Variable()
f = x**2-x*y
result_d = newton_scalar(f,{x:1,y:10},max_itr = 100)
In [94]:
fig  = plt.figure(figsize=(16, 8))
ax   = Axes3D(fig)
us, vs = np.linspace(-1, 1, 200), np.linspace(-1, 1, 200)
uu, vv = np.meshgrid(us, vs)
zz = f.evaluation_at({x: uu, y:vv})
ax.plot([0], [0], [0], marker='o', markersize=15, c='green',alpha = .5)
surf = ax.plot_surface(uu, vv, zz, rstride=2, cstride=2, alpha=0.8, cmap='cool')
ax.plot([result_d[x]], [result_d[y]],
    [f.evaluation_at({x:result_d[x],y:result_d[y]})],
    marker='x', markersize=20, c='red',alpha = .8)
ax.set_xlabel('x')
ax.set_ylabel('y')
ax.set_zlabel('z')
ax.view_init(30, 50)
plt.show()
_images/Libraries_Demo_53_0.png

autodiff.optimize

In [95]:
import numpy              as np
from   autodiff.forward   import *
import autodiff.optimize  as opt
from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot  as plt
%matplotlib inline

We included several basic optimization routines built on autodiff.forward. Here we’ll use the Rosenbrock function to demonstrate the use of these optimization routines. The Rosenbrock function is defined as \(f(x, y) = (a-x)^2 + b(y-x^2)^2\). Here we use \(a=1, b=100\).

In [96]:
x, y = Variable(), Variable()
f = (1-x)**2 + 100*(y-x**2)**2
In [97]:
us, vs = np.linspace(-2, 1.5, 200), np.linspace(0.0, 3.5, 200)
uu, vv = np.meshgrid(us, vs)
values = f.evaluation_at({x: uu, y:vv})

The landscape of the function looks like below. The global minimum is at \([-1, 1]\), it is marked by the red star.

In [98]:
def plt_surf(uu, vv, zz, traj=None, show_dest=False, show_traj=False):
    fig  = plt.figure(figsize=(16, 8))
    ax   = Axes3D(fig)
    if show_traj: ax.plot(traj[0], traj[1], traj[2], marker='>', markersize=7, c='orange')
    if show_dest: ax.plot([1.0], [1.0], [0.0], marker='*', markersize=15, c='red')
    surf = ax.plot_surface(uu, vv, zz, rstride=2, cstride=2, alpha=0.8, cmap='cool')
    ax.set_xlabel('x')
    ax.set_ylabel('y')
    ax.set_zlabel('z')
    ax.set_proj_type('ortho')
    plt.show()
In [99]:
plt_surf(uu, vv, values, show_dest=True)
_images/Libraries_Demo_61_0.png

autodiff.optimize.gradient_descent

Let’s say we start from \((0.0, 3.0)\). We’ll first use gradient descent to find the miminum. The gradient descent is implemented in autodiff.optimize.gradient_descent. Here we set the argument return_history=True to return a whole history of optimization.

In [100]:
hist = opt.gradient_descent(f, init_val_dict={x: 0.0, y: 3.0}, max_iter=10000,
                            return_history=True)

We can plot our optimization path as below. We can see that gradient descent approaches the minimum slowly because the gradient around the minimum is small.

In [101]:
hist = np.array(hist)
us, vs = hist[:, 0].flatten(), hist[:, 1].flatten()
zs     = f.evaluation_at({x: us, y: vs})
plt_surf(uu, vv, values, (us, vs, zs), show_dest=True, show_traj=True)
_images/Libraries_Demo_66_0.png

autodiff.optimize.newton

We’ll then use Newton’s method to find the miminum. The Newton’s method is implemented in autodiff.optimize.newton. Here we set the argument return_history=True to return a whole history of optimization.

In [102]:
hist = opt.newton(f, init_val_dict={x: 0.0, y: 3.0}, max_iter=10000,
                  return_history=True)

We can plot our optimization path as below. The Newton’s method makes use of second-derivative information. We can see that the Newton’s method takes much fewer steps to reach the minimum.

In [103]:
hist = np.array(hist)
us, vs = hist[:, 0].flatten(), hist[:, 1].flatten()
zs     = f.evaluation_at({x: us, y: vs})
plt_surf(uu, vv, values, (us, vs, zs), show_dest=True, show_traj=True)
_images/Libraries_Demo_71_0.png

autodiff.optimize.gradient_descent

Now let’s look at the gradient_descent method, unlike Newton’s method, one does not need the Hessian matrix to find the minimum, while the trade off is that the algorithm might stuck in local minimum and takes more iteration.

In [104]:
hist = opt.gradient_descent(f, init_val_dict={x: 0.0, y: 3.0}, max_iter=10000,
                  return_history=True)

We see gradient descent took a lot more steps then newton’s method.

In [105]:
hist = np.array(hist)
us, vs = hist[:, 0].flatten(), hist[:, 1].flatten()
zs     = f.evaluation_at({x: us, y: vs})
plt_surf(uu, vv, values, (us, vs, zs), show_dest=True, show_traj=True)
_images/Libraries_Demo_76_0.png

autodiff.optimize.bfgs

Lastly, we’ll use BFGS to find the miminum. BFGS is a quasi-Newton method that approximates the Hessian matrix while doing the optimization. The optimization path of BFGS can be quite hysterical, so we’ll just show the optimization result. It is \([1.0, 1.0]\) as we expected.

In [106]:
res = opt.bfgs(f, init_val_dict={x: 0.0, y: 3.0})
In [107]:
print(res[x], res[y])
1.00000000001 1.00000000001

Let’s look at the plot for bfgs, we see it blows up before it get to the mininum

In [108]:
hist = opt.bfgs(f, init_val_dict={x: 0.0, y: 0.0}, max_iter=10000,
                  return_history=True)
hist = np.array(hist)
us, vs = hist[:, 0].flatten(), hist[:, 1].flatten()
zs     = f.evaluation_at({x: us, y: vs})
plt_surf(uu, vv, values, (us, vs, zs), show_dest=True, show_traj=True)
_images/Libraries_Demo_82_0.png

Let take a closer look by excluding the very large value in the first few iterations

In [109]:
hist_trim = hist[5:,:]
In [110]:
us, vs = hist_trim[:, 0].flatten(), hist_trim[:, 1].flatten()
zs     = f.evaluation_at({x: us, y: vs})
plt_surf(uu, vv, values, (us, vs, zs), show_dest=True, show_traj=True)
_images/Libraries_Demo_85_0.png

autodiff.plot

Plot function takes in a single expression, which only has two subcomponent. It then use either Newton’s Method or Gradient Descent to calculate the minimum of the given function. It plots the values of the function at different points in a contour map, with ranges specified by the user,and highlights the trajectory of the optimization algorithm reaching the minimum.

In [111]:
import autodiff.forward as fwd
import autodiff.optimize as opt
from autodiff.plot import plot_contour
In [112]:
x, y = fwd.Variable(), fwd.Variable()
f = 100.0*(y - x**2)**2 + (1 - x)**2.0
init_val_dict = {x: 0.0, y: 1.0}
plot_contour(f,init_val_dict,x,y,plot_range=[-0.5,0.8],method = "gradient_descent")
_images/Libraries_Demo_89_0.png

We see that newton method merely used 2 iteration

In [113]:
plot_contour(f,init_val_dict,x,y,plot_range=[-1,1.5],method = "newton")
_images/Libraries_Demo_91_0.png

autodiff.backprop

In [134]:
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
from autodiff.backprop import *
from autodiff.forward import *
from autodiff.rootfinding import *
import time

Backpropagation module is built upon the interfaces developed in central code file “Autodiff.forward”. It calculate the derivative of each nodes in the compuational graph with respect to the root nodes. Therefore with different root nodes, we should expect to see different values of derivative. suppose we have the following structure:

\(x = 1\), \(y = 2\)

\(c = \sin(x)\)

\(d = c \cdot y\)

Note: after one round of back propagation, the .bder attributes stores the answer from the last round until it is cleared when a new round is called upon.

In [135]:
x = Variable()
y = Variable()
c = sin(x)
d = c*y
back_propagation(c,{x:1,y:2})
print('derivative of x with respect to c is ', x.bder)
print('derivative of y with respect to c is ', y.bder)
back_propagation(d,{x:1,y:2})
print('derivative of x with respect to c is ', x.bder)
print('derivative of y with respect to c is ', y.bder)
derivative of x with respect to c is  0.540302305868
derivative of y with respect to c is  0
derivative of x with respect to c is  1.08060461174
derivative of y with respect to c is  0.841470984808

If we calculate by hand:

$

\begin{align} \frac{dc}{dx} &= cos(1) = 0.54 \\ \frac{dc}{dy} &= 0 \\ \frac{dd}{dx} &= y*\frac{dc}{dx} = 2*cos(1) = 1.08\\ \frac{dd}{dy} &= c = sin(1) = 0.84 \\ \end{align}

$

Our Backward Mode is faster than Forward Mode when getting the derivatives of all nodes in a certain computational graph because of caching the results in the process.

User can use our backward mode to make their own neural network

In [136]:
start1 = time.time()
x = Variable()
y = Variable()
c = sin(x)
d = cos(y)
e = sin(x)*cos(y)
f = tan(e)
for i in range(10000):
    back_propagation(f,{x:1,y:2})
end1 = time.time()
interval = end1-start1
print('derivative of x with respect to f is ', x.bder)
print('derivative of y with respect to f is ', y.bder)
print('derivative of c with respect to f is ', c.bder)
print('derivative of d with respect to f is ', d.bder)
print('derivative of e with respect to f is ', e.bder)
print('derivative of f with respect to f is ', f.bder)
print('derivative of g with respect to f is ', g.bder)
print('time taken is {} second'.format(interval))
derivative of x with respect to f is  -0.254837416116
derivative of y with respect to f is  -0.867211207612
derivative of c with respect to f is  0
derivative of d with respect to f is  0
derivative of e with respect to f is  1.1333910384
derivative of f with respect to f is  1
derivative of g with respect to f is  0
time taken is 0.43090200424194336 second
In [137]:
start2 = time.time()
for i in range(10000):
    forward_x = f.derivative_at(x,{x:1,y:2})
    forward_y = f.derivative_at(y,{x:1,y:2})
    forward_c = f.derivative_at(c,{x:1,y:2})
    forward_d = f.derivative_at(d,{x:1,y:2})
    forward_e = f.derivative_at(e,{x:1,y:2})
    forward_f = f.derivative_at(f,{x:1,y:2})
    forward_g = f.derivative_at(g,{x:1,y:2})
end2 = time.time()
interval = end2-start2
print('derivative of x with respect to f is ', forward_x)
print('derivative of y with respect to f is ', forward_y)
print('derivative of c with respect to f is ', forward_c)
print('derivative of d with respect to f is ', forward_d)
print('derivative of e with respect to f is ', forward_e)
print('derivative of f with respect to f is ', forward_f)
print('derivative of g with respect to f is ', forward_g)
print(interval)
derivative of x with respect to f is  -0.254837416116
derivative of y with respect to f is  -0.867211207612
derivative of c with respect to f is  -0.0
derivative of d with respect to f is  -0.0
derivative of e with respect to f is  1.1333910384
derivative of f with respect to f is  1.0
derivative of g with respect to f is  -0.0
0.8440079689025879

Back propagation is also integrated with the function Newton’s

Note that sine function have multiple roots, and newton’s method will only give you the first one it finds

In [142]:
result_d=newton_scalar(d,{x:1,y:-1},max_itr = 25,method = 'backward')
In [143]:
print('x:',result_d[x])
print('y:',result_d[y])
print('function value:',abs(d.evaluation_at({x:result_d[x],y:result_d[y]})))
x: 2.84112466652
y: -1.5707963268
function value: 5.91243550575e-13

Implementation Details

High-level Design

Core Functions: Static Structure

The centural data structure in autodiff are Expression and ElementaryFunction (which is the common interface shared by Add, Mul, Pow, Exp, Sin…). Expression represents a mathematical expression. It is composed of one ElementaryFunction plus two sub-Expression’s. Expression has two child class: Variable, which represents a ‘base’ variable and Constant, which represents a constant.

Core Functions: Dynamic Behavior

When a Expression’s derivative_at method is called, it will pass its sub-Expression(‘s) to the ElementaryFunction’s derivative_at method. ElementaryFunction’s derivative_at method will then compute the derivative based on chain rule. In this process, the ElementaryFunction will need the values and derivatives of the sub-Expression(s), so it will call the evaluation_at method and derivative_at method of the sub-Expression(‘s), and use the returned value to calculate the derivative. In other words, Expression and ElementaryFunctions will be calling each other recursively, until the base of this recursive process is reached.

The base of this recursive process lies in the Constant class and the Variable class. When a Constant is called to give its derivative, it returns 0. When a Variable is called to give its derivative, it checks whether itself is the variable to be taken derivative with respect of, if yes, then it returns 1.0, otherwise it returns 0.0.

On Second Order derivatives

The implementation of second order derivative is conceptually very similar to the implementation of first order derivative, except that it implements a different chain rule. The knowledge of the chain rule is encompassed within the derivative_at method of ElementaryFunction. Because all the ElementaryFunctions involves either one or two sub-Expression, the Faà di Bruno’s formula is actually much less frightening to implement than it seems in the following figure.

Core Classes

The core class of autodiff is Expression and its child classes (Variable and Constant). They share the same interface: all implements their own evaluation_at and derivative_at methods. The dunder methods of Expression is overridden so that any operation on Expression will also return an Expression. Variable and Constant inherites these dunder methods so that they have the same behavior as Expression.

Expression is composed of one ElementaryFunction and two sub-Expressions. ElementaryFunctions like Sin, Exp and Add implements the chain rule associated with the corresponding elementary function. Note that sin and exp are different from Sin and Exp. The former two are actually factory functions that returns a Expression which has Sin and Exp as its ElementaryFunction.

External Dependencies

autodiff depends on numpy. All of autodiff’s calculation is done in numpy for the efficiency and the advantage of vectorization. The optimize moduel depends on scipy for solving linear systems. The plot module depends on matplotlib for plotting.

Functions Details

Autodiff.forward

forward API documentation Top

forward module

This file contains the central data structure and functions related to the forward mode auto differentiation.

"""
This file contains the central data structure and functions related to the
forward mode auto differentiation. 
"""

import numpy as np

class Expression:
    """ 
    This is a class for representing expression.
    It is the super class for variable and constant.
    """
    def __init__(self, ele_func, sub_expr1, sub_expr2=None):
        """ 
        The constructor for VectorFunction class. 
        
        PARAMETERS:
        =======
        ele_func: the function creating this expression
        sub_expr1: variable/constant composing this expression
        sub_expr2: variable/constant composing this expression, set to non
        for unary operations
        """
        self._ele_func  = ele_func
        self._sub_expr1 = sub_expr1
        self._sub_expr2 = sub_expr2
        self.val = None
        self.bder=0
    
    def evaluation_at(self, val_dict):
        """ 
        The wrapper function for individual evaluation_at function of 
        self_ele_func
        
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        a scalar value 
        """
        # self._sub_expr2 is None implies that self._ele_func is an unary operator
        if self._sub_expr2 is None: 
            return self._ele_func.evaluation_at(
                self._sub_expr1, val_dict)
        
        # self._sub_expr2 not None implies that self._ele_func is a binary operator
        else:
            return self._ele_func.evaluation_at(
                self._sub_expr1, self._sub_expr2, val_dict)
    
    def derivative_at(self, var, val_dict, order=1):
        """ 
        The wrapper function for individual derivative_at function of 
        self_ele_func
        
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        var: variable of interests for derivative calculation
        
        RETURNS
        ======== 
        a scalar value 
        """
        
        if type(var) is tuple: order=len(var)
        if var is self: 
            if   order == 1: return 1.0
            else: return 0.0
        
        # sub_expr2 being None implies that _ele_func is an unary operator
        if self._sub_expr2 is None:
            return self._ele_func.derivative_at(
                self._sub_expr1, var, val_dict, order)
        
        # sub_expr2 not None implies that _ele_func is a binary operator
        else:
            return self._ele_func.derivative_at(
                self._sub_expr1, self._sub_expr2, var, val_dict, order)
    
    def back_derivative(self,var,val_dict):
        """
        The wrapper function for individual backderivative_at 
        function of self_ele_func
        
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values. Variables
        in val_dict are of atomic feature and cannot be further decomposed.
        var: variable with respect to which the function calculates derivative   
        
        RETURNS
        ========
        derivative of var with respect to the immediate parent that contain var
        """
        if var is self: return 1.0
        if self._sub_expr2 is None:
            return self._ele_func.backderivative_at(self._sub_expr1,var)
        else:
            return self._ele_func.backderivative_at(self._sub_expr1,
                                                    self._sub_expr2,var)    



    def gradient_at(self, val_dict, returns_dict=False):
        """
        calculate 1st derivative of variables in val_dict using forward mode
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        returns_dict: the format of output
         
        RETURNS
        ========
        derivative of variables in val_dict with respect to the current 
        expression, stored in a dictionary or a 2-D numpy array
        """
        if returns_dict:
            return {v: self.derivative_at(v, val_dict) for v in val_dict.keys()}
        return np.array([self.derivative_at(var, val_dict, order=1) 
                         for var in val_dict.keys()])
    
    def hessian_at(self, val_dict):
        """
        calculate 2nd derivative of variables in val_dict using forward mode
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
         
        RETURNS
        ========
        2nd derivative of variables in val_dict with respect to the current 
        expression, stored in a 2-D list
        """
        return np.array( [ \
                          [self.derivative_at((var1, var2), val_dict, order=2)
                           for var1 in val_dict.keys()]
                          for var2 in val_dict.keys() \
                          ] )
    
    def __neg__(self):
        """ Implement dunder method for neg """
        return Expression(Neg, self)

                
    def __add__(self, another):
        """ Implement dunder method for add """
        if isinstance(another, Expression):
            return Expression(Add, self, another)
        # if the other operand is not an Expression, then it must be a number
        # the number then should be converted to a Constant
        else:
            return Expression(Add, self, Constant(another))
    
    
    def __radd__(self, another):
        """ Implement dunder method for right add """
        if isinstance(another, Expression):
            return Expression(Add, another, self)
        else:
            return Expression(Add, Constant(another), self)
    
    def __sub__(self, another):
        """ Implement dunder method for subtraction """
        if isinstance(another, Expression):
            return Expression(Sub, self, another)
        else:
            return Expression(Sub, self, Constant(another))
    
    def __rsub__(self, another):
        """ Implement dunder method for right subtraction """
        if isinstance(another, Expression):
            return Expression(Sub, another, self)
        else:
            return Expression(Sub, Constant(another), self)
        

    def __mul__(self, another):
        """ Implement dunder method for multiplication """
        if isinstance(another, Expression):
            return Expression(Mul,self,another)
        else:
            return Expression(Mul, self, Constant(another))

    def __rmul__(self, another):
        """ Implement dunder method for right multiplication """
        if isinstance(another, Expression):
            return Expression(Mul,another,self)
        else:
            return Expression(Mul, Constant(another),self)
    
    def __truediv__(self, another):
        """ Implement dunder method for division """
        if isinstance(another, Expression):
            return Expression(Div,self,another)
        else:
            return Expression(Div, self, Constant(another))

    def __rtruediv__(self, another):
        """ Implement dunder method for right division """
        if isinstance(another, Expression):
            return Expression(Div,another,self)
        else:
            return Expression(Div, Constant(another),self)
    
    def __pow__(self,another):
        """ Implement dunder method for power """
        if isinstance(another, Expression):
            return Expression(Pow,self,another)
        else:
            return Expression(Pow, self, Constant(another))
    
    def __rpow__(self,another):
        """ Implement dunder method for right power """
        if isinstance(another, Expression):
            return Expression(Pow,another,self)
        else:
            return Expression(Pow, Constant(another),self)
    
    def __eq__(self, another):
        """ Implement dunder method for equal """
        if not isinstance(another, Expression):
            return False
        return self._ele_func == another._ele_func \
               and self._sub_expr1 == another._sub_expr1 \
               and self._sub_expr2 == another._sub_expr2
               
    def __ne__(self, another):
        """ Implement dunder method not equal """
        return ~self.__eq__(another)
    
    def __hash__(self):
        """ Implement dunder method hash """
        return object.__hash__(self)   

class Variable(Expression):
    """ 
    This is a class for representing variable. 
    """
    def __init__(self):
        """ 
        The constructor for VectorFunction class. 
        It has no parameters: 
        """
        self.val = None
        self.bder = 0
        return
    
    def evaluation_at(self, val_dict):
        """ 
        The function to evaluation the value of variable class
        
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ======== 
        a scalar value 
        """
        return val_dict[self]
    
    def derivative_at(self, var, val_dict, order=1):
        """ 
        The function calculates derivative of variable class. 
  
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        var: variable whose derivative is the result of this function
        order: default set to 1 for 1st derivative, change to 2 for 
        higher order
        
        RETURNS
        ========
        scalar value  
        """
        if order == 1:
            return 1.0 if var is self else 0.0
        else:
            return 0.0
    
    def __eq__(self, another):
        """ Implement dunder method for equal """
        return another is self
    
    def __ne__(self, another):
        """ Implement dunder method for not equal """
        return ~self.__eq__(another)
    
    def __hash__(self):
        """ Implement dunder method for hash """
        return Expression.__hash__(self) 

class Constant(Expression):
    """ 
    This is a class for representing constant. 
      
    Attributes: 
       val: value of the constant
    """
    def __init__(self, val):
        """ 
        The constructor for VectorFunction class. 
        
        PARAMETERS:
        =======
        val: the value of the constant object
        """
        self.val = val
 
    def evaluation_at(self, val_dict):
        """ 
        The function to evaluation the value of constant class
        
        PARAMETERS:
        ======= 
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        a scalar value 
        """
        return self.val
    
    def derivative_at(self, var, val_dict, order=1):
        """ 
        The function calculates derivative of constant class. 
  
        PARAMETERS:
        ======= 
        val_dict: a dictionary containing variable name and values.
        var: variable whose derivative is the result of this function
        order: default set to 1 for 1st derivative, change to 2 for 
        higher order
        
        RETURNS
        ========
        scalar value  
        """
        return 0.0
    
    def __eq__(self, another):
         """ Implement dunder method for equal """
         if isinstance(another, Constant): return True
         else:                             return False
    
    def __ne__(self, another):
         """ Implement dunder method for not equal """
         return ~self.__eq__(another)
    
    def __hash__(self):
         """ Implement dunder method for hash"""
         return Expression.__hash__(self) 


class VectorFunction:
    """ 
    This is a class for applying operations to a vector of variables. 
      
    Attributes: 
        _exprlist: a list of expressions with respect to which the operations
    are applied 
    """
    def __init__(self, exprlist):
        """ 
        The constructor for VectorFunction class. 
        
        PARAMETERS:
        ======= 
        exprlist: a list of expressions with respect to which the class 
        functions are applied to  
        """
        self._exprlist = exprlist.copy()
    
    def evaluation_at(self, val_dict):
        """ 
        The function to apply evaluation_at to a vector of expressions. 
  
        PARAMETERS:
        ======= 
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        a numpy array containing value of expressions in the self._exprlist. 
        """
        return np.array([expr.evaluation_at(val_dict) 
                        for expr in self._exprlist])
    
    def gradient_at(self, var, val_dict):
        """ 
        The function to apply derivative_at to a vector of expressions. 
  
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        var: variable whose derivative is the result of this function
       
        RETURNS
        ========
        a numpy array containing first derivative of expressions in 
        self._exprlist with respect to var. 
        """
        return np.array([f.derivative_at(var, val_dict) for f in self._exprlist])
    
    def jacobian_at(self, val_dict):
        """ 
        The function to calculate jacobian with respect to atomic variables in 
        val_dict. 
  
        PARAMETERS:
        ======= 
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        a 2-D numpy array containing derivatives of variables in val_dict 
        with resepct to expressions in self._exprlist. 
        """
        return np.array([self.gradient_at(var, val_dict)
                         for var in val_dict.keys()]).transpose()


class Add:
    """ 
    This is a class to wrap up static method related to add operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute addition of sub_expr1 with sub_expr2 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 + sub_expr2
        """
        return sub_expr1.evaluation_at(val_dict) + \
               sub_expr2.evaluation_at(val_dict)
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict, order=1):
        return sub_expr1.derivative_at(var, val_dict, order) + \
               sub_expr2.derivative_at(var, val_dict, order)
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        sub_expr2: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return 1

class Sub:
    """ 
    This is a class to wrap up static method related to sub operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute subtraction of sub_expr2 from sub_expr1 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 - sub_expr2
        """
        return sub_expr1.evaluation_at(val_dict) - \
               sub_expr2.evaluation_at(val_dict)
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return sub_expr1.derivative_at(var, val_dict, order) - \
               sub_expr2.derivative_at(var, val_dict, order)
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        sub_expr2: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if var == sub_expr1:
            return 1
        if var == sub_expr2:
            return -1 

class Mul:
    """ 
    This is a class to wrap up static method related to mul operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute multiplication of sub_expr1 with sub_expr2 using inputs 
        of variable values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 * sub_expr2
        """
        return sub_expr1.evaluation_at(val_dict) *\
               sub_expr2.evaluation_at(val_dict)
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return sub_expr1.derivative_at(var, val_dict) * \
                   sub_expr2.evaluation_at(val_dict)+ \
                   sub_expr1.evaluation_at(val_dict) *\
                   sub_expr2.derivative_at(var, val_dict)
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                term1 = sub_expr1.derivative_at(var, val_dict, order=2) \
                        * sub_expr2.evaluation_at(val_dict)
                term2 = sub_expr2.derivative_at(var, val_dict, order=2) \
                        * sub_expr1.evaluation_at(val_dict)
                term3 = sub_expr1.derivative_at(var1, val_dict, order=1) \
                        * sub_expr2.derivative_at(var2, val_dict, order=1)
                term4 = sub_expr2.derivative_at(var1, val_dict, order=1) \
                        * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2 + term3 + term4
            else:
                return Mul.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        sub_expr2: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if var == sub_expr1:
            return sub_expr2.val
        else:
            return sub_expr1.val
               
class Div:
    """ 
    This is a class to wrap up static method related to div operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute division of sub_expr1 by sub_expr2 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 / sub_expr2
        """
        return sub_expr1.evaluation_at(val_dict) /\
               sub_expr2.evaluation_at(val_dict)
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return  sub_expr1.derivative_at(var, val_dict) / \
                    sub_expr2.evaluation_at(val_dict)- \
                    sub_expr1.evaluation_at(val_dict) *\
                    sub_expr2.derivative_at(var, val_dict)/\
                    sub_expr2.evaluation_at(val_dict)**2
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                g = sub_expr2.evaluation_at(val_dict)
                term1 =  1/g    * sub_expr2.derivative_at(var, val_dict, order=2)
                term2 = -f/g**2 * sub_expr1.derivative_at(var, val_dict, order=2)
                term3 = -1/g**2 * sub_expr1.derivative_at(var1, val_dict, order=1) \
                                * sub_expr2.derivative_at(var2, val_dict, order=1)
                term4 = -1/g**2 * sub_expr1.derivative_at(var2, val_dict, order=1) \
                                * sub_expr2.derivative_at(var1, val_dict, order=1)
                term5 = 2*f/g**3 * sub_expr2.derivative_at(var1, val_dict, order=1) \
                                 * sub_expr2.derivative_at(var2, val_dict, order=1)
                return term1 + term2 + term3 + term4 + term5  
            else:
                return Div.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        sub_expr2: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if var == sub_expr1:
            return 1/sub_expr2.val
        elif var == sub_expr2:
            return -sub_expr1.val/sub_expr2.val**2
            
class Pow:
    """ 
    This is a class to wrap up static method related to pow operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute sub_expr1 to the sub_expr2 power using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 ** sub_expr2
        """
        return np.power(sub_expr1.evaluation_at(val_dict), 
                        sub_expr2.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        p = sub_expr2.evaluation_at(val_dict)
        if   order == 1:
            return p*np.power(sub_expr1.evaluation_at(val_dict), p-1.0) \
                   * sub_expr1.derivative_at(var, val_dict)
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                term1 = p*np.power(sub_expr1.evaluation_at(val_dict), p-1.0) \
                        * sub_expr1.derivative_at((var1, var2), val_dict, order=2)
                term2 = p*(p-1.0)*np.power(sub_expr1.evaluation_at(val_dict), p-2.0) \
                        * sub_expr1.derivative_at(var1, val_dict, order=1) \
                        * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Pow.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        p = sub_expr2.val
        return p*np.power(sub_expr1.val, p-1.0)

def power(expr, p):
    return Expression(Pow, expr, Constant(p))
def sqrt(expr):
    return Expression(Pow, expr, Constant(0.5))


class Exp:
    """ 
    This is a class to wrap up static method related to exp operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, val_dict):
        """
        Compute exponent of sub_expr1 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        exponent(sub_expr1)
        """
        return np.exp(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1, var, val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return sub_expr1.derivative_at(var, val_dict) * \
                   np.exp(sub_expr1.evaluation_at(val_dict))
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 = np.exp(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = np.exp(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                                  * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Exp.derivative_at(sub_expr1, (var,var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return np.exp(sub_expr1.val)
    
def exp(expr):
    return Expression(Exp, expr)

class Log:
    
    @staticmethod
    def evaluation_at(sub_expr1, val_dict):
        return np.log(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1, var, val_dict, order=1):
        if   order == 1:
            return 1 / sub_expr1.evaluation_at(val_dict) * sub_expr1.derivative_at(var, val_dict)
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 = 1/f * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = -1/f**2 * sub_expr1.derivative_at(var1, val_dict, order=1) \
                            * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Log.derivative_at(sub_expr1, (var,var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    def backderivative_at(sub_expr1,var):
        if sub_expr1 == var:
            return 1/sub_expr1.val

def log(expr):
    return Expression(Log, expr)
        
class Neg:
    """ 
    This is a class to wrap up static method related to neg operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, val_dict):
        """
        Compute negation of sub_expr1 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        negate sub_expr1
        """
        return -sub_expr1.evaluation_at(val_dict)
    
    @staticmethod
    def derivative_at(sub_expr1, var, val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return -sub_expr1.derivative_at(var, val_dict, order)
    
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return -1


class Sin:
    """ 
    This is a class to wrap up static method related to sin operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, val_dict):
        """
        Compute sin of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sin of sub_expr1 
        """
        return np.sin(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1, var, val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return sub_expr1.derivative_at(var, val_dict) * \
        np.cos(sub_expr1.evaluation_at(val_dict))
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 =  np.cos(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = -np.sin(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                                   * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Sin.derivative_at(sub_expr1, (var,var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return np.cos(sub_expr1.val)
        
def sin(expr):
    return Expression(Sin, expr)

class Cos:
    """ 
    This is a class to wrap up static method related to cos operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute cos of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        cos sub_expr1
        """
        return np.cos(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return -sub_expr1.derivative_at(var, val_dict, order) * \
                   np.sin(sub_expr1.evaluation_at(val_dict)) 
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 = -np.sin(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = -np.cos(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                                   * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Cos.derivative_at(sub_expr1, (var,var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return -np.sin(sub_expr1.val)
        
def cos(expr):
    return Expression(Cos, expr)
    
class Tan:
    """ 
    This is a class to wrap up static method related to tan operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute tan of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        tan sub_expr1
        """
        return np.tan(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return sub_expr1.derivative_at(var, val_dict) /(np.cos(sub_expr1.evaluation_at(val_dict))**2)
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 = 1/(np.cos(f)**2) * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = 2*np.tan(f)/(np.cos(f)**2) \
                        * sub_expr1.derivative_at(var1, val_dict, order=1) \
                        * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Tan.derivative_at(sub_expr1, (var,var), val_dict, order=2)

        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return 1/(np.cos(sub_expr1.val)**2)

def tan(expr):
    return Expression(Tan, expr)
    
class Cotan:
    """ 
    This is a class to wrap up static method related to cotan operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute cotan of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        cotan sub_expr1
        """
        return 1/np.tan(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1): 
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if order == 1:
            return -sub_expr1.derivative_at(var, val_dict)/(np.sin(sub_expr1.evaluation_at(val_dict))**2)
        else: raise NotImplementedError('higher order derivatives not implemented for cotan.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return -1/(np.sin(sub_expr1.val)**2)          

def cotan(expr):
    return Expression(Cotan, expr)
    
class Sec:
    """ 
    This is a class to wrap up static method related to sec operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute sec of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sec sub_expr1
        """
        return 1/np.cos(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * \
               np.tan(x) * (1/np.cos(x))
        else: raise NotImplementedError('higher order derivatives not implemented for sec.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x =sub_expr1.val
        return np.tan(x)/np.cos(x)
        
def sec(expr):
    return Expression(Sec, expr) 

class Csc:
    """ 
    This is a class to wrap up static method related to csc operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute csc of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        csc sub_expr1
        """
        return 1/np.sin(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        if order == 1:
            return -sub_expr1.derivative_at(var, val_dict) * \
               (1/np.tan(x)) * (1/np.sin(x))
        else: raise NotImplementedError('higher order derivatives not implemented for csc.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return -(1/np.tan(x)) * (1/np.sin(x))

def csc(expr):
    return Expression(Csc, expr) 

class Sinh:
    """ 
    This is a class to wrap up static method related to sinh operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute sinh of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sinh sub_expr1
        """
        return np.sinh(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * np.cosh(x)
        else: raise NotImplementedError('higher order derivatives not implemented for sinh.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return np.cosh(x)

def sinh(expr):
    return Expression(Sinh, expr) 

class Cosh:
    """ 
    This is a class to wrap up static method related to cosh operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute cosh of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        cosh sub_expr1
        """
        return np.cosh(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * np.sinh(x)
        else: raise NotImplementedError('higher order derivatives not implemented for cosh.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return np.sinh(sub_expr1.val)

def cosh(expr):
    return Expression(Cosh, expr) 
    
class Tanh:
    """ 
    This is a class to wrap up static method related to tanh operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute tanh of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        tanh sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.sinh(x)/np.cosh(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        tanh = np.sinh(x)/np.cosh(x)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * (1-tanh*tanh)
        else: raise NotImplementedError('higher order derivatives not implemented for tanh.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        tanh = np.sinh(x)/np.cosh(x)
        return 1-tanh*tanh

def tanh(expr):
    return Expression(Tanh,expr) 

class Csch:
    """ 
    This is a class to wrap up static method related to csch operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute csch of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        csch sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return 1/np.sinh(x)
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        # d = -csch(x)*cot(x)
        d = -(1/np.sinh(x)) * (np.cosh(x)/np.sinh(x))
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * d
        else: raise NotImplementedError('higher order derivatives not implemented for csch.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return -(np.cosh(x)/np.sinh(x))*(1/np.sinh(x))

def csch(expr):
    return Expression(Csch, expr) 

class Sech:
    """ 
    This is a class to wrap up static method related to sech operation
    """
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute sech of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sech sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return 1/np.cosh(x)
    
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        # d = -sech(x)tanh(x)
        d = -(1/np.cosh(x)) * (np.sinh(x)/np.cosh(x))
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict)*d
        else: raise NotImplementedError('higher order derivatives not implemented for sech.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return -(1/np.cosh(x)) * (np.sinh(x)/np.cosh(x))

def sech(expr):
    return Expression(Sech, expr) 

class Coth:
    """ 
    This is a class to wrap up static method related to coth operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute coth of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        coth sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.cosh(x)/np.sinh(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        coth = np.cosh(x)/np.sinh(x)

        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * (1-coth**2)
        else: raise NotImplementedError('higher order derivatives not implemented for cotan.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        coth = np.cosh(x)/np.sinh(x)            
        return 1-coth**2

def coth(expr):
    return Expression(Coth, expr)    

class Arcsin:
    """ 
    This is a class to wrap up static method related to arcsin operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute arcsin of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        arcsin sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.arcsin(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        d = 1/np.sqrt(1-x**2)
        #1/sqrt(1-x^2)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * d
        else: raise NotImplementedError('higher order derivatives not implemented for arcsin.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return 1/np.sqrt(1-x**2)

def arcsin(expr):
    return Expression(Arcsin, expr)
    
class Arccos:
    """ 
    This is a class to wrap up static method related to arccos operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute arccos of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        arccos sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.arccos(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        d = 1/np.sqrt(1-x**2)
        #-1/sqrt(1-x^2)
        if order == 1:
            return -sub_expr1.derivative_at(var, val_dict) * d
        else: raise NotImplementedError('higher order derivatives not implemented for arccos.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return -1/np.sqrt(1-x**2)

def arccos(expr):
    return Expression(Arccos, expr)
    
class Arctan:
    """ 
    This is a class to wrap up static method related to arctan operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute arctan of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        arctan sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.arctan(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        d = 1/(1+x**2)
        # d = 1/(1+x**2)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * d
        else: raise NotImplementedError('higher order derivatives not implemented for arctan.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return 1/(1+x**2)

def arctan(expr):
    return Expression(Arctan, expr)

def logit(expr):
    return log(expr/(1-expr))

def sigmoid(expr):
    return 1/(1+exp(-expr))

Functions

def arccos(

expr)

def arccos(expr):
    return Expression(Arccos, expr)

def arcsin(

expr)

def arcsin(expr):
    return Expression(Arcsin, expr)

def arctan(

expr)

def arctan(expr):
    return Expression(Arctan, expr)

def cos(

expr)

def cos(expr):
    return Expression(Cos, expr)

def cosh(

expr)

def cosh(expr):
    return Expression(Cosh, expr) 

def cotan(

expr)

def cotan(expr):
    return Expression(Cotan, expr)

def coth(

expr)

def coth(expr):
    return Expression(Coth, expr)    

def csc(

expr)

def csc(expr):
    return Expression(Csc, expr) 

def csch(

expr)

def csch(expr):
    return Expression(Csch, expr) 

def exp(

expr)

def exp(expr):
    return Expression(Exp, expr)

def log(

expr)

def log(expr):
    return Expression(Log, expr)

def logit(

expr)

def logit(expr):
    return log(expr/(1-expr))

def power(

expr, p)

def power(expr, p):
    return Expression(Pow, expr, Constant(p))

def sec(

expr)

def sec(expr):
    return Expression(Sec, expr) 

def sech(

expr)

def sech(expr):
    return Expression(Sech, expr) 

def sigmoid(

expr)

def sigmoid(expr):
    return 1/(1+exp(-expr))

def sin(

expr)

def sin(expr):
    return Expression(Sin, expr)

def sinh(

expr)

def sinh(expr):
    return Expression(Sinh, expr) 

def sqrt(

expr)

def sqrt(expr):
    return Expression(Pow, expr, Constant(0.5))

def tan(

expr)

def tan(expr):
    return Expression(Tan, expr)

def tanh(

expr)

def tanh(expr):
    return Expression(Tanh,expr) 

Classes

class Add

This is a class to wrap up static method related to add operation

class Add:
    """ 
    This is a class to wrap up static method related to add operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute addition of sub_expr1 with sub_expr2 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 + sub_expr2
        """
        return sub_expr1.evaluation_at(val_dict) + \
               sub_expr2.evaluation_at(val_dict)
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict, order=1):
        return sub_expr1.derivative_at(var, val_dict, order) + \
               sub_expr2.derivative_at(var, val_dict, order)
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        sub_expr2: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return 1

Ancestors (in MRO)

  • Add
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, sub_expr2, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,sub_expr2,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression or constant 
    sub_expr2: expression or constant 
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return 1

def derivative_at(

sub_expr1, sub_expr2, var, val_dict, order=1)

@staticmethod
def derivative_at(sub_expr1, sub_expr2, var, val_dict, order=1):
    return sub_expr1.derivative_at(var, val_dict, order) + \
           sub_expr2.derivative_at(var, val_dict, order)

def evaluation_at(

sub_expr1, sub_expr2, val_dict)

Compute addition of sub_expr1 with sub_expr2 using inputs of variable values from val_dict.

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant val_dict: a dictionary containing variable name and values.

RETURNS

sub_expr1 + sub_expr2

@staticmethod
def evaluation_at(sub_expr1, sub_expr2, val_dict):
    """
    Compute addition of sub_expr1 with sub_expr2 using inputs of variable
    values from val_dict.

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sub_expr1 + sub_expr2
    """
    return sub_expr1.evaluation_at(val_dict) + \
           sub_expr2.evaluation_at(val_dict)

class Arccos

This is a class to wrap up static method related to arccos operation

class Arccos:
    """ 
    This is a class to wrap up static method related to arccos operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute arccos of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        arccos sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.arccos(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        d = 1/np.sqrt(1-x**2)
        #-1/sqrt(1-x^2)
        if order == 1:
            return -sub_expr1.derivative_at(var, val_dict) * d
        else: raise NotImplementedError('higher order derivatives not implemented for arccos.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return -1/np.sqrt(1-x**2)

Ancestors (in MRO)

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    return -1/np.sqrt(1-x**2)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    d = 1/np.sqrt(1-x**2)
    #-1/sqrt(1-x^2)
    if order == 1:
        return -sub_expr1.derivative_at(var, val_dict) * d
    else: raise NotImplementedError('higher order derivatives not implemented for arccos.')

def evaluation_at(

sub_expr1, val_dict)

Compute arccos of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

arccos sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute arccos of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    arccos sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    return np.arccos(x)

class Arcsin

This is a class to wrap up static method related to arcsin operation

class Arcsin:
    """ 
    This is a class to wrap up static method related to arcsin operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute arcsin of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        arcsin sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.arcsin(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        d = 1/np.sqrt(1-x**2)
        #1/sqrt(1-x^2)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * d
        else: raise NotImplementedError('higher order derivatives not implemented for arcsin.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return 1/np.sqrt(1-x**2)

Ancestors (in MRO)

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    return 1/np.sqrt(1-x**2)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    d = 1/np.sqrt(1-x**2)
    #1/sqrt(1-x^2)
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict) * d
    else: raise NotImplementedError('higher order derivatives not implemented for arcsin.')

def evaluation_at(

sub_expr1, val_dict)

Compute arcsin of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

arcsin sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute arcsin of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    arcsin sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    return np.arcsin(x)

class Arctan

This is a class to wrap up static method related to arctan operation

class Arctan:
    """ 
    This is a class to wrap up static method related to arctan operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute arctan of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        arctan sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.arctan(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        d = 1/(1+x**2)
        # d = 1/(1+x**2)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * d
        else: raise NotImplementedError('higher order derivatives not implemented for arctan.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return 1/(1+x**2)

Ancestors (in MRO)

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    return 1/(1+x**2)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    d = 1/(1+x**2)
    # d = 1/(1+x**2)
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict) * d
    else: raise NotImplementedError('higher order derivatives not implemented for arctan.')

def evaluation_at(

sub_expr1, val_dict)

Compute arctan of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

arctan sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute arctan of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    arctan sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    return np.arctan(x)

class Constant

This is a class for representing constant.

Attributes: val: value of the constant

class Constant(Expression):
    """ 
    This is a class for representing constant. 
      
    Attributes: 
       val: value of the constant
    """
    def __init__(self, val):
        """ 
        The constructor for VectorFunction class. 
        
        PARAMETERS:
        =======
        val: the value of the constant object
        """
        self.val = val
 
    def evaluation_at(self, val_dict):
        """ 
        The function to evaluation the value of constant class
        
        PARAMETERS:
        ======= 
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        a scalar value 
        """
        return self.val
    
    def derivative_at(self, var, val_dict, order=1):
        """ 
        The function calculates derivative of constant class. 
  
        PARAMETERS:
        ======= 
        val_dict: a dictionary containing variable name and values.
        var: variable whose derivative is the result of this function
        order: default set to 1 for 1st derivative, change to 2 for 
        higher order
        
        RETURNS
        ========
        scalar value  
        """
        return 0.0
    
    def __eq__(self, another):
         """ Implement dunder method for equal """
         if isinstance(another, Constant): return True
         else:                             return False
    
    def __ne__(self, another):
         """ Implement dunder method for not equal """
         return ~self.__eq__(another)
    
    def __hash__(self):
         """ Implement dunder method for hash"""
         return Expression.__hash__(self) 

Ancestors (in MRO)

Static methods

def __init__(

self, val)

The constructor for VectorFunction class.

PARAMETERS:

val: the value of the constant object

def __init__(self, val):
    """ 
    The constructor for VectorFunction class. 
    
    PARAMETERS:
    =======
    val: the value of the constant object
    """
    self.val = val

def back_derivative(

self, var, val_dict)

The wrapper function for individual backderivative_at function of self_ele_func

PARAMETERS:

val_dict: a dictionary containing variable name and values. Variables in val_dict are of atomic feature and cannot be further decomposed. var: variable with respect to which the function calculates derivative

RETURNS

derivative of var with respect to the immediate parent that contain var

def back_derivative(self,var,val_dict):
    """
    The wrapper function for individual backderivative_at 
    function of self_ele_func
    
    PARAMETERS:
    =======
    val_dict: a dictionary containing variable name and values. Variables
    in val_dict are of atomic feature and cannot be further decomposed.
    var: variable with respect to which the function calculates derivative   
    
    RETURNS
    ========
    derivative of var with respect to the immediate parent that contain var
    """
    if var is self: return 1.0
    if self._sub_expr2 is None:
        return self._ele_func.backderivative_at(self._sub_expr1,var)
    else:
        return self._ele_func.backderivative_at(self._sub_expr1,
                                                self._sub_expr2,var)    

def derivative_at(

self, var, val_dict, order=1)

The function calculates derivative of constant class.

PARAMETERS:

val_dict: a dictionary containing variable name and values. var: variable whose derivative is the result of this function order: default set to 1 for 1st derivative, change to 2 for higher order

RETURNS

scalar value

def derivative_at(self, var, val_dict, order=1):
    """ 
    The function calculates derivative of constant class. 
    PARAMETERS:
    ======= 
    val_dict: a dictionary containing variable name and values.
    var: variable whose derivative is the result of this function
    order: default set to 1 for 1st derivative, change to 2 for 
    higher order
    
    RETURNS
    ========
    scalar value  
    """
    return 0.0

def evaluation_at(

self, val_dict)

The function to evaluation the value of constant class

PARAMETERS:

val_dict: a dictionary containing variable name and values.

RETURNS

a scalar value

def evaluation_at(self, val_dict):
    """ 
    The function to evaluation the value of constant class
    
    PARAMETERS:
    ======= 
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    a scalar value 
    """
    return self.val

def gradient_at(

self, val_dict, returns_dict=False)

calculate 1st derivative of variables in val_dict using forward mode

INPUTS

val_dict: a dictionary containing variable name and values. returns_dict: the format of output

RETURNS

derivative of variables in val_dict with respect to the current expression, stored in a dictionary or a 2-D numpy array

def gradient_at(self, val_dict, returns_dict=False):
    """
    calculate 1st derivative of variables in val_dict using forward mode

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    returns_dict: the format of output
     
    RETURNS
    ========
    derivative of variables in val_dict with respect to the current 
    expression, stored in a dictionary or a 2-D numpy array
    """
    if returns_dict:
        return {v: self.derivative_at(v, val_dict) for v in val_dict.keys()}
    return np.array([self.derivative_at(var, val_dict, order=1) 
                     for var in val_dict.keys()])

def hessian_at(

self, val_dict)

calculate 2nd derivative of variables in val_dict using forward mode

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

2nd derivative of variables in val_dict with respect to the current expression, stored in a 2-D list

def hessian_at(self, val_dict):
    """
    calculate 2nd derivative of variables in val_dict using forward mode

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
     
    RETURNS
    ========
    2nd derivative of variables in val_dict with respect to the current 
    expression, stored in a 2-D list
    """
    return np.array( [ \
                      [self.derivative_at((var1, var2), val_dict, order=2)
                       for var1 in val_dict.keys()]
                      for var2 in val_dict.keys() \
                      ] )

Instance variables

var val

Inheritance: Expression.val

class Cos

This is a class to wrap up static method related to cos operation

class Cos:
    """ 
    This is a class to wrap up static method related to cos operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute cos of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        cos sub_expr1
        """
        return np.cos(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return -sub_expr1.derivative_at(var, val_dict, order) * \
                   np.sin(sub_expr1.evaluation_at(val_dict)) 
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 = -np.sin(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = -np.cos(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                                   * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Cos.derivative_at(sub_expr1, (var,var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return -np.sin(sub_expr1.val)

Ancestors (in MRO)

  • Cos
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return -np.sin(sub_expr1.val)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression or constant val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression or constant
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if   order == 1:
        return -sub_expr1.derivative_at(var, val_dict, order) * \
               np.sin(sub_expr1.evaluation_at(val_dict)) 
    elif order == 2:
        if type(var) is tuple:
            var1, var2 = var
            f = sub_expr1.evaluation_at(val_dict)
            term1 = -np.sin(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
            term2 = -np.cos(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                               * sub_expr1.derivative_at(var2, val_dict, order=1)
            return term1 + term2
        else:
            return Cos.derivative_at(sub_expr1, (var,var), val_dict, order=2)
    else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')

def evaluation_at(

sub_expr1, val_dict)

Compute cos of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

cos sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute cos of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    cos sub_expr1
    """
    return np.cos(sub_expr1.evaluation_at(val_dict))

class Cosh

This is a class to wrap up static method related to cosh operation

class Cosh:
    """ 
    This is a class to wrap up static method related to cosh operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute cosh of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        cosh sub_expr1
        """
        return np.cosh(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * np.sinh(x)
        else: raise NotImplementedError('higher order derivatives not implemented for cosh.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return np.sinh(sub_expr1.val)

Ancestors (in MRO)

  • Cosh
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return np.sinh(sub_expr1.val)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict) * np.sinh(x)
    else: raise NotImplementedError('higher order derivatives not implemented for cosh.')

def evaluation_at(

sub_expr1, val_dict)

Compute cosh of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

cosh sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute cosh of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    cosh sub_expr1
    """
    return np.cosh(sub_expr1.evaluation_at(val_dict))

class Cotan

This is a class to wrap up static method related to cotan operation

class Cotan:
    """ 
    This is a class to wrap up static method related to cotan operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute cotan of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        cotan sub_expr1
        """
        return 1/np.tan(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1): 
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if order == 1:
            return -sub_expr1.derivative_at(var, val_dict)/(np.sin(sub_expr1.evaluation_at(val_dict))**2)
        else: raise NotImplementedError('higher order derivatives not implemented for cotan.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return -1/(np.sin(sub_expr1.val)**2)          

Ancestors (in MRO)

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return -1/(np.sin(sub_expr1.val)**2)          

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1): 
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if order == 1:
        return -sub_expr1.derivative_at(var, val_dict)/(np.sin(sub_expr1.evaluation_at(val_dict))**2)
    else: raise NotImplementedError('higher order derivatives not implemented for cotan.')

def evaluation_at(

sub_expr1, val_dict)

Compute cotan of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

cotan sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute cotan of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    cotan sub_expr1
    """
    return 1/np.tan(sub_expr1.evaluation_at(val_dict))

class Coth

This is a class to wrap up static method related to coth operation

class Coth:
    """ 
    This is a class to wrap up static method related to coth operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute coth of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        coth sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.cosh(x)/np.sinh(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        coth = np.cosh(x)/np.sinh(x)

        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * (1-coth**2)
        else: raise NotImplementedError('higher order derivatives not implemented for cotan.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        coth = np.cosh(x)/np.sinh(x)            
        return 1-coth**2

Ancestors (in MRO)

  • Coth
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    coth = np.cosh(x)/np.sinh(x)            
    return 1-coth**2

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    coth = np.cosh(x)/np.sinh(x)
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict) * (1-coth**2)
    else: raise NotImplementedError('higher order derivatives not implemented for cotan.')

def evaluation_at(

sub_expr1, val_dict)

Compute coth of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

coth sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute coth of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    coth sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    return np.cosh(x)/np.sinh(x)

class Csc

This is a class to wrap up static method related to csc operation

class Csc:
    """ 
    This is a class to wrap up static method related to csc operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute csc of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        csc sub_expr1
        """
        return 1/np.sin(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        if order == 1:
            return -sub_expr1.derivative_at(var, val_dict) * \
               (1/np.tan(x)) * (1/np.sin(x))
        else: raise NotImplementedError('higher order derivatives not implemented for csc.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return -(1/np.tan(x)) * (1/np.sin(x))

Ancestors (in MRO)

  • Csc
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    return -(1/np.tan(x)) * (1/np.sin(x))

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    if order == 1:
        return -sub_expr1.derivative_at(var, val_dict) * \
           (1/np.tan(x)) * (1/np.sin(x))
    else: raise NotImplementedError('higher order derivatives not implemented for csc.')

def evaluation_at(

sub_expr1, val_dict)

Compute csc of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

csc sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute csc of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    csc sub_expr1
    """
    return 1/np.sin(sub_expr1.evaluation_at(val_dict))

class Csch

This is a class to wrap up static method related to csch operation

class Csch:
    """ 
    This is a class to wrap up static method related to csch operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute csch of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        csch sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return 1/np.sinh(x)
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        # d = -csch(x)*cot(x)
        d = -(1/np.sinh(x)) * (np.cosh(x)/np.sinh(x))
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * d
        else: raise NotImplementedError('higher order derivatives not implemented for csch.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return -(np.cosh(x)/np.sinh(x))*(1/np.sinh(x))

Ancestors (in MRO)

  • Csch
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    return -(np.cosh(x)/np.sinh(x))*(1/np.sinh(x))

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    # d = -csch(x)*cot(x)
    d = -(1/np.sinh(x)) * (np.cosh(x)/np.sinh(x))
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict) * d
    else: raise NotImplementedError('higher order derivatives not implemented for csch.')

def evaluation_at(

sub_expr1, val_dict)

Compute csch of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

csch sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute csch of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    csch sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    return 1/np.sinh(x)

class Div

This is a class to wrap up static method related to div operation

class Div:
    """ 
    This is a class to wrap up static method related to div operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute division of sub_expr1 by sub_expr2 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 / sub_expr2
        """
        return sub_expr1.evaluation_at(val_dict) /\
               sub_expr2.evaluation_at(val_dict)
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return  sub_expr1.derivative_at(var, val_dict) / \
                    sub_expr2.evaluation_at(val_dict)- \
                    sub_expr1.evaluation_at(val_dict) *\
                    sub_expr2.derivative_at(var, val_dict)/\
                    sub_expr2.evaluation_at(val_dict)**2
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                g = sub_expr2.evaluation_at(val_dict)
                term1 =  1/g    * sub_expr2.derivative_at(var, val_dict, order=2)
                term2 = -f/g**2 * sub_expr1.derivative_at(var, val_dict, order=2)
                term3 = -1/g**2 * sub_expr1.derivative_at(var1, val_dict, order=1) \
                                * sub_expr2.derivative_at(var2, val_dict, order=1)
                term4 = -1/g**2 * sub_expr1.derivative_at(var2, val_dict, order=1) \
                                * sub_expr2.derivative_at(var1, val_dict, order=1)
                term5 = 2*f/g**3 * sub_expr2.derivative_at(var1, val_dict, order=1) \
                                 * sub_expr2.derivative_at(var2, val_dict, order=1)
                return term1 + term2 + term3 + term4 + term5  
            else:
                return Div.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        sub_expr2: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if var == sub_expr1:
            return 1/sub_expr2.val
        elif var == sub_expr2:
            return -sub_expr1.val/sub_expr2.val**2

Ancestors (in MRO)

  • Div
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, sub_expr2, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,sub_expr2,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression or constant 
    sub_expr2: expression or constant 
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if var == sub_expr1:
        return 1/sub_expr2.val
    elif var == sub_expr2:
        return -sub_expr1.val/sub_expr2.val**2

def derivative_at(

sub_expr1, sub_expr2, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant val_dict: a dictionary containing variable name and values. var: variable of interest order: default set to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default set to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if   order == 1:
        return  sub_expr1.derivative_at(var, val_dict) / \
                sub_expr2.evaluation_at(val_dict)- \
                sub_expr1.evaluation_at(val_dict) *\
                sub_expr2.derivative_at(var, val_dict)/\
                sub_expr2.evaluation_at(val_dict)**2
    elif order == 2:
        if type(var) is tuple:
            var1, var2 = var
            f = sub_expr1.evaluation_at(val_dict)
            g = sub_expr2.evaluation_at(val_dict)
            term1 =  1/g    * sub_expr2.derivative_at(var, val_dict, order=2)
            term2 = -f/g**2 * sub_expr1.derivative_at(var, val_dict, order=2)
            term3 = -1/g**2 * sub_expr1.derivative_at(var1, val_dict, order=1) \
                            * sub_expr2.derivative_at(var2, val_dict, order=1)
            term4 = -1/g**2 * sub_expr1.derivative_at(var2, val_dict, order=1) \
                            * sub_expr2.derivative_at(var1, val_dict, order=1)
            term5 = 2*f/g**3 * sub_expr2.derivative_at(var1, val_dict, order=1) \
                             * sub_expr2.derivative_at(var2, val_dict, order=1)
            return term1 + term2 + term3 + term4 + term5  
        else:
            return Div.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
    else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')

def evaluation_at(

sub_expr1, sub_expr2, val_dict)

Compute division of sub_expr1 by sub_expr2 using inputs of variable values from val_dict.

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant val_dict: a dictionary containing variable name and values.

RETURNS

sub_expr1 / sub_expr2

@staticmethod
def evaluation_at(sub_expr1, sub_expr2, val_dict):
    """
    Compute division of sub_expr1 by sub_expr2 using inputs of variable
    values from val_dict.

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sub_expr1 / sub_expr2
    """
    return sub_expr1.evaluation_at(val_dict) /\
           sub_expr2.evaluation_at(val_dict)

class Exp

This is a class to wrap up static method related to exp operation

class Exp:
    """ 
    This is a class to wrap up static method related to exp operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, val_dict):
        """
        Compute exponent of sub_expr1 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        exponent(sub_expr1)
        """
        return np.exp(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1, var, val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return sub_expr1.derivative_at(var, val_dict) * \
                   np.exp(sub_expr1.evaluation_at(val_dict))
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 = np.exp(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = np.exp(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                                  * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Exp.derivative_at(sub_expr1, (var,var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return np.exp(sub_expr1.val)

Ancestors (in MRO)

  • Exp
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression or constant var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression or constant
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return np.exp(sub_expr1.val)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default set to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1, var, val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default set to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if   order == 1:
        return sub_expr1.derivative_at(var, val_dict) * \
               np.exp(sub_expr1.evaluation_at(val_dict))
    elif order == 2:
        if type(var) is tuple:
            var1, var2 = var
            f = sub_expr1.evaluation_at(val_dict)
            term1 = np.exp(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
            term2 = np.exp(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                              * sub_expr1.derivative_at(var2, val_dict, order=1)
            return term1 + term2
        else:
            return Exp.derivative_at(sub_expr1, (var,var), val_dict, order=2)
    else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')

def evaluation_at(

sub_expr1, val_dict)

Compute exponent of sub_expr1 using inputs of variable values from val_dict.

INPUTS

sub_expr1: expression or constant val_dict: a dictionary containing variable name and values.

RETURNS

exponent(sub_expr1)

@staticmethod
def evaluation_at(sub_expr1, val_dict):
    """
    Compute exponent of sub_expr1 using inputs of variable
    values from val_dict.

    INPUTS
    =======
    sub_expr1: expression or constant
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    exponent(sub_expr1)
    """
    return np.exp(sub_expr1.evaluation_at(val_dict))

class Expression

This is a class for representing expression. It is the super class for variable and constant.

class Expression:
    """ 
    This is a class for representing expression.
    It is the super class for variable and constant.
    """
    def __init__(self, ele_func, sub_expr1, sub_expr2=None):
        """ 
        The constructor for VectorFunction class. 
        
        PARAMETERS:
        =======
        ele_func: the function creating this expression
        sub_expr1: variable/constant composing this expression
        sub_expr2: variable/constant composing this expression, set to non
        for unary operations
        """
        self._ele_func  = ele_func
        self._sub_expr1 = sub_expr1
        self._sub_expr2 = sub_expr2
        self.val = None
        self.bder=0
    
    def evaluation_at(self, val_dict):
        """ 
        The wrapper function for individual evaluation_at function of 
        self_ele_func
        
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        a scalar value 
        """
        # self._sub_expr2 is None implies that self._ele_func is an unary operator
        if self._sub_expr2 is None: 
            return self._ele_func.evaluation_at(
                self._sub_expr1, val_dict)
        
        # self._sub_expr2 not None implies that self._ele_func is a binary operator
        else:
            return self._ele_func.evaluation_at(
                self._sub_expr1, self._sub_expr2, val_dict)
    
    def derivative_at(self, var, val_dict, order=1):
        """ 
        The wrapper function for individual derivative_at function of 
        self_ele_func
        
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        var: variable of interests for derivative calculation
        
        RETURNS
        ======== 
        a scalar value 
        """
        
        if type(var) is tuple: order=len(var)
        if var is self: 
            if   order == 1: return 1.0
            else: return 0.0
        
        # sub_expr2 being None implies that _ele_func is an unary operator
        if self._sub_expr2 is None:
            return self._ele_func.derivative_at(
                self._sub_expr1, var, val_dict, order)
        
        # sub_expr2 not None implies that _ele_func is a binary operator
        else:
            return self._ele_func.derivative_at(
                self._sub_expr1, self._sub_expr2, var, val_dict, order)
    
    def back_derivative(self,var,val_dict):
        """
        The wrapper function for individual backderivative_at 
        function of self_ele_func
        
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values. Variables
        in val_dict are of atomic feature and cannot be further decomposed.
        var: variable with respect to which the function calculates derivative   
        
        RETURNS
        ========
        derivative of var with respect to the immediate parent that contain var
        """
        if var is self: return 1.0
        if self._sub_expr2 is None:
            return self._ele_func.backderivative_at(self._sub_expr1,var)
        else:
            return self._ele_func.backderivative_at(self._sub_expr1,
                                                    self._sub_expr2,var)    



    def gradient_at(self, val_dict, returns_dict=False):
        """
        calculate 1st derivative of variables in val_dict using forward mode
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        returns_dict: the format of output
         
        RETURNS
        ========
        derivative of variables in val_dict with respect to the current 
        expression, stored in a dictionary or a 2-D numpy array
        """
        if returns_dict:
            return {v: self.derivative_at(v, val_dict) for v in val_dict.keys()}
        return np.array([self.derivative_at(var, val_dict, order=1) 
                         for var in val_dict.keys()])
    
    def hessian_at(self, val_dict):
        """
        calculate 2nd derivative of variables in val_dict using forward mode
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
         
        RETURNS
        ========
        2nd derivative of variables in val_dict with respect to the current 
        expression, stored in a 2-D list
        """
        return np.array( [ \
                          [self.derivative_at((var1, var2), val_dict, order=2)
                           for var1 in val_dict.keys()]
                          for var2 in val_dict.keys() \
                          ] )
    
    def __neg__(self):
        """ Implement dunder method for neg """
        return Expression(Neg, self)

                
    def __add__(self, another):
        """ Implement dunder method for add """
        if isinstance(another, Expression):
            return Expression(Add, self, another)
        # if the other operand is not an Expression, then it must be a number
        # the number then should be converted to a Constant
        else:
            return Expression(Add, self, Constant(another))
    
    
    def __radd__(self, another):
        """ Implement dunder method for right add """
        if isinstance(another, Expression):
            return Expression(Add, another, self)
        else:
            return Expression(Add, Constant(another), self)
    
    def __sub__(self, another):
        """ Implement dunder method for subtraction """
        if isinstance(another, Expression):
            return Expression(Sub, self, another)
        else:
            return Expression(Sub, self, Constant(another))
    
    def __rsub__(self, another):
        """ Implement dunder method for right subtraction """
        if isinstance(another, Expression):
            return Expression(Sub, another, self)
        else:
            return Expression(Sub, Constant(another), self)
        

    def __mul__(self, another):
        """ Implement dunder method for multiplication """
        if isinstance(another, Expression):
            return Expression(Mul,self,another)
        else:
            return Expression(Mul, self, Constant(another))

    def __rmul__(self, another):
        """ Implement dunder method for right multiplication """
        if isinstance(another, Expression):
            return Expression(Mul,another,self)
        else:
            return Expression(Mul, Constant(another),self)
    
    def __truediv__(self, another):
        """ Implement dunder method for division """
        if isinstance(another, Expression):
            return Expression(Div,self,another)
        else:
            return Expression(Div, self, Constant(another))

    def __rtruediv__(self, another):
        """ Implement dunder method for right division """
        if isinstance(another, Expression):
            return Expression(Div,another,self)
        else:
            return Expression(Div, Constant(another),self)
    
    def __pow__(self,another):
        """ Implement dunder method for power """
        if isinstance(another, Expression):
            return Expression(Pow,self,another)
        else:
            return Expression(Pow, self, Constant(another))
    
    def __rpow__(self,another):
        """ Implement dunder method for right power """
        if isinstance(another, Expression):
            return Expression(Pow,another,self)
        else:
            return Expression(Pow, Constant(another),self)
    
    def __eq__(self, another):
        """ Implement dunder method for equal """
        if not isinstance(another, Expression):
            return False
        return self._ele_func == another._ele_func \
               and self._sub_expr1 == another._sub_expr1 \
               and self._sub_expr2 == another._sub_expr2
               
    def __ne__(self, another):
        """ Implement dunder method not equal """
        return ~self.__eq__(another)
    
    def __hash__(self):
        """ Implement dunder method hash """
        return object.__hash__(self)   

Ancestors (in MRO)

Static methods

def __init__(

self, ele_func, sub_expr1, sub_expr2=None)

The constructor for VectorFunction class.

PARAMETERS:

ele_func: the function creating this expression sub_expr1: variable/constant composing this expression sub_expr2: variable/constant composing this expression, set to non for unary operations

def __init__(self, ele_func, sub_expr1, sub_expr2=None):
    """ 
    The constructor for VectorFunction class. 
    
    PARAMETERS:
    =======
    ele_func: the function creating this expression
    sub_expr1: variable/constant composing this expression
    sub_expr2: variable/constant composing this expression, set to non
    for unary operations
    """
    self._ele_func  = ele_func
    self._sub_expr1 = sub_expr1
    self._sub_expr2 = sub_expr2
    self.val = None
    self.bder=0

def back_derivative(

self, var, val_dict)

The wrapper function for individual backderivative_at function of self_ele_func

PARAMETERS:

val_dict: a dictionary containing variable name and values. Variables in val_dict are of atomic feature and cannot be further decomposed. var: variable with respect to which the function calculates derivative

RETURNS

derivative of var with respect to the immediate parent that contain var

def back_derivative(self,var,val_dict):
    """
    The wrapper function for individual backderivative_at 
    function of self_ele_func
    
    PARAMETERS:
    =======
    val_dict: a dictionary containing variable name and values. Variables
    in val_dict are of atomic feature and cannot be further decomposed.
    var: variable with respect to which the function calculates derivative   
    
    RETURNS
    ========
    derivative of var with respect to the immediate parent that contain var
    """
    if var is self: return 1.0
    if self._sub_expr2 is None:
        return self._ele_func.backderivative_at(self._sub_expr1,var)
    else:
        return self._ele_func.backderivative_at(self._sub_expr1,
                                                self._sub_expr2,var)    

def derivative_at(

self, var, val_dict, order=1)

The wrapper function for individual derivative_at function of self_ele_func

PARAMETERS:

val_dict: a dictionary containing variable name and values. var: variable of interests for derivative calculation

RETURNS

a scalar value

def derivative_at(self, var, val_dict, order=1):
    """ 
    The wrapper function for individual derivative_at function of 
    self_ele_func
    
    PARAMETERS:
    =======
    val_dict: a dictionary containing variable name and values.
    var: variable of interests for derivative calculation
    
    RETURNS
    ======== 
    a scalar value 
    """
    
    if type(var) is tuple: order=len(var)
    if var is self: 
        if   order == 1: return 1.0
        else: return 0.0
    
    # sub_expr2 being None implies that _ele_func is an unary operator
    if self._sub_expr2 is None:
        return self._ele_func.derivative_at(
            self._sub_expr1, var, val_dict, order)
    
    # sub_expr2 not None implies that _ele_func is a binary operator
    else:
        return self._ele_func.derivative_at(
            self._sub_expr1, self._sub_expr2, var, val_dict, order)

def evaluation_at(

self, val_dict)

The wrapper function for individual evaluation_at function of self_ele_func

PARAMETERS:

val_dict: a dictionary containing variable name and values.

RETURNS

a scalar value

def evaluation_at(self, val_dict):
    """ 
    The wrapper function for individual evaluation_at function of 
    self_ele_func
    
    PARAMETERS:
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    a scalar value 
    """
    # self._sub_expr2 is None implies that self._ele_func is an unary operator
    if self._sub_expr2 is None: 
        return self._ele_func.evaluation_at(
            self._sub_expr1, val_dict)
    
    # self._sub_expr2 not None implies that self._ele_func is a binary operator
    else:
        return self._ele_func.evaluation_at(
            self._sub_expr1, self._sub_expr2, val_dict)

def gradient_at(

self, val_dict, returns_dict=False)

calculate 1st derivative of variables in val_dict using forward mode

INPUTS

val_dict: a dictionary containing variable name and values. returns_dict: the format of output

RETURNS

derivative of variables in val_dict with respect to the current expression, stored in a dictionary or a 2-D numpy array

def gradient_at(self, val_dict, returns_dict=False):
    """
    calculate 1st derivative of variables in val_dict using forward mode

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    returns_dict: the format of output
     
    RETURNS
    ========
    derivative of variables in val_dict with respect to the current 
    expression, stored in a dictionary or a 2-D numpy array
    """
    if returns_dict:
        return {v: self.derivative_at(v, val_dict) for v in val_dict.keys()}
    return np.array([self.derivative_at(var, val_dict, order=1) 
                     for var in val_dict.keys()])

def hessian_at(

self, val_dict)

calculate 2nd derivative of variables in val_dict using forward mode

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

2nd derivative of variables in val_dict with respect to the current expression, stored in a 2-D list

def hessian_at(self, val_dict):
    """
    calculate 2nd derivative of variables in val_dict using forward mode

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
     
    RETURNS
    ========
    2nd derivative of variables in val_dict with respect to the current 
    expression, stored in a 2-D list
    """
    return np.array( [ \
                      [self.derivative_at((var1, var2), val_dict, order=2)
                       for var1 in val_dict.keys()]
                      for var2 in val_dict.keys() \
                      ] )

Instance variables

var bder

var val

class Log

class Log:
    
    @staticmethod
    def evaluation_at(sub_expr1, val_dict):
        return np.log(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1, var, val_dict, order=1):
        if   order == 1:
            return 1 / sub_expr1.evaluation_at(val_dict) * sub_expr1.derivative_at(var, val_dict)
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 = 1/f * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = -1/f**2 * sub_expr1.derivative_at(var1, val_dict, order=1) \
                            * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Log.derivative_at(sub_expr1, (var,var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    def backderivative_at(sub_expr1,var):
        if sub_expr1 == var:
            return 1/sub_expr1.val

Ancestors (in MRO)

  • Log
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

def backderivative_at(sub_expr1,var):
    if sub_expr1 == var:
        return 1/sub_expr1.val

def derivative_at(

sub_expr1, var, val_dict, order=1)

@staticmethod
def derivative_at(sub_expr1, var, val_dict, order=1):
    if   order == 1:
        return 1 / sub_expr1.evaluation_at(val_dict) * sub_expr1.derivative_at(var, val_dict)
    elif order == 2:
        if type(var) is tuple:
            var1, var2 = var
            f = sub_expr1.evaluation_at(val_dict)
            term1 = 1/f * sub_expr1.derivative_at(var,  val_dict, order=2)
            term2 = -1/f**2 * sub_expr1.derivative_at(var1, val_dict, order=1) \
                        * sub_expr1.derivative_at(var2, val_dict, order=1)
            return term1 + term2
        else:
            return Log.derivative_at(sub_expr1, (var,var), val_dict, order=2)
    else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')

def evaluation_at(

sub_expr1, val_dict)

@staticmethod
def evaluation_at(sub_expr1, val_dict):
    return np.log(sub_expr1.evaluation_at(val_dict))

class Mul

This is a class to wrap up static method related to mul operation

class Mul:
    """ 
    This is a class to wrap up static method related to mul operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute multiplication of sub_expr1 with sub_expr2 using inputs 
        of variable values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 * sub_expr2
        """
        return sub_expr1.evaluation_at(val_dict) *\
               sub_expr2.evaluation_at(val_dict)
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return sub_expr1.derivative_at(var, val_dict) * \
                   sub_expr2.evaluation_at(val_dict)+ \
                   sub_expr1.evaluation_at(val_dict) *\
                   sub_expr2.derivative_at(var, val_dict)
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                term1 = sub_expr1.derivative_at(var, val_dict, order=2) \
                        * sub_expr2.evaluation_at(val_dict)
                term2 = sub_expr2.derivative_at(var, val_dict, order=2) \
                        * sub_expr1.evaluation_at(val_dict)
                term3 = sub_expr1.derivative_at(var1, val_dict, order=1) \
                        * sub_expr2.derivative_at(var2, val_dict, order=1)
                term4 = sub_expr2.derivative_at(var1, val_dict, order=1) \
                        * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2 + term3 + term4
            else:
                return Mul.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        sub_expr2: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if var == sub_expr1:
            return sub_expr2.val
        else:
            return sub_expr1.val

Ancestors (in MRO)

  • Mul
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, sub_expr2, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,sub_expr2,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression or constant 
    sub_expr2: expression or constant 
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if var == sub_expr1:
        return sub_expr2.val
    else:
        return sub_expr1.val

def derivative_at(

sub_expr1, sub_expr2, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant val_dict: a dictionary containing variable name and values. var: variable of interest order: default set to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default set to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if   order == 1:
        return sub_expr1.derivative_at(var, val_dict) * \
               sub_expr2.evaluation_at(val_dict)+ \
               sub_expr1.evaluation_at(val_dict) *\
               sub_expr2.derivative_at(var, val_dict)
    elif order == 2:
        if type(var) is tuple:
            var1, var2 = var
            term1 = sub_expr1.derivative_at(var, val_dict, order=2) \
                    * sub_expr2.evaluation_at(val_dict)
            term2 = sub_expr2.derivative_at(var, val_dict, order=2) \
                    * sub_expr1.evaluation_at(val_dict)
            term3 = sub_expr1.derivative_at(var1, val_dict, order=1) \
                    * sub_expr2.derivative_at(var2, val_dict, order=1)
            term4 = sub_expr2.derivative_at(var1, val_dict, order=1) \
                    * sub_expr1.derivative_at(var2, val_dict, order=1)
            return term1 + term2 + term3 + term4
        else:
            return Mul.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
    else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')

def evaluation_at(

sub_expr1, sub_expr2, val_dict)

Compute multiplication of sub_expr1 with sub_expr2 using inputs of variable values from val_dict.

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant val_dict: a dictionary containing variable name and values.

RETURNS

sub_expr1 * sub_expr2

@staticmethod
def evaluation_at(sub_expr1, sub_expr2, val_dict):
    """
    Compute multiplication of sub_expr1 with sub_expr2 using inputs 
    of variable values from val_dict.

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sub_expr1 * sub_expr2
    """
    return sub_expr1.evaluation_at(val_dict) *\
           sub_expr2.evaluation_at(val_dict)

class Neg

This is a class to wrap up static method related to neg operation

class Neg:
    """ 
    This is a class to wrap up static method related to neg operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, val_dict):
        """
        Compute negation of sub_expr1 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        negate sub_expr1
        """
        return -sub_expr1.evaluation_at(val_dict)
    
    @staticmethod
    def derivative_at(sub_expr1, var, val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return -sub_expr1.derivative_at(var, val_dict, order)
    
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return -1

Ancestors (in MRO)

  • Neg
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression or constant var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression or constant 
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return -1

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default set to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1, var, val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default set to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return -sub_expr1.derivative_at(var, val_dict, order)

def evaluation_at(

sub_expr1, val_dict)

Compute negation of sub_expr1 using inputs of variable values from val_dict.

INPUTS

sub_expr1: expression or constant val_dict: a dictionary containing variable name and values.

RETURNS

negate sub_expr1

@staticmethod
def evaluation_at(sub_expr1, val_dict):
    """
    Compute negation of sub_expr1 using inputs of variable
    values from val_dict.

    INPUTS
    =======
    sub_expr1: expression or constant
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    negate sub_expr1
    """
    return -sub_expr1.evaluation_at(val_dict)

class Pow

This is a class to wrap up static method related to pow operation

class Pow:
    """ 
    This is a class to wrap up static method related to pow operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute sub_expr1 to the sub_expr2 power using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 ** sub_expr2
        """
        return np.power(sub_expr1.evaluation_at(val_dict), 
                        sub_expr2.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        p = sub_expr2.evaluation_at(val_dict)
        if   order == 1:
            return p*np.power(sub_expr1.evaluation_at(val_dict), p-1.0) \
                   * sub_expr1.derivative_at(var, val_dict)
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                term1 = p*np.power(sub_expr1.evaluation_at(val_dict), p-1.0) \
                        * sub_expr1.derivative_at((var1, var2), val_dict, order=2)
                term2 = p*(p-1.0)*np.power(sub_expr1.evaluation_at(val_dict), p-2.0) \
                        * sub_expr1.derivative_at(var1, val_dict, order=1) \
                        * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Pow.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        p = sub_expr2.val
        return p*np.power(sub_expr1.val, p-1.0)

Ancestors (in MRO)

  • Pow
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, sub_expr2, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,sub_expr2,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    p = sub_expr2.val
    return p*np.power(sub_expr1.val, p-1.0)

def derivative_at(

sub_expr1, sub_expr2, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant val_dict: a dictionary containing variable name and values. var: variable of interest order: default set to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1, sub_expr2, var, val_dict,order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default set to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    p = sub_expr2.evaluation_at(val_dict)
    if   order == 1:
        return p*np.power(sub_expr1.evaluation_at(val_dict), p-1.0) \
               * sub_expr1.derivative_at(var, val_dict)
    elif order == 2:
        if type(var) is tuple:
            var1, var2 = var
            term1 = p*np.power(sub_expr1.evaluation_at(val_dict), p-1.0) \
                    * sub_expr1.derivative_at((var1, var2), val_dict, order=2)
            term2 = p*(p-1.0)*np.power(sub_expr1.evaluation_at(val_dict), p-2.0) \
                    * sub_expr1.derivative_at(var1, val_dict, order=1) \
                    * sub_expr1.derivative_at(var2, val_dict, order=1)
            return term1 + term2
        else:
            return Pow.derivative_at(sub_expr1, sub_expr2, (var, var), val_dict, order=2)
    else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')

def evaluation_at(

sub_expr1, sub_expr2, val_dict)

Compute sub_expr1 to the sub_expr2 power using inputs of variable values from val_dict.

INPUTS

sub_expr1: expression or constant sub_expr2: constant val_dict: a dictionary containing variable name and values.

RETURNS

sub_expr1 ** sub_expr2

@staticmethod
def evaluation_at(sub_expr1, sub_expr2, val_dict):
    """
    Compute sub_expr1 to the sub_expr2 power using inputs of variable
    values from val_dict.

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: constant
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sub_expr1 ** sub_expr2
    """
    return np.power(sub_expr1.evaluation_at(val_dict), 
                    sub_expr2.evaluation_at(val_dict))

class Sec

This is a class to wrap up static method related to sec operation

class Sec:
    """ 
    This is a class to wrap up static method related to sec operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute sec of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sec sub_expr1
        """
        return 1/np.cos(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * \
               np.tan(x) * (1/np.cos(x))
        else: raise NotImplementedError('higher order derivatives not implemented for sec.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x =sub_expr1.val
        return np.tan(x)/np.cos(x)

Ancestors (in MRO)

  • Sec
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x =sub_expr1.val
    return np.tan(x)/np.cos(x)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict) * \
           np.tan(x) * (1/np.cos(x))
    else: raise NotImplementedError('higher order derivatives not implemented for sec.')

def evaluation_at(

sub_expr1, val_dict)

Compute sec of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

sec sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute sec of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sec sub_expr1
    """
    return 1/np.cos(sub_expr1.evaluation_at(val_dict))

class Sech

This is a class to wrap up static method related to sech operation

class Sech:
    """ 
    This is a class to wrap up static method related to sech operation
    """
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute sech of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sech sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return 1/np.cosh(x)
    
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        # d = -sech(x)tanh(x)
        d = -(1/np.cosh(x)) * (np.sinh(x)/np.cosh(x))
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict)*d
        else: raise NotImplementedError('higher order derivatives not implemented for sech.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return -(1/np.cosh(x)) * (np.sinh(x)/np.cosh(x))

Ancestors (in MRO)

  • Sech
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    return -(1/np.cosh(x)) * (np.sinh(x)/np.cosh(x))

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    # d = -sech(x)tanh(x)
    d = -(1/np.cosh(x)) * (np.sinh(x)/np.cosh(x))
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict)*d
    else: raise NotImplementedError('higher order derivatives not implemented for sech.')

def evaluation_at(

sub_expr1, val_dict)

Compute sech of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

sech sub_expr1

def evaluation_at(sub_expr1,val_dict):
    """
    Compute sech of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sech sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    return 1/np.cosh(x)

class Sin

This is a class to wrap up static method related to sin operation

class Sin:
    """ 
    This is a class to wrap up static method related to sin operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, val_dict):
        """
        Compute sin of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sin of sub_expr1 
        """
        return np.sin(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1, var, val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return sub_expr1.derivative_at(var, val_dict) * \
        np.cos(sub_expr1.evaluation_at(val_dict))
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 =  np.cos(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = -np.sin(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                                   * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Sin.derivative_at(sub_expr1, (var,var), val_dict, order=2)
        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return np.cos(sub_expr1.val)

Ancestors (in MRO)

  • Sin
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return np.cos(sub_expr1.val)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression or constant val_dict: a dictionary containing variable name and values. var: variable of interest order: default set to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1, var, val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression or constant 
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default set to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if   order == 1:
        return sub_expr1.derivative_at(var, val_dict) * \
    np.cos(sub_expr1.evaluation_at(val_dict))
    elif order == 2:
        if type(var) is tuple:
            var1, var2 = var
            f = sub_expr1.evaluation_at(val_dict)
            term1 =  np.cos(f) * sub_expr1.derivative_at(var,  val_dict, order=2)
            term2 = -np.sin(f) * sub_expr1.derivative_at(var1, val_dict, order=1) \
                               * sub_expr1.derivative_at(var2, val_dict, order=1)
            return term1 + term2
        else:
            return Sin.derivative_at(sub_expr1, (var,var), val_dict, order=2)
    else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')

def evaluation_at(

sub_expr1, val_dict)

Compute sin of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

sin of sub_expr1

@staticmethod
def evaluation_at(sub_expr1, val_dict):
    """
    Compute sin of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sin of sub_expr1 
    """
    return np.sin(sub_expr1.evaluation_at(val_dict))

class Sinh

This is a class to wrap up static method related to sinh operation

class Sinh:
    """ 
    This is a class to wrap up static method related to sinh operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute sinh of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sinh sub_expr1
        """
        return np.sinh(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * np.cosh(x)
        else: raise NotImplementedError('higher order derivatives not implemented for sinh.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        return np.cosh(x)

Ancestors (in MRO)

  • Sinh
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    return np.cosh(x)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict) * np.cosh(x)
    else: raise NotImplementedError('higher order derivatives not implemented for sinh.')

def evaluation_at(

sub_expr1, val_dict)

Compute sinh of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

sinh sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute sinh of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sinh sub_expr1
    """
    return np.sinh(sub_expr1.evaluation_at(val_dict))

class Sub

This is a class to wrap up static method related to sub operation

class Sub:
    """ 
    This is a class to wrap up static method related to sub operation
    """
    @staticmethod
    def evaluation_at(sub_expr1, sub_expr2, val_dict):
        """
        Compute subtraction of sub_expr2 from sub_expr1 using inputs of variable
        values from val_dict.
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        sub_expr1 - sub_expr2
        """
        return sub_expr1.evaluation_at(val_dict) - \
               sub_expr2.evaluation_at(val_dict)
    @staticmethod
    def derivative_at(sub_expr1, sub_expr2, var, val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        sub_expr2: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default set to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return sub_expr1.derivative_at(var, val_dict, order) - \
               sub_expr2.derivative_at(var, val_dict, order)
    @staticmethod
    def backderivative_at(sub_expr1,sub_expr2,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression or constant 
        sub_expr2: expression or constant 
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if var == sub_expr1:
            return 1
        if var == sub_expr2:
            return -1 

Ancestors (in MRO)

  • Sub
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, sub_expr2, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,sub_expr2,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression or constant 
    sub_expr2: expression or constant 
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if var == sub_expr1:
        return 1
    if var == sub_expr2:
        return -1 

def derivative_at(

sub_expr1, sub_expr2, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant val_dict: a dictionary containing variable name and values. var: variable of interest order: default set to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1, sub_expr2, var, val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default set to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return sub_expr1.derivative_at(var, val_dict, order) - \
           sub_expr2.derivative_at(var, val_dict, order)

def evaluation_at(

sub_expr1, sub_expr2, val_dict)

Compute subtraction of sub_expr2 from sub_expr1 using inputs of variable values from val_dict.

INPUTS

sub_expr1: expression or constant sub_expr2: expression or constant val_dict: a dictionary containing variable name and values.

RETURNS

sub_expr1 - sub_expr2

@staticmethod
def evaluation_at(sub_expr1, sub_expr2, val_dict):
    """
    Compute subtraction of sub_expr2 from sub_expr1 using inputs of variable
    values from val_dict.

    INPUTS
    =======
    sub_expr1: expression or constant
    sub_expr2: expression or constant
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    sub_expr1 - sub_expr2
    """
    return sub_expr1.evaluation_at(val_dict) - \
           sub_expr2.evaluation_at(val_dict)

class Tan

This is a class to wrap up static method related to tan operation

class Tan:
    """ 
    This is a class to wrap up static method related to tan operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute tan of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        tan sub_expr1
        """
        return np.tan(sub_expr1.evaluation_at(val_dict))
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression or constant
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        if   order == 1:
            return sub_expr1.derivative_at(var, val_dict) /(np.cos(sub_expr1.evaluation_at(val_dict))**2)
        elif order == 2:
            if type(var) is tuple:
                var1, var2 = var
                f = sub_expr1.evaluation_at(val_dict)
                term1 = 1/(np.cos(f)**2) * sub_expr1.derivative_at(var,  val_dict, order=2)
                term2 = 2*np.tan(f)/(np.cos(f)**2) \
                        * sub_expr1.derivative_at(var1, val_dict, order=1) \
                        * sub_expr1.derivative_at(var2, val_dict, order=1)
                return term1 + term2
            else:
                return Tan.derivative_at(sub_expr1, (var,var), val_dict, order=2)

        else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        return 1/(np.cos(sub_expr1.val)**2)

Ancestors (in MRO)

  • Tan
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    return 1/(np.cos(sub_expr1.val)**2)

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression or constant val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression or constant
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    if   order == 1:
        return sub_expr1.derivative_at(var, val_dict) /(np.cos(sub_expr1.evaluation_at(val_dict))**2)
    elif order == 2:
        if type(var) is tuple:
            var1, var2 = var
            f = sub_expr1.evaluation_at(val_dict)
            term1 = 1/(np.cos(f)**2) * sub_expr1.derivative_at(var,  val_dict, order=2)
            term2 = 2*np.tan(f)/(np.cos(f)**2) \
                    * sub_expr1.derivative_at(var1, val_dict, order=1) \
                    * sub_expr1.derivative_at(var2, val_dict, order=1)
            return term1 + term2
        else:
            return Tan.derivative_at(sub_expr1, (var,var), val_dict, order=2)
    else: raise NotImplementedError('3rd order or higher derivatives are not implemented.')

def evaluation_at(

sub_expr1, val_dict)

Compute tan of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

tan sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute tan of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    tan sub_expr1
    """
    return np.tan(sub_expr1.evaluation_at(val_dict))

class Tanh

This is a class to wrap up static method related to tanh operation

class Tanh:
    """ 
    This is a class to wrap up static method related to tanh operation
    """
    @staticmethod
    def evaluation_at(sub_expr1,val_dict):
        """
        Compute tanh of sub_expr1 with inputs of variable values from val_dict.
    
        INPUTS
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        tanh sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        return np.sinh(x)/np.cosh(x)
    
    @staticmethod
    def derivative_at(sub_expr1,var,val_dict, order=1):
        """
        calculate 1st derivative of var using forward mode
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        val_dict: a dictionary containing variable name and values.
        var: variable of interest
        order: default to 1, set to 2 if 2nd derivative is desired
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.evaluation_at(val_dict)
        tanh = np.sinh(x)/np.cosh(x)
        if order == 1:
            return sub_expr1.derivative_at(var, val_dict) * (1-tanh*tanh)
        else: raise NotImplementedError('higher order derivatives not implemented for tanh.')
    @staticmethod
    def backderivative_at(sub_expr1,var):
        """
        calculate 1st derivative of var using back propagation
    
        INPUTS
        =======
        sub_expr1: expression whose components include var(or itself be to var)
        var: variable of interest
        
        RETURNS
        ========
        derivative of var with respect to sub_expr1
        """
        x = sub_expr1.val
        tanh = np.sinh(x)/np.cosh(x)
        return 1-tanh*tanh

Ancestors (in MRO)

  • Tanh
  • builtins.object

Static methods

def backderivative_at(

sub_expr1, var)

calculate 1st derivative of var using back propagation

INPUTS

sub_expr1: expression whose components include var(or itself be to var) var: variable of interest

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def backderivative_at(sub_expr1,var):
    """
    calculate 1st derivative of var using back propagation

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    var: variable of interest
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.val
    tanh = np.sinh(x)/np.cosh(x)
    return 1-tanh*tanh

def derivative_at(

sub_expr1, var, val_dict, order=1)

calculate 1st derivative of var using forward mode

INPUTS

sub_expr1: expression whose components include var(or itself be to var) val_dict: a dictionary containing variable name and values. var: variable of interest order: default to 1, set to 2 if 2nd derivative is desired

RETURNS

derivative of var with respect to sub_expr1

@staticmethod
def derivative_at(sub_expr1,var,val_dict, order=1):
    """
    calculate 1st derivative of var using forward mode

    INPUTS
    =======
    sub_expr1: expression whose components include var(or itself be to var)
    val_dict: a dictionary containing variable name and values.
    var: variable of interest
    order: default to 1, set to 2 if 2nd derivative is desired
    
    RETURNS
    ========
    derivative of var with respect to sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    tanh = np.sinh(x)/np.cosh(x)
    if order == 1:
        return sub_expr1.derivative_at(var, val_dict) * (1-tanh*tanh)
    else: raise NotImplementedError('higher order derivatives not implemented for tanh.')

def evaluation_at(

sub_expr1, val_dict)

Compute tanh of sub_expr1 with inputs of variable values from val_dict.

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

tanh sub_expr1

@staticmethod
def evaluation_at(sub_expr1,val_dict):
    """
    Compute tanh of sub_expr1 with inputs of variable values from val_dict.

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    tanh sub_expr1
    """
    x = sub_expr1.evaluation_at(val_dict)
    return np.sinh(x)/np.cosh(x)

class Variable

This is a class for representing variable.

class Variable(Expression):
    """ 
    This is a class for representing variable. 
    """
    def __init__(self):
        """ 
        The constructor for VectorFunction class. 
        It has no parameters: 
        """
        self.val = None
        self.bder = 0
        return
    
    def evaluation_at(self, val_dict):
        """ 
        The function to evaluation the value of variable class
        
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ======== 
        a scalar value 
        """
        return val_dict[self]
    
    def derivative_at(self, var, val_dict, order=1):
        """ 
        The function calculates derivative of variable class. 
  
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        var: variable whose derivative is the result of this function
        order: default set to 1 for 1st derivative, change to 2 for 
        higher order
        
        RETURNS
        ========
        scalar value  
        """
        if order == 1:
            return 1.0 if var is self else 0.0
        else:
            return 0.0
    
    def __eq__(self, another):
        """ Implement dunder method for equal """
        return another is self
    
    def __ne__(self, another):
        """ Implement dunder method for not equal """
        return ~self.__eq__(another)
    
    def __hash__(self):
        """ Implement dunder method for hash """
        return Expression.__hash__(self) 

Ancestors (in MRO)

Static methods

def __init__(

self)

The constructor for VectorFunction class. It has no parameters:

def __init__(self):
    """ 
    The constructor for VectorFunction class. 
    It has no parameters: 
    """
    self.val = None
    self.bder = 0
    return

def back_derivative(

self, var, val_dict)

The wrapper function for individual backderivative_at function of self_ele_func

PARAMETERS:

val_dict: a dictionary containing variable name and values. Variables in val_dict are of atomic feature and cannot be further decomposed. var: variable with respect to which the function calculates derivative

RETURNS

derivative of var with respect to the immediate parent that contain var

def back_derivative(self,var,val_dict):
    """
    The wrapper function for individual backderivative_at 
    function of self_ele_func
    
    PARAMETERS:
    =======
    val_dict: a dictionary containing variable name and values. Variables
    in val_dict are of atomic feature and cannot be further decomposed.
    var: variable with respect to which the function calculates derivative   
    
    RETURNS
    ========
    derivative of var with respect to the immediate parent that contain var
    """
    if var is self: return 1.0
    if self._sub_expr2 is None:
        return self._ele_func.backderivative_at(self._sub_expr1,var)
    else:
        return self._ele_func.backderivative_at(self._sub_expr1,
                                                self._sub_expr2,var)    

def derivative_at(

self, var, val_dict, order=1)

The function calculates derivative of variable class.

PARAMETERS:

val_dict: a dictionary containing variable name and values. var: variable whose derivative is the result of this function order: default set to 1 for 1st derivative, change to 2 for higher order

RETURNS

scalar value

def derivative_at(self, var, val_dict, order=1):
    """ 
    The function calculates derivative of variable class. 
    PARAMETERS:
    =======
    val_dict: a dictionary containing variable name and values.
    var: variable whose derivative is the result of this function
    order: default set to 1 for 1st derivative, change to 2 for 
    higher order
    
    RETURNS
    ========
    scalar value  
    """
    if order == 1:
        return 1.0 if var is self else 0.0
    else:
        return 0.0

def evaluation_at(

self, val_dict)

The function to evaluation the value of variable class

PARAMETERS:

val_dict: a dictionary containing variable name and values.

RETURNS

a scalar value

def evaluation_at(self, val_dict):
    """ 
    The function to evaluation the value of variable class
    
    PARAMETERS:
    =======
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ======== 
    a scalar value 
    """
    return val_dict[self]

def gradient_at(

self, val_dict, returns_dict=False)

calculate 1st derivative of variables in val_dict using forward mode

INPUTS

val_dict: a dictionary containing variable name and values. returns_dict: the format of output

RETURNS

derivative of variables in val_dict with respect to the current expression, stored in a dictionary or a 2-D numpy array

def gradient_at(self, val_dict, returns_dict=False):
    """
    calculate 1st derivative of variables in val_dict using forward mode

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
    returns_dict: the format of output
     
    RETURNS
    ========
    derivative of variables in val_dict with respect to the current 
    expression, stored in a dictionary or a 2-D numpy array
    """
    if returns_dict:
        return {v: self.derivative_at(v, val_dict) for v in val_dict.keys()}
    return np.array([self.derivative_at(var, val_dict, order=1) 
                     for var in val_dict.keys()])

def hessian_at(

self, val_dict)

calculate 2nd derivative of variables in val_dict using forward mode

INPUTS

val_dict: a dictionary containing variable name and values.

RETURNS

2nd derivative of variables in val_dict with respect to the current expression, stored in a 2-D list

def hessian_at(self, val_dict):
    """
    calculate 2nd derivative of variables in val_dict using forward mode

    INPUTS
    =======
    val_dict: a dictionary containing variable name and values.
     
    RETURNS
    ========
    2nd derivative of variables in val_dict with respect to the current 
    expression, stored in a 2-D list
    """
    return np.array( [ \
                      [self.derivative_at((var1, var2), val_dict, order=2)
                       for var1 in val_dict.keys()]
                      for var2 in val_dict.keys() \
                      ] )

Instance variables

var bder

var val

Inheritance: Expression.val

class VectorFunction

This is a class for applying operations to a vector of variables.

Attributes: _exprlist: a list of expressions with respect to which the operations are applied

class VectorFunction:
    """ 
    This is a class for applying operations to a vector of variables. 
      
    Attributes: 
        _exprlist: a list of expressions with respect to which the operations
    are applied 
    """
    def __init__(self, exprlist):
        """ 
        The constructor for VectorFunction class. 
        
        PARAMETERS:
        ======= 
        exprlist: a list of expressions with respect to which the class 
        functions are applied to  
        """
        self._exprlist = exprlist.copy()
    
    def evaluation_at(self, val_dict):
        """ 
        The function to apply evaluation_at to a vector of expressions. 
  
        PARAMETERS:
        ======= 
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        a numpy array containing value of expressions in the self._exprlist. 
        """
        return np.array([expr.evaluation_at(val_dict) 
                        for expr in self._exprlist])
    
    def gradient_at(self, var, val_dict):
        """ 
        The function to apply derivative_at to a vector of expressions. 
  
        PARAMETERS:
        =======
        val_dict: a dictionary containing variable name and values.
        var: variable whose derivative is the result of this function
       
        RETURNS
        ========
        a numpy array containing first derivative of expressions in 
        self._exprlist with respect to var. 
        """
        return np.array([f.derivative_at(var, val_dict) for f in self._exprlist])
    
    def jacobian_at(self, val_dict):
        """ 
        The function to calculate jacobian with respect to atomic variables in 
        val_dict. 
  
        PARAMETERS:
        ======= 
        val_dict: a dictionary containing variable name and values.
        
        RETURNS
        ========
        a 2-D numpy array containing derivatives of variables in val_dict 
        with resepct to expressions in self._exprlist. 
        """
        return np.array([self.gradient_at(var, val_dict)
                         for var in val_dict.keys()]).transpose()

Ancestors (in MRO)

Static methods

def __init__(

self, exprlist)

The constructor for VectorFunction class.

PARAMETERS:

exprlist: a list of expressions with respect to which the class functions are applied to

def __init__(self, exprlist):
    """ 
    The constructor for VectorFunction class. 
    
    PARAMETERS:
    ======= 
    exprlist: a list of expressions with respect to which the class 
    functions are applied to  
    """
    self._exprlist = exprlist.copy()

def evaluation_at(

self, val_dict)

The function to apply evaluation_at to a vector of expressions.

PARAMETERS:

val_dict: a dictionary containing variable name and values.

RETURNS

a numpy array containing value of expressions in the self._exprlist.

def evaluation_at(self, val_dict):
    """ 
    The function to apply evaluation_at to a vector of expressions. 
    PARAMETERS:
    ======= 
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    a numpy array containing value of expressions in the self._exprlist. 
    """
    return np.array([expr.evaluation_at(val_dict) 
                    for expr in self._exprlist])

def gradient_at(

self, var, val_dict)

The function to apply derivative_at to a vector of expressions.

PARAMETERS:

val_dict: a dictionary containing variable name and values. var: variable whose derivative is the result of this function

RETURNS

a numpy array containing first derivative of expressions in self._exprlist with respect to var.

def gradient_at(self, var, val_dict):
    """ 
    The function to apply derivative_at to a vector of expressions. 
    PARAMETERS:
    =======
    val_dict: a dictionary containing variable name and values.
    var: variable whose derivative is the result of this function
   
    RETURNS
    ========
    a numpy array containing first derivative of expressions in 
    self._exprlist with respect to var. 
    """
    return np.array([f.derivative_at(var, val_dict) for f in self._exprlist])

def jacobian_at(

self, val_dict)

The function to calculate jacobian with respect to atomic variables in val_dict.

PARAMETERS:

val_dict: a dictionary containing variable name and values.

RETURNS

a 2-D numpy array containing derivatives of variables in val_dict with resepct to expressions in self._exprlist.

def jacobian_at(self, val_dict):
    """ 
    The function to calculate jacobian with respect to atomic variables in 
    val_dict. 
    PARAMETERS:
    ======= 
    val_dict: a dictionary containing variable name and values.
    
    RETURNS
    ========
    a 2-D numpy array containing derivatives of variables in val_dict 
    with resepct to expressions in self._exprlist. 
    """
    return np.array([self.gradient_at(var, val_dict)
                     for var in val_dict.keys()]).transpose()

Autodiff.backprop

backprop API documentation Top

backprop module

This file contains the back propagation feature using interface designed in forward.py

"""
This file contains the back propagation feature using interface designed in 
forward.py
"""
import autodiff.forward as fwd

def forward_pass(y,val_dict):
    """ 
    The function evaluating each variable/constant and storing their values
    in .val attributes, using atomic variable values from val_dict, in a 
    recursive fashion, starting from root node in the computational graph
        
    INPUTS:
    =======
    val_dict: a dictionary containing variable name and values.
    y: the highest node(root) encompassing all variable in the computational 
    graph
    """
    # forward pass, store values
    if type(y) == fwd.Expression:
        y.val = y.evaluation_at(val_dict)
        if y._sub_expr1 != None:
            forward_pass(y._sub_expr1,val_dict)
        if y._sub_expr2!=None:
            forward_pass(y._sub_expr2,val_dict)
    elif isinstance(y,fwd.Variable):
        y.val = val_dict[y]
    return 

def initialize(top,y):
    """ 
    The function initializing derivative values of each variable/constant
    in the computational graph with respect to the root.
        
    INPUTS:
    =======
    y: the highest node(root) encompassing all variable in the computational 
    graph
    """
    #print(y.val)
    if y == top:
        y.bder = 1 
    else:
        y.bder = 0
    if not isinstance(y,fwd.Variable) and isinstance(y._sub_expr1,fwd.Expression) and not isinstance(y._sub_expr1, fwd.Constant):
        initialize(top,y._sub_expr1)
    if not isinstance(y,fwd.Variable) and isinstance(y._sub_expr2,fwd.Expression) and not isinstance(y._sub_expr2, fwd.Constant):
        initialize(top,y._sub_expr2)   
    return

def backward(y,val_dict,depth = 0):
    """ 
    The function calculating derivative values of each variable/constant
    in the computational graph with respect to the root in a recursive fashion,
    starting from the root node.
        
    INPUTS:
    =======
    y: the highest node(root) encompassing all variable in the computational 
    graph
    val_dict: a dictionary containing variable name and values.
    """
    # val_dict stores the basic variables
    # (atomic,cannot be further decomposed)
    if type(y) == fwd.Expression:
        if y._sub_expr1 != None and isinstance(y._sub_expr1,fwd.Expression):
            y._sub_expr1.bder += y.bder*y.back_derivative(y._sub_expr1,val_dict)
            backward(y._sub_expr1,val_dict,depth+1)
        if y._sub_expr2 != None and isinstance(y._sub_expr2,fwd.Expression) and not isinstance(y._sub_expr2, fwd.Constant):
            y._sub_expr2.bder += y.bder*y.back_derivative(y._sub_expr2,val_dict)
            backward(y._sub_expr2,val_dict,depth+1)
    return 

def back_propagation(y,val_dict):
    """ 
    The wrapper function for three steps of back propogation.
    After calling this function, the .bder attributes of each variables/constant
    in the computational graph stores its first derivative with respect to
    the root node.
        
    INPUTS:
    =======
    y: the highest node(root) encompassing all variable in the computational 
    graph
    val_dict: a dictionary containing variable name and values.
    """
    # get backprop derivative with respect to y at every node lower than y
    forward_pass(y,val_dict)
    initialize(y,y)
    backward(y,val_dict)

Functions

def back_propagation(

y, val_dict)

The wrapper function for three steps of back propogation. After calling this function, the .bder attributes of each variables/constant in the computational graph stores its first derivative with respect to the root node.

INPUTS:

y: the highest node(root) encompassing all variable in the computational graph val_dict: a dictionary containing variable name and values.

def back_propagation(y,val_dict):
    """ 
    The wrapper function for three steps of back propogation.
    After calling this function, the .bder attributes of each variables/constant
    in the computational graph stores its first derivative with respect to
    the root node.
        
    INPUTS:
    =======
    y: the highest node(root) encompassing all variable in the computational 
    graph
    val_dict: a dictionary containing variable name and values.
    """
    # get backprop derivative with respect to y at every node lower than y
    forward_pass(y,val_dict)
    initialize(y,y)
    backward(y,val_dict)

def backward(

y, val_dict, depth=0)

The function calculating derivative values of each variable/constant in the computational graph with respect to the root in a recursive fashion, starting from the root node.

INPUTS:

y: the highest node(root) encompassing all variable in the computational graph val_dict: a dictionary containing variable name and values.

def backward(y,val_dict,depth = 0):
    """ 
    The function calculating derivative values of each variable/constant
    in the computational graph with respect to the root in a recursive fashion,
    starting from the root node.
        
    INPUTS:
    =======
    y: the highest node(root) encompassing all variable in the computational 
    graph
    val_dict: a dictionary containing variable name and values.
    """
    # val_dict stores the basic variables
    # (atomic,cannot be further decomposed)
    if type(y) == fwd.Expression:
        if y._sub_expr1 != None and isinstance(y._sub_expr1,fwd.Expression):
            y._sub_expr1.bder += y.bder*y.back_derivative(y._sub_expr1,val_dict)
            backward(y._sub_expr1,val_dict,depth+1)
        if y._sub_expr2 != None and isinstance(y._sub_expr2,fwd.Expression) and not isinstance(y._sub_expr2, fwd.Constant):
            y._sub_expr2.bder += y.bder*y.back_derivative(y._sub_expr2,val_dict)
            backward(y._sub_expr2,val_dict,depth+1)
    return 

def forward_pass(

y, val_dict)

The function evaluating each variable/constant and storing their values in .val attributes, using atomic variable values from val_dict, in a recursive fashion, starting from root node in the computational graph

INPUTS:

val_dict: a dictionary containing variable name and values. y: the highest node(root) encompassing all variable in the computational graph

def forward_pass(y,val_dict):
    """ 
    The function evaluating each variable/constant and storing their values
    in .val attributes, using atomic variable values from val_dict, in a 
    recursive fashion, starting from root node in the computational graph
        
    INPUTS:
    =======
    val_dict: a dictionary containing variable name and values.
    y: the highest node(root) encompassing all variable in the computational 
    graph
    """
    # forward pass, store values
    if type(y) == fwd.Expression:
        y.val = y.evaluation_at(val_dict)
        if y._sub_expr1 != None:
            forward_pass(y._sub_expr1,val_dict)
        if y._sub_expr2!=None:
            forward_pass(y._sub_expr2,val_dict)
    elif isinstance(y,fwd.Variable):
        y.val = val_dict[y]
    return 

def initialize(

top, y)

The function initializing derivative values of each variable/constant in the computational graph with respect to the root.

INPUTS:

y: the highest node(root) encompassing all variable in the computational graph

def initialize(top,y):
    """ 
    The function initializing derivative values of each variable/constant
    in the computational graph with respect to the root.
        
    INPUTS:
    =======
    y: the highest node(root) encompassing all variable in the computational 
    graph
    """
    #print(y.val)
    if y == top:
        y.bder = 1 
    else:
        y.bder = 0
    if not isinstance(y,fwd.Variable) and isinstance(y._sub_expr1,fwd.Expression) and not isinstance(y._sub_expr1, fwd.Constant):
        initialize(top,y._sub_expr1)
    if not isinstance(y,fwd.Variable) and isinstance(y._sub_expr2,fwd.Expression) and not isinstance(y._sub_expr2, fwd.Constant):
        initialize(top,y._sub_expr2)   
    return

Autodiff.rootfinding

rootfinding API documentation Top

rootfinding module

This file contains some root finding algorithms built on top of autodiff.

"""
This file contains some root finding algorithms built on top of autodiff.
"""



from autodiff.backprop import back_propagation

def newton_scalar(f, init_val_dict, max_itr, method = 'forward',tol=1e-8):
    """
    Newton's Method finding 0 root for a single expression 
    
    INPUTS
    =======
    f: expression 
    init_val_dict: dictionary containing initial value of variables
    max_itr: maximum iteration before the algorithm stops
    method: string, default set to 'forward mode', in all other cases 
    use 'backward'
    tol: tolerance, the minimum threshold for absolute 
    difference of value of f from 0 for the algorithm to stop
    
    RETURNS
    ========
    variable values corresponding to the 0 point of f
    """
    itr = 1
    val_dict = init_val_dict.copy()
    
    while True:
        evalf = f.evaluation_at(val_dict)
        if method == 'forward':    
            derif = {v: f.derivative_at(v, val_dict) for v in val_dict.keys()}
        else:
            back_propagation(f,val_dict)
            derif = {v:v.bder for v in val_dict.keys()}
        
        for v in val_dict.keys():
            val_dict[v] = val_dict[v] - evalf/derif[v]

        if abs(f.evaluation_at(val_dict)) <= tol: break

        if itr > max_itr:
            print("Exceeded allowable max iterations without finding a root.")

            break
        
        itr += 1
        
    return val_dict

Functions

def newton_scalar(

f, init_val_dict, max_itr, method='forward', tol=1e-08)

Newton's Method finding 0 root for a single expression

INPUTS

f: expression init_val_dict: dictionary containing initial value of variables max_itr: maximum iteration before the algorithm stops method: string, default set to 'forward mode', in all other cases use 'backward' tol: tolerance, the minimum threshold for absolute difference of value of f from 0 for the algorithm to stop

RETURNS

variable values corresponding to the 0 point of f

def newton_scalar(f, init_val_dict, max_itr, method = 'forward',tol=1e-8):
    """
    Newton's Method finding 0 root for a single expression 
    
    INPUTS
    =======
    f: expression 
    init_val_dict: dictionary containing initial value of variables
    max_itr: maximum iteration before the algorithm stops
    method: string, default set to 'forward mode', in all other cases 
    use 'backward'
    tol: tolerance, the minimum threshold for absolute 
    difference of value of f from 0 for the algorithm to stop
    
    RETURNS
    ========
    variable values corresponding to the 0 point of f
    """
    itr = 1
    val_dict = init_val_dict.copy()
    
    while True:
        evalf = f.evaluation_at(val_dict)
        if method == 'forward':    
            derif = {v: f.derivative_at(v, val_dict) for v in val_dict.keys()}
        else:
            back_propagation(f,val_dict)
            derif = {v:v.bder for v in val_dict.keys()}
        
        for v in val_dict.keys():
            val_dict[v] = val_dict[v] - evalf/derif[v]

        if abs(f.evaluation_at(val_dict)) <= tol: break

        if itr > max_itr:
            print("Exceeded allowable max iterations without finding a root.")

            break
        
        itr += 1
        
    return val_dict

Autodiff.optimize

optimize API documentation Top

optimize module

import numpy as np
from numpy.linalg import multi_dot
from numpy.linalg import norm
from scipy.linalg import solve

def bfgs(f, init_val_dict, max_iter=2000, stop_stepsize=1e-8):
    """
    Broyden–Fletcher–Goldfarb–Shanno finding minimum for a 
    single expression
    
    INPUTS
    =======
    f: expression 
    init_val_dict:dictionary containing initial value of variables
    max_iter: maximum iteration before the algorithm stops
    stop_stepsize: tolerance, the minimum threshold for absolute 
    difference of value of f from 0 for the algorithm to stop
    
    RETURNS
    ========
    variable values corresponding to the minimum value of f
    """
    variables  = [var for var in init_val_dict.keys()]
    curr_point = np.array([v for k, v in init_val_dict.items()])
    B          = np.eye(len(curr_point))
    
    for i in range(max_iter):
        
        # solve Bs = - (gradient of f at x)
        curr_val_dict = {var: val for var, val in zip(variables, curr_point)}
        f_grad = f.gradient_at(curr_val_dict)
        s = solve(B, -f_grad)
        if norm(s, ord=2) < stop_stepsize: break
            
        # x_next := x + s
        next_point = curr_point + s

        # y := (gradient of f at x_next) - (gradient of f at x)
        # x := x_next
        next_val_dict = {var: val for var, val in zip(variables, next_point)}
        y = f.gradient_at(next_val_dict) - f.gradient_at(curr_val_dict)
        curr_point = next_point
        
        # B := B + deltaB
        s, y = s.reshape(-1, 1), y.reshape(-1, 1)
        deltaB = multi_dot([y, y.T])/multi_dot([y.T, s]) \
                 - multi_dot([B, s, s.T, B])/multi_dot([s.T, B, s]) 
        B = B + deltaB
    
    return {var: val for var, val in zip(variables, curr_point)}


def newton(f,  init_val_dict, max_iter=1000, stop_stepsize=1e-8,return_history=False):
    """
    Newton's Method finding minimum for a single expression
    
    INPUTS
    =======
    f: expression 
    init_val_dict:dictionary containing initial value of variables
    max_itr: maximum iteration before the algorithm stops
    stop_stepsize: tolerance, the minimum threshold for absolute 
    difference of value of f from 0 for the algorithm to stop
    return_history: default set to False. If True, return the trajectory
    of the algorithm including the final answer
    
    RETURNS
    ========
    If return_history = False: variable values corresponding to the 
    minimum value of f
    If return_history = True, return the trajectory
    of the algorithm including the final answer
    """
    variables  = [var for var in init_val_dict.keys()]
    curr_point = np.array([v for k, v in init_val_dict.items()])
    f_grad = f.gradient_at(init_val_dict)
    f_hess = f.hessian_at(init_val_dict)
    history = [curr_point.tolist()]
    
    for i in range(max_iter):
        
        curr_val_dict = {var: val for var, val in zip(variables, curr_point)}
        # solve (Hessian of f at x)s = - (gradient of f at x)
        f_grad =f.gradient_at(curr_val_dict)
        f_hess = f.hessian_at(curr_val_dict)

        step = np.linalg.solve(f_hess, -f_grad)
        if np.linalg.norm(step, ord=2) < stop_stepsize: break
        
        # x := x + s
        curr_point = curr_point + step
        history.append(curr_point.tolist())
    
    if return_history:
        return history

    return {var: val for var, val in zip(variables, curr_point)}



def gradient_descent(f,init_val_dict, learning_rate=0.001, max_iter=1000, stop_stepsize=1e-6,return_history=False):
    """
    Gradient Descent finding minimum for a 
    single expression
    
    INPUTS
    =======
    f: expression 
    init_val_dict:dictionary containing initial value of variables
    learning_rate: the step size between iterations
    max_iter: maximum iteration before the algorithm stops
    stop_stepsize: tolerance, the minimum threshold for absolute 
    difference of value of f from 0 for the algorithm to stop
    return_history: default set to False. If True, return the trajectory
    of the algorithm including the final answer
    
    RETURNS
    ========
    If return_history = False: variable values corresponding to the 
    minimum value of f
    If return_history = True, return the trajectory
    of the algorithm including the final answer
    """
    f_grad = f.gradient_at(init_val_dict)
    variables  = [var for var in init_val_dict.keys()]
    curr_point = np.array([v for k, v in init_val_dict.items()])
    history = [curr_point.tolist()]
    
    for i in range(max_iter):
        
        prev_point =curr_point
        prev_val_dict = {var: val for var, val in zip(variables, prev_point)}
        f_grad =f.gradient_at(prev_val_dict)

        curr_point =curr_point - learning_rate*f_grad
        history.append(curr_point.tolist())
        if np.linalg.norm(curr_point-prev_point, ord=2) < stop_stepsize: break
        
    if return_history:
        return history

    return {var: val for var, val in zip(variables, curr_point)}

Functions

def bfgs(

f, init_val_dict, max_iter=2000, stop_stepsize=1e-08)

Broyden–Fletcher–Goldfarb–Shanno finding minimum for a single expression

INPUTS

f: expression init_val_dict:dictionary containing initial value of variables max_iter: maximum iteration before the algorithm stops stop_stepsize: tolerance, the minimum threshold for absolute difference of value of f from 0 for the algorithm to stop

RETURNS

variable values corresponding to the minimum value of f

def bfgs(f, init_val_dict, max_iter=2000, stop_stepsize=1e-8):
    """
    Broyden–Fletcher–Goldfarb–Shanno finding minimum for a 
    single expression
    
    INPUTS
    =======
    f: expression 
    init_val_dict:dictionary containing initial value of variables
    max_iter: maximum iteration before the algorithm stops
    stop_stepsize: tolerance, the minimum threshold for absolute 
    difference of value of f from 0 for the algorithm to stop
    
    RETURNS
    ========
    variable values corresponding to the minimum value of f
    """
    variables  = [var for var in init_val_dict.keys()]
    curr_point = np.array([v for k, v in init_val_dict.items()])
    B          = np.eye(len(curr_point))
    
    for i in range(max_iter):
        
        # solve Bs = - (gradient of f at x)
        curr_val_dict = {var: val for var, val in zip(variables, curr_point)}
        f_grad = f.gradient_at(curr_val_dict)
        s = solve(B, -f_grad)
        if norm(s, ord=2) < stop_stepsize: break
            
        # x_next := x + s
        next_point = curr_point + s

        # y := (gradient of f at x_next) - (gradient of f at x)
        # x := x_next
        next_val_dict = {var: val for var, val in zip(variables, next_point)}
        y = f.gradient_at(next_val_dict) - f.gradient_at(curr_val_dict)
        curr_point = next_point
        
        # B := B + deltaB
        s, y = s.reshape(-1, 1), y.reshape(-1, 1)
        deltaB = multi_dot([y, y.T])/multi_dot([y.T, s]) \
                 - multi_dot([B, s, s.T, B])/multi_dot([s.T, B, s]) 
        B = B + deltaB
    
    return {var: val for var, val in zip(variables, curr_point)}

def gradient_descent(

f, init_val_dict, learning_rate=0.001, max_iter=1000, stop_stepsize=1e-06, return_history=False)

Gradient Descent finding minimum for a single expression

INPUTS

f: expression init_val_dict:dictionary containing initial value of variables learning_rate: the step size between iterations max_iter: maximum iteration before the algorithm stops stop_stepsize: tolerance, the minimum threshold for absolute difference of value of f from 0 for the algorithm to stop return_history: default set to False. If True, return the trajectory of the algorithm including the final answer

RETURNS

If return_history = False: variable values corresponding to the minimum value of f If return_history = True, return the trajectory of the algorithm including the final answer

def gradient_descent(f,init_val_dict, learning_rate=0.001, max_iter=1000, stop_stepsize=1e-6,return_history=False):
    """
    Gradient Descent finding minimum for a 
    single expression
    
    INPUTS
    =======
    f: expression 
    init_val_dict:dictionary containing initial value of variables
    learning_rate: the step size between iterations
    max_iter: maximum iteration before the algorithm stops
    stop_stepsize: tolerance, the minimum threshold for absolute 
    difference of value of f from 0 for the algorithm to stop
    return_history: default set to False. If True, return the trajectory
    of the algorithm including the final answer
    
    RETURNS
    ========
    If return_history = False: variable values corresponding to the 
    minimum value of f
    If return_history = True, return the trajectory
    of the algorithm including the final answer
    """
    f_grad = f.gradient_at(init_val_dict)
    variables  = [var for var in init_val_dict.keys()]
    curr_point = np.array([v for k, v in init_val_dict.items()])
    history = [curr_point.tolist()]
    
    for i in range(max_iter):
        
        prev_point =curr_point
        prev_val_dict = {var: val for var, val in zip(variables, prev_point)}
        f_grad =f.gradient_at(prev_val_dict)

        curr_point =curr_point - learning_rate*f_grad
        history.append(curr_point.tolist())
        if np.linalg.norm(curr_point-prev_point, ord=2) < stop_stepsize: break
        
    if return_history:
        return history

    return {var: val for var, val in zip(variables, curr_point)}

def newton(

f, init_val_dict, max_iter=1000, stop_stepsize=1e-08, return_history=False)

Newton's Method finding minimum for a single expression

INPUTS

f: expression init_val_dict:dictionary containing initial value of variables max_itr: maximum iteration before the algorithm stops stop_stepsize: tolerance, the minimum threshold for absolute difference of value of f from 0 for the algorithm to stop return_history: default set to False. If True, return the trajectory of the algorithm including the final answer

RETURNS

If return_history = False: variable values corresponding to the minimum value of f If return_history = True, return the trajectory of the algorithm including the final answer

def newton(f,  init_val_dict, max_iter=1000, stop_stepsize=1e-8,return_history=False):
    """
    Newton's Method finding minimum for a single expression
    
    INPUTS
    =======
    f: expression 
    init_val_dict:dictionary containing initial value of variables
    max_itr: maximum iteration before the algorithm stops
    stop_stepsize: tolerance, the minimum threshold for absolute 
    difference of value of f from 0 for the algorithm to stop
    return_history: default set to False. If True, return the trajectory
    of the algorithm including the final answer
    
    RETURNS
    ========
    If return_history = False: variable values corresponding to the 
    minimum value of f
    If return_history = True, return the trajectory
    of the algorithm including the final answer
    """
    variables  = [var for var in init_val_dict.keys()]
    curr_point = np.array([v for k, v in init_val_dict.items()])
    f_grad = f.gradient_at(init_val_dict)
    f_hess = f.hessian_at(init_val_dict)
    history = [curr_point.tolist()]
    
    for i in range(max_iter):
        
        curr_val_dict = {var: val for var, val in zip(variables, curr_point)}
        # solve (Hessian of f at x)s = - (gradient of f at x)
        f_grad =f.gradient_at(curr_val_dict)
        f_hess = f.hessian_at(curr_val_dict)

        step = np.linalg.solve(f_hess, -f_grad)
        if np.linalg.norm(step, ord=2) < stop_stepsize: break
        
        # x := x + s
        curr_point = curr_point + step
        history.append(curr_point.tolist())
    
    if return_history:
        return history

    return {var: val for var, val in zip(variables, curr_point)}

Autodiff.plot

plot API documentation Top

plot module

import autodiff.optimize as opt
import matplotlib.pyplot as plt
import numpy as np

def plot_contour(f, init_val_dict, x,y,plot_range=[-3,3],method = 'gradient_descent'):
    """This function plots a countour map according to the values of 
    expression of interests. It finds the minimum point using either Gradient 
    Descent or Newton's Method and then color it on the contour map.
    
    INPUTS
    =======
    f: expression containing two sub_expressions
    init_val_dict: a dictionary containing variable name and values.
    x: expression 1, represented by x axis
    y: expression 2, represented by y axis
    plot_range: the range of both axes
    method: method with which the function finds the minimum point
    """
    if method == 'gradient_descent':
        a=opt.gradient_descent(f, init_val_dict,return_history=True)
    elif method =='newton':
        a=opt.newton(f, init_val_dict,return_history=True)
    #first plot the contour
    xx=np.linspace(plot_range[0],plot_range[1],100)
    yy=np.linspace(plot_range[0],plot_range[1],100)
    xg,yg = np.meshgrid(xx,yy)
    z=np.zeros(shape=(len(xg.ravel()),))
    for i,val in enumerate(xg.ravel()):
        vals = yg.ravel()
        z[i]=f.evaluation_at({x:val,y:vals[i]})
    z2 = z.reshape(xg.shape)
    plt.contourf(xg, yg, z2, alpha = 0.8, cmap = "BuGn")
    #plot the steps
    f_gd = []
    x_gd = []
    y_gd = []
    for l in a:
        x_gd.append(l[0])
        y_gd.append(l[0])
        #f_gd.append(f.evaluation_at({x:l[0],y:l[1]}))
    plt.plot(x_gd,y_gd,'.',alpha=0.1)
    plt.show()

Functions

def plot_contour(

f, init_val_dict, x, y, plot_range=[-3, 3], method='gradient_descent')

This function plots a countour map according to the values of expression of interests. It finds the minimum point using either Gradient Descent or Newton's Method and then color it on the contour map.

INPUTS

f: expression containing two sub_expressions init_val_dict: a dictionary containing variable name and values. x: expression 1, represented by x axis y: expression 2, represented by y axis plot_range: the range of both axes method: method with which the function finds the minimum point

def plot_contour(f, init_val_dict, x,y,plot_range=[-3,3],method = 'gradient_descent'):
    """This function plots a countour map according to the values of 
    expression of interests. It finds the minimum point using either Gradient 
    Descent or Newton's Method and then color it on the contour map.
    
    INPUTS
    =======
    f: expression containing two sub_expressions
    init_val_dict: a dictionary containing variable name and values.
    x: expression 1, represented by x axis
    y: expression 2, represented by y axis
    plot_range: the range of both axes
    method: method with which the function finds the minimum point
    """
    if method == 'gradient_descent':
        a=opt.gradient_descent(f, init_val_dict,return_history=True)
    elif method =='newton':
        a=opt.newton(f, init_val_dict,return_history=True)
    #first plot the contour
    xx=np.linspace(plot_range[0],plot_range[1],100)
    yy=np.linspace(plot_range[0],plot_range[1],100)
    xg,yg = np.meshgrid(xx,yy)
    z=np.zeros(shape=(len(xg.ravel()),))
    for i,val in enumerate(xg.ravel()):
        vals = yg.ravel()
        z[i]=f.evaluation_at({x:val,y:vals[i]})
    z2 = z.reshape(xg.shape)
    plt.contourf(xg, yg, z2, alpha = 0.8, cmap = "BuGn")
    #plot the steps
    f_gd = []
    x_gd = []
    y_gd = []
    for l in a:
        x_gd.append(l[0])
        y_gd.append(l[0])
        #f_gd.append(f.evaluation_at({x:l[0],y:l[1]}))
    plt.plot(x_gd,y_gd,'.',alpha=0.1)
    plt.show()

Software Organization

Directory Structure

The structure of autodiff’s project directory is as follows.

autodiff/

    __init__.py
    README.md
    forward.py
    backward.py
    optimize.py
    rootfinding.py
    plot.py

tests/

    README.md
    test_forward.py
    test_backward.py
    test_optimize.py
    test_rootfinding.py
    test_plot.py

docs/

    README.md
    milestone1.ipynb
    milestone2.ipynb
    source/
        Background.ipynb
        Getting Started.ipynb
        Implementation.ipynb
        index.rst
        Installation.ipynb
        Libraries_demo.ipynb
        Future Development.ipynb
        License.rst

.gitignore
.travis.yml
LICENSE.txt
README.md
requirements.txt
setup.cfg
setup.py

The source codes lies in the directory autodiff, in which the __init__.py is there to make autodiff a package. The file forward.py contains the source code of forward mode autodifferentiation. The file backward.py contains the source code of backward mode autodifferentiation. The file optimize.py contains optimization routines built on top of autodifferentiation. The file rootfinding.py contains rootfinding routines. The file plot.py contains utility functions for plotting.

The test suites lies in the directory tests. The test files are named to represent the module that they test.

The documents lies in the directory docs. milestone1.ipynb is the history version of document when submitting milestone 1. milestone2.ipynb is the history version of document when submitting milestone 2. document.ipynb, which is this file itself, is the final document.

Other files in the rrot directory includes: .gitignore, which specifies the files that should not be tracked by git, .travis.yml, which is the configuration file for TravisCI, LICENSE.txt, which is the license for this package, README.md, which is the README file for this package, requirements.txt, which specifies the dependensies of this package, setup.cfg, which is the configuration file for installing this package, setup.py, which is the script for installing this package.

Modules

There are now five modules: autodiff.forward for forward mode autodifferentiation, backward for backward mode autodifferentiation, optimize for optimization, rootfinding for rootfinding, and plot for plotting.

Test Automation

The continuous integration is done by TravisCI and Coveralls.

Distribution

autodiff is distributed with PyPI.

Future Development

1. Optimization

One of the shortcomings we notice of our current design is that, during the calculation, the derivate/value of an expression at a single point may be evaluated multiple times. When the Expression tree is shallow, this doesn’t have much effect on the computation time. However, when the Expression tree is deep, the time spent on redundant work will grow exponentially, which can be a serious problem. We may want to add a cache to our library , so that when the same derivative/value is queried, it is fetched from the cache instead of being computed again. This can largely accelerate our library in the case of complex Expressions.

2. Extensions

Since most machine learning problems can be formulated as optimization problems, and optimization routines can make use of automatic differentiations, we can actually develop a machine learning library on the top of our library. Other possible extensions include: more visualization tools, more optimization methods, even higher-order derivative than second-order, a neural network framework based on backward mode automatic differntiation.

3. Improvement

If user wish to add additional feature for the DFYS-autodiff package, please go to our GitHub_ repository, fork the repository, make the improvement, and submit pull request to us.

This project is hosted on GitHub and PyPI.