{ "nbformat": 4, "nbformat_minor": 0, "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.3" }, "colab": { "name": "Test-4.ipynb", "provenance": [], "collapsed_sections": [] } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "sLcU0QcZ8Z_Y", "colab_type": "text" }, "source": [ "#
CS568:Deep Learning
Spring 2020
" ] }, { "cell_type": "markdown", "metadata": { "id": "YhoGJ4FA8Z_b", "colab_type": "text" }, "source": [ "## Autograd\n", "\n", "Autograd is a Python package for automatic differentiation.\n", "\n", "* To intall Autograd:\n", " pip install autograd\n", "* There are a lot of great [examples](https://github.com/HIPS/autograd/tree/master/examples) provided with the source code\n", "* Autograd can automatically differentiate Python and Numpy code.\n", "* It uses reverse-mode automatic differentiation (AD) to compute the gradients of scalar-valued functions with respect to vector inputs.\n", "* Modern machine learning frameworks (TensorFlow, Theano, PyTorch) employ AD. " ] }, { "cell_type": "code", "metadata": { "id": "WUFPbw_9UtX-", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 68 }, "outputId": "b2308510-d00e-4b8b-f490-83c26a87200f" }, "source": [ "!pip install autograd # just run this, to install autograd using notebook" ], "execution_count": 1, "outputs": [ { "output_type": "stream", "text": [ "Requirement already satisfied: autograd in /usr/local/lib/python3.6/dist-packages (1.3)\n", "Requirement already satisfied: future>=0.15.2 in /usr/local/lib/python3.6/dist-packages (from autograd) (0.16.0)\n", "Requirement already satisfied: numpy>=1.12 in /usr/local/lib/python3.6/dist-packages (from autograd) (1.17.5)\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "_1hFLJE48Z_b", "colab_type": "code", "colab": {} }, "source": [ "import autograd.numpy as np # wrapper around numpy\n", "from autograd import grad # function to generate gradients\n", "import matplotlib.pyplot as plt" ], "execution_count": 0, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "_2MvhDuv8Z_j", "colab_type": "code", "colab": {} }, "source": [ "def f(x1, x2): \n", " return (x1*x2 + np.sin(x1))\n", "\n", "def manual_f_grad(x1, x2):\n", " # Forward Pass\n", " w1 = x1\n", " w2 = x2\n", " w3 = x1*x2\n", " w4 = np.sin(x1)\n", " w5 = w3 + w4\n", " \n", " # Backward Pass\n", " \n", " \n", " return \n", "\n", "\n", "autograd_f_grad = grad(f)\n", "\n", "# Compare the results of manual and autograd functions\n", "print(autograd_f_grad(2.0, 1.0))\n", "print(manual_f_grad(2.0, 1.0))" ], "execution_count": 0, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "NEg1eGK58Z_l", "colab_type": "code", "colab": {} }, "source": [ "" ], "execution_count": 0, "outputs": [] } ] }