Reading:
def f(x): # <--- def keyword and colon after name and arguments
return x**2 # <--- return output
y = f(4)
print("y =",y)
y = 16
This says
(recall we are exclusively working with real numbers in this class)
An artificial neuron is "on" or "off" depending on its inputs
This on/off behavior is modeled with the activation function, a scalar funtion of a scalar input
Popular functions, in historical order
Exercise: code these in python
Exercise: convert a step function into a sign function and vice versa
$a$, $c$, and $z_0$ are constants.
Sigmoid Activation Function = Special case of Logistic function: $c=1$, $z_0=0$, $a=1$ (typically)
Exercise: code in python
The activation function that was popular when Deep Learning became famous.
Not smooth but gradient still simple to do computationally.
Variants such as "Leaky ReLU" improve implementation details.
Exercise: code in python
Relates two quantities together
This says
(recall we are exclusively working with real numbers in this class)
Examples:
What is $x$?
What is $x$?
Think of $x$ as the variable and $a$ as a vector of fixed parameters
Suppose both $x$ and $a$ were viewed as variables, what is $f$? ($\mathbb R^? \rightarrow \mathbb R^?$)
any linear function $\mathbb R^n \rightarrow \mathbb R$ can be described as an inner product
Exercise: prove by using linearity assumption to expand $f(x_1e_1+..+x_ne_n)$.
A linear function with an offset
consider $a = \begin{pmatrix}2\\1 \end{pmatrix}$, and $b=1$
Draw the linear function $f(x)=a^Tx$ and the affine function $f(x)=a^Tx+b$
print ("dosage = "+str(np.round(dosage,2)))
print ("blood concentration = "+str(np.round(conc_noisy,2)))
plt.scatter(dosage, conc_noisy, color='black', linewidth='1.5');
dosage = [ 0. 166.67 333.33 500. 666.67 833.33 1000. ] blood concentration = [17.06 31.89 35.01 34.18 65.33 79.27 47.47]
Exercise: make two python functions to implement the following function composition
Compute $h(4)$
Describe as a composition of simple functions
What values can $y$ be? where are those values taken?