# Building basic functions with numpy

## Sigmoid function, np.exp()

_sigmoid_ 函数也叫做 logistic function，是 logistic regression 算法中的激活函数。

• 函数关于x递增，函数值平滑地从0走到1
• x = 0, 函数值为0.5；当x无限大时，函数值无限接近于1；当x无限小时，函数值无限接近于0

sigmoid 函数的求导： sigmoid_derivative(x) = σ′(x)= σ(x)(1−σ(x))

## Reshaping arrays

Two common numpy functions used in deep learning are np.shape and np.reshape().

• X.shape is used to get the shape (dimension) of a matrix/vector X.
• X.reshape(…) is used to reshape X into some other dimension.

reshape是一种常用的操作，比如一张3维图片可以被表示为(length, height, depth=3), 但当我们实际输入时，需要将它转换成(length*height*3, 1)这样的向量。换句话说，我们”unroll”或者”reshape” the 3D array into a 1D vector.

## Normalizing rows axis 用来指明将要进行的运算是沿着哪个轴执行,在Numpy中，0轴是垂直的，也就是列，而1轴是水平的，也就是行。

## Broadcasting and the softmax function # Vectorization

In deep learning, you deal with very large datasets. Hence, a non-computationally-optimal function can become a huge bottleneck in your algorithm and can result in a model that takes ages to run. To make sure that your code is computationally efficient, you will use vectorization. For example, try to tell the difference between the following implementations of the dot/outer/elementwise product.

Note that np.dot() performs a matrix-matrix or matrix-vector multiplication. This is different from np.multiply() and the operator (which is equivalent to . in Matlab/Octave), which performs an element-wise multiplication.

## Implement the L1 and L2 loss functions  # Conclusion

What to remember:

• Vectorization is very important in deep learning. It provides computational efficiency and clarity.
• You have reviewed the L1 and L2 loss.
• You are familiar with many numpy functions such as np.sum, np.dot, np.multiply, np.maximum, etc…
-------------本文结束感谢您的阅读-------------

0%