This is not a extensive detailed guide for machine learning libraries. It is a small extract of common feature of these libraries and there usage.

There are many Python packages used in ML, but here are some important ones.

• NumPy - Everything related to Maths, formulas, series, multidimensional arrays and operations etc.
• SciPy - For statistical analysis, extension of Numpy
• Pandas - Dealing with data reading/writing/manipulation from/to CSV,database,excel etc, cleaning, filtering.
• Seaborn - For data visualization
• Matplotlib - For data visualization, Seaborn is extension over Matplotlib.
• Scikit-learn - Provides ready to use ML algorithms, based on NumPy, SciPy.

# Numpy

## Numpy array

• Numpy gives a class which helps to create homogeneous structure, you can say numpy defined datatypes. Here is signature, need not to remember. numpy.dtype(object, align, copy) dt = np.dtype([('age',np.int8)]) a = np.array([(10,),(20,),(30,)], dtype = dt) np.sort(a, order = 'age') # can sort by custom defined objects

• A closure look to array. numpy.array(object, dtype = None, copy = True, order = None, subok = False, ndmin = 0) a = np.array([1, 2, 3], ndmin=2,dtype = complex) # [ 1.+0.j, 2.+0.j, 3.+0.j]

• Read order of n dimension. C -(2,3,4) the read sequence will variate on 4 most then 3 and then 2. F- (2,3,4) 2 will variate most. It doesn’t impact how array store in memory, only read mode is changed.

• Iterate ove array for x in np.nditer(a, order = 'F',op_flags = ['readwrite']): # default order is C, OP flags defaults to read. print x,

# broadcasted iteration of vector and matrix for x,y in np.nditer([a,b]): # x(3,4) and y is(1,3) or y(3) print %d:%d” % (x,y),</code>

## Numpy linear algebra

Eign vector and value of Matrix is arbitrary λ and v for a matrix A. which specifies ratio of vector dimensions and a magnitude respectively. Eign vector ratio sum need not to be 1 but just to give better picture they are scaled to sum 1.

Pseduo inverse - Sometime A-1 is not defined as |A| is 0. So we calculate X which is very close to A-1. $A^{-1}=I, AA^{-1}-I = 0, AX-I \text{ is close to 0}$

Harmition Matrix - $A_{ij} = \bar A_{ij}$. Conjugate matrix of complex number and transposed.

Echelon Matrix - Matrix starts with 1 and every new row start 1 column after.

Orthogonal Matrix - AAT=I.

Matrix Rank - Calculate by reducing the lenearly dependent rows. Minimize it by row to row computaion and try to make them 0. It is computed by bringing matrix to echelon matrix form.

LU decomposition - A = LU. L is lower traingular matrix and U is upper triangular matrix.

Cholesky Decomposition - A = LLT</sip>, where L is hermitian matrix.

QR Decomposition - A =QR where Q is Orthogonal Matrix and R is upper traingular.

SVD - A = USVT where UTU=I, VTV =I