1.Inverse of a matrix
Where Adj(A) = adjoint A which is transpose matrix of cofactors of the original matrix.
Inverse of a diagonal matrix
2. Adjoint of a matrix
Let A be a nxn matrix, then
3. Trace of a matrix
Let M be a n x n matrix, then
4. Matrices of co-factors
B is a matrix of cofactors of A; then
5. Area of triangle determinant format
Three points A (a1, b1, c1), B (a2, b2, c2), C (a3, b3, c3). Area of triangle ABC
2. Differential equation
1. Integrating factor solution for differential equation
2. For roots ∝± iβ of differential equation, solution
3. Odd function
For t < 0, e-|t| is an odd function
3. Calculating curl of a vector
Curl of a vector is defined as follows
4. Taylor series
Taylor series: an infinite sum giving the value of a function f(z) in the neighbourhood of a point “a” in terms of the derivatives of the function evaluated at “a”.
5. Ratio test
Ratio test for finding out whether a given series is convergent or
Following hyperbolic functions are defined
sinhx, coshx, limit of tanx, limit of tanhx
7. Eigen values
Characteristic equation for Eigen value is expanded form of (A-λI) = 0. For finding Eigen values
If a matrix has eigen vector X then AX = KX; eigen vector should be a non-zero vector.
8. Average value of f(x)
9. Finding saddle point
10. Summation of tan-1A and tan-1B
tan-1A + tan-1B
11. Calculating gradient of a function
To find unit vector normal to surface calculate gradient (grad ɸ)
12.Divergence, curl and gradient of a vector
Divergence of curl of vector a
13. Cauchy Riemann equation
The Cauchy–Riemann equations on a pair of real-valued functions of two real variables u(x, y) and v(x, y) are the two equations. Then f = u + iv is complex-differentiable at that point if and only if the partial derivatives of u and v satisfy the Cauchy–Riemann equations
1. Laplace transformation of function f(t)
2. Periodic function
Laplace transfer function of periodic function with period T
3. Improper integral by laplace transformation
Finding the value of improper integral by Laplace transformation
15. Complex equation
1. Polar form of complex equation
Polar form of complex equation x + iy
16. Definite integral
1. For even function
2. Integral of inverse of 1+x²
3. Breaking limits of definite integral
17. Green’s Theorem
18. Fourier series
1. Infinite fourier series
the solution is infinite Fourier series
19. Probability density function
20. Random variable probability function (P(X))
Where: X = random variable.
In continuous random variable total area under curve = 1.
21. Normal distribution
Normal distribution is symmetrical about mean (μ)
23. Residue of a function f(z)
24. Newton Raphson method
Newton – Raphson method will converge in one step if the function is linear
25. Double integral
26. Euler’s method
Step size h is mentioned in the question.
27. Numerical integration
1. Trapezoidal rule
2. Simpson’s 1/3th rule
3. Simpson’s 3/8th rule
29. i raise to power i
1. Summation n
2. Probability A union B
3. Probability A intersection B
32. probability A given B P ( A/B)
34. Summation involving both AP and GP
35. Taylor series expansion of ez
36. Linearly independent set of equations
For a set of functions f(x), g(x), h(x) to be linearly independent w≠0
37. Residual of a function at x = a
To find residual of a function at a particular point
For matrix A
39. Cumulative distribution function
Cumulative distribution function of random variable X
40. Idempotent matrix
A² = A
If AB is idempotent matrix then
AB = BA
41. Regula-Falsi method
Regula – Falsi method for convergence and divergence
43. Ramp function
44. analytic function
Let f(z) be defined as
For f(z) to be analytic
If f(z) is analytic
45. Types of errors
1. Integral square error (ISE)
2. Integral absolute error (IAE)
3. Integral time weighted absolute error (ITAE)
46. Directional derivative
The directional derivative ∇uf(xo, yo, zo) is the rate at which the function f(x, y, z) changes at point (xo, yo, zo)in the direction u. Directional derivative along the path of maximum value is called gradient.
47. Exact differential
Mdx + Ndy is exact differential when
48. Initial value problem
49. Linear differential equation
A first order differential equation is said to be linear if it can be written as
50. Sum of squares of errors
Sum of squares of error is defined as
Where yi is the data value that would be calculated from known
correlations and Yi ist the data point that would be mentioned. It is used to find a value of a variable from an equation on basis of various data points.
51. Few random mathematical formulas
- Transpose of a square matrix A have same eigen values as that of A.
- Gradient of a scalar quantity is always vector.
- If A and B are 3×3 matrix and AB = 0 then rank of matrix B is zero.
- Three sets of equation will be independent and have a unique solution only if they have a non-vanishing determinant.
- For solution of differential equation to be y = (C1 + C2x)emx, roots of
characteristics equation needs to be equal.
- A set of equations will have non-trivial solutions if det[A] = 0.
- A is mxn matrix with rank n; B is nxp matrix with rank p. Assuming m ≥ n ≥ p, rank (AB) ≤ min (n, p) = p; Rank of a matrix is equal to number of linear independent rows.
- Trapezoidal rule will give exact integral when f(x) is linear.
- For vectors to be coplanar, determinant should be zero