f′

Derivative Calculator

Compute derivatives of functions numerically and view a reference table of basic derivative formulas.

MATHEMATICS

The Derivative Calculator computes function derivatives numerically at a given point.

Uses the central difference method. Supports polynomial, trigonometric, exponential, and logarithmic functions. Includes a table of 20 basic derivative formulas and 7 differentiation rules.

Simbol yang didukung: +, -, *, /, ^, sqrt, sin, cos, tan, ln, log, e, pi, x

Calculator information

How to use this calculator

  1. Enter the function f(x) in standard notation, for example: x^2 + 3*x - 5 or sin(x)*exp(x).
  2. Enter the point x at which the derivative should be evaluated, e.g. x = 2.
  3. Select the derivative order: 1 (f'), 2 (f''), or 3 (f''').
  4. Adjust the step h if needed (default 1e-5, optimal precision for central difference).
  5. Press Calculate to obtain the numerical f'(x) and visualize the tangent line.
  6. Review the table of basic derivative rules (polynomial, sin, cos, exp, ln) and the chain/product rules. Tip: a step size that is too small (<1e-10) can cause floating-point errors; central difference with h=1e-5 is generally optimal.

Numerical Central Difference & Basic Derivative Rules

f'(x) ~ (f(x+h) - f(x-h)) / (2h); f''(x) ~ (f(x+h) - 2*f(x) + f(x-h)) / h^2
  • f(x) = function being differentiated
  • x = evaluation point
  • h = small step size (1e-5 default)
  • Central difference has error O(h^2), more accurate than forward/backward difference O(h)

Basic rules: d/dx (x^n) = n*x^(n-1); d/dx sin(x) = cos(x); d/dx cos(x) = -sin(x); d/dx e^x = e^x; d/dx ln(x) = 1/x. Chain rule: d/dx f(g(x)) = f'(g(x))*g'(x).

Worked example: Calculate f'(2) for f(x) = x^3 - 2x + 1

Given:
  • f(x) = x^3 - 2x + 1
  • x = 2
  • h = 1e-5
Steps:
  1. Analytical: f'(x) = 3x^2 - 2; f'(2) = 3*4 - 2 = 10
  2. Numerical: f(2.00001) = (2.00001)^3 - 2*(2.00001) + 1 = 5.00010000...
  3. f(1.99999) = (1.99999)^3 - 2*(1.99999) + 1 = 4.99990000...
  4. f'(2) ~ (5.00010000 - 4.99990000) / (2 * 1e-5) = 0.0002 / 0.00002 = 10.000
  5. Numerical result matches analytical = 10

Result: f'(2) = 10. The tangent line at (2, 5) has a slope of 10, with equation: y = 10x - 15.

Frequently asked questions

What is the difference between central, forward, and backward difference?
Forward: (f(x+h)-f(x))/h, error O(h). Backward: (f(x)-f(x-h))/h, error O(h). Central: (f(x+h)-f(x-h))/(2h), error O(h^2), the most accurate. Forward/backward are used at the domain edge when x-h or x+h is undefined. Central cannot be used at left/right endpoints.
When are numerical derivatives less accurate?
Numerical derivatives are problematic when: 1) The function is not smooth or has discontinuities near x, 2) The function oscillates rapidly (sin(1/x) near 0), 3) h is too large (high truncation error) or too small (round-off floating-point dominates). For symbolic functions, use a CAS like SymPy or Wolfram Alpha for exact derivatives.
What are real-life applications of derivatives?
Derivatives measure rates of change: velocity = derivative of position with respect to time; acceleration = derivative of velocity. In economics: marginal cost = derivative of total cost; marginal revenue = derivative of revenue. In engineering/optimization: critical points (f'=0) for minima/maxima of cost, profit, or efficiency. In medicine: tumor growth rate, drug elimination rate (pharmacokinetics).
What are partial derivatives for multivariable functions?
Partial derivative df/dx measures the change in f with respect to x while holding other variables constant. For example, f(x,y) = x^2 + 3xy; df/dx = 2x + 3y, df/dy = 3x. This calculator focuses on single-variable functions; for multivariable, use Wolfram Alpha, SymPy, or Mathematica. Gradient and divergence are multivariable generalizations.
What is the chain rule and when is it used?
The chain rule is for derivatives of composite functions: if y = f(g(x)), then dy/dx = f'(g(x)) * g'(x). For example, y = sin(x^2); dy/dx = cos(x^2) * 2x = 2x*cos(x^2). The chain rule is critical in neural-network backpropagation: the gradient of the loss with respect to parameters is computed using layered chain rule from output back to input.

Last updated: May 11, 2026