Jump to content

Time-scale calculus

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Aurumvorax (talk | contribs) at 03:47, 24 July 2003. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Overview

Time scale calculus is a unification of the theory of difference equations and standard calculus. It was invented in 1988 by a German mathematician, Stefan Hilger, and has applications in any field that requires simultaneous modelling of discrete and continuous data.

Basic Theory

Define a time scale, or measure chain, T, to be a closed subset of the real line, R.

Define

sigma(t) = inf{s an element of T, s > t}   (forward shift operator)
rou(t) = sup{s an element of T, s < t}     (backward shift operator)

Let t be an element of T.

t is left dense if rou(t) = t,

    right dense if sigma(t) = t,
    left scattered if rou(t) < t,
    right scattered if sigma(t) > t,
    dense if left dense or right dense.

Define graininess mu of a measure chain T by mu(t) = sigma(t) - t

Take a function f : T -> R, where R can be any Banach space, but set to the real line for simplicity.

Definition: generalised derivative or fdelta(t)

For every epsilon > 0 there exists a neighbourhood U of t such that

|f(sigma(t)) - f(s) - fdelta(t)(sigma(t) - s)| =< epsilon|sigma(t)-s|, for all s in U

Take T = R. Then sigma(t) = t, mu(t) = 0, fdelta = f' is the derivative used in standard calculus. If T = Z (the integers), sigma(t) = t + 1, mu(t)=1, fdelta = deltaf is the forward difference operator used in difference equations.