# Definition
In differential geometry, we define tangent vectors more abstractly.
Consider a [[Curve]] passing through a point $P \in M$ for some [[manifold]] $M$, described by the equations $x^i = x^i(\lambda)$, where $\lambda$ parametrizes the [[Curve]]. Moreover, consider a [[Differentiable Functions on a Manifold|differentiable function on]] $M$, $f(x^i)$. At each point of the [[Curve]], $f$ has a value. Therefore, along the [[Curve]], there is a [[differentiable function]] $g(\lambda)$ which gives the value of $f$ at the point whose parameter value is $\lambda$:
$
g(\lambda) = f(x^i(\lambda))
$
Differentiating and using the chain rule gives:
$
\frac{dg}{d\lambda} = \sum_i \frac{dx^i}{d\lambda} \frac{\partial f}{\partial x^i} = \frac{dx^i}{d\lambda} \partial_i f
$
Since this is true for any function $g$, we can write:
$
\frac{d}{d\lambda} = \sum_i \frac{dx^i}{d\lambda} \frac{\partial}{\partial x_i}
$
In the ordinary view of [[Euclidean Space|Euclidean vectors]], one can say that the set of numbers $\{\frac{dx^i}{d\lambda}\}$ are components of the vector tangent to $x^i(\lambda)$. The $dx^i$ are displacements and the $d\lambda$ simply change the scale. Since each [[Curve]] has a unique parameter, there is a unique set $\{\frac{dx^i}{d\lambda}\}$ which are said to be the components of *the* tangent to the [[Curve]]. Thus, by our definition of a parameterized differential [[Curve]], each [[Curve]] has a *unique* tangent vector.
Every vector is tangent to an infinite number of curves through $P$ as shown in the figure below.
![[Pasted image 20210123200206.png]]
Since manifolds have no notion of [[Manifold]], we need to define vectors more abstractly, without such a notion (i.e. not regard $dx^i$ as a displacement). In our definition, we should rely only on infinitesimal [[Neighborhood|neighborhoods]] of points of $M$.
Suppose $a, b \in \mathbb{R}$ and $x^i = x^i(\mu)$ is another [[Curve]] through $P$. Then at $P$ we have:
$
\frac{d}{d\mu} = \sum_i \frac{dx^i}{d\mu} \frac{\partial}{\partial x_i}
$
and
$
a \frac{d}{d\lambda} + b \frac{d}{d\mu} = \sum_i \left( a \frac{dx^i}{d\lambda} + b \frac{dx^i}{d\mu}\right) \frac{\partial}{\partial x_i}
$
Now, the numbers $\left( a \frac{dx^i}{d\lambda} + b \frac{dx^i}{d\mu}\right)$ are the components of a new vector which is tangent to *some* [[Curve]] at the point $P$. Thus, there must exist a [[Curve]] with some parameter $\phi$ for which:
$
\frac{d}{d\phi}= \sum_i \left( a \frac{dx^i}{d\lambda} + b \frac{dx^i}{d\mu}\right) \frac{\partial}{\partial x_i}
$
and we get:
$
\frac{d}{d\phi} = a \frac{d}{d\lambda} + b \frac{d}{d\mu}
$
We can conclude that: *the directional derivatives along curves form a [[vector space]] at P*. We have shown that they obey closure under linear combinations above, and the remaining axioms of vector spaces are trivial to show.
> **Tangent Vector**
> The tangent vector to a [[curve]] parametrized by $\lambda$ on a [[manifold]] $M$ is:
> $
> \frac{d}{d\lambda} = \sum_i \frac{dx^i}{d\lambda} \frac{\partial}{\partial x_i}
> $
> From the equation above it follows that $\{\partial/\partial x^i\}$ defines a [[basis]] for this [[vector space]], and the [[Components|components]] of $d/d\lambda$ are $\{dx^i/d\lambda\}$ on this [[basis]].
This view of vectors has 3 advantages:
1. It does not involve displacements over finite separations
2. It makes no mention of coordinates, i.e. you do not need to specify how a vector transforms as
3. The derivative is a kind of motion along the [[curve]], which is what conceptually what a tangent vector generates. This couples analysis and geometry with very powerful consequences.
Note that now, only vectors at the same point $P$ can be added together. Vectors at two different points have no relation with one another.
# Exponentiation of the Operator $d/d\lambda$
^b5cea1
Suppose we have an analytic $C^\omega$ [[Differentiable Manifold|analytic manifold]], then the coordinate values of $x^i(\lambda)$ of points along the [[Integral Curves]] of $\pmb{Y} = d/d\lambda$ [[analytic functions]] of $\lambda$ by definition. The coordinates of two points with parameters $\lambda = \lambda_0$ and $\lambda = \lambda_0 + \epsilon$ are related by:
$
\begin{align}
x^i(\lambda_0 + \epsilon) &= x^i((\lambda_0 + \epsilon) - \lambda_0) + \epsilon \frac{dx^i}{d\lambda}\Bigg\vert_{\lambda_0} + \epsilon^2 \frac{d^2x^i}{d\lambda^2}\Bigg\vert_{\lambda_0} + \ldots \\
&= \left(1 + \epsilon \frac{d}{d\lambda} + \epsilon^2 \frac{d^2x^i}{d\lambda^2} + \ldots\right)x^i \Bigg\vert_{\lambda_0}
\\
& = \left(\exp\left(\epsilon \frac{d}{d\lambda}\right)\right)x^i\Bigg\vert_{\lambda_0} \\
& \equiv \exp(\epsilon d/d\lambda) \equiv \exp(\epsilon \pmb{Y})
\end{align}
$
Since $\epsilon d/d\lambda$ is an infinitesimal motion along the [[curve]], the exponentiation gives a finite motion along the [[curve]].
# Tensor Algebra
In the language of tensors, a vector is a $(1,0)$ [[Tensor]].