Ordinary Differential Equations
2-1 Integrating Factor $\mu (x)$
In general first-order linear differential equation in standard form:
$$ \frac{dy}{dt} + p(t) y = g(t) $$In convenient ,we also write the equation in this form:
$$ P(t)\frac{dy}{dt} + Q(t) y = G(t) $$then ,to solve this ODE, we tend to find a integrating factor $\mu (x)$ s.t. $(\mu (t) y)' = k(t)$
How to find the Integrating Factor
$$ P(t) \frac{dy}{dt} + Q(t)y = G(t) $$$$ \mu (x) P(t) \frac{dy}{dt} + \mu (x) Q(t)y = \mu (t)G(t) $$$$ \mu (t) \frac{dy}{dt} + \mu (t) \frac {Q(t)}{P(t)}y = \frac {\mu (t) G(t)}{P(t)} $$$$ [\mu (t) y]' = \frac{\mu (t) G(t)}{P(t)} $$we can imply that $\mu '(t) = \frac{Q(t)}{P(t)} \mu (t)$ ,then
$$ \mu (t) = e^{\int_{}^{} \frac{Q(t)}{P(t)} dt} $$2-2 Separable Differential Equation
The first general first-order differential equation is
$$ \frac{dy}{dt} = f(x ,y) $$To identify this class of equation , we write in this form
$$ M(x, y) + N(x ,y)\frac{dy}{dx} = 0 $$When$M$is a function o f $x$only and$N $is a function of $y$ ,the equation becomes
$$ M(x)dx+ N(y)dy = 0 $$and we can $\int M(x)dx + N(y)dy = \int 0 dx$
2-4 Differences Between Linear and Nonlinear DEs
THM-1 (Existence and Unique for First-Order Linear ODE)
Statement
if p(t) is continuous on an open interval$I \quad \exist ! \ y =\phi (t) $satisfying $y' + p(t)y = q(t), \ \forall t \in I \ and \ y = y_0 $
Proof
Since P(t) is continuous on I ,we set $\mu(t) = e^{\int_{t_0}^{t}p(t)dt} \neq0 \ ,t \in I$($\mu '(t) = \mu(t) p(t)$)
We have $\mu(t) \in C'$($\mu \ is \ diff.)$
Multiplying (1) by $\mu(t)$, then we obtain $[\mu(t)y]' = \mu(t)q(t)$
Since q(t) is continuous on I ,,thus
$$ \mu(t) y(t) = \int _{t_0}^{t} \mu(t)q(t) dt +C $$$$ y(t) - \mu^{-1}(t) \int_{t_0}^{t} \mu(s)q(s)ds + C $$$y = \phi(t)$ exists , is differential and satisfy (1). Also , from the mitial condition (2), we determine uniquely $c = y_0$. Hence ,there is only one solution of (1)-(2).
THM-2 (First-Order Nonlinear ODE)
Statement
Let $f (t ,y)$and $\frac{\delta f}{\delta y}$be continuous in some rectangle R ,$\alpha < t <\beta$ ,$\gamma < y < \delta$
$R = \{ (t, y)\ | \ \alpha < t < \beta ,\ \gamma < y < \delta \}$
containing $(t_0 ,y_0)$. Then , in some interval $(t_0 -h , t_0 + h) \subset (\alpha ,\beta)$for some $h > 0$ ,
$\exist! \ y = \phi(t) \ s.t. \ y'(t) = f(t ,y) \ and \ y(t_0) = y_0$
Remark
-
Thm-2 can imply Thm-1 (linear case)
In the case of Thm-1 , $f(t ,y) = q(t) = p(t)y \in C$ and $\frac{\delta f}{\delta y}(t ,y) = -p(t) \in C$
there is only one solution in some interval $(t_0 +h ,t_0 - h) \subset I$
-
Uniqueness
We can make sure that the graph of two solution cannot have any intersection.
2-5 Autonomous DEs ,Population Dynamics
Autonmous
Ex1 (Exponential Growth)
$\phi (t)$is the popuation of the given species at time $t$
$$ \frac{dy}{dt} = f(y) $$consider $\frac{dy}{dt} = ry$
- r > 0 : increase
- r < 0 : decline
Assume r > 0 ,and $y(0) = y_0$ ,$\phi(t) = y_0 e^{rt}$
Ex2 (Logistic DEs)
$$ r = h(y) $$Condiser h(y) s.t. h(y) $\cong$r > 0,when y is small
- h(y) decrease as y grows larger
- h(y) < 0 when y is suff large
Assume that $h(y) = r-ay ,\quad a > 0$
Consider logistic equation:
$$ \frac{dy}{dt} = (r-ay)y $$$$ \frac{dy}{dt} = r(1- \frac{y}{k})y , \ where \ k =\frac{r}{a} $$- Find constant solution:
- Study for
By Exsistence and Uniqueness Theorem ,no solution can intersect the equilibrium solution $y = k$
- Find $\frac{d^2y}{dt^2} $
2-6 Exact Differential Equations
THM
Let M ,N ,$M_y ,N_x$are continuous in $R = \{ (x ,y) |\ \alpha < x< \beta ,\ \gamma Then the $equ(*)$ is exact in $R$ $\Longleftrightarrow \ M_y(x ,y) = N_x(x ,y)$at each point in $R$ Suppose (*) is an exact differential equation ,$\exist \ \psi \ s.t.\ \frac{\partial \psi}{\partial x} = M(x ,y) \ ,\ \frac{\partial \psi}{\partial y} = N(x ,y)$ then ,$M_y(x ,y) = \frac{\partial \psi}{\partial x\partial y} = \frac{\partial \psi}{\partial x \partial y}$, $M_y$and $N_x$are continuous ,$\psi_{xy} \ ,\ \psi_{yx}$are continuous. Hence ,$\frac{\partial^2 \psi}{\partial y \partial x} = \frac{\partial^2 \psi}{\partial x \partial y}$ ,$M_y = N_x$ Good Find a $\psi(x) \ s.t. \ \psi_x = M\ ,\ \psi_y = N$, Integrating $\psi_x = M \ w.r.t \ x$ $\psi(x ,y) = \int_{x_0}^{x}M(s ,y)\ ds + h(y) = Q(x ,y) + h(y)$ ,where $Q(x ,y) = \int_{x_0}^{x} M(s ,y)\ ds$and $x_0$is some specified constant with $\alpha < x < \beta$ Differentiating $Q(x ,y) + h(y) \quad w.r.t \quad y$ $\psi_y(x ,y) = \frac{\partial Q}{\partial y}(x ,y) + h'(y) = N(x ,y)$, Then $h'(y) = N(x ,y) - \frac{\partial Q}{\partial y}(x ,y)$ To show $N(x ,y) - \frac{\partial Q}{\partial y}(x ,y)$only depends on y It is called first-order difference equation. if f is a function of $y_n$, the difference equation is linear, otherwise, it is nonlinear. The solutions of difference equation is a sequence of numbers $y_0 ,y_1 ,y_2$ that satisfy the equation for each n. then$y_1 = f(y_0) ,\quad y_2 = f(y_1) = f(f(y_0) = f^2(y_0)$ In general , the $n^{th}$ iterate$y_n$is $y_n = f^n(y_0)$ Equilibrium solutions Solutions for which $y_n$has the same value for all n In general ,$y_n = \rho_{n-1}...\rho_0y_0 ,\quad n = 1 ,2 ,...$ if $\rho_n = \rho \ \ \forall n \text{ ,the equations becomes } y_n = \rho^n y_0$ $\text{In special case where }b_n = b \neq 0\text{for all n ,the difference equations is } $ $\text{If }\rho \neq 1, \text{we can write the solution in the more compact form}$ Many second-order ordinary differential equations have the form It is linear if the function has the form In general, if $g(t) = 0 \Longrightarrow \text{homogeneous}$ we consider that We start by seeking exponential solutions of the form$y = e^{rt}$,where r is a parameter to be determined. Then it follows that $y' = re^{rt}$,$y' = r^2 e^{rt}$Substituting these expressions. Since $e^{rt} \neq 0$ This equation is called the characteristic equation for the differential equation Assuming that the roots of the characteristic equation are real and different, let them be denoted by $r_1$ and $r_2$, where $r_1 \neq r_2$. Then $y_1(t) = e^{r_1 t} \ and \ y_2(t) = e ^{r_2 t}$ Substituting these expression, Consider the initial value $y(t_0) = y_0 ,\ y'(t_0) = y_0'$ Let p and q be continuous functions on an open interval I , for $\alpha < t < \beta$. The cases for $\alpha = -\infty$or $\beta = \infty$or both , are included.Then for any function $\phi$ that is twice differential on I , we define the differential operator L by the equation
for example ,if $p(t) = t^2 ,\ q(t) = 1+t,\ \phi(t) = sin3t$ we usually write this equation in the form with the initial value $y(t_0) = y_0,\quad y'(t_0) = y_0'$ Consider the initial value problem where p, q ,and g are continuous on an open interval I that contains the point $t_0$. This problem has exactly one solution $y = \phi (t)$,and the solution exists throughout the interval I. if $y_1,\ y_2$are two solutions of the differential equation then the linear combination $c_1y_1 + c_2 y_2$is also a solution for any value of the constants $c_1 ,\ c_2$ In this section, we are going to solve what if $r_1 = r_2$ if $r_1 = r_2$, then To find second solution, we assume that By substitude equation (a) ,cancelling the factor $e^{\frac{-b}{2a}t}$, rearranging the items, we find that then we have Thus $y$ is a linear combination of the two solutions Since ,$W[y_1 ,y_2](t)$ never zero, $y_1$and $y_2$form a fundamental set of solutions. Suppose that we know one solution $y_1(t)$ , not everywhere zero, of To find the second solution, let Substitude $y$, $y'$and $y''$to the equation (b) Since $y_1$is a solution of equation (b). Here becomes a first-order differential equation for the function $v'$ In this section,we will intoduce the gerneral method in principle at least, it can be applied to any equation, and it requires no detailed assumptions about the form of the solution. In gerneral, we consider where $p$ , $q$ and $g$ are continuous functions. We assume that we know In this idea, we replace the constant $c_1$,$c_2$ by $u_1(t)$,$u_2(t)$ $\text{resp.}$ Assume that $u_1'(t) y_1(t) + u'_2(t) y_2(t) = 0$, we have Now, we substitute$y$,$y'$and $y''$to the orginal equation. We find that where $y_1$and $y_2$are the homogeneous solutions ,hence the equation above can be reduced to Since (a) and (b) forms a system of two linear algebraic equations for derivatives $u_1'(t)$ and $u_2'(t)$of the unknown function. By Carmer’s rule, we have: Integrate both of them, and substitude into (a) ,you can get the answer.proof
2-9 First-Order Difference Equations
Definition
$$
y_{n+1} = f(n ,y_n) ,\quad n = 0 ,1 ,2...
$$General case
$$
y_{n+1} = f(y_n), \quad n = 0 ,1, 2...
$$
Linear Equations
Case-1
$$
y_{n+1} = \rho_ny_n,\quad n= 0 ,1 ,2...
$$Case-2
$$
y_{n+1} = \rho y_n + b_n,\quad n = 0,1,2 ...
$$$$
y_1 = \rho y_0 + b_0
$$$$
y_2 = \rho(\rho y_0 + b_0) + b_1 = \rho^2 y_0 + \rho b_0 + b_1
$$$$
y_n = \rho^n y_0 + \sum_{j = 0}^{n-1} \rho^{n-1-j} b_j
$$
3-1 Homogeneous DEs with constant coefficients
Definition
Initial Value Problem(IVP)
3-2 Solution of Linear Homogeneous Equations; the Wronskian
Differential Operator
Theorem 3.2.1 Existence and uniqueness Theorem
Theorem 3.2.2 Principle of Superposition
proof
$$
\begin{aligned}
L[c_1y_1 + c_2 y_2] &= [c_1y_1 + c_2 y_2]'' + p[c_1y_1 + c_2 y_2]' + q[c_1y_1 + c_2 y_2] \\\\
&= c_1y_1'' + c_2y_2'' + c_1py_1' + c_2py_2' + c_1qy_1 + c_2q y_2 \\\\
&= c_1[y_1'' + py_1' + q y_1] + c_2[y_2'' + py_2' + q y_2] \\\\
&= c_1L[y_1] + c_2L[y_2]
\end{aligned}
$$Note
3-4 Solution of Linear Homogeneous Equations; the Wronskian
Solution of Repeated Roots
Reduction of Order
3-6 Variation of Parameters
Method of Variation of Parameters