Google
 
Web unafbapune.blogspot.com

Thursday, November 28, 2013

 

\(\pi(x)\)

  \[ \begin{aligned} \pi(x) \thicksim \frac{x}{\ln x} \quad \text{as } x \rightarrow \infty \\ \end{aligned} \]
where \(\pi(x)\) is the number of primes less than or equal to \(x\). A nice video.


 

Taylor Series in \(\mathbb{C}\)

Let \(f: U \rightarrow \mathbb{C}\) be analytic and let \(\{|z - z_0| < r \subset U\}\). Then in this disk, \(f\) has a power series representation:
  \[ \begin{aligned} f(z) = \sum_{k=0}^\infty a_k (z - z_0)^k, \quad |z - z_0| < r, \quad \text{where } a_k = \frac{f^{(k)}(z_0)}{k!} \\ \end{aligned} \]
The radius of convergence of this power series is \(R \ge r\). Note this means an analytic function is determined entirely in a disk by all it's derivatives \(f^{(k)}(z_0)\) at the center \(z_0\) of the disk!

In other words, if \(f\) and \(g\) are analytic in \(\{|z - z_0| < r\}\) and if \(f^{(k)}(z_0) = g^{(k)}(z_0)\) for all k, then \(f(z) = g(z)\) for all \(z\) in \(\{|z - z_0| < r\}\) !

More at Analysis of a Complex Kind.


 

\(\zeta(s)\)

For \(s \in \mathbb{C}\) with \(\textrm{Re } s > 1\), the zeta function is defined as
  \[ \begin{aligned} \zeta(s) = \sum_{n=1}^\infty \frac{1}{n^s} \end{aligned} \]
We can readily see \(\zeta(s)\) converges absolutely in \(\{\textrm{Re } s > 1\}\). What's interesting is the convergence is uniform in \(\{\textrm{Re } s > r\}\) for any \(r > 1\), which can be used to show that \(\zeta(s)\) is analytic in \(\{\textrm{Re } s > 1\}\).

Riemann was able to show that \(\zeta(s)\) has an analytic continuation into \(\mathbb{C} \setminus \{1\}\), and this continuation satisfies that \(\zeta(s) \rightarrow \infty\) as \(s \rightarrow 1\).

Some interesting facts:

Riemann Hypothesis

    In the strip \(\{0 \le \textrm{Re } s \le 1\}\), all zeros of \(\zeta\) are on the line \(\displaystyle \{ \textrm{Re } s = \frac{1}{2} \}\)

which has significant implication on the distribution of prime numbers and the growth of many important arithmetic functions, but remains unproved.

Amazingly,
  \[ \begin{aligned} \zeta(s) = \prod_p \frac{1}{1 - p^{-s}} \end{aligned} \]
where the product is over all primes ! Why ? Observe
  \[ \begin{aligned} \zeta(s) &= \frac{1}{1^s} + \frac{1}{2^s} + \frac{1}{3^s} + \cdots \\ &= \big(\frac{1}{1^s} + \frac{1}{2^s} + \frac{1}{4^s} + \frac{1}{8^s} + \cdots\big)\big(\frac{1}{1^s} + \frac{1}{3^s} + \frac{1}{9^s} + \cdots\big)\big(\frac{1}{1^s} + \frac{1}{5^s} + \frac{1}{25^s} + \cdots\big) \cdots \\ \end{aligned} \]

More at Analysis of a Complex Kind.


Tuesday, November 26, 2013

 

\(\sum_{k=0}^\infty z^k \)

The close form of \(\displaystyle \sum_{k=0}^\infty z^k = \frac{1}{1 - z} \) for \(|z| < 1\) looks intuitively similar to that of the real counterpart \(\displaystyle \sum_{k=0}^\infty x^k = \frac{1}{1 - x} \) for \(x < 1\). However, for \(|z| = 1 \) and \( z \not = 1\), the value of the infinite series in the complex plane would keep walking on the unit circle !

Writing \(z = r \,e^{i \,\theta}\), on one hand, we can easily split the series into real and imaginary parts
  \[ \begin{aligned} \sum_{k=0}^\infty z^k = \sum_{k=0}^\infty r^k \cos(k\,\theta) + i \sum_{k=0}^\infty r^k \sin(k\,\theta) \end{aligned} \]
On the other hand,
  \[ \begin{aligned} \frac{1}{1-z} = \frac{1}{1 - r\,e^{i \,\theta}} = \cdots = \frac{1 - r \cos \theta + i r \sin \theta}{1 - 2r \cos \theta + r^2} \\ \end{aligned} \]
Check it! This necessarily means, in the peculiar case when \(|z| < 1\), ie when \(r < 1\),
  \[ \begin{aligned} \sum_{k=0}^\infty r^k \cos(k \, \theta) = \frac{1 - r \cos \theta}{1 - 2r \cos \theta + r^2} \\ \sum_{k=0}^\infty r^k \sin(k \, \theta) = \frac{r \sin \theta}{1 - 2r \cos \theta + r^2} \\ \end{aligned} \]
!

More at Analysis of a Complex Kind.


Saturday, November 23, 2013

 

Maximum Principle Theorem

Let \(f\) be analytic in a domain \(D\) and suppose there exists a point \(z_0 \in D\) such that \(\left|\,f(z)\,\right| \le \left|\,f(z_0)\,\right| \, \forall z \in D\). Then \(f\) is constant in \(D\) !

As a consequence, if \(D \subset \mathbb{C}\) is a bounded domain, and if \(f: \overline D \mapsto \mathbb{C}\) is continuous on \(\overline D\) and analytic in \(D\), then \(\left|\, f \,\right|\) reaches it's maximum on \(\partial D\).

(If I understand the notation correctly, \(\overline D\) refers to the union of both the open domain \(D\) and it's boundary, whereas \(\partial D\) refers to only the boundary.)

More at Analysis of a Complex Kind.


 

Liouville's Theorem

If \(f\) is an entire function (ie analytic in the complex plane) and is bounded, then f must be a constant

! Not hard to prove using Cauchy's Estimate. For example, since \(\sin z\) is not a constant, by Lioville's Theorem we know that there must be some points in the complex plane that \(\sin z\) will go off to infinity.

A rather counter-intuitive example is when \(f\) is an entire function with \(u(z) \le 0 \, \forall z \in \mathbb{C}\), then \(f\) must be constant ! The proof starts with considering the function \(g(z) = e^{f(z)} \).

Furthermore, Liouville's Theorem can be used to prove (via contradiction) the Fundamental Theorem of Algebra, which states that any polynomial
  \[ \begin{aligned} p(z) = a_0 + a_1z + \cdots + a_n z^n \end{aligned} \]
with \(a_0, \cdots, a_n \in \mathbb{C} \) with \(n \ge 1\) and \(a_n \not = 0\) has a zero in \(\mathbb{C}\). In other words, there exists \(z_0 \in \mathbb{C}\) such that \(p(z_0) = 0\). As a consequence, polynomials can always be factored in \(\mathbb{C}\) (but this is not the case in \(\mathbb{R}\)) ! So
  \[ \begin{aligned} p(z) = a_n(z - z_1)(z - z_2) \cdots (z - z_n) \end{aligned} \]
where \(z_1, \cdots , z_n \in \mathbb{C}\) are the zeros of \(p\). A typical example would be \(p(z) = z^2 + 1\).

More at Analysis of a Complex Kind.


 

Cauchy's Estimate

Suppose \(f\) is analytic in an open set that contains \(\overline{B_r(z_0)}\), and \(\left|\, f(z) \,\right| \le m\) holds on \(\partial B_r(z_0)\) for some constant m. Then for all \(k \ge 0\),
  \[ \begin{aligned} \left|\, f^{(k)}(z_0) \,\right| \le \frac{k!\,m}{r^k} \\ \end{aligned} \]
which turns out to be remarkably not hard to prove based on the Cauchy's Integral Formula. If I understand the notation correctly, \(\overline{B_r(z_0)}\) is the disk with radius \(r\) centered at \(z_0\), whereas \(\partial B_r(z_0)\) is the boundary of that disk.

More at Analysis of a Complex Kind.


 

Cauchy's Integral Formula

Let \(D\) be a simply connected domain, bounded by a piecewise smooth curve \(\gamma\), and let \(\,f\) be analytic in a set \(U\) that contains the closure of \(D\) (ie \(D\) and \(\gamma\)). Then for all \(w \in D\),
  \[ \begin{aligned} f(w) = \frac{1}{2\pi i} \oint_\gamma \frac{f(z)}{z - w} \,dz \\ \end{aligned} \]
!

For example, what is the value of
  \[ \begin{aligned} \oint_{\left| z \right| = 2} \frac{z^2}{z - 1} \,dz \\ \end{aligned} \]
?

This is not an easy integral to evaluate directly, but with Cauchy's formula, the problem can be reduced into simple pattern matching ! Can you see why the value is \(2 \pi i\) ? Now, how about
  \[ \begin{aligned} \oint_{\left| z \right| = 1} \frac{z^2}{z - 2} \,dz \\ \end{aligned} \]
?

Don't get tricked, and hopefully you can see the value is clearly zero.

An amazing consequence of this formula is that as long as \(f\) is analytic in an open set \(U\), then \(f'\) is also analytic in \(U\) ! Indeed,
  \[ \begin{aligned} f'(w) = \frac{1}{2\pi i} \oint_\gamma \frac{f(z)}{(z - w)^2} \,dz \\ \end{aligned} \]
And in general,
  \[ \begin{aligned} f^{(k)}(w) = \frac{k!}{2\pi i} \oint_\gamma \frac{f(z)}{(z - w)^{k+1}} \,dz \\ \end{aligned} \]
!

Doesn't this look déjà vu and perhaps somehow related to the coefficient of the \(k\)th term of a Taylor series expansion ?

For example, what is the value of
  \[ \begin{aligned} \oint_{\left| z \right| = 2\pi} \frac{z^2\sin z}{(z - \pi)^3} \,dz \\ \end{aligned} \]
? It becomes easy to realize it's exactly \(-4\pi^2i\). The incredible Cauchy !

More at Analysis of a Complex Kind.


 

Cauchy's Theorem

Suppose \(D\) is a simply connected domain in \(\mathbb{C}\), \(\,f\) be analytic, and \(\gamma: [a, b] \mapsto D\) be a piecewise smooth, closed curve in \(D\) (so that \(\gamma(b) = \gamma(a)\)), then
  \[ \begin{aligned} \oint_\gamma f(z)\,dz = 0 \\ \end{aligned} \]
!

Corollary

Suppose \(\gamma_1\) and \(\gamma_2\) are two simple closed curves (ie neither of them intersects itself), oriented clockwise, where \(\gamma_2\) is inside \(\gamma_1\). Suppose \(f\) is analytic in a domain \(D\) that contains both curves as well as the region between them, then
  \[ \begin{aligned} \oint_{\gamma_1} f(z)\,dz = \oint_{\gamma_2} f(z)\,dz \\ \end{aligned} \]
!

For example, suppose \(R\) is the rectangle with vertices \(−2−i, 2−i, 2+i, −2+i\), the integral
  \[ \begin{aligned} \oint_{\partial R} \frac{1}{z - z_0} \,dz \\ \end{aligned} \]
can have zero vs. non-zero value depending \(z_0\). Can you see why and how to compute the non-zero value ?

More at Analysis of a Complex Kind.


Friday, November 22, 2013

 

Primitive

A primitive of \(f\) on \(D\) is an analytic function \(F: D \mapsto \mathbb{C}\) such that \(F' = f\) on \(D\), where \(D \subset \mathbb{C}\).

If \(f\) is continuous on a domain \(D\) and if \(f\) has a primitive \(F\) in \(D\), then for any curve \(\gamma: [a,b] \mapsto D\),
  \[ \begin{aligned} \int_\gamma f(z)\,dz = F(\gamma(b)) - F(\gamma(a)) \\ \end{aligned} \]
Observe that

When does \(f\) have a primitive ?

By Goursat Theorem, if \(D\) is a simply connected domain in \(\mathbb{C}\), and \(f\) is analytic in \(D\), then \(f\) has a primitive in \(D\).

More at Analysis of a Complex Kind.


Thursday, November 21, 2013

 

\(ML\)-Estimate

Suppose \(\gamma\) is a curve and \(f\) is continuous on \(\gamma\), then
  \[ \begin{aligned} \left| \int_\gamma f(z) \, dz \,\right|\, \le \int_\gamma \left|\, f(z) \,\right|\, \left| \, dz \,\right| \\ \end{aligned} \]
In particular, if \(\left|\, f(z) \,\right| \le M \) on \(\gamma\), then
  \[ \begin{aligned} \left| \int_\gamma f(z) \, dz \,\right|\, \le M \int_\gamma \, \left| \, dz \,\right| = M \cdot \text{length}(\gamma) \\ \end{aligned} \]
!

For example, suppose \(\gamma(t) = t + it, 0 \le t \le 1\), what is the upper bound of \(\displaystyle \int_\gamma z^2\,dz \) ? It should be easy to check that it's \(2\sqrt{2}\). Turns out a better and indeed exact estimate is \(\displaystyle \frac{2}{3}\sqrt{2}\), using the first part of the theorem !

More at Analysis of a Complex Kind.


 

Arc Length

Given a curve \( \gamma: [a, b] \mapsto \mathbb{C}\), how do we find it's length ?
  \[ \begin{aligned} \text{length}(\gamma) &\approx \sum_{j=0}^n \,\left|\, \gamma(t_{j+1}) - \gamma(t_j) \,\right|\, \\ &\approx \sum_{j=0}^n \frac{\,\left|\, \gamma(t_{j+1}) - \gamma(t_j) \,\right|\,}{t_{j+1} - t_j} (t_{j+1} - t_j) \\ \text{length}(\gamma) &= \int_a^b \,\left|\, \gamma'(t) \,\right|\, dt \qquad\\ \end{aligned} \]
!

Here is the definition of Integration with respect to Arc Length:
  \[ \begin{aligned} \int_\gamma f(z)\,\left|\, dz \,\right| = \int_a^b f(\gamma(t)) \,\left|\, \gamma'(t) \,\right|\,dt \end{aligned} \]
Note when \(f(z) = 1\),
  \[ \begin{aligned} \int_\gamma f(z)\,\left|\, dz \,\right| = \int_\gamma \,\left|\, dz \,\right| = \int_a^b \,\left|\, \gamma'(t) \,\right|\,dt = \text{length}(\gamma) \end{aligned} \]
!

More at Analysis of a Complex Kind.


Monday, November 18, 2013

 

Integration in \(\mathbb{C}\)

Let \(f: [a, b] \rightarrow \mathbb{R} \) be continuous. Then
  \[ \begin{aligned} \int_a^b f(t)\,dt &= \lim_{n \rightarrow \infty} \sum_{j=0}^{n-1} f(t_j)(t_{j+1} - t_j) \\ F(x) &= \int_a^x f(t)\,dt \\ F'(x) &= f(x) \,\,\, \text{ for } x \in [a, b] \\ \end{aligned} \]
Let \(\gamma\) be a curve ie a smooth or piecewise smooth function in \(\mathbb{C}\):
  \[ \begin{aligned} &\gamma: [a, b] \rightarrow \mathbb{C} \\ &\gamma(t) = x(t) + iy(t) \\ \end{aligned} \]
If \(f\) is complex-valued on \(\gamma\),
  \[ \begin{aligned} \int_\gamma f(z)\,dz &= \lim_{n \rightarrow \infty} \sum_{j=0}^{n-1} f(z_j)(z_{j+1} - z_j) \\ \end{aligned} \]
where \(z_j = \gamma(t_j)\).

But
  \[ \begin{aligned} \int_\gamma f(z)\,dz &= \int_a^b f(\gamma(t)) \, \gamma'(t)\,dt \\ \end{aligned} \]
Why ? This is easy to see, as
  \[ \begin{aligned} z &= \gamma(t) \\ dz &= \gamma'(t)\,dt \\ \end{aligned} \]
Such integration over the curve is called the Path Integral. Furthermore, if \(g(t) = u(t) + iv(t) \), then
  \[ \begin{aligned} \int_a^b g(t)\,dt &= \int_a^b u(t) \,dt + i\int_a^b v(t) \,dt \\ \end{aligned} \]
What is \(\displaystyle \int_{\left| z \right| = 1} \frac{1}{z}\,dz\) ? Kind of cute.

Turns out, in general, \(\displaystyle \int_{\left| z \right| = 1} z^m \,dz = 2\pi i\) when \(m = -1\) but zero otherwise !

More at Analysis of a Complex Kind.


Sunday, November 17, 2013

 

Induction

Linear Induction

  \[ \begin{aligned} &\phi[a] \\ &\forall \mu.(\phi[\mu] \implies \phi[s(\mu]) \\ \hline \\ &\forall \nu.\phi[\nu] \\ \end{aligned} \]
It's crucial that the signature consists of no other object constants or function constants !

Tree Induction

  \[ \begin{aligned} &\phi[a] \\ &\forall \mu.(\phi[\mu] \implies \phi[f(\mu]) \\ &\forall \mu.(\phi[\mu] \implies \phi[g(\mu]) \\ \hline \\ &\forall \nu.\phi[\nu] \\ \end{aligned} \]

Structural Induction

The most general form of induction !
  \[ \begin{aligned} &\phi[a] \\ &\phi[b] \\ &\forall \lambda. \forall \mu.((\phi[\lambda] \land \phi[\mu]) \implies \phi[c(\lambda, \mu)]) \\ \hline \\ &\forall \nu.\phi[\nu] \\ \end{aligned} \]

More at Introduction to Logic.


Saturday, November 16, 2013

 

Domain Closure

Induction for finite languages is trivial. We simply use the Domain Closure rule of inference. For a language with object constants \(\sigma_1, \cdots, \sigma_n\),
  \[ \begin{aligned} \phi[\sigma_1] \\ \cdots \,\,\, \\ \phi[\sigma_n] \\ \hline \\ \forall \nu.\phi[\nu] \end{aligned} \]
If we believe a schema is true for every instance, then we can infer a universally quantified version of that schema.

More at Introduction to Logic.


 

Riemann Mapping Theorem

What conformal mappings are there of the form \(f: \mathbb{D} \mapsto D\), where \(\mathbb{D} = B_1(0)\) is the unit disk and \(D \in \mathbb{C}\) ?

If \(D\) is a simply connected domain (ie open, connected, no holes) in the complex plane, but not the entire complex plane, then there is a conformal map (ie analytic, one-to-one, onto) of \(D\) onto the open unit disk \(\mathbb{D}\).

We say that "\(D\) is conformally equivalent to \(\mathbb{D}\)".

To find a unique conformal mapping \(\, f\) from \(D\) to \(\mathbb{D}\), we need to specify "3 real parameters".

For example,

Now how can we use the above to help construct the Riemann map \(h\) from \(\mathbb{D}^+\) to \(D\) ? Try and you should find the solution:
  \[ \begin{aligned} h = f \circ g \circ f^{-1} \end{aligned} \]
More at Analysis of a Complex Kind.

 

Composition of Möbius transformations

Every Möbius transformation is the composition of mappings of the type:
  \[ \begin{aligned} z &\mapsto az \qquad \text{(rotation & dilation)} \\ z &\mapsto z + b \quad \text{(translation)} \\ z &\mapsto \frac{1}{z} \qquad \text{(inversion)} \\ \end{aligned} \]
which all map circles and lines to circles or lines.

More at Analysis of a Complex Kind.


 

Finding Möbius Transformation

  1. The composition of two Möbius transformations is a Möbius transformation, and so is the inverse.
  2. Given three distinct points \(z_1, z_2, z_3\) and three distinct points \(w_1, w_2, w_3\), there exists a unique Möbius transformation \(f : \hat{\mathbb{C}} \mapsto \hat{\mathbb{C}}\) that maps \(z_j\) to \(w_j , j = 1, 2, 3\):
  3.   \[ \begin{aligned} f_2^{-1} \circ f_1 \end{aligned} \]
    where \(f_1\) maps \(z_1, z_2, z_3\) to \(0, 1, \infty\), and \(f_2\) maps \(w_1, w_2, w_3\) to \(0, 1, \infty\).
For example, what is the Möbius transformation that maps \(0\) to \(-1, i\) to \(0\), and \(\infty\) to 1 ?

More at Analysis of a Complex Kind.


Friday, November 15, 2013

 

\(w + \overline w = 1\)

  \[ \begin{aligned} \therefore \text{Re } w & = \frac{1}{2} \\ \end{aligned} \]
!

which happens to be the image of the circle \(K = \{z: \left| z - 1 \right| = 1 \}\).


 

\(\left| z - 3 \right| = 1\)

What does \(\left| z - 3 \right| = 1\) look like ? A circle of radius 1, centered at 3.

What does \(\displaystyle \left| \frac{1}{z} - 3 \right| = 1\) look like ?
  \[ \begin{aligned} \left| \frac{1}{z} - 3 \right| &= 1 \\ \left| \frac{1-3z}{z} \right|^2 &= 1 \\ \left| 1-3z \right|^2 &= \left| z \right|^2 \\ (1-3z)\overline{(1-3z)} &= z \overline z \\ (1-3z)(1-3\overline z) &= z \overline z \\ 1-3z-3\overline z + 9z\overline z &= z \overline z \\ 8z\overline z - 3z - 3\overline z + 1 &= 0 \\ z\overline z - \frac{3}{8}z - \frac{3}{8} \overline z + \frac{1}{8} &= 0 \\ (z - \frac{3}{8})(\overline z - \frac{3}{8}) - \frac{3^2}{8^2} + \frac{1}{8} &= 0 \\ \left| z - \frac{3}{8} \right|^2 &= \frac{1}{64} \\ \left| z - \frac{3}{8} \right| &= \frac{1}{8} \\ \end{aligned} \]
\(\therefore\) A circle of radius \(\displaystyle \frac{1}{8}\), centered at \(\displaystyle \frac{3}{8}\).

More at Analysis of a Complex Kind.


Thursday, November 14, 2013

 

Möbius Transformation

Möbius Transformation is a function of the form:
  \[ \begin{aligned} f(z) = \frac{az + b}{cz + d} \end{aligned} \]
where \(a, b, c, d \in \mathbb{C}\) such that \(ad - bc \not= 0\). (This looks surprisingly familiar to the determinant of a 2 by 2 invertible matrix !)

Interestingly,

Möbius Transformation is called Affine transformation when \(c = 0\), and \(d=1\). Interestingly,

More at Analysis of a Complex Kind.


Wednesday, November 13, 2013

 

Conformal Mapping

Intuitively, a conformal mapping is a “mapping that preserves angles between curves”.

If \(f:U \rightarrow \mathbb{C}\) is analytic and if \(z_0 \in U\) such that \(f′(z_0) \not = 0\), then \(f\) is conformal at \(z_0\) !

Observe the contrapositive: if a function \(f\) is not conformal, then either \(f\) is not analytic or \(f′(z_0) = 0\). This means it would fail the Cauchy-Riemann equations ! The function \(\overline z\) is a good example.


 

\(\gamma(t) = te^{it}\)

What does the graph look like ?




























In Desmos, choose polar form and enter:
  \[ \begin{aligned} r = \theta \end{aligned} \]


Tuesday, November 12, 2013

 

Log(\(z\))

  \[ \begin{aligned} z &= \left| z \right| \cdot e^{i \cdot \text{Arg}(z)}, \qquad\qquad\qquad\text{in polar form} \\ \text{Log}(z) &= \ln{\left| z \right|} + i \cdot \text{Arg}(z), \qquad\quad\quad\text{the principal branch of logarithm} \\ \text{log}(z) &= \ln{\left| z \right|} + i\left(\text{Arg}(z) + 2k\pi\right), \quad\,\,\text{where } k \in \mathbb{Z} \\ \text{log}(z) &= \ln{\left| z \right|} + i\cdot\text{arg}(z), \qquad\quad\quad\, \text{a multi-valued function} \\ \end{aligned} \]
Observe Log(\(z\)):
  \[ \begin{aligned} z &\mapsto \left| z \right|, \qquad\quad\,\,\,\text{continuous in }\mathbb{C} \\ z &\mapsto \ln{\left| z \right|}, \qquad\,\,\;\text{continuous in }\mathbb{C} \setminus \{0\} \\ z &\mapsto \text{Arg}(z), \qquad\text{continuous in }\mathbb{C} \setminus (-\infty,0] \\ \end{aligned} \]
Recall \(-\pi \lt \text{Arg}(z) \le \pi \). So as \(z\) approaches the x-axis to the left of the origin, it would result in \(\pi\) for Arg(\(z\)) if from above, but \(-\pi\) if from below.
  \[ \begin{aligned} \therefore \text{Log}(z) \text{ is continuous in } \mathbb{C} \setminus (-\infty,0]. \end{aligned} \]
Similar line of arguments can be used to reason that the domain of \(z^2\) is continuous on \(\mathbb{C}\), but the domain of \(\sqrt{z}\) is only continuous on \(\mathbb{C} \setminus (-\infty,0]\). Do you see how ?

Quiz: what is \(\displaystyle\text{Log}(-e^xi) \) ?







Answer:

Observe \(\displaystyle z = -e^xi \,\) lies on the imaginary axis and therefore \(\displaystyle\text{Arg}(z) = -\frac{\pi}{2}\)
  \[ \begin{aligned} -e^x i &= e^x(0 - i) = e^x e^{-\frac{\pi}{2}i} \\ \text{Log}(-e^xi) &= x -\frac{\pi}{2}i \\ \end{aligned} \]
\(\Box\)

More at Analysis of a Complex Kind.


Thursday, November 07, 2013

 

Complex sine and cosine

  \[ \begin{aligned} \cos x &= \frac{e^{ix} + e^{-ix}}{2} \\ \sin x &= \frac{e^{ix} - e^{-ix}}{2i} \\ \end{aligned} \]
This works whether \(x\) is \(\theta \in \mathbb{R} \) or \(z \in \mathbb{C}\). Do you see why ?

Furthermore, it's not difficult to show their relationships with hyperbolic functions:
  \[ \begin{aligned} \sin z &= \sin(x + iy) = \sin x\cosh y + i\cos x\sinh y \\ \cos z &= \cos(x + iy) = \cos x\cosh y\, – i\sin x\sinh y \\ \end{aligned} \]
Why would this be useful ? It's particularly handy when there is the need to evaluate some form of conjugate or absolute value, where it's helpful to break down the function into real and imaginary parts.

Did I mention already ? Euler, the truly incredible.


Wednesday, November 06, 2013

 

\(\left| z \right|^2 = z \cdot \overline z\)

It's rather easy to show that:
  \[ \begin{aligned} \left|z\right|^2 = z \cdot \overline z \,\,\,\,\,\text{ where } z = x + iy \end{aligned} \]
How about:
  \[ \begin{aligned} \left | z + 1 \right |^2 = (z + 1) \cdot (\,\overline z + 1) \end{aligned} \]
?

As a challenge, consider:
  \[ \begin{aligned} Re(e^{\frac{z}{z+1}}) = \left( e^{\frac{\left|z\right|^2+x}{\left|z+1\right|^2}}\right) \cdot \cos\left(\frac{y}{\left| z+1 \right| ^2}\right) \end{aligned} \]
where the left hand side refers to the real part of \(\displaystyle e^{\frac{z}{z+1}}\). Does the equation hold ? Note in particular the right hand side is a real number, but in terms of numbers in the complex plane !

More at Analysis of a Complex Kind.


 

\(e^{\overline{z}} = \overline{e^{z}} \)

In other words, \(e\) raised to the power of the conjugate of \(z\) is equal to the conjugate of \(e^z\), where \(z \in \mathbb{C}\). Not immediately obvious but not hard to show.


Tuesday, November 05, 2013

 

Cauchy-Riemann Equations

The Cauchy-Riemann Equations are useful for checking differentiability in the complex plane, and computing them.

Suppose
  \[ \begin{aligned} f(z) = u(x,y) + i \cdot v(x,y) \end{aligned} \]
is differentiable at \(z_0\). Then

  1. the partial derivatives \(u_x, u_y, v_x, v_y\) exist at \(z_0\),
  2. \(u_x = v_y\), and
  3. \(u_y = -v_x\).
Furthermore,
  \[ \begin{aligned} f'(z_0) &= f_x(z_0) = u_x(x_0,y_0) + iv_x(x_0,y_0) \\ &= -i \cdot f_y(z_0) = -i \cdot (u_y(x_0,y_0) + i \cdot v_y(x_0,y_0)) \\ \end{aligned} \]
!

Conversely, suppose
  \[ \begin{aligned} f = u + i \cdot v \end{aligned} \]
is defined on a domain \(D \in \mathbb{C}\). Then \(f\) is analytic in \(D\) iff \(u(x,y)\) and \(v(x,y)\) have continuous first partial derivatives on \(D\) that satisfies the Cauchy-Riemann equations. The key point is continuity is necessary for the guarantee to work.

More info at Analysis of a Complex Kind.


 

\(x \cdot \lvert x \rvert^2\)

What's peculiar about the function
  \[ \begin{aligned} f(x) = x \cdot \lvert x \rvert^2 \end{aligned} \]
in terms of differentiability ? More on this later ...


Monday, November 04, 2013

 

\(Re(z), \, \bar z\) & \(\lvert z \rvert^2 \)

A complex function \(f\) is analytic if \(f\) is differentiable at each point \(z \in U\) where \(U\) is an open set in \(\mathbb{C}\). A function which is analytic in all of \(\mathbb{C}\) is called an entire function.

Consider the complex functions:
  \[ \begin{aligned} f(z) &= \text{Re }(z) \\ f(z) &= \bar z \\ f(z) &= \lvert z \rvert^2 \end{aligned} \]
Can you see why they are not analytic ?


Saturday, November 02, 2013

 

Fitch Proof

Is this a correct Fitch proof ?

  1. \( p(x) \)  Assumption
  2. \( p(x) \lor \lnot p(x) \)  Or Introduction 1
  3. \( \forall x. p(x) \lor \lnot p(x) \)  Universal Introduction 2
  4. \( p(x) \implies \forall x. p(x) \lor \lnot p(x) \)  Implication Introduction 1-3

More information at Introduction to Logic.


 

Universal Elimination

Is this a correct single step inference in the Fitch system for Relational Logic ?
  \[ \begin{aligned} \frac{\forall x . \forall y . (p(x) \wedge q(y))}{\forall y.(p(y) \wedge q(y))} \end{aligned} \]
More information at Introduction to Logic.


Friday, November 01, 2013

 

\(\forall x .(\exists y. p(x,y) \implies q(x)) \)

It can be shown that if
  \[ \begin{aligned} \forall x .(\exists y. p(x,y) \implies q(x)) \end{aligned} \]
holds, then
  \[ \begin{aligned} \forall x .\forall y.(p(x,y) \implies q(x)) \end{aligned} \]
holds. Do you see why ?




















Turns out it's due to the simple reason that
  \[ \begin{aligned} \forall x .(\exists y. p(x,y) \implies q(x)) \end{aligned} \]
is not the same as
  \[ \begin{aligned} \forall x .(\exists y. (p(x,y) \implies q(x))). \end{aligned} \]
In particular, the sentence
  \[ \begin{aligned} \exists y. p(x,y) \implies q(x) \end{aligned} \]
does not mean there exists \(y\) that the implication holds. Rather, it means if there exists \(y\) so that \(p(x,y)\) holds, then \(q(x)\) holds. In other words, the quantifier "exists y" applies only to the antecedent and not to the consequence.

This page is powered by Blogger. Isn't yours?