Lambert W

What value of \(x\) satisfies \(x^x=2\).
Published

May 24, 2022

From [1]:

The Lambert W Function is denoted \(W(x)\) and is the inverse of the function \(f(x) = xe^{x}\).

Thus, \(W(x) = f^{-1}(x)\).

For \(f(x)\) we have that:

The last two bullet points mean that the domain and range of \(W(x)\) are \([-\frac{1}{e},\infty)\) and \([-1,\infty)\) respectively.

For \(W(x)\) we also have that:

  1. \(W(f(x)) = W(xe^{x}) = x\)
  2. \(f(W(x)) = W(x)e^{W(x)} = x\). In other words, \(e^{W(x)} = \frac{x}{W(x)}\)

Solve \(x^{x} = 2\)

Take the log of both sides to get \(x\ln(x) = \ln(2)\) or that \(e^{\ln(x)}\ln(x) = \ln(2)\). Rearranging we get \(\ln(x)e^{\ln(x)} = \ln(2)\).

Applying the Lambert W function on both sides produces \(W(\ln(x)e^{\ln(x)}) = W(\ln(2))\) or that \(\ln(x) = W(\ln(2))\).

Finally, we get that \(x = e^{W(\ln(2))}\).

from scipy.special import lambertw
import numpy as np
x = np.exp( lambertw( np.log(2) ) )
x, np.allclose( np.power(x,x) , 2 )
((1.5596104694623694+0j), True)

Thus, the solution of \(x^{x} = 2\) is \(x = 1.55961\)

Solve \(x^{2}e^{x} = 2\)

Take the square root in both sides to get \(xe^{x/2} = \sqrt 2\). Divide both sides by \(2\) to get \(0.5xe^{0.5x} = \frac{1}{\sqrt 2}\).

Applying the Lambert W function on both sides produces \(W(0.5xe^{0.5x}) = W(\frac{1}{\sqrt 2})\) or \(0.5x = W(\frac{1}{\sqrt 2})\).

Finally, we get \(x = 2W(\frac{1}{\sqrt 2})\).

x = 2 * lambertw(1./np.sqrt(2))
x, np.allclose( x*x*np.exp(x), 2 )
((0.9012010317296661+0j), True)

Solve \(x + e^{x} = 2\)

Exponentiating both sides we get \(e^{x}e^{e^{x}} = e^{2}\). Applying the Lambert W function on both sides produces \(W( e^{x}e^{e^{x}} ) = W(e^{2})\) or \(e^{x} = W(e^{2})\)

Thus, \(x = \ln(W(e^{2}))\)

x = np.log( lambertw( np.exp(2) ) )
x, np.allclose( x + np.exp(x) , 2 )
((0.4428544010023887+0j), True)

Find the minimizer of \(x \ln(\frac{x}{u}) + \frac{1}{2\lambda}(x-v)^{2}\) (suppose \(\lambda >0\)).

This is a sum of convex functions so it is convex and a minimizer exists.

Taking the derivative and setting equal to zero we get: \[1 + \ln x - \ln u + \frac{x}{\lambda} - \frac{v}{\lambda} = 0\]

Hence, \(\ln x + \frac{x}{\lambda} = \ln u + \frac{v}{\lambda} - 1\).

Exponentiating both sides we get:

\(e^{\ln x + \frac{x}{\lambda}} = e^{\ln u + \frac{v}{\lambda} - 1}\) or \(xe^{\frac{x}{\lambda}} = ue^{\frac{v}{\lambda} - 1}\).

Dividing both sides by \(\lambda\) we have \(\frac{x}{\lambda}e^{\frac{x}{\lambda}} = \frac{u}{\lambda}e^{\frac{v}{\lambda} - 1}\).

Applying the Lambert W function on both sides gives: \[ W(\frac{x}{\lambda}e^{\frac{x}{\lambda}}) = W(\frac{u}{\lambda}e^{\frac{v}{\lambda} - 1})\] or

\(\frac{x}{\lambda} = W(\frac{u}{\lambda}e^{\frac{v}{\lambda} - 1})\).

Thus, \[x = \lambda W(\frac{u}{\lambda}e^{\frac{v}{\lambda} - 1})\]

References

[1]
blackpenredpen, “<A href="https://www.youtube.com/watch?v=sWgNCra93D8">"lambert w function intro and x^x=2"</a>.” https://www.youtube.com/watch?v=sWgNCra93D8, 2018.