Symbolic computations associated to Independent sets in cographs and increasing subsequences in separable permutations

The aim of this SymPy (python) worksheet is to provide all the computations needed to get the exact and numerical results of Section 4-5 in the paper "Linear-sized independent sets in random cographs and increasing subsequences in separable permutations".

Notation and definitions are taken from this paper.

Table of contents

Sec.4 Linear-sized independent sets in cographs

Recall the notation of Section 4: for fixed $\beta\in (0,1)$ the random variable $X_{n,\lfloor \beta n\rfloor}$ is the number of independent sets of size $\lfloor \beta n\rfloor$ in a uniform random labeled cograph of size $n$.

We prove that there exist some computable constants $B_\beta >0$, $C_\beta >0$ such that \begin{equation} \mathbb{E}[X_{n,\lfloor \beta n\rfloor}] \sim B_\beta n^{-1/2} (C_\beta)^n. \end{equation}

We now provide all the necessary code to obtain exact and numerical approximation of $C_\beta$ with arbitrary precision (and also prove that $C_\beta >1$ for small enough $\beta$).

Exact results: parametrization with $y$

First it is proved that $\beta$ can be written as $\beta=\Psi(y)$ where $$ \Psi(y):=\frac{ \left( e^{y}-2 - \log(e^{y}-1) \right) (e^{y}-1) } { 2y + 1 - e^{y} + (e^{y}-1) (2y - 1- \log(e^{y}-1) ) }. $$ and where we have put $$ y=L\left(r(u(\beta) \right) $$ (see below for the expressions of functions $r,u$).

In Sec.4 of the paper it is then proved that $C_\beta$ can be written as $$ C_\beta= \frac{2\log(2)-1}{r(u(\beta)) u(\beta)^{\beta}}= \frac{2\log(2)-1}{G(y)F(y)^{\beta}} = \frac{2\log(2)-1}{G(y)F(y)^{\Psi(y)}} $$ where \begin{align*} F(y)&=\frac{e^{y}-2 - \log(e^{y}-1)}{2y + 1 - e^{y}}\\ G(y)&= 2y + 1 - e^{y}. \end{align*}

Therefore we can plot $C_\beta$ as a function of $y$:

Numerical approximations: parametrization with $\beta$

First a small function which uses the dichotomy method to compute numerically an implicit function:

Thus we find that $C_\beta=1$ for $y \approx 0.13363615\dots$.

We now can give the $\beta_0$ of Theorem 1.3 in the paper. It is the value starting from which $C_\beta <1$ (the orange spot in the above plot). Namely $$ \beta_0 = \Psi( C_\beta^{-1}(1) ) $$

Therefore:

Finding numerically the argmax of $C_\beta$

Expansion expansion of $C_\beta$ when $\beta \to 0$

Finally we compute the Taylor expansion of $C_\beta$ which is useful in the proof of Theorem 1.3 (ii):

This shows \begin{equation} \beta=\Psi(y)=(y - \log(2))^2/(2\log(2)-1) +\mathcal{O}((y - \log(2))^3).\tag{$45$} \end{equation} This allows us to compute the expansions of $u(\beta),r(u(\beta))$:

If we plug finally eq.(45) in the above we get \begin{align*} r(u(\beta))&=2\log(2) - 1 - \beta(2\log(2)-1) + \mathcal{O}(\beta^{3/2})\\ u(\beta)&=2\beta+ \mathcal{O}(\beta^{3/2})\\ \end{align*}

This allows us to find that $$ C_\beta=1+\beta|\log(\beta)| +\mathrm{o}(\beta\log(\beta)). $$

Sec.5: increasing sequences in separable permutations

In Section 5 we apply the same general strategy to the series $S_+(z,u)$, $S_-(z,u)$ which are solutions of

\begin{align*} S_-(z,u) &= \frac{S^2}{1-S}+\left(\frac{(S_-(z,u) +z+zu)^2}{1-(S_-(z,u) +z+zu)}+zu+z -S\right)\times \left( \frac{1}{(1-S)^2}-1\right),\\ S_+(z,u) &= \frac{(S_-(z,u) +z+zu)^2}{1-(S_-(z,u) +z+zu)} \end{align*}

where $S(z)$ is the solution of $S=z+S^2/(1-S)$.

We focus on $S_-$ which is solution of $$ S_-(z,u) = \frac{S(z)^2}{1-S(z)}+\left(\frac{(S_-(z,u) +z+zu)^2}{1-(S_-(z,u) +z+zu)}+zu+z -S(z)\right)\times \left( \frac{1}{(1-S(z))^2}-1\right)=:G(z,S_-,u) $$ The associated characteristic system is then:

\begin{align}\label{eq:CharacteristicSystem} \frac{S(r)^2}{1-S(r)}+\left(\frac{(s +r+ru)^2}{1-(s+r+ru)}+ru+r -S(r)\right)\times \left( \frac{1}{(1-S(r))^2}-1\right)&=s, \tag{$\star 1$}\\ \left( \frac{1}{(1-(s+r+ru))^2} - 1 \right)\left( \frac{1}{(1-S(r))^2}-1\right)&=1. \tag{$\star 2$} \end{align}

We can solve this system seeing $S(r)$ as a parameter:

Keeping the solution with s>0, we obtain \begin{align}\label{eq:Simpli_r_u} r(u)&= \frac{1}{u + 1} \left(2 y - 2 \sqrt{2y - y^2} + 1\right),\\ s(u)&= -2y+\sqrt{2y - y^2} \end{align} where $y:=S(r(u))$.

Exact results: computing everything as a function of $y$

It is easy to check that $$ y\in (0,1-\tfrac{\sqrt{2}}{2}] \mapsto u(y) \in [0,+\infty), $$ is a one-to-one mapping.

Recall that $S$ is a solution of $S=z+S^2/(1-S)$. By differentiating this equation we get $$ S'=1+\frac{S' S(2-S)}{(1-S)^2} $$ which gives a simple expression of $S'$ as a function of $S$:

Using (eq.(2.14) in M.Drmota Asymptotic distributions and a multivariate Darboux method in enumeration problems (1994)) we have that $$ \beta= \frac{ uG_u(r(u),s(u),u) }{r(u)G_z(r(u),s(u),u)} $$

We first have \begin{align*} G_u\left(r(u),s(u),u \right) &= \frac{r}{(1-r-s-ru)^2}\times \left( \frac{1}{(1-S(r))^2}-1\right)\\ &=r\times \left( \frac{1}{(1-S(r))^2}\right) \qquad \text{(using 2d equation of the charac. system)}. \end{align*}

We now want to find an expression for $G_z$. \begin{multline} G_z\left(r(u),s(u),u \right)=\frac{SS'(2-S)}{(1-S)^2}+\left(\frac{u+1}{(1-r-s-ru)^2}-S'\right)\times \left( \frac{1}{(1-S(r))^2}-1\right)\\ +\left(\frac{(s +r+ru)^2}{1-(s+r+ru)}+ru+r -S\right)\times \left( \frac{2S'}{(1-S)^3}\right). \end{multline}

Therefore \begin{align} \beta=\frac{ uG_u(r,s,u) }{rG_z(r,s,u)} &= \frac{u(y)\times r\times \left( \frac{1}{(1-y)^2}\right)} { r\times \bigg(\frac{SS'(2-S)}{(1-S)^2}+\left(\frac{u+1}{(1-r-s-ru)^2}-S'\right)\times \left( \frac{2S-S^2}{(1-S)^2}\right) +\left(\frac{(s +r+ru)^2}{1-(s+r+ru)}+ru+r- S\right)\times \left( \frac{2S'}{(1-S)^2(1-S)}\right)\bigg)}\\ &= \frac{u(y)} {SS'(2-S)+\left(\frac{u+1}{(1-r-s-ru)^2}-S'\right)\times \left( 2S-S^2\right) +\left(\frac{(s +r+ru)^2}{1-(s+r+ru)}+ru+r- S\right)\times \left( \frac{2S'}{1-S}\right)} \qquad (\star) \end{align}

Finally if we use $$ S'=\frac{(S - 1)^2}{S(S-2) + (S - 1)^2}, $$ we can deduce from above an "explicit" expression of $\beta$ as a function of $y(\beta)$

We compute the denominator of $(\star)$:

Numerical approximations: parametrization with $\beta$

Finally we compute the value for which $E_\beta=1$ :

Expansion of $E_\beta$ when $\beta \to 0$

Hence $$ \beta(y)=\frac{1}{\tfrac{3}{4} \sqrt{2}-1}(y - 1 + \tfrac{\sqrt{2}}{2})^2 + O(y -1+\tfrac{\sqrt{2}}{2})^3 $$ therefore $$ y=1-\tfrac{\sqrt{2}}{2}+ \sqrt{\beta}\sqrt{\tfrac{3}{4} \sqrt{2}-1} + \mathrm{o}(\sqrt{\beta}). $$ We find the same asymptotic for $E_\beta$ as for $C_\beta$: $$ E_\beta=1+\beta|\log(\beta)| +\mathrm{o}(\beta\log(\beta)). $$