I was trying to calculate the limit of sequence defined as
$$a_1=k; a_2=l; a_{n+1}=\frac{a_n+(2n-1)a_{n-1}}{2n}; k, l\in\mathbb{N}k<l$$
I had no idea where to start with that so I've brute forced the problem on my PC for permutations of $(k, l)$ to $a_{10^{10}}$ in hope that there would be an emerging pattern. This is what my PC thinks the $a_\infty$ is for $(k, l)$:
$$(1, 2) \Rightarrow 3-\sqrt{2}$$ $$(1, 3) \Rightarrow 5-\sqrt{8}$$ $$(2, 3) \Rightarrow 4-\sqrt{2}$$ $$(1, 4) \Rightarrow 7-\sqrt{18}$$ $$(2, 4) \Rightarrow 6-\sqrt{8}$$ $$(3, 4) \Rightarrow 5-\sqrt{2}$$ $$....$$
It seems that $a_{n\rightarrow\infty}\rightarrow2l-k-(l-k)\sqrt{2}$. How can I show that mathematically, without using brute force?