Proving independence of two variables in a joint distribution, using cumulative distribution functions

I have two variables $X$ and $Y$ with the following joint probablity density function

$$ f (x,y) = \begin{cases} \frac14 (1+xy) & \text{if } |x| < 1, |y| < 1\\ &\\ 0 & \text{otherwise} \end{cases} $$

The problem is to prove that $X$ and $Y$ are not independent, but that $X^2$ and $Y^2$ are. I calculated the marginal density functions of both $X$ and $Y$ and since their product doesn't equal the marginal density function, I proved they are not independent.

However, I wasn't sure about the second part and reviewed the solution. In the given solution, independence was not proven by the following.

$$ f_{x^2,y^2} (u,v) = f_{x^2} (u) \centerdot f_{y^2} (v) $$

Instead, it was proven that the cumulative distribution functions exhibit this property and that this implies independence.

$$ P (X^2 \leq u \cap Y^2 \leq v) = P (X^2 \leq u) \centerdot P (Y^2 \leq v) $$

I couldn't find any reference that said this implied independence. I believed such implication only worked with the density functions. This is also not mentioned on the wikipedia page for cumulative distribution function.

So, I'm wondering, is this also a way to prove independence? Can I use this technique when possible?

2
2022-07-25 20:43:04
Source Share
Answers: 1

To elaborate on Dilip's comment, if we could write $f(x,y)$ in product form, then $$f(x,y)f(w,z)-f(x,z)f(w,y)=0\tag1$$ for $(x,y), (w,z)\in A\subset \mathbb{R}^2$ where $A$ has full Lebesgue measure. But for your function $f$, equation (1) says $${1\over 16}(x-w)(y-z)=0\tag2$$ which is only true if $x=w$ or $y=z$. Thus for each $(x,y)$, equation (2) only holds for $(w,z)$ in a set of Lebesgue measure zero. This proves that $X$ and $Y$ are not independent.

2
2022-07-25 22:31:40
Source