Hey folks, I've been hearing some wild stuff lately about China's influence in the US. I'm curious, does anyone have insights on whether China actually owns the US economically or politically? I'm not looking for a political debate, just some informed opinions from folks who might have some insights. Thoughts?