Who was the one determining if information was false? if that checker is a human then they have bias as well.If one side is literally lying and spreading misinformation much more than another side(and yes multiple studies found that right wing leaning user and media were sharing MASSIVLEY more false and misleading info) then they ill be acted on more. Even more so when that's often the side trying to imply that violence is a proper solution to arguments.
"Authors of the study followed 9,000 politically engaged Twitter users, half Democratic and half Republican, in October 2020. The authors continued keeping track of their Twitter habits for six months after the 2020 election.
The study did find a disparity between how many users from each party were suspended — 7.7% of the Democrats compared to 35.6% of Republicans.
Republicans on Twitter, however, “shared substantially more news from misinformation,” the study found."
The bias wasn't left/right due to politics, it was that the Right simply spread more lies. Stop lying and they would get banned less... easy solution they seemingly want to ignore. Same issue the Right took with Google search results. They do things most people(western world has a left lean, thats just a fact) don't like and they get negative headlines, but were confused by that and blamed Googles algorithm instead of just thinking "hmm maybe I should do LESS things that people don't like...or do MORE things people DO like to balance it out..." They seem to only want to ever see NICE stories, not ACCURATE stories if you google them.
the funny part is you are sitting here saying it is ok to not apply the rules equally because it fits what you want. That you can say if one side is lying and spreading misinformation without ever a thought that the extremes of both parties spread equal amounts of disinformation is pretty funny.
I would also point out that the "side" you seem to favor talks about killing cops a lot. Was in favor of riots in cities. Both of those things seem pretty violent.