Started By
Message

re: Bret Weinstein now has 2 strikes against him on YouTube

Posted on 6/16/21 at 10:30 pm to
Posted by Sneaky__Sally
Member since Jul 2015
12364 posts
Posted on 6/16/21 at 10:30 pm to
quote:

quote:
Ultimately it's still their site, their choice. I'm not taking their side, simply explaining it is what it is.


Let me stop you right there.

Is it your belief that anything that is legal is acceptable? If it is legal then no harm no foul?

This is terrible on so many levels. There's issues where they set a low bar that is selectively enforced. They prevent competitors from thriving by colluding against them to get them kicked off hosting providers. They are potentially doing something very dangerous by blocking the discussion of treatments during a pandemic. They are intertwined with politics and have a lot of influence over the political process. Their influence and manipulation dwarfs anything the Russians did, if that matters to you. The list of reasons why this is all bull shite is long.



Not to mention that it is a very young technology - so perhaps we will need to adjust how we treat it. I'd like to see more discussion around defining a concept like "social pollution" or something phrasing like that where we figure out what underlying factors of these social media sites (example would be looking at what their content generation algorithms are actually designed to do) that are causing large societal problems and if their are any limitations that could be employed that would result in a better outcome for the country and society as a whole.

They have so much money to put forth and can influence people's behavior patterns - particularly looking at large groups - so first I would like for us to get people in office or at least advisors that can actually get their heads around what is going on underneath the surface, and how designs could be tweaked.

For example, if we have content generation algorithms that are in some way helping radicalize people as a byproduct of their function to maximize screen time / logged in time, is there a way that can be adjusted to a more acceptable outcome.

Texting and driving is another - we certainly need self responsibility and parental control, but if you have something with billions of dollars behind it designed to keep people clicking - it seems to be a conflict with the idea that we want our 16 year old drivers without a fully developed brain to not text and drive.

Those are just a couple easy examples - but to circle back here, it would seem to me that (1) we need clear rules, uniform application, etc. and I think to avoid my "societal pollution" we when an issue comes up public figures with a large audience, we need to have some appearance of non-partisan, non-corporate controlled understanding, what exactly was done to be removed and have it be clearly separate and above similar actions from other people who have not been removed.

This starts conflict with the entire political and media apparatus which wants to push conflict and existential threats and s*** like that as it generates interest, views, donations, urgency, etc. - but they don't want it to be pushed too far where something bad (like Jan. 6th) happens.

I think you see similar stuff on Twitter where they don't actually want to enforce their policies which would limit people getting riled up, as that would reduce the usage of their platform.

I don't really have a strong grasp of what all of that would look like, but I think if we could see attitudes and discussion shifting to ask how can we guide this relatively new technology (social media) in a manner which will generate activity and reduce the negative outcomes that area already becoming apparent in the fairly short time we have been exposed to it en masse.

ETA: I know tl; dr
This post was edited on 6/16/21 at 10:31 pm
Posted by Oilfieldbiology
Member since Nov 2016
37733 posts
Posted on 6/17/21 at 5:45 am to
quote:

For example, if we have content generation algorithms that are in some way helping radicalize people as a byproduct of their function to maximize screen time / logged in time, is there a way that can be adjusted to a more acceptable outcome.


The algorithms aren’t designed to divide. People are just much life likely to remain engaged with stuff that they agree with, hence you continue to watch people you agree with in these platforms as opposed to people you disagree with.

Soon enough, all you are seeing is people that reconfirm your world view and opinion while people that disagree with you only see things that reconfirm their world view.

It’s not designed to happen that way, it’s just the unintended consequence of trying to keep you engaged for add revenue.
Posted by RogerTheShrubber
Juneau, AK
Member since Jan 2009
263293 posts
Posted on 6/17/21 at 12:42 pm to
Yes it's private business, but we have anti trust laws we are selectively ignoring here.

first pageprev pagePage 1 of 1Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram