Recent comments in /f/nottheonion

Micheal42 t1_ja3kmuo wrote

Aren't we then treating Twitter more as a technology than as a company. And aren't we considering ourselves at war with Isis? Either way we enact sanctions against anyone working with Russia, who aren't directly at war with, why should someone who works with ISIS be treated any differently?

For me the line isn't about individuals, it's about groups. ISIS shouldn't get better treatment or exceptions in places we wouldn't make them for Russia.

−3

waheifilmguy t1_ja3jkvk wrote

Yeah but he is woke for succumbing to pressure from a left wing political group. That’s a giant problem, isn’t it? I was told it was.

Anyway, the point it that he is long since dead. He has trusted his work to his estate. So any changes that come, come from "him" and his people and the changing times we live in.

Recent Broadway productions of Carousel and Porgy and Bess had rewrites to remove domestic violence and racial content. I'm sure the long-deceased authors would have said "bullshit!" as they would have a right to, but that is neither here not there. Do you want to let these shows remain current or do you want them to disappear into the dustbin of history?

I am not answering the question, I am asking it. There are infrequently easy answers when it comes to art. I have a film from 2007 that has explicit racial invective in it. I know I have to take it out, even if it completely makes sense for the charters and the story (neo Nazis...) because I know no one wants to hear people yelling the N word in a movie in '23.

So here we are in riding the crest of the wave in changing times trying to find a balance.

0

I-do-the-art t1_ja3gj32 wrote

It has the potential to affect cow, chicken, pig, etc… corporate farm profits. No bueno for corporate controlled governments like the US.

More seriously though, one of the lies that is commonly stated as to why you shouldn’t eat these meats is that they are unregulated and can be filled with unknown and hazardous chemicals, which is true. But that’s a misdirection because they could regulate it and that wouldn’t be a problem at all.

1

dpdxguy t1_ja3gig0 wrote

I agree that this was probably a training issue. The article implies as much. I'll also note that the article says that in prior similar situations, the employee involved has done the right thing, making it sound like this was a one off situation.

3

Hand-Picked-Anus t1_ja3g1l1 wrote

Right? Sat on hold with Verizon for an hour and a half before they even answered the other day. Ended up having to call them back four times over the last week, and every time it was an hour wait at least, BEFORE they even picked up.

They're lucky that there aren't many Volkswagen owners out there.

2

Hand-Picked-Anus t1_ja3fm6c wrote

They will ALWAYS blame the employee in these situations. I would be amazed to find out that the employee even knew an emergency option existed. We are talking about some poor kid in an Indian call center, 90% odds. It's very unlikely his software even let's him do anything other than ring people up or save whatever data they've handed over. Allowing low tier employees the ability to just hand over location data is asking for trouble. At the worst, the employee probably should have referred him to someone higher up and failed to do so.

2

HaikuKnives t1_ja3f435 wrote

The argument is that a platform can't be held responsible for content generated by users, even if those users are themselves reprehensible. Take away this protection, and Swap out ISIS with any other group that might have speech that others object to (LGBTQ+ rights activists, Communist party, US secessionists, garden variety racists, drag show promoters, onlyfans content creators, etc) and you'll see that platforms will be forced to turn off user content or so heavily moderate content that nothing that's even advertiser-unfortunately would be allowed on the platform.

I condemn ISIS, and their message. I do not condemn the medium that ISIS used.

8