Recent comments in /f/technology

hodor137 t1_j9x0ilo wrote

Not true at all. Encryption that's not intended and actually implemented to be fully sender-to-receiver can easily be subverted and readable by 3rd parties. In the messaging/signal/Whatsapp context people refer to it as "end to end encryption" but that term doesn't really say anything.

I'm not sure how exactly Signal and these other messaging apps implement their encryption, but they could easily claim end to end encryption while offering governments a "back door" to decrypt and read everyone's messages. Signal is saying they won't do that.

I've never bothered to use Signal but you either have to trust their word, or they have to do a really good job proving to you that only the end users have control of their own private encryption keys. From everything I've heard, including this, they're great and trustworthy - but you still have to trust them.

−35

LiberalFartsMajor t1_j9wz6zq wrote

Right off the bat, this is wrong.

>Google and other big tech financially benefits and exploits the content of others for that financial benefit, often within highly controlled and manipulated eco systems.

No they don't. Any website can direct Google not to index them if they don't want to be linked, this would destroy their traffic from Google though. These media companies want to have their cake and eat it too when what they should be doing is dying in a corner quietly.

25

A1kmm t1_j9wxxff wrote

ChatGPT is a language model, optimised for finding a suitable output text for a given input text. It is trained on natural language understanding an processing - its input is characters, but words, grammar, and basic logic / facts are emergent properties.

It can memorise times tables and solve basic maths problems, but it can't devise an approach to solve larger problems (it can't even add and subtract larger numbers in combinations it hasn't seen before, even if they would be trivial for humans).

None of that makes it very good for tasks like controlling a drone (which would be heavily about image processing) compared to a human.

Other 2010-era developments in AI, such as in the image classification space, for example, would help a lot more for that application.

7

A1kmm t1_j9wwzic wrote

Although attributing ransomware is difficult, everything that has been leaked and is public suggests most of the perpetrators are in CSTO (i.e. Russia-allied) countries that actually at least informally encourage attacks on non-CSTO countries. Leaked policies from criminal organisations suggest they generally do not target victims in CSTO countries. CSTO countries rarely have extradition treaties outside the CSTO - no CSTO country has an extradition treaty with the United States, for example. Sometimes authorities do work together when they are aligned despite the absence of a treaty (e.g. Armenia has extradited to the US before) - but that is unlikely to happen for ransomware criminals that only target victims outside the CSTO.

So I don't think they need immunity from their own government, and they don't fear extradition as long as they don't go to a non-CSTO country. Sometimes they do travel overseas and find out that the government tolerance for their activities doesn't extend outside the CSTO.

Data leaks from criminal organisations to non-CSTO governments (in combination with things the governments collect themselves and share) are likely very helpful in ensuring the criminals are likely to be picked up if they do travel.

7