Recent comments in /f/Futurology
SupremeEmperorNoms t1_j9fgfja wrote
Reply to How good the US will be for living in future for those who will be earning decent?? by [deleted]
At the rate it's going, we're looking to end up in the same situation as The Outer Worlds. It's not really going to be a dictatorship, it's going to be a corporate wonderland. The upper middle class and lower will essentially be born into debt and be required to work off that debt as it passes down generation to generation as the highest class continues to grow their generational wealth into infinitum until the day comes that some serious economic collapse causes the entire system to upend and either result in mass loss of life as the top class seek to maintain their way of life through desperate measures, or a war is fought between corporate entities just to lower the number of mouths to feed and then continue working the remaining population at a higher pace to ensure no loss in productivity.
Marvy_Marv t1_j9fgc9m wrote
Reply to comment by DependentDemand1627 in How good the US will be for living in future for those who will be earning decent?? by [deleted]
Smoke grass got it.
mapadofu t1_j9fg0ed wrote
Reply to How good the US will be for living in future for those who will be earning decent?? by [deleted]
I figure US decline will go slowly, then all at once, and it’s virtually impossible to know how long the slow phase will last; it could easily last through my lifetime.
Iffykindofguy t1_j9ffykr wrote
Reply to comment by Pwnysaurus_Rex in How good the US will be for living in future for those who will be earning decent?? by [deleted]
really? Both sides are trying to turn us into a christian theocracy?
Iffykindofguy t1_j9ffv86 wrote
Reply to comment by TulpagenicUNISS in How good the US will be for living in future for those who will be earning decent?? by [deleted]
I agree, dems and republicans prior to the mid 2010s were the same party. But the democrats are being taken over at the local level and at least pretend at the federal. Republicans will turn us into a christian theocracy, democrats want universal healthcare - maybe. You're the one trying to get internet points here boy, stay in your lane or start bringing facts.
dragonblade_94 t1_j9ffhu2 wrote
Reply to comment by 69inthe619 in Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
Are we talking physical pain, like getting stabbed, or emotional grief?
If the former, there's no reason to think a machine cannot be designed to detect pain. In organics, it just boils down to nerve endings sending a signal that translates to "whatever I am currently experiencing is bad." We already have capacitive tech, it wouldn't be all that exceptional to throw it on the exo-layer of a robot and have it move away from anything that applies enough pressure/heat/charge/etc.
If the latter, that's just ground-level circular reasoning: "Robot's can't feel because you need to feel to feel, which robots cannot do."
Poncho_au t1_j9fff22 wrote
Reply to comment by VintageChemistry in Starlink’s “Global Roaming” promises worldwide access for $200 a month by ethereal3xp
It’s fundamentally impossible they’ll be the #1 carried carrier globally. That’s due to physics. A wireless communication medium cannot be even competitive in comparison to a fibre optic communication medium so in high population density areas that fibre makes sense they’ll have low customer counts.
In rural areas though they could absolutely become the #1 ISP.
TulpagenicUNISS t1_j9fetdc wrote
Reply to comment by Iffykindofguy in How good the US will be for living in future for those who will be earning decent?? by [deleted]
It is really not. While both sides are not the same on the surface, both Republicans and Democrats participate in insider trading on the stock market, they both use capital to influence political decisions away from the best case for the people. The Republicans and the Democrats have been for a near century the same party, it has not been until recently where they have diverged enough to have actual disagreements on things. The belief that Democrats will save you from Republicans or Republicans will save you from Democrats is literally taking the steps to hand authoritarians reins to government. Which I mean cool I guess, you get internet points for not being the other guys, when you should not want want to be any of them.
Peace-Bone t1_j9fer5r wrote
Reply to Future Evolution of Humanity by Calm_Replacement8133
This article is making a LOT of assumptions, not a lot of which are well founded. First of all, it's assuming largely that society as it is will still exist without huge changes and genetic engineering will not be major factors. Which is absurd, but a sane assumption to make for the sake of having an article that's coherent.
Beyond that, it's assumptions are too much based on a pop-culture view of 'caveman vs modern world' mentality. It assumes that 'anxiety and aggression' are selected against in a developed world, which isn't very well founded, and it assumes that anxiety and aggression are distinct, observable traits.
Furthermore, it's conflating 'societal beauty standards', 'personal sexual preference', 'person people want to breed with', and 'person people are likely to breed with' are all the same thing. Those are all separate groups. The first is a political thing more than anything divorced from the rest. And the second is a far cry from the third and forth.
And the assumption that gender differences will become more pronounced is not at all founded, really. We're in a trend of gender lines becoming more blurred. And it seems in direct opposition to it's vague concept that people will become 'more standardized' which seems to be the opposite of what's likely to happen.
bremidon t1_j9feode wrote
Reply to comment by jfcarr in Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
Perhaps.
Or perhaps a sophisticated-enough AGI genuinely feels emotions.
What is our moral duty here?
Pwnysaurus_Rex t1_j9fehag wrote
Reply to comment by Iffykindofguy in How good the US will be for living in future for those who will be earning decent?? by [deleted]
But Both sides are rightwing
TulpagenicUNISS t1_j9feeed wrote
Reply to How good the US will be for living in future for those who will be earning decent?? by [deleted]
America is already run by a collection of oligarchs, they buy up all the politicians or they themselves run for political seats. The ideas of aristocracy, feudalism, dictatorship, its all short sighted, these are hyper capitalist entities, that really don't give two flying flips about political alignment. All they care about is profit, and money. The sooner people realize that we aren't going to ever defeat them by having elections every 2 years the sooner we might make some progress.
bremidon t1_j9feat9 wrote
Reply to Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
>Would the most sentient ai ever actually experience emotion or does it just think it is?
When you find yourself asking a question like this, change the sentence around to reference people and see if you can give a clear answer. Like this:
Do people ever actually experience emotion or do they just think they do?
And you have now just wandered into some extremely deep waters. Even if you can convince yourself that *your* emotions are real, how do you know that anyone else actually *feels* emotions? Maybe you are the only one.
And once you have thought about this long enough, you are almost certainly going to realize: we will never know for sure.
And that leads to the next really troublesome question: what are we going to do about it? Should we give digital agents the benefit of the doubt?
And even though I always say "there's always one in every crowd," it does not seem to help; it's like they can't help themselves. Still, here is my disclaimer: I do not think that any current digital agent is conscious, feels things, or anything of the sort. I am just not entirely certain what my reason here is.
And to the folks who heard a YouTube video about how transformers work and think that explains everything: it does not. We have a pretty good idea of how brain cells work in detail, but we have no idea how we get from some chemicals and potentials to consciousness. So just knowing how the building blocks work does not necessarily mean you have any insights as to how the system works. Emergent behavior is a thing.
theironlion245 t1_j9fe3zf wrote
Reply to comment by midnitelux in Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
How could you harm chatgpt? It doesn't feel pain, it doesn't get injured, it doesn't die, it can replicate itself indefinitely.
There are zeta bites of storage around the world and a massive world wide web, if it had access to the internet chatgpt can hide itself tomorrow and it would be near impossible to find it.
An AI advanced enough would be virtually impossible to kill. So no, it doesn't need emotions and no the entire human species as a whole wouldn't represent any danger to it.
math_debates t1_j9fdoj2 wrote
Reply to comment by DiamondsJims in When will genetic engineering be available for psychiatric disorders? by undefined2937
That's more easily accomplished by giving "volunteers" drugs that make them uncomfortable when they don't have them. Negative behavior reinforcement therapy over time and the right drugs and their survival is programmed to your happiness.
rince and repeat. Put them to work.
Semifreak t1_j9fdf9h wrote
Reply to comment by Freed4ever in Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
Heh.
We've seen movies where machines 'fear' being unplugged, but that's nonsense. You can tell an AI to erase itself and it would. Why would it have any preference whether or not electricity is moving through its transistors? 'Death' and 'life' are meaningless to artificial machines.
substituted_pinions t1_j9fd9mp wrote
Oh, and by the way, there’s also protomolecule in there apparently.
krichuvisz t1_j9fcfte wrote
Reply to Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
I think emotions are a necessary link between body and mind. Without body, no need for emotions, which are helping your physical body to survive.
crawling-alreadygirl t1_j9fca5v wrote
Reply to comment by 69inthe619 in Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
Care to elaborate?
rawrc t1_j9fbzrj wrote
Reply to comment by nolitos in Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
I don't want my sex-bot to fake it, otherwise I'd just keep having sex with my gf
AltCtrlShifty t1_j9fav0t wrote
Reply to comment by moseg in A pilot scheme to trail the four-day workweek in Britain by esprit-de-lescalier
It’s like that on the website too. No one copy edits their AI articles.
midnitelux t1_j9fadqm wrote
Reply to comment by theironlion245 in Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
A sentient robot would still benefit from detecting danger, and if needed would need to create bonds to survive. It may not need food, but unless it was programmed to not care about itself, it would definitely not want to die.
BetoBarnassian t1_j9fac28 wrote
Reply to Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
We would need a good physical definition of what emotions are in a general sense. I think emotions are simply an impetus to behave in a certain way. How we act is some type of weird aggregated calculus of all the different things we want/don't want with varying degrees of intensity. In this sense emotions are more fundamental than the idea of being "happy", "sad", or "angry" and are simply behavioural expressions used to get what we want. Why do people get frustrated? Usually because they have to deal with stuff they don't want to. What does frustration do? Motivates people to leave a situation or change it. When we enjoy things, we usually seek more of that thing. Yet life is complicated and we have to balance many desires/wants against others leading to situations where we do things we don't want to get things we do. So long as you can program in a way for an Ai to have goals/wants/desires/priorities then emotions (imo) are simply the attempt to achieve these goals, fulfil these desires etc. Will they feel happiness or sadness in the same way we do? Probably not, they don't and will unlikely be made to mimic human biology so there will be differences in emotional expression, but I do think they will an analogous expressions that serve similar purposes.
This is just my quick 2cents. I'm sure there are decent arguments to be made against this point but I think it's a reasonably valid premise.
ImOnRedditMaaan t1_j9fgog3 wrote
Reply to comment by NegotiationSea7008 in Would the most sentient ai ever actually experience emotion or does it just think it is? Is the thinking strong enough to effectively be emotion? by wonderingandthinking
youd have to define emotion at that point