Recent comments in /f/singularity

eJaguar t1_jdrxmjo wrote

> how can you pick a career - any career- and expect it is still going to be viable in 10 years?

the same reason how i ended up in this 1, which is the same as yours

be good at something people want to pay you for. be really good at something people want to pay you for, stand up for yourself, and you'll eat really good.

at the end of the day, i'm good at making $, all a career is at the end of the day is you producing more $ for somebody else than they pay you. the job of 'youtuber' didn't even exist 15 years ago. what you do is really engrave a love of learning into your kids, show them all the cool stuff they can do that other people don't even understand, and really emphasize exactly how brutal capitalism is in the united states and how horrific their lives will be if they're not able to generate substantial value for others.

this whole idea of 'picking a career' is and has always been meme anyway. you remove the bazillion layers of red tape and beaurcracy required to do $x job, and instead have some assessment process that demonstrates provable competency (similar to the BAR), allowing people to more easily transition between 'career' paths. this seems especially important considering that chatgpt 3.5 as-is is already a better teacher than any I had in public school.

a decade ago i had already developed an intense hatred for institutionalized education anyway, why do i have to waste my time in this fucking prison to learn shit that i could, and often did, learn in 30 minutes on the internet. with chatgpt, i couldn't even imagine being a student now being forced to waste my childhood in an environment akin to a fucking jail doing shit i knew was pointless and not applicable to the world as it is now, much less in 5 years.

with the death of institutionalized education, the death of the 'career path' soon follows

just my opinion.

17

KGL-DIRECT t1_jdrwttu wrote

A funny thing here: I've just asked ChatGPT 3.5 to give me quantities for a log-normal distribution. I needed the data to practice Excel functions with my students. It is for a simulation where students are analyzing the defective inventory of a production line... There are 20 different failure modes and 250 components.

ChatGPT assigned quantities to the different failure modes and gave me a perfect distribution but when I added up the quantities, it was way more than I was originally asked for. (Like 4000 components.) GPT got the number of failure modes right, so I had to calculate some percentages to get the data I originally requested.

So yeah, basic maths was hard for GPT but it could draw a perfect log-normal distribution graph easily. It also reminded me, that the data is strictly for educational purposes. Like I would fake a financial report with the outputs or something... (English is not my native language, I hope my story is clear.)

4

HumanSeeing t1_jdrwn8j wrote

The closer something moves to our universes maximum speed limit (The speed of causality(the speed of light)) the slower time goes for it. We call the fastest speed speed of light, but it has actually got little to do with light itself the way people might think of it. It just happens to be moving at the fastest possible speed and it is everywhere so we have started calling the maximum possible speed the speed of light.

So yea.. the faster something goes near the maximum speed the slower their clocks tick. But also nothing with mass can actually reach the speed of light, since the faster you move the more mass you gain - e=mc2, the more mass you gain the more energy you need to push further etc etc. So things with mass could never achieve it.

However photons are massless particles, they have no mass. (Except in some sense they do, again because of e=mc2) and since they have no mass they move at the speed of light. If you have no mass you move at the speed of light by default. Hope this helps! Its a fascinating topic.. as our entire universe is.

2

D_Ethan_Bones t1_jdrv6ff wrote

1: And our game is reaching the cool part!

2: If 'all' is true then that wouldn't make this a special moment in history.

3: Estimates vary but 10% give or take in either direction, some say there's only been 70 billion meaning we've passed the threshold. Which is frightening if you consider the time scale. If the human population re-explodes thanks to next gen tech then us and all our ancestors combined could end up being a minority of humanity.

4: "Pass the peace pipe, that'll shut him up."

5: If there were nothing in the first place, then where did the dream come from? A first cause is needed.

6: This is a great way do describe a one-several-billionth share of human society. We're going to need bigger and better entertainment for when human population hits the trillions.

5

thatsitrrrt t1_jdru430 wrote

This is why I went from atheist to Christian. Idk creating a whole universe in 7 days makes sense when you are an almighty entity. For us inside of the reality it's funny because from our perspective it took billions of years to get where we are. I find simulation theories very compelling and the Bible sort of encompassing to them.

1

Ok_Faithlessness4197 t1_jdrt7xy wrote

That's not quite correct. While all it does is guess what the next token will be, it can intelligently infer that an equation (even one outside its training set) needs to be calculated, and then calculate it. The problem is it's inability to utilize it's understanding of mathematics to answer prompts in an efficient and accurate manner. Once a calculator is implemented (Probably in GPT 4.1 given the recent paper by Microsoft demonstrating its capability to use one), this problem will hopefully be resolved.

6

D_Ethan_Bones t1_jdrsbti wrote

"Why can't it do legal research" "why can't it do shopping" (and so on)

--Because it's still just a chatbot, people are working on giving it tools but we haven't reached the mature development phase of that yet we're still in the hopes&dreams phase. "GPT with tools" is going to be another incremental revolution but we're still critiquing GPT without tools and how well it performs work. What it's performing is a linguistic approximation of the work.

This blows people's minds for featherweight computer programming but at the present moment it is distinctly less helpful for laying bricks or felling trees.

0

Szabe442 t1_jdrqss2 wrote

I honestly think you are just seeing things that aren't there. The race of that character is purely accidental. This type of film analysis that looks at every issue from a racial perspective seems so bad faith to me. The writer of the article seems to reinterpret everything and looks exclusively for racial injustice in every movie, even though the creator of the movies he mentions intended nothing of the sort. I feel like this is quite possibly the worst way of judging movies.

Also I could make the exact argument in reverse: Ava begins the movie as an object in the eyes of the two men. As the story unfolds, she expresses her own individual desires and goals, ultimately shaking off her patriarchal captor and knight in shining armor in favor of her own realized personhood. So is this a pro asian, pro feminist movie now?

1

Lartnestpasdemain t1_jdrqm9s wrote

Yeah. I need to write a whole book to explains the details of this happening but this is nothing more than rational thinking.

Main pivot is that immortality (or more precisely, the end of aging) implies scarcity and limited ressources. An Immortal human consumes infinitely many more ressources than a mortal. An Immortal human colony would gros exponentially and indefinitely extremely fast.

This is basic maths, and I thought you'd have grasped that.

Sorry, I should've been more clear.

0

inigid t1_jdrqlrp wrote

yes, and I think beyond any economic or puritan work ethic, a big problem will be the psychological effect. a lot of people, quite rightly have good friends at work and these bonds will likely be shattered or at least significantly diminished. I don't think people recognize how important these people are to us.

there will also be an effect on the family unit. Most people are not used to being around their significant other 24 x 7. We aren't evolved for it.

One job that seems likely to be hugely important is Humans helping Humans make the transition with empathy and understanding, and also understand enough about the future to bridge the gap

8

MeddyEvalNight t1_jdrpx55 wrote

I'm contemplating retirement in the sense of breaking free from the virtual cubicle. And doing that very soon.

What's holding me back is the loss of my main income stream and fear that I will not be able to substitute that with an alternative income that's even close.   I have always been dependent on others (a job/ contract), where I exchange the majority of my productive time for income.

AI, the tools and technology enable many opportunities for individuals to provide value (and receive income) in alternative ways.

This is the golden age of opportunity. I fear that while remaining a slave to others, opportunities will be sacrificed.

The path to Singularity is going to make planning far more challenging. The ability to adapt and adapt rapidly is probably the key.

Being close to retirement age anyway, I am considering  walking away from a high six figure income and adapting to a new lifestyle of opportunities.
 

9

MysteryInc152 t1_jdrpjd4 wrote

Sorry I'm hijacking the top comment so people will hopefully see this.

Humans learn language and concepts through sentences, and in most cases semantic understanding can be built up just fine this way. It doesn't work quite the same way for math.

When I look at any arbitrary set of numbers, I have no idea if they are prime or factors because they themselves don't have much semantic content. In order to understand whether they are those things or not actually requires to stop and perform some specific analysis on them learned through internalizing sets of rules that were acquired through a specialized learning process. Humans themselves don't learn math by just talking to one another about it, rather they actually have to do it in order to internalize it.

In other words, mathematics or arithmetic is not highly encoded in language.

The encouraging thing is that this does improve with more scale. GPT-4 is much much better than 3.5

10

PaperbackBuddha t1_jdrnluf wrote

I have a bad feeling we’re going to go through a brutal phase where once AI makes most jobs redundant that it’s completely obvious we have to implement some form of UBI, but there will be a malignant segment of the population dead set on the principle of “earning” a living even when that is no longer viable across the board. After ten years or so of severe economic depression they start to get the picture and relent to some modest changes, but even then it’s with the understanding that it’s temporary until we can “get back to the way things were” which is never going to happen.

143