Recent comments in /f/singularity

mutantbeings t1_ja5eflp wrote

Your team decides what data to even train it on. There will be sources of data that a culturally diverse team will think to include that a non-diverse team won’t even know exists. This is a very well known phenomenon in software dev; that diverse teams build better software on the first pass due to more varied embedded lived experience. Trust me I’ve been doing this 20 years and see it all the time as a consultant, for better or worse.

1

Yuli-Ban t1_ja5dm1x wrote

There will indeed be a mass unemployment drive.

What I don't get is everyone's cyberdelic utopianism that we all get UBI and everyone's happy.

I'm not even saying the rich kills everyone. I'm saying "Humans don't behave that way." Humans crave stability and the status quo, and the perception that our actions matter and have meaning.

Mass automation, even with UBI, is only going to anger hundreds of millions of people who expected relative career stability. Unless you want a billion screaming Luddites, you have to account for this and offer some form of employment, no matter how BS. The shift to an automated-slave economy should not happen overnight.

Unfortunately it seems we've elected to do the stupid thing in the name of endless growth. Hopefully something decent emerges from this regardless, but I can absolutely see disaster looming as a result of technologists and managers assuming "We have the technology, so it must be used; anyone complaining just has to cope and adapt."

And then a billion screaming Luddites smash data centers and vote in anti-progress politicians. "Who could have possibly predicted humans don't like instability and uncertainty??"

17

mutantbeings t1_ja5dlm2 wrote

White folks hold cultural and political hegemony in post colonial states, as well as historic economic privilege that continues to this day in most cases, so it wouldn’t show up as much in training data, simple as that. The dominant culture always sees less persecution than various disempowered minority groups; surely that’s obvious enough why that rates lower. This is kinda a convincing argument in favour of that too, because an AI just takes in training data, it wasn’t born in one side or the other itself.

−1

Mr_Goaty_McGoatface t1_ja5dgbl wrote

As a professional software engineer who has worked closely with AI and machine learning, especially lately, you have nothing to worry about.

Since the 50s, there have been a series of solutions nearly every decade year that we're all meant to "eliminate the professional software engineer." From languages that were meant to be so easy that business people could use them, to no-code/low-code solutions, to ML-powered coding solutions.

AI coding can be impressive, but it basically only works on pointlessly general, well-understood, extremely close ended problems. And even then, I've yet to see Chat-GPT consistently produce better than CS 101 algorithms without errors or shortcomings.

Even if it significantly improves, it'll just join the ranks of every other engineer killer tech as a reasonable support tool to allow business people to make basic software. We need to be talking about AGI before you need to be worried, and I don't care what the fan boys and hype men say, we're not even playing in the same ballpark as AGI for now.

4

TheRidgeAndTheLadder t1_ja5dax7 wrote

>Yep. And one reason it’s important we build culturally diverse teams that will minimise the intensity of bias.

How can the makeup of the team impact the data?

>This is common knowledge in the tech industry already because it shows up in all kinds of software dev and there are some really embarrassing horror stories out there about bias from teams lacking any diversity at all

The phrase is garbage in, garbage out. Not "garbage supervised by the correct assembly of human attributes"

1

mutantbeings t1_ja5c86q wrote

Yep. And one reason it’s important we build culturally diverse teams that will minimise the intensity of bias. This is common knowledge in the tech industry already because it shows up in all kinds of software dev and there are some really embarrassing horror stories out there about bias from teams lacking any diversity at all

1

mutantbeings t1_ja5bwou wrote

Nah that’s not super important. In the tech industry we all know that unconscious bias affects the tech we build, it’s a super important consideration whether or not it’s conscious. It’s one reason why building a culturally diverse team matters: it minimises the intensity of unconscious bias. There’s actually a lot of conscious things you can do to reduce it but it’ll never go away completely.

0

mutantbeings t1_ja5bldb wrote

And this is THE most important point we all need to take home about AI: it’s values always reflect the creators.

And the creators tend to be greedy capitalist corporations, so I expect this bias chart to change substantially as further tweaks are made, and not for the better.

1

thecoffeejesus OP t1_ja5azdo wrote

Correct. Big numbers get bigger.

But if time is infinite, and matter isn’t, eventually all states of matter that can exist will, no matter how large that number is.

Think about it like this:

If you put an apple in a vacuum box, and let it sit there for infinite time, the apple will decay into nothingness.

But eventually, there will be a point in time when you can open the box, reach in, and grab an apple that’s exactly like the one you put in. Blemishes and everything.

It might be trillions and trillions of years from now, it might be tomorrow.

If nothing ever comes in or out of the box, the atoms that used to be the apple will cycle through every possible state, over and over, forever.

They will at some point in time be in every state they can possibly be.

If time is infinite and the box is inert, then there will be infinite points in time when you can open the box and find an apple that is in exactly the same state as the one that originally went inside the box. And every other kind of apple those atoms could make.

This is just a philosophical thought experiment, but it’s informing real world experiments.

People are working on figuring out if this is how our universe works or not.

1

Talkat t1_ja5aj44 wrote

I personally think absolutely. 20 years is a very long time. Tesla will start mass manufacturing their robot in 2-3 years. They will use all the robots produced for 2-5 years.

So 4-8 years they will start selling them. They will be connected to the data center where a strong AI will understand the command/job given it, and then it will send back simple instructions to the robot.

I think send driving will be solved well before then so they will have lots of compute to help with training.

Connecting utilities is a walk in the park. However I believe the housing will likely be built off-site with options to be self sufficient. Otherwise the robots just connect it to the utilities.

Additionally by then you will have autonomous heavy machinery as well (eg bulldozers) so the AI at HQ will be able to create the plans for the entire project and then send the instructions to hundreds of humanoid robots and heavy machinery with Tesla cars/semis to move everything around

1

coolbreeze1990 t1_ja59pkq wrote

I’m with you again but I believe that democratic socialism is about how we elect government officials and pass laws. Some of those laws regulate the economy. That economy can be capitalist.

I’m all for one vote for each person. It’s a great idea. But we still need a bureaucracy to carry out the system of enforcing laws at the very least. Well you need them to write the laws that we should all have the opportunity to vote on. I agree.

Those laws should be there to protect people and the environment I agree. I’m proposing the current system in the US is ineffective at regulating businesses because politicians are paid by those business to in effectively regulate them. For profit. This is not ok.

I’m saying we need a dramatic overhaul of the government we have right now. This could look like democratic socialism for sure. I agree.

But the economic system is a different topic. Are you hoping for a massive wealth distribution? If everyone makes equal pay for everything, that would kill innovation. People need an incentive to work harder to make new things and provide better services. This is what a capitalist free market economy is about.

Communist and socialist economic systems. Idk maybe we just haven’t gotten them right yet and it’s possible. I’m open to it if that’s what the people want. But what we’ve seen so far is that it leads to dictatorships run by an elite few with an exploited working class in the worst cases.

In the best cases it looks more like Scandinavia. But really think about where most of the innovation in the world comes from. So much of the entertainment. So much of the best and brightest in all areas of work come from America. And this may very well be a super American point of view. It is. I love this country and what it stands for. I just think it needs big improvements in how it runs almost everything. But we can do that.

But I don’t think we need to burn it down and start over. We’re on the right track and have led the world in so many ways. We certainly have our problems but people here in the 1% of the world’s population in regards to money. Ofc we all see the billionaires and corporations need regulation. And the government needs an overhaul.

Idk. What do you think? Thanks for engaging. This is a fun conversation. Im not super educated on a lot of this stuff so if I’m way off, bear with me. This is just what I’ve learned so far and feel to be true for me.

1

shadowworldish t1_ja58la6 wrote

And behind the skeletons of telephone operators, elevator operators, bell hops, sky caps, and door-to-door electric meter readers who have been seated for more than 40 years. (and very old type-writer makers, workers in one-hour photo development labs, and Blockbuster employees who will become skeletons soon.)

And behind the dust of buggy-whip manufacturers, blacksmiths and bustle-makers who passed a century ago.

3