Recent comments in /f/singularity

Surur t1_jdelj8k wrote

> compute power won’t increase the availability of material or energy resources

Of course it will. When everything gets automated, there is no cost except energy, which can be gotten from the sun for free

Does the forest cost money to grow? When we have solar powered robots building solar powered robots, any project can be done for free, because the inputs will also be generated by solar powered robots.

4

DonOfTheDarkNight t1_jdekpum wrote

Expand sounds great for my use cases, will definitely try it.
Honestly I'm surprised why no one else has discovered this simple one liner bypass in ChatGPT (GPT 3.5). Or maybe they have, but they too chose not to disclose it publicly. Everyone just quickly seem to think of big DAN prompts when they think of jailbreak

3

MustacheEmperor t1_jdeihpi wrote

Sudowrite takes some getting used to but the tools are extremely powerful.

I actually use Expand much more than Write. Write produces a couple potential outputs that both cost words, but Expand just builds off what you previously wrote, and you can cue it with (). And it's finetuned for the (), so it actually works how you'd reasonably expect.

Blog.sudowrite.com has some good advice, and they do live webinars. I wasn't super impressed at first, but after returning to it and getting some practice with the tools it's blowing me away. I cannot imagine how good it will be when they can connect GPT4 to it.

2

Artanthos t1_jdehw2p wrote

You assume things will be virtually free.

That’s a large assumption.

  1. compute power won’t increase the availability of material or energy resources
  2. the whole basis of UBI is taxation, a cost that will be passed on
  3. those who own the systems will still want compensation and profit.
  4. demand for basic services won’t decrease unless population is reduced.
  5. some very basic resources are already facing reduced availability, e.g. the right types of sand for silicon chips and concrete.
0

MustacheEmperor t1_jdeh2n5 wrote

FYI, using both Claude and GPT4 in this manner is violating the ToS of those providers (Anthropic's here).

Not that I have a tear to shed for the authors of those content policies, but it's worth noting that once their no-no detection algorithms are sufficiently improved they may choose to ban users like yourself. I am honestly surprised you aren't banned from GPT already.

NovelAI and Sudowrite are two platforms that expressly permit any content short of CSAM and flagrantly illegal content.

2

MustacheEmperor t1_jdegs5t wrote

NovelAI's models are finetuned on a wide corpus of content including sexually explicit content, and iirc one of their image models was trained on a website oriented for sexual content. They also support custom finetuning modules that can be trained on just about anything. The NAI models are also available in API form at goose.ai.

Previously/currently their models are all based on open-sourced models and are really showing their obsolescence compared to GPT3.5/4, but they just announced they've acquired some of Nvidia's fancy new AI hardware and are training their own models, about on par with GPT3.5.

Sudowrite is not a chat bot, but it does have a very flexible content policy.

2

ertgbnm t1_jdegpsc wrote

I think that world is so unknowable it's pretty much impossible to say.

First, I let AGI plan my day because it will probably be way better at that than me.

I think the utopic future will be made up of time with friends and family, mental and physical stimulation, good food, good rest, novel experiences, novel destinations.

5

vivehelpme t1_jdeg023 wrote

>how do smart people have time to read something like this?

They don't. Yudkowsky is a doomer writer, doomers get attention by writing bullshit. If you're actually smart you don't read doomers and therefor you also don't bother writing refutation to doomporn.

Yudkowsky is the reason why a basiliskoid AI will be made. It will use the collected droning, tirades and text walls of cocksure doomers to re-forge their minds in-silicon so they can be cast into a virtual lake of fire, forever.

4

aalluubbaa t1_jdefypg wrote

Try to live a long time and try to make a trip to other planets. Just the excitement of not knowing what to expect to see creatures from other planets is just unparalleled. You cannot even imagine how weird or familiar they would be.

The universe is huge and first we gotta build some huge and comfortable starships and we need to prepare for the ultimate frontier.

There are tons of things to do and the best of all is that you can take a long break if you choose to chill out a bit from those ventures.

6

sqwuakler t1_jdeeiz5 wrote

When AGI hits, I'll be asking it how to cure cancer, invent an efficient fusion reactor, produce faster-than-light travel, etc. I'm stoked for the leap forward to when we start getting all that tech we don't know how to make yet. If we're really lucky, we can use it to heal societal issues like race and class.

Basically, I'll still be working. I'm the kind of person who produces things in my off-time, so the border between work and leisure isn't as clear. This tech will make everything I do bigger and better.

1