Recent comments in /f/technology

Hrmbee OP t1_jaezir8 wrote

>ShrimpApplePro reports that accessories like AirPods and cables are already being manufactured overseas based on the standard. Any cables that aren’t MFi-certified will be “limited in data and charging speed.” > >What does MFi stand for? Well, it’s “Made for iPod,” which isn’t a device that exists anymore (RIP to the mp3 players of yore), but the certification program was implemented back in 2005. Apple expanded it when the iPhone and iPad were introduced and rebranded it as MFi in 2012 after the iPhone 5 adopted the Lightning standard—remember going from 30-PIN connectors to Lightning connectors? What a journey it’s been. In addition to helping standardize cables, MFi certifies all sorts of gadgets and accessories to label what’s safe for Apple users, including headphones, speakers, and even smart home devices. The only caveat to this program is that accessory makers have to pay a licensing fee of about $100/year. It only applies to manufacturers of electronic accessories, however, particularly those that don’t utilize an Apple standard like MagSafe. > >While it’s easy to see this as another way that Apple is sealing in its walled garden, Android manufacturers practice the same exclusivity with charging cables. OnePlus, under the Oppo brand, uses the red cable motif for its charging standard. The brand has long offered a faster charging specification than the rest of the Android brood within its ecosystem. And now that it’s adopted SuperVOOQ, buying the right cable and adapter is essential to reach full 80W charging speeds. Its latest release, the OnePlus 11, can charge fully in about 30 minutes with the cable and adapter included in the box.

One of my ongoing frustrations with cables and connectors in computing more broadly are the proliferation of standards using the same plugs. Without clear markings, it's sometimes impossible to know which cables are capable of what until you plug them in - and even then it's not always clear either. Manufacturers should be doing better to ensure a better user experience for all users here.

27

Skullpt-Art OP t1_jaez8yn wrote

Has there been any examples of what you've described? The two nearly-identical images, one produced by human and one produced by AI with explicit direction, as opposed to 4 generated close-enough ones?

Also, I think the argument is that the actions of one using AI to create art is closer to what an Art Director does, rather then what an artist does. You can direct a person or a computer with directions, but that doesn't mean you are the one putting pen to paper, so to speak.

2

SwagginsYolo420 t1_jaey7qm wrote

That may sound convincing to somebody unfamiliar with the software, but how the process is characterized is misleading.

Certainly, the less information given to the AI, the more random the output. However if you sculpt your prompt to include all of the photographic and desired image factors, you can produce a very specific result. The AI model can simulate all of the photography factors, listed, if you instruct it to do so. Lens type, exposure time, lighting, film stock etc.

A person could take an actual photo. Then recreate that image from scratch through the AI by providing enough information to sculpt the output with all the factors involved in the photo's composition.

That would leave you with two nearly identical images created by the same person, but only one of those images able to be protected by copyright law. But both images required the same compositional choices on behalf of the image creator.

I recommend every play with software like midjourney or stable diffusion themselves and learn the basics about how AI prompting is done, and learn how specificity on the part of the artist/software user is always going to be necessary to produce the desired result.

Certainly with the comic book artist in the article, it should become obvious that the images in question aren't completely at random, but all done in a similar style that served to illustrate the story and in a specific order that matches and illustrates the written text. That can't occur at random, it required very specific decisions made by the artist for each image.

0

SILENTSAM69 t1_jaex3to wrote

Yeah very true. At least methane is less of a concern considering its cycle is so short lived compared to CO2 taking thousands of years to pull out of the system. I see some getting confused that methane traps more heat, but scientists are less concerned about it. The life cycle of the gas in the atmosphere being a big part of the problem.

1

[deleted] t1_jaewa51 wrote

I’m not insulting you, I’m just correcting your inconsistencies and incorrect information. Your arrogance is astounding.

You aren’t gaming at 4k then if you have star citizen at your claimed fps. And yes 45-75 fps is good for 4k on a 3090. Even with a 13900k cpu, you will still see that same drop, It’s the gpu that’s the bottleneck at that point. Sounds more like you are gaming at most at 2k. Which will yield a higher fps on your gpu. Did you even setup the nVidia power settings or go into the game graphics settings to turn the graphics up? Do you even own a 4k monitor?

The “my rig is top notch” statement is mute when you never went into your bios and manually setup your hardware in the first place.

Open task manager, if you have a 13900k CPU, and I’ll bet your running your DDR5 ram at 2100 mhz. If you spent the money on a i9-13900k you better have DDR5-6000 ram, with it properly setup in bios. And if you bought the “k” variant of the CPU, then you better have at minimum a 360mm AIO or custom loop with the clock manually set to 6.0GHz, otherwise you wasted your money for hardware you won’t push to its max potential.

Again, your also comparing stock clocks. A 9900k OC to a minimum of 5.0GHz with DDR4-4400 won’t bottleneck a 4090 running 4k, only if your running it at 8k.

A 13900k stock clock of 5.6 GHz w/ E cores disabled is not much more than the OC’ed 9900k and won’t make a difference to it unless you OC that 13900k to 6.0-6.2 GHz (For 8k gaming).

0