Hacker Newsnew | past | comments | ask | show | jobs | submit | rkrisztian's commentslogin

It's the 3D cache, as I wrote in my other response. It has to be powered on at all times, so it affects even the idle power usage.


It has little to nothing to do with the 3D cache.

The IOD (die) is extremely inefficient for all desktop Zen CPUs as it never truly idles.


Their APUs don't have the problem from the reviews I've seen, but yes the I/O die has been the bane of the Zen platform when it comes to idle power consumption.

To make matters worse, the x570 chipset basically runs this I/O die upside down as a chipset and sucks twice as much power at idle as the x470 chipset it replaced. I expected them to replace this hack of a product used for the high end when Asmedia's efforts were delayed but all that platform got was B550. It was pretty clear they weren't chasing this part of the market during AM4's heydey, no real idea where they are at now with chipsets on AM5. But given few people talked about how crappy that chipset was in this respect I guess they might be right it wasn't important to most people.


What are you talking about? AMD has been really good at the power efficiency department until the 3D CPUs that use extra power for cache memory that simply cannot be turned off. Plus, Intel started applying the 3nm fabrication process, while AMD is still at 4nm. But previously, Intel was at 10nm for a long time, see i9-13900K for example, while Ryzen went to 5nm much sooner, see Ryzen 9 7900x.


Nothing, I'm making this up, except it's been confirmed by pretty much all desktop Zen users:

https://www.reddit.com/r/Amd/comments/1brs42g/amd_please_tac...

I don't bloody care that AMD CPUs seem to be more power efficient than Intel's. For most people their CPUs are completely idle most of the time and Zen CPUs on average idle at 25W or MORE.

Many Zen 4 and Zen 5 owners report that their desktop CPUs idle at 40W or more even without the 3D cache.


I can't confirm the 40W, my Ryzen 9 7900 (non-X) consumes 1W to 3W at idle on Windows 10.


Please post a HWiNFO64 screenshot.

I have reasons to believe you're making this up.

Not a single user has seen such low idle power consumption for desktop Zen AMD CPUs.


Could it be that that some cores are constantly being waked up by something?

I mention that since you seem to be on Windows, which itself has a hard time to just shut up, but that is also easily paired with bad drivers, stupid software and bad peripherals.


> I mention that since you seem to be on Windows, which itself has a hard time to just shut up, but that is also easily paired with bad drivers, stupid software and bad peripherals.

I happen to be on Fedora Linux 42 and Windows 11 but my primary OS has been Linux for almost 30 years now.

Idle power consumption under Windows and Linux is exactly the same. Linux doesn't have any magical tricks to make it lower.

Windows has more services running in background but they don't meaningfully affect idle power consumption at all.

The entire Reddit topic confirms my statement, multiple over hundreds of reviews confirm what I said, yet it's

> paired with bad drivers, stupid software and bad peripherals.

It's kinda hard to be an AMD fan when you live in an alternative reality, huh?


  > It's kinda hard to be an AMD fan when you live in an alternative reality, huh?
I don't know, as I am not too much intimate with both concepts. I meant to say if both measure idle power but come with different results, are they measuring the same? Could hardware and software differences influence idle power? What values does an "idle power reading" measure actually?


Sure, it's reported by the Radeon Software, but I'll check HWiNFO64 soon and post a screenshot.


52 minutes to post a screenshot?

Guess someone doesn't want to be embarrassed.


Someone flag this comment for insult. I was at work, was taking a break, I wasn't on my personal computer.


I am treated so unfairly, it's unbelieavable. People with insults get away without punishment, but my innocent screenshot is invisible.


I posted my screenshot yesterday but it's not visible. I don't care anymore.


Please someone flag the comment above for offensive language ("I don't bloody care")


Exactly, your phone can break or get stolen any time. Plus I just don't want to limit myself to a single device.


Unfortunately in Germany almost all banks force you to use an unmodified phone (so no de-Googled) Android as the 2FA. There are other solutions like code generators but they require extra payment.


Buy an older iPhone for ~$150. Install financial apps on it and don't use it for anything else. Keep it in a safe place, only carry it around if you must.

If you need to manage non-trivial amounts of money through your phone, having a specific device to do that is a no-brainer.


Is the risk that someone's going to steal my phone, forcibly hold it to my face, and wire my money somewhere? So far I've known two close friends who got mugged, the robber didn't think of this. Last time I tried intentionally wiring a large amount of money to someone, it took forever and involved tons of approval.


It's common in London, phones are being stolen for the access to financial accounts, not the value of the phone itself. They steal the phone out of your hands while it is unlocked. For example:

https://www.bbc.com/news/articles/cy8y70pvz92o.amp

I'm not sure exactly how they get around security features, perhaps by social engineering customer support, if they have enough PII.


Uhm yeah in order to actually wire money in my banking app I need to input a fingerprint. Smart people developed these apps banks are not stupid.

Obviously people can still kidnap you and torture you but that's no different from before smartphones.


Maybe if it's a random Android phone with Cash App


What I missed from the article is the usual: biometric authentication is not secure.

https://www.youtube.com/watch?v=tJw2Kf1khlA

(Yes, I'm linking YouTube because unlike popular belief, some channels are actually informative, or some make it easy for us to understand the content.)

I would never use my fingerprint for authentication, because it's a flawed concept. The problem is, that your fingerprint is not a password. It's more like a username. That's because you leave your fingerprint everywhere, it's practically public information. The same can be told about your face.


Biometrics are like identification yes. It checks that it's you. Now knowing that it's you, it retrieves a password stored on-device and uses the password for auth.

The auth is using a password still. The password is just indexed on your face or fingerprint ID and only locally, on-device.

That means the attacker would need the device to ever get at the password in the first place. Then they'd need to be able to break into the device. The latter you can argue is easy or hard, depending on perspective, but they'd need both your faceprint or fingerprint, and a reliable way to replicate it that can fool the reader.

If your fingerprint or faceprint leaks to the world. The attacker would still need your physical device, and would still need to find a way to fool the physical reader with a replica of your faceprint or fingerprint.

In that sense, it's more secure than a password.


That YouTube video is bad, if not outright wrong.

First of all, like the other commenter said, these days biometrics are rarely used as a key itself (which is how they are often portrayed in old movies). Instead, they are used as a method to gain access to the key. This is quite literally the case with some biometric Yubikeys - the key is the Yubikey, but to get it to work it needs your biometrics. Are you saying it would be better to have a key with no access control at all? Or one with a passcode (just watch the linked WSJ article from TFA - the guy was able to steal data from phones with passcodes, but biometrics would have made that attack vector much more difficult). Phones work pretty much the same way, perhaps the downside being that people often don't consider their phones as something that needs the same level of guarding as an actual key.

And just as importantly, what these kinds of YouTube videos often miss is the old adage "I don't need to outrun the bear - I just need to outrun you." That is, unless you are a particularly high-value target (and you would know if you are), any security that makes you much more difficult to hack than the person using Princess123 as their password means thieves give up and go to the easier target first.


It's a bit of an aside but your disclaimer intrigued me - YouTube is extraordinarily useful and the popular belief is that it is, I'm not sure at all it's anywhere near a popular belief otherwise. It's like defending a television recommendation to watch How It's Made on the basis that there are also less informative shows broadcast on the same medium.


So if I give you my fingerprint on a cup, can you get into my phone?


On the GrapheneOS forum you will see a lot of bad opinions about F-Droid, for example this:

> It doesn't matter that the app is trustworthy, because F-Droid are extremely incompetent with security and the apps you install from F-Droid are signed by F-Droid rather than the developer.

https://discuss.grapheneos.org/d/20212-f-droid-security-in-s... https://discuss.grapheneos.org/d/18731-f-droid-vulnerability...

They also say, if you use F-Droid, at least use F-Droid Basic:

> Dont use the main F-Droid client. Android is pretty strict about SDK versions and as F-Droid targets legacy devices, it is very outdated.

https://discuss.grapheneos.org/d/11439-f-droid-vsor-droid-if...

> If the app is only available on F-Droid / third party F-Droid repo, use F-Droid Basic and use the third party repo rather than the main repo if available. > > If the app is available on Github then install the APK first from Github then auto-update it using Obtanium. Be sure to check the hash using AppVerifier which can be installed from Accrescent (available on the GrapheneOS app store).

https://discuss.grapheneos.org/d/16589-obtainium-f-droid-bas...

By the way, while GrapheneOS recommends Accrescent, I don't use it anymore because they can't even add apps like CoMaps, while some of the apps they actually added are proprietary.


>the apps you install from F-Droid are signed by F-Droid rather than the developer.

That doesn't seem like a con if you take into account the context: F-droid is not shipping pre-build binaries from the developper, it asks for a buildable project from the developper.

If the source repo of the upstream dev are compromised, so will be hid own binaries anyway.


> [A]pps you install from F-Droid are signed by F-Droid rather than the developer.

Having recently gone through the F-Droid release process, I learned that this is not necessarily the case anymore.

F-Droid implements the reproducible builds concept. They re-build the developer's app, compare the resulting binary sans signature block, and if it matches they distribute the developer-signed binary instead of their re-built binary.

This is opt-in for developers so not all apps do it this way. I'd sure like to know how common this is, I wonder if there are any statistics.


F-Droid only uses reproducible builds for a tiny portion of apps, and there are still significant disadvantages. It depends on the app developers always complying with F-Droid's rules otherwise users are left without updates. F-Droid only checks that the build matches, they do not review/audit the apps and will not catch hidden malicious behavior or simply non-compliance with their rules. WireGuard's app deliberately broke F-Droid's rules by including a self-updater which was not noticed by F-Droid and shipped by F-Droid. WireGuard used this to start taking over updates for itself to migrate their users away from F-Droid. F-Droid eventually found out when the WireGuard developer brought it up many months later and couldn't do anything beyond dropping the app. It had taken over updates for itself already and F-Droid wasn't in the picture anymore.

The process adds a significant delay for updates but it does not actually protect users from developers in any meaningful way. This real world example with WireGuard demonstrates that.


If the signatures are the same, what difference does it make which binary is distributed?


What is the same is the checksum of the result binary.


Also learn grammar, please. If you mean a canvas based on IPv6, then write "IPv6-Based Canvas". If you mean IPv6 has based the canvas, then write what you just did.


As the domain name is "openbased.org", and with "based" being Internet slang, I don't think your correction applies. The name describes a based canvas built on IPv6, hence IPv6 Based Canvas.

The English language is rather particular when it comes to the order of adjectives, but according to https://dictionary.cambridge.org/grammar/british-grammar/adj..., "opinion" comes first, and "based" is very much an opinion. I suppose that means the grammatically correct name would be "based IPv6 canvas" if you're being pedantic.


Please read the article again, he does explain the torture:

> our phones make us feel powerless; as if we must fight our devices to get anything done. There’s a constant barrage of notifications, and by the time you have dealt with them, chances are you have forgotten what you wanted to do in the first place. Then there is Gemini, Google’s artificial intelligence bot, which won’t leave you alone. Press the home (middle) button for half a second too long, and it pops up, offering to “assist” you.

The closed-source part has been explained too, as it links another article: https://www.osnews.com/story/142553/rumour-google-intends-to...

"The Android Open Source Project has been gutted over the years, with Google leaving more and more parts of it to languish, while moving a lot of code and functionality into proprietary components like Google Mobile Services and Google Play Services. Taking “Pixel Android” closed source almost feels like the natural next step in the process of gutting AOSP that’s been ongoing for well over a decade."


Cool idea, but I could just run a Commodore 64 emulator on my PC these days.


True, but it just doesn't hit the same nostalgia spot.

Emulators are all about running the software. This is more about the full package, you know? The breadbin in all its glory. Going clickety-clack with its keyboard. It brings back memories.

I own a TheC64 by RetroGames just for this. All the convenience of HDMI and instant loading software with the form factor of the C64 of my youth, with a working keyboard with PETSCII characters!

If you just want to play games, agreed, you just need VICE.


I'm disappointed. 8B is too low for GPUs with 16 GB VRAM (which is still common in affordable PCs), where most 13B to 16B models could still be easily run, depending on the quantization.


If you guess at random, that's 25% probability to get the right answers, which gets you 7/28 points in average. I got 14/28 by trying hard and I still hate the result, but it's also true that the questions were largely impractical: noone parses dates like this in a real production app. We always validate the date format first. So noone should feel bad at their results.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: