Hacker Newsnew | past | comments | ask | show | jobs | submit | oerdier's commentslogin

One doesn't exclude the other. I still program myself; I actually have more time to do so because the LLM I pay some billionaire for is taking care of the mundane stuff. Before I had to do the mundane stuff myself. What I pay the billionaire is a laughable fraction compared to the time and energy I now have extra to spend on meaningful innovation.


I think it's healthy for a population to not have as part of day-to-day life to pay with credit, effectively paying with money you might not actually have, going into debt. How many US citizens are crippled by credit card debt, and the interest on it?


You can pay with credit card as a convenience + fraud protection mechanism without ever paying interest by just paying your bill off every month.

When chip and pin was first rolled out, Europeans were shocked by the low security of swipe cards in the US. The reason that wasn’t an issue for Americans was (and still is) that credit cards have excellent fraud protections.

If someone steals my credit card, it is the bank’s problem, not mine.

The risk of paying by debit card on a regular basis is unfathomable to me, even with fancy tech to try to make it secure.


I wasn't suggesting individuals shouldn't be able to pay with credit. I have a credit card myself, which I use when I can't pay with debit. I was suggesting that for a population as a whole, having paying with credit being so commonplace leads to crippling debt issues, which as far as I can believe "the reports", is an issue in the US.

Your comment on the risk of paying with debit cards surprised me. I've never considered it a risk at all. It made me realize that perhaps here (in the Netherlands) we have consumer protection systems in place, in addition to the payment systems, that prevent any issues.


> + fraud protection mechanism

This is an Americanism, my European debit card had consumer protection clauses pretty much on par with credit cards.


True. In the US the ability to get money back from fraud is easier for credit card than debit, at least it was a decade ago. Things may have changed.


You also get contactless payment on debit cards.


It's not about credit/debit, it's about phone/card. Americans tend to use "credit card" as a generic term for payment cards.

And yes, phone NFC payment is one of those technically unnecessary conveniences that's really easy to get used to. You probably already have your phone out or at least accessible in like one second, paying with it instead of pulling out your wallet and finding a card or even cash is just sooo nice. I hate that I've gotten this used to it.

That being said, you can still get NFC payment on a rooted or reflashed phone. Instead of Google Wallet, find a bank or card provider that has their own app. I use the Curve "proxy card" and it works fine.


In my case sliding my card out of my wallet is faster than unlocking my phone given the lack of consistency of the fingerprint reader of my google pixel when using my smartphone case (and I am too clumsy to use a smartphone without a case covering both sides, broke too many screenw already). Some people just leave the card on their smartphone case too.

I also see a lot of people struggling because they need to pay while being on a call or because their smartphone is just way too big to be handled comfortably with one hand given the size of their hands.


Using debit or credit cards is unrelated to the UX. Both are available with google wallet or physically.


Very different risks.

Physical CC can be used by anyone holding it or actually just standing close.

Contactless payment with phone only after the phone is unlocked().

Physical card contactless payments have a limit (don't remember it) after which you have to use it with pin. At least all mine. Boring. Payments with apple/hoogle/Garmin have higher transaction limit.

I could probably go on longer but these are for me.

() I know it can be done without, but it also can be done with.


After getting shoulder bursitis two years ago--although the direct cause was sports, not desk habits--I dove into the world of split ergo keyboards. I did get one (a Kyria v3) and learned to type on it at an acceptable speed--although still significantly slower than my speed on a regular keyboard.

Wanting to optimize my layout, I did research into my typing behavior and logged my keystrokes (and storing these logs as securely as I would a password). Analysis did give me notable insights (e.g. by far my most used keys are arrow keys, for selecting text), but my main conclusion was that even during a regular full day of programming preferring my keyboard over my mouse (tiling window manager, hotkeys, browser extension to virtually click on elements using keys), I don't actually type that much, and if I do, it is in bursts, never more than 20 seconds or so.

Although I find typing on a split fun and comfortable, I went back to a regular keyboard because the hit in productivity is not worth it for me. The experiment did teach me how to improve my ergonomics. I optimized my desk height and bought a very flat and less wide keyboard, with the completely unused numpad section chopped off ("TKL") so if I do grab my mouse there is less travel.


This kind of translation problem is the focal point of Star Trek: The Next Generation, season 5 episode 2, "Darmok" (1991).

I watched it for the first time after somebody referenced it, as I did just now, as an example of this kind of problem. Despite my knowing the point of the plot beforehand, I found the episode was still interesting.

I wish I could mention this episode here for language enthusiasts to enjoy without revealing the main plot point (that idioms in languages are hard to translate). Shaka, when the walls fell. But I think the very act of mentioning it in a thread on this topic does so unavoidably. Temba, at rest.


Check out E.D. Hirsch Jr.'s work, e.g.'Why Knowledge Matters'.


There are such legal, cultural and economic differences between countries that no homework might work in one country but not at all in another.


Not being able to solve basic math problems in your mind (without a calculator) is still a problem. "Because you won't always have a calculator with you" just was the wrong argument.

You'll acquire advanced knowledge and skills much, much faster (and sometimes only) if you have the base knowledge and skills readily available in your mind. If you're learning about linear algebra but you have to type in every simple multiplication of numbers into a calculator...


> if you have the base knowledge and skills readily available in your mind.

I have the base knowledge and skill readily available to perform basic arithmetic, but I still can't do it in my mind in any practical way because I, for lack of a better description, run out of memory.

I expect most everyone eventually "runs out of memory" if the values are sufficiently large, but I hit the wall when the values are exceptionally small. And not for lack of trying – the "you won't always have a calculator" message was heard.

It wasn't skill and knowledge that was the concern, though. It was very much about execution. We were tested on execution.

> If you're learning about linear algebra but you have to type in every simple multiplication of numbers into a calculator...

I can't imagine anyone is still using a four function calculator. Certainly not in an application like learning linear algebra. Modern calculators are decidedly designed for linear algebra. They need to be given the rise of things like machine learning that are heavily dependent on such.


Critical thinking is not a generic/standalone skill that you can practise targetedly. As in, critical thinking doesn't translate across knowledge domains. To think critically you need extensive knowledge of the domain in question; that's one reason why memorizing facts will always remain necessary, despite search engines and LLMs.

At best what you can learn specifically regarding critical thinking are some rules of thumb such as "compare at least three sources" and "ask yourself who benefits".


I think you'd find many would disagree with each of those claims.


I hope they'll apply the critical thinking rule of thumb to check for themselves what modern research has to say on this!

Edit: And how can you critically assess if that research is any good? To do it well you need... domain knowledge.


And would they amount to a larger number than those who oppose vaccines?


Great initiative, but the landing page on Firefox Android is quite annoying because the animated text wraps and keeps changing the vertical position of the text I'm trying to read.


At https://www.hedy.org/learn-more many of the listed 'research' items are bachelor's theses supervised by the creator of Hedy, with topics such as 'creating Syntax highlighting for Hedy'. To me this comes across as padding and disingenuous.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: