Hacker Newsnew | past | comments | ask | show | jobs | submit | nicoty's commentslogin

Filesystems like zfs, btrfs and bcachefs have snapshot creation and rollbacks as features.


I know I'll sound like a rube but somehow it rubs me the wrong way that the rich and powerful are spending billions trying to establish multiplanetary civilisations despite the fact that we still have plenty of unfixed problems here at home that also deserve attention and resources, if not more so.


It's not either/or, and you shouldn't reduce things to false dichotomies.


It is though, when you have a fixed budget, even at planetary scales.


Are we still talking about private enterprises, not space agencies?


They are indeed private companies, being subsidised by space agencies.


Are they being subsidized the same way my employer subsidizes my lifestyle?

If I sell steel, grain, boots, or launch services to the government and that gives me profits that I invest into some aspect of my business, I’m not sure that “subsidized by” is the clearest term.


I'm not sure it is an open market that anyone could join. The US government seems to favour domestic supply. Therefore I believe it is a subsidy, yes


Pretty much all of the commercial rocket companies are in America. There is competition between them. In the amount of money that SpaceX gets from the government is quite small compared to its overall operating expenses. They launched 90% of all mass to orbit, most of which is commercial.

The “SpaceX lives on government subsidies” thing is a myth.


It's probably more because of the news than the world. It's not possible for there not to be plenty of unfixed problems. No matter how many problems we solve -- and we've solved a hell of a lot over the past decades or century -- other existing problems will take their place as seeming to be important.

If I'm being too extreme, can you describe a world where you'd consider enough problems have been solved that it's worth spending billions colonizing space?


> that the rich and powerful are spending billions trying to establish multiplanetary civilisations

Citation needed. What are the current projects to make this happen? Starship is a work in progress, but that by itself wont be able to create a colony out of thin air.


I can't believe that pipedream being anything but the largest company town. Who needs Scrip when you have to work for... every vital resource, even air?

Not Rube-ish or rubish at all, IMO. I believe they're more interested in power or recreational drug use than problem-solving. Horses for courses.


It's even worse than that: They're messing up the environment we do have by burning tons of fuel in pursuit of their pipe dreams.

Instead of establishing multiplanetary civilisations, they're burning our single-planetary atmosphere in their hubris and ego.


Like how capsaicin makes food feel hot even when it isn't?


Yes, my point is that our senses often portray a reality that doesn't exist. Why should we assume free will is any different?

We don't even have a coherent and agreed upon definition. Every attempt at operationalizating it, results in it not being detectable. It's time we admit that there is no scientific basis for free will. It's not a scientific belief.


That's kind-of where compatibalism comes in. For most definitions of free will, it is fighting the feeling that we would like to have it because it feels like we have it, and the evidence that says that the feeling that we have it does not provide any proof.

Sapolsky takes the approach to come up with a definition that can't be met and declare that it can't be met. Compatatablisim is more about finding a definition that is consistent with the feeling because without the feeling there isn't really anything to anchor the idea to anything meaningful. It doesn't use the feeling of making decisions as proof of any power as such, it is treating the feeling more like a measurement of the concept. Doing what we feel like doing is free will. What we feel like doing can be caused by anything and it still wouldn't matter.

Considering the inverse situation makes it seem like any other definition of freedom would be intolerable. If you 'freely' chose to do X but consistently had the perception of wanting to do not-X, it seems like you would not have a happy life. Similarly considering the alternative to determanism for your decisions seems like literal chaos. If something has no cause, it is literally random. Any pattern you can discern would indicate that it has a cause.

Then of-course you get into the nature of what is a cause and consequently considering the nature of time itself. Something travelling backwards in time interacting with something that you see appears uncaused because there was literally no preceding event that indicated it was going to happen (because it wasn't preceding, it came from the other direction)

That's one of those weird things I wonder about when people ask why is there more matter than antimatter, or why does times arrow mainly point in one direction. It feels like riding the crest of a wave wondering why all the water is going the same direction.


Thanks for the spoilers


I implemented an iterative, stack-based DFS iterator in JS last year for a project that I didn't end up using it on. Maybe someone else can find some use of it: https://gist.github.com/nothingnesses/5f974a43a2da5d1d8a6b9c...


Is it possible to press multiple keys on opposite axes at the same time? E.g. q and z or w and x on the qwerty layout.


The keyboard supports it, but your hands likely do not! You likely can't flex and extend your fingers simultaneously.

If you really wanted to do this for some reason (chording?) you could use too-small finger keys and press harder. But this would defeat the main point which is to be aggressively ergonomic.


What's wrong with static type systems?


I've summarized my opinion on this here: https://doi.org/10.5281/zenodo.15118670

In normal programming languages, I see static type systems as a necessary evil: TypeScript is better than JavaScript, as long as you don't confuse types with terms...

But in a logic, types are superfluous: You already have a notion of truth, and types just overcomplicate things. That doesn't mean that you cannot have mathematical objects x and A → B, such that x ∈ A → B, of course. Here you can indeed use terms instead of types.

So in a logic, I think types represent a form of premature optimisation of the language that invariants are expressed in.


Your invocation of Strong AI (in the linked paper) seems like a restatement of the Sufficiently Smart Compiler [1] fallacy that’s been around forever in programming language debates. It’s hypothetical, not practical, so it doesn’t represent a solution to anything. Do you have any evidence to suggest that Strong AI is imminent?

[1] https://wiki.c2.com/?SufficientlySmartCompiler


I routinely describe code that I want in natural language, and it generates correct TypeScript code for me automatically. When it gets something wrong, I see that it is because of missing information, not because it is not smart enough. If I needed any more evidence for Strong AI since AlphaGo, that would be it.

I wouldn't call it vibe coding; I just have much more time to focus on the spec than on the code. I'd call that a sufficiently smart compiler.


This is a very reductive definition of types, if not a facile category error entirely (see: curry-howard), and what you call "premature optimization" -- if we're discussing type systems -- is really "a best effort at formalizations within which we can do useful work".

AL doesn’t make types obsolete -- it just relocates the same constraints into another formalism. You still have types, you’re just not calling them that.


I think I have a reference to Curry in my summary link. Anyways, curry-howard is a nice correspondence, about as important to AL as the correspondence between the groups (ℝ, 0, +) and (ℝ \ 0, 1, *); by which I mean, not at all. But type people like bringing it up even when it is not relevant at all.

No, sorry, I really don't have types. Maybe trying to reduce all approaches to logic to curry-howard is the very reductive view here.


If your system encodes invariants, constrains terms, and supports abstraction via formal rules, then you’re doing the work of types whether you like the name or not.

Dismissing Curry–Howard without addressing its foundational and extricable relevance to computation and logic isn’t a rebuttal.


Saying "Curry-Howard, Curry-Howard, Curry-Howard" isn't an argument, either.

I am not saying that types cannot do this work. I am saying that to do this work you don't need types, and AL is the proof for that. Well, first-order logic is already the proof for that, but it doesn't have general binders.

Now, you are saying, whenever this work is done, it is Curry-Howard, but that is just plain wrong. Curry-Howard has a specific meaning, maybe read up on it.


Curry–Howard applies when there’s a computational interpretation of proofs — like in AL, which encodes computation and abstraction in a logic.

You don’t get to do type-like work, then deny the structural analogy just because you renamed the machinery. It’s a type system built while insisting type systems are obsolete.


You seem to know AL very well, I didn't even know that there is a computational interpretation of AL proofs! Can you tell me what it is?


Personally I would enjoy if TLA+ would have types, though, and TLA+ belongs to the land of logic, right? I do not know how it differs from the abstraction logic referred in your writing and your other whitepapers.

What is commonly used is a TypeOK predicate that verifies that your variables have the expected type. This is fine, except your intermediate values can still end up being of mis-intended values, so you won't spot the mistake until you evaluate the TypeOK predicate, and not at all if the checker doesn't visit the right corners of the state space. At least TypeOK can be much more expressive than any type system.

There is a new project in the same domain called Quint, it has types.


Practically, in abstraction logic (AL) I would solve that (AL is not a practical thing yet, unlike TLA+, I need libraries and tools for it) by having an error value ⊥, and making sure that abstractions return ⊥ whenever ⊥ is an argument of the abstraction, or when the return value is otherwise not well-defined [1]. For example,

    div 7 0 = ⊥, 
and

    plus 3 ⊥ = ⊥, 
so

    plus 3 (div 7 0) = ⊥.

In principle, that could be done in TLA+ as well, I would guess. So you would have to prove a predicate Defined, where Defined x just means x ≠ ⊥.

[1] Actually, it is probably rather the other way around: Make sure that if your expression e is well-defined under certain preconditions, that you can then prove e ≠ ⊥ under the same preconditions.


This looks like you've defined the abstract lattice extension as a proper superset of the concrete semantics. That sort of analysis is entirely subsumed by Cousot & Cousot's work, right? Abstract Interpretation doesn't require static types; in fact, the imposition of a two-level interpretation is secondary. The constraints on the pre-level are so that the behavior of the program can be checked "about as quickly as parsing", while also giving strong correctness guarantees under the systems being checked.

Moving the whole thing to dynamic behavior doesn't tell us anything new, does it? Lisps have been tagged & checked for decades?


I have not defined any "abstract lattice extension" explicitly; which is nice, why would I need to know about lattices for something as simple as this? It is just a convention I suggest to get a useful Defined predicate, actually. Nobody can stop you from defining mul 0 ⊥ = 0, for example, and that might make sense sometimes.

I would suggest that abstraction logic compares to type theory as Lisp compares to Standard ML; but that is just an analogy and not precise.


My problem with your interaction on this forum, so far, is that it kind of ignores the practical (artisan) craft of programming. The use of static type systems by working engineers isn't math; it's more akin to "fast, formal checking". You can assert that you don't like types, or they're useless all day long; but, every working engineer knows you're wrong: static type systems have solved a real problem. Your arguments need to be rooted in the basic experience of a working engineer. Single-level programming has its place, but runtime checking is too error prone in reality to build large software. If you think this isn't true, then you need to show up to the table with large scale untyped software. For instance: the FORTH community has real advantages they've demonstrated. Where's your OS? Your database? Your browser? If you have real advantages then you'd have these things in hand, already.

Otherwise, you're just another academic blowhard using a thin veneer of formalism to justify your own motivated reasoning.


Strong opinion! Also, it seems you didn't read my interaction so far carefully enough. You also didn't read the summary I linked to, otherwise you would know that I fully approve the current use of types in programming languages, as they provide lightweight formal methods (another name for what you call "fast, formal checking"). But I think we will be able to do much better, we will have fast, formal checking without static types. Not today, not tomorrow, but certainly not more than 10 years from here.


I'm reading through your paper, and I really like your ideas. I just think you're approaching the engineers wrong.


> At least TypeOK can be much more expressive than any type system.

Can you clarify what you mean by that? Dependent types or more practically refinement types (à la F*) can embed arbitrary predicates.


Right, I was referring totype systems in relatively popular real-world programming languages of today, i.e. Haskell, OCaml, Rust, Haskell :).


> But in a logic,

I am not sure if I misunderstand you. Types are for domain, real world semantics, they help to disambiguate human language, they make context explicit which humans just assume when they talk about their domain.

Logic is abstract. If you implied people should be able to express a type system in their host language, that would be interesting. I can see something like Prolog as type annotations, embedded in any programming language, it would give tons of flexibility, but then you shift quite some burden onto the programmer.

Has this idea been tried?


Types for real-world semantics are fine, they are pretty much like predicates if you understand them like that.

The idea to use predicates instead of types has been tried many times; the main problem (I think) is that you still need a nice way of binding variables, and types seem the only way to do so, so you will introduce types anyway, and what is the point then? The nice thing about AL is that you can have a general variable binding mechanism without having to introduce types.


AL as described sounds like it reinvents parts of the meta-theoretic infrastructure of Isabelle/HOL, but repackaged as a clean break from type theory instead of what it seems to be — a notational reshuffling of well-trod ideas in type theory.

What am I missing?


Given that I am an Isabelle user and/or developer since about 1996, similarities with Isabelle are certainly not accidental. I think Isabelle got it basically right: its only problem (in my opinion) is that it is based on intuitionistic type theory as a metalogic and not abstraction logic (nevertheless, most type theorists pretty much ignored Isabelle!). Abstraction logic has a simple semantics; ITT does not. My bet is that this conceptual simplicity is relevant in practice. We will see if that is actually the case or not. I've written a few words about that in the abstraction logic book on page 118, also available in the sample.

> — a notational reshuffling of well-trod ideas in type theory

Always fun to encounter disparaging comments (I see that you deleted the other one in the other thread), but I wrote this answer more for the other readers than for you.


> Always fun to encounter disparaging comments

I can't imagine why anyone would ever treat your bombastic proclamations about type theory with suspicion.


What is bombastic about them?


I'm not a logician but do you mean that predicates and their algebra are a more granular and universal way to describe what a type is.. basically that names are a problem ?


Yes and no. Yes, predicates are more flexible, because they can range over the entire mathematical universe, as they do for example in (one-sorted) first-order logic. No, names are not a problem, predicates can have names, too.


so if names are not an issue, the problem with the usual static type systems is that they lack a way to manipulate / recombine user defined types to avoid expressive dead ends ?


What is the type of x / y, for x : ℝ and y : ℝ?


What is the type of x and y? If it's ℝ, why wouldn't x / y be of type *ℝ (hyperreals)?

I'm just your average programmer, not into abstract logic stuff, but got curious.


The hyperreals don't fix the x/0 or 0/0 problems. Infinitesimals have a well-defined division and take the place of a lot of uses of 0 in the reals, but the hyperreals still have a 0, and the same problem posed in their comment exists when you consider division on the hyperreals as well.

I'm also curious what they intended, but there aren't many options:

1. The question is ill-posed. The input types are too broad.

2. You must extend ℝ with at least one additional point representing the result. Every choice is bad for a number of reasons (e.g., you no longer have multiplicative inverses and might not even have well-defined addition or subtraction on the resulting set). Some are easier to work with than others. A dedicated `undefined` value is usually easy enough to work with, though sometimes a single "infinity" isn't terrible (if you consider negative infinities it gets more terrible).

3. You arbitrarily choose some real-valued result. Basically every theorem or application considering division now doesn't have to special-case zero (because x/0 is defined) but still has to special-case zero (because every choice is wrong for most use cases), leading to no real improvements.


Good point, forgot about 0/0.


I'm tempted to say ℝ but it seems too obvious


I would say ℝ u {undefined}


undefined because x / 0 ? ℝ includes infinity


It's undefined because there isn't an obvious choice:

- At x:=0, every real is a reasonable limit for some path through the euclidean plane to (0,0). Ignore the 0/0 problem, then x/0 still has either positive or negative infinity as reasonable choices. Suppose you choose a different extension like only having a single point at infinity; then you give up other properties like being able to add and subtract all members of the set you're considering.

- However you define x/0, it still doesn't work as a multiplicative inverse for arbitrary x.

A good question to ask is why you want to compute x/0 in the first place. It is, e.g., sometimes true that doing so allows you to compute results arithmetically and ignore intermediate infinities. Doing so is usually dangerous though, since your intuition no longer lines up with the actual properties of the system, and most techniques you might want to apply are no longer valid. Certain definitions are more amenable to being situationally useful (like the one-point compactification of the reals), but without a goal in mind the definition you choose is unlikely to be any better than either treating division as a non-total function (not actually defined on all of ℝ), or, equivalently, considering its output to be ℝ extended with {undefined}.

Not that it directly applies to your question, ℝ typically does not include infinity. ℝ⁺ is one symbol sometimes used for representing the reals with two infinities, though notation varies both for that and the one-point compactification.


Is there any programming language based on abstraction logic?

This is all a bit too abstract for me right now but seems interesting.


There is nothing practically usable right now. I hope there will be before the end of the year. Algebraic effects seem an interesting feature to include from the start, they seem conceptually very close to abstraction algebra.


An HTML version that I can just read on a phone would be nice.


Forget it Jake, it’s land of Lisp.


Still somewhat janky. I use it on my work machine (since it at least seems a bit faster than using VirtualBox) and regularly run into issues where npm won't build my project due to the existence of symlinks [1,2]. wslg windows also don't yet have first-party support from the windowing system [3]. I also remember having trouble setting up self-signed certs and getting SSL working.

1. https://stackoverflow.com/questions/57580420/wsl-using-a-wsl... 2. https://github.com/microsoft/WSL/issues/5118 3. https://github.com/microsoft/wslg/issues/22


> First tier, high cost of living should be for high earners, singles or childless people, they should pay more taxes while second tier areas should offer UBI, and should generally subsidise people having children

That seems unfair to me. Why should singles and childless people subsidise people with children?


> Why should singles and childless people subsidise people with children?

My point was mainly that that kind of first tier city would attract those kind of people, not specifically that they should be targeted.

If you lived in a 2nd tier zone you shouldn't pay extra in taxes even if you're childless.

But specifically to your question, I do think that it's fair that childless people should pay more taxes, because having a stable population is a requirement for a stable society and single/childless people aren't doing their part.

That duty can be offloaded to parents with more children, but they should be compensated for that.

You can frame it whatever you want -- they pay more taxes, parents pay less taxes, parents get tax rebates, parents get higher UBI per child, the outcome is the same.

I do think that first tier cities leach and profit from the work of the parents, educators of the people who migrate there (and generally the whole area -- it takes a village to raise a child), profit from exorbitant property taxes so, I think the only way to solve this curse is higher taxes (if it's required) on those areas and subsidized living in less desirable places -- provided that those 2nd tier places produce competent citizens.


Progressive income tax already does this "taxing first-tier cities a higher percentage" thing. The tax bands are the same in all cities, so people living in the cheaper, low pay provincial cities already pay less tax as a proportion of income. This effect is very stark in comparing London to the rest of the UK.


Why should singles and childless people subsidise people with children?

children are the future of our society. even in today's system, children will be the ones paying your pension when you retire. putting that burden on parents alone is what is unfair. how would you like your pension to be measured based on the number of children you raised? maybe if you have no children you shouldn't get any pension at all?


Where did you get the idea anyone will be paying my pension when I retire?


Capital Accumulation without labor is a fiction. If nobody had any more children all stocks would gradually decline to zero


Pension is also a fiction (as in, I do not have one).


do you expect not to get any pension at all and just live on your own savings? do you live on a farm growing your own food?

you can't live in this society without relying on others, and they can't live without relying on you. and unless you want humanity to die out, future generations need our support.


This is more of a "100% certain" situation than an "expect" situation. Are you mixing pensions up with 401ks and social security?


i was using it as a general term for whatever money you receive after you retire except personal investments or savings. i could not find out how 401ks work, specifically i did not find out whether 401k works like any other pension scheme or not. the wikipedia page on pensions does not say that 401k is not a pension, therefore i figure it is included in the definition.

i just checked the 401k page, and it says: a 401(k) plan is an employer-sponsored, defined-contribution, personal pension (savings) account.

so it's a pension.

but what we call it doesn't really matter. more important is the question if the payout depends on future generations paying in. in my brief search i could not tell whether the 401k is protected against that or if it even can be protected. but if it is i'll have to retract my claim. my apologies.


Basic social security yield math.


Copied verbatim from the AI generated summary:

To choose the best rice cooker, consider these factors:

Top Brands: Zojirushi is often considered the best brand, with Cuckoo and Tiger as close contenders. Aroma is considered a good budget brand 1. Types: Basic on/off rice cookers: These are good for simple white or brown rice cooking and are usually affordable and easy to use 2. Considerations: When buying a rice cooker, also consider noise levels, especially from beeping alerts and fan operation 3. Specific Recommendations: Yum Asia Panda Mini Advanced Fuzzy Logic Ceramic Rice Cooker is recommended for versatility 4. Yum Asia Bamboo rice cooker is considered the best overall 5. Russell Hobbs large rice cooker is a good budget option 5. For one to two people, you don't need a large rice cooker unless cost and space aren't a concern 6. Basic one-button models can be found for under $50, mid-range options around $100-$200, and high-end cookers for hundreds of dollars 6. References What is the best rice cooker brand ? : r/Cooking - Reddit www.reddit.com The Ultimate Rice Cooker Guide: How to Choose the Right One for Your Needs www.expertreviewsbestricecooker.com Best Rice Cooker UK | Posh Living Magazine posh.co.uk Best rice cookers for making perfectly fluffy grains - BBC Good Food www.bbcgoodfood.com The best rice cookers for gloriously fluffy grains at home www.theguardian.com Do You Really Need A Rice Cooker? (The Answer Is Yes.) - HuffPost www.huffpost.com


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: