> Trump's one quality that may save us all from this quagmire is his ability to do a complete 180 on his previous committed path - "TACO" as his detractors like to call it.
If you haven't been paying attention, Trump has declared victory and called it quits roughly every other day for the past several weeks. It hasn't stuck, principally because Iran is the main actor that can decide whether or not to call it quits, and they have no reason to call it quits until they believe that Trump is actually serious in calling it quits.
One of the most surreal things is the sheer disconnect going on. The energy sector and everyone who's impacted are basically running around going "the strait's gonna be closed for months, we're turbofucked." The finance people are betting that the crisis will be over if not tomorrow then next week at the latest. And Trump et al are acting as if the crisis ended yesterday.
> I wonder if it will weigh on them at all if another school gets blown up or another thousand people die while they slow-walk the vote on the next war powers resolution.
The Democrats are the minority party. They don't control the agenda of legislative votes. But sure, blame them for the things they don't control, rather than the Republicans who want to avoid embarrassing their dear leader even as he leads his party to what looks to be utterly crushing defeats in the next elections with some of the most historically unpopular policies ever.
> I wonder if C++ has some hairy concepts and syntax today on par with Rust's more difficult parts.
… … … … Unqualified name lookup has been challenging in C++ since even before C++11. Overload resolution rules are so painful that it took me weeks to review a patch simply because I had to back out of trying to make sense of the rules in the standard. There's several slightly different definitions of initialization. If you really want to get in the weeds, starting playing around with std::launder and std::byte and strict aliasing rules and lifetime rules, and you'll yearn for the simplicity of Rust.
C++ is the absolute most complex of any of the languages whose specifications I have read, and that's before we get into the categories of things that the standard just gives up on.
> starting playing around with std::launder and std::byte and strict aliasing rules and lifetime rules, and you'll yearn for the simplicity of Rust
Annotations like std::launder, lifetime manipulation, etc solve a class of problems that exist in every systems language. They inform the compiler of properties that cannot be known by analyzing the code. Rust isn't special in this regard, it has the same issues.
Without these features, we either relied on unofficial compiler-specific behavior or used unnecessarily conservative code that was safe but slower.
> Rust isn't special in this regard, it has the same issues.
This is both fundamentally true and misleading. Rust has to solve the same issues but isn't obliged to make all the same bad choices to do that and so the results are much better.
For example C++ dare not perform compile time transmutations so, it just forbids them and a whole bunch of extra stuff landed to work around that, but in Rust they're actually fine and so you can just:
That blows up at compile time because we claimed the bit pattern for the integer 2 is a valid boolean and it isn't. If we choose instead 0 (or 1) this works and we get the expected false (or true) boolean instead of a compiler diagnostic.
C++ could allow this but it doesn't, rather than figure out all the tricky edge cases they just said no, use this other new thing we made.
> For example C++ dare not perform compile time transmutations
I am confused by this assertion. You can abuse the hell out of transformations in a constexpr context. The gap between what is possible at compile-time and run-time became vanishingly small a while ago.
I think your example is not illustrative in any case. Many C++ code bases work exactly like your example, enforced at compile-time. That this can be an issue is a hangover from retaining compatibility with C-style code which conflates comparison operators and cast operators. It is a choice.
C++ can enforce many type constraints beyond this at compile-time that Rust cannot, with zero effort or explicit type creation. No one should be passing ints around.
Surely "We have many different ways to do this, each with different rules" is exactly the point? C++ 20's std::bit_cast isn't necessarily constexpr by the way although it is for the trivial byte <-> boolean transmutation I mentioned here.
I see that C++ people were more comfortable with the "We have far too many ways to initialize things" examples of this problem but I think transmutation hits harder precisely because it sneaks up on you.
bit_cast and reinterpret_cast do different things: one works at the value level, the second preserves address identity (and it is problematic from an aliasing point of view).
Not sure what any of this has to do with initialization though.
FWIW, the direct translation of your rust code is:
constexpr char y = 2;
constexpr bool x = std::bit_cast<bool>(y);
It fails on clang for y=2 and works for y=1, exactly like rust;
GCC produces UB for y=2, I don't know if it is a GCC bug or the standard actually allows this form of UB to be ignored at contexpr time.
What is the rust equivalent of reinterpret_cast and does it work at constexpr time?
edit: I guess it would be an unsafe dereference of a casted pointer. Does it propagate constants?
Firstly, that's not a direct translation because you're making two variables and I made none at all. Rust's const is an actual constant, it's not an immutable variable. We have both, but they're different. The analogous Rust for your bit cast example would make two immutable variables that we promise have constant values, maybe:
Of course this also won't compile, because the representation for 2 still isn't a boolean. If it did compile you'd also (by default) get angry warnings because it's bad style to give these lowercase names.
I also don't know if you found a GCC bug but it seems likely from your description. I can't see a way to have UB, a runtime phenomenon, at compile time in C++ as the committee imagines their language. Of course "UB? In my lexer?" is an example of how the ISO document doesn't understand intention, but I'd be surprised if the committee would resolve a DR with "That's fine, UB at compile time is intentional".
I understand that "these are different things" followed by bafflegab is how C++ gets here but the whole point of this sub-thread is that Rust didn't do that, so in Rust these aren't "different things". They're both transmutation, they don't emit CPU instructions because they happen in the type system and the type system evaporates at runtime.
So this is an impedance mismatch, you've got Roman numerals and you can't see why metric units are a good idea, and I've got the positional notation and so it's obvious to me. I am not going to be able to explain why this is a good idea in your notation, the brilliance vanishes during translation.
I'm using two variables because numeric literals have the wrong type and bit_cast rejects transmutations between differently sized types.
I could have written it as x = bit_cast<bool>(char{2}), but does it really make a difference?
I don't know enough rust to know what's the difference between its const and c++ constexpr. It might not be a meaningful difference in C++.
> So this is an impedance mismatch, you've got Roman numerals and you can't see why metric units are a good idea, and I've got the positional notation and so it's obvious to me. I am not going to be able to explain why this is a good idea in your notation, the brilliance vanishes during translation.
There are plenty of rust users on HN that are capable of kind, constructive, and technically interesting conversations. Unfortunately there are a small few that will destroy any goodwill the rest of the community works hard to generate.
> I could have written it as x = bit_cast<bool>(char{2}), but does it really make a difference?
Not really, that's also a variable. We're running into concrete differences here, which is what I was gesturing at. In C++ you've got two different things, one old and one new, and the new one does some transmutations (and is usually constexpr) while the old one does others but isn't constexpr. It's not correct to say that reinterpret_cast isn't a transmutation, for example it's the recognised way to do the "I want either a pointer or an integer of the same size" trick in C++ which is exactly that. Let me briefly explain, as much to ensure it's clear in my head as yours:
In C++ we have an integer but sometimes we're hiding a pointer in there using reinterpret_cast, in Rust we have a pointer but sometimes we're hiding an integer in there using transmute [actually core::ptr::without_provenance but that's just a transmute with a safe API]. Of course the machine code emitted is identical, because types evaporate at compile time the CPU doesn't care whether this value in a register "is" a pointer or not.
Anyway, yes the issues are the same because ultimately the machines are the same, but it's not true that C++ solved these issues the only way they could be addressed, better is possible. And in fact it would surely be a disappointment if we couldn't do any better decades later. I hope that in twenty years the Rust successor is as much better.
I don't know a way to express actual constants in C++ either. If there isn't one yet maybe C++ 29 can introduce a stuttering type qualifier co_co_const to signify that they really mean constant this time. Because constexpr is a way to get an immutable variable (with guaranteed compile time initialization and some other constraints) and in C++ we're allowed to "cast away" the immutability, we can actually just modify that variable, something like this: https://cpp.godbolt.org/z/EYnWET8sT
In contrast it doesn't mean anything to modify a constant in either language, it's not a surprise that 5 += 2 doesn't compile and so likewise Rust's core::f32::consts::PI *= 2; won't compile, and if we made our own constants we can't change those either. We can write expressions where we call into existence a temporary with our constant value, and then we mutate the temporary, but the constant itself is of course unaffected if we do this.
This can be a perf footgun, you will see newcomers write Rust where they've got a huge constant (e.g a table of 1000 32-bit floating point numbers) and they write code which just indexes into the constant in various parts of their program, if the index values are known at compile time this just optimises to the relevant 32-bit floating point number, because duh, but if they aren't it's going to shove that entire table on your stack everywhere you do this, and that's almost certainly not what you intended. It's similar to how newcomers might accidentally incur copies they didn't mean in C++ because they forgot a reference.
I'm afraid that just C++ being C++ and you are deep into UB; you can't really modify a constexpr value at runtime, and if you cast away its constness with what is effectively a const cast you are on your own. This will print "0 3" which is obviously nonsense:
constexpr int x = 3;
((int&)x) = 0;
char y[x];
std::print("{}, {}",x, sizeof(y));
The output might change according to the compiler and optimization level.
You can also move the problematic sequence into a constexpr function and invoke it in a constexpr context: the compiler will reject the cast now.
Enum constants can't become lvalues, so the following also won't compile:
enum : int { x = 3};
((int&)x) = 0;
So I guess that's closer to the rust meaning of constant.
FWIW, notoriously you could modify numeric literals in early Fortrans and get into similar nonsense UB.
edit: in the end[1] it seems that you take exception with constexpr not being prvalues in C++. I guess it was found more convenient for them to have an address [1]. That doesn't make them less constant.
[1] or at least I think you do, it is not clear to me what you have been trying to claim in this discussion.
[2] C++ will materialize prvalues into temporaries when their address is needed (and give them an unique address), I guess it was thought to be wasteful for large constexpr objects, and avoids the rust pitfall you mentioned.
> I'm afraid that just C++ being C++ and you are deep into UB
Of course, but the reason to even do this is only to illustrate that it's just another variable, nothing more. As much for myself as for you.
I've heard the stories about older languages but I assume that's not a thing on a modern Fortran and I'm sure we agree it's a bad idea.
The main thrust of this sub-thread was that languages can, and I believe Rust did, choose to solve the same issues but in a better way and so "This is too complicated in C++" doesn't translate to "It will be too complicated in every language". I think some C++ people have a version of the "End of History" nonsense, which is a nineteenth century idea. If you think noteworthy change happened after that and so history didn't end then hopefully you agree that makes no sense for general world history, and perhaps you can agree likewise C++ isn't the final apex of programming languages.
Is there a reason Rust would not (as it was done in the ‘good ole days’) index the table via pointer arithmetic from .data? Also, I’m assuming that because you are discussing new devs, that they are not making the implementation decision to place the table on the heap and using Rist’s subscript operator, which I would understand Rust not doing as default. I can not think of a reason that the table should ever be put on the stack for reading a single value, so that being the default seems an oddly pessimistic default. I could be missing something regarding how Rust handles literal data ‘written out’ into source though.
The table is on the stack because we conjured into existence a temporary, so it's exactly as if the programmer had conjured the variable as a local by hand. Suppose our table is named SOUP
const SOUP: [f32; 1000] = [ /* whatever */ ];
let foo = something(blah_blah, blah) * SOUP[4];
// The optimiser will see that SOUP[4] is exactly say 1.5_f32 so it'll just do
// the same as if we'd calculated something(blah_blah, blah) * 1.5_f32
However
let foo = something(blah_blah, blah) * SOUP[blah];
Now blah is a variable, we might need any value from SOUP, the optimiser doesn't know what values it might have - so a temporary is conjured into existence, equivalent to:
let tmp = SOUP;
let foo = something(blah_blah, blah) * tmp[blah];
You do presumably recognise that this now puts SOUP on the stack right? The temporary is equivalent, but without the explicitness.
Now if you know what you're doing you would of course have one single place in your program where you do this:
static SOUP: [f32; 1000] = [ /* whatever */ ];
And now there's an immutable global named SOUP and like "the good ole days" we don't keep writing this huge data blob to the stack only to subsequently look at a single element. But that's not the thing the noob wrote so that's not what they get.
"Sufficiently smart compilers" are a very gradual miracle. In theory a compiler could realise this is a good idea, in practice today I doubt you will find such a compiler, so just write explicitly that you want a single variable if that's what you want.
In my experience conversions is one of the things that maximum warning levels do excellent static analysis for nowadays. In the last 15 years I hardly had a couole problems (init vs paren initialization). All narrowing etc. is caught out of the box with warnings.
My point is, in your exact example both reinterpret_cast and C-style casts have the exact same behavior, making the example bad. If you want to showcase a deficiency of C++, it would make sense to pick something where the difference between cast types actually matters.
The right strategy to use C++ efficiently is to set warnings to the maximum as errors and take the core guidelines or similar and avoid past cruft.
More often than not (except if you inherit codebases but clang has a modernize tool) most of the cruft is either avoidable or caught by analyzers. Not all.
But overall, I feel that C++ is still one of the most competitive languages if you use it as I said and with a sane build system and package manager.
Certificate revocations are not required to be reported after the expiration date, so you can no longer reliably check if a certificate has been revoked (e.g., because its underlying key was exfiltrated or because it was misissued).
Honestly, the Fourth Crusade in 1204 was more of a "real" death of the Byzantine Empire than the conquest of Constantinople in 1453. Although the largest remnant of the Byzantine Empire was able to reconquest Constantinople in 1261, the city's population never recovered (it went from ~400k in 1204 to ~50k in 1453). The 14th century saw it riven with a series of civil wars, which the Ottomans used to expand their foothold into the remnants of the Ottoman Empire. By 1453, Constantinople was unable to really defend itself without garrison from the major European states like Hungary and Venice, and Mehmet II was able to conquer the city before those states could get their forces sent out.
I've been trying out AI over the past month (mostly because of management trying to force it down my throat), and have not found it to be terribly conducive to actually helping me on most tasks. It still evidences a lot of the failure modes I was talking about 3 years ago. And yet the entire time, it's the AI boosters who keep trying to say that any skepticism is invalid because it's totally different than how it was three months ago.
I haven't seen a lot of goalpost moving on either side; the closest I've seen is from the most hyperbolic of AI supporters, who are keeping the timeline to supposed AGI or AI superintelligence or whatnot a fairly consistent X months from now (which isn't really goalpost-moving).
> How big a share of the desktop market do the BSDs have compared to Linux? I imagine it’s quite small, unfortunately.
Good stats are hard to come by, but the Linux : BSD ratio is probably no larger than the Windows : Linux ratio (which is actually running relatively low these days--Linux seems to be closing in on ~3% desktop share). That puts the BSD overall in the 0.01% range, which is really too little market share to accurately measure.
There are three main problems with trying to offer a simple answer to the question of "what is the first computer?"
The most obvious of the problems is that a computer isn't a singular technology that springs up de novo, but something that develops from antecedents over a long, messy transition problem that requires a judgement call as to when the proto-computer becomes an actual computer. A judgement call which is obviously going to be biased based on the other considerations. Consider, for a more contemporary example, what you would argue as the "first smartphone" or the "first LLM." Personally, I think the ENIAC is still somewhat too proto-computer for my tastes: I'd prefer a "first" that uses binary arithmetic and has stored programs, neither of which is true for the ENIAC.
The second major issue is it's also instructive to look at the candidates' influence on later development. Among the contenders for "first computer," it's unfortunately kinda clear that ENIAC has the most lasting influence. ENIAC's development produced the papers that directly inspires the next generation of machines. Colossus is screwed here because of the secrecy of the code-breaking effort. Meanwhile, Zuse and Z3 suffer from being on the losing end of WW2. ABC has a claim here, but it's not clear whether or not the developers of ENIAC drew influence from ABC or not.
The final major issue isn't so much an issue by itself but rather something that colors the interpretation of the first two issues: national pride. An American is far more likely to weight the influence and ingenuity of the ENIAC and similar machines to label one of them the "first computer." A UK person would instead prefer to crown Colossus or the Manchester Baby. A German would prefer the Z3.
In many ways the ENIAC was more like an FPGA than a computer. It was programmed with patch cables connecting the different computational units as well as switches, and had no CPU as such. The cables had to be physically rerouted when changing to a new program, which took weeks. My understanding is that it was eventually programmed to emulate a von Neuman machine around 1948/49. As far as I understand, this was done mainly by Jean Bartik based on Von Neumans ideas.
If this is correct, it was not a von Neuman machine originally, but it eventually became one, and at approximately the same time as the Manchester Baby.
I'm writing my own programming language right now... which is for an intensely narrow use case, I'm building a testbed for comparing floating-point implementations without messy language semantics getting in the way.
There's lots of reasons to write your own programming language, especially since if you don't care about it actually displacing existing languages.
If you haven't been paying attention, Trump has declared victory and called it quits roughly every other day for the past several weeks. It hasn't stuck, principally because Iran is the main actor that can decide whether or not to call it quits, and they have no reason to call it quits until they believe that Trump is actually serious in calling it quits.
One of the most surreal things is the sheer disconnect going on. The energy sector and everyone who's impacted are basically running around going "the strait's gonna be closed for months, we're turbofucked." The finance people are betting that the crisis will be over if not tomorrow then next week at the latest. And Trump et al are acting as if the crisis ended yesterday.
> I wonder if it will weigh on them at all if another school gets blown up or another thousand people die while they slow-walk the vote on the next war powers resolution.
The Democrats are the minority party. They don't control the agenda of legislative votes. But sure, blame them for the things they don't control, rather than the Republicans who want to avoid embarrassing their dear leader even as he leads his party to what looks to be utterly crushing defeats in the next elections with some of the most historically unpopular policies ever.
reply