Hacker Newsnew | past | comments | ask | show | jobs | submit | cturner's commentslogin

Do you dislike type inheritance? Or only implementation inheritance? My view is that type inheritance is incredibly useful, both for single system programming, and rpc. Whereas implementation inheritance creates brittle systems.


"The only difference is how it presents itself to the switch (ie, says its a Cisco optic), not actual difference in performance."

That's not the only difference. I have had situations where I ran equivalent optics side-by-side, and then touched one and it was hot, and touched the other and it was not hot. They do contain different components. In the case of that test - the atgbics SFP was cool, and the other clone unit was hot. My dealer was able to get me in contact with someone technical at atgbics (the cool-running unit) who explained the difference, "The DSP might be say 13nm where more modern more expensive ones are 5nm."

But you definitely do not need to pay for "genuine" optics to get high-reliability optics. You just need to shop around the clones - atgbics is a clone.


Before I comment, a disclaimer about my small scale. I am running probably three hundred SFP+s running and less than five years of experience with optics. I don't have stock tracking for the individual manufacturers, and the failure rate comments here are based on gut-feel only. (there will be other people here used to far larger scales)

I bucket it into there being three options: genuine, clone, and good-clone.

We had a bad run with fs.com QSFP+s. Their SFP+s have been better to me, but reckon I have had a couple fail.

Atgbics SFP+s have been a reliable clone supplier for us. I don't think I have had any of those fail, and they have been my main vendor for a while now. You can order them programmed with personalities for Cisco, etc.

Part of the edge of fs.com is that it is so easy to place an order and get fast delivery. My main site is in another country to where I live, and I do a few trips a year. Several times they have made low-notice projects possible.


We accidentally ordered a load of “Generic brand” 100G QSFP from FS. Everything worked and appeared fine from the perspective of the switch and cards, reported OK status for everything, _except_ the lasers never turned on. Switching to an Extreme switch made the switch end work fine, but not the server.

Turns out Mellanox/NVIDIA hardware are _really_ picky about their cards. We bought a box from FS that reprogrammed the compatibility firmware and they worked instantly (FS also offered to return and reprogram but we needed it fast).

This was a big shock after dealing with nothing but CAT5/6/RJ45 that has been stable and common for decades (?).


> Atgbics SFP+s have been a reliable clone supplier for us.

With the caveat that I'm a USian and my scale is even lower than yours (10 10gbit SFP+ modules in my apartment combination home, office, and lab, running trouble-free for the past three years) I've found 10Gtek to be a reliable supplier. You can order 10gbit SFP+ modules straight from them for 14USD per per module. Though, shipping costs straight from them is currently pretty terrible: $35 if you're spending less than $800.

Stores like Newegg will often meet or beat that per-module price and offer free shipping if you buy a bundle of four or more... but modules with the personality you want may not be in stock.


Valve is not the market-maker here, they are the exchange.


Don’t they also 100% control the supply though?


Yes, Valve controls supply. That strengthens my point.

Market makers do not control the supply of goods. They provide resting liquidity for pre-existing goods.

Similarly, market makers do not get to establish rules of the of their own "reality". Market makers are participants in a venue. It is the venue/exchange that sets the rules.

User Bengalilol seems to have inferred that because Valve made the venue, he can refer to them as the "market maker". This is not correct. Words have meaning. The meaning of market-maker is well-established in the context of exchanges. Market maker is incorrect terminology for Valve’s role.


Microserfs?


Without convenience it will not be successful as a common currency. It does not need convenience to succeed in other ways. For example, as a store of value.


Sorry, I should have been more specific. "Succeed as a common currency" is more-so what I meant, I think the store of value argument stands.


Literally any scarce and durable good can be a store of value. A pound of Osmium is about a million bucks. So building massive server farms to store something equivalent to an inert rock is kind of uninteresting.


I tend to run my tmux session for months at a time on my office workstation. When I remote in to that computer, I can type ‘tmux attach’ and all my context is there. I might have four long arc dev projects running at once, and my planning system, all within those windows.

On our datacentre servers, I also have tmux running. It is fast to connect to these hosts, attach tmux and continue from where I left off.

Another use case: it is common for corporates to require devs to use windows desktops, but to then give them a headless linux host in a datacentre for development work. Here, you use putty to connect to the linux host, fullscreen it, run tmux. On your desktop you have outlook and office and putty and a browser and no dev tools. You can do all your planning and dev work on the linux host, using your favourite ten thousand hours text editor and building your own tools, and this becomes your hub. You lose awareness that you are connected to this from a locked down windows host. Corporate security reboots your windows host for patching several nights in a row, and it does not cause you any hassle because your work context is in the tmux session on another host.


Summary: The mainstream ear has changed. As a result, traditional choral compositions have become less accessible to mainstream audiences, but the form of choral music remains accessible. People who participate in choral music train themselves into a traditional taste as a side-effect of participation.

Sung pentatonic music seems to be accessible to everyone from a young age.

For most things beyond that, our brains need exposure to the form to be able to appreciate it. This affects rhythm, melody and instruments.

My three-year-old hates the sound of guitar distortion. I am confident he will acclimatise to it.

Accessibility of traditional choral music will be influenced by what the audience knows. People who grew up with sung carols on at Christmas will be more open to it than people who have grown up with post-war pop Christmas.

Everyone now living in the developed world has been exposed to beat-backed major/minor easy-listening music by television, films, car radio and shopping centres. This is recent. People a hundred years ago did not have the same ear. The large choral work Elijah was easy-listening to audiences who had heard sung mass hundreds of times.

In 2025, a church music director wanting a twentieth century composer would schedule Rutter easily - Rutter writes music that suits the ear of pop Christmas. They would prefer Howells if they thought the congregation had a more traditional ear. They would schedule Messiaen only for a particular occasion.

The OP wrote - "It has struck me that most recommenders and lovers of choral music [are] themselves singers (or conductors) of choral music."

It is easy to get involved, so many people who are curious get involved. Once involved, people will find their tastes becoming traditional as a side-effect of exposure to the repertoire. This creates a running division between people who participate and the mainstream.

Note that before seventy years ago, almost everyone who loved music would have participated in it, even if only singing to young children or helping out at church. Outside royal circles, the practice of loving music yet being a pure consumer is a recent phenomenon.

Some forms of choral music will have a different relationship with pop than high-church music. For example - Gospel, accompaniment to rock songs like /Under the Bridge/ or /You can't always get what you want/. The Beatles were a mainstay of post-war pop, but /Because/ on Abbey Road has the character of a Renaissance choral work - George Martin was classically trained.

The mainstream ear may be making another shift now to more sophisticated beats with closer melodies (smaller pitch jumps) and simpler chords. If it happens, we will see evidence of it in popular Christmas music. As far as I know there has not been a new addition to that repertoire since /All I want for Christmas is You/ which is c20 pop.


It is a terminal multiplexer. You will be able to find youtube videos. The gp is talking about a tool called gnu screen. If you need a more distinct token to search on try “tmux”.



I want to ask about the bureaucracy aspect. I have never written a science grant application, but expect that some of it comes about because the applications want to ensure good governance around the proposals. Do you agree? For the fluff that genuinely has no productive value, do you have any explanation for why it is there?

Could LLM participation be blowing holes in good-governance measures that were only weakly effective, and therefore a good thing in the long-term? Could the rise in the practice drive grants arrangements to better governance?


These are very good questions, and I only have vague answers because it's not easy to understand how bureaucratic systems come to be, grow and work (and not my speciality), but I'll try to do my best.

Indeed, some of the fluff is due to the first reason - for example, the data management plan (where you need to specify how you're going to handle data) has good intentions: it's there so that you explain how you will make your data findable, interoperable, etc. which is a legitimately positive thing; as opposed to e.g. never releasing the research software you produce and making your results unreproducible. But the result is still fluff: I (well, Gemini and I) wrote one last week, it's 6 pages, and what it says can be said in 2-3 lines: that we use a few standard data formats, we will publish all papers on arXiv and/or our institutional repository, software on GitHub, data on GitHub or data repositories, and all the relevant URLs or handles will be linked from the papers. That's pretty much all, but of course you have to put it into a document with various sections and all sorts of unnecessary detail. Why? I suppose in part due to requirements of some disciplines "leaking" into others (I can imagine for people who work with medical data, it's important to specify in fine detail how they're going to handle the data. But not for my projects where I never touch sensitive data at all). And in part due to the trend of bureaucracies to grow - someone adds something, and then it's difficult to remove it because "hey, what if for some project it's relevant?", etc.

Then there are things that are outright useless, like the Gantt chart. At least in my area (CS), you can't really Gantt chart what you're going to do in a 5-year project, because it's research. Any nontrivial research should be unexpected, so beyond the first year you don't know what you'll exactly be doing.

Why is that there? I suppose it can be a mix of several factors:

- Maybe again, spill from other disciplines: I suppose in some particular disciplines, a Gantt chart might be useful. Perhaps if you're a historian and you're going to spend one year at a given archive, another year at a second archive, etc... but in CS it's useless.

- Scientists who end up at bureaucratic roles are those that don't actually like doing science that much, so they tend to focus on the fluff rather than on actual research.

- Research is unpredictable but funding agencies want to believe they're funding something predictable. So they make you plan the project, and then write a final report on how everything turned just as planned (even if this requires contorting facts) to make them happy.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: