Hacker Newsnew | past | comments | ask | show | jobs | submit | shadowgovt's commentslogin

There also seems to be the assumption that people of a particular race can't be racist against members of the same race.

"Uncle Tom's Cabin" is an old story by now, and "One of the good ones" an old meme.


The author did not make such a statement, rather (it appears to me), that they were surprised instead by the lack of empathy exhibited by said similar-origin agents. A feeling that is shared by a lot of people who are opposed to the inhumane brutality being meted out, without repercussions, by ICE agents.

Because race is a story we tell ourselves, it turns out to not be as hard as one might assume to bend the story to make ingroups and outgroups out of people that, to an outside observer, should have more in common than their interactions suggest they do.

the correct concept of racism is: Bigotry + Power. So you give anyone power and suddenly their view on race matters.

The homeless black guy hating white people? What does that matter, he cant go shoot them in the back.


[flagged]


Anyone can do that. How often so you think a racist CEO funds a fascist president who funds the opression of immigrants, killing them?

Anyway,you are in personal responsibiliry land and been told you are tge master of your domain. Rarely do people who think individuality is paramount care about society. Your example along allows you to fixate on anecdotes because they hit your dopami e.


For what it's worth, that kind of lumping of drivers is more-or-less one of the metrics Waymo is using to self-evaluate. Perfect safety when multi-ton vehicles share space with sub-300-pound humans is impossible. But they ultimately seek to do better than humans in all contexts.

It's true, but the main reason I haven't just switched to an iPhone is the ecosystem that lets me write apps without having to pay Apple money or use their computers.

If Google is narrowing their moat on this, there are a lot fewer reasons for me, personally, to stay on the platform.


Sure, but the alternative ain't better for it, no?

I'm not quite sure I catch your meaning; "it" is an unbound pronoun in that sentence.

If I assume "it" means "programming on a mobile device": yes, it is. Apple cares an awful lot about the developer experience, has massive support, and a deep well of shareable knowledge. Google is about the same (the developer experience is a little patchier; I'd generously call Google's approach to devex on Android "bag-of-cats vision" and since one is not developing on, generally, a vertically-integrated tech stack, one has to struggle a bit more to get the tools set up and maintained).

The big selling point for Android is freedom of that stack, and if they throw sand in those gears, the benefits of the vertically-integrated stack that you have to pay-to-play start to become actually enticing.


XML was abandoned because we realized bandwidth costs money and while it was too late to do anything about how verbose HTML is, we didn't have to repeat the mistake with our data transfer protocols.

Even with zipped payloads, it's just way unnecessarily chatty without being more readable.


That doesn't match my memory, though its been a while now!

I remember the arguments largely revolving around verbosity and the prevalence of JSON use in browsers.

That doesn't mean bandwidth wasn't a consideration, but I mostly remember hearing devs complain about how verbose or difficult to work with XML was.


Your memory is correct. Once compression was applied, the size on the wire was mostly a wash. Parsing costs were often greater but that's at the endpoints.

But one of those endpoints is a client on a mobile phone, which when we started with Internet on mobile devices wasn't a particularly powerful CPU architecture.

OK, but XML is a pretty solid format for a lot of other stuff that doesn't necessarily need network transmission.

This is true, but if other formats work for those purposes and also network transmission, they'll start to edge out the alternative of supporting two different protocols in your stack.

The article addresses this.

if bandwidth was a concern, JSON was a poor solution. XML compresses nicely and efficiently. Yes it can be verbose to the human eyes, but I don't know if bandwidth is the reason it's not used more often.

JSON absolutely isn't perfect, but it's a spec that you can explain in ~5 minutes, mirrors common PL syntax for Dict/Array, and is pretty much superior to XML in every way.

Sure, but the argument is bandwidth which is what I’m comparing them to solutions to each other against.

To my mind, this is the huge bit that should not be overlooked.

So much infrastructure is there to support doing "it" in the Cloud, for all definitions of "it." If we can vibe-code bespoke one-offs to solve our problems, a lot of that Cloud interaction goes away... And that stuff is expensive and complicated.

Hypothetically, open source app stores (I'm counting apt here) address this, but then it's someone else's solution to my problem, which doesn't quite fit my problem perfectly.

This approach to software engineering could be what 3D printing is to tangible artifacts (and I mean that including the limits of 3D printing regarding tangible artifacts, but even still.)


There’s also speed - no matter what you do, a local app will simply be faster than using one from a SaaS provider

But they'd have to be on the same network as me to do that attack, right?

Yep, like ECHELON and friends are. The metadata recorded about your (all of our) traffic is probably enough to perform the timing attack.

Hey, if ECHELON snuck a listener into my house, where six devices hang out on a local router... Good for them, they're welcome to my TODO lists and vast collection of public-domain 1950s informational videos.

(I wouldn't recommend switching the option off for anything that could transit the Internet or be on a LAN with untrusted devices. I am one of those old sods who doesn't believe in the max-paranoia setting for things like "my own house," especially since if I dial that knob all the way up the point is moot; they've already compromised every individual device at the max-knob setting, so a timing attack on my SSH packet speed is a waste of effort).


An interoperable search index access standard might work. We've done something similar for peering and the backbone of the IP-layer interconnects themselves.

You have to make it economically preferable, and there's No known solution to this. Large networks are still using their positions to bully smaller ones off the IP-layer internet backbone.

But in this current climate, they can admit it and then dare Google to tell them to stop... After Google has just had an antitrust ruling against it for dominating the search market.

Google doesn't really have a leg to stand on and they know it.


In social networks, revenue is enhanced by stickiness.

Anger increases stickiness. Once one discovers there are other people on the site, and they are guilty of being wrong on the internet, one is incentivized to correct them. It feels useful because it feels like you're generating content that will help other people.

I suspect the failure of the system that nobody necessarily predicted is that people seem to not only tolerate, but actually like being a little angry online all the time.


Every time I see people talk about 'dead internet' I think about what web sites looked like before Google dropped PageRank into the whole ecosystem and utterly killed the utility of putting huge blobs of black-on-black text at the bottom of a page for keyword-match purposes. This was then shortly followed by Google becoming so popular they could use user behavior itself as a signal for site quality (much to the chagrin of every new publisher ever who suddenly had a catch-22 of "Nobody visits my site because Google won't uprank it because nobody visits my site").

The Internet has never been dead. Or alive. Ever since it escaped its comfortable cage in the university / military / small-clique-of-corporations ecosystem and became a thing "anyone" can see and publish on, there has forever been a push-pull between "People wanting to use this to solve their problems" and "People wanting eyeballs on their content, no matter the reason." We're just in an interesting local minimum where the ability to auto-generate human-shaped content has momentarily overtaken the tools search engines (and people with their own brains) use to filter useful from useless, and nobody has yet come up with the PageRank-equivalent nuclear weapon to swing the equation back again.

I'm giving it time, and until it happens I'm using a smaller list of curated sites I mostly trust to get me answers or engage with people I know IRL (as well as Mastodon, which mostly escapes these effects by being bad at transiting novelty from server to server), because thanks to the domain name ownership model pedigree of site ownership still mostly matters.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: