Hacker Newsnew | past | comments | ask | show | jobs | submit | borg16's commentslogin

vibe around and find out

folks have created software by "vibe coding". It is now time to "face the music" when doing so for production grade software at scale.


at this point it may be considered as form factor that has been deprecated, despite the advantages it brings


i guess folks in azure wanted to show some solidarity with aws brethren

(couldn't resist adding it. i acknowledge this comment adds no value to the discussion)


Azure goes down all the time. On Friday we had an entire regional service down all day. Two weeks ago same thing different region. You only hear about it when it's something everyone uses like the portal, because in general nobody uses Azure unless they're held hostage.


Yeah, im regretting my decision to buy an xbox now. Every once in a while, everything goes down.


I never really understood the concept of small teams. Managing a small team really does not provide the scale and benefit that a medium to large team does. Lost bandwidth of manager of said small team or extra salary of the same manager seem like something the company could use in other places.

But often such teams in faang come up as a by product of someone’s empire building and that is unfortunate for others involved in it.


The lead is also doing a lot of the regular dev work. Their work is small and self-contained enough that it doesn't need more headcount but also doesn't make sense to merge into another team.

I could even see there being a 1-person team, but their slow tooling and red tape creates the need for extra headcount.


can someone tell me what are some common use cases of such a tool? may be i'm not the intended customer base, but am curious to know what others are using it for.


It's not for everyone but I can think of 2 use cases immediately:

1. If you're a developer, then you're probably using sqlite to store data and having a GUI for checking/modifying the database is probably handy.

2. Most application by other developers (even massive companies) use sqlite to store data under the hood even if the file doesn't appear to be named `.sqlite`. So if you want to tweak certain settings that aren't exposed to the end user, then you can use this to do that in a more user friendly way rather than crack out a command line tool for sqlite changes.


To add to what xmprt and msephton have said, people have told me they use it for:

  - Storing results for scientific research
  - Local analysis of data exported from server-based databases
  - Experimenting with database designs before exporting SQL to codebases
  - Maintaining relational data where a website or app are not needed (eg. tutors keeping client records)
  - Recovering data from databases used by other products (eg. phone backups, discontinued apps)


It's for manipulating SQLite databases with a GUI, rather than in a web page or at the command line. I previously used V2 of Base and the user interface was excellent, and this looks to be even better.


i think your use of the phrase "terrifying forum" is aptly justified here. that has got to be the most unsettling subreddit i have every come across on reddit, and i have been using reddit for more than a decade at this point.


I hadn’t had “that funny feeling” [0] for a while but yep that sub hit me like a truck.

It’s worth bearing in mind that it’s fairly small as subreddits go, I guess.

[0] https://youtu.be/ObOqq1knVxs?si=N5iqaCi5KZer0tsV


I keep telling myself it is satire or some sort of larping but we all know it isn’t.


i read in an earlier thread for this on HN - "this is a classic example of data driven product decision" aka we can reduce costs by $x if we just stopped goo.gl links. Instead of actually wondering how this would impact the customers.

Also helps that they are in a culture which does not mind killing services on a whim.


The Google URL shortener stopped accepting new links around 2018. It has been deprecated for a long time.

I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.

They also might have wanted to use the domain for something else.


How much of a burden could this really be?

The nature of something like this is that the cost to run it naturally goes down over time. Old links get clicked less so the hardware costs would be basically nothing.

As for the actual software security, it's a URL shortener. They could rewrite the entire thing in almost no time with just a single dev. Especially since it's strictly hosting static links at this point.

It probably took them more time and money to find inactive links than it'd take to keep the entire thing running for a couple of years.


"How much of a burden could this really be?"

My understanding from conversations I've seen about Google Reader is that the problem with Google is that every few years they have a new wave of infrastructure, which necessitates upgrading a bunch of things about all of their products.

I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.

If a product has an active team maintaining it they can handle the upgrade. If a product has no team assigned there's nobody to do that work.


My understanding is that (at least at one point) binaries older than about six months were not allowed to run in production. But APIs are "evolving" irregularly so the longer you go between builds the more likely something is going to break. You really need a continuous build going to stay on top of it.

Best analogy I can think of is log-rolling (as in the lumberjack competition).


Google is famously a monorepo and is basically the gold standard of CI/CD.

What does happen is APIs are constantly upgraded and rewritten and deprecated. Eventually projects using the deprecated APIs need to be upgraded or dropped. I don't really understand why developers LOVE to deprecate shit that has users but it's a fact of life.

Second hand info about Google only so take it with a grain of salt.


Simple: you don't get promoted for maintaining legacy stuff. You do get promoted for providing something new that people adopt.

As such, developing a new API gets more brownie points than rebuilding a service that does a better job of providing an existing API.

To be more charitable, having learned lessons from an existing API, a new one might incorporate those lessons learned and be able to do a better job serving various needs. At some point, it stops making sense to support older versions of an API as multiple versions with multiple sets of documentation can be really confusing.

I'm personally cynical enough to believe more in the less charitable version, but it's not impossible.


I agree this is an overriding incentive that hurts customers & companies. I don't think there's an easy fix. Designing & creating new products require more relevant capabilities from employees for promotions then maintaining legacy code.


> I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.

Arrival of new does not neccessitate migration.

Only departure of old does.


They deprecate internal infrastructure stuff zealously and tell teams they need to be off of such and such by this date.

But it's worse than that because they'll bring up whole new datacenters without ever bringing the deprecated service up, and they also retire datacenters with some regularity. So if you run a service that depends on deprecated services you could quickly find yourself in a situation where you have to migrate to maintain N+2 redundancy but there's hardly any datacenter with capacity available in the deprecated service you depend on.

Also, how many man years of engineering do you want to spend on keeping goo.gl running. If you were an engineer would you want to be assigned this project? What are you going to put in your perf packet? "Spent 6 months of my time and also bothered engineers in other teams to keep this service that makes us no money running"?


> If you were an engineer would you want to be assigned this project?

If you're high flying, trying to be the next Urs or Jeff Dean or Ian Goodfellow, you wouldn't, but I'm sure there's are many thousands of people who are able to do the job that would just love to work for Google and collect a paycheck on a $150k/yr job and do that for the rest of their lives.


I'd like to encourage you consider the following two perspectives --

1. A senior Google leader telling the shareholders "we've asked 1% of our engineers, that's 270 people, costing $80M/year, to work on services that produce no revenue whatsoever." I don't think it would pass that well.

2. A Google middle manager trying to figure out if an engineer working exclusively on non-revenue projects is actually being useful or otherwise; this is made more complex by about 30% of the workforce trying to go for the rest and vest option provided by these projects.


> A senior Google leader telling the shareholders "we've asked 1% of our engineers, that's 270 people, costing $80M/year, to work on services that produce no revenue whatsoever." I don't think it would pass that well.

The business case for this is that Google lose a bunch of money in b2b (cloud mostly, potentially AI in future) because professional users (developers etc) don't believe that products will be supported. Every time Google shut down a service like this, this perception is re-inforced. We're investing this money into these services to change our brand perception and help us make more money in future.

As a bonus, this kind of cultural change would also force them to rebuild their engineering systems (and promotional systems) to make this easier. This may not have mattered for Search/Ads but it will matter if they actually care about winning in cloud and AI.


A Google shareholder that shortsighted might as well ask why they have an HR department or have custodians to maintain the offices, after all, they don't generate income either.

The manager in the trenches can tell if there's actual work happening, to move goo.gl from the internal legacy system to the new supported one doesn't magically happen, code needs to change for it to work after the old system gets shut off.


Because it costs money to run things, and no one wants to pay for something that they aren't getting career value for.


A lot of Google infra services are built around the understanding that clients will be re-built to pick up library changes pretty often, and that you can make breaking API changes from time to time (with lots of notice).


But if you don't downgrade the old, then you're endlessly supporting systems, forever. At some point, it does become cheaper to migrate everything to the new.


And you could assign somebody to do that work, but who wants to be employed as the maintainer of a dead product? It’s a career dead-end.


> If a product has no team assigned there's nobody to do that work.

This seems like a good eval case for autonomous coding agents.


> How much of a burden could this really be?

You know how Google deprecating stuff externally is a (deserved) meme? Things get deprecated internally even more frequently and someone has to migrate to the new thing. It's a huge pain in the ass to keep up with for teams that are fully funded. If something doesn't have a team dedicated to it eventually someone will decide it's no longer worth that burden and shut it down instead.


I assume the general problem is people using these links for bad purposes and having to deal with needing to moderate them.


I think the concern is someone might scan all the inactive links and find that some of them link to secret URL's, leak design details about how things are built, link to documents shared 'anyone with the link' permission, etc.


> I think the concern is someone might scan all the inactive links

How? Barring a database leak I don't see a way for someone to simply scan all the links. Putting something like Cloudflare in front of the shortener with a rate limit would prevent brute force scanning. I assume google semi-competently made the shortener (using a random number generator) which would make it pretty hard to find links in the first place.

Removing inactive links also doesn't solve this problem. You can still have active links to secret docs.


To make the URLs actually short, you need to use most/all of the keyspace.

Back when it was made, shorteners were competing to see who could make the shortest URL, so I bet a brute force scan would find everything.


> You can still have active links to secret docs.

If they're have a (passwordless) URL they're not secret.


> My guess would be that it was a security and maintenance burden that nobody wanted.

Cloudflare offered to run it and Google turned them down:

https://x.com/elithrar/status/1948451254780526609


> I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.

Yeah I can't imagine it being a huge cost saver? But guessing that the people who developed it long moved on, and it stopped being a cool project. And depending on the culture inside Google it just doesn't pay career-wise to maintain someone else's project.


>The Google URL shortener stopped accepting new links around 2018. It has been deprecated for a long time.

It's a strange thing to consider 'since 2018' "a long time". Only in tech circles is this so, not in normal life.


I really doubt it was about security/maintenance burdens. Under the hood, goo.gl just uses Firebase Dynamic Links which is still supported by Google.

Edit: nevermind, I had no idea Dynamic Links is deprecated and will be shutting down.


Firebase Dynamic Links is shutting down at the end of August 2025.


I had no idea. It's too late to delete my comment now.

It's a really ridiculous decision though. There's not a lot that goes into a link redirection service.


Documents from 2018 haven't decayed or somehow become irrelevant.


I think the problem with URL shorteners like Google’s that includes the company name is that to the layperson there is possibly an implied level of safety.

Here is a service that basically makes Google $0 and confuses a non-zero amount of non-technical users when it sends them to a scam website.

Also, in the age of OCR on every device they make basically no sense. You can take a picture of a long URL on a piece of paper then just copy and paste the text instantly. The URL shortener no longer serves a discernible purpose.


Shorter URLs mean fewer characters to encode in a QR code.


And how does that matter? The QR gets read either way.


No, code is smaller and more readable, also shortener means additional tracking layer


Less complex QR codes are easier to scan, especially at a distance


How much does it really cost google to answer some quick HTTP requests and redirect, vs all their youtube videos etc


"security and maintenance burden" == "cost" == "cost-driven decision"


Capital inputs are one part of the equation. The human cost of mental and contextual overhead cannot be reduced to dollars and cents.


Sure it can. It takes X people Y hours a day/month/week to perform tasks, including planning and digging up the context behind, related to this service. Those X people make Z dollars per year. It's an extremely simple math equation


Emotional labor doesn’t show up on a balance sheet.


Goo.gl didn't have customers, it had users. Customers pay, either with money or their personal data, now or the future. Goo.gl did not make any money or have a plan to do so in the future.


One wonders why they don't, instead of showing down, display a 15s interstitial unskippable YouTube-style ad prior to redirecting.

That way they'll make money, and they can fund the service not having to shut down, and there isn't any linkrot.


This is such an evil idea.


Why is it evil? If we assume that a free URL shortener is a good thing, and that shutting one down is a bad thing, and given that every link shortener will have costs (not just the servers -- constant moderation needs, as scammers and worse use them) and no revenue. The only possible outcome is for them all to eventually shut down, causing unrecoverable linkrot.

Given those options, an ad seems like a trivial annoyance to anyone who very much needs a very old link to work. Anyone who still has the ability to update their pages can always update their links.


This is how every URL shortener on the internet worked used to work


Well, it's either that or a paywall. Pick your poison.


The monetary value of the goodwill and mindshare generated by such a free service is hard to calculate, but definitely significant. I wouldn't be surprised if it was more than it costs to run.


And also the ongoing demonstration of why you should never trust Google.

"Here's a permanent (*) link".

[*] Definitions of permanent may vary wildly.


Which raises the obvious question -- why make a service that you know will eventually be shut down because of said economics. Especially one that (by design) will render many documents unusable when it is shut down.

While I generally find the "killed by Google" thing insanely short-sighted, this borders on straight-up negligence.


There was a time in Google where anything seemed possible and they loved doing experimental or fun things just for doing them. The ad machine printed money and nobody asked questions. Over time they started turning into a normal corporation with bean counting.


I always figured most of the real value of these url hashing services was as an marketing tracking metric. That is, sort of equivalent to the "share with" widgets provided that conveniently also dump tons of analytics to the services.

I will be honest I was never in an environment that would benefit from link shortening, so I don't really know if any end users actually wanted them (my guess twitter mainly) and always viewed these hashed links with extreme suspicion.


One of the complaints about Google is that it's difficult to launch products due to bureaucracy. I'm starting to thing that's not a bad thing. If they'd done a careful analysis of the cost of jumping into this url-shortener bandwagon, we wouldn't be here. Maybe it's not a bad thing they move slower now.


I would bet that the salaries paid to the product managers behind shutting this down, during the time they worked on shutting it down, outweigh the annual cost of running the service by an order of magnitude.


At this point, anyone depending on Google for anything deserves to get burned. I don't know how much more clearly they could tell their users that Google has absolutely no respect for users without drone shipping boxes of excrement.


If companies can spend billions on AI and not have anything in return and be okay with that in the ways of giving free stuff (okay, I'll admit not completely free since you are the product but still free)

Then they should also be okay for keeping the goo.gl links honestly.

Sounds kinda bad for some good will but this is literally google, the one thing google is notorious for is killing their products.


This is basically modern SV business. This old data is costing us about a million a year to hold onto. KILL IT NOW WITH FIRE.

Hey lets also dump 100 Billion dollars into this AI thing without any business plan or ideas to back it up this year. HOW FAST CAN YOU ACCEPT MY CHECK!


Nobody wants to catch a falling knife, everybody wants to attach a lanyard to the moon rocket.


Hard to imagine costs were ever a factor.

For company running GCP and giving things like Colab TPUs free the costs of running a URL service would be trivial rounding number at best


Outside of bandwidth, I could run this entire service on a raspberry pi. No, I'm not exaggerating. It's just a text end of url to full url lookup.

I've handled far more traffic on single machines 20 years ago.


Arguably, this is them collecting the wrong types of data to inform decisions, if that isn't represented in the data.


All while data and visibility is part of the business.

Like other things spun down there must not be value in the links.


For all HN commenters: if you are not paying for it, you are not a customer and thus you should not complain.


yeah their efforts seem to show no care for their consumer level subscribers. all they do seems to be tiered to enterprise customers.


shout out to "A tribe called Quest" for the title (my guess)


it's a third of the price to begin with. I think readwise has a winner in reader app, but they sure do charge a premium for the same. You can get the same functionality in linkwarden or pinboard for a fraction of readwise's subscription pricing.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: