Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, relatively successful. Just because these treaties aren't 100% effective, doesn't mean they lack effect. It is the job of UN weapons inspectors to ensure chemical weapons are not being stockpiled. Stockpiles are harder to hide than the smaller quantities that many labs can produce. As for Syria, as the article you linked to states, they aren't signatories of the treaty banning chemical weapons. However, they aren't exactly out of the gaze of the 'world police', they're at the centre of one of the major conflicts at this moment in time, including involvement from the international community.

As for this idea of AI being harder to control than chemical weapons, if we were just talking about software then fine, but hardware is part of the equation and needs to be manufactured. There are varying levels of sophistication for this hardware, at the crudest level you have something like the drone + gun combo that hit the news in the last couple of weeks, on the more sophisticated end you have complex robotics designed to be more versatile. One end of this scale is available to Joe Public, but is easier to fight against, the other end of this scale is only available to those with deep pockets and could potentially be hard to fight against. Furthermore, in both cases, they are physical objects. Making these physical objects illegal to own and operate is the goal. Do you oppose this?



> Just because these treaties aren't 100% effective, doesn't mean they lack effect. It is the job of UN weapons inspectors to ensure chemical weapons are not being stockpiled. Stockpiles are harder to hide than the smaller quantities that many labs can produce.

I'm not trying to say it needs 100% to make an effect I just asked what your definition of successful meant in relative terms. We're getting a little off track here but the big takeaway here is that the process to create chemical weapons largely doesn't have a lot of areas in which further advancement in technologies can help people (some exists, sure, but I'm not convinced a lot). This is counter to AI where there are thousands of applications for AI in everyday life from driving to medical equipment; so much of this advancement is knowledge and technology that can easily be moved into the military sector.

> but hardware is part of the equation and needs to be manufactured.

But why? Yes I'm sure they would make specialized hardware but it's not like they need to. There is plenty of equipment on the ground and in the air controlled by either a remote human or by a human through direct interfacing with the machine. There is no reason these existing points which contain a human can't be swapped with a relatively advanced AI should one be created in the future.

So hardware is part of the equation but it's not like anything needs to be radically altered. In fact it may be advantageous to keep the same looking hardware so the enemy doesn't know it's an AI controlling it.

> Furthermore, in both cases, they are physical objects. Making these physical objects illegal to own and operate is the goal. Do you oppose this?

Making what objects illegal to own and operate? Objects controlled by AI, objects that contain weaponry, objects that contain weaponry and AI? How do you sufficiently define AI? What constitutes a weapon? Can the drone itself be considered a weapon?

It may make sense to make owning certain, dangerous things illegal but I'm not convinced it solves the problem we are discussing. In all honestly if someone could mass produce a machine with a decent amount of weapons on it then, depending on a ton of details, I could see those overtaking towns or cities; hell maybe even overtake small countries depending on how they're built. So it's tough to just simply say it's illegal for the members of the UN, so they don't research it at all, and non-members of the UN end of researching it and developing something incredible.


Few follow up points:

1. The knowledge necessary to create chemical weapons is chemistry. Are you sure there are not many positive uses of chemistry?

2. Military drones controlled by humans that can also be controlled by AI also need to go in order for this proposed ban to be effective.

3. The scale of the research matters. Yes you may have some groups who choose to ignore the treaty and develop autonomous weapons, but you can monitor for large scale stockpiling of such weapons and counter against their use. What we don't want is for these weapons to be easy to come by in large enough quantities to pose a widespread security risk. We'll never eliminate the development completely, but we can make it a smaller problem than widespread use could be.


1. Chemistry yes but the specific area of chemistry I'm not sure how many positive knowledge comes out of it. Granted there will be some but I don't think it's even close to being comparable to AI research. Chemical weapons need to, as efficiently as possible, break down areas of the body so I'm not sure a ton of positive applications come out of that (but again I'm sure at least some would).

2. Yeah but considering how many of those exist and how easy it is to even weaponize consumer drones I don't think any type of ban is even possible here even if it was universally agreed upon as a net positive action.

3. I'm not sure how we could target anyone working on this though. AI is going to be largely in software. Granted we don't understand how a really good AI will look and it's possible it'll need more specialized hardware to better support a neural net but I can't imagine that's going to be large enough to be able to track or even see even if it was necessary.

Developing software that can optionally control machines is just not possible to monitor and counter. Someone in their home may end up creating the most advanced AI and we'd have no idea until it's employed somewhere.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: