Hey HN! We're building Docstring, a tool that can automatically generate documentation for Python, Ruby, and Javascript (with more languages soon to come), and then makes sure the docs stay up to date whenever the code changes so you don't have to. We integrate into your IDE (currently VSCode and Vim, IntelliJ products soon) and generate docs inline with your code just as normal. When you save, the code and docs are uploaded to us so you can browse and read online, and see stats like what the most under-documented portions of your codebase are.
Background: My co-founder and I have been programming for 10+ years each, and like many of you I'm sure, have run into old, incomplete, or entirely incorrect documentation more times than we can count. We believe that this happens for a few main reasons:
First, existing tools don't really support you in writing docs. They may autogenerate a few lines of boilerplate, but nothing links the docs to the code itself. This is true both for top of (function|class) docstrings, but is even more true for documentation which is in completely separate files like READMEs, architecture diagrams, a wiki, etc. We believe that by being more intelligent about understanding what is documenting what, we can make the experience better for everyone, both on the writing side, and also on the reading side.
Second, doc writing is time consuming for something that feels like gives little immediate gain, and is also not ingrained in many companies engineering culture. In reality, it helps your teammates _now_ since they can more easily comprehend what you wrote, and in the long term, it of course becomes necessary as people come and go (or even just to remind yourself of what you wrote a few months ago). It's just hard to be driven to do something when the payoff may be a few months down the line, especially when you have a deadline to ship something next week.
We're curious to hear what you all think both about the product and about the issue in general.
This looks like one of those kinds of tools that, if it works well, would be very difficult to remove from my digital tool belt once I get used to it. I was just thinking this morning about how shitty documentation can be worse than no documentation, so this is kind of serendipitous.
My only concern is about the code that is uploaded to your servers. I can see people who maintain proprietary or private repos being adverse to having their code on someone else's system (ie companies who would pay this money per head). This leaves people who maintain open source software or only work on small personal projects, which I can't see paying this much for an auto documenting feature.
Unless I'm missing/misunderstanding something of course.
Yep, nothing worse than wasting time because of bad information.
We're intending on offering a self-hosted version a bit down the line, exactly for those companies who need complete control (for compliance reasons or otherwise). And as for pricing, we're going to be watching how people engage and potentially be adjusting the price over the next little while.
Looks like the project hasn't been updated in a while? The last files on the downloads page are from 2019-06... I believe https://www.gharchive.org/ is still updated though.
I'm having trouble understanding what the goal is of this as well. It seems like the quick summary would be "ML-based forecasting/prediction in a box" but the readme is making all of these broad claims...
We're looking to make timeseries AI easier for developers to integrate into their applications by providing tooling and patterns that are familiar to them. We're a group of devs that were looking to add intelligence to one of our projects, but struggling with existing tooling. We wanted to use patterns that were more familiar to us as devs, like a quick debugging loop, easily consumed packages, etc.
It's a SHA256 - `shasum -a 256 server.py`