I'm with you, and actually bullish on this to be a viable way forward.
48V DC has been eyed already for a potential standard to emerge. Doesn't need massive cables to deliver decent power (4A ~200W). There is enough hardware around coming from use cases like EV and Boats that could make it work. Many battery solutions already 'talk' 48v without lossy stepdown of voltage, etc.
Big plus is that the regulation is A LOT less strict for <48v DC compared to AC 110/220/240.
AC makes power distribution easier (because you can have modulated phases). So it's correct to say it's easier to move it over a long distance.
Additionally, and i'm really simplifying, at parity of nominal voltage, you can move a lot more power, at a lower dissipation cost. This has resulted in few high power electronics to be AC native (ie.: no AC - DC - AC conversion). Think about motors in the various appliances, etc.
It doesn't need to be like that, investment in DC car motors have pushed the industry to optimizes design, and get similar power output of the motor at lower energy consumption.
That said, if you are a manufacturer of an appliance and you have an addressable user base of billions with AC, and a 'potential new user base' with DC... you might just want to swallow the cost and add a DC / AC converter for the sake to not have to produce two variants of the most complex / costly item (the motor in this case).
That wheel has turned. The king of long distance transmission is now HVDC, to be point of sometimes being used intra-grid and not just for interconnects.
That is correct. When that will be available is the hard thing to guess.
There are currently enough production of electricity that is motor based (think about gas turbine, water turbines, etc), so there is a nice benefit of having AC at source and distribution.
The infrastructure needs to change. With an average lifetime of a substation in the 50-75years, it's hard to expect we'll overhaul completely the distribution system over night.
It's also hard for me to understand the power loss between the two scenarios (AC production, ac distribution, ac/dc conversion , dc consumption) and DC production, dc stepup to HVDC, dc distribution, DC stepdown and DC consumption). Even 1% at national scale means millions, so the entire business case might be anchored there.
I'm sure there are smarter people than me here that can cast some light on this
An axiom of inevitabilism, especially among the highest echelons, is that you end up making it a reality. It’s the kind of belief that shapes reality itself.
In simple terms: the fact that the Googles, Anthropics, and OpenAIs of the world have a strong interest in making LLMs the way AI pans out will most likely ensure that LLMs become the dominant paradigm — until someone else, with equal leverage, comes along to disrupt them.
Just wanted to callout how well written is this blog post (not necessarily from a substance standpoint, which in my opinion is very good as well), but from a fluidity and narrative standpoint.
It's quite rare in this day and age. Thank you, OP
genuine, security newbie, question.
What's the worst case scenario that can happen on using this type of solution from a security standpoint? I do get it the authentication would be compromised. Probably some internal ports would be exposed publicly too.. what else?
Good question. I think absolute worse case scenario the tunnel and VPS is compromised and someone is able to gain access to the private network. We advise people in the docs to always consider this a possibility and secure Newt and what is has access to. A slightly worse case is there is a bypass in the forward auth and someone can get access to the webpage of a private service without passing the user/pass auth etc.
We are always looking for security experts to review the code and to pen test the application. Please hammer it and let us know at security@fossorial.io if there are any issues!
I’m running pangolin for a couple months now and instead of newt I use my router WireGuard Client in a VLAN.
Any „wanted“ traffic is then routed via DNAT/firewall to my home server.
that's already the case, and it's called model distillation. You use LLMs to generate labels but then you use a dedicated smaller model (usually NN) to run at 1000x cheaper cost of inference.
I think beyond the technical aspect it's a product and packaging problem.
All the effort is in productizing foundational models and apps built on top of them, but as that plateaus distilled models and new approaches will probably get more time in the sun. I'm hopeful that if this is the case we will see more weird stuff come available.