You're completely correct, that's fair criticism. The excitement made me skip the basics. Here's a quick breakdown:
What it does: It's a new optimization algorithm that finds exceptionally good solutions to the MAX-CUT problem (and others) very quickly.
What is MAX-CUT: It's a classic NP-hard problem where you split a graph's nodes into two groups to maximize the number of edges between the groups. It's fundamental in computer science and has applications in circuit design, statistical physics, and machine learning.
How it works (The "Grav" part): It treats parameters like particles in a gravitational field. The "loss" creates an attractive force, but I've added a quantum potential that creates a repulsive force, preventing collapse into local minima. The adaptive engine balances these forces dynamically.
Comparison: The script in the post beats the 0.878... approximation guarantee of the famous Goemans-Williamson algorithm on small, dense graphs. It's not just another gradient optimizer; it's designed for complex, noisy landscapes where Adam and others plateau.
I've updated the README with a "Technical Background" section. Thanks for the push—it's much better now.
You're completely correct, that's fair criticism. The excitement made me skip the basics. Here's a quick breakdown:
What it does: It's a new optimization algorithm that finds exceptionally good solutions to the MAX-CUT problem (and others) very quickly.
What is MAX-CUT: It's a classic NP-hard problem where you split a graph's nodes into two groups to maximize the number of edges between the groups. It's fundamental in computer science and has applications in circuit design, statistical physics, and machine learning.
How it works (The "Grav" part): It treats parameters like particles in a gravitational field. The "loss" creates an attractive force, but I've added a quantum potential that creates a repulsive force, preventing collapse into local minima. The adaptive engine balances these forces dynamically.
Comparison: The script in the post beats the 0.878... approximation guarantee of the famous Goemans-Williamson algorithm on small, dense graphs. It's not just another gradient optimizer; it's designed for complex, noisy landscapes where Adam and others plateau.
I've updated the README with a "Technical Background" section. Thanks for the push—it's much better now.