The Transmission Control Protocol is like a crossing guard for the internet, regulating traffic to keep things flowing. Sure, engineers are constantly working to improve it, but it's manmade, so there's always room for human error. But researchers at MIT have created a computer system that could fix all that-and make the internet two to three times faster.
The system is called Remy, and it uses some basic criteria to churn out congestion-controlling algorithms. A user tells Remy a few characteristics of a network, like how much bandwidth is needed, how much it fluctuates, how many people are using it, what kinds of things they use it for, and so forth. They then specify what kind of metric they want to use to gauge performance, like throughput (how much data is going through the network at a time) or the delay (how long it takes data to travel through a network). Remy takes all of this and tests a bunch of different algorithms to see which one, specifically, optimizes speed. It doesn't test every possible algorithm, because that would take forever, but it seeks out small changes that would improve performance.
In tests, the algorithms generated by Remy almost always performed significantly better than those made by humans. Although Remy's thought process still takes about 12 hours, it's faster because its powerful computer brain is working faster than a weak little human brain could.
Hari Balakrishan, an author of the paper says:
When you have even a handful of connections, or more, and a slightly more complicated network, where the workload is not a constant-a single file being sent, or 10 files being sent-that's very hard for human beings to reason about. And computers seem to be a lot better about navigating that search space.
Remy has only been tested in the lab, not on the wild wild west of the real open internet. But you'd be hard pressed to find someone who complains that their internet works too well, so we'll take just about anything that has potential to make our connections faster. [MIT via PopSci]