Accelerating the grapevine effect
Gossip is an efficient way to share information across large networks and has unexpected applications in solving other mathematical and machine-learning problems.
By looking at classical gossip algorithms from a novel perspective, Peter Richtarik, KAUST Professor, has found a way to significantly speed up gossip-based information sharing, and in the process, he discovered new applications for this efficient mathematical approach.
Gossip involves the sharing of information between individuals in a network and can be applied mathematically in both human social networks and data networks, such as distributed sensors.
“A network is a collection of nodes, each connected to other nodes via links,” explains Richtarik. “In social networks, for instance, individuals are connected to others via friendship links. In sensor networks, sensors could be connected if they are close enough to communicate over a wireless connection.”
In many real-world applications, it is often useful to perform calculations based on the data stored by all nodes across a network, such as computing the average of the private data stored by each node, which is known as the average consensus problem. However, because communication is limited to direct links between nodes, in practice, this is very challenging.
“The idea of gossip algorithms is to perform this calculation by pairwise communication among randomly selected friends and to repeat this process until all individuals learn the result,” says Richtarik. “This mimics the way gossip works among humans. It’s surprising that it is possible to show mathematically that this simple communication strategy can solve a global, network-wide problem.”
In collaboration with Nicolas Loizou from the University of Edinburgh in Scotland, Richtarik has been studying the randomized gossip algorithms and their connections to other branches of mathematics and computer science.
Their theoretical study revealed a deep connection between randomized gossip algorithms and a branch of mathematics called linear algebra, which involves solving systems of many equations with many unknowns. They also established a direct deep link with one of the most famous algorithms in machine learning, the stochastic gradient descent method, which is used to train the deep-learning models employed in almost all industrial applications. These insights helped the researchers develop new and much faster gossip protocols.
“We were able to develop an accelerated gossip algorithm that needs many fewer gossip rounds to reach the average consensus value,” says Richtarik. “Our method needs only the square root of the number of rounds needed for a classical gossip algorithm, that’s 100 rounds instead of 10,000. We proved this mathematically and observed this acceleration in practice as well.”
Loizou, N. & Richtarik, P. Accelerated gossip via stochastic heavy ball method. 56th Annual Allerton Conference on Communication, Control, and Computing 2-5 October 2018 IEEE, 927-934.| article
You might also like
Digital sovereignty planning is a game changer
They shall not pass: Keeping an eye on middleboxes
Compress to impress with AI hardware
A machine that learns to learn
Simple hardware to defend microgrid attacks
Hard to crack hardware
Enlightened route to wireless communications