Explanation
The first observation we need to make is that all spiders which share a common factor form a complete graph where all pairs of spiders in this group are friends with each other.
Thus, we can enumerate the prime factors of all spiders and keep track for each factor the spiders which divide it to store the edges in an efficient manner. Two spiders are friends iff they're both in some prime factor's list together.
However, doing a naive BFS from the starting node will take too long. To optimize this, we have to make another optimization, which is that it's always optimal to "use" a prime at most once in our path.
For example, take the path where the number of legs are , , and where we've used the common divisor two times. Here, we can cut out and move directly from to .
This allows us to do an optimized BFS from the starting spider by keeping a global list of visited prime factors. For each spider, we iterate through its prime factors and see which ones haven't been used. If a factor hasn't been used, then it's standard BFS with backtracking.
Implementation
Time Complexity: , where is the largest number of legs among the spiders.
C++
#include <algorithm>#include <cmath>#include <iostream>#include <map>#include <set>#include <vector>using namespace std;/** @return n's prime factors */
Java
import java.io.*;import java.util.*;public class FriendlySpiders {public static void main(String[] args) throws IOException {BufferedReader read = new BufferedReader(new InputStreamReader(System.in));int spiderNum = Integer.parseInt(read.readLine());Map<Integer, List<Integer>> trains = new HashMap<>();Set<Integer>[] primeFactors = new HashSet[spiderNum];
Join the USACO Forum!
Stuck on a problem, or don't understand a module? Join the USACO Forum and get help from other competitive programmers!