CSES - Fixed-Length Paths II
Authors: Andi Qu, Benjamin Qi, Michael Chen
With Centroid Decomposition
Time Complexity:
This solution is a simple extension of CSES Fixed-Length Paths I's solution.
Since we want to count the number of paths of length between and instead of a fixed length , a node with depth in 's -th child's subtree will contribute paths instead of paths to the answer.
We could calculate this is using a range-sum data-structure (e.g. a BIT). However this would add an additional factor to the complexity which might be too slow.
Another way is to calculate and maintain the partial sums of . We can calculate the partial sum in time proportional to the size of the subtree, and each time is incremented we can update the partial sum in time.
This allows us to keep the total work done at each level of the centroid decomposition at and keep the overall time complexity at .
C++
#include <bits/stdc++.h>typedef long long ll;using namespace std;int n, a, b;vector<int> graph[200001];int subtree[200001];ll ans = 0, bit[200001];int total_cnt[200001]{1}, mx_depth;
Faster Solution
Time Complexity:
We use the same idea as the above solution but with prefix sums instead of a BIT and with small-to-large merging in a single DFS instead of centroid decomposition. This is because we can merge two prefix sum arrays and in time.
If we let denote the maximum depth in node 's subtree, then the number of operations that our algorithm performs can be calculated by
Now,
and
Summing these two expressions and adding gives
which is . Below is Ben's code implementing this idea:
C++
#include <bits/stdc++.h>using namespace std;using ll = long long;using db = long double; // or double, if TL is tightusing str = string; // yay python!using pi = pair<int, int>;using pl = pair<ll, ll>;using pd = pair<db, db>;
Join the USACO Forum!
Stuck on a problem, or don't understand a module? Join the USACO Forum and get help from other competitive programmers!