Rare
0/21

# Square Root Decomposition

Author: Benjamin Qi

Splitting up data into smaller chunks to speed up processing.

Focus Problem – read through this problem before continuing!

You should already have done this problem in Point Update Range Sum, but here we'll present two more approaches. Both run in $\mathcal{O}((N+Q)\sqrt N)$ time.

Resources
CPH
CF

Blocking, Mo's Algo

## Blocking

See the first example in CPH.

int n,q;vi x;ll block[450];int BLOCK;
ll sum(int a) { // sum of first a elements of array in O(sqrt(n)) time	ll ans = 0;	F0R(i,a/BLOCK) ans += block[i];	FOR(i,a/BLOCK*BLOCK,a) ans += x[i];	return ans;

## Batching

See the CPH section on "batch processing."

Maintain a "buffer" of the latest updates (up to $\sqrt N$). The answer for each sum query can be calculated with prefix sums and by examining each update within the buffer. When the buffer gets too large ($\ge \sqrt N$), clear it and recalculate prefix sums.

int n,q;vi x;vl cum;
void build() {	cum = {0};	trav(t,x) cum.pb(cum.bk+t);}
int main() {

## Mo's Algorithm

See CPH 27.3 and the CF link above.

Resources
CF

very brief description

Low constraints (ex. $n=5\cdot 10^4$) and/or high time limits (greater than 2s) can be signs that sqrt decomp is intended.

In practice, it is not necessary to use the exact value of $\sqrt n$ as a parameter, and instead we may use parameters $k$ and $n/k$ where $k$ is different from $\sqrt n$. The optimal parameter depends on the problem and input. For example, if an algorithm often goes through the blocks but rarely inspects single elements inside the blocks, it may be a good idea to divide the array into $k<\sqrt n$ blocks, each of which contains $n/k > \sqrt n$ elements.

As another example, if an update takes time proportional to the size of one block ($\mathcal{O}(n/k)$) while a query takes time proportional to the number of blocks times $\log n$ ($\mathcal{O}(k\log n)$) then we can set $k\approx \sqrt{\frac{n}{\log n}}$ to make both updates and queries take time $\mathcal{O}(\sqrt{n\log n})$.

Solutions with worse complexities are not necessarily slower (at least for problems with reasonable input sizes, ex. $n\le 5\cdot 10^5$). I recall an instance where a fast $\mathcal{O}(n\sqrt n\log n)$ solution passed (where $\log n$ came from a BIT) while an $\mathcal{O}(n\sqrt n)$ solution did not ... So constant factors are important!

## On Trees

The techniques mentioned in the blogs below are extremely rare but worth a mention.

Some more discussion about how sqrt decomp can be used:

Resources
CF

format isn't great but tree example is ok

## Problems

### A

Problems where the best solution I know of involves sqrt decomp.

StatusSourceProblem NameDifficultyTags
CFEasy
Show TagsSqrt
JOIEasy
Show TagsSqrt
POIEasy
Show TagsSqrt
YSNormal
Show TagsSqrt
APIOHard
Show TagsSqrt
JOIHard
Show TagsSOS DP
PlatVery Hard
Show TagsSqrt
DMOJVery Hard
Show TagsSqrt
DMOJVery Hard
Show TagsSqrt

### B

Problems that can be solved without it. But you might as well try to use it!!

StatusSourceProblem NameDifficultyTags
JOINormal
Show Tags2DRQ, Mo's Algorithm
IOINormal
Show TagsSqrt
PlatHard
Show TagsSqrt
CFHard
Show TagsSqrt
TLXHard
Show TagsSqrt
CSAHard
Show TagsSqrt
Old GoldHard
Show TagsConvex Hull
CFVery Hard
Show TagsConvex Hull
IOIVery Hard
Show TagsSqrt
PlatVery Hard
Show TagsSqrt
IOIVery Hard
Show Tags2DRQ

### Join the USACO Forum!

Stuck on a problem, or don't understand a module? Join the USACO Forum and get help from other competitive programmers!