Description

Given a n * n matrix grid of 0's and 1's only. We want to represent the grid with a Quad-Tree. Return the root of the Quad-Tree representing the grid.

Examples

Input:grid = [[0,1],[1,0]]
Output:[[0,1],[1,0],[1,1],[1,1],[1,0]]
Explanation:

Quad tree representation.

Input:grid = [[1]]
Output:[1]
Explanation:

Minimal case with a single-element matrix.

Input:grid = [[1,1,0,0],[1,1,0,0],[1,1,1,1],[1,1,1,1]]
Output:[[0,1],[1,1],[0,1],[1,1],[1,0]]
Explanation:

The 4x4 grid is divided into four 2x2 quadrants. Top-left quadrant [[1,1],[1,1]] is uniform (all 1s), so it becomes a leaf node [1,1]. Top-right quadrant [[0,0],[0,0]] is uniform (all 0s), so it becomes a leaf node [1,0]. Bottom-left quadrant [[1,1],[1,1]] is uniform (all 1s), so it becomes a leaf node [1,1]. Bottom-right quadrant [[1,1],[1,1]] is uniform (all 1s), so it becomes a leaf node [1,1]. Since the quadrants are not all the same, the root is an internal node [0,1] with four children.

Constraints

  • n == grid.length == grid[i].length
  • n == 2^x where 0 ≤ x ≤ 6

Ready to solve this problem?

Practice solo or challenge other developers in a real-time coding battle!