#!/usr/bin/python
# Filename: AdvTreeDS.py

# Advanced Tree Data Structure, Spacial Data(n-dimension data) Search and Index

# Trie Tree
# re(trie)val tree
# a charactor tree to store words, which sub-tree is a set of words have same pre-fix

# Patrica Tree
# an improved trie tree
# each branch compare a bit of coded charactor(key), 1 or 0(sth like huffman tree)
# BTW, I don't think it must be a full b-tree, it depends on the codec

# Best BST
# BST with least accumulation of weight
# key set: [A, B, C, D]
# weight sequence:[p1, p2, p3, p4, q0, q1, q2, q3, q4]
# p0 is weight of key < A, q1 is weight A < key < B, and so on
# C(i, j) = weight(i, j) + min(C(i, k-1)+C(k, j))
# is formula to calculate least weight, use dynamic programming
# 3 tables:
# a) w(i, j)=sumq(i, j)+sump(i+1, j) initiated
# b) c(i, j): c(i, i)=0 initiated
# c) r(i, j): k when get min C(i, j)
# finally, use r table to build the tree
# a) r(0, n) is the root
# b) r(0, r(0, n)-1) and r(r(0, n), n) is root of sub-tree
# c) repeat

# AVL(Adelson-Velskii and Landis) Tree
# 1) property
# a) can be empty
# b) height = O(log(n))
# c) if a tree is an AVL tree, it's sub-tree are also AVL trees, and |hL-hR|<=1
# 2) re-structuring(keep balance)
# balance factor bf(t) = hR - hL
# a) insert value
# i) insert, re-evaluate h and bf
# ii) find where is not balanced
# iii) LL/RR
# iv) LR/RL
# b) delete value

# Self-Organized Linear List
# 1) count
# sort items by visit frequency(count), but use extra space to record count
# 2) move to first
# move latest visit item to the first item(used in linked list)
# 3) move 1 step
# switch latest visit item and its precursor, the list will change gradually

# Splaying Tree
# a serial double-splaying and a single-splaying if need, to move latest visit item to root(or some other ascendent)
# this will make the tree more balanced, so height is not very relative to input order

# Semi-Splaying Tree
# if curr.parent = root: semi-splaying
# elif zigzig: semi-splaying, curr = curr.parent
# else (zigzag): splaying(double splaying)
# until curr == root

# K-D Tree
# a binary tree, each level compare 1 dimension of key
# n-dimension vector:
# l mod n is the dimension to be compared in this level, if less move to left child, or else right child and l += 1
# 1) insert:
# a) recursive compare, if found, return
# b) if not found, add it to the empty child of last compare
# 2) delete:
# a) find left most or right least(in the dimension)
# can be used to search a range of points with same Euclidean distance to another point, but not very simple
# height of tree is relative to input order

# PR(Point-Region) Quad-Tree
# a full 4-branch tree
# divide space into NorthEast, SE, SW, NW 4 part, each one is a square, equal size
# each divide or merge happens in the center of a part
# can be used to search a range of points with same Euclidean distance to another point, but not very simple

# R Tree
# is something like B+ Tree
# 1) each item in branch node is <reference, rectangle(or more dimension)>
# reference is to its child node
# rectangle is smallest rectangle which can ontain all rectangles in its child
# 2) each item in leaf is <Oid, rectangle>
# Oid is reference to a spacial object
# rectangle is the smallest rectangle which can contain this object
# all leaves are in the same level
# 3) insert
# a) select
# b) distribute
# i) pick seeds(as initiated groups)
# ii) pick next(pick next algorithm)
# iii) add to group(add algorithm)
# 4) R* Tree: consider overlap

# Decision Tree
# training of decision tree is a long story

# Gaming Tree

