ehartford's picture TheBloke's picture
Shouldn't CodeLlama 34B have 16K context and rope_theta 1M? (#3)
3fe88e0