Edit model card

bigyi-15b

I recently made bigstral-12b and then I saw this new awesome model yi-9b and decided to make an embiggened version.

This is a merge of pre-trained language models created using mergekit.

Bigyi-15b is a base / completion model, so there is no chat template.

It has a 4k context.

Example

Here is a recipe for Mai Tai:\n\n1:

3 parts rum, 2: 3 parts pineapple juice, 3: half a cup of lime juice, 4: 6 to 8 fresh or frozen pineapple chunks, 5: crushed ice. Mix all ingredients except ice and pour into glasses with ice. Garnish with a pineapple slice.

Here is an implementation of 2-sum in golang:

func twoSum(nums []int, target int) []int {
  if len(nums) <= 1 { return nil }
  m := map[int] bool{}
  for i := range(nums) {{
    n = nums[i]

    // find the complement of current number in map
    comp = target - n
    if comp in m { return [m[comp], i+1 ] }
    else { m[n] = true }
  }}
  return nil
}

Merge Details

Merge Method

This model was merged using the passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: float16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 12]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [6, 18]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [12, 24]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [18, 30]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [24, 36]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [30, 42]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [36, 48]
    model: 01-ai/Yi-9B
Downloads last month
1,324
Safetensors
Model size
15.1B params
Tensor type
FP16
·
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.

Finetuned from