runtime error
Space failed. Exit code: 1. Reason: Couldn't find '/home/ubuntu/.ollama/id_ed25519'. Generating new private key. Your new public key is: ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAsbam93fm49G/jpiSqotszZQWBfpOngy5quWJrUQPEV 2024/06/03 04:01:20 routes.go:1028: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]" time=2024-06-03T04:01:20.533Z level=INFO source=images.go:729 msg="total blobs: 0" time=2024-06-03T04:01:20.534Z level=INFO source=images.go:736 msg="total unused blobs removed: 0" time=2024-06-03T04:01:20.535Z level=INFO source=routes.go:1074 msg="Listening on [::]:11434 (version 0.1.39)" time=2024-06-03T04:01:20.538Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama3619178421/runners time=2024-06-03T04:01:24.607Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx2 cuda_v11 rocm_v60002 cpu cpu_avx]" time=2024-06-03T04:01:24.609Z level=INFO source=types.go:71 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="123.8 GiB" available="17.8 GiB" [GIN] 2024/06/03 - 04:01:25 | 200 | 56.059µs | 127.0.0.1 | HEAD "/" [GIN] 2024/06/03 - 04:01:25 | 200 | 56.898373ms | 127.0.0.1 | POST "/api/pull" [?25lpulling manifest [?25h Error: pull model manifest: file does not exist
Container logs:
Fetching error logs...