
When BAML raises an exception, it will be an instance of a subclass of `BamlError`. This allows you to catch all BAML-specific exceptions with a single `except` block.

## Example
<CodeGroup>
```python Python
from baml_client import b
from baml_py.errors import BamlError, BamlInvalidArgumentError, BamlClientError, BamlClientHttpError, BamlValidationError
from baml_py import BamlAbortError

try:
  b.CallFunctionThatRaisesError()
except BamlError as e:
  print(e)


try:
  b.CallFunctionThatRaisesError()
except BamlValidationError as e:
  # The original prompt sent to the LLM
  print(e.prompt)
  # The LLM response string
  print(e.raw_output)
  # A human-readable error message
  print(e.message)
  # Complete error history (includes fallback attempts)
  print(e.detailed_message)
```


```typescript TypeScript
import { b } from './baml_client'
// For catching parsing errors and cancellation errors, you can import these
import { BamlValidationError, BamlClientFinishReasonError, BamlAbortError } from '@boundaryml/baml'
// The rest of the BAML errors contain a string that is prefixed with:
// "BamlError:"
// Subclasses are sequentially appended to the string.
// For example, BamlInvalidArgumentError is returned as:
// "BamlError: BamlInvalidArgumentError:"
// Or, BamlClientHttpError is returned as:
// "BamlError: BamlClientError: BamlClientHttpError:"


async function example() {
  try {
    await b.CallFunctionThatRaisesError()
  } catch (e) {
    if (e instanceof BamlAbortError) {
      // Handle cancellation
      console.log('Operation was cancelled:', e.message)
      console.log('Cancellation reason:', e.reason)
    } else if (e instanceof BamlValidationError || e instanceof BamlClientFinishReasonError) {
      // You should be lenient to these fields missing.
      // The original prompt sent to the LLM
      console.log(e.prompt)
      // The LLM response string
      console.log(e.raw_output)
      // A human-readable error message
      console.log(e.message)
      // Complete error history (includes fallback attempts)
      console.log(e.detailed_message)
    } else {
      // Handle other BAML errors
      console.log(e)
    }
  }
}

```

```go Go
// Error handling support coming soon for Go
// Currently, Go functions return standard (non-typed) Go errors
```

```ruby Ruby
# Example coming soon
```
</CodeGroup>


## BamlError

Base class for all BAML exceptions.

<ParamField
  path="message"
  type="string"
>
  A human-readable error message.
</ParamField>

### BamlInvalidArgumentError

Subclass of `BamlError`.

Raised when one or multiple arguments to a function are invalid.

### BamlClientError

Subclass of `BamlError`.

Raised when a client fails to return a valid response.

<Warning>
In the case of aggregate clients like `fallback` or those with `retry_policy`, only the last client's error **type** is raised. However, the complete history of all failed attempts is preserved in the `detailed_message` field, allowing you to debug the entire fallback chain.
</Warning>

#### BamlClientHttpError

Subclass of `BamlClientError`.

Raised when the HTTP request made by a client fails with a non-200 status code.

<ParamField
  path="status_code"
  type="int"
>
  The status code of the response.

Common status codes are:

- 1: Other
- 2: Other
- 400: Bad Request
- 401: Unauthorized
- 403: Forbidden
- 404: Not Found
- 429: Too Many Requests
- 500: Internal Server Error
</ParamField>


#### BamlClientFinishReasonError

Subclass of `BamlClientError`.

Raised when the finish reason of the LLM response is not allowed.

<ParamField
  path="finish_reason"
  type="string"
>
  The finish reason of the LLM response.
</ParamField>

<ParamField
  path="message"
  type="string"
>
  An error message.
</ParamField>

<ParamField
  path="prompt"
  type="string"
>
  The original prompt that was sent to the LLM, formatted as a plain string. Images sent as base64-encoded strings are not serialized into this field.
</ParamField>

<ParamField
  path="raw_output"
  type="string"
>
  The raw text from the LLM that failed to parse into the expected return type of a function.
</ParamField>

<ParamField
  path="detailed_message"
  type="string"
>
  Comprehensive error information that includes the complete history of all failed attempts when using fallback clients or retry policies. When multiple attempts are made, this field contains formatted details about each failed attempt, making it invaluable for debugging complex client configurations.
</ParamField>

### BamlValidationError

Subclass of `BamlError`.

Raised when BAML fails to parse a string from the LLM into the specified object.

<ParamField
  path="raw_output"
  type="string"
>
  The raw text from the LLM that failed to parse into the expected return type of a function.
</ParamField>

<ParamField
  path="message"
  type="string"
>
  The parsing-related error message.
</ParamField>

<ParamField
  path="prompt"
  type="string"
>
  The original prompt that was sent to the LLM, formatted as a plain string. Images sent as base64-encoded strings are not serialized into this field.
</ParamField>

<ParamField
  path="detailed_message"
  type="string"
>
  Comprehensive error information that includes the complete history of all failed attempts when using fallback clients or retry policies. When multiple attempts are made, this field contains formatted details about each failed attempt, making it invaluable for debugging complex client configurations.
</ParamField>

### BamlAbortError

Subclass of `BamlError`.

Raised when a BAML operation is cancelled via an abort controller.

<ParamField
  path="message"
  type="string"
>
  A message describing why the operation was aborted.
</ParamField>

<ParamField
  path="reason"
  type="any"
>
  Optional additional context about the cancellation. This can be any value provided when calling the `abort()` method.
</ParamField>

## Handling Cancellation

When operations are cancelled via abort controllers, specific errors are thrown:

<CodeGroup>
```python Python
from baml_client import b
from baml_py import AbortController, BamlAbortError

async def example():
    controller = AbortController()

    # Cancel after 5 seconds
    async def cancel_after_timeout():
        await asyncio.sleep(5)
        controller.abort('timeout')

    asyncio.create_task(cancel_after_timeout())

    try:
        result = await b.ExtractData(
            input_text,
            baml_options={"abort_controller": controller}
        )
    except BamlAbortError as e:
        if e.reason == 'timeout':
            print("Operation timed out after 5 seconds")
        else:
            print(f"Operation was cancelled: {e.message}")
    except BamlValidationError as e:
        print(f"Validation failed: {e.message}")
```

```typescript TypeScript
import { b } from './baml_client'
import { BamlAbortError } from '@boundaryml/baml'

async function example() {
  const controller = new AbortController()

  // Cancel after 5 seconds
  setTimeout(() => controller.abort('timeout'), 5000)

  try {
    const result = await b.ExtractData(inputText, {
      abortController: controller
    })
  } catch (e) {
    if (e instanceof BamlAbortError) {
      if (e.reason === 'timeout') {
        console.log('Operation timed out after 5 seconds')
      } else {
        console.log(`Operation was cancelled: ${e.message}`)
      }
    } else if (e instanceof BamlValidationError) {
      console.log(`Validation failed: ${e.message}`)
    }
  }
}
```

```go Go
import (
    "context"
    "errors"
    "time"
)

func example() {
    // Create context with 5 second timeout
    ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
    defer cancel()

    result, err := b.ExtractData(ctx, inputText)
    if err != nil {
        if errors.Is(err, context.DeadlineExceeded) {
            fmt.Println("Operation timed out after 5 seconds")
        } else if errors.Is(err, context.Canceled) {
            fmt.Println("Operation was cancelled")
        } else {
            // Handle other errors
            fmt.Printf("Error: %v\n", err)
        }
    }
}
```

```ruby Ruby
begin
  controller = Baml::AbortController.new

  # Cancel after 5 seconds in another thread
  Thread.new do
    sleep(5)
    controller.abort('timeout')
  end

  result = b.extract_data(
    input_text,
    baml_options: { abort_controller: controller }
  )
rescue Baml::AbortError => e
  if e.reason == 'timeout'
    puts "Operation timed out after 5 seconds"
  else
    puts "Operation was cancelled: #{e.message}"
  end
rescue Baml::ValidationError => e
  puts "Validation failed: #{e.message}"
end
```
</CodeGroup>

For more information on using abort controllers, see the [Abort Controllers guide](/guide/baml-basics/abort-signal).

## LLM Fixup: Dealing with Validation Errors

Our parser is very forgiving, allowing for structured data parsing even in the presence of
minor errors and thought tokens in the LLM response. However, certain types of errors are
too ambiguous to handle without the help of an LLM.

In cases where your LLM is having trouble producing valid data from the output schema, you
can use this 'fixup' recipe to get valid data:

1. Write a Fixup Function. For example, if your original function is called `Foo` and it
returns `MyClass`:

```baml BAML
function FixupFoo(errorMessage: string) -> MyClass {
    client GPT4o
    prompt #"
        Fix this malformed JSON. Preserve the same information.

        {{ ctx.output_format }}

        Original data and parse error:
        {{ errorMessage }}
    "#
}
```

2. Then call the fixup function from your client code in response to validation errors:

<CodeGroup>
```python Python
try:
    result = b.Foo(myData)
except Baml.ValidationError as e:
    result = b.FixupFoo(str(e))
```
```typescript TypeScript
import { b } from './baml_client'
import { BamlValidationError } from '@boundaryml/baml'

async function example() {
  try {
    const result = await b.Foo(myData)
  } catch (e) {
    if (e instanceof BamlValidationError) {
      const result = await b.FixupFoo(JSON.stringify(e))
    }
  }
}
```

```go Go
// Example coming soon.
```

```ruby Ruby
begin
  result = b.foo(my_data)
rescue Baml::ValidationError => e
  result = b.fixup_foo(JSON.generate(e))
end
```
</CodeGroup>

### Choosing a Model

LLMs are good at reconstituting data, so it is often possible to use a less
powerful model for your fixup function than the model you used to produce
the original data. The difficulty of producing valid JSON data depends on
the complexity of the schema and the details of your data payload, so be
sure to test your fixup function on realistic data payloads before moving
to a smaller model.
