File size: 1,636 Bytes
447ebeb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
import os
import sys
import traceback

from dotenv import load_dotenv

import litellm.types

load_dotenv()
import io
import os
import json

sys.path.insert(
    0, os.path.abspath("../..")
)  # Adds the parent directory to the system path
from unittest.mock import AsyncMock, Mock, patch

import pytest

@pytest.mark.asyncio
async def test_bedrock_agents():
    litellm._turn_on_debug()
    response = litellm.completion(
            model="bedrock/agent/L1RT58GYRW/MFPSBCXYTW",
            messages=[
                {
                    "role": "user",
                    "content": "Hi just respond with a ping message"
                }
            ],
        )

    #########################################################
    #########################################################
    print("response from agent=", response.model_dump_json(indent=4))

    # assert that the message content has a response with some length
    assert len(response.choices[0].message.content) > 0


    # assert we were able to get the response cost
    assert response._hidden_params["response_cost"] is not None and response._hidden_params["response_cost"] > 0

    pass

@pytest.mark.asyncio
async def test_bedrock_agents_with_streaming():
    # litellm._turn_on_debug()
    response = litellm.completion(
        model="bedrock/agent/L1RT58GYRW/MFPSBCXYTW",
        messages=[
            {
                "role": "user",
                "content": "Hi who is ishaan cto of litellm, tell me 10 things about him"
            }
        ],
        stream=True,
    )

    for chunk in response:
        print("final chunk=", chunk)

    pass