---
title: "AsyncFileSystem"
hideTitleOnPage: true
---

## AsyncFileSystem

```python
class AsyncFileSystem()
```

Provides file system operations within a Sandbox.

This class implements a high-level interface to file system operations that can
be performed within a Daytona Sandbox.

#### AsyncFileSystem.\_\_init\_\_

```python
def __init__(api_client: FileSystemApi,
             ensure_toolbox_url: Callable[[], Awaitable[None]])
```

Initializes a new FileSystem instance.

**Arguments**:

- `api_client` _FileSystemApi_ - API client for Sandbox file system operations.
- `ensure_toolbox_url` _Callable[[], Awaitable[None]]_ - Ensures the toolbox API URL is initialized.
  Must be called before invoking any private methods on the API client.

#### AsyncFileSystem.create\_folder

```python
@intercept_errors(message_prefix="Failed to create folder: ")
async def create_folder(path: str, mode: str) -> None
```

Creates a new directory in the Sandbox at the specified path with the given
permissions.

**Arguments**:

- `path` _str_ - Path where the folder should be created. Relative paths are resolved based
  on the sandbox working directory.
- `mode` _str_ - Folder permissions in octal format (e.g., "755" for rwxr-xr-x).
  

**Example**:

```python
# Create a directory with standard permissions
await sandbox.fs.create_folder("workspace/data", "755")

# Create a private directory
await sandbox.fs.create_folder("workspace/secrets", "700")
```

#### AsyncFileSystem.delete\_file

```python
@intercept_errors(message_prefix="Failed to delete file: ")
async def delete_file(path: str, recursive: bool = False) -> None
```

Deletes a file from the Sandbox.

**Arguments**:

- `path` _str_ - Path to the file to delete. Relative paths are resolved based on the sandbox working directory.
- `recursive` _bool_ - If the file is a directory, this must be true to delete it.
  

**Example**:

```python
# Delete a file
await sandbox.fs.delete_file("workspace/data/old_file.txt")
```

#### AsyncFileSystem.download\_file

```python
@overload
async def download_file(remote_path: str, timeout: int = 30 * 60) -> bytes
```

Downloads a file from the Sandbox. Returns the file contents as a bytes object.
This method is useful when you want to load the file into memory without saving it to disk.
It can only be used for smaller files.

**Arguments**:

- `remote_path` _str_ - Path to the file in the Sandbox. Relative paths are resolved based
  on the sandbox working directory.
- `timeout` _int_ - Timeout for the download operation in seconds. 0 means no timeout. Default is 30 minutes.
  

**Returns**:

- `bytes` - The file contents as a bytes object.
  

**Example**:

```python
# Download and save a file locally
content = await sandbox.fs.download_file("workspace/data/file.txt")
with open("local_copy.txt", "wb") as f:
    f.write(content)

# Download and process text content
content = await sandbox.fs.download_file("workspace/data/config.json")
config = json.loads(content.decode('utf-8'))
```

#### AsyncFileSystem.download\_file

```python
@overload
async def download_file(remote_path: str,
                        local_path: str,
                        timeout: int = 30 * 60) -> None
```

Downloads a file from the Sandbox and saves it to a local file using stream.
This method is useful when you want to download larger files that may not fit into memory.

**Arguments**:

- `remote_path` _str_ - Path to the file in the Sandbox. Relative paths are resolved based
  on the sandbox working directory.
- `local_path` _str_ - Path to save the file locally.
- `timeout` _int_ - Timeout for the download operation in seconds. 0 means no timeout. Default is 30 minutes.
  

**Example**:

```python
local_path = "local_copy.txt"
await sandbox.fs.download_file("tmp/large_file.txt", local_path)
size_mb = os.path.getsize(local_path) / 1024 / 1024
print(f"Size of the downloaded file {local_path}: {size_mb} MB")
```

#### AsyncFileSystem.download\_files

```python
@intercept_errors(message_prefix="Failed to download files: ")
async def download_files(files: List[FileDownloadRequest],
                         timeout: int = 30 * 60) -> List[FileDownloadResponse]
```

Downloads multiple files from the Sandbox. If the files already exist locally, they will be overwritten.

**Arguments**:

- `files` _List[FileDownloadRequest]_ - List of files to download.
- `timeout` _int_ - Timeout for the download operation in seconds. 0 means no timeout. Default is 30 minutes.
  

**Returns**:

- `List[FileDownloadResponse]` - List of download results.
  

**Raises**:

- `Exception` - Only if the request itself fails (network issues, invalid request/response, etc.). Individual
  file download errors are returned in the `FileDownloadResponse.error` field.
  

**Example**:

```python
# Download multiple files
results = await sandbox.fs.download_files([
    FileDownloadRequest(source="tmp/data.json"),
    FileDownloadRequest(source="tmp/config.json", destination="local_config.json")
])
for result in results:
    if result.error:
        print(f"Error downloading {result.source}: {result.error}")
    elif result.result:
        print(f"Downloaded {result.source} to {result.result}")
```

#### AsyncFileSystem.find\_files

```python
@intercept_errors(message_prefix="Failed to find files: ")
async def find_files(path: str, pattern: str) -> List[Match]
```

Searches for files containing a pattern, similar to
the grep command.

**Arguments**:

- `path` _str_ - Path to the file or directory to search. If the path is a directory,
  the search will be performed recursively. Relative paths are resolved based
  on the sandbox working directory.
- `pattern` _str_ - Search pattern to match against file contents.
  

**Returns**:

- `List[Match]` - List of matches found in files. Each Match object includes:
  - file: Path to the file containing the match
  - line: The line number where the match was found
  - content: The matching line content
  

**Example**:

```python
# Search for TODOs in Python files
matches = await sandbox.fs.find_files("workspace/src", "TODO:")
for match in matches:
    print(f"{match.file}:{match.line}: {match.content.strip()}")
```

#### AsyncFileSystem.get\_file\_info

```python
@intercept_errors(message_prefix="Failed to get file info: ")
async def get_file_info(path: str) -> FileInfo
```

Gets detailed information about a file or directory, including its
size, permissions, and timestamps.

**Arguments**:

- `path` _str_ - Path to the file or directory. Relative paths are resolved based
  on the sandbox working directory.
  

**Returns**:

- `FileInfo` - Detailed file information including:
  - name: File name
  - is_dir: Whether the path is a directory
  - size: File size in bytes
  - mode: File permissions
  - mod_time: Last modification timestamp
  - permissions: File permissions in octal format
  - owner: File owner
  - group: File group
  

**Example**:

```python
# Get file metadata
info = await sandbox.fs.get_file_info("workspace/data/file.txt")
print(f"Size: {info.size} bytes")
print(f"Modified: {info.mod_time}")
print(f"Mode: {info.mode}")

# Check if path is a directory
info = await sandbox.fs.get_file_info("workspace/data")
if info.is_dir:
    print("Path is a directory")
```

#### AsyncFileSystem.list\_files

```python
@intercept_errors(message_prefix="Failed to list files: ")
async def list_files(path: str) -> List[FileInfo]
```

Lists files and directories in a given path and returns their information, similar to the ls -l command.

**Arguments**:

- `path` _str_ - Path to the directory to list contents from. Relative paths are resolved
  based on the sandbox working directory.
  

**Returns**:

- `List[FileInfo]` - List of file and directory information. Each FileInfo
  object includes the same fields as described in get_file_info().
  

**Example**:

```python
# List directory contents
files = await sandbox.fs.list_files("workspace/data")

# Print files and their sizes
for file in files:
    if not file.is_dir:
        print(f"{file.name}: {file.size} bytes")

# List only directories
dirs = [f for f in files if f.is_dir]
print("Subdirectories:", ", ".join(d.name for d in dirs))
```

#### AsyncFileSystem.move\_files

```python
@intercept_errors(message_prefix="Failed to move files: ")
async def move_files(source: str, destination: str) -> None
```

Moves or renames a file or directory. The parent directory of the destination must exist.

**Arguments**:

- `source` _str_ - Path to the source file or directory. Relative paths are resolved
  based on the sandbox working directory.
- `destination` _str_ - Path to the destination. Relative paths are resolved based on
  the sandbox working directory.
  

**Example**:

```python
# Rename a file
await sandbox.fs.move_files(
    "workspace/data/old_name.txt",
    "workspace/data/new_name.txt"
)

# Move a file to a different directory
await sandbox.fs.move_files(
    "workspace/data/file.txt",
    "workspace/archive/file.txt"
)

# Move a directory
await sandbox.fs.move_files(
    "workspace/old_dir",
    "workspace/new_dir"
)
```

#### AsyncFileSystem.replace\_in\_files

```python
@intercept_errors(message_prefix="Failed to replace in files: ")
async def replace_in_files(files: List[str], pattern: str,
                           new_value: str) -> List[ReplaceResult]
```

Performs search and replace operations across multiple files.

**Arguments**:

- `files` _List[str]_ - List of file paths to perform replacements in. Relative paths are
  resolved based on the sandbox working directory.
- `pattern` _str_ - Pattern to search for.
- `new_value` _str_ - Text to replace matches with.
  

**Returns**:

- `List[ReplaceResult]` - List of results indicating replacements made in
  each file. Each ReplaceResult includes:
  - file: Path to the modified file
  - success: Whether the operation was successful
  - error: Error message if the operation failed
  

**Example**:

```python
# Replace in specific files
results = await sandbox.fs.replace_in_files(
    files=["workspace/src/file1.py", "workspace/src/file2.py"],
    pattern="old_function",
    new_value="new_function"
)

# Print results
for result in results:
    if result.success:
        print(f"{result.file}: {result.success}")
    else:
        print(f"{result.file}: {result.error}")
```

#### AsyncFileSystem.search\_files

```python
@intercept_errors(message_prefix="Failed to search files: ")
async def search_files(path: str, pattern: str) -> SearchFilesResponse
```

Searches for files and directories whose names match the
specified pattern. The pattern can be a simple string or a glob pattern.

**Arguments**:

- `path` _str_ - Path to the root directory to start search from. Relative paths are resolved
  based on the sandbox working directory.
- `pattern` _str_ - Pattern to match against file names. Supports glob
  patterns (e.g., "*.py" for Python files).
  

**Returns**:

- `SearchFilesResponse` - Search results containing:
  - files: List of matching file and directory paths
  

**Example**:

```python
# Find all Python files
result = await sandbox.fs.search_files("workspace", "*.py")
for file in result.files:
    print(file)

# Find files with specific prefix
result = await sandbox.fs.search_files("workspace/data", "test_*")
print(f"Found {len(result.files)} test files")
```

#### AsyncFileSystem.set\_file\_permissions

```python
@intercept_errors(message_prefix="Failed to set file permissions: ")
async def set_file_permissions(path: str,
                               mode: str = None,
                               owner: str = None,
                               group: str = None) -> None
```

Sets permissions and ownership for a file or directory. Any of the parameters can be None
to leave that attribute unchanged.

**Arguments**:

- `path` _str_ - Path to the file or directory. Relative paths are resolved based on
  the sandbox working directory.
- `mode` _Optional[str]_ - File mode/permissions in octal format
  (e.g., "644" for rw-r--r--).
- `owner` _Optional[str]_ - User owner of the file.
- `group` _Optional[str]_ - Group owner of the file.
  

**Example**:

```python
# Make a file executable
await sandbox.fs.set_file_permissions(
    path="workspace/scripts/run.sh",
    mode="755"  # rwxr-xr-x
)

# Change file owner
await sandbox.fs.set_file_permissions(
    path="workspace/data/file.txt",
    owner="daytona",
    group="daytona"
)
```

#### AsyncFileSystem.upload\_file

```python
@overload
async def upload_file(file: bytes,
                      remote_path: str,
                      timeout: int = 30 * 60) -> None
```

Uploads a file to the specified path in the Sandbox. If a file already exists at
the destination path, it will be overwritten. This method is useful when you want to upload
small files that fit into memory.

**Arguments**:

- `file` _bytes_ - File contents as a bytes object.
- `remote_path` _str_ - Path to the destination file. Relative paths are resolved based on
  the sandbox working directory.
- `timeout` _int_ - Timeout for the upload operation in seconds. 0 means no timeout. Default is 30 minutes.
  

**Example**:

```python
# Upload a text file
content = b"Hello, World!"
await sandbox.fs.upload_file(content, "tmp/hello.txt")

# Upload a local file
with open("local_file.txt", "rb") as f:
    content = f.read()
await sandbox.fs.upload_file(content, "tmp/file.txt")

# Upload binary data
import json
data = {"key": "value"}
content = json.dumps(data).encode('utf-8')
await sandbox.fs.upload_file(content, "tmp/config.json")
```

#### AsyncFileSystem.upload\_file

```python
@overload
async def upload_file(local_path: str,
                      remote_path: str,
                      timeout: int = 30 * 60) -> None
```

Uploads a file from the local file system to the specified path in the Sandbox.
If a file already exists at the destination path, it will be overwritten. This method uses
streaming to upload the file, so it is useful when you want to upload larger files that may
not fit into memory.

**Arguments**:

- `local_path` _str_ - Path to the local file to upload.
- `remote_path` _str_ - Path to the destination file in the Sandbox. Relative paths are
  resolved based on the sandbox working directory.
- `timeout` _int_ - Timeout for the upload operation in seconds. 0 means no timeout. Default is 30 minutes.
  

**Example**:

```python
await sandbox.fs.upload_file("local_file.txt", "tmp/large_file.txt")
```

#### AsyncFileSystem.upload\_files

```python
@intercept_errors(message_prefix="Failed to upload files: ")
async def upload_files(files: List[FileUpload],
                       timeout: int = 30 * 60) -> None
```

Uploads multiple files to the Sandbox. If files already exist at the destination paths,
they will be overwritten.

**Arguments**:

- `files` _List[FileUpload]_ - List of files to upload.
- `timeout` _int_ - Timeout for the upload operation in seconds. 0 means no timeout. Default is 30 minutes.

**Example**:

```python
# Upload multiple text files
files = [
    FileUpload(
        source=b"Content of file 1",
        destination="/tmp/file1.txt"
    ),
    FileUpload(
        source="workspace/data/file2.txt",
        destination="/tmp/file2.txt"
    ),
    FileUpload(
        source=b'{"key": "value"}',
        destination="/tmp/config.json"
    )
]
await sandbox.fs.upload_files(files)
```


## FileUpload

```python
@dataclass
class FileUpload()
```

Represents a file to be uploaded to the Sandbox.

**Attributes**:

- `source` _Union[bytes, str]_ - File contents as a bytes object or a local file path. If a bytes object is provided,
  make sure it fits into memory, otherwise use the local file path which content will be streamed to the Sandbox.
- `destination` _str_ - Absolute destination path in the Sandbox. Relative paths are resolved based on
  the sandbox working directory.

## FileDownloadRequest

```python
@dataclass
class FileDownloadRequest()
```

Represents a request to download a single file from the Sandbox.

**Attributes**:

- `source` _str_ - Source path in the Sandbox. Relative paths are resolved based on the user's
  root directory.
- `destination` _Optional[str]_ - Destination path in the local filesystem where the file content will be
  streamed to. If not provided, the file will be downloaded in the bytes buffer
  (might cause memory issues if the file is large).

## FileDownloadResponse

```python
@dataclass
class FileDownloadResponse()
```

Represents the response to a single file download request.

**Attributes**:

- `source` _str_ - The original source path requested for download.
- `result` _Union[str, bytes, None]_ - The download result - file path (if destination provided in the request)
  or bytes content (if no destination in the request), None if failed or no data received.
- `error` _Optional[str]_ - Error message if the download failed, None if successful.

