Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
124 changes: 124 additions & 0 deletions blacksheep/docs/openapi.md
Original file line number Diff line number Diff line change
Expand Up @@ -222,6 +222,130 @@ components: {}
tags: []
```

### Request body binders support

/// admonition | Enhanced in BlackSheep 2.6.0
type: info

BlackSheep 2.6.0 adds full OpenAPI documentation support for `FromText` and `FromFiles` binders. These binders are now automatically documented with appropriate request body schemas and content types.

///

BlackSheep automatically generates OpenAPI documentation for various request body binders. The following examples assume the `docs` handler has been set up as described in the [Built-in support](#built-in-support-for-openapi-documentation) section.

#### FromJSON

Documented with `application/json` content type and the appropriate schema:

```python
from dataclasses import dataclass
from blacksheep import FromJSON, post


@dataclass
class CreateUserInput:
name: str
email: str
age: int


@docs(
summary="Create a new user",
responses={201: "User created successfully"}
)
@post("/api/users")
async def create_user(input: FromJSON[CreateUserInput]):
return {"user_id": 123}
```

The OpenAPI documentation automatically includes the request body schema for `CreateUserInput`.

#### FromFiles (since 2.6.0)

Documented with `multipart/form-data` content type:

```python
from blacksheep import FromFiles, post


@docs(
summary="Upload files",
responses={201: "Files uploaded successfully"}
)
@post("/api/upload")
async def upload_files(files: FromFiles):
return {"files_count": len(files.value)}
```

The OpenAPI documentation automatically documents this as a file upload endpoint with `multipart/form-data` encoding.

#### FromText (since 2.6.0)

Documented with `text/plain` content type:

```python
from blacksheep import FromText, post


@docs(
summary="Store text content",
responses={201: "Text stored successfully"}
)
@post("/api/text")
async def store_text(content: FromText):
return {"length": len(content.value)}
```

#### Mixed multipart/form-data (since 2.6.0)

When combining `FromText` and `FromFiles` in the same endpoint, BlackSheep generates appropriate `multipart/form-data` documentation:

```python
from blacksheep import FromFiles, FromText, post


@docs(
summary="Upload files with description",
responses={201: "Upload completed successfully"}
)
@post("/api/upload-with-metadata")
async def upload_with_metadata(
description: FromText,
files: FromFiles,
):
return {
"description": description.value,
"files_count": len(files.value)
}
```

The OpenAPI specification will correctly document both the text field and file upload field as part of the `multipart/form-data` request body.

#### FromForm

Documented with `application/x-www-form-urlencoded` or `multipart/form-data` content type:

```python
from dataclasses import dataclass
from blacksheep import FromForm, post


@dataclass
class ContactForm:
name: str
email: str
message: str


@docs(
summary="Submit contact form",
responses={200: "Form submitted successfully"}
)
@post("/api/contact")
async def submit_contact(form: FromForm[ContactForm]):
return {"status": "received"}
```

### Adding description and summary

An endpoint description can be specified either using a `docstring`:
Expand Down
165 changes: 158 additions & 7 deletions blacksheep/docs/requests.md
Original file line number Diff line number Diff line change
Expand Up @@ -177,6 +177,13 @@ kinds.

#### Reading a form request body

/// admonition | Improved in BlackSheep 2.6.0
type: info

Starting from BlackSheep 2.6.0, `request.form()` and `request.multipart()` use `SpooledTemporaryFile` for memory-efficient file handling. Small files (<1MB) are kept in memory, while larger files automatically spill to temporary disk files. The framework automatically cleans up resources at the end of each request.

///

=== "Using binders (recommended)"

```python
Expand Down Expand Up @@ -259,35 +266,179 @@ kinds.
# data is bytes
```

#### Reading files
#### Reading files and multipart/form-data

/// admonition | Significantly improved in BlackSheep 2.6.0
type: info

BlackSheep 2.6.0 introduces significant improvements for handling `multipart/form-data` with memory-efficient streaming and file handling:

Files read from `multipart/form-data` payload.
- **Memory-efficient file handling**: Files use `SpooledTemporaryFile` - small files (<1MB) stay in memory, larger files automatically spill to temporary disk files
- **True streaming parsing**: New `Request.multipart_stream()` method for streaming multipart data without buffering the entire request body
- **Automatic resource cleanup**: The framework automatically calls `Request.dispose()` at the end of each request to clean up file resources
- **Better API**: `FileBuffer` class provides clean methods (`read()`, `seek()`, `close()`, `save_to()`) for uploaded files
- **Streaming parts**: `FormPart.stream()` method to stream part data in chunks
- **OpenAPI support**: `FromText` and `FromFiles` are now properly documented in OpenAPI

///

Files are read from `multipart/form-data` payload.

=== "Using binders (recommended)"

```python
from blacksheep import FromFiles
from blacksheep import FromFiles, post


@post("/something")
@post("/upload")
async def post_files(files: FromFiles):
data = files.value
# files.value is a list of FormPart objects
for file_part in files.value:
# Access file metadata
file_name = file_part.file_name.decode() if file_part.file_name else "unknown"
content_type = file_part.content_type.decode() if file_part.content_type else None

# file_part.file is a FileBuffer instance with efficient memory handling
# Small files (<1MB) are kept in memory, larger files use temporary disk files
file_buffer = file_part.file

# Read file content
content = file_buffer.read()

# Or save directly to disk
await file_buffer.save_to(f"./uploads/{file_name}")
```

=== "Directly from the request"

```python
from blacksheep import post, Request


@post("/upload-files")
async def upload_files(request: Request):
files = await request.files()

for part in files:
# Access file metadata
file_name = part.file_name.decode() if part.file_name else "unknown"

# file_bytes contains the entire file content
file_bytes = part.data
file_name = file.file_name.decode()

# Or use the FileBuffer for more control
file_buffer = part.file
content = file_buffer.read()
```

...
=== "Memory-efficient streaming (2.6.0+)"

For handling large file uploads efficiently without loading the entire request body into memory:

```python
from blacksheep import post, Request, created


@post("/upload-large")
async def upload_large_files(request: Request):
# Stream multipart data without buffering entire request body
async for part in request.multipart_stream():
if part.file_name:
# This is a file upload
file_name = part.file_name.decode()

# Stream the file content in chunks
with open(f"./uploads/{file_name}", "wb") as f:
async for chunk in part.stream():
f.write(chunk)
else:
# This is a regular form field
field_name = part.name.decode() if part.name else ""
field_value = part.data.decode()
print(f"Field {field_name}: {field_value}")

return created()
```

=== "Mixed form with files and text (2.6.0+)"

Using `FromFiles` and `FromText` together in the same handler:

```python
from blacksheep import FromFiles, FromText, post


@post("/upload-with-description")
async def upload_with_metadata(
description: FromText,
files: FromFiles,
):
# description.value contains the text field value
text_content = description.value

# files.value contains the uploaded files
for file_part in files.value:
file_name = file_part.file_name.decode() if file_part.file_name else "unknown"

# Process the file
await file_part.file.save_to(f"./uploads/{file_name}")

return {"description": text_content, "files_count": len(files.value)}
```

##### Resource management and cleanup

BlackSheep automatically manages file resources. The framework calls `Request.dispose()` at the end of each request-response cycle to clean up temporary files. However, if you need manual control:

```python
from blacksheep import post, Request


@post("/manual-cleanup")
async def manual_file_handling(request: Request):
try:
files = await request.files()

for part in files:
# Process files
pass
finally:
# Manually clean up resources if needed
# (normally not required as framework does this automatically)
request.dispose()
```

##### FileBuffer API

The `FileBuffer` class wraps `SpooledTemporaryFile` and provides these methods:

- `read(size: int = -1) -> bytes`: Read file content
- `seek(offset: int, whence: int = 0) -> int`: Change file position
- `close() -> None`: Close the file
- `async save_to(file_path: str) -> None`: Asynchronously save file to disk (must be awaited)

```python
from blacksheep import FromFiles, post


@post("/process-file")
async def process_file(files: FromFiles):
for file_part in files.value:
file_buffer = file_part.file

# Read first 100 bytes
header = file_buffer.read(100)

# Go back to start
file_buffer.seek(0)

# Read entire content
full_content = file_buffer.read()

# Save to disk
await file_buffer.save_to("./output.bin")
```

#### Reading streams

Reading streams enables reading large-sized bodies using an asynchronous
Expand Down