Skip to content

Commit b0671aa

Browse files
authored
chore: prepare v0.4.2 (#1041)
1 parent 39f83d4 commit b0671aa

File tree

8 files changed

+87
-33
lines changed

8 files changed

+87
-33
lines changed

Cargo.lock

Lines changed: 29 additions & 29 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

Cargo.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ members = [
55
resolver = "2"
66

77
[workspace.package]
8-
version = "0.4.1"
8+
version = "0.4.2"
99
authors = ["LakeSail <[email protected]>"]
1010
edition = "2021"
1111
homepage = "https://lakesail.com"

docs/guide/formats/delta.md

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -98,14 +98,26 @@ df.write.format("delta").mode("overwrite").option("overwriteSchema", "true").sav
9898

9999
### Time Travel
100100

101-
You can use the time travel feature of Delta Lake to query historical versions of a Delta table.
101+
You can use the time travel feature to query historical versions of a Delta table.
102102

103103
```python
104104
df = spark.read.format("delta").option("versionAsOf", "0").load(path)
105+
df = spark.read.format("delta").option("timestampAsOf", "2025-01-02T03:04:05.678").load(path)
105106
```
106107

107108
Time travel is not available for Spark SQL in Sail yet, but we plan to support it soon.
108109

110+
### Column Mapping
111+
112+
You can write Delta tables with column mapping enabled. The supported column mapping modes are `name` and `id`. You must write to a new Delta table to enable column mapping.
113+
114+
```python
115+
df.write.format("delta").option("columnMappingMode", "name").save(path)
116+
df.write.format("delta").option("columnMappingMode", "id").save(path)
117+
```
118+
119+
Existing Delta tables with column mapping can be read as usual.
120+
109121
### More Features
110122

111123
We will continue adding more examples for advanced Delta Lake features as they become available in Sail.

docs/guide/formats/iceberg.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -78,6 +78,18 @@ SELECT * FROM metrics WHERE year > 2024;
7878

7979
:::
8080

81+
### Time Travel
82+
83+
You can use the time travel feature to query tags, branches, or historical versions of an Iceberg table.
84+
85+
```python
86+
df = spark.read.format("iceberg").option("snapshotId", "123").load(path)
87+
df = spark.read.format("iceberg").option("timestampAsOf", "2025-01-02T03:04:05.678").load(path)
88+
df = spark.read.format("iceberg").option("branch", "main").load(path)
89+
```
90+
91+
Time travel is not available for Spark SQL in Sail yet, but we plan to support it soon.
92+
8193
### More Features
8294

8395
We will continue adding more examples for advanced Iceberg features as they become available in Sail.

docs/reference/changelog/index.md

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,34 @@ next: false
55

66
# Changelog
77

8+
## 0.4.2
9+
10+
_November 13, 2025_
11+
12+
- Supported column mapping for Delta Lake ([#985](https://github.com/lakehq/sail/pull/985)).
13+
- Supported time travel for Iceberg ([#1039](https://github.com/lakehq/sail/pull/1039)).
14+
- Supported Unity Catalog ([#1005](https://github.com/lakehq/sail/pull/1005)).
15+
- Improved Iceberg integration ([#1006](https://github.com/lakehq/sail/pull/1006), [#1009](https://github.com/lakehq/sail/pull/1009), and [#1042](https://github.com/lakehq/sail/pull/1042)).
16+
- Added the `luhn_check` SQL function ([#909](https://github.com/lakehq/sail/pull/909)).
17+
- Improved the following SQL functions ([#909](https://github.com/lakehq/sail/pull/909) and [#1024](https://github.com/lakehq/sail/pull/1024)):
18+
- `bit_count`
19+
- `bit_get`
20+
- `getbit`
21+
- `crc32`
22+
- `sha`
23+
- `sha1`
24+
- `expm1`
25+
- `pmod`
26+
- `width_bucket`
27+
- `bitmap_count`
28+
- `to_date`
29+
- Added the `try_avg` SQL aggregate function ([#1012](https://github.com/lakehq/sail/pull/1012)).
30+
- Supported the `try_sum` and `try_avg` SQL aggregate functions in window expressions ([#1040](https://github.com/lakehq/sail/pull/1040)).
31+
32+
### Contributors
33+
34+
Huge thanks to [@davidlghellin](https://github.com/davidlghellin) for the contribution!
35+
836
## 0.4.1
937

1038
_November 2, 2025_

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "pysail"
3-
version = "0.4.1"
3+
version = "0.4.2"
44
description = "Sail Python library"
55
authors = [
66
{ name = "LakeSail", email = "[email protected]" },

python/pysail/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,6 @@
22
# We have a CI step that verifies this.
33
# We cannot use the Hatch dynamic version feature since the project
44
# may be built with Maturin outside of Hatch.
5-
__version__: str = "0.4.1"
5+
__version__: str = "0.4.2"
66

77
__all__ = ["__version__"]

python/pysail/tests/spark/iceberg/test_iceberg_time_travel.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
import platform
12
import time
23
from datetime import datetime, timezone
34

@@ -44,6 +45,7 @@ def test_iceberg_time_travel_by_snapshot_id(spark, tmp_path):
4445
catalog.drop_table(identifier)
4546

4647

48+
@pytest.mark.skipif(platform.system() == "Windows", reason="not working on Windows")
4749
def test_iceberg_time_travel_by_timestamp(spark, tmp_path):
4850
table_path = tmp_path / "tt_by_timestamp"
4951
table_path.mkdir(parents=True, exist_ok=True)

0 commit comments

Comments
 (0)