Skip to content

Commit

Permalink
Python - Update CHANGELOG and bump to 0.9.3 for release
Browse files Browse the repository at this point in the history
  • Loading branch information
n1t0 committed Oct 26, 2020
1 parent 466f530 commit 2364d37
Show file tree
Hide file tree
Showing 5 changed files with 22 additions and 6 deletions.
20 changes: 18 additions & 2 deletions bindings/python/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,15 +4,26 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.9.3]

### Fixed
- [#470]: Fix hanging error when training with custom component
- [#476]: TemplateProcessing serialization is now deterministic
- [#481]: Fix SentencePieceBPETokenizer.from_files

### Added
- [#477]: UnicodeScripts PreTokenizer to avoid merges between various scripts
- [#480]: Unigram now accepts an `initial_alphabet` and handles `special_tokens` correctly

## [0.9.2]

### Fixed
- [#464] Fix a problem with RobertaProcessing being deserialized as BertProcessing
- [#464]: Fix a problem with RobertaProcessing being deserialized as BertProcessing

## [0.9.1]

### Fixed
- [#459] Fix a problem with deserialization
- [#459]: Fix a problem with deserialization

## [0.9.0]

Expand Down Expand Up @@ -248,6 +259,11 @@ delimiter (Works like `.split(delimiter)`)
- Fix a bug with the IDs associated with added tokens.
- Fix a bug that was causing crashes in Python 3.5

[#481]: https://github.com/huggingface/tokenizers/pull/481
[#480]: https://github.com/huggingface/tokenizers/pull/480
[#477]: https://github.com/huggingface/tokenizers/pull/477
[#476]: https://github.com/huggingface/tokenizers/pull/476
[#470]: https://github.com/huggingface/tokenizers/pull/470
[#464]: https://github.com/huggingface/tokenizers/pull/464
[#459]: https://github.com/huggingface/tokenizers/pull/459
[#420]: https://github.com/huggingface/tokenizers/pull/420
Expand Down
2 changes: 1 addition & 1 deletion bindings/python/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion bindings/python/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "tokenizers-python"
version = "0.9.2"
version = "0.9.3"
authors = ["Anthony MOI <[email protected]>"]
edition = "2018"

Expand Down
2 changes: 1 addition & 1 deletion bindings/python/py_src/tokenizers/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
__version__ = "0.9.2"
__version__ = "0.9.3"

from typing import Tuple, Union, Tuple, List
from enum import Enum
Expand Down
2 changes: 1 addition & 1 deletion bindings/python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

setup(
name="tokenizers",
version="0.9.2",
version="0.9.3",
description="Fast and Customizable Tokenizers",
long_description=open("README.md", "r", encoding="utf-8").read(),
long_description_content_type="text/markdown",
Expand Down

0 comments on commit 2364d37

Please sign in to comment.