Skip to content

Add BindCombinator and PureCombinator #6

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 13 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,19 @@ There are several kinds of combinators included in `better-parse`:

* `foo asJust bar` can be used to map a parser to some constant value.

* `bind`, `useBind`

The bind combinator works like the map combinator, but instead of running a parser and mapping over its
result, bind runs a parser, transforms the output, then runs the transformed output on any remaining input.

```kotlin
val abMany = Token("(a|b)*")
val revABMany = abMany bind { Token(it.text.reversed()) }
// Parser<String>, parses some string of "a"s and "b"s and then the reverse, returning just the reverse
```

* `someParser useBind { ... }` is a `bind` equivalent that takes a function with receiver instead. Example: `id useBind { Token(text) }`.

* `optional(...)`

Given a `Parser<T>`, tries to parse the sequence with it, but returns a `null` result if the parser failed, and thus never fails itself:
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
package com.github.h0tk3y.betterParse.combinators

import com.github.h0tk3y.betterParse.lexer.TokenMatch
import com.github.h0tk3y.betterParse.parser.ErrorResult
import com.github.h0tk3y.betterParse.parser.ParseResult
import com.github.h0tk3y.betterParse.parser.Parsed
import com.github.h0tk3y.betterParse.parser.Parser

/** Parses the sequence with [innerParser], and if that succeeds, maps its [Parsed] result with [transform].
* Then run this mapped result on any remaining input.
* Returns the [ErrorResult] of the `innerParser` otherwise.
* @sample BindTest*/
class BindCombinator<T, R>(
val innerParser: Parser<T>,
val transform: (T) -> Parser<R>
) : Parser<R> {
override fun tryParse(tokens: Sequence<TokenMatch>): ParseResult<R> {
val innerResult = innerParser.tryParse(tokens)
return when (innerResult) {
is ErrorResult -> innerResult
is Parsed -> { transform(innerResult.value).tryParse(innerResult.remainder) }
}
}
}

/** Applies the [transform] function to the successful results of the receiver parser and then runs
* the new parser on any remaining input. See [BindCombinator]. */
infix fun <A, T> Parser<A>.bind(transform: (A) -> Parser<T>): Parser<T> = BindCombinator(this, transform)

/** Applies the [transform] receiver to the successful results of the receiver parser and then runs
* the new parser on any remaining input. See [BindCombinator]. */
infix fun <A, T> Parser<A>.useBind(transform: A.() -> Parser<T>): Parser<T> = BindCombinator(this, transform)
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
package com.github.h0tk3y.betterParse.combinators

import com.github.h0tk3y.betterParse.lexer.TokenMatch
import com.github.h0tk3y.betterParse.parser.ParseResult
import com.github.h0tk3y.betterParse.parser.Parsed
import com.github.h0tk3y.betterParse.parser.Parser

/** Returns [Parsed] of [value] without consuming any input */
class PureCombinator<T>(val value: T) : Parser<T> {
override fun tryParse(tokens: Sequence<TokenMatch>): ParseResult<T> = Parsed(value, tokens)
}

/** Returns [Parsed] of [value] without consuming any input */
fun <T> pure(value: T) : Parser<T> = PureCombinator(value)
52 changes: 52 additions & 0 deletions src/test/kotlin/BindTest.kt
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
import com.github.h0tk3y.betterParse.combinators.*
import com.github.h0tk3y.betterParse.lexer.DefaultTokenizer
import com.github.h0tk3y.betterParse.lexer.Token
import com.github.h0tk3y.betterParse.parser.toParsedOrThrow
import org.junit.Assert.assertEquals
import org.junit.Test

class BindTest {
val a_or_b = Token("a_or_b", "[ab]")
val b_and_a = Token("b_and_a", "ba")
val c = Token("c", "c")
val lexer = DefaultTokenizer(listOf(b_and_a, a_or_b, c))

@Test fun testSuccessfulBind() {
val tokens = lexer.tokenize("aba")
val result = a_or_b.bind {
when(it.text) {
"a" -> b_and_a
"b" -> a_or_b
else -> c
}
}.tryParse(tokens)
assertEquals("ba", result.toParsedOrThrow().value.text)
}

@Test fun testSuccessfulBindUse() {
val tokens = lexer.tokenize("baccba")
val result = (b_and_a useBind {
when(text) {
"ba" -> c
"a", "b" -> b_and_a
else -> a_or_b
}
}).tryParse(tokens)
assertEquals("c", result.toParsedOrThrow().value.text)
}

@Test fun testBindPure() {
val tokens = lexer.tokenize("ba")
val result = b_and_a.tryParse(tokens)
val resultBindPure = b_and_a.bind { pure(it) }.tryParse(tokens)
assertEquals("ba", result.toParsedOrThrow().value.text)
assertEquals("ba", resultBindPure.toParsedOrThrow().value.text)
}

@Throws fun testError() {
val tokens = lexer.tokenize("bbbb")
val resultNonBinded = b_and_a.tryParse(tokens)
val resultBinded = b_and_a.bind { pure(it.text) }.tryParse(tokens)
assertEquals(resultNonBinded, resultBinded)
}
}
26 changes: 26 additions & 0 deletions src/test/kotlin/PureTest.kt
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
import com.github.h0tk3y.betterParse.combinators.pure
import com.github.h0tk3y.betterParse.lexer.DefaultTokenizer
import com.github.h0tk3y.betterParse.lexer.Token
import com.github.h0tk3y.betterParse.parser.UnparsedRemainder
import com.github.h0tk3y.betterParse.parser.toParsedOrThrow
import com.github.h0tk3y.betterParse.parser.tryParseToEnd
import org.junit.Assert.assertEquals
import org.junit.Assert.assertTrue
import org.junit.Test

class PureTest {
val a = Token("a", "a")
val lexer = DefaultTokenizer(listOf(a))

@Test fun testSuccessfulPure() {
val tokens = lexer.tokenize("a")
val result = pure(42).tryParse(tokens)
assertEquals(42, result.toParsedOrThrow().value)
}

@Test fun testNotConsumesInputPure() {
val tokens = lexer.tokenize("a")
val result = pure(42).tryParseToEnd(tokens)
assertTrue(result is UnparsedRemainder)
}
}