Skip to content

Commit

Permalink
Merge pull request scala#8596 from tegonal/documentation
Browse files Browse the repository at this point in the history
bundle of documentation improvements
  • Loading branch information
OlivierBlanvillain authored Mar 30, 2020
2 parents 2dfb359 + abc35b7 commit 0d03af4
Show file tree
Hide file tree
Showing 15 changed files with 72 additions and 61 deletions.
2 changes: 1 addition & 1 deletion docs/docs/reference/contextual/context-functions.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ With that setup, the table construction code above compiles and expands to:
```
### Example: Postconditions

As a larger example, here is a way to define constructs for checking arbitrary postconditions using an extension method `ensuring` so that the checked result can be referred to simply by `result`. The example combines opaque aliases, context function types, and extension methods to provide a zero-overhead abstraction.
As a larger example, here is a way to define constructs for checking arbitrary postconditions using an extension method `ensuring` so that the checked result can be referred to simply by `result`. The example combines opaque type aliases, context function types, and extension methods to provide a zero-overhead abstraction.

```scala
object PostConditions {
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/dropped-features/package-objects.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ The compiler generates synthetic objects that wrap toplevel definitions falling

- all pattern, value, method, and type definitions,
- implicit classes and objects,
- companion objects of opaque types.
- companion objects of opaque type aliases.

If a source file `src.scala` contains such toplevel definitions, they will be put in a synthetic object named `src$package`. The wrapping is transparent, however. The definitions in `src` can still be accessed as members of the enclosing package.

Expand Down
47 changes: 24 additions & 23 deletions docs/docs/reference/enums/desugarEnums.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,33 +19,34 @@ some terminology and notational conventions:

- _Class cases_ are those cases that are parameterized, either with a type parameter section `[...]` or with one or more (possibly empty) parameter sections `(...)`.
- _Simple cases_ are cases of a non-generic enum that have neither parameters nor an extends clause or body. That is, they consist of a name only.
- _Value cases_ are all cases that do not have a parameter section but that do have a (possibly generated) extends clause and/or a body.
- _Value cases_ are all cases that do not have a parameter section but that do have a (possibly generated) `extends` clause and/or a body.

Simple cases and value cases are collectively called _singleton cases_.

The desugaring rules imply that class cases are mapped to case classes, and singleton cases are mapped to `val` definitions.

There are nine desugaring rules. Rule (1) desugar enum definitions. Rules
(2) and (3) desugar simple cases. Rules (4) to (6) define extends clauses for cases that
are missing them. Rules (7) to (9) define how such cases with extends clauses
map into case classes or vals.

1. An `enum` definition
```scala
enum E ... { <defs> <cases> }
```
expands to a `sealed` `abstract` class that extends the `scala.Enum` trait and
an associated companion object that contains the defined cases, expanded according
to rules (2 - 8). The enum trait starts with a compiler-generated import that imports
the names `<caseIds>` of all cases so that they can be used without prefix in the trait.
```scala
sealed abstract class E ... extends <parents> with scala.Enum {
import E.{ <caseIds> }
(2) and (3) desugar simple cases. Rules (4) to (6) define `extends` clauses for cases that
are missing them. Rules (7) to (9) define how such cases with `extends` clauses
map into `case class`es or `val`s.

1. An `enum` definition
```scala
enum E ... { <defs> <cases> }
```
expands to a `sealed abstract` class that extends the `scala.Enum` trait and
an associated companion object that contains the defined cases, expanded according
to rules (2 - 8). The enum trait starts with a compiler-generated import that imports
the names `<caseIds>` of all cases so that they can be used without prefix in the trait.
```scala
sealed abstract class E ... extends <parents> with scala.Enum {
import E.{ <caseIds> }
<defs>
}
object E { <cases> }
```
2. A simple case consisting of a comma-separated list of enum names
}
object E { <cases> }
```

2. A simple case consisting of a comma-separated list of enum names
```scala
case C_1, ..., C_n
```
Expand All @@ -69,7 +70,7 @@ map into case classes or vals.

4. If `E` is an enum with type parameters
```scala
V1 T1 > L1 <: U1 , ... , Vn Tn >: Ln <: Un (n > 0)
V1 T1 >: L1 <: U1 , ... , Vn Tn >: Ln <: Un (n > 0)
```
where each of the variances `Vi` is either `'+'` or `'-'`, then a simple case
```scala
Expand All @@ -81,7 +82,7 @@ map into case classes or vals.
```
where `Bi` is `Li` if `Vi = '+'` and `Ui` if `Vi = '-'`. This result is then further
rewritten with rule (8). Simple cases of enums with non-variant type
parameters are not permitted.
parameters are not permitted (however value cases with explicit `extends` clause are)

5. A class case without an extends clause
```scala
Expand Down Expand Up @@ -201,4 +202,4 @@ Cases such as `case C` expand to a `@static val` as opposed to a `val`. This all
explicitly declared in it.

- If an enum case has an extends clause, the enum class must be one of the
classes that's extended.
classes that's extended.
16 changes: 10 additions & 6 deletions docs/docs/reference/new-types/match-types.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ type LeafElem[X] = X match {
```
Recursive match type definitions can also be given an upper bound, like this:
```scala
type Concat[+Xs <: Tuple, +Ys <: Tuple] <: Tuple = Xs match {
type Concat[Xs <: Tuple, +Ys <: Tuple] <: Tuple = Xs match {
case Unit => Ys
case x *: xs => x *: Concat[xs, Ys]
}
Expand Down Expand Up @@ -124,17 +124,21 @@ The third rule states that a match type conforms to its upper bound

Within a match type `Match(S, Cs) <: B`, all occurrences of type variables count as covariant. By the nature of the cases `Ci` this means that occurrences in pattern position are contravarant (since patterns are represented as function type arguments).

## Typing Rules for Match Expressions
<!-- TODO revise this section, at least `S` has to be invariant according to the current implementation -->

## Typing Rules for Match Expressions (Work in Progress)

<!-- TODO document the final solution and remove (Work in Progress) -->

Typing rules for match expressions are tricky. First, they need some new form of GADT matching for value parameters.
Second, they have to account for the difference between sequential match on the term level and parallel match on the type level. As a running example consider:
```scala
type M[+X] = X match {
type M[X] = X match {
case A => 1
case B => 2
}
```
We'd like to be able to typecheck
We would like to be able to typecheck
```scala
def m[X](x: X): M[X] = x match {
case _: A => 1 // type error
Expand All @@ -144,14 +148,14 @@ def m[X](x: X): M[X] = x match {
Unfortunately, this goes nowhere. Let's try the first case. We have: `x.type <: A` and `x.type <: X`. This tells
us nothing useful about `X`, so we cannot reduce `M` in order to show that the right hand side of the case is valid.

The following variant is more promising:
The following variant is more promising but does not compile either:
```scala
def m(x: Any): M[x.type] = x match {
case _: A => 1
case _: B => 2
}
```
To make this work, we'd need a new form of GADT checking: If the scrutinee is a term variable `s`, we can make use of
To make this work, we would need a new form of GADT checking: If the scrutinee is a term variable `s`, we can make use of
the fact that `s.type` must conform to the pattern's type and derive a GADT constraint from that. For the first case above,
this would be the constraint `x.type <: A`. The new aspect here is that we need GADT constraints over singleton types where
before we just had constraints over type parameters.
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/other-new-features/control-syntax.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,4 +39,4 @@ The rules in detail are:
### Rewrites

The Dotty compiler can rewrite source code from old syntax and new syntax and back.
When invoked with options `-rewrite -new-syntax` it will rewrite from old to new syntax, dropping parentheses and braces in conditions and enumerators. When invoked with with options `-rewrite -old-syntax` it will rewrite in the reverse direction, inserting parentheses and braces as needed.
When invoked with options `-rewrite -new-syntax` it will rewrite from old to new syntax, dropping parentheses and braces in conditions and enumerators. When invoked with options `-rewrite -old-syntax` it will rewrite in the reverse direction, inserting parentheses and braces as needed.
6 changes: 3 additions & 3 deletions docs/docs/reference/other-new-features/explicit-nulls.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,13 +17,13 @@ Instead, to mark a type as nullable we use a [type union](https://dotty.epfl.ch/
val x: String|Null = null // ok
```

Explicit nulls are enabled via a `-Yexplicit-nulls` flag, so they're an opt-in feature.
Explicit nulls are enabled via a `-Yexplicit-nulls` flag.

Read on for details.

## New Type Hierarchy

When explicit nulls are enabled, the type hierarchy changes so that `Null` is subtype only of
When explicit nulls are enabled, the type hierarchy changes so that `Null` is only a subtype of
`Any`, as opposed to every reference type.

This is the new type hierarchy:
Expand All @@ -33,7 +33,7 @@ After erasure, `Null` remains a subtype of all reference types (as forced by the

## Unsoundness

The new type system is unsound with respect to `null`. This means there are still instances where an expressions has a non-nullable type like `String`, but its value is `null`.
The new type system is unsound with respect to `null`. This means there are still instances where an expression has a non-nullable type like `String`, but its value is actually `null`.

The unsoundness happens because uninitialized fields in a class start out as `null`:
```scala
Expand Down
5 changes: 3 additions & 2 deletions docs/docs/reference/other-new-features/indentation.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,10 @@ title: Optional Braces
---

As an experimental feature, Scala 3 enforces some rules on indentation and allows
some occurrences of braces `{...}` to be optional.
some occurrences of braces `{...}` to be optional.
It can be turned off with the compiler flag `-noindent`.

- First, some badly indented programs are ruled out, which means they are flagged with warnings.
- First, some badly indented programs are flagged with warnings.
- Second, some occurrences of braces `{...}` are made optional. Generally, the rule
is that adding a pair of optional braces will not change the meaning of a well-indented program.

Expand Down
12 changes: 6 additions & 6 deletions docs/docs/reference/other-new-features/opaques-details.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,22 +20,22 @@ The general form of a (monomorphic) opaque type alias is
```scala
opaque type T >: L <: U = R
```
where the lower bound `L` and the upper bound `U` may be missing, in which case they are assumed to be `scala.Nothing` and `scala.Any`, respectively. If bounds are given, it is checked that the right hand side `R` conforms to them, i.e. `L <: R` and `R <: U`. F-bounds are not supported for opaque types: `T` is not allowed to appear in `L` or `U`.
where the lower bound `L` and the upper bound `U` may be missing, in which case they are assumed to be `scala.Nothing` and `scala.Any`, respectively. If bounds are given, it is checked that the right hand side `R` conforms to them, i.e. `L <: R` and `R <: U`. F-bounds are not supported for opaque type aliases: `T` is not allowed to appear in `L` or `U`.

Inside the scope of the alias definition, the alias is transparent: `T` is treated
as a normal alias of `R`. Outside its scope, the alias is treated as the abstract type
```scala
type T >: L <: U
```
A special case arises if the opaque type is defined in an object. Example:
A special case arises if the opaque type alias is defined in an object. Example:
```
object o {
opaque type T = R
}
```
In this case we have inside the object (also for non-opaque types) that `o.T` is equal to
`T` or its expanded form `o.this.T`. Equality is understood here as mutual subtyping, i.e.
`o.T <: o.this.T` and `o.this.T <: T`. Furthermore, we have by the rules of opaque types
`o.T <: o.this.T` and `o.this.T <: T`. Furthermore, we have by the rules of opaque type aliases
that `o.this.T` equals `R`. The two equalities compose. That is, inside `o`, it is
also known that `o.T` is equal to `R`. This means the following code type-checks:
```scala
Expand All @@ -48,7 +48,7 @@ def id(x: o.T): o.T = x

### Toplevel Opaque Types

An opaque type on the toplevel is transparent in all other toplevel definitions in the sourcefile where it appears, but is opaque in nested
An opaque type alias on the toplevel is transparent in all other toplevel definitions in the sourcefile where it appears, but is opaque in nested
objects and classes and in all other source files. Example:
```scala
// in test1.scala
Expand All @@ -69,10 +69,10 @@ object test1$package {
val x: A = "abc"
}
object obj {
val y: A = "abc" // error: cannot assign "abc" to opaque type A
val y: A = "abc" // error: cannot assign "abc" to opaque type alias A
}
```
The opaque type `A` is transparent in its scope, which includes the definition of `x`, but not the definitions of `obj` and `y`.
The opaque type alias `A` is transparent in its scope, which includes the definition of `x`, but not the definitions of `obj` and `y`.


### Relationship to SIP 35
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/other-new-features/open-classes.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ open class Writer[T] {
/** Sends to stdout, can be overridden */
def send(x: T) = println(x)

/** Send all arguments using `send` */
/** Sends all arguments using `send` */
def sendAll(xs: T*) = xs.foreach(send)
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,14 +43,14 @@ parameter `pi`. Then `f` will conform to the function type `ProductN[T1, ..., Tn
A type `Ti` fits a parameter `pi` if one of the following two cases is `true`:

* `pi` comes without a type, i.e. it is a simple identifier or `_`.
* `pi` is of the form `x: Ui` or `_: Ui` and `Ti` conforms to `Ui`.
* `pi` is of the form `x: Ui` or `_: Ui` and `Ti <: Ui`.

Auto-tupling composes with eta-expansion. That is an n-ary function generated by eta-expansion
can in turn be adapted to the expected type with auto-tupling.

#### Term addaptation
#### Term adaptation

If the a function
If the function
```scala
(p1: T1, ..., pn: Tn) => e
```
Expand All @@ -73,7 +73,7 @@ Translation of such a tuples would use the `apply` method on the tuple to access

### Migration

Code like this could not be written before, hence the new notation would not be ambigouous after adoption.
Code like this could not be written before, hence the new notation would not be ambiguous after adoption.

Though it is possible that someone has written an implicit conversion form `(T1, ..., Tn) => R` to `TupleN[T1, ..., Tn] => R`
for some `n`. This change could be detected and fixed by `Scalafix`. Furthermore, such conversion would probably
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@ layout: doc-page
title: threadUnsafe annotation
---

A new annotation `@threadUnsafe` can be used on a field which defines a lazy
val. When this annotation is used, the initialization of the lazy val will use a
A new annotation `@threadUnsafe` can be used on a field which defines a `lazy
val`. When this annotation is used, the initialization of the lazy val will use a
faster mechanism which is not thread-safe.

### Example
Expand Down
17 changes: 9 additions & 8 deletions docs/docs/reference/other-new-features/tupled-function.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,18 +26,19 @@ sealed trait TupledFunction[F, G] {
The compiler will synthesize an instance of `TupledFunction[F, G]` if:

* `F` is a function type of arity `N`
* `G` is a function with a single tuple argument of size `N` and it's types are equal to the arguments of `F`
* `G` is a function with a single tuple argument of size `N` and its types are equal to the arguments of `F`
* The return type of `F` is equal to the return type of `G`
* `F` and `G` are the same kind of function (both are `(...) => R` or both are `(...) ?=> R`)
* `F` and `G` are the same sort of function (both are `(...) => R` or both are `(...) ?=> R`)
* If only one of `F` or `G` is instantiated the second one is inferred.

Examples
--------
`TupledFunction` can be used to generalize the `Function1.tupled`, ... `Function22.tupled` methods to functions of any arities ([full example](https://github.com/lampepfl/dotty/blob/master/tests/run/tupled-function-tupled.scala))
`TupledFunction` can be used to generalize the `Function1.tupled`, ... `Function22.tupled` methods to functions of any arities.
The following defines `tupled` as [extension method](../contextual/extension-methods.html) ([full example](https://github.com/lampepfl/dotty/blob/master/tests/run/tupled-function-tupled.scala)).

```scala
/** Creates a tupled version of this function: instead of N arguments,
* it accepts a single [[scala.Tuple]] argument.
* it accepts a single [[scala.Tuple]] with N elements as argument.
*
* @tparam F the function type
* @tparam Args the tuple type with the same types as the function arguments of F
Expand All @@ -46,11 +47,11 @@ Examples
def [F, Args <: Tuple, R](f: F).tupled(using tf: TupledFunction[F, Args => R]): Args => R = tf.tupled(f)
```

`TupledFunction` can be used to generalize the `Function.untupled` methods to functions of any arities ([full example](https://github.com/lampepfl/dotty/blob/master/tests/run/tupled-function-untupled.scala))
`TupledFunction` can be used to generalize the `Function.untupled` to a function of any arities ([full example](https://github.com/lampepfl/dotty/blob/master/tests/run/tupled-function-untupled.scala))

```scala
/** Creates an untupled version of this function: instead of single [[scala.Tuple]] argument,
* it accepts a N arguments.
/** Creates an untupled version of this function: instead of a single argument of type [[scala.Tuple]] with N elements,
* it accepts N arguments.
*
* This is a generalization of [[scala.Function.untupled]] that work on functions of any arity
*
Expand All @@ -64,7 +65,7 @@ def [F, Args <: Tuple, R](f: Args => R).untupled(using tf: TupledFunction[F, Arg
`TupledFunction` can also be used to generalize the [`Tuple1.compose`](https://github.com/lampepfl/dotty/blob/master/tests/run/tupled-function-compose.scala) and [`Tuple1.andThen`](https://github.com/lampepfl/dotty/blob/master/tests/run/tupled-function-andThen.scala) methods to compose functions of larger arities and with functions that return tuples.

```scala
/** Composes two instances of TupledFunctions in a new TupledFunctions, with this function applied last
/** Composes two instances of TupledFunction into a new TupledFunction, with this function applied last.
*
* @tparam F a function type
* @tparam G a function type
Expand Down
4 changes: 4 additions & 0 deletions docs/docs/reference/soft-modifier.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@ title: Soft Modifiers
---

A soft modifier is one of the identifiers `opaque` and `inline`.
<!--
TODO this is most likely outdated should at least contain `extension` in addition.
Worth maintaining? or maybe better refer to internal/syntax.md ?
-->

It is treated as a potential modifier of a definition, if it is followed by a hard modifier or a keyword combination starting a definition (`def`, `val`, `var`, `type`, `class`, `case class`, `trait`, `object`, `case object`, `enum`). Between the two words there may be a sequence of newline tokens and soft modifiers.

Expand Down
2 changes: 1 addition & 1 deletion tests/run/tupled-function-tupled.scala
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ object Test {
}

/** Creates a tupled version of this function: instead of N arguments,
* it accepts a single [[scala.Tuple]] argument.
* it accepts a single [[scala.Tuple]] with N elements as argument.
*
* This is a generalization of [[scala.FunctionN.tupled]] that work on functions of any arity
*
Expand Down
4 changes: 2 additions & 2 deletions tests/run/tupled-function-untupled.scala
Original file line number Diff line number Diff line change
Expand Up @@ -95,8 +95,8 @@ object Test {

}

/** Creates an untupled version of this function: instead of single [[scala.Tuple]] argument,
* it accepts a N arguments.
/** Creates an untupled version of this function: instead of a single argument of type [[scala.Tuple]] with N elements,
* it accepts N arguments.
*
* This is a generalization of [[scala.Function.untupled]] that work on functions of any arity
*
Expand Down

0 comments on commit 0d03af4

Please sign in to comment.