Updated GSoC 2015 Ideas (markdown)
This commit is contained in:
parent
8623478501
commit
ec40c2f0f0
|
@ -138,6 +138,56 @@ with learning about Nim compiler internals.
|
||||||
|
|
||||||
**Mentor:** Araq ([@Araq](https://github.com/Araq)), def- ([@def-](http://github.com/def-))
|
**Mentor:** Araq ([@Araq](https://github.com/Araq)), def- ([@def-](http://github.com/def-))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
#### A strong theoretical foundation for Nim's parallel statement
|
||||||
|
|
||||||
|
**Description:**
|
||||||
|
|
||||||
|
Nim uses a fresh idea to model fork&join parallelism via its ``parallel``
|
||||||
|
builtin that performs a range of additional checks at compile-time
|
||||||
|
like array bounds checking or the so called "disjoint check". The disjoint
|
||||||
|
check ensures the datasets are disjoint; that means every processed element is
|
||||||
|
only accessed by one worker. This way no locking is required and yet (memory)
|
||||||
|
safety is guaranteed.
|
||||||
|
|
||||||
|
Unfortunately, how exactly this checking is performed is not
|
||||||
|
yet documented: It's done in a pretty adhoc fashion that mostly uses
|
||||||
|
the transitivity of inequality relations and some simple control flow analysis.
|
||||||
|
|
||||||
|
However, in the context of *optimization* and code generation there has been
|
||||||
|
lots of progress in commercial comilers in the form of auto vectorization.
|
||||||
|
One such framework that enables auto vectorization is the polyhedral model.
|
||||||
|
The polyhedral model deals with iteration spaces and serves to parallelize loop
|
||||||
|
programs automatically by applying algebraic transformations. As it is a
|
||||||
|
traditional optimization pass in a compiler it has to prove these
|
||||||
|
transformations are correct. This is remarkably similar to the task of proving
|
||||||
|
the code written down by a human being correct. So the idea of this task is to
|
||||||
|
leverage the polyhedral model for program verification as opposed to program
|
||||||
|
optimization.
|
||||||
|
|
||||||
|
See these papers for an introduction of the polyhedral model:
|
||||||
|
|
||||||
|
http://www.cs.ucla.edu/~pouchet/doc/cc-article.10.pdf
|
||||||
|
|
||||||
|
http://icps.u-strasbg.fr/~bastoul/research/papers/Bas04-PACT.pdf
|
||||||
|
|
||||||
|
|
||||||
|
Here is a sketch of a possible implementation:
|
||||||
|
|
||||||
|
* Implement an AST to polyhedral model transformation.
|
||||||
|
* Implement a semantic checking pass that works on the polyhedral model and
|
||||||
|
proves array bounds correct as well as that the datasets are disjoint.
|
||||||
|
* In addition to that it needs to be proven that the code is free
|
||||||
|
of "write after write" or "write after read" etc. conflicts.
|
||||||
|
|
||||||
|
|
||||||
|
**Desirable skills**: Knowledge of optimizers.
|
||||||
|
|
||||||
|
**Difficulty:** Hard
|
||||||
|
|
||||||
|
**Mentor:** Araq ([@Araq](https://github.com/Araq))
|
||||||
|
|
||||||
## Standard Library
|
## Standard Library
|
||||||
|
|
||||||
#### Enhance and expand standard library documentation
|
#### Enhance and expand standard library documentation
|
||||||
|
|
Loading…
Reference in New Issue