Several Tweagers all agreed to gather notes here about our ICFP (International Conference on Functional Programming) experiences.
Other posts in this series:
Day 5 - Friday
These notes follow Arnaud, Noon and Richard through their day at ICFP, roughly in order. (The talks are not publicly posted at the time of publication of this post, but we expect they will be over the coming weeks.)
Arnaud — Parafuzz: Coverage-guided Property Fuzzing for Multicore OCaml programs: this is in the context of multicore OCaml (multicore OCaml has algebraic effects). The authors use algebraic effects to abstract over concurrency primitives, so that they can test concurrency property with the AFL fuzzer (AFL tries to generate random inputs, here random schedules, while trying very hard to trigger code paths that were not covered by previous runs).
Arnaud — Wibbily Wobbly Timey Camly: an OCaml library to deal with time, timezones, time intervals, etc… The talk focused a lot on finding out whether a particular date belong to a set of dates. These sets are built by assembling constraints. It’s an unusual, but very convincing design.
Noon — FARM starts today! I’m so excited; I’ve wanted to attend FARM for years, and this is my first opportunity!
- Nice language that has scheduling and state.
- Has a focus on composition over live-coding (c.f. extempore, say.)
Noon — Unfortunately, the next speaker wasn’t able to make it, so there was no talk.
Noon — Bit disappointed that FARM and Haskell Symposium are on at the same time, I’ve ended up attending Haskell, and I’ll hope to watch FARM at a later point.
Practical Normalization by Evaluation for EDSLs, by Nachiappan Villiappan, Alejandro Russo, and Sam Lindley
- Prefer shallow over deep embedding to piggy-back features from the host language.
- Downside is we’ve lost syntax for the operations.
- Maybe one way is to interconvert between expression representation and host (?) representation
- Doesn’t always work; sometimes there’s no unique choice.
I had heard the term “normalization by evaluation” a number of times in the literature, but I never really understood it until watching this talk: the idea is to take a bit of syntax, interpret it into a semantic domain (that is, reinterpret it as an expression in the host language) and then inject the result back into the encoding of the syntax of the object language. The only problem is, now that I’ve described it thusly, it feels vacuous once again: what is the difference between “normalization by evaluation” and just evaluating terms in your object language? The term keeps cropping up, but I don’t really get it.
Let me think a bit more about what the words mean: normalization is the
process of finding a normal form, where a normal form is (generally) a
distinguished member of an equivalence class, useful for checking membership
in the equivalence class. That is, if we want to identify all semantically
equal expressions, we want to notice that
1 + 1 and
2 are semantically
equal; thus, they are in the same equivalence class. A good way to notice this
is to choose one distinguished member of each equivalence class: this is the
normal form. For
1 + 1 and
2 will be this normal form. Evaluation is
a separate process by which we take an expression and simplify it according to
well-established rules of evaluation.
Putting this together, we might surmise that “normalization by evaluation” is an approach for checking semantic equality of expressions, using this recipe:
To determine whether
e1to yield value
e2to yield value
- Declare that
e1is semantically equal to
v1is syntactically equal to
This process does not really care what
v2 are — just whether
they’re syntactically equal. So maybe that really is the essence of
normalization by evaluation. And I’ve probably struggled to understand this
because this process seems “obvious” to me, and so I cannot really imagine
another way of checking semantic equality of expressions.
In the end, I’m not sure whether this definition of normalization-by-evaluation is correct, and I’m afraid I got stuck on this during the talk and did not extract other useful bits.
- Concurrency is frustrating; we don’t want non-determinism!
- Maybe algebraic effects can help?
- Idea: Annotate the resources and modify these to control what can be done.
- It all works!
Richard — Seeking Stability by being Lazy and Shallow: Lazy and shallow instantiation is user friendly, by Gert-Jan Bottu and Richard A. Eisenberg.
I will not comment further on this talk other than to include this shameless plug, and to publicly state how much I enjoyed working with former Tweag intern Gert-Jan on developing this paper. I do hope to submit a proposal requesting lazy instantiation in future versions of GHC.
Noon — I had wanted to watch the Linear Haskell, Today and Tomorrow talk, but it was scheduled alongside ShutdownPL (and FARM); I really want to support ICFPs efforts for more DEI (diversity, equity, and inclusion) content so I feel compelled to attend ShutdownPL; I’ll have to catch the other talks at a later time.
Richard — Linear Haskell, Today and Tomorrow, keynote by former Tweager Jean-Philippe Bernardy
This is a fine keynote describing the state of linear types in Haskell today and sketching out how we can extract more from them in the future. In particular, it describes the long-term plan for supporting pure mutable-in-place structures, as long as we can verify statically (via linear types) that there are never two live copies of the structures.
- This was a bold and strong talk. The main discussion was around the need to address problematic people at the end of the so-called “pipeline problem” in STEM; i.e. there’s no point bringing people in to a community if they are just going to leave because they are not welcomed.
- So, one of the key ideas I got from this talk was to consider why people leave. I think this is a very powerful concept and often quite hard to do.
- Another key idea from this talk was highlighting the damage done to communities by supporting, publicly, people known to be engaging in problematic behaviour. While I think it’s perhaps very obvious, it also can be quite subtle; the talk goes into a discussion of legal issues some of the community leaders, in the specific example, were facing, and the cost of speaking up; which is often very high; especially high if you are not in a privileged position.
- One of the most practical ideas that I came out of this talk with, is this: Survey people who are in/leaving your community, and find out what they are thinking. It doesn’t have to be entirely negative; it could also solicit comments on things that are going well, or new ideas, etc. But one idea that came up, in discussions with my partner afterwards, was the idea of what I’m calling a “Living Code of Conduct”: It’s a mechanism for flagging behaviour (good or bad!) and aligning it to a specific code-of-conduct item managed by the community. It’s probably best done anonymously; but doesn’t have to be; there could be lots of variants. In any case, if you’re at all interested in this idea, do reach out to me; I’d love to chat more about it!
- Really enjoying the presentations by Wen; she has a very engaging style.
- First time I’ve heard about Session types, but I think I got a good feeling.
- Overall I really enjoyed it!
- Box and wire diagrams.
- Symmetric Monoidal Categories (SMCs).
- Issue with Arrows is that they don’t represent these box-and-write diagrams very well; and in particular fail at the parallelisation.
- Idea is to implement a nice notation for working with SMCs in Haskell.
- Perhaps has applications to quantum computing!
- So of course I’m interested, and will try and do a bit more reading.
- Overall, my experience at ICFP has been great.
- I feel very full of FP/Type theory knowledge! I feel like I learned a lot of new words and concepts that I will inevitably have to google again, but now they’ll spark a pleasant memory in my mind :)
- I met several very friendly people, and also watched many lovely talks that I enjoyed; hopefully some friendly faces from ICFP will be appearing on the Compositional podcast in the next few months!
- Thanks to the organisers and all the speakers for their hard work!