05:10 <pyro-> now i am writing a cpsTransform function :D
05:10 <pyro-> it already works, but the current version introduces superflous continuations
05:10 <pyro-> so i am trying to fix :D
05:10 <ReinH> pyro-: Here's a CPS transform function: flip ($)
05:11 <pyro-> i will find out about flip
05:11 <ReinH>@src flip
05:11 <lambdabot> flip f x y = f y x
05:11 <ReinH> pyro-: the essence of CPS can be described as follows:
05:11 <ReinH> :t flip ($)
05:11 <lambdabot> b -> (b -> c) -> c
05:12 <ReinH> is the type of a function which takes a value and produces a suspended computation that takes a continuation and runs it against the value
05:12 <ReinH> for example:
05:12 <ReinH> > let c = flip ($) 3 in c show
05:12 <lambdabot> "3"
05:12 <ReinH> > let c = flip ($) 3 in c succ
05:12 <lambdabot> 4
05:13 <mm_freak_> direct style: f x = 3*x + 1
05:13 <mm_freak_> CPS: f x k = k (3*x + 1)
05:13 <mm_freak_> the rules are: take a continuation argument and be fully polymorphic on the result type
05:13 <mm_freak_> f :: Integer -> (Integer -> r) -> r
05:14 <mm_freak_> as long as your result type is fully polymorphic and doesn't unify with anything else in the type signature you can't do anything wrong other than to descend
05:38 <ReinH> So for example, if you have an incredibly simple expression language like data Expr a = Val a | Neg a | Add a a
05:38 <ReinH> a (more) initial encoding of an expression would be Add (Val 1) (Neg (Val 1))
05:38 <ReinH> A (more) final encoding might be (1 - 1) or even 0
05:39 <ReinH> The initial encoding generally is more flexible (you can still write a double-negation elimination rule, for instance
05:39 <ReinH> the final encoding is less flexible, but also does more work up-front
05:40 <ReinH> More initial encodings tend to force you to use quantification and type-level tricks, CPS and pre-applied functions tend to appear more in final encodings
05:40 <ReinH> An even smaller example:
05:40 <ReinH> \f z -> foldr f z [1,2,3] is a final encoding of the list [1,2,3]
05:41 <ReinH> pyro-: I'm not really a lisper, but I'm always looking for good reading material
05:41 <ReinH> for bonus points, the foldr encoding is *invertible* as well :)
05:44 <ReinH> pyro-: the relevance is that you seem to be using the cps transform in a more initial encoding than I usually see it
05:44 <ReinH> not that this is at all bad
05:46 <bitemyapp> ReinH: where does the invertibility in the final encoding come from?
05:46 <ReinH> foldr (:) [] :)
05:46 <ReinH> it's not generally so
05:46 <bitemyapp> > foldr (:) [] [1, 2, 3]
05:46 <lambdabot> [1,2,3]
05:47 <bitemyapp> I may not understand the proper meaning of invertibility in this case.
05:47 <bitemyapp> Do you mean invertibility from final to initial encoding?
05:47 <ReinH> Just that, yes
05:47 <bitemyapp> how would it get you back to final from initial?
05:47 <ReinH> I'm not sure if that's the correct term
05:47 <bitemyapp> I don't think it is, but the intent is understood and appreciated.
05:48 <bitemyapp> invertibility implies isomorphism, implies ability to go final -> initial -> final
05:48 <ReinH> well, there is an isomorphism
05:48 <bitemyapp> well, we've established final -> initial, where's initial -> final for this example?
05:49 <bitemyapp> I figured it was a morphism of some sort, but with only a final -> initial and not a way to get back, I wasn't sure which.
05:49 <ReinH> toInitial k = k (:) []; toFinal xs = \f z -> foldr f z xs
Note: all of the options for playing with lists and queues and fingertrees come with trade-offs.
Finger trees give you O(log n) appends and random access, O(1) cons/uncons/snoc/unsnoc etc. but _cost you_ infinite lists.
Realtime queues give you the O(1) uncons/snoc. There are catenable output restricted deques that can preserve those and can upgrade you to O(1) append, but we've lost unsnoc and random access along the way.
Skew binary random access lists give you O(log n) drop and random access and O(1) cons/uncons, but lose the infinite lists, etc.
Tarjan and Mihaescu's deque may get you back worst-case bounds on more of the, but we still lose O(log n) random access and infinite lists.
Difference lists give you an O(1) append, but alternating between inspection and construction can hit your asymptotics.
Lists are used by default because they cleanly extend to the infinite cases, anything more clever necessarily loses some of that power.
02:25 <Iceland_jack> You can gain your intuition by working these things out for increasingly more involved types
02:26 <bartleby> I was trying this, but it looked like a product type... I'm doing something wrong
02:26 <bartleby> let me see this case
02:26 <Iceland_jack> sure
02:27 <bartleby> wait, if I have one pattern for True and another for False, does it count as a single function? or two?
02:28 <Iceland_jack> If they're two patterns in the same function then it's the same function
02:28 <Iceland_jack> I.e. in the function definition
02:28 <Iceland_jack> f True = ...
02:28 <Iceland_jack> f False = ...
02:28 <Iceland_jack> 'f' is a single function
02:29 <Iceland_jack> and for the first ellipsis '...' you have one of three choices (b = {b1, b2, b3}) and same for the second one
02:29 <pyro-> does b^a include non total functions?
02:29 <Iceland_jack> no
02:29 <pyro-> why is that?
02:30 <Iceland_jack> Because it breaks all sorts of reasoning and makes it more complicated
02:30 <pyro-> :D
02:30 <bartleby> no? I thought that was what I was missing...
02:30 <Iceland_jack> bartleby: How many functions of type
02:30 <Iceland_jack> Bool -> ()
02:31 <bartleby> yes, that's where I'm confused. I'd guess one?
02:31 <Iceland_jack> Right, because the only choice is
02:31 <Iceland_jack> fn True = ()
02:31 <Iceland_jack> fn False = ()
02:31 <bartleby> matching True and False, but only returning ()
02:32 <Iceland_jack> so the number of function |Bool -> ()| is |()| ^ |Bool|
02:32 <Iceland_jack> |()| ^ |Bool|
02:32 <Iceland_jack> = 1 ^ 2
02:32 <Iceland_jack> = 1
02:32 <bartleby> ah, I think I get it
02:33 <Iceland_jack> And there are 2 functions from
02:33 <Iceland_jack> Bool -> ()
02:33 <Iceland_jack> conversely
02:33 <Iceland_jack> oops, () -> Bool I meant
02:33 <pyro-> Just by sitting in this channel I a learning things :D bartleby, how is it that cardinality of a type has interested you? I haven't even heard the term before
02:33 <Iceland_jack> 'const False' and 'const True' respectively
02:33 <bartleby> Iceland_jack: because 2^1
02:33 <Iceland_jack> Precisely
02:34 <Iceland_jack> pyro-: You should definitely read up on the 'Algebra of Algebraic Data Types'
02:36 <Iceland_jack> It also means that you can define Bool as
02:36 <Iceland_jack> type Bool = Either () ()
02:36 <Iceland_jack> rather than
02:36 <Iceland_jack> data Bool = False | True
02:36 <Iceland_jack> since
02:36 <Iceland_jack> 1 + 1 ≅ 2
02:37 <Iceland_jack> Given the recent pattern synonyms extensions (PatternSynonyms) you can even use the same constructors and pattern match
02:37 <pyro-> Thats interesting
02:37 <Iceland_jack> type (+) = Either
02:37 <Iceland_jack> type BOOL = () + ()
02:37 <Iceland_jack> pattern TRUE = Right () :: BOOL
02:37 <Iceland_jack> pattern FALSE = Left () :: BOOL
02:38 <Iceland_jack> and then
02:38 <Iceland_jack> not :: BOOL -> BOOL
02:38 <Iceland_jack> not TRUE = FALSE
02:38 <Iceland_jack> not FALSE = TRUE
02:38 <pyro-> what abut values instead of types? 1 + 2 = 2 + 1 works for Int. what about algebra for values of other type?
02:38 <Iceland_jack> pyro-: You're not actually using numbers
02:38 <Iceland_jack> 1 is just a nice and confusing way to refer to the type ()
02:38 <pyro-> i understand
02:38 <bartleby> whoa, easy there boy! I'm overheating with 2^2 here
02:38 <Iceland_jack> not the value 1
02:38 <bartleby> :-D
02:38 <pyro-> thanks
02:39 <Iceland_jack> bartleby: Slowing down :)
02:39 <pyro-> actually that i'm not using numbers is kind of the point right?
02:39 <Iceland_jack> well it makes the analogy with elementary arithmetic clearer
02:39 <bartleby> pyro-: you are counting possible values of that type
02:40 <Iceland_jack> So you can write '2' for Bool because Bool has two things
02:40 <bartleby> so Either () Bool has three because: Left (), or Right True, or Right False
02:40 <Iceland_jack> Maybe Bool would be 3
02:40 <Iceland_jack> Yes exactly
02:40 <Iceland_jack> and thus
02:40 <Iceland_jack> Either () Bool ≅ Maybe Bool
02:41 <Iceland_jack> and also
02:41 <Iceland_jack> Maybe a ≅ Either () a
02:41 <Iceland_jack> If you define
02:41 <Iceland_jack> Maybe b = 1 + b
02:41 <Iceland_jack> Either a b = a + b
02:41 <Iceland_jack> then it becomes fairly clear
02:44 <bartleby> ah, I think it clicked here. I managed to list Bool -> Bool, four different functions
02:46 <Iceland_jack> and then for Bool -> Three where |Three| = 3 you have 3 independent choices for True and False so you have 3 * 3 = 3^2
02:46 <Iceland_jack> and so forth
02:46 <Iceland_jack> hope this clears things up a bit
02:46 <bartleby> I was unsure about partial fuctions, but now it makes sense. It's just a permutations of b I think (not sure if permutation is the right word)
02:47 <bartleby> how many arrangements with `a` elements of type `b` can I make?
02:51 <bartleby> Iceland_jack: thank you. I see that I have that page bookmarked, but I think I didn't get that Functions sections at the time
02:52 <bartleby> in fact, it's still confusing...
02:52 <bartleby> "Then each of First, Second and Third can map to two possible values, and in total there are 2⋅2⋅2 = 2^3 = 8 functions of type Trio -> Bool"
02:53 <bartleby> counting like this I was only seeing First->True, First->False, Second->True, Second->False... 6, like a product
02:54 <Iceland_jack> You have to map all the values
02:54 <Iceland_jack> so the first function might be
02:54 <Iceland_jack> f1 First = False
02:54 <Iceland_jack> f1 Second = False
02:54 <Iceland_jack> f1 Third = False
02:54 <Iceland_jack> And the second function might be
02:54 <Iceland_jack> f2 First = True
02:54 <Iceland_jack> f2 Second = False
02:54 <Iceland_jack> f2 Third = False
02:54 <bartleby> yeah, I missed that. Thinking about combinations is easier IMO. True True True, True True False, ...
02:55 <bartleby> reminds me of truth tables :)
02:55 <Iceland_jack> writing False as 0 and True as 1 you get
Much of Guy Steele's work here pertained to a desire to be able to parallelize calculation. This is a laudable goal.
The main issue with a naïve magma approach Steele proposed for Fortress is that you have zero guarantees about efficient splittability. All the mass of your magma could be on one side or the other.
The benefit is that without those guarantees infinite magmas make sense in a lazy language. You can have infinitely large trees just fine, that go off to infinity at any point not just at the right.
This has a certain pleasing structure to it. Why? Well, lists aren't really the free monoid if you allow for infinitely recursive use of your monoid! You have unit and associativity laws and by induction you can apply them a finite number of times, but reassociating an infinite tree from the left to the right requires an infinite number of steps, taking us out of the constructive world we can program. So ultimately a free Monoid (allowing for infinite monoids) is something like Sjoerd Visscher's
```haskell
newtype Free p = Free { runFree :: forall r. p r => (a -> r) -> r }
type List = Free Monoid
```
Here we borrow the assumption of unit and association from the target r and generate something using it. It is an almost vacuous but now correct construction, whereas the association to the right to make a list required us to be able to right associate infinite trees. You can view this as a sort of quotient on a magma, where you guarantee to only consume it with monoidal reductions.
Binding/substituting on a (unital) magma can now take longer than O(n), why? Because now I have to walk past all the structure. You can replace this with Oleg and Atze's "Reflection without Remorse", but walking down a unital Magma structure doesn't decrease n necesssarily.
In the absence of infinite trees, you usually want some form of balance depending on what you want to do with the structure. e.g. turning it into a catenable deque gives you efficient access to both ends and lets you still glue in O(1) or O(log n).
Switching to a finger tree gives you guaranteed O(log n) splits, but now merges go from O(1) to O(log n)
In a general magma the split is potentially completely lopsided. You can 'steal work' but as often as not you likely steal a single unit, or in a unital magma, possibly nothing.
The cost of these richer structures is you lose the continuous extension to the infinite case, but when trading O(n) or worse for O(log n) it is often worth making that trade-off.
<ajcoppa> OscarZ: you know about (->) from the types of functions, right?
<OscarZ> ajcoppa: yes
<ajcoppa> ((->) t) is a partial definition that describes a function that takes in a value of type t
<OscarZ> ajcoppa: im a bit confused.. so does "instance Functor ((->) t) where" mean that were defining that any function that takes a parameter is a Functor instance?
<ajcoppa> OscarZ: yep!
<OscarZ> ajcoppa: is that different than saying instance Functor (a -> b) where ?
<ajcoppa> OscarZ: it is different, because functors need to have a kind of * -> *
<ajcoppa> as an example, when defining a functor for Optional, you don't say instance Functor (Optional Int) where -- you say instance Functor Optional where
<pjdelport> OscarZ: "Functor ((->) r)" is a lot like "Functor (Either a)" and "Functor ((,) a)"
<pjdelport> OscarZ: You're declaring an instance for a partially-applied type, which itself will get applied to a remaining type (which is the "slot" that the functor operates on)
<pjdelport> OscarZ: So in the same sense that "instance Functor Maybe where ..." means: fmap :: (a -> b) -> Maybe a -> Maybe b
<pjdelport> "instance Functor ((,) x) where ..." means: fmap (a -> b) -> (x,a) -> (x,b)
<pjdelport> Or in equivalent syntax: fmap :: (a -> b) -> ((,) x) a -> ((,) x) b
<pjdelport> where ((,) x) is what "f" gets replaced with
<pjdelport> So for "instance Functor (Either x) where ...", you have fmap :: (a -> b) -> Either x a -> Either x b
<pjdelport> and for "instance Functor ((->) r) where ...", you have fmap :: (a -> b) -> (->) r a -> (->) r b
<pjdelport> or equivalently: fmap :: (a -> b) -> (r -> a) -> (r -> b)
<OscarZ> oh.. i think i finally get it
<pjdelport> OscarZ: To read that more intuitively, just remember what the "slot" that fmap operates on is for each Functor instance: for lists, the "slots" are all of a list's elements. For Maybe, the "slots" are Just values (if any). For Either, the "slots" are Right values.
<pjdelport> OscarZ: For functions, (->), the "slot" that you're operating on is the *result* of the function.
<pjdelport> So applying "fmap f" to a function value like (r -> a) uses f to "replace" the result value 'a' with a 'b', yielding a new function (r -> b).
<pjdelport> (Just like how applying "fmap f" to a list value [a] uses f to "replace" each element 'a' with a 'b', yielding a new list [b].)
<OscarZ> to implement Functor functions with signature a -> a -> a, should it be instance Function ((->) r t) where ?
<pjdelport> OscarZ: a -> a -> a, or (a -> (a -> a)), is just a special case of (a -> b) where b = (a -> a)
<OscarZ> pjdelport: so like Functor (Either a) matches (Either a) part in Either a b, Functor ((->) r) matches just the first a in a -> (a -> a) ? in this case the slot would be of type (a -> a) ?
<pjdelport> OscarZ: Yep.
<OscarZ> ok thanks.. i think i get.. apparently im not used to functions as first-class citizens :) somehow got confused when it was a function instead of something like Either a b
<ajcoppa> it is very normal to need some time to wrap your head around this instance in particular
Then I wanted to write a function that would reverse the (>>= gen) and i came up with this function.
```haskell
revgen (x:xs) =
case (null xs) of
False -> case ((head xs) - x) of
1 -> 1 + revgen xs
otherwise -> 0 : revgen xs
True -> 1 : []
```
But it doesn't work, I think because of the revgen returning both numbers and lists. But I can't come up with a way to do this any other way with one function. Also the nested cases look ugly as hell but I quess they work as intented because at least it compiles.
* * * * *
meditans:
Given the definitions:
```haskell
gen x = [1..x]
f xs = xs >>= gen
```
If you want a function g such that g . f = id, I think you could use: