When I look at the “remove” functions of lisp, I noticed there are optional arguments :start :end to designate the part of the list you want to manipulate. http://clhs.lisp.se/Body/f_rm_rm.htm
I cannot find an elegant counterpart of :start :end in racket.
I can split a list to multiple sub-lists and change the sub-list, then combine them together, but it doesn't give me a feel of “that's the way”.
Here is a more specific question:
remove duplicates between index [1,6) from (list 0 1 1 3 3 5 5) elegantly (as short as possible?)
This looks like one of those rare occasions where "batteries included" has some reservations.
That said, if I needed such functionality, I'd probably just go with something like:
(define (remove-duplicates+ lst #:start (start 0) #:end (end #f))
(for/fold ((rr '())
(es (set))
#:result (reverse rr))
((e (in-list lst))
(i (in-naturals)))
(if (and (>= i start)
(or (not end)
(< i end)))
(if (set-member? es e)
(values rr es)
(values (cons e rr)
(set-add es e)))
(values (cons e rr)
es))))
It takes me sometime to fully digest your code, it does feel like the racket-y way.
I don't think it is worth, because this will open a time-consuming black hole of adding #:start & #:end to all kinds of functions.
However, I do want to come up a generic macro to apply a general list function to part of a list, like
(apply-to-range func lst start end)
or
(apply-to func lst (in-range start end [step])) ;; the last part are indices of wantted parted
;; the `func` take a (sub)list as argument.
so that one may simply (apply-to-range remove-duplicates '(0 1 1 3 3 5 5) 1 6)
But I think it cannot be efficiently implemented this way without mutable list (split and combine the list is a must)?
I used this topic as a prompt to explore lenses a little bit more. (no expert on using them)
I think there are circumstances where using lenses is neat, however I am unsure how well they are optimized by racket / whether using them introduces a considerable performance hit. I haven't tested.
For code that needs to be fast / should have fewer dependencies, I would use one of the other solutions.
That said I think it would be nice if these lenses / generic collections were integrated into the language and there was no overhead (that I speculate is currently there).
The (experimental) Glass: Composable Optics package has 3 Traversals that doesn't allow the number of elements to change during the traversal, that seems reasonable to me, considering the complications above.
It probably would be easier to map elements that are supposed to disappear to a nothing value and then filter that value out in a later step, ideally doing that would be supported by optimizations to not make it more expensive. I haven't looked into all of this very deeply, I might be off on some things.