Responses to common criticisms of Lisp languages?

Just prefacing this by saying that my goal with this post isn't to incite a flame war or be rude: I'm a beginner to Lisps who has a soft spot for them. I learned to code with SICP, and shortly after I somehow stumbled upon Dybvig's beautiful book on the Scheme programming language, and eventually Essentials of Programming Languages by Friedman & Wand. I attribute my success today as a self-taught engineer to having been exposed to these texts, all of which use Scheme as a didactic language. Also I'm not sure whether this was the right subforum to post this question, apologies if not

My goal is more to understand how folks respond to the criticisms in the second post pictured below:

Screenshot of a Hacker News comment by kovrik on Nov 13, 2014 that reads: Could someone give an advice on how to grok Lisp, please? I write Clojure code, read Lisp books, articles etc. But I don't feel 'magic' yet. How did you get 'addicted' to Lisps?
Screenshot of a Hacker News comment by jonesjames on Nov 13, 2014 replying to the previous comment: That's because the magic is a lie. I have this book. I've also worked on many Scheme systems and the internals of CMUCL. I was once a Lisp weenie. The things that made Lisp special are not so special today. Basic things like garbage collection or dynamic typing exist everywhere. The more esoteric things, like CLOS, are esoteric for a reason (they are very difficult to use and impossible to master). The one feature that people still hype has always been a double-edged sword: macros. Scheme entirely ran away from the macros that Common Lisp has, and for good reason. But Scheme still has not developed a hygienic replacement that is equally powerful (and easy to use). They never will. It's been decades now. The problem with macros and pet DSLs is that the best advice for using them is: don't. They can subtly alter the semantics of your code, and they conflate run-time vs compile-time. Not many people can elegantly weave through all these dimensions at once and not make a disaster. Homoiconicity has been oversold. The practice of CONSing everywhere is just awful, and slow. It's not the '80s anymore. CONS doesn't make sense today. When you have first-class functions, closures and GC, you have the best things from Scheme/Lisp.

I've heard this sentiment repeated elsewhere (HN, Twitter, coworkers): the claim goes that in the 90s and earlier the Lisp features mentioned (+ dynamic typing) were quite rare in other programming languages. This made it so the people that were familiar with Lisps appeared to wield superpowers. But these features that made Lisp special then are basically ubiquitous in modern languages today. This makes it so that Lisp languages are little more than historical artifacts: whatever useful ideas are there have already been absorbed more broadly, and the languages themselves are weighed down by their history and awkward syntax (e.g. the cons point in the last screenshot)

I can see how macros could be problematic -- that's one of the things I have a hard time with whenever I need to work on a Ruby project -- these magical mini-DSLs are everywhere, where with Python code following reasonable PEP8 style, I can pretty quickly orient myself and reason about what is actually happening

I also sort of have this impression that, in the programming language implementation world, the Lisp community(ies) is a bit insular and academic. Insular in the sense of being isolated from industry/most people who are writing code, and isolated from modern approaches to programming language implementation. It seems that in the Lisp world, programming language implementation depends on everything being S-expression based or possible to implement using macros. The stuff that seems to have broader effects on industry and the way people think about languages lately (see Roslyn, LSP, safety, type systems, formal verification) seems to not be coming from Lisp folks. I realize this impression is largely due to ignorance -- I have pretty narrow experience with Lisps, haven't talked to many Lisp people.

I also know Lisps have had a major impact on PL implementation historically -- just wondering if that continues to be true today. Is there any recent research coming out of the Lisp community that has influenced the broader PL implementation world, and industry/programmers more broadly?

As mentioned earlier, I have a fondness for Scheme and Python's hylang because the former was the first language I programmed in, and I find them simple and fun, which I think all of these criticisms fail to consider. It's something often said by Rubyists that their language is a joy to write code in, and while I struggle to orient myself in a Ruby codebase, I do think that developer happiness is an important language feature. Another point is community -- languages don't exist in a vacuum, they're written by people and for people. My experience is pretty limited, but as a newcomer, the communities around other well-known functional languages don't feel very welcoming to newcomers, where it feels the Racket community is quite intentional about building a culture that is welcoming to newcomers and generally accessible.

Anyway, I'm interested to hear what folks who know much more than me have to say. Again, apologies if this is the wrong subforum

3 Likes

Just for fun I will reply to this pseudo-flamebait. (You did an ok job keeping it from being over the top.)

Let me just write one response so I don't spend too much time on this.

Is any recent research from the Lisp community influencing PL implementation/industry/programmers? (You don't define recent... )

One common dismissal of academic work is that it "is not relevant to the real world." Do you intend the question this way? Citing what you definitely consider to be influential work would start this off on the right foot. Is Julia groundbreaking? Does a cover story on language oriented programming in a major academic journal indicate an influence? 1

Do you think Clojure/Script are Lisps? Do you think that high salaries for these languages represent impact? (According to annual surveys the salaries for Clojure/Script programmers were near the very top of averages.) If so, I think you have an answer. If not, you need to explain a whole bunch for words in your post.

Thanks for your reply

Genuinely was not trying to agitate or stir the pot in bad faith -- not sure how I could have phrased my question differently to avoid it coming off that way. Sorry for that

As far as "relevance to the real world" goes, I study pure math and was once rejected from a job specifically because I mentioned Erlang and the founder said he thought I was more of a computer scientist than an engineer: I definitely don't think research should be purely motivated by industry in any discipline. I mentioned industry because I think when an idea bleeds from academia into industry, it's a signal of influence, but an idea can certainly be influential and interesting without that

I think I did mention a few examples of what I think to be influential -- LSP and the work around the Roslyn compiler, formal verification, and type systems. I do think Julia is interesting, yes

I guess specifically I'm seeing a lot of (obvious) influence in e.g. JavaScript and C# and Python from ML-family languages than anything obviously originating from Lisp family languages, but I'm also much less familiar with Lisps than I am with ML family languages, and in general I'm really not that familiar with either. Someone in the Discord spoke to this by mentioning lack of familiarity with Lisps makes their impact easier to miss -- like hygienic macros are now more of a thing in languages like Rust and Julia and Lean, and likewise with algebraic effect handlers and other features

Some folks from the Discord server posted some interesting replies:

jimpjorps wrote:

I don't have enough industry or academic experience to really weigh in on most of it, but I do think a couple of those particular complaints in the screenshotted post are overblown, like the one about cons cells being archaic; with the resurgence in functional programming lately, a lot of languages have rediscovered/elevated linked lists as an efficient data structure for iteration, especially since they lend themselves well to declarative definitions of iteration
though that also ties into the whole "other languages have caught up so Lisp isn't so 'magical'" idea

camoy wrote:

I would say that comment is not terribly well-informed. Other languages are still looking at Lisps for inspiration and as a source for new ideas, and no doubt will continue to do so for years to come. Hygienic macros have become, only in the last few years, quite common in some mainstream languages like Rust and Julia. At RacketCon just last year we saw a talk that was bringing ideas from Racket/Rhombus’s macro system into Lean. Languages like OCaml are shipping with algebraic effect handlers which have their roots in delimited continuations which was pioneered by Lisps. Clojure’s high performance immutable data structures have been hugely influential among other languages. I could go on and on and on…
Sorry but can’t help myself because there’s another obvious thing to mention: gradual typing. Nowadays seemingly everyone uses TypeScript, but that line of research has a long history in the Scheme community starting in at least the early 90’s with Fagan’s soft typing.
^ I think that not knowing history (and maybe it's hard to find out, certainly informally) is part of what makes it hard to see the influence of Lisp (or whatever, actually)

100 g-exps wrote (responding to the Scheme & hygienic macros comment in the screenshot):

it's a blatant lie, see racket's syntax/parse. also, common lisp's unhygienic macro system can be implemented using scheme macros (and vice-versa actually).
it's one of these "listen to me kid, I've seen shit, all you know is a lie" posts - you're likely to believe them because the author sounds like somebody who knows their stuff. it's not always the case :mew:
And pedantically speaking, hygienic macros are more powerful, not the other way around

usao wrote

I don’t think it’s possible to implement hygienic macros without altering the expansion model
And pedantically speaking, hygienic macros are more powerful, not the other way around

2 Likes

Thanks for that -- I appreciate it and I'm aware of Clojure, but I didn't know about that statistic. I guess to be more specific, by influence I mean more "impact on other languages" -- like ideas from ML family languages like pattern matching (which I think of as an ML feature, but I also don't know that much so please correct me) have been entering mainstream languages like JavaScript and C# and even Python, I have been seeing less that (obviously) originates from Lisp languages

Some folks in the Discord spoke to that though, and showed me several compelling examples which I shared in my my last post.

Are 10 examples enough?

-- catch and throw (in the 1960s) became exception handling (for certain reasons, it is "officially attributed to CLU from the mid 1970s)
-- heap allocation and garbage collection came from Lisp
-- every data exchange language (XML, JSON) is a bad version of S-expressions
-- mutable cells -- as opposed to variable assihgnment -- became boxes (ref) in ML and Rust
-- lambda became first-class functions from Lisp and Scheme got into a fair number of languages, including ML (from 1970s); to be fair they floated around in 'theory' and have many origins
-- optional types showed up in CL first; CLOS used to be pronounced C Lost but that turned out to be wrong :-); see TypeScript
-- first continuations in Racket entered SML/NJ in 1987, OCaml in the 1990s, the Haskell continuation monad in the late 90s
-- delimited continuations became available in Scala by 2010 or so; a week ago their Haskell implementation was presented at LambdaDay in Krakau
-- first-class and recursive modules jumped from Racket to ML in the 00s
-- Rust borrows Racket's primitive macros (and they keep adding power); also see sweet.js
-- Rust has traits, which are partly due to Bracha/Cook, Racket and detoured through Nierstrasses' Squeak group

9 Likes

On the specific topic of macros, I'd say the comment starting "That's because the magic ...." doesn't demonstrate a particularly deep understanding of the research world state of the art. That said, there's no doubt that there are still problems legitimately considered research still to be overcome.

I'll point you to Matthew et al's recent efforts on Rhombus as somewhere to look at how people are thinking about overcoming the sexp surface syntax.

Overall on macros, I'd say that there is more that's good and ready than is missing. I recently gave a talk on the technical aspects of macros that you might want to watch. Here's a link to youtube plus other details. tl;dr: a lot has happened in the 40 some years since hygiene was first introduced and it has significantly changed the macro-based programming experience.

Hope that helps.

7 Likes

Beautiful, yes, thank you!

I'm a beginner to Lisps, maybe intermediate at Racket in particular, and not very familiar with the programming language research timeline. I've dabbled in many different programing languages and I still use a mixed cocktail of them today, but as someone who came to Racket only relatively recently, but with a fair bit of experience in non-Lisp languages, I'll share a couple of thoughts.

I've always appreciated that many of the programming language features I love the most originated in Lisp or Scheme, and I think we should all be happy that they are becoming ubiquitous. What attracts me to Racket is a unique cocktail of features that I suspect reflects both the Lisp origins of the language, and the community being tuned in to the state-of-the-art (but I could be wrong, so corrections are appreciated).

Support for, and ubiquitous use of, dynamically scoped variables (parameters) in the standard library makes functional programming significantly more ergonomic. I've heard this comes from CL but I haven't seen it much elsewhere.

Coroutines are integrated into the standard library. I always believed that the killer feature of Go was that the task multiplexer wasn't just tacked on as an optional library, but every I/O function in the standard library was multiplexer-aware, which in turn propagates to third party libraries that use the standard library. Obviously, Go didn't invent this, and Erlang arguably has a lot more going for it than just this simple property, but it makes a huge difference, and I'm pointing out Go because it seems (IMO) to be largely carried by popularizing this alone. As far as I can see, the question of how to handle concurrency is still hotly debated in language design, but taking the coroutine stance is a major advantage over languages that don't take a stance whatsoever. Impressively, Java recently managed to sneak in coroutines with Project Loom.

A recent development of the above is the idea of structured concurrency, where many languages are adding "nursery" types to better handle coroutines. While in those languages it's suggested to pass these around as explicit parameters/static binding, they do sound a lot like Racket's custodian type. I've never actually used custodian in a program so I could be way off the mark here.

The next one is more of a long shot because I was only recently introduced to it and I have no idea what I'm talking about, but is the Racket unit mechanism equivalent of ML-style modules? I haven't seen this used in the "Racket wild", but a common criticism of languages with monomorphization, like C++ and Rust, is the slow compilation speed and poor handling of optional dependencies. Parameterization at the module level seems like it might be a relevant topic going forward for the next generation of languages trying to tackle this, while some languages already have it.

Macros/metaprogramming/CTFE has already been discussed but many modern and popular languages still don't have these facilities and the implementation in some that do is still in flux and often changing. Scheme has a stable and time-tested story here which is nice.

Gradual typing has also been mentioned but it's interesting to see how it's still extremely relevant - just recently there was a lot of media hype about Mojo and it's interesting to see it from the perspective of gradual typing. I'd consider Racket at the bleeding edge here.

Now, the thread is about Lisp and not Racket, but my point is that I think the academic leanings of many Lisps, Racket included, is not something that was merely an advantage in decades past, but is still an advantage today. I'm even less familiar with other Lisps, but I've seen enough to know that there are very interesting developments happening there too. One project that comes to mind is the Goblins library where development is now mainly centered around Guile and its nascent Web Assembly target, where much of the focus is capability-based security, which is a hot topic in security right now.

The two cents of a layman, so take it with a grain of salt :slightly_smiling_face:

7 Likes

IMO, the answer to your question is just the old dichotomy : "I wanna make something, gimme libs!" vs. "I wanna make something completely new, gimme lisp!"

Programmers swearing by lisp often are not making your run-of-the-mill 100000th-of-the-same REST app. And yes, there are certainly better tools to do that. If you compare what is first class / emphasized in the standard library of various (non-)lisps, the differences will be very obvious.

Thanks for the comment -- sorry, I don't think I ever asked specifically about making CRUD apps / rest services, so obviously I wasn't making myself clear. That's my fault for not asking a more clear and focused question. If I were to make things a bit more focused I would have phrased my question this way:

Lisp was obviously deeply influential on the development of other programming languages, contributing ideas like GC, dynamic types, macros, among others. Most modern programming languages now have those features. Is it the case then that modern programming languages absorbed all the useful parts of Lisp already, and so have nothing more to learn from it? Or are Lispy languages still innovating and influencing the development of modern languages (and likewise being influenced by other languages)?

Despite my phrasing of the question, I do want the answer to be yes, and I assumed it was the case, otherwise there wouldn't be such robust and active communities around these languages both in academia and industry. Others pointed out some great examples that confirmed this to be the case -- optional typing in TypeScript, traits in Rust, macros in Rust and Julia, among lots of other beautiful examples. Would love to know about others.

1 Like

Just wanted to include here a great response from one of the lead devs of Chicken scheme who commented on the lobste.rs topic linking to this thread:

As a Scheme implementer, my view is probably biased, but allow me to do a hot take: Yes, the mainstream has caught up.

This means that all the smug Lisp weenies from yesteryear who were trying to use the advanced features of Lisp as a selling point have “won” in the sense that other languages now have integrated most of those things that made Lisp unique. They managed to drag the languages closer towards Lisp, which means the programmers can keep on using their chosen languages (so they have also “lost” in that sense). Hell, even the quintessential clown car language PHP now has closures and other advanced features. So much so that it is often regarded it as having come into its own, and being a proper, respectable, language.

However, the mere fact that people are even asking this question points to a deeper truth. Nobody asks such questions about other languages. Lisps have been ahead for so long, and that’s because Lisp is the perfect language research vehicle. It is eminently hackable, you can add new features that seamlessly integrate with the language easily. When object orientation became a thing, Lisp just absorbed it and it became a natural extension of the language (and while at it, they threw in some fancy-ass metaprogramming). And if your new idea is different enough, you can bootstrap a new s-expression based language (which will look like Lisp, but can be something else entirely) in no time, as the core of a compiler or interpreter is quite tiny and you can lean on the existing runtime system. So for new research, Lisp is well-suited. This is also the reason these Racket folks are so defensive in their responses to OP: they can’t understand how anyone couldn’t see that. But that’s mostly because they’re academics, and the question at hand is about practical uses.

Personally, I prefer working in Scheme because it is a language that fits well in my head. It’s quite small and anything that gets added to it “feels” native and well-designed. It has a certain self-contained elegance to it that you don’t see in many other places. The way to implement things typically “fall out of” the design. As a simple example, consider adding a wrapper to an existing procedure. In Lisps that’s trivial: you just redefine the procedure and wrap it up in a lambda. In Python you can do this too, except when you’re in a class. This adds all sorts of artificial complications, so they invented decorators to make it more convenient. Everything added to languages like that feel bolted on. “just add more syntax and it’ll be easy”, they say. Except it leads to concept bloat and everything just becomes more complicated and you have to know more and more of these subtle details of how things interact. It just sucks the joy out of it, for me at least. And if your mind works like mine, perhaps Lisp is for you.

So, for the working programmer without any interest in languages for their own sake, you’re probably better off using a mainstream language, and there’s little point in learning a Lisp (especially if you are a hater of all the parentheses). But that’s actually true for any non-mainstream language. Especially because $API_OF_THE_DAY is probably not implemented yet in non-mainstream languages and you have a lot more work to do.

6 Likes

Kernel by John Shutt is fascinating.

Loko Scheme https://scheme.fail/
http://metamodular.com/closos.pdf
Convex

I can't find Algebraic Data Types in Lisp. I guess they are beyond Lisp.

1 Like

@DKordic
Did you look at pkgs.racket-lang.org ?

Some options:

  1. Algebraic Data Types for compilers

  2. 1 Algebraic Racket

  3. GitHub - pnwamk/datatype: (Somewhat) Algebraic Data Types for Racket

  4. The language for EoPL also implements define-datatype.

However, there's lots of space between a "run-of-the-mill 100000th-of-the-same REST app" and "something completely new". I think that's the reason why it's often still difficult to decide whether Lisp/Scheme or something else is a better fit for a project.

Also, it gets more complicated because many projects have parts that are "run-of-the-mill" and parts that are "completely new" (or quite new).

Good point. I also think that's one aspect to take into account when choosing a programming language.

It sometimes frustrates me that in the Racket ecosystem the ratio of language-manipulating tools compared to tools for "real world" problems is so big. That said, I'm aware that for programming language researchers manipulating language concepts is their "real world" problem, so I understand why this ratio is what it is. :slight_smile:

Edit: The ratio itself maybe isn't the main problem. It's more the ratio and that the Racket community is so small that there's - compared to many other languages - rather little support for "real world" problems.

1 Like

Yes:

  • Manual JIT is still something that can only be done with some lisps; I read it is coming to LLVM and GCC;
  • I learned about Futamura projection, and 3-Lisp by the way of Kernel, and Nada Amin, and those have real-world impact if/when understood;
  • What I call "transparent async" that is async without the keywords async nor await (Racket is transparent async);
  • You can find it in non-lispy systems, but continuations travelling over the network are really neat;
  • Multi-tier applications built out of a single code base, read hop.js by Manuel Serrano tl;dr: it is more general than JavaScript universal web apps;
  • I only read about real-time GC when looking at Pinball Tristan;

I almost forgot to mention tail-call, and tail-call modulo cons optimizations; having something that "Just Work" is the definition of... boring technology.

A lot of things in the Western world do not make sense anymore. There is a (to me) clear tendency to herd mentality and the avoidance of individual intellectual and educational strength and towards group think and leveling out the capacity of people inside a team (or if you prefer, averaging out). Clever programmers used to be the norm 30 years ago and now they are looked down upon as people who are in the way of producing quality solutions because they are too different from the team and this is somehow an impediment. It's not as if code quality or solutions have improved over the years - most programmers these days are glorified glue monkeys who connect things together with the assumption of interfaces being exposed to them that are guaranteed to work but without the understanding of how they work (increased complexity). People who go into LISP/Scheme style languages tend to be usually what I call "independent contributors" (not to be confused with corporate structural layers like "individual contributor") and they tend to be on the "smart" side, interested in math and/or how things work on the deeper level. The main complaint I hear about macros is that they will change the language so much (in the particular problem solution) that another programmer on the team will have a hard time understanding what is going on but also they have to often "reverse engineer" what is happening, not to mention someone taking over the maintenance of the code. You mean to tell me that the team member should be as smart as someone else but also be given the time to figure things out without having someone breathe down their neck?

So, what is the solution for this problem by the majority of "average" people? Ban you and find languages that are "simple" and "easy to learn and understand and use". It's the same thing with everything else in society, isn't it? "Everything" is getting "simpler" because more people with less faculties are expected to use "everything". This applies to basics like a coffee machine to things like cars and yes, programming computers.

Part of the above lies in capitalism and its inherent pressure creating nature - there is an expectation of continuous creation/production and competition and the pressure is always on and growing. With it complexity increases because people are trying to do more and more and expected to know more and more. Paradoxically, a university graduate from 1930s had less body of knowledge to master but they were very likely better educated than today's equivalent graduate (just my opinion based on 30 years in the industry and hiring new graduates to this day). To facilitate this - the natural knee jerk reaction is to average out and dumb down and the first area where that is natural and visible is the tools and the people.

Anyway, you may get the impression by what I wrote above that I am an elitist who thinks they are smarter than others. I am not - Scheme and macros and Racket have not been easy for me at all. But something not being easy used and requiring brain power and work to learn used to be a point of pride (and yes, there are sometimes things that are unnecessarily complicated but this is not what LISP is - it is actually quite simple - it is the complexity that arises from this simplicity and the underlying concepts that people find complicated).

2 Likes

Totally off topic. So let’s have fun :slight_smile:

A lot of things in the Western world do not make sense anymore.

Yeap :slight_smile:

Part of the above lies in capitalism and its inherent pressure creating nature

Capitalism — to re-use this awful Marxist-American term here — created all of this, including the ‘independent contributor’. Feudal societies or road-to-serfdom societies never knew such individuals (at the percentage level we see them here).

Anyway, you may get the impression by what I wrote above that I am an elitist .. Scheme ..

When I started teaching here, the ‘education dean’ yelled at me (yes, ‘yell’) that Scheme was for elitist and introducing it here would leave everyone behind. The opposite happened .. companies stood in line to hire our grads after a few years, even during the financial crisis. — Now that we’re leveling — like the rest of society — our students get laid off from internships (co-ops) and some grads have trouble finding jobs. More than half — probably three quarters — avoid any course that doesn’t have an A-ish average. Strangely enough, those are also having a hard time on internships.

1 Like

I wrote a reply :slight_smile: but this IS way off topic and I decided the reply was not appropriate for the original topic. Maybe we can start a new one. Thank you @EmEf

1 Like

There are a lot of Scheme voices in here, naturally. I'd like to hear more from Common Lisp and Clojure users about this topic. One thing that I didn't see in here was repl-driven development. Dr. Racket doesn't implement it so that might be part of the reason. Rich Hickey added it for Clojure and it's been in Common Lisp (and others?). My experience is limited but I enjoy it much more than the repl in other languages or their re-compilation process.

Some of the comparisons in these replies, like TS implementing optional typing, seem to gloss over how the features are implemented. I'm reminded of Greenspun's Tenth Rule:

Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.

I was just reading Beautiful Ideas in Programming: Generators and Continuations. From that post:

Rather, what I want to do is to demonstrate that generators are special cases of a much more powerful construct - continuations. Continuations allow programmers to invent new control structures, and it is the foundation upon which iterators, generators, coroutines, and many other useful constructs can be built.

Many languages have iterators, generators, etc. Do they have continuations? I don't know any that implement their iterators and generators via continuations.

So we might read that other languages are acquiring the things Scheme or Lisp implemented long ago. My push-back is the quality of the implementation and the upcoming deprecations (soon, surely) that are left in those languages.

One commenter used PHP as an example. I use PHP all day at work and I can assure you that closures and lambdas are just starting to approach usability. Consider the closure around a variable. For a long time you had to specify what variables the lambda had access to function ($value) use ($variableInParentScope) {...} then in 7.4 you could use arrow functions that would allow access to the variables of the same scope as the lambda definition. Oh, but the arrow functions don't allow multiple lines. You have to use the older form to use multi-line lambdas. Both closures and lambdas are implemented but still need more improvement. I'd guess its the same in other languages.

One comment described how there's a difference between developers who write software and others who write software with an interest in the language and its elegance. I think Lisp/Scheme developers fall into the latter category. If you're in the latter category then isn't it nice to implement something in the language without having to wait for the maintainers of the language to implement it?

Lastly, what if the fun of Lisp is the magic?

2 Likes