I admit I'm one of those students who never used Racket in a non-academic setting (but mostly because I needed to contribute to already-existing projects written in different languages), and I was taught Racket from one of its main contributors, John Clements at Cal Poly San Luis Obispo. However, learning Racket planted a seed in me that would later grow into a love of programming languages beyond industry-standard imperative ones.
I took a two-quarter series of classes from John Clements: the first was a course on programming language interpreters, and the second was a compilers course. The first course was taught entirely in Racket (then called DrScheme). As a guy who loved C and wanted to be the next Dennis Ritchie, I remember hating Racket at first, with all of its parentheses and feeling restricted by immutability and needing to express repetition using recursion. However, we gradually worked our way toward building a Scheme meta-circular evaluator. The second course was language-agnostic. Our first assignment was to write an interpreter for a subset of Scheme. We were allowed to use any language. I was tired of Racket and wanted to code in a much more familiar language: C++. Surely this was a sign of relief, right?
It turned out that C++ was a terrible choice for the job. I ended up writing a complex inheritance hierarchy of expression types, which could have easily been implemented using Racket's pattern matching capabilities. Additionally, C++ requires manual memory management, and this was before the C++11 standard with its introduction of smart pointers. Finally, I learned how functional programming paradigms make testing so much easier, compared to using object-oriented unit testing frameworks and dealing with mutable objects. I managed to get the project done and working in C++, but only after a grueling 40 hours.
I never complained about Racket after that.
In graduate school, I was taught Scala and Haskell from Cormac Flanagan, who also contributed to Racket. Sometime after graduate school, I got bit by the Smalltalk and Lisp bugs hard....now I do a little bit of research on programming languages when I'm not busy teaching classes as a community college professor. I find Futamura projections quite fascinating.
I'm glad I was taught programming languages from John Clements and Cormac Flanagan. They planted seeds that later bloomed into a love for programming languages.
that's an often repeated misconception about lisps.
lisps are pretty good at low-level programming, but then you'll need to make some compromises like abandoning the reliance on the GC and managing memory manually (which is still a lot easier than in other languages due to the metaprogramming capabilities).
there are lisps that can compile themselves to machine code in 2-4000 LoC altogether (i.e. compiler and assembler included; https://github.com/attila-lendvai/maru).
i'm not saying that there are lisp-based solutions that are ready for use in the industry. what i'm saying is that the lisp langauge is not at all an obstacle for memory-limited and/or real-time programs. it's just that few people use them, especially in those fields.
and there are interesting experiments for direct compilation, too:
BIT: A Very Compact #Scheme System for #Microcontrollers (#lisp #embedded)
http://www.iro.umontreal.ca/~feeley/papers/DubeFeeleyHOSC05....
"We demonstrate that with this system it is clearly possible to run realistic Scheme programs on a microcontroller with as little as 3 to 4 KB of RAM. Programs that access the whole Scheme library require only 13 KB of ROM."
"Many of the techniques [...] are part of the Scheme and Lisp implementation folklore. [...] We cite relevant previous work for the less well known implementation techniques."
People always point out this as a failure, when it is the contrary.
A programming language being managed doesn't mean we need to close the door to any other kind of resource management.
Unless it is something hard real time, and there are options there as well, we get to enjoy the productivity of high level programming, while at the same time having the tools at our disposal to do low level systems stuff, without having to mix languages.
C++ is one of my favourite languages, and I got into a few cool jobs because of my C++ knowledge.
However, given the option I would mostly reach for managed compiled languages as first choice, and only if really, really required, to something like C++, and even then, probably to a native library that gets consumed, instead of 100% pure C++.
I didn’t know you like C++. I’ve been reading your posts for a few years now and your advocacy of the Xerox PARC way of computing. I’ve found that most Smalltalkers and Lispers are not exactly fond of C++. To be fair, many Unix and Plan 9 people are also not big C++ fans despite C++ also coming from Bell Labs.
Back when C++ was becoming famous, my favourite programming language was Object Pascal, in the form of Turbo Pascal, having been introduced to it via TP 5.5 OOP mini booklet.
Shortly thereafter Turbo Pascal 6 was released, and I got into Turbo Vision, followed by Turbo Pascal 1.5 for Windows 3.1, the year thereafter.
I was a big Borland fan, thus when you got the whole stuff it was Object Pascal/C++, naturally C was there just because all C++ vendors started as C vendors.
On Windows and OS/2 land, C++ IDEs shared a lot with Smalltalk and Xerox PARC ideas in developer experience, it wasn't the vi + command line + debuggers are for the weak kind of experience.
See Energize C++, as Lucid was pivoting away from Common Lisp, with Cadillac what we would call a LSP nowadays, where you could do incrementatl compilation on method level and hot reload
You're right altougth C++ was born on UNIX at Bell Labs there is that point of view, and also a reason why I always had much more fun with C++ across Mac OS, OS/2, Windows, BeOS, and Symbian, with their full stack frameworks and IDE tooling.
However with time I moved into managed languages, application languages, where it is enough to make use of a couple of native libraries, if really required, which is where I still reach for C++.
I use it professionally. My favorite is its seemingly complete lack of bad behavior:
"3" + 1 is neither "4", "31", nor 4. It's illegal.
0 is not false, causing endless confusion on filters and &&s.
For loops don't alter the iterated value within a closure to the max value when it's finally called.
And some positives:
Immutable/functional is the default, but mutability is easy too.
Nice optional, keyword, and variable arity support.
Straight forward multithreading, semaphores, shared state, and unshared state.
Excellent module system:
- renames both in and out, including prefixes, all applied to arbitrary scopes of identifiers (I may be using inaccurate terminology)
- nested submodules
- automatic tests and/or "main" submodules
.....etc.......
If I could be grated a wish though it would be for nice struct syntax, but I think that's in Racket's successor Rhombus; haven't personally tried it yet.
I also sometimes wish it was slightly Haskell-ier in various ways, as did the talented individual who created Hackett.
If I were to guess why it's not used, it's because it's not used, which has a kind of downward-spiral force thing going on with it. If you're a random guy in charge of 200 dudes at BigCo, your first thought probably isn't "We should rewrite this whole thing in Racket!", it's probably more like "We should fire everyone and have Claude rewrite our codebase into Rust!" and tell your boss you saved 200*0.5M a year and ask for a comparative bonus. But if you're solo and in charge of designing, implementing, and maintaining a system with 1 or 2 people over the next 20 years, you can use whatever language you want, and Racket's a pretty good choice.
None of this really ruins the language for me, considering pros vs cons as a whole, but sometimes I'm slowed down in that by the time I finish mentally spelling out and typing the struct accessing I half forget the context I was in, and in general I'm sensitive to "eye bleed". Sometimes Racket looks like:
(define define match-define define define-values define begin cond define define)
when the real meat of the algorithm is more like:
cond
...and where Haskell's "where"s, "|"s, and "="s shine.
I'm sure I've over-answered your question but it's the holidays and I'm bored :)
edit: Since Racket uses dot already, it would probably have to be a different character, or the other way around.
University is to open the people's horizons, to learn how to learn, too see computing systems in action that most people on programming bootcamps never deem possible, unless they are curious to learn about computing history.
Sometimes it takes a couple of years, before a seed grows. I for one had a professor, who said: "I am not here to teach you C or Java. I am here to teach you computer programming." and then went on to take us on a tour through various paradigms, and included Prolog, back then Dr.Scheme (which turned into Racket), C, Java and Python. At the time I didn't understand Scheme at all. Didn't understand the idea of passing a function as an argument, so deeply rooted in the imperative world I was. But a couple of years later, I came upon HN and comments mentioning SICP ... Didn't that professor teach us something about that? What if I took a look and started learning from this book everyone is recommending?
And there it was. I worked on exercises of SICP and finished approximately 40% of the book's exercises and had a very solid grasp of Scheme and Racket, and any hobby project I would take out Racket to try and build it. Along the way I learned many things, that I would still not know today, had I stuck with only mainstream imperative languages. I wouldn't be half the computer programmer, that I am today, without going the SICP and Scheme way. I also worked through The Little Schemer. What an impressive little book it is!
So it is far from what you claim. In fact even a little exposure to Scheme once upon a time can make all the difference.
Everyone gets to choose which language they use for their personal projects.
Where are all the Racket personal projects?
N.B. I say this as someone who personally contributed small fixes to Racket in the 90s (when it was called mzscheme) and 00s (when it was called PLT-Scheme).
I view Racket as an academic language used as a vehicle for education and for research. I think Racket does fine in its niche, but Racket has a lot of compelling competitors, especially for researchers and professional software engineers. Those who want a smaller Scheme can choose between plenty of implementations, and those who want a larger language can choose Common Lisp. For those who don't mind syntax different from S-expressions, there's Haskell and OCaml. Those who want access to the Java or .NET ecosystems could use Scala, Clojure, or F#.
There's nothing wrong with an academic/research language like Racket, Oberon, and Standard ML.
I wish Standard ML had a strong ecosystem and things like a good dependency manager/package manager. I really liked it. But there is even less of an ecosystem around it than some other niche languages, and I've gone into the rabbit hole of writing everything myself too often, to know that at some point I will either hit the limit of my energy burning out, or the limits of my mathematical understanding to implement something. For example how to make a normal distribution from only having uniform distribution in the standard library. So many approaches to have an approximation, but to really understand them, you need to understand a lot of math.
Anyway, I like the language. Felt great writing a few Advent of Code puzzles in SMLNJ.
Racket is my first choice for most code I write these days and I've published a fair number of libraries into the raco package manager ecosystem in hopes other people using Racket might find them useful too.